Foxconn Technology Launches AI Model FoxBrain, Leveraging Nvidia Hardware

GuruFocus
03-11

Foxconn (FXCOF, Financials) introduced its first large language model, FoxBrain, which was trained in four weeks using a cost-efficient method, the company announced.

Supported by Nvidia Quantum-2 InfiniBand networking and Taipei-1 supercomputer, the AI model was built utilizing 120 Nvidia (NVDA, Financials) H100 GPUs. Through Nvidia NeMo, Foxconn also got technological direction.

Complementing Foxconn's larger emphasis on industrial artificial intelligence applications, FoxBrain is meant to run Foxconn's Smart Manufacturing, Smart EV, and Smart City projects. Inspired by Meta Platforms' (META, Financials) Llama 3.1 architecture with 70 billion parameters, the model is built on this structure.

Formally known as Hon Hai Precision (HNHAF, Financials), Foxconn — produced by the Hon Hai Research Institute — stated the model is the first Traditional Chinese large language model. The future open-source model is intended by the firm.

Designed for internal use, FoxBrain finds applications in data analysis, decision support, document collaboration, mathematics, reasoning, problem-solving, and code development, the business said.

While FoxBrain lags DeepSeek's distillation model, Foxconn said it reaches world-class standards in understanding and reasoning. With outstanding performance in mathematics and logical thinking, the business stated the model has been tuned for processing Taiwanese language.

With the first public findings of FoxBrain set for presentation at Nvidia's GPU Technology Conference on March 20, Foxconn intends to extend AI applications in manufacturing, supply chain management, and intelligent decision-making.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10