Foxconn Technology Launches AI Model FoxBrain, Leveraging Nvidia Hardware

GuruFocus
03-11

Foxconn (FXCOF, Financials) introduced its first large language model, FoxBrain, which was trained in four weeks using a cost-efficient method, the company announced.

Supported by Nvidia Quantum-2 InfiniBand networking and Taipei-1 supercomputer, the AI model was built utilizing 120 Nvidia (NVDA, Financials) H100 GPUs. Through Nvidia NeMo, Foxconn also got technological direction.

Complementing Foxconn's larger emphasis on industrial artificial intelligence applications, FoxBrain is meant to run Foxconn's Smart Manufacturing, Smart EV, and Smart City projects. Inspired by Meta Platforms' (META, Financials) Llama 3.1 architecture with 70 billion parameters, the model is built on this structure.

Formally known as Hon Hai Precision (HNHAF, Financials), Foxconn — produced by the Hon Hai Research Institute — stated the model is the first Traditional Chinese large language model. The future open-source model is intended by the firm.

Designed for internal use, FoxBrain finds applications in data analysis, decision support, document collaboration, mathematics, reasoning, problem-solving, and code development, the business said.

While FoxBrain lags DeepSeek's distillation model, Foxconn said it reaches world-class standards in understanding and reasoning. With outstanding performance in mathematics and logical thinking, the business stated the model has been tuned for processing Taiwanese language.

With the first public findings of FoxBrain set for presentation at Nvidia's GPU Technology Conference on March 20, Foxconn intends to extend AI applications in manufacturing, supply chain management, and intelligent decision-making.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10