Meta Tests Secret AI Chip to Slash Costs and Challenge Nvidia

GuruFocus.com
13 Mar

March 13 - According to a Reuters report, Meta (META, Financial) is quietly testing its first in?house AI training chip in a bid to slash infrastructure costs and challenge Nvidia's (NVDA, Financial) dominance in the AI hardware market. The chip, part of Meta's Training and Inference Accelerator (MTIA) series and manufactured by Taiwan's TSMC, recently completed a crucial tape?out phase, a key early milestone in chip development.

  • Warning! GuruFocus has detected 2 Warning Sign with META.

Meta is looking to spend up to $119 billion on AI in 2025, with about $65 billion allocated to AI infrastructure. As it has spent billions on Nvidia GPUs, Meta is now reducing its reliance on external suppliers and trying to optimize its own systems for jobs such as recommendation algorithms and generative AI.

Chief Product Officer Chris Cox described the project as a walk, crawl, run situation, acknowledging past setbacks while highlighting the chip's potential. If successful, Meta plans to fully integrate the chip into its AI training platforms by 2026, potentially reshaping cost structures and altering the competitive landscape in favor of greater AI independence. However, any misstep could result in significant delays and cost overruns.

This article first appeared on GuruFocus.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10