Nvidia Expected to Unveil Blackwell Ultra AI Chip at GTC 2025

GuruFocus.com
03-19

Nvidia (NVDA, Financials) is set to announce its latest artificial intelligence chip, the Blackwell Ultra, at the GTC in San Jose, California. The company is also expected to outline details about Rubin, the next-generation AI graphics processor, which is scheduled for release in 2026.

  • Warning! GuruFocus has detected 3 Warning Signs with NVDA.

Improved performance catered for AI inference activities makes the Blackwell Ultra processor a more evolved variation of Nvidia's current AI technology. Industry observers predict that Rubin will provide performance increases comparable to Blackwell's stated thirty-fold speed enhancement over earlier artificial intelligence devices. The rising complexity of big language models, which need significant computing capability, is driving demand for high-performance graphics processing units.As Amazon (AMZN, Financials) and Alphabet (GOOGL, Financials) are developing their own AI inference processors, trying to cut dependency on outside technology, Nvidia confronts more competition. Furthermore, developments in artificial intelligence research in China imply that modern AI models may be developed without premium processors from Nvidia, therefore undermining the company's market leadership.Given recent volatility in its stock, Nvidia sees the next announcements as very vital. Investors are attentively watching to see whether the business can keep expanding in the AI semiconductor market.At GTC 2025, Nvidia CEO Jensen Huang is supposed to take part in a panel discussion on quantum computing. To indicate a calculated foray into quantum computing technology, the corporation has officially set March 20 as its inaugural Quantum Day.

This article first appeared on GuruFocus.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10