Nvidia Expected to Unveil Blackwell Ultra AI Chip at GTC 2025

GuruFocus.com
03-19

Nvidia (NVDA, Financials) is set to announce its latest artificial intelligence chip, the Blackwell Ultra, at the GTC in San Jose, California. The company is also expected to outline details about Rubin, the next-generation AI graphics processor, which is scheduled for release in 2026.

  • Warning! GuruFocus has detected 3 Warning Signs with NVDA.

Improved performance catered for AI inference activities makes the Blackwell Ultra processor a more evolved variation of Nvidia's current AI technology. Industry observers predict that Rubin will provide performance increases comparable to Blackwell's stated thirty-fold speed enhancement over earlier artificial intelligence devices. The rising complexity of big language models, which need significant computing capability, is driving demand for high-performance graphics processing units.As Amazon (AMZN, Financials) and Alphabet (GOOGL, Financials) are developing their own AI inference processors, trying to cut dependency on outside technology, Nvidia confronts more competition. Furthermore, developments in artificial intelligence research in China imply that modern AI models may be developed without premium processors from Nvidia, therefore undermining the company's market leadership.Given recent volatility in its stock, Nvidia sees the next announcements as very vital. Investors are attentively watching to see whether the business can keep expanding in the AI semiconductor market.At GTC 2025, Nvidia CEO Jensen Huang is supposed to take part in a panel discussion on quantum computing. To indicate a calculated foray into quantum computing technology, the corporation has officially set March 20 as its inaugural Quantum Day.

This article first appeared on GuruFocus.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10