Nvidia Unveils 'Rubin' Superchip -- WSJ

Dow Jones
03-19

By Dan Gallagher

What's Nvidia's next big thing? An AI superchip, named after the astronomer who discovered dark matter.

Nvidia confirmed at its GTC conference Tuesday that Vera Rubin will succeed its Grace Blackwell lineup as the next generation of the company's artificial intelligence systems. Blackwell chips have only recently started shipping in high volume, with the enhanced Blackwell Ultra lineup due to launch later this year. Rubin chips are expected to start shipping in the second half of next year.

They'll be monsters-at least according to the specs Nvidia CEO Jensen Huang laid out in his GTC conference keynote on Tuesday. The Vera Rubin systems will sport 3.3 times the computing performance of Blackwell Ultra. Vera Rubin Ultra-the enhanced version of the lineup expected to ship in the later half of 2027-will offer 14 times Blackwell's computing performance.

"Once a year-like clock ticks," Huang said, referring to Nvidia's current pace of new AI chip launches.

Rubin will need to be a monster hit as well. Analysts expect this series of Nvidia's chips to generate nearly $40 billion in revenue in their first year of sales and more than $95 billion in their second year, according to consensus estimates from Visible Alpha. That, incidentally, is more than the annual revenue of nearly three-quarters of companies on the S&P 500. But as the name suggests, Nvidia has to aim for the stars these days.

This analysis comes from the Journal's Heard on the Street team. Subscribe to their free daily afternoon newsletter here.

This item is part of a Wall Street Journal live coverage event. The full stream can be found by searching P/WSJL (WSJ Live Coverage).

(END) Dow Jones Newswires

March 18, 2025 14:52 ET (18:52 GMT)

Copyright (c) 2025 Dow Jones & Company, Inc.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10