Amazon's Secretive GPU Strategy Pays Off as AI Demand Surges

GuruFocus.com
23 Apr

Amazon (NASDAQ:AMZN) put a plan in motion last year to make sure it wouldn't fall behind in the race for AI hardwareand it's starting to pay off. The initiative, known internally as Project Greenland, focused on locking down enough GPU power to support artificial intelligence across Amazon's massive retail business, according to a report from Business Insider.

  • Warning! GuruFocus has detected 2 Warning Sign with AMZN.

Instead of letting teams grab GPU resources as needed, Amazon created a more structured system. Access is now based on return on investment and long-term growth goals, not just who asked first. GPUs are too valuable to be given out on a first-come, first-served basis, internal documents reportedly said.

While other companies are struggling to get enough GPUs, Amazon's retail division now has full access to them via Amazon Web Services. The company also plans to lean more on its in-house Trainium chips by the end of the year.

Amazon expects to spend $5.7 billion on AWS infrastructure in 2025, reinforcing its bet that AI will keep driving demand across its business. Shares were up over 3% in late morning trading Tuesday.

This article first appeared on GuruFocus.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10