By Deepa Seetharaman
In the fierce competition to build the best artificial-intelligence systems, the most precious resource isn't data, researchers or cash. It's an expensive chip called a graphics processing unit.
Tech CEOs including Elon Musk, Mark Zuckerberg and Sam Altman think that the difference between dominance and defeat in AI comes down to amassing as many GPUs as possible and networking them together in massive data centers that cost billions of dollars each. If AI requires building at this scale, Silicon Valley's leaders think, then only giants like Microsoft, Meta Platforms, Alphabet's Google and Amazon, or startups with deep-pocketed investors like OpenAI, can afford to do it.
People like Alex Cheema think there's another way.
Cheema, co-founder of EXO Labs, is among a burgeoning group of founders who say they believe success in AI lies in finding pockets of underused GPUs around the world and stitching them together in virtual "distributed" networks over the internet. These chips can be anywhere -- in a university lab or a hedge fund's office or a gaming PC in a teenager's bedroom.
If it works, the setup would allow AI developers to bypass the largest tech companies and compete against OpenAI or Google at far lower cost. That approach, coupled with engineering techniques popularized by Chinese AI startup DeepSeek and other open-source models, could make AI cheaper to develop.
"The fundamental constraint with AI is compute," Cheema says, using the industry term for GPUs. "If you don't have the compute, you can't compete. But if you create this distributed network, maybe we can."
Most advanced GPUs are made by Nvidia. One of its top-of-the-line HGX H100 GPU systems weighs 70 pounds, contains 35,000 parts and starts at a price of a quarter-million dollars. But smaller, less-expensive ones have long been used for other purposes like making videogames come to life and mining cryptocurrencies.
Distributed AI networks would take advantage of the times when these chips aren't rendering "Call of Duty" or mining bitcoins and connect them online to work together to develop AI systems.
The operators of these networks could pay the GPU owners or ask them to donate their chips' time if the AI was being developed for charitable purposes.
Jared Quincy Davis says that while he was a researcher at Google-owned DeepMind, the company was starting to spend more on computing resources than on people. He left in 2022 and created the company Foundry, a platform where customers can look for spare GPUs and rent out their own that aren't being regularly used.
Entrepreneurs are finding these stranded resources in unexpected places. Cheema was recently introduced to a Canadian law firm that is setting up a GPU cluster to be operated on the premises. "While they're asleep, these GPUs aren't doing anything," he says.
The recently founded Exo Labs, which describes its mission as democratizing access to AI, is at the early stages of finding spare GPUs to put together a network.
Thousands of organizations have somewhere between 10 and 100 GPUs that frequently aren't being used, Cheema estimates. "In aggregate, they have more than x. AI," he says, referencing Musk's AI startup, which last year built a 100,000 GPU cluster at a data center in Tennessee.
So far, nobody has built a virtual network of GPUs at scale. Some of those that exist have just hundreds of GPUs. And there are plenty of hurdles to overcome.
Foremost is speed. A distributed network is only as fast as its slowest internet connection, whereas chips in the same data center experience virtually no latency. It also isn't clear if a federated network of GPUs is secure enough to ensure that someone's private information doesn't seep out. And how do you find the people and companies with spare chips in the first place?
Another problem: Building AI models is an expensive endeavor, and the people financing these projects are generally averse to added risks. Vipul Prakash, chief executive of Together.AI, initially founded the company to build a decentralized GPU network and then pivoted to working inside data centers for this reason. "Someone who is going to invest a billion in training a model tends to be conservative," he says. "They're spending a lot of money and they're already taking a lot of other types of risks, and they don't want to take infrastructure risks."
The founders pursuing the decentralized path acknowledge those challenges but argue that it is bad for the economy and entrepreneurs to concentrate computational resources in the hands of a few huge tech companies.
They also say they don't need access to a lot of compute to help new AI companies blossom, as evidenced by the success of DeepSeek.
Paul Hainsworth, CEO of decentralized AI company Berkeley Compute, says he has one customer looking to build a cutting-edge AI model larger than the biggest one operated by Meta, which plans to end this year with 1.3 million GPUs. Hainsworth's startup, founded last year, has about 900 GPUs collectively, in two data centers -- one in Wyoming and the other in California. It is also developing a way to let people own GPUs as a financial asset that they can rent out, like a vacation home.
"I'm making a big bet that the big tech companies are wrong that all of the value will be accreted to a centralized place," Hainsworth says.
Write to Deepa Seetharaman at deepa.seetharaman@wsj.com
(END) Dow Jones Newswires
February 21, 2025 10:00 ET (15:00 GMT)
Copyright (c) 2025 Dow Jones & Company, Inc.
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.