By
Henry Chandonnet
Every time Henry publishes a story, you’ll get an alert straight to your inbox!
By clicking “Sign up”, you agree to receive emails from Business Insider. In addition, you accept Insider’s
Terms of Service and
Privacy Policy.
Follow Henry Chandonnet
- Google DeepMind CEO Demis Hassabis said that the "whole supply chain" for memory chips is constrained.
- "You need a lot of chips to be able to experiment on new ideas," Hassabis told CNBC.
- Google produces its own TPUs, but Hassabis said that there were still "key components" that were supply-constrained.
The memory shortage takes no prisoners. Even Google isn't immune.
AI companies are duking it out for greater and greater quantities of memory chips. The problem? The industry is heavily supply-constrained. Costs have skyrocketed, products have been tied up, and some companies — especially those in consumer electronics — are increasing prices.
On the AI front, Google DeepMind CEO Demis Hassabis told CNBC that physical challenges were "constraining a lot of deployment." Google sees "so much more demand" for Gemini and its other models than it could serve, he said.
"Also, it does constrain a little bit the research," Hassabis said. "You need a lot of chips to be able to experiment on new ideas at a big enough scale that you can actually see if they're going to work."
Researchers want chips, whether they work at Google, Meta, OpenAI, or other Big Tech companies, and memory is a key component. Mark Zuckerberg said that AI researchers demanded two things beyond money: the fewest number of people reporting to them, and the most chips possible.
Hassabis said that wherever there was a capacity constraint, there was a "choke point."
"The whole supply chain is kind of strained," Hassabis said. "We're lucky, because we have our own TPUs, so we have our own chip designs."
Google has long built TPUs — Tensor Processing Units — for internal use. The company also leases them to external customers through its cloud, which has also put Nvidia on edge.
But even access to their own TPUs won't save Google from having to navigate the highly competitive memory market. "It still, in the end, actually comes down to a few suppliers of a few key components," Hassabis said.
Three suppliers dominate memory chip production: Samsung, Micron, and SK Hynix. These companies are struggling to meet demand for chips from AI hyperscalers without dropping their longtime electronics customers.
It doesn't help that AI companies mainly want a different type of memory chip than PC manufacturers do. Large language model producers want HBM (high-bandwidth memory) chips.
Don't expect Google's spending on AI infrastructure and chips to go down anytime soon. On its fourth-quarter earnings call, the company projected capital expenditures of $175 billion to $185 billion for 2026.












