Meta said in talks to use Google Tensor AI chips in fresh headache for Nvidia| Business News

Google already supplies up to 1 million Tensor chips to Anthropic PBC, in what has been called a “really powerful validaton” for TPUs. (Reuters)


Meta Platforms Inc. is reportedly in talks to spend billions on Google’s Tensor AI chips, which are powering its industry-benchmark Gemini 3 model. Nvidia Corp. shares plunged even as Alphabet Inc.’s shares surged on the news.

Google already supplies up to 1 million Tensor chips to Anthropic PBC, in what has been called a “really powerful validaton” for TPUs. (Reuters)

The parent company of Facebook, Instagram and WhatsApp is in discussions to use Google’s tensor processing units in AI data centres in 2027, The Information reported, citing sources. Meta may also rent chips from Google Cloud next year.

A deal here would help establish Google Tensor as an alternative to Nvidia’s chips—currently the gold standard to run AI computing power. Google already supplies up to 1 million Tensor chips to Anthropic PBC, in what has been called a “really powerful validaton” for TPUs.

“A lot of people were already thinking about it (TPUs), and a lot more people are probably thinking about it now,” Seaport analyst Jay Goldberg had said.

Google Tensor vs Nvidia AI chips

The tensor chip, first developed more than 10 years ago for AI tasks, is gaining momentum outside Google amid worries of an over-reliance on AI chips made by Nvidia even as Advanced Micro Devices Inc. (AMD) remains a distant rival.

  • Nvidia’s Blackwell is essentially a graphics processing unit (GPU), which until the last decade formed the “brain” of video games. It turned out to be well-suited for training AI models because they can handle large amounts of data and computations.
  • Google’s TPUs are a type of specialised product known as “application-specific integrated circuits”, or microchips that were designed for a sold purpose — AI compute.

The tensor chips were so far powering only Google’s AI models Gemini. That they are highly customisable—Tensor powers Google’s Pixel phones as well—has worked to the advantage of the search giant.

Meta AI, powered by Google Tensor?

If Meta chooses Google over Nvidia, then the burgeoning AI industry will have a viable alternative to Nvidia, if not essentially better.

“Meta’s capex of at least $100 billion for 2026 suggests it will spend at least $40-$50 billion on inferencing-chip capacity next year,” Bloomberg Intelligence analysts Mandeep Singh and Robert Biggar said.

A deal with Meta would mark a win for Google, but much depends on whether the tensor chips can demonstrate the power efficiency and computing muscle necessary to become a viable option in the long run.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *