Google and Meta Explore Major AI Chip Supply Deal

3 Min Read
Photo by Farhat Altaf on Unsplash

Meta Platforms is in advanced discussions with Google over a multi-billion-dollar agreement to purchase Google’s tensor processing units (TPUs) for deployment in Meta’s data centers beginning in 2027, according to a report published Tuesday by The Information. The negotiations also include the possibility that Meta could begin renting Google’s chips through Google Cloud as early as next year, a move that would represent a notable shift in how Google commercializes its AI hardware.

Google has historically used its TPUs exclusively within its own data centers, integrating them tightly with its cloud infrastructure and internal AI workloads. Opening up these chips for external sale or rental would mark a strategic departure and could significantly broaden Google’s presence in the data-center processor market. The move comes as Google seeks to accelerate adoption of its TPUs among major enterprise customers and position itself more directly against Nvidia, whose GPUs currently dominate the accelerating demand for AI computing capacity.

According to the report, some Google Cloud executives believe that expanding TPU availability to customers such as Meta could allow Google to capture up to 10 percent of Nvidia’s annual revenue. If such estimates materialize, the shift could generate billions of dollars in additional income and firmly establish Google as a challenger to Nvidia in one of the most lucrative segments of the technology industry.

For Meta, access to Google’s chips—whether through long-term purchases or rental arrangements—would add another layer of supply optionality as the company continues to scale its AI infrastructure. The discussions reflect the growing need for large technology platforms to diversify their compute sources amid intense competition for advanced processors.

While the talks remain preliminary, a finalized agreement between two of the largest players in the technology sector would have meaningful implications for the AI hardware landscape. It would expand the commercial reach of Google’s processors, reshape competitive dynamics in the data-center market, and potentially intensify rivalry with Nvidia at a moment when demand for AI compute continues to surge.