Google's Tensor Processing Units (TPUs) are posing a significant challenge to Nvidia's dominance in the AI chip market, prompting concerns within OpenAI and impacting Nvidia's stock value. Google's Gemini 3 AI model, trained using its in-house TPUs, is demonstrating performance that rivals ChatGPT, leading OpenAI to declare a 'code red'. Meta is also in discussions to deploy Google TPUs in its data centres, potentially starting with rentals in 2026 and direct hardware deployment in 2027. This deal, reportedly worth billions, marks a major shift from a previously devoted Nvidia customer.
Nvidia currently holds a 90% market share in the GPU-based AI chip market, but analysts predict this could fall to 70% by 2030, with Google's TPU market share potentially rising from 5% to 25%. TPUs are application-specific integrated circuits (ASICs) optimised for the mathematical computations used in deep learning models, offering greater efficiency and lower power consumption compared to CPUs and GPUs. While Nvidia's CUDA software gives it greater versatility, TPUs excel in large-scale deep learning tasks.
TPUs deliver superior cost-performance for AI inference workloads, with some companies reporting significant cost reductions after switching from Nvidia GPUs. This shift towards TPUs is driven by the increasing importance of AI inference, which is projected to consume 75% of AI compute by 2030, creating a $255 billion market. Samsung and SK Hynix are expected to play key roles in Google's TPU supply chain, further solidifying the TPU ecosystem.
Related Articles

Nvidia Faces AI Chip Challenge
Read more about Nvidia Faces AI Chip Challenge →
Qualcomm Enters AI Arena
Read more about Qualcomm Enters AI Arena →
OpenAI Announces 'Code Red' Focus
Read more about OpenAI Announces 'Code Red' Focus →
OpenAI's Dominance Under Threat
Read more about OpenAI's Dominance Under Threat →
