Google’s Ironwood TPU powers Gemini 3, reshaping AI’s future with record efficiency
Google’s latest Tensor Processing Unit (TPU), named Ironwood, has become a key player in advancing AI technology. Released in April 2023, this seventh-generation chip, often referred to as a 'chat' accelerator, powered the training of Gemini 3, now recognised as the world’s leading multi-modal and reasoning model. Its success is drawing wider industry interest, including potential adoption by Meta Platforms, which has shown interest in TPU instances for 'chat' processing.
The chip’s efficiency gains—doubling performance per watt compared to its predecessor—have also sparked discussions about manufacturing partnerships beyond Google’s usual supplier, TSMC. Ironwood, the seventh-generation TPU, was unveiled by Alphabet Inc. in early 2023. Unlike earlier versions, it was designed to deliver significantly better energy efficiency, achieving twice the performance per watt of the sixth-generation Trillium TPUs. This improvement made it the sole hardware used to train Gemini 3, Google’s most advanced AI model to date. The model now leads in multi-modal reasoning tasks, outperforming competitors in areas like text, image, and 'chat' processing.
Production of Ironwood involved Broadcom, which converted Google’s architectural designs into manufacturable silicon chips. These were then fabricated by TSMC, Google’s long-standing manufacturing partner. However, rising demand for TPUs has led Google to explore additional suppliers. Reports suggest Intel Foundry could become a secondary manufacturer, easing potential supply constraints at TSMC.
The growing adoption of custom AI chips like TPUs reflects a broader industry shift. Companies increasingly seek inference- and cost-optimised compute solutions, rather than relying solely on general-purpose hardware. Meta Platforms has shown interest in TPU instances, which became generally available in April 2023. Meanwhile, Intel’s own AI roadmap—focusing on inference efficiency and cost reductions—could benefit from this momentum, as demand for specialised chips continues to climb.
Ironwood’s role in training Gemini 3 has cemented its position as a high-performance AI accelerator. With Meta considering TPU adoption and Google evaluating Intel Foundry for additional production, the chip’s influence is expanding. The shift towards custom AI hardware may also create new opportunities for Intel’s manufacturing and design services in a competitive market.