Tech
Google Plans New AI Inference Chips (Next-Gen TPUs) to Challenge Nvidia Dominance
Google is set to announce its next-generation Tensor Processing Units (TPUs) this week at an event in Las Vegas, with a strong focus on AI inference chips. Separately, Google is reportedly partnering with Marvell Technology to develop custom AI chips.
Details: The new TPUs target efficient running of trained AI models, complementing Google’s existing silicon efforts and reducing reliance on Nvidia GPUs for inference workloads.
Impact: This accelerates the custom AI chip boom, pressuring Nvidia while offering hyperscalers better cost and performance optimization for real-world AI deployment. Marvell shares surged on the news, reflecting investor excitement over diversification in the AI hardware ecosystem. It aligns with broader trends of Big Tech building bespoke silicon amid supply constraints and high GPU demand.
