Alphabet has launched its eighth generation of Tensor Processing Units, introducing a specialized chip for AI inference. The move aims to reduce structural costs and expand revenue streams through Google Cloud and Broadcom.
- New TPU 8i chip designed specifically for inference and agentic AI
- 80% increase in performance-per-dollar over previous generation
- Enhanced SRAM and HBM to reduce data transfer lag
- Integration capabilities with Axiom CPUs
- Revenue expansion through Google Cloud and Broadcom partnerships
Sign up free to read the full analysis
Create a free account to unlock full AI-curated market articles, personalized alerts, and more.