Google has committed to integrating multiple generations of Intel CPUs into its AI data centers to enhance training and inference capabilities. The partnership focuses on creating balanced systems to optimize performance as AI workloads grow in complexity.
- Google to use Xeon 6 processors for AI training and inference
- Multi-generational commitment to Intel CPU roadmaps
- Joint development of infrastructure processing units for offloading tasks
- Strategic move toward 'balanced systems' in AI data centers
- Google continues parallel development of custom TPU and Axion chips
Sign up free to read the full analysis
Create a free account to unlock full AI-curated market articles, personalized alerts, and more.