➀ The new Google Cloud TPU v6e Trillium was showcased at SC24 without its heatsinks, highlighting Google's custom AI accelerator with improved performance, interconnect, and memory. ➁ The v6e chip offers double the HBM memory from 16GB to 32GB, doubling the bandwidth, and improves INT8 and bfloat16 performance by around 4.6-4.7x compared to the v5e version. ➂ The new generation quadruples the interconnect bandwidth from two 100Gbps links to four 200Gbps links, and the inter-chip interconnect bandwidth more than doubles.
Related Articles
- Google unveils AlphaChip AI-assisted chip design technology — chip layout as a game for a computer8 months ago
- Google AI Co-Scientist Arrives To Turbocharge Scientific Research3 months ago
- Google's former chief warns AI could lead to a 'Bin Laden' style attack4 months ago
- Google releases Gemini 2.0 models: 'We kicked off the agentic era'4 months ago
- Perplexity AI assistant goes head-to-head with Google's Gemini4 months ago
- Scientists subject AI to 'pain' after being inspired by electrocuting hermit crabs4 months ago
- The 2024 Google Search Terms Which Were Most Up On 20235 months ago
- STH Q4 2024 Letter from the Editor Re-aligning5 months ago
- Intel announces AI-focused keynote at CES 20255 months ago
- Google Unveils Gemini 2.0 Flash AI Reasoning Model To Challenge OpenAI's o15 months ago