➀ The new Google Cloud TPU v6e Trillium was showcased at SC24 without its heatsinks, highlighting Google's custom AI accelerator with improved performance, interconnect, and memory. ➁ The v6e chip offers double the HBM memory from 16GB to 32GB, doubling the bandwidth, and improves INT8 and bfloat16 performance by around 4.6-4.7x compared to the v5e version. ➂ The new generation quadruples the interconnect bandwidth from two 100Gbps links to four 200Gbps links, and the inter-chip interconnect bandwidth more than doubles.
Related Articles
- Google unveils AlphaChip AI-assisted chip design technology — chip layout as a game for a computer11 months ago
- HP Dimension with Google Beam eyes 3D video meetings2 months ago
- Musk asserts AI will make search redundant in comment on Google Search share dipping below 90%3 months ago
- Google AI Co-Scientist Arrives To Turbocharge Scientific Research7 months ago
- Google's former chief warns AI could lead to a 'Bin Laden' style attack7 months ago
- Google releases Gemini 2.0 models: 'We kicked off the agentic era'7 months ago
- Perplexity AI assistant goes head-to-head with Google's Gemini7 months ago
- Scientists subject AI to 'pain' after being inspired by electrocuting hermit crabs7 months ago
- The 2024 Google Search Terms Which Were Most Up On 20238 months ago
- STH Q4 2024 Letter from the Editor Re-aligning8 months ago