➀ Micron has launched its second-generation HBM3 memory, HBM3e, with a capacity of 36GB, doubling the previous 24GB limit; ➁ The new memory features a 12-layer design, up from the previous eight layers, and is aimed at the AI market; ➂ Despite the increased capacity, Micron claims to have reduced power consumption by 30% compared to competitors' eight-layer HBM3e products; ➃ The new memory is expected to be used in upcoming AI accelerators from AMD, Nvidia, and other hardware vendors; ➄ The demand for high-bandwidth memory has surged with the AI boom, leading to intense competition for higher capacity and bandwidth.
Related Articles
- Micron confirms memory price hikes as AI and data center demand surges7 months ago
 - The 2025 Server Starting Point10 months ago
 - High-Performance MRAM-Powered Edge MCUs11 days ago
 - Micron Samples Industry's Highest Capacity 192GB SOCAMM2 Memory For AI Servers12 days ago
 - RAM Not Just Store Data But Computes21 days ago
 - WSTS forecasts 15.4% growth for 20253 months ago
 - Three SSDs for AI datacentres3 months ago
 - IQE, Quinas complete UltraRAM industrialisation project for AI4 months ago
 - Low-power DRAM For Faster AI On Smartphones4 months ago
 - Micron Begins Shipping HBM4 Memory for Next-Gen AI5 months ago