➀ Micron has launched its second-generation HBM3 memory, HBM3e, with a capacity of 36GB, doubling the previous 24GB limit; ➁ The new memory features a 12-layer design, up from the previous eight layers, and is aimed at the AI market; ➂ Despite the increased capacity, Micron claims to have reduced power consumption by 30% compared to competitors' eight-layer HBM3e products; ➃ The new memory is expected to be used in upcoming AI accelerators from AMD, Nvidia, and other hardware vendors; ➄ The demand for high-bandwidth memory has surged with the AI boom, leading to intense competition for higher capacity and bandwidth.
Related Articles
- Micron confirms memory price hikes as AI and data center demand surges5 months ago
- The 2025 Server Starting Point8 months ago
- IQE, Quinas complete UltraRAM industrialisation project for AI2 months ago
- Low-power DRAM For Faster AI On Smartphones2 months ago
- Micron Begins Shipping HBM4 Memory for Next-Gen AI3 months ago
- Samsung Exits HBM2E Market, Focuses on HBM3E and HBM44 months ago
- Thin Film May Help Make Light Night-Vision Glasses4 months ago
- Symposium on VLSI Technology & Circuits in Kyoto,5 months ago
- Prototype of a Particularly Sustainable and Energy-Autonomous E-Bike Terminal Developed at HKA5 months ago
- Enhancing Chitosan Films with Silanized Hexagonal Boron Nitride for Sustainable Applications5 months ago