➀ Micron has launched its second-generation HBM3 memory, HBM3e, with a capacity of 36GB, doubling the previous 24GB limit; ➁ The new memory features a 12-layer design, up from the previous eight layers, and is aimed at the AI market; ➂ Despite the increased capacity, Micron claims to have reduced power consumption by 30% compared to competitors' eight-layer HBM3e products; ➃ The new memory is expected to be used in upcoming AI accelerators from AMD, Nvidia, and other hardware vendors; ➄ The demand for high-bandwidth memory has surged with the AI boom, leading to intense competition for higher capacity and bandwidth.
Related Articles
- Micron confirms memory price hikes as AI and data center demand surges2 months ago
- The 2025 Server Starting Point5 months ago
- Tariffs lead to fall in memory inventories2 months ago
- Contactless Timing for Paralympic Swimming2 months ago
- 2% 2025 growth forecast for front-end fab equipment2 months ago
- Fishing2 months ago
- Micron SOCAMM Memory Powers Next-Gen NVIDIA Grace GB300 Servers2 months ago
- Ed Tackles PIP2 months ago
- How 3D NAND Shapes The Future Of AI Memory3 months ago
- Nvidia unveils DGX Station workstation PCs with GB300 Blackwell Ultra inside3 months ago