<p>➀ Micron has commenced shipments of HBM4 memory, delivering 2.0TB/s per stack with a 2048-bit interface, a 60% performance boost over HBM3E;</p><p>➁ Initial 36GB stacks target next-gen AI accelerators, built on Micron's 1-beta process with advanced memory testing (MBIST) for reliability;</p><p>➂ Full production ramp is planned for 2026, aligning with next-gen AI hardware releases, while future designs may combine HBM with LPDDR for expanded memory capacity.</p>
Related Articles
- Micron confirms memory price hikes as AI and data center demand surges5 months ago
- Consumer memory slowing more than AI gaining8 months ago
- SK hynix dethrones Samsung to become world's top-selling memory maker for the first time — success mostly attributed to its HBM3 dominance for Nvidia's AI GPUs15 days ago
- Micron details new U.S. fab projects: HBM assembly comes to the U.S., Idaho Fab comes online in 2027, New York fabs laterabout 2 months ago
- Low-power DRAM For Faster AI On Smartphonesabout 2 months ago
- Electronics and IC sales unaffected by tariffs3 months ago
- Pliops expands AI's context windows with 3D NAND-based accelerator – can accelerate certain inference workflows by up to eight times3 months ago
- NVIDIA RTX Pro 6000 Blackwell Family for Workstations and Servers3 months ago
- Prototype of a Particularly Sustainable and Energy-Autonomous E-Bike Terminal Developed at HKA4 months ago
- Enhancing Chitosan Films with Silanized Hexagonal Boron Nitride for Sustainable Applications4 months ago