<p>➀ Micron has commenced shipments of HBM4 memory, delivering 2.0TB/s per stack with a 2048-bit interface, a 60% performance boost over HBM3E;</p><p>➁ Initial 36GB stacks target next-gen AI accelerators, built on Micron&#39;s 1-beta process with advanced memory testing (MBIST) for reliability;</p><p>➂ Full production ramp is planned for 2026, aligning with next-gen AI hardware releases, while future designs may combine HBM with LPDDR for expanded memory capacity.</p>