➀ Micron has launched its second-generation HBM3 memory, HBM3e, with a capacity of 36GB, doubling the previous 24GB limit; ➁ The new memory features a 12-layer design, up from the previous eight layers, and is aimed at the AI market; ➂ Despite the increased capacity, Micron claims to have reduced power consumption by 30% compared to competitors' eight-layer HBM3e products; ➃ The new memory is expected to be used in upcoming AI accelerators from AMD, Nvidia, and other hardware vendors; ➄ The demand for high-bandwidth memory has surged with the AI boom, leading to intense competition for higher capacity and bandwidth.