SK hynix has announced the volume production of its new 12-layer HBM3E memory, boasting a capacity of up to 36GB and a bandwidth of 9.6Gbps. This marks a significant advancement in memory technology, as it is the largest capacity of existing HBM to date.
The new memory is designed for AI GPUs, with SK hynix planning to supply the memory chips to NVIDIA within the next 12 months. This comes just six months after the company launched the 8-layer HBM3E memory, which was the first of its kind in the industry.
SK hynix's HBM3 and HBM3E memory has been a key component in NVIDIA's Hopper H100 and H200 AI GPUs, as well as its new Blackwell AI GPUs. The new 12-layer HBM3E memory chips offer a bandwidth of 9.6Gbps, which is the highest memory speed on the market.
SK hynix's President of AI infrastructure, Justin Kim, emphasized the company's commitment to leading in AI memory, stating that they will continue to be the number one global AI memory provider as they prepare for the next generation of memory products to overcome the challenges of the AI era.