Recent #HBM3e news in the semiconductor industry

4 months ago

➀ Tariffs have encouraged US memory buyers to get their orders in Q1, reducing inventories;

➁ DRAM prices expected to fall 0-5% QoQ in Q2, with HBM pricing to rise 3-8% due to HBM3e shipments;

➂ Samsung's HBM qualification progress slower than expected, but no significant capacity shift back to DRAM;

➃ SK hynix focusing on server and mobile DRAM, limiting PC DDR5 supply;

➄ DDR4 prices weak due to consumer demand and capacity expansion, with PC DRAM prices expected to remain flat QoQ in Q2;

➅ Server DRAM demand driven by top North American CSPs and AI server investment in China, with DDR5 prices expected to stabilise in Q2;

➆ Mobile DRAM demand improved, with LPDDR4X supply ample but prices expected to decline modestly;

➇ GDDR7 demand from next-gen graphics cards, with prices expected to hold flat or slightly down due to tight supply;

➈ GDDR6 prices expected to decline 3-8%, with suppliers bundling to stabilise pricing and clear inventory;

➉ Consumer DRAM seeing more aggressive purchasing, with DDR4 contract prices expected to rise 0-5% in 2Q25, and DDR3 prices likely to remain flat.

DRAMGDDRHBM3eSK HynixTrendForceddr5memorysemiconductor
4 months ago

NVIDIA has introduced its DGX Station workstation platform, featuring the GB300 Desktop Superchip, which combines a Grace CPU with a Blackwell GPU optimized for AI workloads.

The platform is designed for software developers, researchers, and data scientists, with a focus on high-speed networking and scalability for extensive AI models.

The specifications of the 'Desktop Superchip' are yet to be disclosed, but it is expected to be a combination of Grace CPU and Blackwell Ultra GPU components optimized for desktop PCs.

AIHBM3eNVIDIASuperchip
8 months ago
➀ Hynix is set to sample HBM3e 16hi memory in the first half of 2025; ➁ Lightsynq Technologies emerges with $18 million in Series A funding for diamond optical interconnects in quantum computing; ➂ DigiKey is optimistic about the future of the electronics market; ➃ Texas Instruments aims to increase in-house manufacturing to 95% by 2030.
HynixDigiKeyHBM3ePackaging TechnologyQuantum ComputingTSMCTexas InstrumentsTrendForcememory
9 months ago

➀ Samsung Electronics has begun supplying its most advanced high-bandwidth memory HBM3E to a major customer after passing key validation procedures.

➁ The customer is believed to be NVIDIA, a leading AI chip manufacturer.

➂ SK Hynix currently holds over 50% of the HBM market share and is the main supplier of HBM3 and HBM3E to NVIDIA and AMD.

➃ Micron started mass production of HBM3E in February and plans to capture 20-25% of the HBM market share by 2025.

➄ Micron's stock plummeted by 4.26% on October 31st following the news.

HBM3eMarket ShareMicronNVIDIASamsung
10 months ago
➀ Samsung Electronics is facing threats to its 32-year dominance in the DRAM market due to its delayed mass production of 5th-generation high-bandwidth memory HBM3E compared to competitors like SK Hynix and Micron; ➁ The challenge to reverse the situation is heightened without the introduction of next-generation HBM; ➂ The article highlights the competitive landscape and the impact of technological advancements on market leadership.
HBM3eSamsung Electronics
10 months ago
➀ SK hynix has started mass production of 12-Hi HBM3E memory stacks, setting the stage for next-generation AI and HPC processors; ➁ The new modules offer a peak bandwidth of 1.22 TB/s per module and a total of 9.83 TB/s with eight stacks; ➂ SK hynix is the first company to mass produce 12-Hi HBM3E memory, with plans to ship by the end of the year for AMD's Instinct MI325X and Nvidia's Blackwell Ultra.
AI ProcessorsHBM3eMass ProductionMemory TechnologySK Hynix
10 months ago
➀ Micron has launched its second-generation HBM3 memory, HBM3e, with a capacity of 36GB, doubling the previous 24GB limit; ➁ The new memory features a 12-layer design, up from the previous eight layers, and is aimed at the AI market; ➂ Despite the increased capacity, Micron claims to have reduced power consumption by 30% compared to competitors' eight-layer HBM3e products; ➃ The new memory is expected to be used in upcoming AI accelerators from AMD, Nvidia, and other hardware vendors; ➄ The demand for high-bandwidth memory has surged with the AI boom, leading to intense competition for higher capacity and bandwidth.
AIHBM3eMicronmemory