Samsung Debuts 7th-Gen HBM4E at GTC 2026: 16Gbps Speed and 4TB/s Bandwidth for NVIDIA Vera Rubin
Samsung unveiled its 7th-generation HBM4E memory at NVIDIA GTC 2026, achieving 16Gbps per pin and 4TB/s bandwidth, designed for NVIDIA's next-gen Vera Rubin platform.
On March 17, 2026, Samsung made the world premiere of its 7th-generation High Bandwidth Memory 'HBM4E' at NVIDIA GTC 2026. This next-generation memory achieves a transfer speed of 16Gbps per pin and 4TB/s bandwidth, optimized for NVIDIA's next-gen AI platform 'Vera Rubin.'
HBM4E was developed building on the technology of Samsung's 6th-generation HBM4 (11.7Gbps), which is already in mass production. The significant performance improvement over HBM4 addresses the massive data processing demands of next-generation AI workloads. It is particularly expected to resolve memory bandwidth bottlenecks in large language model inference and training.
Samsung also introduced a new Hybrid Copper Bonding (HCB) technology that enables stacking of 16 or more layers while reducing thermal resistance by over 20%. This technology maintains stable operating temperatures even in high-density stacked structures, maximizing the performance of AI accelerators.
NVIDIA CEO Jensen Huang emphasized the collaborative relationship with Samsung, noting that Samsung manufactures the Groq 3 LPU AI chip. In the AI semiconductor market, SK hynix and Samsung are competing for HBM memory dominance, and this HBM4E announcement represents a crucial step for Samsung in establishing its position as a key supplier for NVIDIA's next-generation platform. With the rapid expansion of AI infrastructure, demand for high-performance memory is expected to accelerate further.
Sources
AI Newsletter
Get the latest AI tools and news delivered daily
Related Articles
NVIDIA GTC 2026 Opens: Jensen Huang Unveils Vera Rubin GPU and Groq 3 LPU, Projects $1T Demand and Declares 'Inference Inflection Point'
NVIDIA's annual AI conference GTC 2026 kicked off with CEO Jensen Huang unveiling the next-gen Vera Rubin AI platform and inference-focused Groq 3 LPU, projecting $1 trillion in demand by 2027.
NVIDIA GTC 2026 Preview: New Inference Chip with Groq Integration, Rubin GPU, and Agent AI in Focus
Ahead of NVIDIA GTC 2026, a new inference-focused chip integrating Groq technology is expected. The next-gen Rubin GPU roadmap and agent AI advances are also key themes.
NVIDIA GTC 2026 Next Week: Rubin GPU, Groq Integration, and Robotics in the Spotlight
NVIDIA GTC 2026 runs March 16-19 in San Jose. Key highlights include the next-gen Rubin GPU details, Groq technology integration from the $20B acquisition, and robotics advances.