Showcasing AI Memory Solutions of Next-Generation at CES 2026
"The world's first public reveal of the 16-high, 48-gigabyte (GB) HBM4 memory chip is a big highlight at SK hynix's next-generation artificial intelligence memory technologies showcases at CES 2026. Under the theme of 'Innovative AI, Sustainable Tomorrow,' the company will showcase its wide range of products.
Flagship HBM4 and Current-Gen HBM3E
At the center of its event will be the new 48GB HBM4. It succeeds the 12-high, 36GB HBM4, and is quite significant as a technical breakout as it requires stacking 16 DRAM dies which involved overcoming the very demanding and complex thin die spacing and wafer warpage, as well as very complicated thermal management issues.
Alongside the new HBM4, SK hynix is also demonstrating its current market-leading product, the 12-high, 36GB HBM3E. The company displayed a GPU module for Nvidia's latest AI servers equipped with this memory and then showed its operation in a real-world AI environment.
Broader AI Memory Portfolio
Beyond HBM, SK hynix presented several other memory products optimized for various AI applications.
- SOCAMM2: A low-power memory module specifically designed for AI servers.
- LPDDR6: Next generation of low-power memory, significantly improving data processing speed and power efficiency for on-device AI.
High-Capacity NAND for AI Data Centers
To solve the growing storage requirements for AI data centers, SK hynix introduced a new NAND flash product.
- 321-Layer QLC NAND: A 2-terabit, quadruple-level cell product optimized for ultra-high-capacity solid-state drives (SSDs).
Key Benefits: These NAND offers one of the strongest storage densities in existence today, performance, and power-efficiency improvements beyond generations compared to the previous ones. This makes them most suited for power-frugal AI data centers.
SK hynix stated it intends to "closely engage with global customers to create new value together in the AI era".
