SK hynix Details the Future DRAM and NAND Roadmap until 2031
SK hynix has put up its next generation technology roadmap at SK AI Summit 2025 where plans were also unveiled for future DRAM and NAND flash products. The road map contains two main phases in which future advanced applications will be developed, like HBM5, DDR6, and 400+ layer 4D NAND devices, to meet the subsequent AI industry demands.
Near-Term Roadmap: 2026 to 2028
This first phase will be largely centered on establishing SK hynix's HBM4 technology while developing the traditional portfolio of DRAM and NAND product offerings.
- HBM Memory: Development includes HBM4 16-Hi stacks, HBM4E in 8, 12, and 16-Hi configurations, and a custom HBM solution.
- Conventional DRAM: The LPDDR6 rollout will be accompanied by a variety of AI-oriented DRAM (AI-D)-related solutions, including LPDDR5X SOCAMM2, MRDIMM Gen2, and LPDDR5R.
- NAND Flash: This period will see the introduction of PCIe Gen5 and Gen6 eSSDs and cSSDs, UFS 5.0, and AI-optimized NAND (AI-N) products.
Long-Term Vision: 2029 to 2031
Future products will reflect the introduction of another generation of standards within the various product segments from SK hynix.
- HBM Memory: The progress thus far will include HBM5 and HBM5E, along with relevant custom HBM5 solutions.
- Conventional DRAM: The PC DDR6 memory and next-generation graphics memory, GDDR7-next, and 3D DRAM, will introduce this generation.
- NAND Flash: Among the most important advances will be those regarding the creation of 400+ layer 4D NAND, PCIe Gen7 eSSDs and cSSDs, UFS 6.0, and High-Bandwidth Flash (HBF), dedicated to AI inference.
Key Technologies in Development
The roadmap, beyond names of specific products, highlights several innovative concepts that SK hynix is working on to overcome bottlenecks in AI computing.
- Custom HBM: Functions such as memory controller are stitched directly into the base die of HBM to free up silicon area on GPUs/ASICs and lower power consumption in the data transfer.
- AI-Optimized Memory (AI-D & AI-N): There will be development of DRAM and NAND products designed specifically to address the AI bottlenecks at a structural level- conducting the improvements in the efficiency of operational computing resources.
- High-Bandwidth Flash (HBF): This is a new NAND technology in development for the high-bandwidth needs of AI inference tasks in next-gen PCs.
Industry Outlook
The timeline shows that it would take a handful of years before mainstream adoption of technologies such as DDR6 and graphics memory at least beyond GDDR7. The timeline hints at how the industry's constructed schema is changing toward making increasingly specialized and efficient memory in tandem with the greater prevalence of AI workloads. And SK hynix CEO Kwak No-jung announced that the company envisions itself as "full-stack AI memory creator," working as co-designer and partner in future AI computing.



