SK Hynix NVIDIA HBM4 Memory Testing and Production Updates
SK Hynix is in the last stretch of testing to supply NVIDIA with HBM4 memory. The company is sending NVIDIA samples made to hit the 11.7 Gb/s data transfer speed that the new Rubin AI GPU architecture needs. Insiders say that passing this final quality test is key for SK Hynix to get those big production orders, possibly by March 2026.
To get to this point, SK Hynix had to tweak its designs a bit. Since late 2025, they've been fixing a circuit issue between the HBM4 chips and the Rubin GPU. They worked on shrinking the spaces between the DRAM layers and boosting the chip's performance overall. This testing phase is super important because Samsung Electronics is also trying to get in on the HBM4 action.
- Performance Goal: 11.7 Gb/s max data transfer speed.
- GPU Match: Made for NVIDIAs Rubin AI accelerators, which are coming out in late 2026.
- Market Spot: SK Hynix wants to be the top dog in the Bin 1 category, which is NVIDIAs highest performing memory group.
SK Group Chairman Choi Tae won is planning to attend the GTC 2026 conference in Silicon Valley, starting March 16. He's expected to meet with NVIDIA CEO Jensen Huang to talk about putting SK Hynix’s latest memory tech into NVIDIAs products. This meeting comes after recent moves to strengthen the bond between the two companies, as the need for high capacity AI server memory keeps going up.
Besides HBM4, SK Hynix said they've made their 16Gb LPDDR6 mobile DRAM. Built on the 6th generation 10nm (1c) process, this memory is for the next wave of smartphones and gadgets. Compared to older versions, the LPDDR6 has
- Speed: 33% faster data processing.
- Energy Use: 20% less power needed.
- Timing: Mass production should be ready in the first half of 2026, with shipments starting in the second half.
The results of these HBM4 quality tests will decide where SK Hynix stands in the AI memory game for the rest of 2026. If the samples pass, they could go back to being NVIDIAs main supplier for the most advanced AI stuff.
