High Bandwidth Flash HBF The Future of AI Memory Poised to Surpass HBM with Huge Capacity

Explore High Bandwidth Flash (HBF), the future of AI memory. Engineered for huge capacity at lower costs, HBF is poised to surpass HBM in data centers
High Bandwidth Flash HBF The Future of AI Memory Poised to Surpass HBM with Huge Capacity

High Bandwidth Flash Memory of the Future for AI

As HBM gives way to HBF, the memory market is on the brink of a major technological shift. The statement brings nothing other than one made by Professor Kim Jong-ho from KAIST, an important figure also known as the "father" of HBM, who said that the focus in AI is now shifting from graphics processors to memory solutions.

What is High Bandandwidth Flash

HBF is engineered to meet ever-growing data volumes and bandwidth limitations within AI data centers. Based on NAND flash memory but using the same vertical chip stacking technology (TSV) as HBM, it presents specific advantages:

  • Huge Capacity: HBF offers dozens of times memory capacity compared to HBM.
  • Lower Production Costs: The technology is designed to be cheaper to manufacture.
  • AI-Focused Architecture: It is built to handle the large size of datasets needed by neural networks.

In the view of Professor Kim Jong-ho, memory capacity and organization will be the primary performance criteria in AI systems, while GPU raw computing power will no longer be so relevant.

Initiatives and Timeframes for Adoption by Industry

Industry giants are already stepping forth to work on the HBF technology and its standardization.

This includes the following key milestones:

  • In August 2025, SK hynix and SanDisk signed an agreement for joint development and standardization.
  • SK hynix also showcased and presented its AIN B storage line recently at the October 2025 OCP Global Summit with HBF as a major component.
  • Samsung has initiated early design work on its own HBF solutions, leveraging its expertise in high-performance storage.

HBF memory samples should be ready in the second half of 2026, with server solutions likely to be available at the beginning of 2027.

Impending Role of HBF in AI

Professor Kim envisioned a new memory hierarchy for AI systems whereby HBF would serve as the enormous "underground archive" or vast memory layer that holds vast amounts of data. This would then be interfaced with the current HBM technology as a smaller, faster cache that fetches necessary data from the HBF archive for active computations by the GPU.

About the author

mgtid
Owner of Technetbook | 10+ Years of Expertise in Technology | Seasoned Writer, Designer, and Programmer | Specialist in In-Depth Tech Reviews and Industry Insights | Passionate about Driving Innovation and Educating the Tech Community Technetbook

Join the conversation