With U.S. memory maker SanDisk and a standardization consortium
HBF, a next-generation product that addresses the limits of HBM
Growing need for HBF in AI inference
“Implementing composite memory solutions including both HBM and HBF”
SK hynix 321-layer QLC NAND product
SK hynix is launching full-scale efforts to standardize specifications for High Bandwidth Flash (HBF), regarded as a next-generation memory product.
SK hynix announced on the 25th (local time) that it held an “HBF Spec. Standardization Consortium Kick-Off” event with U.S. memory company Sandisk. The event took place at Sandisk’s headquarters in Milpitas, California. At the event, the company unveiled a global standardization strategy for HBF, a next-generation memory solution targeting the era of artificial intelligence (AI) inference.
An SK hynix executive stated, “Together with Sandisk, the company will establish HBF as an industry standard to build a foundation for the entire AI ecosystem to grow together,” adding, “We will form a dedicated workstream, a core task-focused collaborative framework under the Open Compute Project (OCP, the world’s largest open data center technology consortium), together with Sandisk and begin full-scale standardization work.”
HBF is considered a next-generation semiconductor product and can be understood as the NAND flash version of High Bandwidth Memory (HBM). HBM is based on vertically stacking DRAM, while HBF vertically stacks NAND flash. Although slower than HBM, it is said to offer more than 10 times the capacity at lower cost as its key advantage.
In the AI industry, the center of gravity is rapidly shifting from the training phase, in which large language models (LLMs) are created, to the inference phase, in which actual services are provided. As the number of users simultaneously accessing AI services surges, fast and efficient memory has become essential, but with existing memory architectures it is difficult to meet both the large-scale data processing and power-efficiency requirements of the inference phase. HBF is being cited as an alternative that can overcome these limitations.
Positioned as a new memory tier between ultra-high-speed HBM and high-capacity storage SSDs, HBF is expected to fill the gap between these two products. It is said to be capable of securing both capacity scalability and power efficiency required in inference workloads. In this structure, existing HBM provides top-level bandwidth, while HBF plays a complementary role. In particular, HBF is drawing attention as a product that can enhance AI system scalability while reducing total cost of ownership (TCO). The industry expects demand for composite memory solutions including HBF to expand in earnest around 2030.
In the AI inference market, system-level optimization that spans CPUs, GPUs, memory, and storage determines competitiveness more than the performance of a single chip. As a result, the role of comprehensive memory solution providers capable of supplying both HBM and HBF is becoming increasingly important.
SK hynix and Sandisk plan to leverage their accumulated design and packaging technologies and mass-production capabilities in HBM and NAND to proactively drive rapid standardization and commercialization of HBF.
Ahn Hyun, President and Chief Development Officer (CDO) in charge of development at SK hynix, said, “The core of AI infrastructure goes beyond competition in individual technologies to optimizing the entire ecosystem,” adding, “Through HBF standardization, the company will establish a cooperative framework and present optimized memory architectures for customers and partners in the AI era, thereby creating new value.”
ⓒ dongA.com. All rights reserved. Reproduction, redistribution, or use for AI training prohibited.
Popular News