Leverage the highest-performing HMB4 to secure a competitive edge
40% increase in power efficiency compared to existing HBM3E
“Up to 69% improvement in AI service performance”
“A symbolic turning point surpassing AI infrastructure limits”
SK Hynix announced on the 12th that it has successfully completed the development of its new high-performance memory product for artificial intelligence (AI), HBM4, and established the world's first mass production system. High Bandwidth Memory (HBM) is a high-performance product that vertically connects multiple DRAMs to significantly enhance data processing speed compared to conventional DRAMs. The numbers following HBM indicate the generation, with the first and second generations being HBM and HBM2, the third generation HBM2E, and the fourth generation HBM3. The fifth generation is HBM3E, making the current HBM4 the sixth generation.
SK Hynix emphasized, “We have successfully developed HBM4, which will lead the new AI era, and established the world’s first HBM4 mass production system,” and added, “We have once again demonstrated our leadership in AI memory technology in the global market.”
Cho Joo-hwan, Vice President of SK Hynix in charge of HBM development, stated, “The completion of HBM4 development will set a new milestone in the industry,” and added, “We will supply products that meet customer demands for performance, energy efficiency, and reliability in a timely manner, securing a competitive edge in the AI memory market and achieving rapid market entry (Time to Market).”
As AI demand and data processing volumes increase explosively, the demand for high-bandwidth memory to achieve faster system speeds is surging. In HBM products, bandwidth refers to the total data capacity that one HBM package can process per second.
With the operational burden of data centers consuming massive power increasing, securing memory power efficiency has become a key requirement for customers. SK Hynix is targeting the market with HBM4, which boasts enhanced bandwidth and power efficiency.
The newly established mass production system for HBM4 applies 2,048 data transmission channels (I/O), doubling the bandwidth and improving power efficiency by over 40% compared to the previous generation. SK Hynix explained that it has achieved world-class data processing speed and power efficiency. Additionally, introducing this product into customer systems can enhance AI service performance by up to 69%, fundamentally resolving data bottlenecks while significantly reducing data center power costs.
SK Hynix HBM4 (6th Generation)
One of the performance indicators, the operating speed, is 10 gigabits per second (10Gbps), surpassing the standard operating speed of 8Gbps set by the Joint Electron Device Engineering Council (JEDEC), according to SK Hynix.
The risks that may arise during mass production have also been minimized. The company applied the Advanced MR-MUF process and 10-nanometer class 5th generation (1bnm) DRAM technology, which have proven stability. MR-MUF is a process where a liquid protective material is injected and solidified between the spaces of stacked semiconductor chips to protect the circuits between the chips. Compared to the method of laying film-type materials each time a chip is stacked, this process is evaluated as more efficient and effective in heat dissipation. SK Hynix emphasized that its proprietary Advanced MR-MUF process reduces the pressure applied when stacking chips compared to existing processes, improving warpage control and securing stable mass production capability.
Kim Joo-sun, President of SK Hynix AI Infrastructure (CMO), stated, “The establishment of the world’s first HBM4 mass production system is a symbolic turning point that surpasses the limits of AI infrastructure and will solve the technological challenges of the AI era,” and added, “We will grow into a Full Stack AI Memory Provider by supplying the highest quality and diverse performance memory required by the AI era in a timely manner.”
ⓒ dongA.com. All rights reserved. Reproduction, redistribution, or use for AI training prohibited.
Popular News