Samsung Electronics recorded operating profit of KRW 16.4 trillion in its semiconductor business in the fourth quarter of 2025. It became the first Korean company to usher in the era of KRW 20 trillion in annual profit in this segment. In particular, high bandwidth memory (HBM), the core semiconductor for artificial intelligence (AI), has emerged as the main pillar of the earnings rebound, pushing sales (KRW 333.6 trillion) to a new all-time high. This is being evaluated as a symbolic milestone for gauging the technological competitiveness and strategic direction of K-tech in the era of AI transformation. This article examines the next steps for K-tech from multiple angles around four pillars: Samsung’s semiconductors, AI, global technology hegemony, and the industrial ecosystem.
Samsung Electronics is firmly demonstrating its technological capabilities in the high bandwidth memory (HBM) market and is expanding supply to major global customers such as NVIDIA. Samsung, which at one point lagged in this area, has achieved a technological turnaround by receiving NVIDIA’s highest evaluation for its next-generation HBM4. With HBM sales projected to more than triple in 2026, attention is focusing on whether Samsung can replicate its strategy of overwhelming technological leadership in semiconductors in the HBM and customized AI chip markets.
The emergence of HBM and its inevitability in the AI eraWith the rapid spread of generative AI, the structure of data center servers is changing dramatically. The architecture is shifting from conventional general-purpose processor-based systems to a combination of GPUs (graphics processing units) and high bandwidth memory (HBM). HBM is an advanced memory technology manufactured by vertically stacking multiple layers of conventional DRAM like a building.
The core technology of HBM is known as TSV (Through Silicon Via), which drills vertical conduits inside the semiconductor to transmit data rapidly. While conventional DRAM exchanges data only horizontally, HBM is interconnected three-dimensionally, including vertically, which dramatically increases data transfer speed (bandwidth). This is a key element in meeting the requirement to process massive volumes of data quickly during AI model training and inference.
HBM products are categorized by generation: 4th-generation HBM3 (2021), 5th-generation HBM3E (2023), and the latest 6th generation, HBM4. NVIDIA’s GPUs mount multiple HBM devices together to perform ultra-high-speed AI computation. For example, NVIDIA’s next-generation AI accelerator Rubin (to be launched in the second half of 2026) will mount multiple high-performance HBM4 chips to deliver ultra-high-speed AI computation.
Samsung’s HBM development failure and lessons learned
Interestingly, Samsung Electronics was among the first companies to develop HBM technology. However, in 2019, the management of Samsung’s semiconductor division decided to downsize the dedicated HBM team, judging that the HBM market would not be large enough. This contrasted with latecomer SK hynix, which focused on HBM development for survival, even in what was then a small market.
Management’s rationale at the time was that due to the specificity of being “high-performance but very expensive memory,” it would be difficult to expect mass-market demand. This reflected Samsung’s organizational culture of concentrating on businesses with “guaranteed profitability.” The structure of allocating people and resources to business units with certain current profits, rather than to uncertain future technology development, had become entrenched.
The outcome was devastating. As generative AI exploded in 2023, HBM demand far exceeded expectations. SK hynix’s HBM production was sold out through its 2025 output, and competitors also fell into a state of supply shortage. It was problematic that Samsung failed to read the market’s potential, but industry assessments were even harsher on the fact that it took more than a year to improve product quality after the market opportunity had become clear.
At the time, there was a view that Samsung’s HBM failed to meet NVIDIA’s quality requirements due to packaging technology issues. However, industry experts pointed to the possibility of fundamental problems in the DRAM design itself. Performance degradation caused by heat generation and power consumption is estimated to have been the reason it did not pass NVIDIA’s rigorous tests.
2025, signs of a technological turnaroundSamsung, however, turned crisis into a turning point. Through intensive technology development and capacity expansion, its market share surged from 17% in the second quarter of 2025 to a significantly higher level in the third quarter, overtaking US-based Micron to reclaim second place. While SK hynix maintained the top spot with a 57–64% share, Samsung’s catch-up began.
Of particular note is that Samsung’s next-generation product HBM4 received the highest scores in NVIDIA’s strict quality evaluations. In December 2025, NVIDIA evaluated Samsung’s 12-layer HBM4 as achieving 11 Gbps (gigabits per second) per pin. This far exceeds the JEDEC international standard of 8 Gbps and meets the high specifications NVIDIA required for Rubin.
Samsung Electronics’ industry-first “24Gb GDDR7 DRAM”
Once the performance superiority of Samsung’s HBM4 was validated, NVIDIA requested far larger volumes from Samsung than initially expected. This appears to align with NVIDIA’s push for a multi-sourcing strategy and diversification of its supplier base. Samsung plans to begin supplying HBM4 modules to NVIDIA in the first half of 2026, and the semiconductor industry expects full-scale supply to begin from the second quarter.
HBM revenue in 2026, the reality behind a threefold surgeAccording to securities industry projections, Samsung Electronics’ HBM revenue in 2026 is expected to surge about threefold from the 2025 level to reach KRW 26 trillion. HBM shipment volume is also projected to triple year-on-year to 11.2 billion Gb, with next-generation HBM4 accounting for about half of total shipments.
What is even more striking is that this outlook is not merely a simple recovery from the previous downturn. According to KB Securities, Samsung Electronics’ operating profit in 2026 is expected to approach KRW 100 trillion, up 129% from 2025, with the key growth drivers identified as rising DRAM prices and expanding HBM shipments. The fact that Samsung recorded KRW 20 trillion in operating profit in just the fourth quarter of 2025 indicates it is heading toward the peak of a memory supercycle.
The outlook for HBM market share expansion is also striking. From 17% in the second quarter of 2025, Samsung’s share is estimated to surge to around 35% in 2026, with some institutions projecting above 30%. If market share expands to this extent, Samsung is expected to move beyond its second-place position and enter into full-scale competitive dynamics with SK hynix. Since SK hynix currently maintains an oligopolistic share of more than 50%, Samsung’s 30% share is seen as the threshold at which a “two-top competition” can emerge.
What is important is that Samsung’s technological capabilities in HBM4 are expected to continue to strengthen. HBM4 has already established a full-scale mass production system in the first half of 2026, and is projected to achieve a yield target of 70% for volume production around April 2026. With parallel development underway for next-generation technologies such as HBM5 and HBM6, some industry analysts see the possibility that the technology gap could be fully closed.
Jeon Young-hyun, head of Samsung Electronics’ Device Solutions (DS) Division
The possibility of recreating overwhelming technological leadershipWhether Samsung Electronics can recreate overwhelming technological leadership in HBM, the key memory of the AI era, now appears to be less a question of technology roadmap and more one of organizational determination and execution. The performance of Samsung’s HBM4 is already at a level that surpasses competitors, and its integrated capabilities in design, manufacturing, and packaging are among the best in the world. The key question is how quickly this technological edge can be commercialized and translated into supply stability that satisfies customers.
2026 is expected to be the turning point when these changes enter full swing. Whether Samsung can triple its HBM revenue, achieve a market share above 30%, and successfully secure customers in the ASIC market is likely to determine the next decade. As the saying in the semiconductor industry goes, “Two years of preparation, ten years of growth,” the question of whether current technological advances will evolve into genuine overwhelming leadership depends on execution over the coming years.
“Q&A” Understanding HBM and Samsung’s technological turnaround in simple terms
Q. What exactly is HBM (high bandwidth memory)?
A. HBM is advanced memory made by vertically stacking layers of DRAM (conventional memory) like a building. Conventional memory exchanges data only horizontally, but HBM is also connected vertically, enabling much faster data transmission. AI training and computation require processing massive amounts of data at high speed, and HBM is the key component that makes this possible.
Q. What is TSV technology and why is it important?
A. TSV (Through Silicon Via) is a technology that creates tiny vertical conduits inside a semiconductor chip to transmit data. Typically, signals move only horizontally across the surface of the chip, but with TSV, a “3D conduit” is created that penetrates the chip. This allows much faster and greater volumes of data transfer, and it is the “core technology” that makes HBM special.
Q. How are HBM3 and HBM4 different?
A. HBM3 was launched in 2021, HBM3E in 2023, and HBM4 is the latest (6th-generation) product. With each generation, data transfer speeds increase. For example, HBM4 has achieved a speed of 11 gigabits per second per pin (for each data connection point), which far exceeds the international standard of 8 gigabits.
Q. Why are NVIDIA’s quality requirements so stringent?
A. NVIDIA is the company that makes the world’s most powerful AI chips (GPUs). The HBM used in its latest AI accelerator (Rubin) must meet the highest standards for performance, stability, and thermal management. If the HBM’s performance is inadequate or heat generation is excessive, the performance of the entire system declines. Therefore, NVIDIA conducts extremely rigorous tests and selects only top-quality products.
Q. What is a “memory supercycle”?
A. In the memory chip market, a period when both demand and prices rise sharply is called a “supercycle.” Since 2023, the AI boom has driven explosive demand for HBM and DRAM, pushing prices higher. During such periods, memory chip manufacturers’ profits surge. Saying that Samsung is headed toward the peak of this supercycle means it is on track to maximize earnings in this phase.
Q. What does the “70% yield target in April 2026” mean?
A. “Yield” refers to the proportion of defect-free, fully functional products among those manufactured. For example, if 70 out of 100 produced units pass inspection, the yield is 70%. Because HBM4 uses cutting-edge technology, many imperfect products may be produced initially. Raising the yield to 70% by April 2026 means that preparations for full-scale mass production will be complete. The higher the yield, the lower the cost, and the higher the profitability.
Q. What does “two years of preparation, ten years of growth” mean in the article?
A. It is a saying used in the semiconductor industry. It means that if it takes two years to fully develop and prepare a new technology, that technology can generate steady profits for the next ten years. In other words, Samsung’s current development of HBM technology over the past two to three years could drive growth for the next decade, and it underscores how crucial execution in 2026 will be.
Q. Why is it said that Samsung can “recreate” overwhelming technological leadership?
A. Samsung has previously created “overwhelming leads” over competitors in areas such as smartphone memory chips, DRAM, and NAND flash by staying far ahead technologically. The suggestion is that a similar situation is taking shape in HBM. HBM4 performance exceeds that of competitors, and its capabilities in design, manufacturing, and packaging are also world-class. However, technology alone is not enough; what matters is how quickly it can be commercialized and how reliably it can be supplied.
ⓒ dongA.com. All rights reserved. Reproduction, redistribution, or use for AI training prohibited.
Popular News