로그인|회원가입|고객센터|HBR Korea
페이지 맨 위로 이동
검색버튼 메뉴버튼

Samsung·Nvidia

Samsung to Supply Next-Gen DRAM SoCAM 2 to Nvidia

Dong-A Ilbo | Updated 2025.12.18
View of Samsung Electronics’ headquarters in Yeongtong-gu, Suwon, Gyeonggi Province. 2022.3.28/News1
Samsung Electronics announced through its semiconductor newsroom on the 18th that it has supplied samples of its next-generation DRAM module standard, “SOCAMM (SOCAMM) 2nd generation,” to NVIDIA. NVIDIA plans to equip “Vera Rubin,” its next-generation artificial intelligence (AI) accelerator to be released next year, with SOCAMM 2.

SOCAMM 2 is also referred to as the “second high bandwidth memory (HBM).” This is because it can process large volumes of data with low power consumption, thereby improving AI computation efficiency. It is a detachable DRAM module based on low-power double data rate (LPDDR) that NVIDIA is pushing as a proprietary standard to secure leadership in the AI memory semiconductor market.

SOCAMM 2 more than doubles the data processing bandwidth compared with registered dual in-line memory modules (RDIMM) for conventional servers, while reducing power consumption by more than 55%. This is because, instead of existing DDR-based server memory modules, it applies LPDDR5X DRAM, which has strengths in low power consumption. It maintains stable performance even under high-load AI computation processing.

Industry observers expect SOCAMM 2 to establish itself, alongside HBM, as one of the two main pillars of AI memory and to grow into a market worth several billion dollars within a few years. As AI workloads shift from large-scale training to continuous inference, demand is rapidly increasing for server memory that offers both power efficiency and scalability.

Lee Min-a

AI-translated with ChatGPT. Provided as is; original Korean text prevails.
Popular News

경영·경제 질문은 AI 비서에게,
무엇이든 물어보세요.

Click!