Samsung Ships First Commercial HBM4 Memory Module, Nvidia’s Vera Rubin Expected To Use It: SSNLF Stock Soars 55%

The company announced it has begun mass production of its new HBM4 memory modules.

  • Samsung’s HBM4 module is expected to be used in Nvidia’s next-generation Vera Rubin AI accelerators.
  • Nvidia reportedly plans to source about 30% of its HBM4 requirements for its upcoming Vera Rubin AI chip from Samsung.
  • The AI giant will source the remaining 70% of the HBM4 modules from SK Hynix, with Micron no longer in the picture.

Samsung Electronics (SSNLF) shares soared nearly 55% on Thursday after the company announced that it has shipped the world’s first commercial high bandwidth memory module, HBM4.

Add Asianet Newsable as a Preferred Source

The company announced it has begun mass production of its new HBM4 memory modules, designed for high-performance AI computing.

“Instead of taking the conventional path of utilizing existing proven designs, Samsung took the leap and adopted the most advanced nodes like the 1c DRAM and 4nm logic process for HBM4,” said Sang Joon Hwang, Executive Vice President and Head of Memory Development at Samsung Electronics.

First HBM4 Modules Sent To Customers

Samsung said it has sent the first HBM4 memory modules to its customers but did not name them. The company is competing with rivals in this space, such as Micron Technology Inc. (MU) and SK Hynix Inc., which also make high–performance memory chips.

Samsung’s HBM4 module is expected to be used in Nvidia’s next-generation Vera Rubin AI accelerators. According to a report by the Chosun Daily citing semiconductor analysis firm SemiAnalysis, Nvidia plans to source about 30% of its HBM4 requirements for its upcoming Vera Rubin AI chip from Samsung.

The AI giant will reportedly source the remaining 70% of the HBM4 modules from SK Hynix, with Micron no longer in the picture.

Samsung’s HBM4 Module Meets Nvidia’s Requirements

According to a TrendForce report from last month, Nvidia updated the data transfer speed requirements to 11 Gbps, following a specs revision for the Vera Rubin chip in the third quarter (Q3) of 2025.

Samsung’s new HBM4 module meets these requirements, with the company touting up to 11.7 Gbps data transfer speeds. The Korean giant said that this represents a 22% increase over HBM4’s predecessor, HBM3e.

“HBM4’s performance can be further enhanced up to 13Gbps, as well, effectively mitigating data bottlenecks that intensify as AI models continue to scale up,” it said.

ByteDance Looks At Samsung For Its AI Chips, Says Report

According to a Reuters report, Chinese tech giant ByteDance is looking to Samsung to manufacture an AI chip that it is currently developing.

ByteDance expects to receive samples by March and plans to produce 100,000 AI chips this year, with the aim of ramping up to 350,000 chips annually.

For updates and corrections, email newsroom[at]stocktwits[dot]com.<

Leave a Comment