Samsung Electronics’ fifth-generation high bandwidth memory (HBM3E) chips have successfully cleared testing for Nvidia’s artificial intelligence (AI) processors, according to three sources familiar with the matter. This achievement marks a significant milestone for Samsung, which has been striving to match the pace of its rival SK Hynix in the competitive advanced memory market.
The successful qualification of Samsung’s 8-layer HBM3E chips represents a crucial advancement for the South Korean tech giant. Although the companies have not yet formalized a supply agreement, it is anticipated that a deal will be concluded soon, with initial shipments expected to commence in the fourth quarter of 2024.
However, Samsung’s 12-layer HBM3E chips have not yet passed Nvidia’s rigorous testing, as per the sources, who requested anonymity due to the confidential nature of the information. Nvidia has declined to comment on the specifics.
In a statement provided to Reuters, Samsung acknowledged that its product testing was on track and highlighted its ongoing efforts to optimize its offerings through collaboration with various customers. The company did not provide additional details.
HBM, a dynamic random access memory (DRAM) standard introduced in 2013, is characterized by vertically stacked chips designed to enhance space efficiency and reduce power consumption. This technology is pivotal for graphics processing units (GPUs) used in AI, facilitating the processing of vast data volumes generated by complex applications.
Samsung has been attempting to meet Nvidia’s performance standards for both HBM3E and its predecessor, the fourth-generation HBM3, since last year. Challenges related to heat and power consumption have reportedly hindered progress, though Samsung has reengineered its HBM3E design to address these issues.
Following a Reuters report in May, which suggested that Samsung’s chips had failed due to thermal and power problems, the company denied these claims. Dylan Patel, founder of semiconductor research group SemiAnalysis, noted that while Samsung is set to begin shipping the 8-layer HBM3E by Q4, SK Hynix is advancing by shipping its 12-layer HBM3E concurrently.
Samsung’s shares rose by 3.0% on Wednesday, outperforming a 1.8% increase in the broader market. In comparison, SK Hynix’s shares increased by 3.4%.
Nvidia’s recent certification of Samsung’s HBM3 chips for less complex processors aimed at the Chinese market further underscores the growing demand for sophisticated GPUs driven by the generative AI boom. Research firm TrendForce predicts that HBM3E chips will become the dominant HBM product this year, with significant shipments expected in the latter half. SK Hynix anticipates an 82% annual growth rate in HBM memory demand through 2027.
Samsung projects that HBM3E chips will account for 60% of its HBM chip sales by Q4, a target analysts believe is achievable if the chips receive Nvidia’s final approval by the third quarter. Although Samsung does not disclose revenue specifics for individual chip products, its total DRAM chip revenue for the first half of the year was estimated at 22.5 trillion won ($16.4 billion), with HBM sales potentially comprising about 10% of this figure.
Currently, the primary HBM manufacturers are SK Hynix, Micron, and Samsung. SK Hynix has been a major supplier of HBM chips to Nvidia, providing HBM3E chips in late March, while Micron has also committed to supplying Nvidia with HBM3E chips.
Related topics:
Why Hasn’t Sora Been Released Yet?