Samsung's HBM Chips Face Setbacks in Nvidia's AI Processors Due to Heat Issues

By

Samsung's HBM Chips Face Setbacks in Nvidia's AI Processors Due to Heat Issues
A Samsung smartphone sits on display at an AT&T store on April 16, 2024 in Austin, Texas. Samsung has become the top phonemaker as Apple smartphone shipments have dropped about 10% in the first quarter of 2024. Brandon Bell/Getty Images

Samsung Electronics' newest high bandwidth memory (HBM) chips have yet to pass Nvidia's testing for use in the company's AI processors owing to heat and power consumption difficulties, according to Reuters' sources.

The reasons why Samsung failed Nvidia's testing are being revealed for the first time.

Samsung's HBM Fails Nvidia Tests

The company recently announced the development of HBM3E 12H, the industry's first 12-stack HBM3E DRAM and the highest-capacity HBM product to date.

Samsung's HBM3E 12H offers an unprecedented bandwidth of up to 1,280 gigabytes per second (GB/s) and an industry-leading capacity of 36 gigabytes (GB). Compared to the 8-stack HBM3 8H, both characteristics have improved by more than 50%.

However, according to Nvidia, the heat and power consumption issues affect Samsung's HBM3 chips, which are the fourth-generation HBM standard currently most commonly used in graphics processing units (GPUs) for artificial intelligence, as well as the fifth-generation HBM3E chips that the South Korean tech behemoth and its competitors are introducing this year.

Samsung stated to Reuters that HBM is a customized memory product that requires "optimization processes in tandem with customers' needs," and that it is currently refining its products in close collaboration with customers.

With the rapid expansion of AI applications, the HBM3E 12H is poised to become the go-to choice for upcoming systems that demand increased memory capacity.

Samsung's High Bandwidth Memory Chips

According to the three sources, Samsung has been trying to pass Nvidia's HBM3 and HBM3E tests since last year the findings of a recent failed test for Samsung's 8- and 12-layer HBM3E chips were released in April.

It was unclear whether the issues could be easily resolved, but the three sources said that Samsung's failure to meet Nvidia's requirements has raised industry and investor concerns that the company could fall further behind rivals SK Hynix (000660.KS) and Micron Technology (MU.O) in HBM.

The individuals, two of whom were informed on the topic by Samsung executives, spoke on the condition of anonymity because the material was sensitive.

According to Samsung Newsroom, the increased performance and capacity of the product will provide customers with greater flexibility in resource management and help reduce the overall cost of owning and operating datacenters.

In AI applications, the adoption of HBM3 8H can significantly boost the average speed of AI training by 34% and allow for more than 11.5 times the number of simultaneous users of inference services.

Tags
Samsung

© 2024 VCPOST.com All rights reserved. Do not reproduce without permission.

Join the Conversation

Real Time Analytics