Samsung unveils new memory chip for AI applications

Samsung Electronics, the world’s largest memory chip maker, has announced the development of a new high-bandwidth memory (HBM) chip that boasts the highest capacity and performance in the industry.

Samsung Electronics workers female

The new chip, dubbed HBM3E 12H, is designed to meet the growing demand for high-capacity memory from AI service providers, who use generative AI models such as OpenAI’s ChatGPT to create humanlike responses.

According to Samsung, the HBM3E 12H can raise both performance and capacity by more than 50% compared to its previous HBM3 product, the HBM3 8H. The chip has a 12-layer stack, but uses a special film that reduces the height and the gap between the layers, resulting in a more compact and efficient package.

The HBM3E 12H is expected to be an optimal solution for future systems that require more memory, such as datacenters and supercomputers. Samsung claims that the chip will allow customers to manage their resources more flexibly and reduce total cost of ownership.

Samsung has started sampling the chip to customers and plans to mass produce it in the first half of 2024. The company also secured a deal to supply its HBM3 chips to Nvidia, the leading GPU maker, for its next-generation graphics cards.

Samsung’s announcement comes amid the AI boom that is fueling the chip industry. Nvidia reported a 265% increase in its fourth-quarter revenue, driven by the soaring demand for its GPUs, which are used to run and train ChatGPT and other AI models. However, Nvidia CEO Jensen Huang warned that the company may not be able to sustain this level of growth or sales for the whole year, due to the global chip shortage and the competition from other chipmakers.

Samsung’s rival, SK Hynix, is also a major player in the high-performance memory chip market. The South Korean company was previously the sole mass producer of HBM3 chips supplied to Nvidia, according to a Korea Economic Daily report. SK Hynix is also developing its own HBM4 chip, which is expected to offer higher speed and lower power consumption than HBM3.

The HBM market is projected to grow rapidly in the coming years, as AI applications become more widespread and complex. Samsung and SK Hynix are competing to gain the upper hand in this lucrative segment, while also facing challenges from other chipmakers such as Micron and Intel.

Samsung Electronics announces KRW 53.7 trillion capital expenditure plan

Add a Comment

Your email address will not be published. Required fields are marked *