To boost the production and launch of AI and HPC GPUs in 2024, NVIDIA, SK Hynix, and Micron are now in a partnership. Both SK Hynix and Micron are suppliers of HBM3 memory in this partnership. NVIDIA, on its part, is paying to receive the services of both suppliers as they step into 2024 with lots of products to be developed and launched in the GPU industry.
According to the reports, NVIDIA has paid anywhere between 70 billion to 1 trillion won to both suppliers. Now, this isn’t a defined figure, but it’s a remarkable one in the industry and shows the plans of the GPU manufacturer. This huge sum of money will serve NVIDIA through the year, hence clearing any chances of it getting into a situation where it’ll find it hard to access HBM3 memory for manufacturing.
Already, NVIDIA has two products on its launch schedule for next year that’ll use HBM3 memory. These products are the NVIDIA H200 GPU and the GH200 superchip. For both of these products, NVIDIA will need a lot of HBM3 memory due to the specifications that they’ll launch with.
In terms of performance, one of these coming NVIDIA chips, the H200, is, according to reports, the most powerful AI chip in the world. This chip is said to launch in 4-way and 8-way configurations and will outperform its predecessor. It’ll also be the first GPU in the world to make use of the HBM3e memory.
These coming NVIDIA chips will be in high demand in various AI-driven industries. Because these chips will be in high demand, NVIDIA will need a steady supply of HBM3 memory throughout the year. This is the major reason behind the bulk purchase of these chips from two major suppliers SK Hynix and Micron