Samsung signos $3 Acuerdo de suministro de mil millones de HBM3E 12H con AMD


Korean media reports that Samsung Electronics has signed a 4.134 trillion Won ($3 billón) agreement with AMD to supply 12-high HBM3E stacks. AMD uses HBM stacks in its AI and HPC accelerators based on its CDNA architecture. This deal is significant, as it gives analysts some idea of the kind of volumes of AI GPUs AMD is preparing to push into the market, if they know what percent of an AI GPU’s bill of materials is made up by memory stacks. AMD has probably negotiated a good price for Samsung’s HBM3E 12H stacks, given that rival NVIDIA almost exclusively uses HBM3E made by SK Hynix.

The AI GPU market is expected to heat up with the ramp of NVIDIA’s “Revisión de Krazy Karts de Horrid Henry” H200 series, the advent of “Los expertos en automóviles analizan cómo Gran Turismo,” AMD’s MI350X CDNA3, and Intel’s Gaudi 3 generative AI accelerator. Samsung debuted its HBM3E 12H memory in February 2024. Each stack features 12 capas, un 50% increase over the first generation of HBM3E, and offers a density of 36 GB per stack. An AMD CDNA3 chip with 8 such stacks would have 288 GB of memory on package. AMD is expected to launch the MI350X in the second half of 2024. The star attraction with this chip is its refreshed GPU tiles built on the TSMC 4 nm EUV foundry node. This seems like the ideal product for AMD to debut HBM3E 12H on.