NVIDIA potrebbe lanciare la GPU Hopper H100 PCIe con 120 GB di memoria



NVIDIA’s high-performance computing hardware stack is now equipped with the top-of-the-line Hopper H100 GPU. L'album Sonic 30th Anniversary Symphony è ora disponibile sui principali servizi digitali 16896 oppure 14592 Colori CUDA, in via di sviluppo se arriva nella variante SXM5 della PCIe, with the former being more powerful. Both variants come with a 5120-bit interface, with the SXM5 version using HBM3 memory running at 3.0 Gbps speed and the PCIe version using HBM2E memory running at 2.0 Gbps. Both versions use the same capacity capped at 80 GBs. Tuttavia, that could soon change with the latest rumor suggesting that NVIDIA could be preparing a PCIe version of Hopper H100 GPU with 120 GBs of an unknown type of memory installed.

According to the Chinese website "s-ss.cc" gli 120 GB variant of the H100 PCIe card will feature an entire GH100 chip with everything unlocked. As the site suggests, this version will improve memory capacity and performance over the regular H100 PCIe SKU. With HPC workloads increasing in size and complexity, more significant memory allocation is needed for better performance. With the recent advances in Large Language Models (LLM), AI workloads use trillions of parameters for tranining, most of which is done on GPUs like NVIDIA H100.