Gigabyte NVIDIA GB300 NVL72 Compute Node
Gigabyte also had a NVIDIA GB300 NVL72 compute node at their GTC 2025 booth. This is for the very dense Grace-Blackwell racks.

On the front we get our E1.S SSDs and the ability to liquid cool them along with the BlueField-3 DPUs.

The star, however, is the GB300 CPU and GPU complexes on the compute tray. These are all liquid-cooled.

There are so many liquid-cooled components from CPUs, GPUs, NICs, and more, that there is a large internal manifold for hot and cold connections.

Even the power distribution is liquid-cooled. Cooler Master, a brand many may know, has gone big on liquid cooling recently so they are very popular on the GB300 NVL72 systems. Here, even the power distribution is liquid-cooled.

On the rear, we get our guide pins, power inputs, and NVLink cable cartridge connectors.

Of course, this is designed so eighteen of these compute trays can be housed with the NVLink Switch trays, CDU, and power supplies to create a GB300 NVL72 rack.
Final Words
It is always fun to see these big servers. The GIGAPOD with both air-cooled and liquid-cooled options is one that many vendors have not adopted. Instead of selling individual systems, or just systems with either air cooling or liquid cooling, Gigabyte has both options. Many organizations simply do not have the power density to handle today’s 4U GPU compute servers so they are fine using air-cooled servers. Beyond that, getting to see the MGX platform for PCIe GPUs and the GB200 NVL72 compute tray is always a treat. Those four systems combine to cover a wide array of GPU compute scenarios.