Home Tags Infiniband

Tag: Infiniband

0

NVIDIA ConnectX-7 Shown at ISC 2022

0
At ISC 2022 NVIDIA showed off its 200Gbps and 400Gbps generation NVIDIA ConnectX-7 Infiniband PCIe Gen5 adapters

NVIDIA Cedar Fever 1.6Tbps Modules Used in the DGX H100

1
The new NVIDIA Cedar modules each have four ConnectX-7 controllers for up to 1.6Tbps each, and there are two in the DGX H100

Building the Ultimate x86 and Arm Cluster-in-a-Box

19
We have the Ultimate Cluster-in-a-Box with 8 nodes, 120 cores, 184 threads (both x86 and Arm), 1.4Tbps of network bandwidth, and 624GB of RAM

NVIDIA Quantum-2 400G Switches and ConnectX-7 at GTC Fall 2021

0
NVIDIA has new 400G generation solutions coming including the NVIDIA Quantum-2 switches ConnectX-7 NICs and BlueField-3 DPUs

NVIDIA NDR Infiniband 400Gbps Switches

4
NVIDIA NDR InfiniBand switch systems discussed at ISC21 ranging from 32-2048 ports of 400Gbps NDR ports or 64-4096 200Gbps NDR200 ports

NVIDIA Mellanox NDR 400Gbps Infiniband Announced

2
New NVIDIA Mellanox NDR 400Gbps Infiniband will double current HDR 200Gbps speeds for exascale supercomputers
HC32 NVIDIA DGX A100 SuperPOD Selene Cover

NVIDIA DGX A100 SuperPod Detailed Look

3
We get details on how the NVIDIA DGX A100 SuperPod is constructed including interconnects and storage direct from Hot Chips 32 in 2020

NVIDIA Acquires Mellanox Bringing a Major Shift

3
NVIDIA completing its acquisition of Mellanox signals a major shift in the data center. A lot has changed since the deal was announced

Changing Mellanox ConnectX VPI Ports to Ethernet or InfiniBand in Linux

2
In this guide, we show how to change Mellanox ConnectX VPI ports to either Ethernet or InfiniBand in Linux and have a video as well

Mellanox ConnectX-5 VPI 100GbE and EDR InfiniBand Review

10
Our Mellanox ConnectX-5 VPI 100GbE and EDR IB review shows why this PCIe Gen4 capable 100GbE and 100Gbps EDR InfiniBand adapter is in a class by itself