0
Tag: Infiniband
NVIDIA Mellanox NDR 400Gbps Infiniband Announced
New NVIDIA Mellanox NDR 400Gbps Infiniband will double current HDR 200Gbps speeds for exascale supercomputers
NVIDIA DGX A100 SuperPod Detailed Look
We get details on how the NVIDIA DGX A100 SuperPod is constructed including interconnects and storage direct from Hot Chips 32 in 2020
NVIDIA Acquires Mellanox Bringing a Major Shift
NVIDIA completing its acquisition of Mellanox signals a major shift in the data center. A lot has changed since the deal was announced
Changing Mellanox ConnectX VPI Ports to Ethernet or InfiniBand in Linux
In this guide, we show how to change Mellanox ConnectX VPI ports to either Ethernet or InfiniBand in Linux and have a video as well
Mellanox ConnectX-5 VPI 100GbE and EDR InfiniBand Review
Our Mellanox ConnectX-5 VPI 100GbE and EDR IB review shows why this PCIe Gen4 capable 100GbE and 100Gbps EDR InfiniBand adapter is in a class by itself
Mellanox ConnectX-6 Brings 200GbE and HDR Infiniband Fabric to HPC
At SC18, Mellanox showed off several versions of their newest 200GbE and 200Gbps HDR Infiniband cards. The Mellanox ConnectX-6 family is now the leading NIC solution in the Ethernet and Infiniband networking race
Move over EDR: Mellanox announces 200Gbps HDR Infiniband Products
100Gbps interconnects not fast enough for you? Mellanox announces 200Gbps HDR Infiniband switch and adapter products with a 2017 availability date
Custom Firmware for Mellanox OEM Infiniband Cards – RDMA in Windows...
Build custom firmware for your Mellanox ConnectX-2 InfiniBand card from Dell, Sun, or HP to enable RDMA in Windows 2012
How To Configure IPoIB with Mellanox HCAs – Ubuntu 12.04.1 LTS
A quick how-to guide on configuring IPoIB with Mellanox HCAs using Ubuntu 12.04.1 LTS. Get up and running in only a few minutes.
Mellanox ConnectX-2 MHQH19B-XTR – 40gbps Windows SMB 3.0 Adapters
In Windows Server 2012 and Windows 8 Microsoft introduced SMB 3.0 which greatly improves shared storage performance. After playing with it a bit, and...