Home Tags Mellanox

Tag: Mellanox

0

NVIDIA ConnectX-7 Shown at ISC 2022

0
At ISC 2022 NVIDIA showed off its 200Gbps and 400Gbps generation NVIDIA ConnectX-7 Infiniband PCIe Gen5 adapters

HPE 25GbE NVIDIA ConnectX-4 OCP NIC 3.0 Adapter Review

2
We take a look at a HPE 25GbE OCP NIC 3.0 adapter based around the Mellanox now NVIDIA ConnectX-4 Lx to see how HPE implemented the standard

Building the Ultimate x86 and Arm Cluster-in-a-Box

19
We have the Ultimate Cluster-in-a-Box with 8 nodes, 120 cores, 184 threads (both x86 and Arm), 1.4Tbps of network bandwidth, and 624GB of RAM

CPU-GPU-NIC PCIe Card Realized with NVIDIA BlueField-2 A100

2
As a precursor to Grace, we found the NVIDIA BlueField-2 A100 combining an Arm CPU, Mellanox NIC, A100 GPU, memory, storage, and maybe NVLink

NVIDIA NDR Infiniband 400Gbps Switches

4
NVIDIA NDR InfiniBand switch systems discussed at ISC21 ranging from 32-2048 ports of 400Gbps NDR ports or 64-4096 200Gbps NDR200 ports

Intel IPU is an Exotic Answer to the Industry DPU

10
The Intel IPU is the company's exotic answer to the industry-wide DPU effort to separate data center infrastructure and application layers

DPU vs SmartNIC and the STH NIC Continuum Framework

13
We are introducing the 2021 STH NIC Continuum framework for discussing NIC types to help categorize DPU vs SmartNIC and other solutions

A Quick Look at Logging Into a Mellanox NVIDIA BlueField-2 DPU

8
We have a quick piece to show how the NVIDIA BlueField-2 DPU is effectively a server on a PCIe card by logging into the DPU

NVIDIA BlueField-2 DPU Available BlueField-3 Samples in Q1 2022

0
NVIDIA BlueField-2 DPUs are now available and BlueField-3 samples are expected in Q1 2022 as 400Gb / PCIe Gen5 devices

NVIDIA Mellanox NDR 400Gbps Infiniband Announced

2
New NVIDIA Mellanox NDR 400Gbps Infiniband will double current HDR 200Gbps speeds for exascale supercomputers