NVIDIA InfiniBand Adapters

Enhancing Top Supercomputers and Clouds with NDR 400Gb/s

Leveraging faster speeds and innovative In-Network Computing, NVIDIA InfiniBand smart adapters achieve extreme performance and scale. NVIDIA InfiniBand adapters lower cost per operation, increasing ROI for high-performance computing (HPC), machine learning, advanced storage, clustered databases, low-latency embedded I/O applications, and more.

High-Performance Computing, Accelerated

NVIDIA® ConnectX® InfiniBand smart adapters with acceleration engines deliver best-in-class network performance and efficiency, enabling low latency, high throughput, and high message rates.

  • World-class cluster performance
  • High-performance networking and storage access
  • In-network computing
  • Efficient use of compute resources
  • Guaranteed bandwidth and low-latency services


NVIDIA ConnectX-7 NDR 400Gb/s InfiniBand

ConnectX-7 NDR 400Gb/s InfiniBand

The NVIDIA ConnectX-7 NDR 400Gb/s InfiniBand host channel adapter (HCA) provides the highest networking performance available to take on the world’s most challenging workloads. The ConnectX-7 InfiniBand adapter provides ultra-low latency, 400Gb/s throughput, and innovative NVIDIA In-Network Computing engines to provide additional acceleration to deliver the scalability and feature-rich technology needed for supercomputers, artificial intelligence, and hyperscale cloud data centers. 

NVIDIA ConnectX-6 VPI HDR/200GbE

ConnectX-6 HDR 200Gb/s InfiniBand

The NVIDIA ConnectX-6 HDR 200Gb/s InfiniBand host channel adapter (HCA) delivers high-performance and NVIDIA In-Network Computing acceleration engines for maximizing efficiency in HPC, artificial intelligence, cloud, hyperscale, and storage platforms.

NVIDIA ConnectX-5 EDR 100Gb/s InfiniBand

ConnectX-5 EDR 100Gb/s InfiniBand

ConnectX-5 VPI smart adapters with intelligent acceleration engines enhance HPC, machine learning, data analytics, cloud, and storage platforms. With support for two ports of 100Gb/s InfiniBand (EDR) and Ethernet network connectivity, PCIe Gen3 and Gen4 server connectivity, a very high message rate, PCIe switch, and NVMe over Fabrics offloads, ConnectX-5 adapter cards are a high-performance and cost-effective solution for a wide range of applications and markets.       

NVIDIA Mellanox ConnectX-4 VPI EDR/100GbE

ConnectX-4 VPI EDR/100GbE

ConnectX-4 Virtual Protocol Interconnect (VPI) smart adapters support EDR 100Gb/s InfiniBand and 100Gb/s Ethernet connectivity. Providing data centers high performance and flexible solutions for HPC (high performance computing), Cloud, database, and storage platforms, ConnectX-4 smart adapters combine 100Gb/s bandwidth in a single port with the lowest available latency, 150 million messages per second and application hardware offloads.

NVIDIA Mellanox ConnectX-3 Pro VPI FDR and 40/56GbE

ConnectX-3 Pro VPI FDR and 40/56GbE

ConnectX-3 Pro smart adapters with Virtual Protocol Interconnect (VPI) support InfiniBand and Ethernet connectivity with hardware offload engines for Overlay Networks ("Tunneling"). ConnectX-3 Pro provides great performance and flexibility for PCI Express Gen3 servers deployed in public and private clouds, enterprise data centers, and high-performance computing.


OCP Adapters

Open Compute Project (OCP) defines a mezzanine form factor  that features best-in-class efficiency to enable the highest data center performance.

NVIDIA Multi-Host Solutions

Multi-Host Solutions

The innovative NVIDIA Multi-Host® technology allows multiple compute or storage hosts to connect into a single adapter.

NVIDIA Socket Direct Adapters

Socket-Direct Adapters

NVIDIA Socket Direct® technology enables direct PCIe access to multiple CPU sockets, eliminating network traffic having to traverse the inter-process bus.


See how you can build the most efficient, high-performance network.

Configure Your Cluster

Take Networking Courses

Ready to Purchase?