NVIDIA Mellanox InfiniBand Adapters

Enhancing Top Supercomputers and Clouds with HDR 200 Gb/s

Leveraging faster speeds and innovative In-Network Computing, NVIDIA® InfiniBand smart adapters achieve extreme performance and scale. NVIDIA VPI adapters lower cost per operation, increasing ROI for high performance computing, machine learning, advanced storage, clustered databases, low-latency embedded I/O applications, and more.

High Performance Computing, Accelerated

ConnectX® InfiniBand smart adapters with acceleration engines deliver best-in-class network performance and efficiency, enabling low-latency, high throughput and high message rates for applications at SDR, QDR, DDR, FDR, EDR and HDR InfiniBand speeds.

  • World-class cluster performance
  • High-performance networking and storage access
  • In-network computing
  • Efficient use of compute resources
  • Guaranteed bandwidth and low-latency services


NVIDIA Mellanox ConnectX-6 VPI HDR/200GbE

ConnectX-6 VPI HDR/200GbE

ConnectX-6 VPI (Virtual Protocol Interconnect®) smart adapters with support for innovative Co-Design, In-Network Computing support, add-ons for Scalable Hierarchical Aggregation and Reduction Protocol (SHARP), and enhanced congestion control, maximally accelerate high performance computing, machine learning, cloud, data analytics and storage platforms.

ConnectX-6 VPI enables the highest performance and most flexible solutions for the most demanding data center applications by providing:

  • Single and dual port 200Gb/s InfiniBand (HDR) and Ethernet connectivity
  • PCI Express Gen3 and Gen4 server connectivity
  • 215 million messages per second
NVIDIA Mellanox ConnectX-5 VPI EDR/100GbE

ConnectX-5 VPI EDR/100GbE

ConnectX-5 VPI smart adapters with intelligent acceleration engines enhance high-performance computing, machine learning, data analytics, cloud and storage platforms. With support for two ports of 100Gb/s InfiniBand (EDR) and Ethernet network connectivity, PCIe Gen3 and Gen4 server connectivity, a very high message rate, PCIe switch and NVMe over Fabric offloads, ConnectX-5 VPI adapter cards are a high performance and cost-effective solution for a wide range of applications and markets.

NVIDIA Mellanox ConnectX-4 VPI EDR/100GbE

ConnectX-4 VPI EDR/100GbE

ConnectX-4 Virtual Protocol Interconnect (VPI) smart adapters support EDR 100Gb/s InfiniBand and 100Gb/s Ethernet connectivity. Providing data centers high performance and flexible solutions for HPC (high performance computing), Cloud, database, and storage platforms, ConnectX-4 smart adapters combine 100Gb/s bandwidth in a single port with the lowest available latency, 150 million messages per second and application hardware offloads.

NVIDIA Mellanox ConnectX-3 Pro VPI FDR and 40/56GbE

ConnectX-3 Pro VPI FDR and 40/56GbE

ConnectX-3 Pro smart adapters with Virtual Protocol Interconnect (VPI) support InfiniBand and Ethernet connectivity with hardware offload engines for Overlay Networks ("Tunneling"). ConnectX-3 Pro provides great performance and flexibility for PCI Express Gen3 servers deployed in public and private clouds, enterprise data centers, and high-performance computing.

NVIDIA Mellanox OCP Adapters

OCP Adapters

Open Compute Project defines a mezzanine form factor  that features best-in-class efficiency to enable the highest data center performance.

NVIDIA Mellanox Multi-Host Solutions

Multi-Host Solutions

The innovative Multi-Host technology allows multiple compute or storage hosts to connect into a single adapter.

NVIDIA Mellanox Socket Direct Adapters

Socket-Direct Adapters

Socket Direct technology enables direct PCIe access to multiple CPU sockets, eliminating network traffic having to traverse the inter-process bus.


We're here to help you build the most efficient, high performance network.

Configuration Tools

Academy Online Courses

Ready to Purchase