The NGC container registry provides a comprehensive catalog of GPU-accelerated AI containers that are optimized, tested and ready-to-run on supported NVIDIA GPUs on-premises and in the cloud. AI containers from NGC, including TensorFlow, PyTorch, MXNet, NVIDIA TensorRT™, and more, give users the performance and flexibility to take on their most challenging projects with the power of NVIDIA AI. This helps data scientists and researchers rapidly build, train, and deploy AI models to meet continually evolving demands.
The NGC container registry features the top software for accelerated data science, machine learning, and analytics. Tap into powerful software for executing end-to-end data science training pipelines completely in the GPU, reducing training time from days to minutes.
The NGC container registry provides a comprehensive catalog of GPU-accelerated containers for AI, machine learning and HPC that are optimized, tested and ready-to-run on supported NVIDIA GPUs on-premises and in the cloud.
Each AI container has the NVIDIA GPU Cloud Software Stack, a pre-integrated set of GPU-accelerated software. The stack includes the chosen application or framework, NVIDIA CUDA Toolkit, NVIDIA deep learning libraries and a Linux OS—all tested and tuned to work together immediately with no additional setup.
The NGC container registry features the top AI software such as TensorFlow, PyTorch, MXNet, NVIDIA TensorRT ™, RAPIDS and many more. Browse the NGC container registry to see the full list.
Containers from the NGC container registry work across a wide variety of NVIDIA GPU platforms, including NVIDIA GPUs on the top cloud providers, NVIDIA DGX Systems, NGC-Ready systems, and PCs and workstations with select NVIDIA TITAN and Quadro GPUs. View the NGC documentation for more information.
Yes, the terms of use allow NGC containers to be used on desktop PCs running NVIDIA Pascal and later GPUs.
NVIDIA offers virtual machine image files in the marketplace section of each supported cloud service provider. To run an NGC container, simply pick the appropriate instance type, run the NGC image, and pull the container into it from the NGC container registry. The exact steps vary by cloud provider, but you can find step-by-step instructions in the NGC documentation.
The most popular deep learning software such as TensorFlow, PyTorch and MXNet are updated monthly by NVIDIA engineers to optimize the complete software stack and get the most from your NVIDIA GPUs.
Users get access to the NVIDIA DevTalk Developer Forum https://devtalk.nvidia.com, supported by a large community of AI and GPU experts from the NVIDIA customer, partner, and employee ecosystem.
NVIDIA is accelerating the democratization of AI by giving data scientists, researchers and developers simplified access to GPU-accelerated software. This makes it easy for them to run these optimized containers on NVIDIA GPUs in the cloud or on local systems.
There is no charge to download the containers from the NGC container registry (subject to the terms of the TOU). However, for running in the cloud, each cloud service provider will have their own pricing for GPU compute instances.
Please see https://ngc.nvidia.com/legal/terms