NGC | Catalog
Logo for DGL
Features
Description
This container is built with the latest version of Deep Graph Library(DGL), PyTorch, and their dependencies.
Publisher
NVIDIA
Latest Tag
24.03-py3
Modified
March 26, 2024
Compressed Size
9.68 GB
Multinode Support
Yes
Multi-Arch Support
Yes
24.03-py3 (Latest) Security Scan Results
No results available.

What is inside this container?

Deep Graph Library (DGL) is a Python package built for the implementation and training of graph neural networks on top of existing DL frameworks. NGC Containers are the easiest way to get started with DGL. The DGL NGC Container is built with the latest versions of Deep Graph Library (DGL), PyTorch, and their dependencies.

The DGL NGC Container is optimized for GPU acceleration and contains a validated set of libraries that enable and optimize GPU performance. This container also contains software for accelerating data sampling and ETL (cuGraph, NVIDIA Rapids), Training (cuDNN, NCCL), and Inference (TensorRT) workloads.

Prerequisites

There are two main prerequisites for DGL containers:

Running the container

Use the following commands to run the container, where <xx.xx> is the container version. For example, 23.07 for July 2023 release:

docker run --gpus all -it --rm nvcr.io/nvidia/dgl:<xx.xx>-py3

Running JupyterLab and examples

To start JupyterLab from the container and view all the included examples:

docker run --gpus all -it --rm -p 8888:8888 nvcr.io/nvidia/dgl:<xx.xx>-py3 bash -c 'source /usr/local/nvm/nvm.sh && jupyter lab'

You might want to pull in your own data or persist code outside the DGL container. The easiest method is to mount one or more host directories as Docker bind mounts so your code changes persist.

We also have a GraphSAGE training example:

cd examples/graphsage
python3 train_full.py --dataset cora --gpu 0

If you are looking for examples from DGL, you can find them in /opt/dgl/dgl-source/

Documentation and resources

Our documentation and release notes can be found here.

Ethical AI

NVIDIA's platforms and application frameworks enable developers to build a wide array of AI applications. Consider potential algorithmic bias when choosing or creating the models being deployed. Work with the model's developer to ensure:

  • The model meets the requirements for the relevant industry and use case
  • The necessary instruction and documentation are provided to understand error rates, confidence intervals, and results
  • The model is being used under the conditions and in the manner intended.