NGC | Catalog
CatalogContainersMerlin Training

Merlin Training

Logo for Merlin Training
Features
Description
This container allows users to do preprocessing and feature engineering with NVTabular, and then train a deep-learning based recommender system model with HugeCTR.
Publisher
NVIDIA
Latest Tag
latest
Modified
March 1, 2024
Compressed Size
8.11 GB
Multinode Support
No
Multi-Arch Support
No
latest (Latest) Security Scan Results

Linux / amd64

Sorry, your browser does not support inline SVG.

What is Merlin for Recommender Systems?

NVIDIA Merlin is a framework for accelerating the entire recommender systems pipeline on the GPU: from data ingestion and training to deployment. Merlin empowers data scientists, machine learning engineers, and researchers to build high-performing recommenders at scale. Merlin includes tools that democratize building deep learning recommenders by addressing common ETL, training, and inference challenges. Each stage of the Merlin pipeline is optimized to support hundreds of terabytes of data, all accessible through easy-to-use APIs. With Merlin, better predictions than traditional methods and increased click-through rates are within reach.

The Merlin ecosystem has four main components: Merlin ETL, Merlin Dataloaders and Training, and Merlin Inference.

Merlin Training for ETL with NVTabular and Training with HugeCTR

The Merlin-training container allows users to do preprocessing and feature engineering with NVTabular, and then train a deep-learning based recommender system model with HugeCTR.

As the ETL component of the Merlin ecosystem, NVTabular is a feature engineering and preprocessing library for tabular data designed to quickly and easily manipulate terabyte scale datasets used to train deep learning based recommender systems. The core features are explained in the API documentation and additional information can be found in the GitHub repository.

HugeCTR, a dedicated framework to train deep learning recommender systems written in CUDA C++. It is a recommender specific framework which is capable of distributed training across multiple GPUs and nodes for Click-Through-Rate (CTR) estimation. HugeCTR supports model-parallel embedding tables and data-parallel neural networks and their variants such as Wide and Deep Learning (WDL), Deep Cross Network (DCN), DeepFM, and Deep Learning Recommendation Model (DLRM).

Getting Started

Launch Merlin-Training Container

You can pull the training containers with the following command:

docker run --runtime=nvidia --rm -it -p 8888:8888 -p 8797:8787 -p 8796:8786 --ipc=host --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-training:22.03 /bin/bash

If you are running on a docker version 19+, change --runtime=nvidia to --gpus all. The container will open a shell when the run command completes execution, you will be responsible for starting the jupyter lab on the docker container. Should look similar to below:

root@2efa5b50b909:

Install jupyter-lab with conda or pip: Installation Guide

pip install jupyterlab

Finally start the jupyter-lab server:

jupyter-lab --allow-root --ip='0.0.0.0' --NotebookApp.token=''

Now you can use any browser to access the jupyter-lab server, via :8888 Once in the server, navigate to the /nvtabular/ directory and explore the code base or try out some of the examples. Within the container is the codebase, along with all of our dependencies, particularly RAPIDS Dask-cuDF. The easiest way to get started is to simply launch the container above and explore the examples within.

Other NVIDIA Merlin containers

Merlin containers are available in the NVIDIA container repository at the following locations: Table 1: Merlin Containers

Container name Container location Functionality
Merlin-training https://ngc.nvidia.com/catalog/containers/nvidia:merlin:merlin-training NVTabular and HugeCTR
Merlin-tensorflow-training https://ngc.nvidia.com/catalog/containers/nvidia:merlin:merlin-tensorflow-training NVTabular, TensorFlow and Tensorflow Embedding plugin
Merlin-pytorch-training https://ngc.nvidia.com/catalog/containers/nvidia:merlin:merlin-pytorch-training NVTabular and PyTorch
Merlin-inference https://ngc.nvidia.com/catalog/containers/nvidia:merlin:merlin-inference NVTabular, HugeCTR and Triton Inference

Examples and Tutorials

We provide a collection of examples, use cases, and tutorials for NVTabular and HugeCTR as Jupyter notebooks in our repository. These Jupyter notebooks are based on the following datasets:

  • MovieLens
  • Outbrain Click Prediction
  • Criteo Click Ads Prediction
  • RecSys2020 Competition Hosted by Twitter
  • Rossmann Sales Prediction With the example notebooks we cover the following:
  • Preprocessing and feature engineering with NVTabular
  • Advanced workflows with NVTabular
  • Accelerated dataloaders for TensorFlow and PyTorch
  • Scaling to multi-GPU and multi nodes systems
  • Integrating NVTabular with HugeCTR
  • Deploying to inference with Triton

For more sample models and their end-to-end instructions for HugeCTR visit the link: https://github.com/NVIDIA/HugeCTR/tree/master/samples

Learn More

If you are interested in learning more about how NVTabular works under the hood, we have API documentation that outlines in detail the specifics of the calls available within the library. The following are the suggested readings for those who want to learn more about HugeCTR.

HugeCTR User Guide: https://github.com/NVIDIA/HugeCTR/blob/master/docs/hugectr_user_guide.md

Questions and Answers: https://github.com/NVIDIA/HugeCTR/blob/master/docs/QAList.md

Sample models and their end-to-end instructions: https://github.com/NVIDIA/HugeCTR/tree/master/samples

NVIDIA Developer Site: https://developer.nvidia.com/nvidia-merlin#getstarted

NVIDIA Developer Blog: https://medium.com/nvidia-merlin

Contributing

If you wish to contribute to the Merlin library directly please see Contributing.md. We are particularly interested in contributions or feature requests for feature engineering or preprocessing operations that you have found helpful in your own workflows.

License

By pulling and using the container, you accept the terms and conditions of this End User License Agreement.