NGC | Catalog
CatalogModelsResNet v1.5 TensorFlow checkpoint (AMP)

ResNet v1.5 TensorFlow checkpoint (AMP)

Logo for ResNet v1.5 TensorFlow checkpoint (AMP)
Description
ResNet v1.5 TensorFlow checkpoint trained with AMP
Publisher
NVIDIA Deep Learning Examples
Latest Version
20.06.0
Modified
April 4, 2023
Size
301.79 MB

Model Overview

With modified architecture and initialization this ResNet50 version gives ~0.5% better accuracy than original.

Model Architecture

The ResNet50 v1.5 model is a modified version of the original ResNet50 v1 model.

The difference between v1 and v1.5 is in the bottleneck blocks which requires downsampling, for example, v1 has stride = 2 in the first 1x1 convolution, whereas v1.5 has stride = 2 in the 3x3 convolution.

This difference makes ResNet50 v1.5 slightly more accurate (~0.5% top1) than v1, but comes with a small performance drawback (~5% imgs/sec).

The following performance optimizations were implemented in this model:

  • JIT graph compilation with XLA
  • Multi-GPU training with Horovod
  • Automated mixed precision AMP

This model is trained with mixed precision using Tensor Cores on Volta, Turing, and the NVIDIA Ampere GPU architectures. Therefore, researchers can get results 3x faster than training without Tensor Cores, while experiencing the benefits of mixed precision training. This model is tested against each NGC monthly container release to ensure consistent accuracy and performance over time.

Training

This model was trained using script available on NGC and in GitHub repo

Dataset

The following datasets were used to train this model:

  • ImageNet - Image database organized according to the WordNet hierarchy, in which each noun is depicted by hundreds and thousands of images.

Performance

Performance numbers for this model are available in NGC

References

License

This model was trained using open-source software available in Deep Learning Examples repository. For terms of use, please refer to the license of the script and the datasets the model was derived from.