NGC | Catalog
CatalogModelsNMT Zh En Transformer6x6

NMT Zh En Transformer6x6

Logo for NMT Zh En Transformer6x6
Description
Neural Machine Translation (NMT) model to translate from Chinese to English
Publisher
NVIDIA
Latest Version
1.0.0rc1
Modified
April 4, 2023
Size
860.75 MB

Model Overview

This model can be used for translating text in source language (Zh) to a text in target language (En).

Model Architecture

The model is based on Transformer "Big" architecture originally presented in "Attention Is All You Need" paper [1]. In this particular instance, the model has 6 layers in the encoder and 6 layers in the decoder. It is using YouTokenToMe tokenizer [2].

Training

These models were trained on a collection of many publicly available datasets comprising of millions of parallel sentences. The NeMo toolkit [5] was used for training this model over roughly 300k steps.

Datasets

While training this model, we used the following datasets:

Tokenizer Construction

We used the YouTokenToMe tokenizer [2] with separate encoder and decoder BPE tokenizers.

Performance

The accuracy of translation models are often measured using BLEU scores [3]. The model achieves the following sacreBLEU [4] scores on the WMT'18, WMT'19 and WMT'20 test sets

WMT18 - 25.2
WMT19 - 25.1
WMT20 - 26.4

How to Use this Model

Automatically load the model from NGC

import nemo
import nemo.collections.nlp as nemo_nlp
nmt_model = nemo_nlp.models.machine_translation.MTEncDecModel.from_pretrained(model_name="nmt_zh_en_transformer6x6")

Translating text with this model

python [NEMO_GIT_FOLDER]/examples/nlp/machine_translation/nmt_transformer_infer.py --model=nmt_zh_en_transformer6x6.nemo --srctext=[TEXT_IN_SRC_LANGUAGE] --tgtout=[WHERE_TO_SAVE_TRANSLATIONS] --target_lang en --source_lang zh

Input

This translate method of the NMT model accepts a list of de-tokenized strings.

Output

The translate method outputs a list of de-tokenized strings in the target language.

Limitations

No known limitations at this time.

References

[1] Vaswani, Ashish, et al. "Attention is all you need." arXiv preprint arXiv:1706.03762 (2017).

[2] https://github.com/VKCOM/YouTokenToMe

[3] https://en.wikipedia.org/wiki/BLEU

[4] https://github.com/mjpost/sacreBLEU

[5] NVIDIA NeMo Toolkit

Licence

License to use this model is covered by the NGC TERMS OF USE unless another License/Terms Of Use/EULA is clearly specified. By downloading the public and release version of the model, you accept the terms and conditions of the NGC TERMS OF USE.