NGC | Catalog
CatalogModelsMegatron-BERT 345M

Megatron-BERT 345M

Logo for Megatron-BERT 345M
Description
345M parameter BERT Megatron model
Publisher
-
Latest Version
v0.1_cased
Modified
April 4, 2023
Size
638.5 MB

Megatron-LM BERT 345M

Megatron is a large, powerful transformer. For this particular Megatron model we trained a bidirectional transformer in the style of BERT. This model contains 345 million parameters made up of 24 layers, 16 attention heads, and a hidden size of 1024.

This model was trained on text sourced from Wikipedia, RealNews, OpenWebText, and CC-Stories. We offer versions of this model pretrained both with a cased and uncased vocabulary.

Find more information at our repo: https://github.com/NVIDIA/Megatron-LM