Back to Unilm

Adaptive Input Representations for Neural Language Modeling (Baevski and Auli, 2018)

infoxlm/fairseq/examples/language_model/transformer_lm/README.md

latest1.3 KB
Original Source

Adaptive Input Representations for Neural Language Modeling (Baevski and Auli, 2018)

Pre-trained models

DescriptionParametersDatasetModel and Test set(s)
Adaptive Inputs
(Baevski and Auli, 2018)1026MGoogle Billion Wordsdownload (.tar.bz2)
Adaptive Inputs
(Baevski and Auli, 2018)247MWikiText-103download (.tar.bz2)

Example usage

See the language modeling README for instructions on reproducing results for WikiText-103 using the transformer_lm_wiki103 model architecture.

Citation

bibtex
@inproceedings{
    baevski2018adaptive,
    title={Adaptive Input Representations for Neural Language Modeling},
    author={Alexei Baevski and Michael Auli},
    booktitle={International Conference on Learning Representations},
    year={2019},
    url={https://openreview.net/forum?id=ByxZX20qFQ},
}