RESEARCH

NLP

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

July 8, 2020

Abstract

We present BART, a denoising autoencoder for pretraining sequence-to-sequence models. BART is trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text. It uses a standard Tranformer-based neural machine translation architecture which, despite its simplicity, can be seen as generalizing BERT (due to the bidirectional encoder), GPT (with the left-to-right decoder), and other recent pre- training schemes. We evaluate a number of noising approaches, finding the best perfor- mance by both randomly shuffling the order of sentences and using a novel in-filling scheme,where spans of text are replaced with a single mask token. BART is particularly effective when fine tuned for text generation but also works well for comprehension tasks. It matches the performance of RoBERTa on GLUE and SQuAD, and achieves new state-of-the-art results on a range of abstractive di-alogue, question answering, and summarization tasks, with gains of up to 3.5 ROUGE. BART also provides a 1.1 BLEU increase over a back-translation system for machine translation, with only target language pretraining. We also replicate other pretraining schemes within the BART framework, to understand their effect on end-task performance.

Download the Paper

AUTHORS

Written by

Mike Lewis

Yinhan Liu

Naman Goyal

Marjan Ghazvininejad

Abdelrahman Mohamed

Omer Levy

Ves Stoyanov

Luke Zettlemoyer

Publisher

Association for Computational Linguistics (ACL)

Related Publications

August 01, 2019

NLP

Simple and Effective Curriculum Pointer-Generator Networks for Reading Comprehension over Long Narratives | Facebook AI Research

Yi Tay, Shuohang Wang, Luu Anh Tuan, Jie Fu, Minh C. Phan, Xingdi Yuan, Jinfeng Rao, Siu Cheung Hui, Aston Zhang

August 01, 2019

July 27, 2019

NLP

Unsupervised Question Answering by Cloze Translation | Facebook AI Research

Patrick Lewis, Ludovic Denoyer, Sebastian Riedel

July 27, 2019

September 10, 2019

NLP

Bridging the Gap Between Relevance Matching and Semantic Matching for Short Text Similarity Modeling | Facebook AI Research

Jinfeng Rao, Linqing Liu, Yi Tay, Wei Yang, Peng Shi, Jimmy Lin

September 10, 2019

May 17, 2019

NLP

Unsupervised Hyper-alignment for Multilingual Word Embeddings | Facebook AI Research

Jean Alaux, Edouard Grave, Marco Cuturi, Armand Joulin

May 17, 2019

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.