NLP

ML APPLICATIONS

Emerging Cross-lingual Structure in Pretrained Language Models

July 09, 2020

Abstract

We study the problem of multilingual masked language modeling, i.e. the training of a single model on concatenated text from multiple languages, and present a detailed study of several factors that influence why these models are so effective for cross-lingual transfer. We show, contrary to what was previously hypothesized, that transfer is possible even when there is no shared vocabulary across the monolingual corpora and also when the text comes from very different domains. The only requirement is that there are some shared parameters in the top layers of the multi-lingual encoder. To better understand this result, we also show that representations from independently trained models in different languages can be aligned post-hoc quite effectively, strongly suggesting that, much like for non-contextual word embeddings, there are universal latent symmetries in the learned embedding spaces. For multilingual masked language modeling, these symmetries seem to be automatically discovered and aligned during the joint training process.

Download the Paper

AUTHORS

Publisher

ACL

Related Publications

December 06, 2021

NLP

Pay Better Attention to Attention: Head Selection in Multilingual and Multi-Domain Sequence Modeling

Hongyu Gong, Yun Tang, Juan Miguel Pino, Xian Li

December 06, 2021

November 16, 2021

NLP

Can Transformers Jump Around Right in Natural Language? Assessing Performance Transfer from SCAN

Rahma Chaabouni, Roberto Dessì, Evgeny Kharitonov

November 16, 2021

November 08, 2021

NLP

CORE MACHINE LEARNING

DOBF: A Deobfuscation Pre-Training Objective for Programming Languages

Baptiste Rozière, Marie-Anne Lachaux, Marc Szafraniec, Guillaume Lample

November 08, 2021

October 29, 2021

ML APPLICATIONS

Antipodes of Label Differential Privacy: PATE and ALIBI

Mani Malek, Ilya Mironov, Karthik Prasad, Igor Shilov, Florian Tramer

October 29, 2021