RESEARCH

NLP

A Tale of a Probe and a Parser

June 19, 2020

Abstract

Measuring what linguistic information is encoded in neural models of language has become popular in NLP. Researchers approach this enterprise by training "probes"—supervised models designed to extract linguistic structure from another model's output. One such probe is the 'structural probe' (Hewitt & Manning 2019), designed to quantify the extent to which syntactic information is encoded in contextualised word representations. The structural probe has a novel design, unattested in the parsing literature, the precise benefit of which is not immediately obvious. To explore whether syntactic probes would do better to make use of existing techniques, we compare the structural probe to a more traditional parser with an identical lightweight parameterisation. The parser outperforms structural probe on UUAS in seven of nine analysed languages, often by a substantial amount (e.g. by 11.1 points in English). Under a second less common metric, however, there is the opposite trend—the structural probe outperforms the parser. This begs the question: which metric should we prefer?

Download the Paper

AUTHORS

Written by

Adina Williams

Joseph Valvoda

Rowan Hall Maudsley

Ryan Cotterell

Tiago Pimentel

Publisher

ACL

Related Publications

December 15, 2021

RESEARCH

Sample-and-threshold differential privacy: Histograms and applications

Akash Bharadwaj, Graham Cormode

December 15, 2021

December 06, 2021

NLP

Pay Better Attention to Attention: Head Selection in Multilingual and Multi-Domain Sequence Modeling

Hongyu Gong, Yun Tang, Juan Miguel Pino, Xian Li

December 06, 2021

November 16, 2021

NLP

Can Transformers Jump Around Right in Natural Language? Assessing Performance Transfer from SCAN

Rahma Chaabouni, Roberto Dessì, Evgeny Kharitonov

November 16, 2021

November 08, 2021

NLP

CORE MACHINE LEARNING

DOBF: A Deobfuscation Pre-Training Objective for Programming Languages

Baptiste Rozière, Marie-Anne Lachaux, Marc Szafraniec, Guillaume Lample

November 08, 2021