RESEARCH

NLP

Using Local Knowledge Graph Construction to Scale Seq2Seq Models to Multi-Document Inputs

October 17, 2019

Abstract

Query-based open-domain NLP tasks require information synthesis from long and diverse web results. Current approaches extractively select portions of web text as input to Sequence-to-Sequence models using methods such as TF-IDF ranking. We propose constructing a local graph structured knowledge base for each query, which compresses the web search information and reduces redundancy. We show that by linearizing the graph into a structured input sequence, models can encode the graph representations within a standard Sequence-to-Sequence setting. For two generative tasks with very long text input, long-form question answering and multidocument summarization, feeding graph representations as input can achieve better performance than using retrieved text portions.

Download the Paper

AUTHORS

Written by

Angela Fan

Antoine Bordes

Chloe Braud

Claire Gardent

Publisher

EMNLP

Related Publications

December 15, 2021

RESEARCH

Sample-and-threshold differential privacy: Histograms and applications

Akash Bharadwaj, Graham Cormode

December 15, 2021

December 06, 2021

NLP

Pay Better Attention to Attention: Head Selection in Multilingual and Multi-Domain Sequence Modeling

Hongyu Gong, Yun Tang, Juan Miguel Pino, Xian Li

December 06, 2021

November 16, 2021

NLP

Can Transformers Jump Around Right in Natural Language? Assessing Performance Transfer from SCAN

Rahma Chaabouni, Roberto Dessì, Evgeny Kharitonov

November 16, 2021

November 08, 2021

NLP

CORE MACHINE LEARNING

DOBF: A Deobfuscation Pre-Training Objective for Programming Languages

Baptiste Rozière, Marie-Anne Lachaux, Marc Szafraniec, Guillaume Lample

November 08, 2021