ML APPLICATIONS

Neural Relational Autoregression for High-Resolution COVID-19 Forecasting

October 1, 2020

Abstract

Forecasting COVID-19 poses unique challenges due to the novelty of the disease, its unknown characteristics, and substantial but varying interventions to reduce its spread. To improve the quality and robustness of forecasts, we propose a new method which aims to disentangle region-specific factors -- such as demographics, enacted policies, and mobility -- from disease-inherent factors that influence its spread. For this purpose, we combine recurrent neural networks with a vector autoregressive model and train the joint model with a specific regularization scheme that increases the coupling between regions. This approach is akin to using Granger causality as a relational inductive bias and allows us to train high-resolution models by borrowing statistical strength across regions. In our experiments, we observe that our method achieves strong performance in predicting the spread of COVID-19 when compared to state-of-the-art forecasts.

Download the Paper

AUTHORS

Written by

Matthew Le

Mark Ibrahim

Levent Sagun

Timothee Lacroix

Maximilian Nickel

Publisher

Facebook AI

Related Publications

August 01, 2019

NLP

Simple and Effective Curriculum Pointer-Generator Networks for Reading Comprehension over Long Narratives | Facebook AI Research

This paper tackles the problem of reading comprehension over long narratives where documents easily span over thousands of tokens. We propose a curriculum learning (CL) based Pointer-Generator framework for reading/sampling over large…

Yi Tay, Shuohang Wang, Luu Anh Tuan, Jie Fu, Minh C. Phan, Xingdi Yuan, Jinfeng Rao, Siu Cheung Hui, Aston Zhang

August 01, 2019

July 29, 2019

NLP

Improved Zero-shot Neural Machine Translation via Ignoring Spurious Correlations | Facebook AI Research

Zero-shot translation, translating between language pairs on which a Neural Machine Translation (NMT) system has never been trained, is an emergent property when training the system in multilingual settings. However, naïve training for…

Jiatao Gu, Yong Wang, Kyunghyun Cho, Victor O.K. Li

July 29, 2019

July 29, 2019

NLP

Word-order biases in deep-agent emergent communication | Facebook AI Research

Sequence-processing neural networks led to remarkable progress on many NLP tasks. As a consequence, there has been increasing interest in understanding to what extent they process language as humans do. We aim here to uncover which biases such…

Rahma Chaabouni, Eugene Kharitonov, Alessandro Lazaric, Emmanuel Dupoux, Marco Baroni

July 29, 2019

June 11, 2019

NLP

COMPUTER VISION

Adversarial Inference for Multi-Sentence Video Description | Facebook AI Research

While significant progress has been made in the image captioning task, video description is still in its infancy due to the complex nature of video data. Generating multi-sentence descriptions for long videos is even more challenging. Among the…

Jae Sung Park, Marcus Rohrbach, Trevor Darrell, Anna Rohrbach

June 11, 2019

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.