Scott Wen-tau Yih

Scott is a Research Scientist at Facebook AI Research (FAIR) his general research interests include natural language processing, machine learning and information retrieval. He has worked on a variety of problems over the years, including information extraction, semantic role labeling, email spam filtering, keyword extraction and search & ad relevance. His recent work focuses on continuous representations and neural network models, with applications in knowledge base embedding, semantic parsing and question answering.

Scott's Publications

September 02, 2020

NLP

RESEARCH

TaBERT: Pretraining for Joint Understanding of Textual and Tabular Data

Recent years have witnessed the burgeoning of pretrained language models (LMs) for text-based natural language (NL) understanding tasks. Such models are typically trained…

Pengcheng Yin, Graham Neubig, Wen-tau Yih, Sebastian Riedel

September 02, 2020

September 02, 2020

NLP

Language Models as Fact Checkers?

Recent work has revealed that language models (LM) store both common-sense and factual information from data they were pretrained on. In this paper, we explore whether language models can be used as an effective…

Nayeon Lee, Belinda Z. Li, Sinong Wang, Wen-tau Yih, Hao Ma, Madian Khabsa

September 02, 2020