CORE MACHINE LEARNING

Marginalized Stochastic Natural Gradients for Black-Box Variational Inference

July 18, 2021

Abstract

Black-box variational inference algorithms use stochastic sampling to analyze diverse statistical models, like those expressed in probabilistic programming languages, without model-specific derivations. While the popular score-function estimator computes unbiased gradient estimates, its variance is often unacceptably large, especially in models with discrete latent variables. We propose a stochastic natural gradient estimator that is as broadly applicable and unbiased, but improves efficiency by exploiting the curvature of the variational bound, and provably reduces variance by marginalizing discrete latent variables. Our marginalized stochastic natural gradients have intriguing connections to classic coordinate ascent variational inference, but allow parallel updates of variational parameters, and provide superior convergence guarantees relative to naive Monte Carlo approximations. We integrate our method with the probabilistic programming language Pyro and evaluate real-world models of documents, images, networks, and crowd-sourcing. Compared to score-function estimators, we require far fewer Monte Carlo samples and consistently convergence orders of magnitude faster.

See supplementary material here.

Download the Paper

AUTHORS

Written by

Geng Ji

Debora Sujono

Erik B. Sudderth

Publisher

ICML 2021

Research Topics

Core Machine Learning

Related Publications

November 03, 2020

CORE MACHINE LEARNING

Robust Embedded Deep K-means Clustering

Deep neural network clustering is superior to the conventional clustering methods due to deep feature extraction and nonlinear dimensionality reduction.…

Rui Zhang, Hanghang Tong Yinglong Xia, Yada Zhu

November 03, 2020

December 07, 2020

CORE MACHINE LEARNING

Adversarial Example Games

The existence of adversarial examples capable of fooling trained neural network classifiers calls for a much better understanding of possible attacks to guide the development…

Avishek Joey Bose, Gauthier Gidel, Andre Cianflone, Pascal Vincent, Simon Lacoste-Julien, William L. Hamilton

December 07, 2020

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.