THEORY

RESEARCH

Fluctuation-dissipation relations for stochastic gradient descent

May 03, 2019

Abstract

The notion of the stationary equilibrium ensemble has played a central role in statistical mechanics. In machine learning as well, training serves as generalized equilibration that drives the probability distribution of model parameters toward stationarity. Here, we derive stationary fluctuation-dissipation relations that link measurable quantities and hyperparameters in the stochastic gradient descent algorithm. These relations hold exactly for any stationary state and can in particular be used to adaptively set training schedule. We can further use the relations to efficiently extract information pertaining to a loss-function landscape such as the magnitudes of its Hessian and anharmonicity. Our claims are empirically verified.

Download the Paper

AUTHORS

Written by

Sho Yaida

Publisher

ICLR

Research Topics

Theory

Related Publications

December 15, 2021

RESEARCH

Sample-and-threshold differential privacy: Histograms and applications

Akash Bharadwaj, Graham Cormode

December 15, 2021

December 06, 2021

THEORY

CORE MACHINE LEARNING

Learning on Random Balls is Sufficient for Estimating (Some) Graph Parameters

Takanori Maehara, Hoang NT

December 06, 2021

November 12, 2021

THEORY

REINFORCEMENT LEARNING

Bandits with Knapsacks beyond the Worst-Case Analysis

Karthik Abinav Sankararaman, Aleksandrs Slivkins

November 12, 2021

July 25, 2021

HUMAN & MACHINE INTELLIGENCE

INTEGRITY

Cumulative deviation of a subpopulation from the full population

Mark Tygert

July 25, 2021