The Power of Factorial Powers: New Parameter settings for (Stochastic) Optimization

October 14, 2021

Abstract

The convergence rates for convex and non-convex optimization methods depend on the choice of a host of constants, including step-sizes, Lyapunov function constants and momentum constants. In this work we propose the use of factorial powers as a flexible tool for defining constants that appear in convergence proofs. We list a number of remarkable properties that these sequences enjoy, and show how they can be applied to convergence proofs to simplify or improve the convergence rates of the momentum method, accelerated gradient and the stochastic variance reduced method (SVRG).

Download the Paper

AUTHORS

Written by

Aaron Defazio

Robert Gower

Publisher

ACML

Research Topics

Theory

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.