CORE MACHINE LEARNING

Learning-Rate-Free Learning by D-Adaptation

June 13, 2023

Abstract

D-Adaptation is an approach to automatically setting the learning rate which asymptotically achieves the optimal rate of convergence for minimizing convex Lipschitz functions, with no back-tracking or line searches, and no additional function value or gradient evaluations per step. Our approach is the first hyper-parameter free method for this class without additional multiplicative log factors in the convergence rate. We present extensive experiments for SGD and Adam variants of our method, where the method automatically matches hand-tuned learning rates across more than a dozen diverse machine learning problems, including large-scale vision and language problems. An open-source implementation is available.

Download the Paper

AUTHORS

Written by

Aaron Defazio

Konstantin Mishchenko

Publisher

ICML

Research Topics

Core Machine Learning

Related Publications

May 07, 2024

CORE MACHINE LEARNING

ReTaSA: A Nonparametric Functional Estimation Approach for Addressing Continuous Target Shift

Hwanwoo Kim, Xin Zhang, Jiwei Zhao, Qinglong Tian

May 07, 2024

April 04, 2024

CORE MACHINE LEARNING

DP-RDM: Adapting Diffusion Models to Private Domains Without Fine-Tuning

Jonathan Lebensold, Maziar Sanjabi, Pietro Astolfi, Adriana Romero Soriano, Kamalika Chaudhuri, Mike Rabbat, Chuan Guo

April 04, 2024

March 28, 2024

THEORY

CORE MACHINE LEARNING

On the Identifiability of Quantized Factors

Vitoria Barin Pacela, Kartik Ahuja, Simon Lacoste-Julien, Pascal Vincent

March 28, 2024

March 13, 2024

CORE MACHINE LEARNING

GaLore: Memory-Efficient LLM Training by Gradient Low-Rank Projection

Jiawei Zhao, Zhenyu Zhang, Beidi Chen, Zhangyang Wang, Anima Anandkumar, Yuandong Tian

March 13, 2024

Help Us Pioneer The Future of AI

We share our open source frameworks, tools, libraries, and models for everything from research exploration to large-scale production deployment.