Closing the convergence gap of SGD without replacement

Jul 12, 2020

Speakers

About

Stochastic gradient descent without replacement sampling is widely used in practice for model training. However, the vast majority of SGD analyses assumes data sampled with replacement, and when the function minimized is strongly convex, an 𝒪(1/T) rate can be established when SGD is run for T iterations. A recent line of breakthrough work on SGD without replacement (SGDo) established an 𝒪(n/T^2) convergence rate when the function minimized is strongly convex and is a sum of n smooth functions, and an 𝒪(1/T^2+n^3/T^3) rate for sums of quadratics. On the other hand, the tightest known lower bound postulates an Ω(1/T^2+n^2/T^3) rate, leaving open the possibility of better SGDo convergence rates in the general case. In this paper, we close this gap and show that SGD without replacement achieves a rate of 𝒪(1/T^2+n^2/T^3) when the sum of the functions is a quadratic, and offer a new lower bound of Ω(n/T^2) for strongly convex functions that are sums of smooth functions.

Organizer

Categories

About ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICML 2020