Anderson acceleration of coordinate descent

Apr 14, 2021

Speakers

About

Acceleration of first order methods is mainly obtained via inertial techniques à la Nesterov, or via nonlinear extrapolation. The latter has known a recent surge of interest, with successful applications to gradient and proximal gradient techniques. On multiple Machine Learning problems, coordinate descent achieves performance significantly superior to full-gradient methods. Speeding up coordinate descent in practice is not easy: inertially accelerated versions of coordinate descent are theoretically accelerated, but may not always lead to practical speed-ups. We propose an accelerated version of coordinate descent using extrapolation, which shows considerable speed up in practice, compared to both inertial accelerated coordinate descent and extrapolated proximal gradient descent. Experiments on Least Squares, Lasso, elastic net, group Lasso, and Logistic Regression validate the approach.

Organizer

Categories

About AISTATS 2021

The 24th International Conference on Artificial Intelligence and Statistics was held virtually from Tuesday, 13 April 2021 to Thursday, 15 April 2021.

Like the format? Trust SlidesLive to capture your next event!

Professional recording and live streaming, delivered globally.

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow AISTATS 2021