Estimating the Error of Randomized Newton Methods: A Bootstrap Approach

Jul 12, 2020

Speakers

About

Randomized Newton methods have recently become the focus of intense research activity in large-scale and distributed optimization. Generally, these methods are based on a "computation-accuracy trade-off", which allows the user to gain scalability in exchange for error in the solution. However, the user does not know how much error is created by the randomization, which can be detrimental in two ways: On one hand, the user may try to manage the unknown error with theoretical worst-case error bounds, but this approach is impractical when the bounds involve unknown constants, and it typically leads to excessive computation. On the other hand, the user may select tuning parameters or stopping criteria in a heuristic manner, but this is generally unreliable. Motivated by these difficulties, we develop a bootstrap method for directly estimating the unknown error, which avoids excessive computation and offers greater reliability. Also, we provide non-asymptotic theoretical guarantees to show that the error estimates are valid for several error metrics and algorithms (including GIANT and Newton Sketch). Lastly, we show that the proposed method adds relatively little cost to existing randomized Newton methods, and that it performs well in a range of experimental conditions.

Organizer

Categories

About ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Store presentation

Should this presentation be stored for 1000 years?

How do we store presentations

Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow ICML 2020