Refined bounds for algorithm configuration: The knife-edge of dual class approximability

Jul 12, 2020

Sprecher:innen

Über

Automating algorithm configuration is growing increasingly necessary as algorithms come with more and more tunable parameters. It is common to tune the parameters using machine learning, optimizing performance metrics such as runtime and solution quality. The training set consists of problem instances from the specific domain at hand. We help answer a fundamental question about these techniques: how large should the training set be to ensure that a parameter’s average empirical performance over the training set is close to its expected, future performance? Prior research provides this type of sample complexity bound when the algorithm's performance as a function of its parameters is "simple." We often find, however, that these functions are not simple, but can be approximated by simple functions. We show that if this approximation holds under the L∞-norm, we can provide strong sample complexity bounds. On the flip side, if the approximation holds only under the Lp-norm for p < ∞, it is not possible to provide meaningful sample complexity bounds in the worst case. We empirically evaluate our bounds in the context of algorithm configuration for integer programming, one of the most powerful tools in computer science. In this way, we merge tools from discrete optimization and machine learning. Via experiments, we obtain sample complexity bounds that are up to 700 times smaller than the previously best-known bounds (Balcan et al., ICML 2018).

Organisator

Kategorien

Über ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Präsentation speichern

Soll diese Präsentation für 1000 Jahre gespeichert werden?

Wie speichern wir Präsentationen?

Ewigspeicher-Fortschrittswert: 0 = 0.0%

Freigeben

Empfohlene Videos

Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

Interessiert an Vorträgen wie diesem? ICML 2020 folgen