Sparse Convex Optimization via Adaptively Regularized Hard Thresholding (ARHT)

Jul 12, 2020

Sprecher:innen

Über

The goal of Sparse Convex Optimization is to optimize a convex function f under a sparsity constraint s <= s* γ, where s* is the target number of non-zero entries in a feasible solution (sparsity) and γ >= 1 is an approximation factor. There has been a lot of work to analyze the sparsity guarantees of various algorithms (LASSO, Orthogonal Matching Pursuit (OMP), Iterative Hard Thresholding (IHT)) in terms of the ”Restricted Condition Number” κ. The best known algorithms guarantee to find an approximate solution of value f(x*)+ε with the sparsity bound of γ = O(κ minlog ((f(x0)-f(x*))/ε), κ), where x* is the target solution. We present a novel ”Adaptively Regularized Hard Thresholding (ARHT)” algorithm that makes significant progress on this problem by bringing the bound down to γ=O(κ), which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT. This is achieved without any sacrifice in the runtime efficiency compared to the fastest known algorithms. For the regime of low sparsity s s* and bounded Restricted Condition Number we propose a greedy ”Exhaustive Local Search” algorithm that works under the condition s > s* κ^2 / 4 and provide an analysis for its performance in terms of approximately minimizing f. We then show that immediate application of this general theorem can yield Compressed Sensing bounds under the Restricted Isometry Property (RIP), akin to those of LASSO, CoSaMP, and IHT. When compared to other Compressed Sensing approaches, it has the advantage of providing a strong tradeoff between the RIP condition and the solution sparsity, as well as working for any general function f that meets the RIP condition. Experimental comparison of ARHT and Exhaustive Local Search with other algorithms gives further evidence of their advantage in applications where having low sparsity is critical.

Organisator

Kategorien

Über ICML 2020

The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

Präsentation speichern

Soll diese Präsentation für 1000 Jahre gespeichert werden?

Wie speichern wir Präsentationen?

Ewigspeicher-Fortschrittswert: 0 = 0.0%

Freigeben

Empfohlene Videos

Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

Interessiert an Vorträgen wie diesem? ICML 2020 folgen