Jul 12, 2020
The goal of compressed sensing is to learn a structured signal x from a limited number of noisy linear measurements y ≈ Ax. In traditional compressed sensing, “structure” is represented by sparsity in some known basis. Inspired by the success of deep learning in modeling images, recent work starting with Bora et.al has instead considered structure to come from a generative model G: ^k →^n. In this paper, we prove results that (i)establish the difficulty of this task and show that existing bounds are tight and (ii) demonstrate that the latter task is a generalization of the former. First, we provide a lower bound matching the upper bound of Bora et.al. for compressed sensing from L-Lipschitz generative models G. In particular, there exists such a function that requires roughly Ω(k log L) linear measurements for sparse recovery to be possible. This holds even for the more relaxed goal of nonuniform recovery. Second, we show that generative models generalize sparsity as a representation of structure. In particular, we construct a ReLU-based neural network G: ^k→^n with O(1) layers and O(n) activations per layer, such that the range of G contains all k-sparse vectors.
The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.
Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%
Presentations on similar topic, category or speaker