Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: On the Power of Compressed Sensing with Generative Models
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-011-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-011-alpha.b-cdn.net
      • sl-yoda-v3-stream-011-beta.b-cdn.net
      • 1150868944.rsc.cdn77.org
      • 1511650057.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            On the Power of Compressed Sensing with Generative Models
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            On the Power of Compressed Sensing with Generative Models

            Jul 12, 2020

            Sprecher:innen

            AK

            Akshay Kamath

            Sprecher:in · 0 Follower:innen

            EP

            Eric Price

            Sprecher:in · 0 Follower:innen

            SK

            Sushrut Karmalkar

            Sprecher:in · 1 Follower:in

            Über

            The goal of compressed sensing is to learn a structured signal x from a limited number of noisy linear measurements y ≈ Ax. In traditional compressed sensing, “structure” is represented by sparsity in some known basis. Inspired by the success of deep learning in modeling images, recent work starting with Bora et.al has instead considered structure to come from a generative model G: ^k →^n. In this paper, we prove results that (i)establish the difficulty of this task and show that existing bounds…

            Organisator

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Kategorien

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            Über ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Invited Talk 8
            46:19

            Invited Talk 8

            Jennifer Listgarten

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Representation Learning Without Labels - Part 4
            55:59

            Representation Learning Without Labels - Part 4

            Danilo J. Rezende, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Parameter-Free, Dynamic, and Strongly-Adaptive Online Learning
            14:58

            Parameter-Free, Dynamic, and Strongly-Adaptive Online Learning

            Ashok Cutkosky

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Designing Bayesian-Optimal Experiments with Stochastic Gradients
            44:29

            Designing Bayesian-Optimal Experiments with Stochastic Gradients

            Tom Rainforth

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Modeling the Semantics of Data Sources with Graph Neural Networks
            05:31

            Modeling the Semantics of Data Sources with Graph Neural Networks

            Giuseppe Futia, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Never-Ending Learning
            47:10

            Never-Ending Learning

            Partha Talukdar

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? ICML 2020 folgen