Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Evaluating Lossy Compression Rates of Deep Generative Models
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-001-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-001-alpha.b-cdn.net
      • sl-yoda-v2-stream-001-beta.b-cdn.net
      • 1824830694.rsc.cdn77.org
      • 1979322955.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Evaluating Lossy Compression Rates of Deep Generative Models
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Evaluating Lossy Compression Rates of Deep Generative Models

            Jul 12, 2020

            Speakers

            SH

            Sicong Huang

            Speaker · 0 followers

            AM

            Alireza Makhzani

            Speaker · 0 followers

            YC

            Yanshuai Cao

            Speaker · 0 followers

            About

            Deep generative models have achieved remarkable progress in recent years. Despite this progress, quantitative evaluation and comparison of generative models remains as one of the important challenges. One of the most popular metrics for evaluating generative models is the log-likelihood. While the direct computation of log-likelihood can be intractable, it has been recently shown that the log-likelihood of some of the most interesting generative models such as variational autoencoders (VAE) or g…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.7k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Fair k-centers via Maximum Matching
            12:37

            Fair k-centers via Maximum Matching

            Huy Le Nguyen, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            1min Intro

            Placeholder AutoMLWS20

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning to Learn from Failures Using Replay
            04:27

            Learning to Learn from Failures Using Replay

            Tao Chen, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Enhanced POET: Open-ended Reinforcement Learning through Unbounded Invention of Learning Challenges and their Solutions
            15:28

            Enhanced POET: Open-ended Reinforcement Learning through Unbounded Invention of Learning Challenges and their Solutions

            Rui Wang, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks
            14:55

            Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks

            Blake Bordelon, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Deep Active Learning Toward Crisis-related Tweets Classification
            16:57

            Deep Active Learning Toward Crisis-related Tweets Classification

            Shiva Ebrahimi, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020