Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: AR-DAE: Towards Unbiased Neural Entropy Gradient Estimation
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-002-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-002-alpha.b-cdn.net
      • sl-yoda-v2-stream-002-beta.b-cdn.net
      • 1001562353.rsc.cdn77.org
      • 1075090661.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            AR-DAE: Towards Unbiased Neural Entropy Gradient Estimation
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            AR-DAE: Towards Unbiased Neural Entropy Gradient Estimation

            Jul 12, 2020

            Sprecher:innen

            JHL

            Jae Hyun Lim

            Sprecher:in · 0 Follower:innen

            AC

            Aaron Courville

            Sprecher:in · 3 Follower:innen

            CP

            Christopher Pal

            Sprecher:in · 0 Follower:innen

            Über

            Entropy is ubiquitous in machine learning, but it is in general intractable to compute the entropy of the distribution of an arbitrary continuous random variable. In this paper, we propose the amortized residual denoising autoencoder (AR-DAE) to approximate the gradient of the log density function, which can be used to estimate the gradient of entropy. Amortization allows us to significantly reduce the error of the gradient approximator by approaching asymptotic optimality of a regular DAE, in w…

            Organisator

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Kategorien

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            Über ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Gradient Based Memory Editing for Task-Free Continual Learning
            14:46

            Gradient Based Memory Editing for Task-Free Continual Learning

            Xisen Jin, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Graph Clustering with Graph Neural Networks
            04:23

            Graph Clustering with Graph Neural Networks

            Anton Tsitsulin, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Evaluating Logical Generalization in Graph Neural Networks
            05:35

            Evaluating Logical Generalization in Graph Neural Networks

            Koustuv Sinha, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Retrospective on DeepIV and some general thoughts on research strategy
            22:58

            Retrospective on DeepIV and some general thoughts on research strategy

            Jason Hartford

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 1 = 0.1%

            Learning Factorized Weight Matrix for Joint Image Filtering
            14:08

            Learning Factorized Weight Matrix for Joint Image Filtering

            Xiangyu Xu, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Geometric and Topological Graph Analysis for Machine Learning Applications
            30:19

            Geometric and Topological Graph Analysis for Machine Learning Applications

            Tina Eliassi-Rad, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? ICML 2020 folgen