Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Contextual Dropout: An Efficient Sample-Dependent Dropout Module
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-015-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-015-alpha.b-cdn.net
      • sl-yoda-v3-stream-015-beta.b-cdn.net
      • 1963568160.rsc.cdn77.org
      • 1940033649.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Contextual Dropout: An Efficient Sample-Dependent Dropout Module
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Contextual Dropout: An Efficient Sample-Dependent Dropout Module

            May 3, 2021

            Speakers

            XF

            Xinjie Fan

            Speaker · 0 followers

            SZ

            Shujian Zhang

            Speaker · 0 followers

            KT

            Korawat Tanwisuth

            Speaker · 0 followers

            About

            Dropout has been demonstrated as a simple and effective module to not only regularize the training process of deep neural networks, but also provide the uncertainty estimation for prediction. However, the quality of uncertainty estimation is highly dependent on the dropout probabilities. Most current models use the same dropout distributions across all data samples due to its simplicity. Despite the potential gains in the flexibility of modeling uncertainty, sample-dependent dropout, on the othe…

            Organizer

            I2
            I2

            ICLR 2021

            Account · 901 followers

            About ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Interpretable Neural Architecture Search Using Bayesian Optimisation with Weisfeiler-Lehman Kernels
            05:24

            Interpretable Neural Architecture Search Using Bayesian Optimisation with Weisfeiler-Lehman Kernels

            Robin Ru, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning Incompressible Fluid Dynamics from Scratch - Towards Fast, Differentiable Fluid Models that Generalize
            10:56

            Learning Incompressible Fluid Dynamics from Scratch - Towards Fast, Differentiable Fluid Models that Generalize

            Nils Wandel, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Beyond the Fairness Rhetoric in ML
            1:10:47

            Beyond the Fairness Rhetoric in ML

            Timnit Gebru

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning Task-General Representations with Generative Neuro-Symbolic Modeling
            06:13

            Learning Task-General Representations with Generative Neuro-Symbolic Modeling

            Reuben Feinman, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Estimating informativeness of samples with Smooth Unique Information
            06:05

            Estimating informativeness of samples with Smooth Unique Information

            Hrayr Harutyunyan, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Reset-Free Reinforcement Learning via Multi-Task Learning
            11:13

            Reset-Free Reinforcement Learning via Multi-Task Learning

            Abhishek Gupta, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICLR 2021