Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: R-Drop: Regularized Dropout for Neural Networks
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-016-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-016-alpha.b-cdn.net
      • sl-yoda-v3-stream-016-beta.b-cdn.net
      • 1504562137.rsc.cdn77.org
      • 1896834465.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            R-Drop: Regularized Dropout for Neural Networks
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            R-Drop: Regularized Dropout for Neural Networks

            Dec 6, 2021

            Speakers

            XL

            Xiaobo Liang

            Speaker · 0 followers

            LW

            Lijun Wu

            Speaker · 0 followers

            JL

            Juntao Li

            Speaker · 0 followers

            About

            Dropout is a powerful and widely used technique to regularize the training of deep neural networks. In this paper, we introduce a simple regularization strategy upon dropout in model training, namely R-Drop, which forces the output distributions of different sub models generated by dropout to be consistent with each other. Specifically, for each training sample, R-Drop minimizes the bidirectional KL-divergence between the output distributions of two sub models sampled by dropout. Theoretical an…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Near-Optimal Offline Reinforcement Learning via Double Variance Reduction
            08:57

            Near-Optimal Offline Reinforcement Learning via Double Variance Reduction

            Ming Yin, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Influence Patterns for Explaining Information Flow in BERT
            14:16

            Influence Patterns for Explaining Information Flow in BERT

            Caleb Lu, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning latent causal graphs via mixture oracles
            12:33

            Learning latent causal graphs via mixture oracles

            Bohdan Kivva, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            How Should a Machine Learning Researcher Think About AI Ethics?
            1:01:45

            How Should a Machine Learning Researcher Think About AI Ethics?

            Amanda Askell, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            On UMAP's True Loss Function
            14:13

            On UMAP's True Loss Function

            Sebastian Damrich, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            A Complete Axiomatization of Forward Differentiation
            12:34

            A Complete Axiomatization of Forward Differentiation

            Gordon Plotkin

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021