Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Label Noise SGD Provably Prefers Flat Global Minimizers
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-008-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-008-alpha.b-cdn.net
      • sl-yoda-v2-stream-008-beta.b-cdn.net
      • 1159783934.rsc.cdn77.org
      • 1511376917.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Label Noise SGD Provably Prefers Flat Global Minimizers
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Label Noise SGD Provably Prefers Flat Global Minimizers

            Dec 6, 2021

            Speakers

            AD

            Alex Damian

            Speaker · 1 follower

            TM

            Tengyu Ma

            Speaker · 9 followers

            JDL

            Jason D. Lee

            Speaker · 0 followers

            About

            In overparametrized models, the noise in stochastic gradient descent (SGD) implicitly regularizes the optimization trajectory and determines which local minimum SGD converges to. Motivated by empirical studies that demonstrate that training with noisy labels improves generalization, we study the implicit regularization effect of SGD with label noise. We show that SGD with label noise converges to a stationary point of a regularized loss L(θ) +λ R(θ), where L(θ) is the training loss, λ is an effe…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            PASS: An ImageNet replacement for self-supervised pretraining without humans
            05:07

            PASS: An ImageNet replacement for self-supervised pretraining without humans

            Yuki M. Asano, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Status-quo policy gradient in Multi-Agent Reinforcement Learning
            04:17

            Status-quo policy gradient in Multi-Agent Reinforcement Learning

            Pinkesh Badjatiya, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            HSVA: Hierarchical Semantic-Visual Adaptation for Zero-Shot Learning
            09:19

            HSVA: Hierarchical Semantic-Visual Adaptation for Zero-Shot Learning

            Shiming Chen, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Neural Flows: Efficient Alternative to Neural ODEs
            12:09

            Neural Flows: Efficient Alternative to Neural ODEs

            Marin Biloš, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 1 viewers voted for saving the presentation to eternal vault which is 0.1%

            CARMS: Categorical-Antithetic-REINFORCE Multi-Sample Gradient Estimator
            13:57

            CARMS: Categorical-Antithetic-REINFORCE Multi-Sample Gradient Estimator

            Alek Dimitriev, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Towards Reliable and Robust Model Explanations
            37:41

            Towards Reliable and Robust Model Explanations

            Hima Lakkaraju

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021