Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Understanding and Improving Early Stopping for Learning with Noisy Labels
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-007-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-007-alpha.b-cdn.net
      • sl-yoda-v2-stream-007-beta.b-cdn.net
      • 1678031076.rsc.cdn77.org
      • 1932936657.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Understanding and Improving Early Stopping for Learning with Noisy Labels
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Understanding and Improving Early Stopping for Learning with Noisy Labels

            Dec 6, 2021

            Speakers

            YB

            Yingbin Bai

            Speaker · 0 followers

            EY

            Erkun Yang

            Speaker · 0 followers

            BH

            Bo Han

            Speaker · 0 followers

            About

            The memorization effect of deep neural network (DNN) plays a pivotal role in many state-of-the-art label-noise learning methods. To exploit this property, the early stopping trick, which stops the optimization at the early stage of training, is usually adopted. Current methods generally decide the early stopping point by considering a DNN as a whole. However, a DNN can be considered as a composition of a series of layers, and we find that the latter layers in a DNN are much more sensitive to lab…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Deep generative models of protein structures create new and diverse proteins
            10:58

            Deep generative models of protein structures create new and diverse proteins

            Zeming Lin, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Exploiting Local Convergence of Quasi-Newton Methods Globally: Adaptive Sample Size Approach
            14:06

            Exploiting Local Convergence of Quasi-Newton Methods Globally: Adaptive Sample Size Approach

            Qiujiang Jin, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning after Deployment: The Missed Tale of Supervision
            04:41

            Learning after Deployment: The Missed Tale of Supervision

            Aviral Chharia, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Emergent Communication Under Varying Sizes and Connectivities
            05:40

            Emergent Communication Under Varying Sizes and Connectivities

            Jooyeon Kim, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            MADE: Exploration via Maximizing Deviation from Explored Regions
            13:09

            MADE: Exploration via Maximizing Deviation from Explored Regions

            Tianjun Zhang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            BARTScore: Evaluating Generated Text as Text Generation
            13:47

            BARTScore: Evaluating Generated Text as Text Generation

            Weizhe Yuan, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 1 viewers voted for saving the presentation to eternal vault which is 0.1%

            Interested in talks like this? Follow NeurIPS 2021