Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: On the Role of Optimization in Double Descent: A Least Squares Study
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-011-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-011-alpha.b-cdn.net
      • sl-yoda-v3-stream-011-beta.b-cdn.net
      • 1150868944.rsc.cdn77.org
      • 1511650057.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            On the Role of Optimization in Double Descent: A Least Squares Study
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            On the Role of Optimization in Double Descent: A Least Squares Study

            Dec 6, 2021

            Speakers

            IK

            Ilja Kuzborskij

            Speaker · 0 followers

            CS

            Csaba Szepesvari

            Speaker · 8 followers

            OR

            Omar Rivasplata

            Speaker · 0 followers

            About

            Empirically it has been observed that the performance of deep neural networks steadily improves as we increase model size, contradicting the classical view on overfitting and generalization. Recently, the double descent phenomena has been proposed to reconcile this observation with theory, suggesting that the test error has a second descent when the model becomes sufficiently overparametrized, as the model size itself acts as an implicit regularizer. In this paper we add to the growing body of w…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Accelerated Sparse Neural Training: A Provable and Efficient Method to Find N:M Transposable Masks
            11:02

            Accelerated Sparse Neural Training: A Provable and Efficient Method to Find N:M Transposable Masks

            Itay Hubara, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            How should human translation coexist with NMT? Efficient tool for building high quality parallel corpus
            02:07

            How should human translation coexist with NMT? Efficient tool for building high quality parallel corpus

            Chanjun Park, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Fast Inference and Transfer of Compositional Task Structures for Few-shot Task Generalization
            05:02

            Fast Inference and Transfer of Compositional Task Structures for Few-shot Task Generalization

            Sungryull Sohn, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Breaking the centralized barrier for cross-device federated learning
            13:48

            Breaking the centralized barrier for cross-device federated learning

            Sai Praneeth Karimireddy, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            OpenML Benchmarking Suites
            05:42

            OpenML Benchmarking Suites

            Bernd Bischl, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            RAFT: A Real-World Benchmark for Text Classification
            04:52

            RAFT: A Real-World Benchmark for Text Classification

            Neel Alex, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021