Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Locality defeats the curse of dimensionality in convolutional teacher-student scenarios
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-016-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-016-alpha.b-cdn.net
      • sl-yoda-v3-stream-016-beta.b-cdn.net
      • 1504562137.rsc.cdn77.org
      • 1896834465.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Locality defeats the curse of dimensionality in convolutional teacher-student scenarios
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Locality defeats the curse of dimensionality in convolutional teacher-student scenarios

            Dez 6, 2021

            Sprecher:innen

            AF

            Alessandro Favero

            Sprecher:in · 0 Follower:innen

            FC

            Francesco Cagnetta

            Sprecher:in · 0 Follower:innen

            MW

            Matthieu Wyart

            Sprecher:in · 0 Follower:innen

            Über

            Convolutional neural networks perform a local and translationally-invariant treatment of the data: quantifying which of these two aspects is central to their success remains a challenge. We study this problem within a teacher-student framework for kernel regression, using `convolutional' kernels inspired by the neural tangent kernel of simple convolutional architectures of given filter size. Using heuristic methods from physics, we find in the ridgeless case that locality is key in determining t…

            Organisator

            N2
            N2

            NeurIPS 2021

            Konto · 1,9k Follower:innen

            Über NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Test-Agnostic Long-Tailed Recognition by Test-Time Aggregating Diverse Experts
            05:18

            Test-Agnostic Long-Tailed Recognition by Test-Time Aggregating Diverse Experts

            Yifan Zhang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 1 = 0.1%

            Scaling Vision with Sparse Mixture of Experts
            12:43

            Scaling Vision with Sparse Mixture of Experts

            Carlos Riquelme, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Does enforcing fairness mitigate biases caused by sub-population shift?
            13:22

            Does enforcing fairness mitigate biases caused by sub-population shift?

            Subha Maity, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Chebyshev-Cantelli PAC-Bayes-Bennett Inequality for the Weighted Majority Vote
            10:37

            Chebyshev-Cantelli PAC-Bayes-Bennett Inequality for the Weighted Majority Vote

            Yi-Shan Wu, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Invariant Causal Imitation Learning for Generalizable Policies
            14:49

            Invariant Causal Imitation Learning for Generalizable Policies

            Ioana Bica, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Panel Discussion 2
            59:12

            Panel Discussion 2

            Susan L. Epstein, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? NeurIPS 2021 folgen