Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: A Neural Tangent Kernel Perspective on Function-Space Regularization in Neural Networks
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-003-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-003-alpha.b-cdn.net
      • sl-yoda-v2-stream-003-beta.b-cdn.net
      • 1544410162.rsc.cdn77.org
      • 1005514182.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            A Neural Tangent Kernel Perspective on Function-Space Regularization in Neural Networks
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            A Neural Tangent Kernel Perspective on Function-Space Regularization in Neural Networks

            Dez 2, 2022

            Sprecher:innen

            ZC

            Zonghao Chen

            Sprecher:in · 0 Follower:innen

            XS

            Xupeng Shi

            Sprecher:in · 0 Follower:innen

            TGJR

            Tim G. J. Rudner

            Sprecher:in · 0 Follower:innen

            Über

            Loss regularization can help reduce the gap between training and test error by systematically limiting model complexity. Popular regularization techniques such as L2 weight regularization act directly on the network parameters, but do not explicitly take into account how the interplay between the parameters and the network architecture may affect the induced predictive functions.To address this shortcoming, we propose a simple technique for effective function-space regularization. Drawing on the…

            Organisator

            N2
            N2

            NeurIPS 2022

            Konto · 962 Follower:innen

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            coVariance Neural Networks
            04:56

            coVariance Neural Networks

            Saurabh Sihag, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Closing the Creator-Consumer Gap in XAI: A Call for Participatory XAI Design with End-users
            05:01

            Closing the Creator-Consumer Gap in XAI: A Call for Participatory XAI Design with End-users

            Sunnie S. Y. Kim, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Do As You Teach: A Multi-Teacher Approach to Self-Play in Deep Reinforcement Learning
            05:14

            Do As You Teach: A Multi-Teacher Approach to Self-Play in Deep Reinforcement Learning

            Chaitanya Kharyal, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Task Discovery: Finding the Tasks that Neural Networks Generalize on
            01:06

            Task Discovery: Finding the Tasks that Neural Networks Generalize on

            Andrei Atanov, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Doubly Robust Counterfactual Classification
            00:48

            Doubly Robust Counterfactual Classification

            Kwangho Kim, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Symmetry-induced Disentanglement on Graphs
            04:39

            Symmetry-induced Disentanglement on Graphs

            Giangiacomo Mercatali, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? NeurIPS 2022 folgen