Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: The Neural Covariance SDE: Shaped Infinite Depth-and-Width Networks at Initialization
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-009-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-009-alpha.b-cdn.net
      • sl-yoda-v2-stream-009-beta.b-cdn.net
      • 1766500541.rsc.cdn77.org
      • 1441886916.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            The Neural Covariance SDE: Shaped Infinite Depth-and-Width Networks at Initialization
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            The Neural Covariance SDE: Shaped Infinite Depth-and-Width Networks at Initialization

            Nov 28, 2022

            Sprecher:innen

            ML

            Mufan Li

            Sprecher:in · 0 Follower:innen

            MN

            Mihai Nica

            Sprecher:in · 0 Follower:innen

            DMR

            Daniel M. Roy

            Sprecher:in · 0 Follower:innen

            Über

            The logit outputs of a feedforward neural network at initialization are conditionally Gaussian, given a random covariance matrix defined by the penultimate layer. In this work, we study the distribution of this random matrix. Recent work has shown that shaping the activation function as network depth grows large is necessary for this covariance matrix to be non-degenerate. However, the current infinite-width-style understanding of this shaping method is unsatisfactory for large depth: infinite-w…

            Organisator

            N2
            N2

            NeurIPS 2022

            Konto · 961 Follower:innen

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Matching in Multi-arm Bandit with Collision
            04:06

            Matching in Multi-arm Bandit with Collision

            Yirui Zhang, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            A Coupled Design of Exploiting Record Similarity for Vertical Federated Learning
            04:36

            A Coupled Design of Exploiting Record Similarity for Vertical Federated Learning

            Zhaomin Wu, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            When Does Group Invariant Learning Survive Spurious Correlations?
            04:51

            When Does Group Invariant Learning Survive Spurious Correlations?

            Yimeng Chen, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Causal Discovery in Heterogeneous Environments Under the Sparse Mechanism Shift Hypothesis
            04:55

            Causal Discovery in Heterogeneous Environments Under the Sparse Mechanism Shift Hypothesis

            Ronan Perry, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Rethinking and Improving Robustness of Convolutional Neural Networks: a Shapley Value-based Approach in Frequency Domain
            05:03

            Rethinking and Improving Robustness of Convolutional Neural Networks: a Shapley Value-based Approach in Frequency Domain

            YiTing Chen, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Label Noise in Adversarial Training: A Novel Perspective to Study Robust Overfitting
            04:50

            Label Noise in Adversarial Training: A Novel Perspective to Study Robust Overfitting

            Chengyu Dong, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? NeurIPS 2022 folgen