Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Parametric Complexity Bounds for Approximating PDEs with Neural Networks
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-016-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-016-alpha.b-cdn.net
      • sl-yoda-v3-stream-016-beta.b-cdn.net
      • 1504562137.rsc.cdn77.org
      • 1896834465.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Parametric Complexity Bounds for Approximating PDEs with Neural Networks
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Parametric Complexity Bounds for Approximating PDEs with Neural Networks

            Dez 6, 2021

            Sprecher:innen

            TM

            Tanya Marwah

            Sprecher:in · 0 Follower:innen

            ZL

            Zachary Lipton

            Sprecher:in · 0 Follower:innen

            AR

            Andrej Risteski

            Sprecher:in · 0 Follower:innen

            Über

            Recent experiments have shown that deep networks can approximate solutions to high-dimensional PDEs, seemingly escaping the curse of dimensionality. However, questions regarding the theoretical basis for such approximations, including the required network size remain open. In this paper, we investigate the representational power of neural networks for approximating solutions to linear elliptic PDEs with Dirichlet boundary conditions. We prove that when a PDE's coefficients are representable by s…

            Organisator

            N2
            N2

            NeurIPS 2021

            Konto · 1,9k Follower:innen

            Über NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Complexity Lower Bounds for Nonconvex-Strongly-Concave Min-Max Optimization
            14:48

            Complexity Lower Bounds for Nonconvex-Strongly-Concave Min-Max Optimization

            Haochuan Li, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Random Noise Defense Against Query-Based Black-Box Attacks
            12:53

            Random Noise Defense Against Query-Based Black-Box Attacks

            Zeyu Qin, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            eXplainable Al approaches for debugging and diagnosis
            54:01

            eXplainable Al approaches for debugging and diagnosis

            Andreas Holzinger, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Word2Fun: Modelling Words as Functions for Diachronic Word Representation
            13:46

            Word2Fun: Modelling Words as Functions for Diachronic Word Representation

            Benyou Wang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Neural Tangent Kernel Maximum Mean Discrepancy
            13:11

            Neural Tangent Kernel Maximum Mean Discrepancy

            Xiuyuan Cheng, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            On Covariate Shift of Latent Confounders in Imitation and Reinforcement Learning
            19:13

            On Covariate Shift of Latent Confounders in Imitation and Reinforcement Learning

            Guy Tennenholtz, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? NeurIPS 2021 folgen