Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Sparse Flows: Pruning Continuous-depth Models
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-015-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-015-alpha.b-cdn.net
      • sl-yoda-v3-stream-015-beta.b-cdn.net
      • 1963568160.rsc.cdn77.org
      • 1940033649.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Sparse Flows: Pruning Continuous-depth Models
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Sparse Flows: Pruning Continuous-depth Models

            Dec 6, 2021

            Speakers

            LL

            Lucas Liebenwein

            Sprecher:in · 0 Follower:innen

            RH

            Ramin Hasani

            Sprecher:in · 0 Follower:innen

            AA

            Alexander Amini

            Sprecher:in · 2 Follower:innen

            About

            Continuous deep learning architectures enable learning of flexible probabilistic models for predictive modeling as neural ordinary differential equations (ODEs), and for generative modeling as continuous normalizing flows. In this work, we design a framework to decipher the internal dynamics of these continuous depth models by pruning their network architectures. Our empirical results suggest that pruning improves generalization for neural ODEs in generative modeling. Moreover, pruning finds min…

            Organizer

            N2
            N2

            NeurIPS 2021

            Konto · 1,9k Follower:innen

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Matrix factorisation and the interpretation of geodesic distance
            12:11

            Matrix factorisation and the interpretation of geodesic distance

            Nick Whiteley, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            TransMIL: Transformer based Correlated Multiple Instance Learning for Whole Slide Image Classication
            03:14

            TransMIL: Transformer based Correlated Multiple Instance Learning for Whole Slide Image Classication

            Zhuchen Shao, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Identifying and Benchmarking Natural Out-of-Context Prediction Problems
            12:36

            Identifying and Benchmarking Natural Out-of-Context Prediction Problems

            David Madras, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Challenges of Working with Materials R&D Data
            02:05

            Challenges of Working with Materials R&D Data

            Lenore Kubie, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Iterative Iterative
            04:19

            Iterative Iterative

            Erin Smith

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Temporal Transductive Inference for Few-Shot Video Object Segmentation
            02:54

            Temporal Transductive Inference for Few-Shot Video Object Segmentation

            Mennatullah Siam, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interested in talks like this? Follow NeurIPS 2021