Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: DropNet: Reducing Neural Network Complexity via Iterative Pruning
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-012-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-012-alpha.b-cdn.net
      • sl-yoda-v3-stream-012-beta.b-cdn.net
      • 1338956956.rsc.cdn77.org
      • 1656830687.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            DropNet: Reducing Neural Network Complexity via Iterative Pruning
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            DropNet: Reducing Neural Network Complexity via Iterative Pruning

            Jul 12, 2020

            Speakers

            MM

            Mehul Motani

            Řečník · 1 sledující

            JCMT

            John Chong Min Tan

            Řečník · 0 sledujících

            About

            Modern deep neural networks require a significant amount of computing time and power to train and deploy, which limits their usage on edge devices. Inspired by the iterative weight pruning in the Lottery Ticket Hypothesis, we propose DropNet, an iterative pruning method which prunes nodes/filters to reduce network complexity. DropNet iteratively removes nodes/filters with the lowest average post-activation value across all training samples. Empirically, we show that DropNet is robust across a wi…

            Organizer

            I2
            I2

            ICML 2020

            Účet · 2,7k sledujících

            Categories

            Umělá inteligence a data science

            Kategorie · 10,8k prezentací

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Incremental Sampling Without Replacement for Sequence Models
            14:46

            Incremental Sampling Without Replacement for Sequence Models

            Kensen Shi, …

            I2
            I2
            ICML 2020 5 years ago

            Pro uložení prezentace do věčného trezoru hlasovalo 0 diváků, což je 0.0 %

            The Implicit and Explicit Regularization Effects of Dropout
            14:30

            The Implicit and Explicit Regularization Effects of Dropout

            Colin Wei, …

            I2
            I2
            ICML 2020 5 years ago

            Pro uložení prezentace do věčného trezoru hlasovalo 0 diváků, což je 0.0 %

            Online Control of the False Coverage Rate
            15:06

            Online Control of the False Coverage Rate

            Asaf Weinstein, …

            I2
            I2
            ICML 2020 5 years ago

            Pro uložení prezentace do věčného trezoru hlasovalo 0 diváků, což je 0.0 %

            DeepKinZero: Zero-shot learning for predicting kinase-phosphosite associations involving understudied kinases
            05:10

            DeepKinZero: Zero-shot learning for predicting kinase-phosphosite associations involving understudied kinases

            Iman Deznabi, …

            I2
            I2
            ICML 2020 5 years ago

            Pro uložení prezentace do věčného trezoru hlasovalo 0 diváků, což je 0.0 %

            Historical perspective on extreme classification in language modeling
            35:42

            Historical perspective on extreme classification in language modeling

            Tomas Mikolov

            I2
            I2
            ICML 2020 5 years ago

            Pro uložení prezentace do věčného trezoru hlasovalo 0 diváků, což je 0.0 %

            Invited talk 4

            Invertible Workshop Innf

            I2
            I2
            ICML 2020 5 years ago

            Pro uložení prezentace do věčného trezoru hlasovalo 0 diváků, což je 0.0 %

            Interested in talks like this? Follow ICML 2020