Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Divide and Conquer for Quantization: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-012-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-012-alpha.b-cdn.net
      • sl-yoda-v3-stream-012-beta.b-cdn.net
      • 1338956956.rsc.cdn77.org
      • 1656830687.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Divide and Conquer for Quantization: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Divide and Conquer for Quantization: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks

            Jul 12, 2020

            Speakers

            ATE

            Ahmed T. Elthakeb

            Speaker · 0 followers

            PP

            Prannoy Pilligundla

            Speaker · 0 followers

            FM

            FatemehSadat Mireshghallah

            Speaker · 0 followers

            About

            The deep layers of modern neural networks extract a rather rich set of features as an input propagates through the network. This paper sets out to harvest these rich intermediate representations for quantization with minimal accuracy loss while significantly reducing the memory footprint and compute intensity of the DNN. This paper utilizes knowledge distillation through teacher-student paradigm (Hinton et al., 2015) in a novel setting that exploits the feature extraction capability of DNNs for…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.7k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Imputer: Sequence Modelling via Imputation and Dynamic Programming
            10:50

            Imputer: Sequence Modelling via Imputation and Dynamic Programming

            William Chan, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 1 viewers voted for saving the presentation to eternal vault which is 0.1%

            Sequentially Additive Nonignorable Missing Data Modeling: Using Auxiliary Marginal Information
            40:58

            Sequentially Additive Nonignorable Missing Data Modeling: Using Auxiliary Marginal Information

            Mauricio Sadinle

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Normalisr: inferring single-cell differential and co-expression with linear association testing
            05:19

            Normalisr: inferring single-cell differential and co-expression with linear association testing

            Lingfei Wang

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Automatic semantic segmentation for prediction of tuberculosis using lens-free microscopy images
            06:59

            Automatic semantic segmentation for prediction of tuberculosis using lens-free microscopy images

            Dennis Núñez Fernández, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Meta-learning with Stochastic Linear Bandits
            13:17

            Meta-learning with Stochastic Linear Bandits

            Leonardo Cella, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Linear Mode Connectivity and the Lottery Ticket Hypothesis
            12:24

            Linear Mode Connectivity and the Lottery Ticket Hypothesis

            Jonathan Frankle, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020