Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: PoWER-BERT: Accelerating BERT Inference via Progressive Word-vector Elimination
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-010-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-010-alpha.b-cdn.net
      • sl-yoda-v2-stream-010-beta.b-cdn.net
      • 1759419103.rsc.cdn77.org
      • 1016618226.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            PoWER-BERT: Accelerating BERT Inference via Progressive Word-vector Elimination
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            PoWER-BERT: Accelerating BERT Inference via Progressive Word-vector Elimination

            Jul 12, 2020

            Sprecher:innen

            SG

            Saurabh Goyal

            Sprecher:in · 0 Follower:innen

            ARC

            Anamitra Roy Choudhury

            Sprecher:in · 0 Follower:innen

            SMR

            Souhabh M. Raje

            Sprecher:in · 0 Follower:innen

            Über

            We develop a novel method, called PoWER-BERT, for improving the inference time of the popular BERT model, while maintaining the accuracy. It works by: a) exploiting redundancy pertaining to word-vectors (intermediate encoder outputs) and eliminating the redundant vectors. b) determining which word-vectors to eliminate by developing a strategy for measuring their significance, based on the self-attention mechanism; c) learning how many word-vectors to eliminate by augmenting the BERT model and t…

            Organisator

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Kategorien

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            Über ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            The Effect of Natural Distribution Shift on Question Answering Models
            16:22

            The Effect of Natural Distribution Shift on Question Answering Models

            John Miller, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Task Understanding from Confusing Mulit-task Data
            15:29

            Task Understanding from Confusing Mulit-task Data

            Xin Su, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Black-Box Methods for Restoring Monotonicity
            15:40

            Black-Box Methods for Restoring Monotonicity

            Evangelia Gergatsouli, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Graph Clustering with Graph Neural Networks
            04:23

            Graph Clustering with Graph Neural Networks

            Anton Tsitsulin, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Tuning-free Plug-and-Play Proximal Algorithm for Inverse Imaging Problems
            11:47

            Tuning-free Plug-and-Play Proximal Algorithm for Inverse Imaging Problems

            Kaixuan Wei, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            The Sample Complexity of Best-k Items Selection from Pairwise Comparisons
            13:16

            The Sample Complexity of Best-k Items Selection from Pairwise Comparisons

            Wenbo Ren, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? ICML 2020 folgen