Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: FastFormers: Highly Efficient Transformer Models for Natural Language Understanding
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            FastFormers: Highly Efficient Transformer Models for Natural Language Understanding
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            FastFormers: Highly Efficient Transformer Models for Natural Language Understanding

            Nov 16, 2020

            Speakers

            HHA

            Hany Hassan Awadalla

            Speaker · 0 followers

            YJK

            Young Jin Kim

            Speaker · 0 followers

            Organizer

            E2
            E2

            EMNLP 2020

            Account · 1.2k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About EMNLP 2020

            The 2020 Conference on Empirical Methods in Natural Language Processing

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Interested in talks like this? Follow EMNLP 2020