Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: An Empirical Study of Pre-trained Transformers for Arabic Information Extraction
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-015-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-015-alpha.b-cdn.net
      • sl-yoda-v3-stream-015-beta.b-cdn.net
      • 1963568160.rsc.cdn77.org
      • 1940033649.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            An Empirical Study of Pre-trained Transformers for Arabic Information Extraction
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            An Empirical Study of Pre-trained Transformers for Arabic Information Extraction

            Nov 16, 2020

            Speakers

            WL

            Wuwei Lan

            Speaker · 0 followers

            YC

            Yang Chen

            Speaker · 0 followers

            WX

            Wei Xu

            Speaker · 0 followers

            Organizer

            E2
            E2

            EMNLP 2020

            Account · 1.2k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About EMNLP 2020

            The 2020 Conference on Empirical Methods in Natural Language Processing

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Interested in talks like this? Follow EMNLP 2020