Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Modeling Content Importance for Summarization with Pre-trained Language Models
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Modeling Content Importance for Summarization with Pre-trained Language Models
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Modeling Content Importance for Summarization with Pre-trained Language Models

            Nov 16, 2020

            Speakers

            LX

            Liqiang Xiao

            Speaker · 0 followers

            LW

            Lu Wang

            Speaker · 1 follower

            YJ

            Yaohui Jin

            Speaker · 0 followers

            Organizer

            E2
            E2

            EMNLP 2020

            Account · 1.2k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About EMNLP 2020

            The 2020 Conference on Empirical Methods in Natural Language Processing

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Interested in talks like this? Follow EMNLP 2020