Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: TGEA 2.0: A Large-Scale Diagnostically Annotated Dataset with Benchmark Tasks for Text Generation of Pretrained Language Models
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-005-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-005-alpha.b-cdn.net
      • sl-yoda-v2-stream-005-beta.b-cdn.net
      • 1034628162.rsc.cdn77.org
      • 1409346856.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            TGEA 2.0: A Large-Scale Diagnostically Annotated Dataset with Benchmark Tasks for Text Generation of Pretrained Language Models
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            TGEA 2.0: A Large-Scale Diagnostically Annotated Dataset with Benchmark Tasks for Text Generation of Pretrained Language Models

            Nov 28, 2022

            Speakers

            HG

            Huibin Ge

            Speaker · 0 followers

            XZ

            Xiaohu Zhao

            Speaker · 0 followers

            CL

            Chuang Liu

            Speaker · 0 followers

            About

            In order to diagnostically analyze and improve the capability of pretrained language models (PLMs) in text generation, we propose TGEA 2.0, to date the largest dataset built on machine-authored texts by PLMs with fine-grained semantic annotations on a wide variety of pathological generation errors. We collect 170K nominal, phrasal and sentential prompts from 6M natural sentences in 3 domains. These prompts are fed into 4 generative PLMs with their best decoding strategy to generate paragraphs. …

            Organizer

            N2
            N2

            NeurIPS 2022

            Account · 953 followers

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Researchers Comparing DNNs to Brains Need to Adopt Standard Methods of Science.
            30:37

            Researchers Comparing DNNs to Brains Need to Adopt Standard Methods of Science.

            Jeffrey Bowers

            N2
            N2
            NeurIPS 2022 2 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Engineering Uncertainty Representations to Monitor Distribution Shifts
            07:15

            Engineering Uncertainty Representations to Monitor Distribution Shifts

            Thomas Bonnier, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Cost-Sensitive Self-Training for Optimizing Non-Decomposable Metrics
            01:01

            Cost-Sensitive Self-Training for Optimizing Non-Decomposable Metrics

            Harsh Rangwani, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            AutoST: Towards the Universal Modeling of Spatio-temporal Sequences
            01:04

            AutoST: Towards the Universal Modeling of Spatio-temporal Sequences

            Jianxin Li, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Contrastive Graph Structure Learning via Information Bottleneck for Recommendation
            04:59

            Contrastive Graph Structure Learning via Information Bottleneck for Recommendation

            Chunyu Wei, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Old can be Gold: Better Gradient Flow can make Vanilla-GCNs Great Again
            05:00

            Old can be Gold: Better Gradient Flow can make Vanilla-GCNs Great Again

            Ajay Jaiswal, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2022