Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: On the Relation between Quality-Diversity Evaluation and Distribution-Fitting Goal in Text Generation
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            On the Relation between Quality-Diversity Evaluation and Distribution-Fitting Goal in Text Generation
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            On the Relation between Quality-Diversity Evaluation and Distribution-Fitting Goal in Text Generation

            Jul 12, 2020

            Sprecher:innen

            JL

            Jianing Li

            Sprecher:in · 0 Follower:innen

            YL

            Yanyan Lan

            Sprecher:in · 0 Follower:innen

            JG

            Jiafeng Guo

            Sprecher:in · 0 Follower:innen

            Über

            The goal of text generation models is to fit the underlying real probability distribution of text. For performance evaluation, quality and diversity metrics are usually applied. However, it is still not clear to what extend can the quality-diversity evaluation reflect the distribution-fitting goal. In this paper, we try to reveal such relation in a theoretical approach. We prove that under certain conditions, a linear combination of quality and diversity constitutes a divergence metric between t…

            Organisator

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Kategorien

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            Über ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Energy-Based Models for Object-Oriented Learning
            42:02

            Energy-Based Models for Object-Oriented Learning

            Igor Mordatch

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Learning to Rank Learning Curves
            15:34

            Learning to Rank Learning Curves

            Martin Wistuba, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Inferring DQN structure for high-dimensional continuous control
            12:14

            Inferring DQN structure for high-dimensional continuous control

            Andrey Sakryukin, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            FetchSGD: Communication-Efficient Federated Learning with Sketching
            15:26

            FetchSGD: Communication-Efficient Federated Learning with Sketching

            Daniel Rothchild, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Optimal Sequential Maximization: One Interview is Enough!
            16:44

            Optimal Sequential Maximization: One Interview is Enough!

            Alon Orlitsky, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Poster #52

            Avisha Das

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? ICML 2020 folgen