Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Controlling Overestimation Bias with Truncated Mixture of Continuous Distributional Quantile Critics
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Controlling Overestimation Bias with Truncated Mixture of Continuous Distributional Quantile Critics
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Controlling Overestimation Bias with Truncated Mixture of Continuous Distributional Quantile Critics

            Jul 12, 2020

            Speakers

            AK

            Arsenii Kuznetsov

            Speaker · 0 followers

            PS

            Pavel Shvechikov

            Speaker · 0 followers

            AG

            Alexander Grishin

            Speaker · 0 followers

            About

            According to previous studies, one of the major impediments to accurate off-policy learning is the overestimation bias. This paper investigates a novel way to alleviate the overestimation bias in a continuous control setting. Our method—Truncated Quantile Critics, TQC,—blends three ideas: distributional representation of a critic, truncation of critics prediction, and ensembling of multiple critics. We show that all components are key for the achieved performance. Distributional representation c…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.7k followers

            Categories

            Mathematics

            Category · 2.4k presentations

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Problems with Shapley-value-based explanations as feature importance measures
            13:56

            Problems with Shapley-value-based explanations as feature importance measures

            I. Elizabeth Kumar, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Train Big, Then Compress: Rethinking Model Size for Efficient Training and Inference of Transformers
            15:21

            Train Big, Then Compress: Rethinking Model Size for Efficient Training and Inference of Transformers

            Zhuohan Li, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            On the Power of Compressed Sensing with Generative Models
            16:02

            On the Power of Compressed Sensing with Generative Models

            Akshay Kamath, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            1min Intro

            Placeholder AutoMLWS20

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Private Outsourced Bayesian Optimization
            15:09

            Private Outsourced Bayesian Optimization

            Dmitrii Kharkovskii, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Transformer Hawkes Process
            14:13

            Transformer Hawkes Process

            Simiao Zuo, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 1 viewers voted for saving the presentation to eternal vault which is 0.1%

            Interested in talks like this? Follow ICML 2020