Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Logarithmic Regret for Learning Linear Quadratic Regulators Efficiently
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-016-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-016-alpha.b-cdn.net
      • sl-yoda-v3-stream-016-beta.b-cdn.net
      • 1504562137.rsc.cdn77.org
      • 1896834465.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Logarithmic Regret for Learning Linear Quadratic Regulators Efficiently
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Logarithmic Regret for Learning Linear Quadratic Regulators Efficiently

            Jul 12, 2020

            Speakers

            AC

            Asaf Cassel

            Speaker · 0 followers

            AC

            Alon Cohen

            Speaker · 0 followers

            TK

            Tomer Koren

            Speaker · 0 followers

            About

            We consider the problem of learning in Linear Quadratic Control systems whose transition parameters are initially unknown. Recent results in this setting have demonstrated efficient learning algorithms with regret growing with the square root of the number of decision steps. We present new efficient algorithms that achieve, perhaps surprisingly,regret that scales only (poly-)logarithmically with the number of steps, in two scenarios: when only the state transition matrix A is unknown, and when o…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.7k followers

            Categories

            Mathematics

            Category · 2.4k presentations

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Meta-Reinforcement Learning for Robotic Industrial Insertion Tasks
            03:27

            Meta-Reinforcement Learning for Robotic Industrial Insertion Tasks

            Gerrit Schoettler, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Boosted Histogram Transform for Regression
            13:38

            Boosted Histogram Transform for Regression

            Yuchao Cai, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Sparse Subspace Clustering with Entropy-Norm
            11:50

            Sparse Subspace Clustering with Entropy-Norm

            Liang Bai

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Dynamics of Deep Neural Networks and  Neural Tangent Hierarchy
            15:27

            Dynamics of Deep Neural Networks and Neural Tangent Hierarchy

            Jiaoyang Huang, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Catch Me if I Can: Detecting Strategic Behaviour in Peer Assessment
            07:25

            Catch Me if I Can: Detecting Strategic Behaviour in Peer Assessment

            Ivan Stelmakh, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Average-case Acceleration Through Spectral Density Estimation
            11:33

            Average-case Acceleration Through Spectral Density Estimation

            Fabian Pedregosa, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020