Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Minimax Rate for Learning From Pairwise Comparisons in the BTL Model
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-011-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-011-alpha.b-cdn.net
      • sl-yoda-v3-stream-011-beta.b-cdn.net
      • 1150868944.rsc.cdn77.org
      • 1511650057.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Minimax Rate for Learning From Pairwise Comparisons in the BTL Model
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Minimax Rate for Learning From Pairwise Comparisons in the BTL Model

            Jul 12, 2020

            Speakers

            AO

            Alex Olshevsky

            Speaker · 0 followers

            JH

            Julien Hendrickx

            Speaker · 0 followers

            VS

            Venkatesh Saligrama

            Speaker · 0 followers

            About

            We consider the problem of learning the qualities w_1, ... , w_n of a collection of items by performing noisy comparisons among them. We assume there is a fixed “comparison graph” and every neighboring pair of items is compared k times. We will study the popular Bradley-Terry-Luce model, where the probability that item i wins a comparison against j equals w_i/(w_i + w_j). We are interested in how the expected error in estimating the vector w = (w_1, ... , w_n) behaves in the regime when the numb…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.7k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            Mathematics

            Category · 2.4k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Application of Bayesian Techniques to Multi-omic Longitudinal Data
            09:35

            Application of Bayesian Techniques to Multi-omic Longitudinal Data

            Daniel Ruiz-Perez

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            R2-B2: Recursive Reasoning-Based Bayesian Optimization for No-Regret Learning in Games
            15:20

            R2-B2: Recursive Reasoning-Based Bayesian Optimization for No-Regret Learning in Games

            Zhongxiang Dai, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Variable Skipping for Autoregressive Range Density Estimation
            13:00

            Variable Skipping for Autoregressive Range Density Estimation

            Eric Liang, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Poster #54

            Srija Chakraborty

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning Retrosynthetic Planning with Chemical Reasoning
            05:17

            Learning Retrosynthetic Planning with Chemical Reasoning

            Binghong Chen, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Collapsed Amortized Variational Inference for Switching Nonlinear Dynamical Systems
            16:11

            Collapsed Amortized Variational Inference for Switching Nonlinear Dynamical Systems

            Zhe Dong, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020