Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: NGBoost: Natural Gradient Boosting for Probabilistic Prediction
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-002-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-002-alpha.b-cdn.net
      • sl-yoda-v2-stream-002-beta.b-cdn.net
      • 1001562353.rsc.cdn77.org
      • 1075090661.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            NGBoost: Natural Gradient Boosting for Probabilistic Prediction
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            NGBoost: Natural Gradient Boosting for Probabilistic Prediction

            Jul 12, 2020

            Speakers

            TD

            Tony Duan

            Speaker · 0 followers

            AA

            Anand Avati

            Speaker · 0 followers

            DD

            Daisy Ding

            Speaker · 0 followers

            About

            We present Natural Gradient Boosting (NGBoost), an algorithm for generic probabilistic prediction via gradient boosting. Typical regression models return a point estimate, conditional on covariates, but probabilistic regression models output a full probability distribution over the outcome space, conditional on the covariates. This allows for predictive uncertainty estimation — crucial in applications like healthcare and weather forecasting. NGBoost generalizes gradient boosting to probabilistic…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.6k followers

            Categories

            Mathematics

            Category · 2.4k presentations

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Guided Learning of Nonconvex Models through Successive Functional Gradient Optimization
            15:22

            Guided Learning of Nonconvex Models through Successive Functional Gradient Optimization

            Rie Johnson, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems
            13:54

            Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems

            Filip Hanzely, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Normalized Loss Functions for Deep Learning with Noisy Labels
            16:00

            Normalized Loss Functions for Deep Learning with Noisy Labels

            Xingjun Ma, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            An end-to-end approach for the verification problem: learning the right distance
            13:05

            An end-to-end approach for the verification problem: learning the right distance

            João Monteiro, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Debiasing Evaluations That are Biased by Evaluations
            12:12

            Debiasing Evaluations That are Biased by Evaluations

            Jingyan Wang, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Partial Trace Regression and Low-Rank Kraus Decomposition
            15:29

            Partial Trace Regression and Low-Rank Kraus Decomposition

            Hachem Kadri, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020