Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Why Resampling Outperforms Reweighting for Correcting Sampling Bias with Stochastic Descents
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-010-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-010-alpha.b-cdn.net
      • sl-yoda-v2-stream-010-beta.b-cdn.net
      • 1759419103.rsc.cdn77.org
      • 1016618226.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Why Resampling Outperforms Reweighting for Correcting Sampling Bias with Stochastic Descents
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Why Resampling Outperforms Reweighting for Correcting Sampling Bias with Stochastic Descents

            May 3, 2021

            Speakers

            JA

            Jing An

            Speaker · 0 followers

            LY

            Lexing Ying

            Speaker · 0 followers

            YZ

            Yuhua Zhu

            Speaker · 0 followers

            About

            A data set sampled from a certain population is biased if the subgroups of the population are sampled at proportions that are significantly different from their underlying proportions. Training machine learning models on biased data sets requires correction techniques to compensate for the bias. We consider two commonly-used techniques, resampling and reweighting, that rebalance the proportions of the subgroups to maintain the desired objective function. Though statistically equivalent, it has b…

            Organizer

            I2
            I2

            ICLR 2021

            Account · 901 followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Unsupervised Intelligent Agents
            31:17

            Unsupervised Intelligent Agents

            Danijar Hafner

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Science and Engineering of Deep Learning
            9:34:30

            Science and Engineering of Deep Learning

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Opening remarks
            03:17

            Opening remarks

            Christina Funke, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Score-Based Generative Modeling through Stochastic Differential Equations
            15:27

            Score-Based Generative Modeling through Stochastic Differential Equations

            Yang Song, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 2 viewers voted for saving the presentation to eternal vault which is 0.2%

            What is Going on Inside Recurrent Meta Reinforcement Learning Agents?
            09:19

            What is Going on Inside Recurrent Meta Reinforcement Learning Agents?

            Safa Alver, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Lifelong Learning of Compositional Structures
            05:08

            Lifelong Learning of Compositional Structures

            Jorge A. Mendez, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICLR 2021