Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Non-convex Learning via Replica Exchange Stochastic Gradient MCMC
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-015-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-015-alpha.b-cdn.net
      • sl-yoda-v3-stream-015-beta.b-cdn.net
      • 1963568160.rsc.cdn77.org
      • 1940033649.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Non-convex Learning via Replica Exchange Stochastic Gradient MCMC
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Non-convex Learning via Replica Exchange Stochastic Gradient MCMC

            Jul 12, 2020

            Speakers

            WD

            Wei Deng

            Speaker · 0 followers

            QF

            Qi Feng

            Speaker · 0 followers

            FL

            Faming Liang

            Speaker · 0 followers

            About

            Replica exchange method (RE), also known as parallel tempering, is an important technique for accelerating the convergence of the conventional Markov Chain Monte Carlo (MCMC) algorithms. However, such a method requires the evaluation of the energy function based on the full dataset and is not scalable to big data. The naïve implementation of RE in mini-batch settings introduces large biases, which cannot be directly extended to the stochastic gradient MCMC (SG-MCMC), the standard sampling method…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.7k followers

            Categories

            Mathematics

            Category · 2.4k presentations

            Physics

            Category · 261 presentations

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Streaming Submodular Maximization under a k-Set System Constraint
            14:59

            Streaming Submodular Maximization under a k-Set System Constraint

            Ran Haba, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Towards Understanding the Regularization of Adversarial Robustness on Neural Networks
            15:52

            Towards Understanding the Regularization of Adversarial Robustness on Neural Networks

            Yuxin Wen, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Using Self-Supervised Learning of Birdsong for Downstream Industrial Audio Classification
            10:58

            Using Self-Supervised Learning of Birdsong for Downstream Industrial Audio Classification

            Patty Ryan, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Optimistic bounds for multi-output prediction
            14:40

            Optimistic bounds for multi-output prediction

            Henry Reeve, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Real-Time Optimisation for Online Learning in Auctions
            15:00

            Real-Time Optimisation for Online Learning in Auctions

            Lorenzo Croissant, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Invited talk: Distributions for Parameters
            37:39

            Invited talk: Distributions for Parameters

            Nancy Reid

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020