Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Time-Consistent Semi-Supervised Learning
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-011-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-011-alpha.b-cdn.net
      • sl-yoda-v3-stream-011-beta.b-cdn.net
      • 1150868944.rsc.cdn77.org
      • 1511650057.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Time-Consistent Semi-Supervised Learning
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Time-Consistent Semi-Supervised Learning

            Jul 12, 2020

            Speakers

            TZ

            Tianyi Zhou

            Speaker · 0 followers

            SW

            Shengjie Wang

            Speaker · 0 followers

            JAB

            Jeff A. Bilmes

            Speaker · 0 followers

            About

            Semi-supervised learning (SSL) aims to leverage unlabeled data when training a model with deficient labeled data. A common methodology for SSL is to enforce the consistency of model outputs between similar samples. Empowered by recent data augmentation methods, it helps train neural nets to achieve promising SSL performance. However, the model outputs can vary dramatically on unlabeled data for different training stages especially when using large learning rates. This may introduce unpredictable…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.6k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Imitation Learning Approach for AI Driving Olympics Trained on Real-world and Simulation Data Simultaneously
            02:58

            Imitation Learning Approach for AI Driving Olympics Trained on Real-world and Simulation Data Simultaneously

            Mikita Sazanovich, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            The Implicit Regularization of Stochastic Gradient Flow for Least Squares
            16:14

            The Implicit Regularization of Stochastic Gradient Flow for Least Squares

            Alnur Ali, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Are Hyperbolic Representations in Graphs Created Equal?
            04:54

            Are Hyperbolic Representations in Graphs Created Equal?

            Max Kochurov, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Optimizer Benchmarking Needs to Account for Hyperparameter Tuning
            11:54

            Optimizer Benchmarking Needs to Account for Hyperparameter Tuning

            Prabhu Teja Sivaprasad, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Barking up the right tree: an approach to search over molecule synthesis DAGs
            05:11

            Barking up the right tree: an approach to search over molecule synthesis DAGs

            John Bradshaw, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Provable Smoothness Guarantees for Black-Box Variational Inference
            08:29

            Provable Smoothness Guarantees for Black-Box Variational Inference

            Justin Domke

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020