Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural Networks
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-012-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-012-alpha.b-cdn.net
      • sl-yoda-v3-stream-012-beta.b-cdn.net
      • 1338956956.rsc.cdn77.org
      • 1656830687.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural Networks
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural Networks

            Jul 12, 2020

            Speakers

            AS

            Alexander Shevchenko

            Speaker · 0 followers

            MM

            Marco Mondelli

            Speaker · 0 followers

            About

            The optimization of multilayer neural networks typically leads to a solution with zero training error, yet the landscape can exhibit spurious local minima and the minima can be disconnected. In this paper, we shed light on this phenomenon: we show that the combination of stochastic gradient descent (SGD) and over-parameterization makes the landscape of multilayer neural networks approximately connected and thus more favorable to optimization. More specifically, we prove that SGD solutions are co…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.6k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks Using PAC-Bayesian Analysis
            15:02

            Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks Using PAC-Bayesian Analysis

            Yusuke Tsuzuku, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Deep Reasoning Networks for Unsupervised Pattern De-mixing with Constraint Reasoning
            13:12

            Deep Reasoning Networks for Unsupervised Pattern De-mixing with Constraint Reasoning

            Di Chen, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Attention Option-Critic
            05:53

            Attention Option-Critic

            Raviteja Chunduru, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning Affordances in Object-Centric Generative Models
            11:46

            Learning Affordances in Object-Centric Generative Models

            Sudhansu Kasewa, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Efficient non-conjugate Gaussian process factor models for spike count data using polynomial approximations
            17:15

            Efficient non-conjugate Gaussian process factor models for spike count data using polynomial approximations

            Stephen Keeley, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Random Hypervolume Scalarizations for Provable Multi-Objective Black Box Optimization
            16:39

            Random Hypervolume Scalarizations for Provable Multi-Objective Black Box Optimization

            Daniel Golovin, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020