Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: The role of regularization in classification of high-dimensional noisy Gaussian mixture
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-008-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-008-alpha.b-cdn.net
      • sl-yoda-v2-stream-008-beta.b-cdn.net
      • 1159783934.rsc.cdn77.org
      • 1511376917.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            The role of regularization in classification of high-dimensional noisy Gaussian mixture
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            The role of regularization in classification of high-dimensional noisy Gaussian mixture

            Jul 12, 2020

            Speakers

            FM

            Francesca Mignacco

            Speaker · 0 followers

            FK

            Florent Krzakala

            Speaker · 2 followers

            YML

            Yue M. Lu

            Speaker · 0 followers

            About

            We consider a high-dimensional mixture of two Gaussians in the noisy regime where even an oracle knowing the centers of the clusters misclassifies a small but finite fraction of the points. We provide a rigorous analysis of the generalization error of regularized convex classifiers, including ridge, hinge and logistic regression, in the high-dimensional limit where the number n of samples and their dimension d goes to infinity while their ratio is fixed to α=n/d. We discuss surprising effects of…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.7k followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Bayesian Sparsification of Deep Complex-valued networks
            15:22

            Bayesian Sparsification of Deep Complex-valued networks

            Ivan Nazarov, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Causal Feature Discovery Through Strategic Modification
            11:49

            Causal Feature Discovery Through Strategic Modification

            Yahav Bechavod, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Deep Active Learning Toward Crisis-related Tweets Classification
            16:57

            Deep Active Learning Toward Crisis-related Tweets Classification

            Shiva Ebrahimi, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Discount Factor as a Regularizer in Reinforcement Learning
            14:46

            Discount Factor as a Regularizer in Reinforcement Learning

            Ron Amit, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Wandering Within a World: Online Contextualized Few-Shot Learning
            17:17

            Wandering Within a World: Online Contextualized Few-Shot Learning

            Mengye Ren, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features
            15:02

            Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features

            Liang Ding, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020