Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Subspace Fitting Meets Regression: The Effects of Supervision and Orthonormality Constraints on Double Descent of Generalization Errors
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-011-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-011-alpha.b-cdn.net
      • sl-yoda-v3-stream-011-beta.b-cdn.net
      • 1150868944.rsc.cdn77.org
      • 1511650057.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Subspace Fitting Meets Regression: The Effects of Supervision and Orthonormality Constraints on Double Descent of Generalization Errors
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Subspace Fitting Meets Regression: The Effects of Supervision and Orthonormality Constraints on Double Descent of Generalization Errors

            Jul 12, 2020

            Speakers

            YD

            Yehuda Dar

            Sprecher:in · 0 Follower:innen

            PM

            Paul Mayer

            Sprecher:in · 0 Follower:innen

            LL

            Lorenzo Luzi

            Sprecher:in · 0 Follower:innen

            About

            We study the linear subspace fitting problem in the overparameterized setting, where the estimated subspace can perfectly interpolate the training examples. Our scope includes the least-squares solutions to subspace fitting tasks with varying levels of supervision in the training data (i.e., the proportion of input-output examples of the desired low-dimensional mapping) and orthonormality of the vectors defining the learned operator. This flexible family of problems connects standard, unsupervis…

            Organizer

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Categories

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            AutoAttack: reliable evaluation of adversarial robustness with an ensemble of diverse parameter-free attacks
            05:10

            AutoAttack: reliable evaluation of adversarial robustness with an ensemble of diverse parameter-free attacks

            Francesco Croce, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models
            10:28

            Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models

            Lasse F. Wolff Anthony, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Word Embeddings to analyze Peruvian computing curriculums
            05:07

            Word Embeddings to analyze Peruvian computing curriculums

            Jeffri Erwin Murrugarra Llerena, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Convolutional dictionary learning based auto-encoders for natural exponential-family distributions
            14:49

            Convolutional dictionary learning based auto-encoders for natural exponential-family distributions

            Bahareh Tolooshams, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Conditioning of Reinforcement Learning Agents and its Policy Regularization Application
            08:05

            Conditioning of Reinforcement Learning Agents and its Policy Regularization Application

            Arip Asadulaev, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Invariant Rationalization
            13:54

            Invariant Rationalization

            Shiyu Chang, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interested in talks like this? Follow ICML 2020