Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Deciphering and Optimizing Multi-Task Learning: a Random Matrix Approach
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-015-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-015-alpha.b-cdn.net
      • sl-yoda-v3-stream-015-beta.b-cdn.net
      • 1963568160.rsc.cdn77.org
      • 1940033649.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Deciphering and Optimizing Multi-Task Learning: a Random Matrix Approach
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Deciphering and Optimizing Multi-Task Learning: a Random Matrix Approach

            May 3, 2021

            Speakers

            MT

            Malik Tiomoko

            Speaker · 0 followers

            HTA

            Hafiz T Ali

            Speaker · 0 followers

            RC

            Romain Couillet

            Speaker · 1 follower

            About

            This article provides theoretical insights into the inner workings of multi-task and transfer learning methods, by studying the tractable least-square support vector machine multi-task learning (LS-SVM MTL) method, in the limit of large ($p$) and numerous ($n$) data. By a random matrix analysis applied to a Gaussian mixture data model, the performance of MTL LS-SVM is shown to converge, as $n,p\to\infty$, to a deterministic limit involving simple (small-dimensional) statistics of the data. We pr…

            Organizer

            I2
            I2

            ICLR 2021

            Account · 898 followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            ABC Problem: An Investigation of Offline RL for Vision-Based Dynamic Manipulation
            06:21

            ABC Problem: An Investigation of Offline RL for Vision-Based Dynamic Manipulation

            Kamyar Ghassemipour, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text Retrieval
            05:45

            Approximate Nearest Neighbor Negative Contrastive Learning for Dense Text Retrieval

            Lee Xiong, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            FastSpeech 2: Fast and High-Quality End-to-End Text to Speech
            07:01

            FastSpeech 2: Fast and High-Quality End-to-End Text to Speech

            Yi Ren, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Dependency Structure Misspecification in Multi-Source Weak Supervision Models
            22:50

            Dependency Structure Misspecification in Multi-Source Weak Supervision Models

            Salva Rühling Cachay, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Offline Model-based Optimization via Normalized Maximum Likelihood Estimation
            07:37

            Offline Model-based Optimization via Normalized Maximum Likelihood Estimation

            Justin Fu, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Doing More with Less: Improving Robustness using Generated Data
            02:59

            Doing More with Less: Improving Robustness using Generated Data

            Sven Gowal, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICLR 2021