Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Improved Learning Rates of a Functional Lasso-type SVM with Sparse Multi-Kernel Representation
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-016-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-016-alpha.b-cdn.net
      • sl-yoda-v3-stream-016-beta.b-cdn.net
      • 1504562137.rsc.cdn77.org
      • 1896834465.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Improved Learning Rates of a Functional Lasso-type SVM with Sparse Multi-Kernel Representation
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Improved Learning Rates of a Functional Lasso-type SVM with Sparse Multi-Kernel Representation

            Dec 6, 2021

            Speakers

            SL

            Shaogao Lv

            Speaker · 0 followers

            JW

            Junhui Wang

            Speaker · 0 followers

            JL

            Jiankun Liu

            Speaker · 0 followers

            About

            In this paper, we provide theoretical results of estimation bounds and excess risk upper bounds for support vector machine (SVM) with sparse multi-kernel representation. These convergence rates for multi-kernel SVM are established by analyzing a Lasso-type regularized learning scheme within composite multi-kernel spaces. It is shown that the oracle rates of convergence of classifiers depend on the complexity of multi-kernels, the sparsity and a Bernstein condition and the sample size, which sign…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Representation Learning for Imitation with Contrastive Fourier Features
            15:06

            Representation Learning for Imitation with Contrastive Fourier Features

            Ofir Nachum, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning Revenue-Maximizing Auctions With Differentiable Matching
            04:02

            Learning Revenue-Maximizing Auctions With Differentiable Matching

            Michael Curry, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Reinforcement Learning Benchmarks for Traffic Signal Control
            03:53

            Reinforcement Learning Benchmarks for Traffic Signal Control

            James Ault, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Uncertainty Quantification and Deep Ensembles
            14:41

            Uncertainty Quantification and Deep Ensembles

            Rahul Rahaman, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            HELP: Hardware-adaptive Efficient Latency Prediction for NAS via Meta-Learning
            11:30

            HELP: Hardware-adaptive Efficient Latency Prediction for NAS via Meta-Learning

            Hayeon Lee, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Oral Session 5: Generative Modeling
            1:45:06

            Oral Session 5: Generative Modeling

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021