Next
Exact Sampling of Determinantal Point Processes with Sublinear Time Preprocessing
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: How Negative Dependence Broke the Quadratic Barrier for Learning with Graphs and Kernels
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-002-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-002-alpha.b-cdn.net
      • sl-yoda-v3-stream-002-beta.b-cdn.net
      • 1982098372.rsc.cdn77.org
      • 1068207240.rsc.cdn77.org
      • Subtitles
      • Off
      • English (auto-generated)
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            How Negative Dependence Broke the Quadratic Barrier for Learning with Graphs and Kernels
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            How Negative Dependence Broke the Quadratic Barrier for Learning with Graphs and Kernels

            Jun 14, 2019

            Speakers

            MV

            Michal Valko

            Speaker · 2 followers

            About

            As we advance with resources, we move from reasoning on entities to reasoning on pairs and groups. We have beautiful frameworks: graphs, kernels, DPPs. However, the methods that work with pairs, relationships, and similarity are just slow. Kernel regression, or second-order gradient methods, or sampling from DPPs do not scale to large data, because of the costly construction and storing of matrix K_n. Prior work showed that sampling points according to their ridge leverage scores (RLS) generates…

            Organizer

            I2
            I2

            ICML 2019

            Account · 3.2k followers

            Categories

            Mathematics

            Category · 2.4k presentations

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2019

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            On Two Ways to use Determinantal Point Processes for Monte Carlo Integration
            20:57

            On Two Ways to use Determinantal Point Processes for Monte Carlo Integration

            Guillaume Gautier

            I2
            I2
            ICML 2019 6 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning to Drive with Purpose
            23:21

            Learning to Drive with Purpose

            Alexander Amini

            I2
            I2
            ICML 2019 6 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Causal Inference and Stable Learning
            2:09:37

            Causal Inference and Stable Learning

            Peng Cui

            I2
            I2
            ICML 2019 6 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Enhancing Gradient-based Attacks with Symbolic Intervals
            19:08

            Enhancing Gradient-based Attacks with Symbolic Intervals

            Shiqi Wang

            I2
            I2
            ICML 2019 6 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            From Listening to Watching, A Recommender Systems Perspective
            48:32

            From Listening to Watching, A Recommender Systems Perspective

            Yves Raimond

            I2
            I2
            ICML 2019 6 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Tampered Speaker Inconsistency Detection with Phonetically Aware Audio-visual Features
            14:43

            Tampered Speaker Inconsistency Detection with Phonetically Aware Audio-visual Features

            Mahesh Nandwana

            I2
            I2
            ICML 2019 6 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2019