Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Correlation Clustering with Asymmetric Classification Errors
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-012-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-012-alpha.b-cdn.net
      • sl-yoda-v3-stream-012-beta.b-cdn.net
      • 1338956956.rsc.cdn77.org
      • 1656830687.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Correlation Clustering with Asymmetric Classification Errors
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Correlation Clustering with Asymmetric Classification Errors

            Jul 12, 2020

            Speakers

            JJ

            Jafar Jafarov

            Speaker · 0 followers

            SK

            Sanchit Kalhan

            Speaker · 0 followers

            KM

            Konstantin Makarychev

            Speaker · 0 followers

            About

            In the Correlation Clustering problem, we are given a weighted graph G with its edges labelled as "similar" or "dissimilar" by a binary classifier. The goal is to produce a clustering that minimizes the weight of "disagreements": the sum of the weights of "similar" edges across clusters and "dissimilar" edges within clusters. We study the correlation clustering problem under the following assumption: Every "similar" edge e has weight w_e ∈ [ α w, w ] and every "dissimilar" edge e has weight w_e…

            Organizer

            I2
            I2

            ICML 2020

            Account · 2.7k followers

            Categories

            Mathematics

            Category · 2.4k presentations

            AI & Data Science

            Category · 10.8k presentations

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Adaptive Adversarial Multi-task Representation Learning
            13:23

            Adaptive Adversarial Multi-task Representation Learning

            Yuren Mao, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Gradient Temporal-Difference Learning with Regularized Corrections
            10:56

            Gradient Temporal-Difference Learning with Regularized Corrections

            Sina Ghiassian, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Deep Graph Library an Update
            10:14

            Deep Graph Library an Update

            Zheng Zhang

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Few-shot Domain Adaptation by Causal Mechanism Transfer
            11:28

            Few-shot Domain Adaptation by Causal Mechanism Transfer

            Takeshi Teshima, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Planning to Explore via Self-Supervised World Models
            10:51

            Planning to Explore via Self-Supervised World Models

            Ramanan Sekar, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Superpolynomial Lower Bounds on Learning One-Layer Neural Nets with Gradient Descent
            14:37

            Superpolynomial Lower Bounds on Learning One-Layer Neural Nets with Gradient Descent

            Surbhi Goel, …

            I2
            I2
            ICML 2020 5 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICML 2020