Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Memory-Efficient Approximation Algorithms for Max-k-Cut and Correlation Clustering
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-016-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-016-alpha.b-cdn.net
      • sl-yoda-v3-stream-016-beta.b-cdn.net
      • 1504562137.rsc.cdn77.org
      • 1896834465.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Memory-Efficient Approximation Algorithms for Max-k-Cut and Correlation Clustering
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Memory-Efficient Approximation Algorithms for Max-k-Cut and Correlation Clustering

            Dec 6, 2021

            Speakers

            NS

            Nimita Shinde

            Speaker · 1 follower

            VN

            Vishnu Narayanan

            Speaker · 0 followers

            JS

            James Saunderson

            Speaker · 0 followers

            About

            Max-k-Cut and correlation clustering are fundamental graph partitioning problems. The methods with the best approximation guarantees for Max-k-Cut and the Max-Agree variant of correlation clustering involve solving SDPs with n^2 variables and n^2 constraints (where n is the number of vertices). Large-scale instances of SDPs, thus, present a memory bottleneck. In this paper, we develop a simple polynomial-time Gaussian sampling-based algorithms for these two problems that use 𝒪(n+|E|) memory and…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            ErrorCompensatedX: error compensation for variance reduced algorithms
            14:38

            ErrorCompensatedX: error compensation for variance reduced algorithms

            Hanlin Tang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Towards efficient end-to-end speech recognition with biologically-inspired neural networks
            05:13

            Towards efficient end-to-end speech recognition with biologically-inspired neural networks

            Thomas Bohnstingl, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Recursive Bayesian Networks: Generalising and Unifying Probabilistic Context-Free Grammars and Dynamic Bayesian Networks
            15:00

            Recursive Bayesian Networks: Generalising and Unifying Probabilistic Context-Free Grammars and Dynamic Bayesian Networks

            R. Lieck, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Opening Remarks
            12:53

            Opening Remarks

            Nathan Lambert

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Deep Generative Models and their Application in Meta-RL
            27:08

            Deep Generative Models and their Application in Meta-RL

            Luisa Zintgraf

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Error Compensated Distributed SGD can be Accelerated
            08:18

            Error Compensated Distributed SGD can be Accelerated

            Xun Qian, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021