Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-015-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-015-alpha.b-cdn.net
      • sl-yoda-v3-stream-015-beta.b-cdn.net
      • 1963568160.rsc.cdn77.org
      • 1940033649.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            On Riemannian Optimization over Positive Definite Matrices with the Bures-Wasserstein Geometry

            Dec 6, 2021

            Speakers

            AH

            Andi Han

            Sprecher:in · 0 Follower:innen

            BM

            Bamdev Mishra

            Sprecher:in · 0 Follower:innen

            PKJ

            Pratik Kumar Jawanpuria

            Sprecher:in · 0 Follower:innen

            About

            In this paper, we comparatively analyze the Bures-Wasserstein (BW) geometry with the popular Affine-Invariant (AI) geometry for Riemannian optimization on the symmetric positive definite (SPD) matrix manifold. Our study begins with an observation that the BW metric has a linear dependence on SPD matrices in contrast to the quadratic dependence of the AI metric. We build on this to show that the BW metric is a more suitable and robust choice for several Riemannian optimization problems over ill-c…

            Organizer

            N2
            N2

            NeurIPS 2021

            Konto · 1,9k Follower:innen

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Diversity is All You Need to Improve Bayesian Model Averaging
            06:31

            Diversity is All You Need to Improve Bayesian Model Averaging

            Yashvir Singh Grewal, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Scalable Intervention Target Estimation in Linear Models
            15:16

            Scalable Intervention Target Estimation in Linear Models

            Burak Varici, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Adversarial Robustness of Streaming Algorithms through Importance Sampling
            07:14

            Adversarial Robustness of Streaming Algorithms through Importance Sampling

            Vladimir Braverman, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Capacity and Bias of Learned Geometric Embeddings for Directed Graphs
            14:56

            Capacity and Bias of Learned Geometric Embeddings for Directed Graphs

            Michael Boratko, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Channel Permutations for N:M Sparsity
            12:41

            Channel Permutations for N:M Sparsity

            Jeff Pool, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 1 = 0.1%

            Annotation Quality Framework - Accuracy,Credibility, and Consistency
            02:07

            Annotation Quality Framework - Accuracy,Credibility, and Consistency

            Liliya Lavitas, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interested in talks like this? Follow NeurIPS 2021