Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Coresets for Wasserstein Distributionally Robust Optimization Problems
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-010-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-010-alpha.b-cdn.net
      • sl-yoda-v2-stream-010-beta.b-cdn.net
      • 1759419103.rsc.cdn77.org
      • 1016618226.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Coresets for Wasserstein Distributionally Robust Optimization Problems
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Coresets for Wasserstein Distributionally Robust Optimization Problems

            Dez 6, 2022

            Sprecher:innen

            RH

            Ruomin Huang

            Sprecher:in · 0 Follower:innen

            JH

            Jiawei Huang

            Sprecher:in · 0 Follower:innen

            WL

            Wenjie Liu

            Sprecher:in · 0 Follower:innen

            Über

            Wasserstein distributionally robust optimization (WDRO) is a popular model to enhance the robustness of machine learning with ambiguous data. However, the complexity of WDRO can be prohibitive in practice since solving its “minimax” formulation requires a great amount of computation. Recently, several fast WDRO training algorithms for some specific machine learning tasks (e.g., logistic regression) have been developed. However, the research on designing efficient algorithms for general large-sca…

            Organisator

            N2
            N2

            NeurIPS 2022

            Konto · 962 Follower:innen

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            DTG-SSOD: Dense Teacher Guidance for Semi-Supervised Object Detection
            01:02

            DTG-SSOD: Dense Teacher Guidance for Semi-Supervised Object Detection

            Gang Li, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Calibration of Large Neural Weather Models
            08:57

            Calibration of Large Neural Weather Models

            Andre Graubner, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Neural Networks Efficiently Learn Low-Dimensional Representations with SGD
            04:47

            Neural Networks Efficiently Learn Low-Dimensional Representations with SGD

            Alireza Mousavi-Hosseini, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Neural Matching Fields: Implicit Representation of Matching Fields for Visual Correspondence
            05:15

            Neural Matching Fields: Implicit Representation of Matching Fields for Visual Correspondence

            Sunghwan Hong

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            [Re] A Cluster-based Approach for Improving Isotropy in Contextual Embedding Space
            04:38

            [Re] A Cluster-based Approach for Improving Isotropy in Contextual Embedding Space

            Benjamin Džubur

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Quantitatively Assessing Explainability in Collaborative Computational Co-Creativity
            04:13

            Quantitatively Assessing Explainability in Collaborative Computational Co-Creativity

            Michael Paul Clemens, …

            N2
            N2
            NeurIPS 2022 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? NeurIPS 2022 folgen