Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence

            Dec 6, 2021

            Speakers

            DL

            Dachao Lin

            Speaker · 0 followers

            HY

            Haishan Ye

            Speaker · 0 followers

            ZZ

            Zhihua Zhang

            Speaker · 0 followers

            About

            In this paper we follow Rodomanov and Nesterov’s work to study quasi-Newton methods. We focus on the common SR1 and BFGS quasi-Newton methods to establish better explicit (local) superlinear convergence. First, based on greedy quasi-Newton update which greedily selects the direction so as to maximize a certain measure of progress, we improve the convergence rate to a condition-number-free superlinear convergence rate. Second, based on random quasi-Newton update that selects the direction randoml…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            FLoRA: Single-shot Hyper-parameter Optimization for Federated Learning
            14:07

            FLoRA: Single-shot Hyper-parameter Optimization for Federated Learning

            Yi Zhou, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            A Law of Iterated Logarithm for Multi-Agent Reinforcement Learning
            15:06

            A Law of Iterated Logarithm for Multi-Agent Reinforcement Learning

            Gugan Thoppe, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Controllable and Compositional Generation with Latent-Space Energy-Based Models
            13:13

            Controllable and Compositional Generation with Latent-Space Energy-Based Models

            Weili Nie, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Exploiting Proximity Search and Easy Examples to Select Rare Events
            01:54

            Exploiting Proximity Search and Easy Examples to Select Rare Events

            Daniel Kang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Near-optimal Offline and Streaming Algorithms for Learning Non-Linear Dynamical Systems
            14:43

            Near-optimal Offline and Streaming Algorithms for Learning Non-Linear Dynamical Systems

            Prateek Jain, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Oral Session 3: Theory
            1:32:58

            Oral Session 3: Theory

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021