Další
Živý přenos začne již brzy!
Živý přenos již skončil.
Prezentace ještě nebyla nahrána!
  • title: Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence
      0:00 / 0:00
      • Nahlásit chybu
      • Nastavení
      • Playlisty
      • Záložky
      • Titulky Off
      • Rychlost přehrávání
      • Kvalita
      • Nastavení
      • Debug informace
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Velikost titulků Střední
      • Záložky
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Titulky
      • Off
      • English
      • Rychlost přehrávání
      • Kvalita
      • Velikost titulků
      • Velké
      • Střední
      • Malé
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      Moje playlisty
        Záložky
          00:00:00
            Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence
            • Nastavení
            • Sync diff
            • Kvalita
            • Nastavení
            • Server
            • Kvalita
            • Server

            Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence

            6. prosince 2021

            Řečníci

            DL

            Dachao Lin

            Sprecher:in · 0 Follower:innen

            HY

            Haishan Ye

            Sprecher:in · 0 Follower:innen

            ZZ

            Zhihua Zhang

            Sprecher:in · 0 Follower:innen

            O prezentaci

            In this paper we follow Rodomanov and Nesterov’s work to study quasi-Newton methods. We focus on the common SR1 and BFGS quasi-Newton methods to establish better explicit (local) superlinear convergence. First, based on greedy quasi-Newton update which greedily selects the direction so as to maximize a certain measure of progress, we improve the convergence rate to a condition-number-free superlinear convergence rate. Second, based on random quasi-Newton update that selects the direction randoml…

            Organizátor

            N2
            N2

            NeurIPS 2021

            Konto · 1,9k Follower:innen

            O organizátorovi (NeurIPS 2021)

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Baví vás formát? Nechte SlidesLive zachytit svou akci!

            Profesionální natáčení a streamování po celém světě.

            Sdílení

            Doporučená videa

            Prezentace na podobné téma, kategorii nebo přednášejícího

            Designing Molecular Models with Machine Learning and Experimental Data
            19:27

            Designing Molecular Models with Machine Learning and Experimental Data

            Cecilia Clementi

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Closing Remarks
            05:00

            Closing Remarks

            Andres Munoz

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Safe Reinforcement Learning by Imagining the Near Future
            06:50

            Safe Reinforcement Learning by Imagining the Near Future

            Garrett Thomas, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Sparse Quadratic Optimisation over the Stiefel Manifold with Application to Permutation Synchronisation
            11:13

            Sparse Quadratic Optimisation over the Stiefel Manifold with Application to Permutation Synchronisation

            Florian Bernard, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            PROCAT: Product Catalogue Dataset for Implicit Clustering, Permutation Learning and Structure Prediction
            04:52

            PROCAT: Product Catalogue Dataset for Implicit Clustering, Permutation Learning and Structure Prediction

            Mateusz Jurewicz, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            A Risk Model for Predicting Powerline-induced Wildfires in Distribution System
            04:52

            A Risk Model for Predicting Powerline-induced Wildfires in Distribution System

            Mengqi Yao, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Zajímají Vás podobná videa? Sledujte NeurIPS 2021