Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Linear Convergence of Gradient Methods for Estimating Structured Transition Matrices in High-dimensional Vector Autoregressive Models
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-012-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-012-alpha.b-cdn.net
      • sl-yoda-v3-stream-012-beta.b-cdn.net
      • 1338956956.rsc.cdn77.org
      • 1656830687.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Linear Convergence of Gradient Methods for Estimating Structured Transition Matrices in High-dimensional Vector Autoregressive Models
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Linear Convergence of Gradient Methods for Estimating Structured Transition Matrices in High-dimensional Vector Autoregressive Models

            Dec 6, 2021

            Speakers

            XL

            Xiao Lv

            Speaker · 0 followers

            WC

            Wei Cui

            Speaker · 0 followers

            YL

            Yulong Liu

            Speaker · 0 followers

            About

            In this paper, we present non-asymptotic optimization guarantees of gradient descent methods for estimating structured transition matrices in high-dimensional vector autoregressive (VAR) models. We adopt the projected gradient descent (PGD) for single-structured transition matrices and the alternating projected gradient descent (AltPGD) for superposition-structured ones. Our analysis demonstrates that both gradient algorithms converge linearly to the statistical error even though the strong conv…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Coresets for Time Series Clustering
            15:19

            Coresets for Time Series Clustering

            Lingxiao Huang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Making the most of your day: online learning for optimal allocation of time
            15:17

            Making the most of your day: online learning for optimal allocation of time

            Etienne Boursier, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Amortized Variational Inference (AVI) for Type Ia Supernova Light Curves
            05:12

            Amortized Variational Inference (AVI) for Type Ia Supernova Light Curves

            Alexis Sánchez, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Introduction
            04:48

            Introduction

            Karen A. McKinnon, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Generalizability of density functionals learned from differentiable programming on weakly correlated spin-polarized systems
            15:34

            Generalizability of density functionals learned from differentiable programming on weakly correlated spin-polarized systems

            Bhupalee Kalita, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Calibrated Ensembles: A Simple Way to Mitigate ID-OOD Accuracy Tradeoffs
            05:02

            Calibrated Ensembles: A Simple Way to Mitigate ID-OOD Accuracy Tradeoffs

            Ananya Kumar, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021