Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: An Even More Optimal Stochastic Optimization Algorithm: Minibatching and Interpolation Learning
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-006-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-006-alpha.b-cdn.net
      • sl-yoda-v2-stream-006-beta.b-cdn.net
      • 1549480416.rsc.cdn77.org
      • 1102696603.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            An Even More Optimal Stochastic Optimization Algorithm: Minibatching and Interpolation Learning
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            An Even More Optimal Stochastic Optimization Algorithm: Minibatching and Interpolation Learning

            Dec 6, 2021

            Speakers

            BW

            Blake Woodworth

            Speaker · 0 followers

            NS

            Nathan Srebro

            Speaker · 0 followers

            About

            We present and analyze an algorithm for optimizing smooth and convex or strongly convex objectives using minibatch stochastic gradient estimates. The algorithm is optimal with respect to its dependence on both the minibatch size and minimum expected loss simultaneously. This improves over the optimal method of Lan, which is insensitive to the minimum expected loss; over the optimistic acceleration of Cotter et al., which has suboptimal dependence on the minibatch size; and over the algorithm of…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            The impact of weather information on machine-learning probabilistic electricity demand predictions
            05:51

            The impact of weather information on machine-learning probabilistic electricity demand predictions

            Yifu Ding, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement
            09:08

            Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement

            Sam Daulton, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Task-Driven Discovery of Perceptual Schemas for Generalization in Reinforcement Learning
            04:41

            Task-Driven Discovery of Perceptual Schemas for Generalization in Reinforcement Learning

            Wilka Carvalho, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Task-Adaptive Neural Network Search with Meta-Contrastive Learning
            12:19

            Task-Adaptive Neural Network Search with Meta-Contrastive Learning

            Wonyong Jeong, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Automatic Symmetry Discovery with Lie Algebra Convolutional Network
            14:42

            Automatic Symmetry Discovery with Lie Algebra Convolutional Network

            Nima Dehmamy, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 1 viewers voted for saving the presentation to eternal vault which is 0.1%

            KitchenShift: Evaluating Zero-Shot Generalization of Imitation-Based Policy Learning Under Domain Shifts
            04:35

            KitchenShift: Evaluating Zero-Shot Generalization of Imitation-Based Policy Learning Under Domain Shifts

            Eliot Xing, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021