Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Exponential Graphs are Provably Efficient for Decentralized Deep Training
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-015-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-015-alpha.b-cdn.net
      • sl-yoda-v3-stream-015-beta.b-cdn.net
      • 1963568160.rsc.cdn77.org
      • 1940033649.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Exponential Graphs are Provably Efficient for Decentralized Deep Training
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Exponential Graphs are Provably Efficient for Decentralized Deep Training

            Dec 6, 2021

            Speakers

            BY

            Bicheng Ying

            Speaker · 0 followers

            KY

            Kun Yuan

            Speaker · 0 followers

            YC

            Yiming Chen

            Speaker · 0 followers

            About

            Decentralized SGD is an emerging training method for deep learning known for its much less (thus faster) communication per iteration, which relaxes the averaging step in parallel SGD to inexact averaging. The less exact the averaging is, however, the more the total iterations the training needs to take. Therefore, the key to making decentralized SGD efficient is to realize nearly-exact averaging using little communication. This requires a skillful choice of communication topology, which is an un…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Open Rule Induction
            10:50

            Open Rule Induction

            Wanyun Cui, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Fair Sparse Regression with Clustering: An Invex Relaxation for a Combinatorial Problem
            11:37

            Fair Sparse Regression with Clustering: An Invex Relaxation for a Combinatorial Problem

            Adarsh Barik, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Variational Optimistic Bayesian Sampling
            15:13

            Variational Optimistic Bayesian Sampling

            Brendan O'Donoghue, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Meta Learning Backpropagation And Improving It
            12:39

            Meta Learning Backpropagation And Improving It

            Louis Kirsch, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Boosting With Multiple Sources
            13:17

            Boosting With Multiple Sources

            Corinna Cortes, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Boost Neural Networks by Checkpoints
            04:45

            Boost Neural Networks by Checkpoints

            Feng Wang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021