Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Two Routes to Scalable Credit Assignment without Weight Symmetry
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-006-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-006-alpha.b-cdn.net
      • sl-yoda-v2-stream-006-beta.b-cdn.net
      • 1549480416.rsc.cdn77.org
      • 1102696603.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Two Routes to Scalable Credit Assignment without Weight Symmetry
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Two Routes to Scalable Credit Assignment without Weight Symmetry

            Jul 12, 2020

            Sprecher:innen

            DK

            Daniel Kunin

            Sprecher:in · 0 Follower:innen

            AN

            Aran Nayebi

            Sprecher:in · 0 Follower:innen

            JS

            Javier Sagastuy-Brena

            Sprecher:in · 0 Follower:innen

            Über

            The neural plausibility of backpropagation has long been disputed, primarily for its use of instantaneous weight transport. A variety of prior proposals that avoid weight transport fail to scale on complex tasks such as ImageNet; however, a recent proposal has reported competitive performance with backpropagation. We find that this local learning rule requires complex hyperparameter tuning that does not transfer across architectures. We identify a more robust local learning rule that transfers t…

            Organisator

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Kategorien

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            Über ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Counterfactual Transfer via Inductive Bias in Clinical Settings
            05:44

            Counterfactual Transfer via Inductive Bias in Clinical Settings

            Taylor Killian, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            GCNs: From Summarization to Heterophily
            31:12

            GCNs: From Summarization to Heterophily

            Danai Koutra

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Flows in Probabilistic Modeling & Inference
            29:07

            Flows in Probabilistic Modeling & Inference

            Martin Jankowiak

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Generative Adversarial Set Transformers
            04:31

            Generative Adversarial Set Transformers

            Karl Stelzner, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Evolutionary Reinforcement Learning for Sample-Efficient Multiagent Cooperation
            15:53

            Evolutionary Reinforcement Learning for Sample-Efficient Multiagent Cooperation

            Somdeb Majumdar, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Imputing Missing Data with the Gaussian Copula
            48:09

            Imputing Missing Data with the Gaussian Copula

            Madeleine Udell, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? ICML 2020 folgen