Další
Živý přenos začne již brzy!
Živý přenos již skončil.
Prezentace ještě nebyla nahrána!
  • title: Two Routes to Scalable Credit Assignment without Weight Symmetry
      0:00 / 0:00
      • Nahlásit chybu
      • Nastavení
      • Playlisty
      • Záložky
      • Titulky Off
      • Rychlost přehrávání
      • Kvalita
      • Nastavení
      • Debug informace
      • Server sl-yoda-v2-stream-006-alpha.b-cdn.net
      • Velikost titulků Střední
      • Záložky
      • Server
      • sl-yoda-v2-stream-006-alpha.b-cdn.net
      • sl-yoda-v2-stream-006-beta.b-cdn.net
      • 1549480416.rsc.cdn77.org
      • 1102696603.rsc.cdn77.org
      • Titulky
      • Off
      • en
      • Rychlost přehrávání
      • Kvalita
      • Velikost titulků
      • Velké
      • Střední
      • Malé
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      Moje playlisty
        Záložky
          00:00:00
            Two Routes to Scalable Credit Assignment without Weight Symmetry
            • Nastavení
            • Sync diff
            • Kvalita
            • Nastavení
            • Server
            • Kvalita
            • Server

            Two Routes to Scalable Credit Assignment without Weight Symmetry

            12. července 2020

            Řečníci

            DK

            Daniel Kunin

            Sprecher:in · 0 Follower:innen

            AN

            Aran Nayebi

            Sprecher:in · 0 Follower:innen

            JS

            Javier Sagastuy-Brena

            Sprecher:in · 0 Follower:innen

            O prezentaci

            The neural plausibility of backpropagation has long been disputed, primarily for its use of instantaneous weight transport. A variety of prior proposals that avoid weight transport fail to scale on complex tasks such as ImageNet; however, a recent proposal has reported competitive performance with backpropagation. We find that this local learning rule requires complex hyperparameter tuning that does not transfer across architectures. We identify a more robust local learning rule that transfers t…

            Organizátor

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Kategorie

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            O organizátorovi (ICML 2020)

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Baví vás formát? Nechte SlidesLive zachytit svou akci!

            Profesionální natáčení a streamování po celém světě.

            Sdílení

            Doporučená videa

            Prezentace na podobné téma, kategorii nebo přednášejícího

            Counterfactual Transfer via Inductive Bias in Clinical Settings
            05:44

            Counterfactual Transfer via Inductive Bias in Clinical Settings

            Taylor Killian, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            GCNs: From Summarization to Heterophily
            31:12

            GCNs: From Summarization to Heterophily

            Danai Koutra

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Flows in Probabilistic Modeling & Inference
            29:07

            Flows in Probabilistic Modeling & Inference

            Martin Jankowiak

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Generative Adversarial Set Transformers
            04:31

            Generative Adversarial Set Transformers

            Karl Stelzner, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Evolutionary Reinforcement Learning for Sample-Efficient Multiagent Cooperation
            15:53

            Evolutionary Reinforcement Learning for Sample-Efficient Multiagent Cooperation

            Somdeb Majumdar, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Imputing Missing Data with the Gaussian Copula
            48:09

            Imputing Missing Data with the Gaussian Copula

            Madeleine Udell, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Zajímají Vás podobná videa? Sledujte ICML 2020