Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Delayed Propagation Transformer: A Universal Computation Engine towards Practical Control in Cyber-Physical Systems
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-014-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-014-alpha.b-cdn.net
      • sl-yoda-v3-stream-014-beta.b-cdn.net
      • 1978117156.rsc.cdn77.org
      • 1243944885.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Delayed Propagation Transformer: A Universal Computation Engine towards Practical Control in Cyber-Physical Systems
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Delayed Propagation Transformer: A Universal Computation Engine towards Practical Control in Cyber-Physical Systems

            Dez 6, 2021

            Sprecher:innen

            WZ

            Wenqing Zheng

            Sprecher:in · 0 Follower:innen

            QG

            Qiangqiang Guo

            Sprecher:in · 0 Follower:innen

            HY

            Hao Yang

            Sprecher:in · 2 Follower:innen

            Über

            Multi-agent control is a central theme in the Cyber-Physical Systems (CPS). However, current control methods either receive non-Markovian states due to insufficient sensing and decentralized design, or suffer from poor convergence. This paper presents the Delayed Propagation Transformer (DePT), a new transformer-based model that specializes in the global modeling of CPS while taking into account the immutable constraints from the physical world. DePT induces a cone-shaped spatial-temporal attent…

            Organisator

            N2
            N2

            NeurIPS 2021

            Konto · 1,9k Follower:innen

            Über NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Accurate Multi-Endpoint Molecular Toxicity Predictions in Humans with Contrastive Explanations
            05:05

            Accurate Multi-Endpoint Molecular Toxicity Predictions in Humans with Contrastive Explanations

            Bhanushee Sharma, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            PreferenceNet: Encoding Human Preferences in Auction Design with Deep Learning
            14:24

            PreferenceNet: Encoding Human Preferences in Auction Design with Deep Learning

            Neehar Peri, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Identifying and Benchmarking Natural Out-of-Context Prediction Problems
            12:36

            Identifying and Benchmarking Natural Out-of-Context Prediction Problems

            David Madras, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Partition-Based Formulations for Mixed-Integer Optimization of Trained ReLU Neural Networks
            10:54

            Partition-Based Formulations for Mixed-Integer Optimization of Trained ReLU Neural Networks

            Calvin Tsay, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Bridging Non Co-occurrence with Unlabeled In-the-wild Data for Incremental Object Detection
            08:05

            Bridging Non Co-occurrence with Unlabeled In-the-wild Data for Incremental Object Detection

            Na Dong, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Q&A 3
            14:46

            Q&A 3

            César Lincoln Mattos, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? NeurIPS 2021 folgen