Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: VAST: Value Function Factorization with Variable Agent Sub-Teams
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-004-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-004-alpha.b-cdn.net
      • sl-yoda-v2-stream-004-beta.b-cdn.net
      • 1685195716.rsc.cdn77.org
      • 1239898752.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            VAST: Value Function Factorization with Variable Agent Sub-Teams
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            VAST: Value Function Factorization with Variable Agent Sub-Teams

            Dec 6, 2021

            Speakers

            TP

            Thomy Phan

            Speaker · 0 followers

            FR

            Fabian Ritz

            Speaker · 0 followers

            LB

            Lenz Belzner

            Speaker · 0 followers

            About

            Value function factorization (VFF) is a popular approach to cooperative multi-agent reinforcement learning in order to learn local value functions from global rewards. However, state-of-the-art VFF is limited to a handful of agents in most domains. We hypothesize that this is due to the flat factorization scheme, where the VFF operator becomes a performance bottleneck with an increasing number of agents. Therefore, we propose VFF with variable agent sub-teams (VAST). VAST approximates a factoriz…

            Organizer

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            About NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Adaptively Calibrated Critic Estimates for Deep Reinforcement Learning
            05:05

            Adaptively Calibrated Critic Estimates for Deep Reinforcement Learning

            Nicolai Dorka, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Fast Abductive Learning by Similarity-based Consistency Optimization
            13:25

            Fast Abductive Learning by Similarity-based Consistency Optimization

            Yu-Xuan Huang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Fast Certified Robust Training with Short Warmup
            11:49

            Fast Certified Robust Training with Short Warmup

            Zhouxing Shi, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Data-Driven Offline Optimization for Architecting Hardware Accelerators
            12:22

            Data-Driven Offline Optimization for Architecting Hardware Accelerators

            Aviral Kumar, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Maximum Mean Discrepancy for Generalization in the Presence of Distribution and Missingness Shift
            05:05

            Maximum Mean Discrepancy for Generalization in the Presence of Distribution and Missingness Shift

            Liwen Ouyang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Oral Session 1: Generative Modeling
            1:33:54

            Oral Session 1: Generative Modeling

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow NeurIPS 2021