Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Gradient Vaccine: Investigating and Improving Multi-task Optimization in Massively Multilingual Models
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-002-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-002-alpha.b-cdn.net
      • sl-yoda-v2-stream-002-beta.b-cdn.net
      • 1001562353.rsc.cdn77.org
      • 1075090661.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Gradient Vaccine: Investigating and Improving Multi-task Optimization in Massively Multilingual Models
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Gradient Vaccine: Investigating and Improving Multi-task Optimization in Massively Multilingual Models

            May 3, 2021

            Speakers

            ZW

            Zirui Wang

            Speaker · 1 follower

            YT

            Yulia Tsvetkov

            Speaker · 1 follower

            OF

            Orhan Firat

            Speaker · 0 followers

            About

            Massively multilingual models subsuming tens or even hundreds of languages pose great challenges to multi-task optimization. While it is a common practice to apply a language-agnostic procedure optimizing a joint multilingual task objective, how to properly characterize and take advantage of its underlying problem structure for improving optimization efficiency remains under-explored. In this paper, we attempt to peek into the black-box of multilingual optimization through the lens of loss funct…

            Organizer

            I2
            I2

            ICLR 2021

            Account · 909 followers

            Categories

            Visual Arts & Graphic Design

            Category · 333 presentations

            AI & Data Science

            Category · 10.8k presentations

            About ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Learning to see from few labels
            1:01:31

            Learning to see from few labels

            Bharath Hariharan

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            CPT: Efficient Deep Neural Network Training via Cyclic Precision
            08:55

            CPT: Efficient Deep Neural Network Training via Cyclic Precision

            Yonggan Fu, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Closing Remarks
            01:18

            Closing Remarks

            Nayat Sánchez-Pi

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Graph Energy-Based Model for Molecular Graph Generation
            21:32

            Graph Energy-Based Model for Molecular Graph Generation

            Ryuichiro Hataya, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Opening remarks - How Can Findings About the Brain Improve AI Systems?
            18:08

            Opening remarks - How Can Findings About the Brain Improve AI Systems?

            Vy Vo

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Viewmaker Networks: Learning Views for Unsupervised Representation Learning
            05:03

            Viewmaker Networks: Learning Views for Unsupervised Representation Learning

            Alex Tamkin, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICLR 2021