Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: On Learning Language-Invariant Representations for Universal Machine Translation
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-016-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-016-alpha.b-cdn.net
      • sl-yoda-v3-stream-016-beta.b-cdn.net
      • 1504562137.rsc.cdn77.org
      • 1896834465.rsc.cdn77.org
      • Subtitles
      • Off
      • en
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            On Learning Language-Invariant Representations for Universal Machine Translation
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            On Learning Language-Invariant Representations for Universal Machine Translation

            Jul 12, 2020

            Speakers

            HZ

            Han Zhao

            Sprecher:in · 0 Follower:innen

            JH

            Junjie Hu

            Sprecher:in · 0 Follower:innen

            AR

            Andrej Risteski

            Sprecher:in · 0 Follower:innen

            About

            The goal of universal machine translation is to learn to translate between any pair of languages, given pairs of translated documents for some of these languages. Despite impressive empirical results and an increasing interest in massively multilingual models, theoretical analysis on translation errors made by such universal machine translation models is only nascent. In this paper, we take one step towards better understanding of universal machine translation by first proving an impossibility t…

            Organizer

            I2
            I2

            ICML 2020

            Konto · 2,7k Follower:innen

            Categories

            Linguistik und Philologie

            Kategorie · 88 Präsentationen

            About ICML 2020

            The International Conference on Machine Learning (ICML) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence known as machine learning. ICML is globally renowned for presenting and publishing cutting-edge research on all aspects of machine learning used in closely related areas like artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, and robotics. ICML is one of the fastest growing artificial intelligence conferences in the world. Participants at ICML span a wide range of backgrounds, from academic and industrial researchers, to entrepreneurs and engineers, to graduate students and postdocs.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Q&A with Francis Bach
            33:13

            Q&A with Francis Bach

            Francis Bach

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Heterogenous Network Representation across Cyber-Physical-Human Domains
            30:22

            Heterogenous Network Representation across Cyber-Physical-Human Domains

            Wenwu Zhu

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Puzzle Mix: Exploiting Saliency and Local Statistics for Optimal Mixup
            13:40

            Puzzle Mix: Exploiting Saliency and Local Statistics for Optimal Mixup

            Jang-Hyun Kim, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
            15:56

            Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates

            Yang Liu, …

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            DeepMind at WiML Un-worskhop
            14:38

            DeepMind at WiML Un-worskhop

            Meire Fortunato

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Recent Advances in High-Dimensional Robust Statistics - Part I
            42:38

            Recent Advances in High-Dimensional Robust Statistics - Part I

            Ilias Diakonikolas

            I2
            I2
            ICML 2020 5 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interested in talks like this? Follow ICML 2020