Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Distilling Knowledge from Reader to Retriever for Question Answering
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-004-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-004-alpha.b-cdn.net
      • sl-yoda-v2-stream-004-beta.b-cdn.net
      • 1685195716.rsc.cdn77.org
      • 1239898752.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Distilling Knowledge from Reader to Retriever for Question Answering
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Distilling Knowledge from Reader to Retriever for Question Answering

            Mai 3, 2021

            Sprecher:innen

            GI

            Gautier Izacard

            Sprecher:in · 0 Follower:innen

            EG

            Edouard Grave

            Sprecher:in · 0 Follower:innen

            Über

            The task of information retrieval is an important component of many natural language processing systems, such as open domain question answering. While traditional methods were based on hand-crafted features, continuous representations based on neural networks recently obtained competitive results. A challenge of using such methods is to obtain supervised data to train the retriever model, corresponding to pairs of query and support documents. In this paper, we propose a technique to learn retrie…

            Organisator

            I2
            I2

            ICLR 2021

            Konto · 906 Follower:innen

            Kategorien

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            Über ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Panel #5: Hyo Gweon & Matt Botvinick
            28:04

            Panel #5: Hyo Gweon & Matt Botvinick

            Hyo Gweon, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Training GANs with Stronger Augmentations via Contrastive Discriminator
            05:48

            Training GANs with Stronger Augmentations via Contrastive Discriminator

            Jongheon Jeong, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 1 = 0.1%

            Risk-Averse Offline Reinforcement Learning
            05:17

            Risk-Averse Offline Reinforcement Learning

            Núria Armengol Urpí, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Conjugate Energy-Based Models
            29:13

            Conjugate Energy-Based Models

            Hao Wu, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Announcement of the different award winners and closing remarks
            38:07

            Announcement of the different award winners and closing remarks

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Introduction and Opening Remarks
            05:02

            Introduction and Opening Remarks

            Gautam Kamath

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? ICLR 2021 folgen