Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Are Neural Rankers still Outperformed by Gradient Boosted Decision Trees?
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-009-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-009-alpha.b-cdn.net
      • sl-yoda-v2-stream-009-beta.b-cdn.net
      • 1766500541.rsc.cdn77.org
      • 1441886916.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Are Neural Rankers still Outperformed by Gradient Boosted Decision Trees?
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Are Neural Rankers still Outperformed by Gradient Boosted Decision Trees?

            May 3, 2021

            Speakers

            ZQ

            Zhen Qin

            Speaker · 0 followers

            LY

            Le Yan

            Speaker · 0 followers

            HZ

            Honglei Zhuang

            Speaker · 0 followers

            About

            Despite the success of neural models on many major machine learning problems, their effectiveness on traditional Learning-to-Rank (LTR) problems is still not widely acknowledged. We first validate this concern by showing that most recent neural LTR models are, by a large margin, inferior to the best publicly available Gradient Boosted Decision Trees (GBDT) in terms of their reported ranking accuracy on benchmark datasets. This unfortunately was somehow overlooked in recent neural LTR papers. We…

            Organizer

            I2
            I2

            ICLR 2021

            Account · 899 followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Long-tail learning via logit adjustment
            09:22

            Long-tail learning via logit adjustment

            Aditya Krishna Menon, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Continual Learning in Recurrent Neural Networks
            05:16

            Continual Learning in Recurrent Neural Networks

            Benjamin Ehret, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Emergent Road Rules In Multi-Agent Driving Environments
            04:48

            Emergent Road Rules In Multi-Agent Driving Environments

            Avik Pal, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Self-Supervision for Learning from the Bottom Up
            1:01:12

            Self-Supervision for Learning from the Bottom Up

            Alexei A. Efros

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Using system context information to complement weakly labeled data
            03:18

            Using system context information to complement weakly labeled data

            Matthias Meyer, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            On the Dynamics of Training Attention Models
            05:09

            On the Dynamics of Training Attention Models

            Haoye Lu, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICLR 2021