Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Tent: Fully Test-Time Adaptation by Entropy Minimization
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-010-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-010-alpha.b-cdn.net
      • sl-yoda-v2-stream-010-beta.b-cdn.net
      • 1759419103.rsc.cdn77.org
      • 1016618226.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Tent: Fully Test-Time Adaptation by Entropy Minimization
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Tent: Fully Test-Time Adaptation by Entropy Minimization

            May 3, 2021

            Speakers

            DW

            Dequan Wang

            Speaker · 0 followers

            ES

            Evan Shelhamer

            Speaker · 0 followers

            SL

            Shaoteng Liu

            Speaker · 0 followers

            About

            A model must adapt itself to generalize to new and different data during testing. This is the setting of fully test-time adaptation given only unlabeled test data and the model parameters. We propose test-time entropy minimization (tent): we optimize for model confidence as measured by the entropy of its predictions. During testing, we adapt the model features by estimating normalization statistics and optimizing channel-wise affine transformations. Tent improves robustness to corruptions for im…

            Organizer

            I2
            I2

            ICLR 2021

            Account · 898 followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Paper Session 1 Q&A
            07:43

            Paper Session 1 Q&A

            Ruohan Gao

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Zero-Cost Proxies for Lightweight NAS
            04:06

            Zero-Cost Proxies for Lightweight NAS

            Mohamed Abdelfattah, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            A Mathematical Exploration of Why Language Models Help Solve Downstream Tasks
            05:16

            A Mathematical Exploration of Why Language Models Help Solve Downstream Tasks

            Nikunj Saunshi, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Theoretical bounds on estimation error for meta-learning
            04:46

            Theoretical bounds on estimation error for meta-learning

            Richard Zemel, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            not-MIWAE: Deep Generative Modelling with Missing not at Random Data
            05:00

            not-MIWAE: Deep Generative Modelling with Missing not at Random Data

            Neils Bruun Ipsen, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Speeding Up Neural Network Verification via Automated Algorithm Configuration
            02:58

            Speeding Up Neural Network Verification via Automated Algorithm Configuration

            Matthias König, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICLR 2021