Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Generalization bounds via distillation
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Generalization bounds via distillation
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Generalization bounds via distillation

            May 3, 2021

            Speakers

            DH

            Daniel Hsu

            Speaker · 1 follower

            ZJ

            Ziwei Ji

            Speaker · 0 followers

            MT

            Matus Telgarsky

            Speaker · 0 followers

            About

            This paper provides a suite of mathematical tools to bound the generalization error of networks that possess low-complexity distillations --- that is, when there exist simple networks whose softmax outputs approximately match those of the original network. The primary contribution is the aforementioned bound, which upper bounds the test error of a network by the sum of its training error, the distillation error, and the complexity of the distilled network. Supporting this, secondary contribution…

            Organizer

            I2
            I2

            ICLR 2021

            Account · 898 followers

            Categories

            AI & Data Science

            Category · 10.8k presentations

            About ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Probing BERT in Hyperbolic Spaces
            05:10

            Probing BERT in Hyperbolic Spaces

            Boli Chen, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Learning Reasoning Paths over Semantic Graphs for Video-grounded Dialogues
            05:02

            Learning Reasoning Paths over Semantic Graphs for Video-grounded Dialogues

            Hung Le, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Towards Faster and Stabilized GAN Training for High-Fidelity Few-Shot Image Synthesis
            05:08

            Towards Faster and Stabilized GAN Training for High-Fidelity Few-Shot Image Synthesis

            Bingchen Liu, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            NONSEPARABLE SYMPLECTIC NEURAL NETWORKS
            04:58

            NONSEPARABLE SYMPLECTIC NEURAL NETWORKS

            Shiying Xiong, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Rank the Episodes: A Simple Approach for Exploration in Procedurally-Generated Environments
            05:09

            Rank the Episodes: A Simple Approach for Exploration in Procedurally-Generated Environments

            Daochen Zha, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            On the Stability of Fine-tuning BERT: Misconceptions, Explanations, and Strong Baselines
            03:01

            On the Stability of Fine-tuning BERT: Misconceptions, Explanations, and Strong Baselines

            Marius Mosbach, …

            I2
            I2
            ICLR 2021 4 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interested in talks like this? Follow ICLR 2021