Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective

            Mai 3, 2021

            Sprecher:innen

            WC

            Wuyang Chen

            Sprecher:in · 0 Follower:innen

            XG

            Xinyu Gong

            Sprecher:in · 0 Follower:innen

            ZW

            Zhangyang Wang

            Sprecher:in · 1 Follower:in

            Über

            Neural Architecture Search (NAS) has been explosively studied to automate the discovery of top-performer neural networks. Current works require heavy training of supernet or intensive architecture evaluations, thus suffering from heavy resource consumption and often incurring search bias due to truncated training or approximations. Can we select the best neural architectures without involving any training and eliminate a drastic portion of the search cost? We provide an affirmative answer, by p…

            Organisator

            I2
            I2

            ICLR 2021

            Konto · 906 Follower:innen

            Kategorien

            KI und Datenwissenschaft

            Kategorie · 10,8k Präsentationen

            Über ICLR 2021

            The International Conference on Learning Representations (ICLR) is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation learning, but generally referred to as deep learning. ICLR is globally renowned for presenting and publishing cutting-edge research on all aspects of deep learning used in the fields of artificial intelligence, statistics and data science, as well as important application areas such as machine vision, computational biology, speech recognition, text understanding, gaming, and robotics.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Oral Session 8 - Q&A 2
            09:22

            Oral Session 8 - Q&A 2

            Denis Yarats, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Ditto: Fair and Robust Federated Learning Through Personalization
            03:13

            Ditto: Fair and Robust Federated Learning Through Personalization

            Tian Li, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Proximal Gradient Descent-Ascent: Variable Convergence under KŁ Geometry
            05:10

            Proximal Gradient Descent-Ascent: Variable Convergence under KŁ Geometry

            Ziyi Chen, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Panel discussion
            38:35

            Panel discussion

            Drew Linsley, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Gradient Vaccine: Investigating and Improving Multi-task Optimization in Massively Multilingual Models
            10:16

            Gradient Vaccine: Investigating and Improving Multi-task Optimization in Massively Multilingual Models

            Zirui Wang, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            On the Geometry of Generalization and Memorization in Deep Neural Networks
            05:14

            On the Geometry of Generalization and Memorization in Deep Neural Networks

            Cory Stephenson, …

            I2
            I2
            ICLR 2021 4 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interessiert an Vorträgen wie diesem? ICLR 2021 folgen