Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Rockmate: an Efficient, Fast, Automatic and Generic Tool for Re-materialization in PyTorch
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v2-stream-006-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v2-stream-006-alpha.b-cdn.net
      • sl-yoda-v2-stream-006-beta.b-cdn.net
      • 1549480416.rsc.cdn77.org
      • 1102696603.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Rockmate: an Efficient, Fast, Automatic and Generic Tool for Re-materialization in PyTorch
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Rockmate: an Efficient, Fast, Automatic and Generic Tool for Re-materialization in PyTorch

            Jul 25, 2023

            Speakers

            XZ

            Xunyi Zhao

            Sprecher:in · 0 Follower:innen

            TLH

            Théotime Le Hellard

            Sprecher:in · 0 Follower:innen

            LE

            Lionel Eyraud-Dubois

            Sprecher:in · 0 Follower:innen

            About

            We propose Rockmate to control the memory requirements when training PyTorch DNN models. Rockmate is an automatic tool that starts from the model code and generates an equivalent model, using a predefined amount of memory for activations, at the cost of a few re-computations. Rockmate automatically detects the structure of computational and data dependencies and rewrites the initial model as a sequence of complex blocks. We show that such a structure is widespread and can be found in many model…

            Organizer

            I2
            I2

            ICML 2023

            Konto · 657 Follower:innen

            Like the format? Trust SlidesLive to capture your next event!

            Professional recording and live streaming, delivered globally.

            Sharing

            Recommended Videos

            Presentations on similar topic, category or speaker

            Leveraging Offline Data in Online Reinforcement Learning
            05:17

            Leveraging Offline Data in Online Reinforcement Learning

            Andrew Wagenmaker, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Graph Neural Networks with Learnable and Optimal Polynomial Bases
            04:26

            Graph Neural Networks with Learnable and Optimal Polynomial Bases

            Yuhe Guo, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Diffusion Models as Artists: Are we Closing the Gap between Humans and Machines?
            09:19

            Diffusion Models as Artists: Are we Closing the Gap between Humans and Machines?

            Victor Boutin, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Cones: Concept Neurons in Diffusion Models for Customized Generation
            06:22

            Cones: Concept Neurons in Diffusion Models for Customized Generation

            Zhiheng Liu, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Localized Learning: Past, Present and Future
            1:05:33

            Localized Learning: Past, Present and Future

            Jack Kendall, …

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 1 = 0.1%

            Closing Remarks
            10:59

            Closing Remarks

            Miguel Felipe Arevalo-Castiblanco

            I2
            I2
            ICML 2023 2 years ago

            Ewigspeicher-Fortschrittswert: 0 = 0.0%

            Interested in talks like this? Follow ICML 2023