Next
Livestream will start soon!
Livestream has already ended.
Presentation has not been recorded yet!
  • title: Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization
      0:00 / 0:00
      • Report Issue
      • Settings
      • Playlists
      • Bookmarks
      • Subtitles Off
      • Playback rate
      • Quality
      • Settings
      • Debug information
      • Server sl-yoda-v3-stream-013-alpha.b-cdn.net
      • Subtitles size Medium
      • Bookmarks
      • Server
      • sl-yoda-v3-stream-013-alpha.b-cdn.net
      • sl-yoda-v3-stream-013-beta.b-cdn.net
      • 1668715672.rsc.cdn77.org
      • 1420896597.rsc.cdn77.org
      • Subtitles
      • Off
      • English
      • Playback rate
      • Quality
      • Subtitles size
      • Large
      • Medium
      • Small
      • Mode
      • Video Slideshow
      • Audio Slideshow
      • Slideshow
      • Video
      My playlists
        Bookmarks
          00:00:00
            Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization
            • Settings
            • Sync diff
            • Quality
            • Settings
            • Server
            • Quality
            • Server

            Transfer Learning of Graph Neural Networks with Ego-graph Information Maximization

            Dez 6, 2021

            Sprecher:innen

            QZ

            Qi Zhu

            Speaker · 0 followers

            CY

            Carl Yang

            Speaker · 0 followers

            YX

            Yidan Xu

            Speaker · 0 followers

            Über

            Graph neural networks (GNNs) have achieved superior performance in various applications, but training dedicated GNNs can be costly for large-scale graphs. Some recent work started to study the pre-training of GNNs. However, none of them provide theoretical insights into the design of their frameworks, or clear requirements and guarantees towards their transferability. In this work, we establish a theoretically grounded and practically useful framework for the transfer learning of GNNs. Firstly,…

            Organisator

            N2
            N2

            NeurIPS 2021

            Account · 1.9k followers

            Über NeurIPS 2021

            Neural Information Processing Systems (NeurIPS) is a multi-track machine learning and computational neuroscience conference that includes invited talks, demonstrations, symposia and oral and poster presentations of refereed papers. Following the conference, there are workshops which provide a less formal setting.

            Gefällt euch das Format? Vertraut auf SlidesLive, um euer nächstes Event festzuhalten!

            Professionelle Aufzeichnung und Livestreaming – weltweit.

            Freigeben

            Empfohlene Videos

            Präsentationen, deren Thema, Kategorie oder Sprecher:in ähnlich sind

            Data vast and low in variance: Augment machine learning pipelines with dataset profiles to improve data quality without sacrificing scale
            02:04

            Data vast and low in variance: Augment machine learning pipelines with dataset profiles to improve data quality without sacrificing scale

            Bernease Herman, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Deep Molecular Representation Learning via Fusing Physical and Chemical Information
            06:01

            Deep Molecular Representation Learning via Fusing Physical and Chemical Information

            Shuwen Yang, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Differentially Private n-gram Extraction
            14:47

            Differentially Private n-gram Extraction

            Kunho Kim, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Modeling the human brain from invariance and robustness to clutter towards multimodal, multi-task and continuous learning models
            21:40

            Modeling the human brain from invariance and robustness to clutter towards multimodal, multi-task and continuous learning models

            Gemma Roig

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Fault-Tolerant Federated Reinforcement Learning with Theoretical Guarantee
            13:35

            Fault-Tolerant Federated Reinforcement Learning with Theoretical Guarantee

            Flint Xiaofeng Fan, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Compositional Transformers for Scene Generation
            11:54

            Compositional Transformers for Scene Generation

            Drew A. Hudson, …

            N2
            N2
            NeurIPS 2021 3 years ago

            Total of 0 viewers voted for saving the presentation to eternal vault which is 0.0%

            Interessiert an Vorträgen wie diesem? NeurIPS 2021 folgen