Provably expressive temporal graph networks

Nov 28, 2022

Speakers

About

Temporal graph networks (TGNs) have gained prominence as models for embedding dynamic interactions, but little is known about their theoretical underpinnings. We establish fundamental results about the representational power and limits of the two main categories of TGNs: WA-TGNs that aggregate temporal walks, and MP-TGNs that augment local message passing with (recurrent) memory modules. Specifically, novel constructions reveal the inadequacy of MP-TGNs and WA-TGNs, proving that neither category subsumes the other. We extend the 1-WL (Weisfeiler-Leman) test to temporal graphs, and show that the most powerful MP-TGNs should use injective updates, as in this case they become as expressive as the temporal WL. Moreover, we elucidate that sufficiently deep MP-TGNs cannot benefit from memory, and MP-TGNs fail to compute graph properties such as girth. These theoretical insights lead us to introduce PINT — a method provably more expressive than MP-TGN, WA-TGN, and temporal WL. Our experiments demonstrate that PINT outperforms existing TGNs on several real-world benchmarks.

Organizer

Like the format? Trust SlidesLive to capture your next event!

Professional recording and live streaming, delivered globally.

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2022