Jul 24, 2023
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
Speaker · 0 followers
Graph neural networks (GNNs) are powerful tools for analyzing graph-structured data, and their expressiveness can be further enhanced by attention, i.e., inferring node relations for propagation. Attention-based GNNs infer neighbor importance to manipulate the weight of their propagation. Despite the success of graph attention in various fields, the discussion on deep graph attention and its unique challenges has been limited. In this work, we investigate some problematic phenomena related to deep graph attention, including vulnerability to over-smoothed features and smooth cumulative attention. Through theoretical and empirical analysis, we show that various state-of-the-art graph attention models suffer from these problems. Motivated by our findings, we propose AERO-GNN, a novel GNN architecture designed for deep graph attention. We theoretically show that AERO-GNN can mitigate the problems, further evidenced by its higher performance and adaptive attention distribution at deep layers (up to 64) in comparison to the baseline attention-based GNNs. On 10 out of 12 node classification benchmarks, AERO-GNN outperforms the baseline attention-based GNNs, further highlighting the advantages of deep graph attention.Graph neural networks (GNNs) are powerful tools for analyzing graph-structured data, and their expressiveness can be further enhanced by attention, i.e., inferring node relations for propagation. Attention-based GNNs infer neighbor importance to manipulate the weight of their propagation. Despite the success of graph attention in various fields, the discussion on deep graph attention and its unique challenges has been limited. In this work, we investigate some problematic phenomena related to de…
Professional recording and live streaming, delivered globally.
Presentations on similar topic, category or speaker
Yeonju Ro, …
Ilgee Hong, …
Hao Tang, …