TY - JOUR
T1 - Flow2GNN: Flexible Two-Way Flow Message Passing for Enhancing GNNs Beyond Homophily
AU - Huang, C.
AU - Wang, Y.
AU - Jiang, Y.
AU - Li, M.
AU - Huang, X.
AU - Wang, S.
AU - Pan, S.
AU - Zhou, C.
PY - 2024/7/10
Y1 - 2024/7/10
N2 - Message passing (MP) is crucial for effective graph neural networks (GNNs). Most local message-passing schemes have been shown to underperform on heterophily graphs due to the perturbation of updated representations caused by local redundant heterophily information. However, our experiment findings indicate that the distribution of heterophily information during MP can be disrupted by disentangling local neighborhoods. This finding can be applied to other GNNs, improving their performance on heterophily graphs in a more flexible manner compared to most heterophily GNNs with complex designs. This article proposes a new type of simple message-passing neural network called Flow2GNN. It uses a two-way flow message-passing scheme to enhance the ability of GNNs by disentangling and redistributing heterophily information in the topology space and the attribute space. Our proposed message-passing scheme consists of two steps in topology space and attribute space. First, we introduce a new disentangled operator with binary elements that disentangle topology information in-flow and out-flow between connected nodes. Second, we use an adaptive aggregation model that adjusts the flow amount between homophily and heterophily attribute information. Furthermore, we rigorously prove that disentangling in message-passing can reduce the generalization gap, offering a deeper understanding of how our model enhances other GNNs. The extensive experiment results show that the proposed model, Flow2GNN, not only outperforms state-of-the-art GNNs, but also helps improve the performance of other commonly used GNNs on heterophily graphs, including GCN, GAT, GCNII, and H 2 GCN, specifically for GCN, with up to a 25.88% improvement on the Wisconsin dataset.
AB - Message passing (MP) is crucial for effective graph neural networks (GNNs). Most local message-passing schemes have been shown to underperform on heterophily graphs due to the perturbation of updated representations caused by local redundant heterophily information. However, our experiment findings indicate that the distribution of heterophily information during MP can be disrupted by disentangling local neighborhoods. This finding can be applied to other GNNs, improving their performance on heterophily graphs in a more flexible manner compared to most heterophily GNNs with complex designs. This article proposes a new type of simple message-passing neural network called Flow2GNN. It uses a two-way flow message-passing scheme to enhance the ability of GNNs by disentangling and redistributing heterophily information in the topology space and the attribute space. Our proposed message-passing scheme consists of two steps in topology space and attribute space. First, we introduce a new disentangled operator with binary elements that disentangle topology information in-flow and out-flow between connected nodes. Second, we use an adaptive aggregation model that adjusts the flow amount between homophily and heterophily attribute information. Furthermore, we rigorously prove that disentangling in message-passing can reduce the generalization gap, offering a deeper understanding of how our model enhances other GNNs. The extensive experiment results show that the proposed model, Flow2GNN, not only outperforms state-of-the-art GNNs, but also helps improve the performance of other commonly used GNNs on heterophily graphs, including GCN, GAT, GCNII, and H 2 GCN, specifically for GCN, with up to a 25.88% improvement on the Wisconsin dataset.
U2 - 10.1109/TCYB.2024.3412149
DO - 10.1109/TCYB.2024.3412149
M3 - Article
C2 - 38985552
SN - 2168-2275
SP - 1
EP - 12
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
ER -