pith. machine review for the scientific record. sign in

arxiv: 2605.00951 · v1 · submitted 2026-05-01 · 💻 cs.LG · cs.AI

Recognition: unknown

Graph Rewiring in GNNs to Mitigate Over-Squashing and Over-Smoothing: A Survey

Authors on Pith no claims yet

Pith reviewed 2026-05-09 19:22 UTC · model grok-4.3

classification 💻 cs.LG cs.AI
keywords graph neural networksover-squashingover-smoothinggraph rewiringmessage passingtopology modificationinformation propagation
0
0 comments X

The pith

Graph rewiring modifies topologies to improve information flow and mitigate over-squashing and over-smoothing in Graph Neural Networks.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

This paper surveys graph rewiring methods that change the structure of graphs to address limitations in Graph Neural Networks. Over-squashing compresses distant node information during propagation, and over-smoothing makes representations from different nodes nearly identical after several steps. These problems occur because message passing depends heavily on the original connections in the graph. The review covers how rewiring approaches, such as adding shortcut edges or reweighting connections, can enhance the spread of information across longer distances while keeping node features distinct. Such improvements would allow GNNs to perform better on tasks involving complex, large-scale graphs like those in chemistry or social analysis.

Core claim

The central claim is that graph rewiring techniques, designed to modify the graph topology, enhance information propagation in GNNs and thereby mitigate over-squashing and over-smoothing. The survey examines the theoretical foundations, practical applications, and performance trade-offs of current rewiring methods in the literature.

What carries the argument

Graph rewiring, the modification of graph edges or their weights to facilitate improved message passing between nodes in a GNN.

Load-bearing premise

The root causes of over-squashing and over-smoothing lie in the interaction between standard message passing and the given graph topology, with rewiring able to correct this without significant unintended negative effects.

What would settle it

Running a GNN with and without rewiring on a benchmark graph dataset that suffers from over-squashing and finding that the rewired version achieves equal or lower accuracy.

Figures

Figures reproduced from arXiv: 2605.00951 by Davide Buscaldi, Fragkiskos D. Malliaros, Hugo Attali, Nathalie Pernelle.

Figure 1
Figure 1. Figure 1: Example of a graph G and its rewired instance G +, with node colours encoding commute times (see Section 3.3) from the highlighted red node. In G +, intra-clique density is reduced while connectivity across the bottleneck is strengthened, which short￾ens and balances commute times. This yields a topology more favourable to information propagation, thereby mitigating over￾smoothing and over-squashing. 3.1 D… view at source ↗
read the original abstract

Graph Neural Networks are powerful models for learning from graph-structured data, yet their effectiveness is often limited by two critical challenges: over-squashing, where information from distant nodes is excessively compressed, and over-smoothing, where repeated propagation makes node representations indistinguishable. Both phenomena stem from the interaction between message passing and the input topology, ultimately degrading information flow and limiting the performance of GNNs. In this survey, we examine graph rewiring techniques, a class of methods designed to modify the graph topology to enhance information propagation in GNNs. We provide a comprehensive review of state-of-the-art rewiring approaches, delving into their theoretical underpinnings, practical implementations, and performance trade-offs.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

0 major / 3 minor

Summary. This survey reviews graph rewiring techniques in GNNs that modify input topology to improve message passing and thereby mitigate over-squashing (excessive compression of distant-node information) and over-smoothing (indistinguishable node representations after repeated propagation). The central descriptive claim is that such rewiring methods enhance information flow; the manuscript covers their theoretical motivations, implementations, and trade-offs without advancing new theorems or experiments.

Significance. If the coverage is accurate and reasonably complete, the survey would serve as a useful organizing resource for the GNN community, consolidating scattered literature on a recognized pair of limitations and highlighting practical design choices. Its value is primarily archival and pedagogical rather than generative of new results.

minor comments (3)
  1. [Abstract] The abstract states that both phenomena 'stem from the interaction between message passing and the input topology' but does not cite the foundational references (e.g., the original over-squashing or over-smoothing papers) that establish this causal link; adding 1-2 key citations here would strengthen the opening claim.
  2. A high-level taxonomy or decision tree classifying rewiring methods (e.g., by whether they add/remove edges, operate statically or dynamically, or preserve vs. alter spectral properties) would improve navigability; the current organization appears to proceed method-by-method without an explicit organizing framework.
  3. Performance trade-offs are mentioned but the manuscript should explicitly note any systematic empirical gaps, such as lack of standardized benchmarks across rewiring papers or missing comparisons on the same datasets and GNN backbones.

Simulated Author's Rebuttal

0 responses · 0 unresolved

We thank the referee for their positive assessment of our survey and for recommending minor revision. We appreciate the recognition that the manuscript could serve as a useful organizing resource for the GNN community if the coverage is accurate and complete.

Circularity Check

0 steps flagged

No significant circularity: survey with no derivations or predictions

full rationale

This is a literature survey paper whose content consists of descriptive reviews of existing rewiring methods, their theoretical underpinnings from prior work, and performance trade-offs reported in the literature. No original equations, fitted parameters, predictions, or derivation chains are advanced by the authors. The abstract and structure confirm it summarizes state-of-the-art approaches without introducing self-referential claims, uniqueness theorems, or ansatzes that could reduce to the paper's own inputs. All load-bearing elements are external citations to independent prior research, satisfying the criteria for non-circularity.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

The paper is a survey and does not introduce or rely on new free parameters, axioms, or invented entities; it synthesizes prior work on GNN limitations and rewiring solutions.

pith-pipeline@v0.9.0 · 5427 in / 1032 out tokens · 26675 ms · 2026-05-09T19:22:43.831934+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

58 extracted references · 7 canonical work pages · 1 internal anchor

  1. [1]

    Over-squashing in graph neural networks: A comprehensive survey.Neurocomput- ing, page 130389,

    [Akansha, 2025] Singh Akansha. Over-squashing in graph neural networks: A comprehensive survey.Neurocomput- ing, page 130389,

  2. [2]

    Eigenvalues, expanders and superconcentrators

    [Alon and Milman, 1984] Noga Alon and Vitali D Milman. Eigenvalues, expanders and superconcentrators. InFOCS, pages 320–322,

  3. [3]

    On the bottleneck of graph neural networks and its practical im- plications

    [Alon and Yahav, 2021] Uri Alon and Eran Yahav. On the bottleneck of graph neural networks and its practical im- plications. InInternational Conference on Learning Rep- resentations,

  4. [4]

    Oversmoothing, ”oversquashing”, heterophily, long-range, and more: Demystifying common beliefs in graph machine learning

    [Arnaiz-Rodriguez and Errica, 2026] Adrian Arnaiz- Rodriguez and Federico Errica. Oversmoothing, ”oversquashing”, heterophily, long-range, and more: Demystifying common beliefs in graph machine learning. InThe Fourteenth International Conference on Learning Representations,

  5. [5]

    Diffwire: Inductive graph rewiring via the lov\’asz bound

    [Arnaiz-Rodr´ıguezet al., 2022] Adri´an Arnaiz-Rodr ´ıguez, Ahmed Begga, Francisco Escolano, and Nuria Oliver. Diffwire: Inductive graph rewiring via the lov\’asz bound. arXiv preprint arXiv:2206.07369,

  6. [6]

    Delaunay graph: Addressing over- squashing and over-smoothing using delaunay triangula- tion

    [Attaliet al., 2024 ] Hugo Attali, Davide Buscaldi, and Nathalie Pernelle. Delaunay graph: Addressing over- squashing and over-smoothing using delaunay triangula- tion. InForty-first International Conference on Machine Learning,

  7. [7]

    Dynamic triangulation-based graph rewiring for graph neural net- works.arXiv preprint arXiv:2508.19071,

    [Attaliet al., 2025b ] Hugo Attali, Thomas Papastergiou, Nathalie Pernelle, and Fragkiskos D Malliaros. Dynamic triangulation-based graph rewiring for graph neural net- works.arXiv preprint arXiv:2508.19071,

  8. [8]

    Oversquashing in gnns through the lens of information contraction and graph expansion

    [Banerjeeet al., 2022 ] Pradeep Kr Banerjee, Kedar Karhad- kar, Yu Guang Wang, Uri Alon, and Guido Mont ´ufar. Oversquashing in gnns through the lens of information contraction and graph expansion. InAACCC. IEEE,

  9. [9]

    Bronstein, and Francesco Di Giovanni

    [Barberoet al., 2024 ] Federico Barbero, Ameya Velingker, Amin Saberi, Michael M. Bronstein, and Francesco Di Giovanni. Locality-aware graph rewiring in GNNs. In The Twelfth International Conference on Learning Repre- sentations,

  10. [10]

    Bron- stein, Mathias Niepert, Bryan Perozzi, Mikhail Galkin, and Christopher Morris

    [Bechler-Speicheret al., 2025 ] Maya Bechler-Speicher, Ben Finkelshtein, Fabrizio Frasca, Luis M ¨uller, Jan T ¨onshoff, Antoine Siraudin, Viktor Zaverkin, Michael M. Bron- stein, Mathias Niepert, Bryan Perozzi, Mikhail Galkin, and Christopher Morris. Position: Graph learning will lose relevance due to poor benchmarks. InForty-second Inter- national Confe...

  11. [11]

    Understanding oversquashing in GNNs through the lens of effective resistance

    [Blacket al., 2023 ] Mitchell Black, Zhengchao Wan, Amir Nayyeri, and Yusu Wang. Understanding oversquashing in GNNs through the lens of effective resistance. InICML, pages 2528–2547. PMLR,

  12. [12]

    Scaling graph neural networks with approx- imate pagerank

    [Bojchevskiet al., 2020 ] Aleksandar Bojchevski, Johannes Gasteiger, Bryan Perozzi, Amol Kapoor, Martin Blais, Benedek R ´ozemberczki, Michal Lukasik, and Stephan G¨unnemann. Scaling graph neural networks with approx- imate pagerank. InKDD,

  13. [13]

    A note on over-smoothing for graph neural networks.Graph Repre- sentation Learning,

    [Cai and Wang, 2020] Chen Cai and Yusu Wang. A note on over-smoothing for graph neural networks.Graph Repre- sentation Learning,

  14. [14]

    The electri- cal resistance of a graph captures its commute and cover times

    [Chandraet al., 1989 ] Ashok K Chandra, Prabhakar Ragha- van, Walter L Ruzzo, and Roman Smolensky. The electri- cal resistance of a graph captures its commute and cover times. InSTOC,

  15. [15]

    A lower bound for the small- est eigenvalue of the laplacian

    [Cheeger, 1970] Jeff Cheeger. A lower bound for the small- est eigenvalue of the laplacian. InProblems in Analysis: A Symposium in Honor of Salomon Bochner (PMS-31). Princeton University Press,

  16. [16]

    Rethinking the gold stan- dard: Why discrete curvature fails to fully capture over- squashing in GNNs? InThe Fourteenth International Con- ference on Learning Representations,

    [Chenet al., 2026 ] Jialong Chen, Bowen Deng, Zibin Zheng, and Chuan Chen. Rethinking the gold stan- dard: Why discrete curvature fails to fully capture over- squashing in GNNs? InThe Fourteenth International Con- ference on Learning Representations,

  17. [17]

    Are graph trans- formers necessary? efficient long-range message passing with fractal nodes in mpnns.arXiv,

    [Choiet al., 2025 ] Jeongwhan Choi, Seungjun Park, Sumin Park, Sung-Bae Cho, and Noseong Park. Are graph trans- formers necessary? efficient long-range message passing with fractal nodes in mpnns.arXiv,

  18. [18]

    American Mathematical Soc.,

    [Chung and Graham, 1997] Fan RK Chung and Fan Chung Graham.Spectral graph theory. American Mathematical Soc.,

  19. [19]

    Expander graph propagation

    [Deacet al., 2022 ] Andreea Deac, Marc Lackenby, and Petar Veliˇckovi´c. Expander graph propagation. InLearning on Graphs Conference, pages 38–1. PMLR,

  20. [20]

    On over-squashing in message pass- ing neural networks: The impact of width, depth, and topology

    [Di Giovanniet al., 2023 ] Francesco Di Giovanni, Lorenzo Giusti, Federico Barbero, Giulia Luise, Pietro Lio, and Michael M Bronstein. On over-squashing in message pass- ing neural networks: The impact of width, depth, and topology. InInternational conference on machine learn- ing, pages 7865–7885. PMLR,

  21. [21]

    Long range graph benchmark.NeurIPS,

    [Dwivediet al., 2022 ] Vijay Prakash Dwivedi, Ladislav Ramp´aˇsek, Michael Galkin, Ali Parviz, Guy Wolf, Anh Tuan Luu, and Dominique Beaini. Long range graph benchmark.NeurIPS,

  22. [22]

    Mitigating over-smoothing and over-squashing using aug- mentations of forman-ricci curvature

    [Fesser and Weber, 2023] Lukas Fesser and Melanie Weber. Mitigating over-smoothing and over-squashing using aug- mentations of forman-ricci curvature. InLoG,

  23. [23]

    Bochner’s method for cell complexes and combinatorial ricci curvature

    [Forman, 2003] Robin Forman. Bochner’s method for cell complexes and combinatorial ricci curvature

  24. [24]

    Neu- ral message passing for quantum chemistry

    [Gilmeret al., 2017 ] Justin Gilmer, Samuel S Schoenholz, Patrick F Riley, Oriol Vinyals, and George E Dahl. Neu- ral message passing for quantum chemistry. InInterna- tional conference on machine learning, pages 1263–1272. PMLR,

  25. [25]

    Malliaros

    [Giraldoet al., 2023 ] Jhony H Giraldo, Konstantinos Skia- nis, Thierry Bouwmans, and Fragkiskos D. Malliaros. On the trade-off between over-smoothing and over-squashing in deep graph neural networks. InACM International Conference on Information and Knowledge Management, CIKM, pages 566–576,

  26. [26]

    When are graph neural networks better than structure-agnostic meth- ods? InI Can’t Believe It’s Not Better Workshop: Under- standing Deep Learning Through Empirical Falsification,

    [Gomeset al., 2022 ] Diana Gomes, Frederik Ruelens, Kyri- akos Efthymiadis, Ann Nowe, and Peter Vrancx. When are graph neural networks better than structure-agnostic meth- ods? InI Can’t Believe It’s Not Better Workshop: Under- standing Deep Learning Through Empirical Falsification,

  27. [27]

    A survey on learning from graphs with heterophily: recent advances and future directions.Frontiers of Com- puter Science, 20(2):2002314,

    [Gonget al., 2026 ] Cheng-Hua Gong, Yao Cheng, Jian- Xiang Yu, Can Xu, Cai-Hua Shan, Si-Qiang Luo, and Xi- ang Li. A survey on learning from graphs with heterophily: recent advances and future directions.Frontiers of Com- puter Science, 20(2):2002314,

  28. [28]

    A new model for learning in graph do- mains

    [Goriet al., 2005 ] Marco Gori, Gabriele Monfardini, and Franco Scarselli. A new model for learning in graph do- mains. InIJCNN, volume 2, pages 729–734,

  29. [29]

    Sketch-augmented features improve learning long-range dependencies in graph neural networks

    [Hosseiniet al., 2026 ] Ryien Hosseini, Filippo Simini, Venkatram Vishwanath, Rebecca Willett, and Henry Hoffmann. Sketch-augmented features improve learning long-range dependencies in graph neural networks. InThe Thirty-ninth Annual Conference on Neural Information Processing Systems,

  30. [30]

    Spectral graph prun- ing against over-squashing and over-smoothing

    [Jamadandiet al., 2024 ] Adarsh Jamadandi, Celia Rubio- Madrigal, and Rebekka Burkholz. Spectral graph prun- ing against over-squashing and over-smoothing. InThe Thirty-eighth Annual Conference on Neural Information Processing Systems,

  31. [31]

    Ollivier’s ricci curvature, local clustering and curvature-dimension inequalities on graphs.Discrete & Computational Geom- etry, 51(2):300–322,

    [Jost and Liu, 2014] J¨urgen Jost and Shiping Liu. Ollivier’s ricci curvature, local clustering and curvature-dimension inequalities on graphs.Discrete & Computational Geom- etry, 51(2):300–322,

  32. [32]

    Generalization in graph neural networks: Improved pac-bayesian bounds on graph diffu- sion

    [Juet al., 2023 ] Haotian Ju, Dongyue Li, Aneesh Sharma, and Hongyang R Zhang. Generalization in graph neural networks: Improved pac-bayesian bounds on graph diffu- sion. InAISTATS. PMLR,

  33. [33]

    Fosr: First-order spectral rewiring for addressing oversquashing in gnns

    [Karhadkaret al., 2023 ] Kedar Karhadkar, Pradeep Kr Banerjee, and Guido Mont ´ufar. Fosr: First-order spectral rewiring for addressing oversquashing in gnns. InInter- national Conference on Learning Representations, ICLR,

  34. [34]

    Diffusion improves graph learning

    [Klicperaet al., 2019 ] Johannes Klicpera, Stefan Weißen- berger, and Stephan G ¨unnemann. Diffusion improves graph learning. InAdvances in neural information pro- cessing systems, NeurIPS,

  35. [35]

    Mitigat- ing over-squashing in graph neural networks by spectrum- preserving sparsification

    [Lianget al., 2025 ] Langzhang Liang, Fanchen Bu, Zixing Song, Zenglin Xu, Shirui Pan, and Kijung Shin. Mitigat- ing over-squashing in graph neural networks by spectrum- preserving sparsification. InForty-second International Conference on Machine Learning,

  36. [36]

    Bronstein, and Xiaowen Dong

    [Lianget al., 2026 ] Huidong Liang, Haitz S ´aez de Oc´ariz Borde, Baskaran Sripathmanathan, Michael M. Bronstein, and Xiaowen Dong. Towards quantifying long-range interactions in graph machine learning: a large graph dataset and a measurement. InThe Fourteenth International Conference on Learning Representations,

  37. [37]

    Large scale learning on non-homophilous graphs: New benchmarks and strong simple methods

    [Limet al., 2021 ] Derek Lim, Felix Hohne, Xiuyu Li, Si- jia Linda Huang, Vaishnavi Gupta, Omkar Bhalerao, and Ser Nam Lim. Large scale learning on non-homophilous graphs: New benchmarks and strong simple methods. In NeurIPS,

  38. [38]

    Joint graph rewiring and feature de- noising via spectral resonance

    [Linkerh¨agneret al., 2025 ] Jonas Linkerh ¨agner, Cheng Shi, and Ivan Dokmani´c. Joint graph rewiring and feature de- noising via spectral resonance. InThe Thirteenth Interna- tional Conference on Learning Representations,

  39. [39]

    Can you hear me now? a benchmark for long-range graph propagation

    [Miglioret al., 2026 ] Luca Miglior, Matteo Tolloso, Alessio Gravina, and Davide Bacciu. Can you hear me now? a benchmark for long-range graph propagation. InThe Four- teenth International Conference on Learning Representa- tions,

  40. [40]

    Revisiting over-smoothing and over-squashing using ollivier-ricci curvature

    [Nguyenet al., 2023 ] Khang Nguyen, Nong Minh Hieu, Vinh Duc Nguyen, Nhat Ho, Stanley Osher, and Tan Minh Nguyen. Revisiting over-smoothing and over-squashing using ollivier-ricci curvature. InInternational Conference on Machine Learning, pages 25956–25979. PMLR,

  41. [41]

    Ricci curvature of metric spaces.Comptes Rendus Mathematique, 345(11):643– 646,

    [Ollivier, 2007] Yann Ollivier. Ricci curvature of metric spaces.Comptes Rendus Mathematique, 345(11):643– 646,

  42. [42]

    A critical look at the evaluation of gnns under heterophily: are we really making progress? InICLR,

    [Platonovet al., 2023 ] Oleg Platonov, Denis Kuznedelev, Michael Diskin, Artem Babenko, and Liudmila Prokhorenkova. A critical look at the evaluation of gnns under heterophily: are we really making progress? InICLR,

  43. [43]

    Probabilistically rewired message-passing neural networks.arXiv preprint arXiv:2310.02156,

    [Qianet al., 2023 ] Chendi Qian, Andrei Manolache, Ka- reem Ahmed, Zhe Zeng, Guy Van den Broeck, Math- ias Niepert, and Christopher Morris. Probabilistically rewired message-passing neural networks.arXiv preprint arXiv:2310.02156,

  44. [44]

    Probabilistic graph rewiring via virtual nodes

    [Qianet al., 2024 ] Chendi Qian, Andrei Manolache, Christopher Morris, and Mathias Niepert. Probabilistic graph rewiring via virtual nodes. InThe Thirty-eighth Annual Conference on Neural Information Processing Systems,

  45. [45]

    Dropedge: Towards deep graph con- volutional networks on node classification

    [Ronget al., 2019 ] Yu Rong, Wenbing Huang, Tingyang Xu, and Junzhou Huang. Dropedge: Towards deep graph con- volutional networks on node classification. InICLR,

  46. [46]

    GNNs getting comfy: Community and feature similarity guided rewiring

    [Rubio-Madrigalet al., 2025 ] Celia Rubio-Madrigal, Adarsh Jamadandi, and Rebekka Burkholz. GNNs getting comfy: Community and feature similarity guided rewiring. InThe Thirteenth International Conference on Learning Representations,

  47. [47]

    A sur- vey on oversmoothing in graph neural networks,

    [Ruschet al., 2023 ] T Konstantin Rusch, Michael M Bron- stein, and Siddhartha Mishra. A survey on over- smoothing in graph neural networks.arXiv preprint arXiv:2303.10993,

  48. [48]

    Over-squashing in gnns and causal inference of rewiring strategies.arXiv preprint arXiv:2508.09265,

    [Saber and Salehi-Abari, 2025] Danial Saber and Amirali Salehi-Abari. Over-squashing in gnns and causal inference of rewiring strategies.arXiv preprint arXiv:2508.09265,

  49. [49]

    Comparative analysis of two discretizations of ricci curvature for com- plex networks.Scientific reports, 8(1):8650,

    [Samalet al., 2018 ] Areejit Samal, RP Sreejith, Jiao Gu, Shiping Liu, Emil Saucan, and J ¨urgen Jost. Comparative analysis of two discretizations of ricci curvature for com- plex networks.Scientific reports, 8(1):8650,

  50. [50]

    The graph neural network model

    [Scarselliet al., 2008 ] Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. The graph neural network model. In transactions on neural networks. IEEE,

  51. [51]

    Bronstein, and Johannes F

    [Southernet al., 2025 ] Joshua Southern, Francesco Di Gio- vanni, Michael M. Bronstein, and Johannes F. Lutzeyer. Understanding virtual nodes: Oversquashing and node heterogeneity. InICLR,

  52. [52]

    Bronstein

    [Toppinget al., 2022 ] Jake Topping, Francesco Di Gio- vanni, Benjamin Paul Chamberlain, Xiaowen Dong, and Michael M. Bronstein. Understanding over-squashing and bottlenecks on graphs via curvature. InInternational Con- ference on Learning Representations,

  53. [53]

    The effectiveness of curvature-based rewiring and the role of hyperparameters in GNNs revisited

    [Toriet al., 2025 ] Floriano Tori, Vincent Holst, and Vincent Ginis. The effectiveness of curvature-based rewiring and the role of hyperparameters in GNNs revisited. InICLR,

  54. [54]

    Cayley graph propagation.arXiv preprint arXiv:2410.03424,

    [Wilsonet al., 2024 ] JJ Wilson, Maya Bechler-Speicher, and Petar Veli ˇckovi´c. Cayley graph propagation.arXiv preprint arXiv:2410.03424,

  55. [55]

    Deep sets.Advances in neural information processing systems, 30,

    [Zaheeret al., 2017 ] Manzil Zaheer, Satwik Kottur, Siamak Ravanbakhsh, Barnabas Poczos, Russ R Salakhutdinov, and Alexander J Smola. Deep sets.Advances in neural information processing systems, 30,

  56. [56]

    Graph Neural Networks for Graphs with Heterophily: A Survey

    [Zhenget al., 2022 ] Xin Zheng, Yi Wang, Yixin Liu, Ming Li, Miao Zhang, Di Jin, Philip S Yu, and Shirui Pan. Graph neural networks for graphs with heterophily: A survey. arXiv preprint arXiv:2202.07082,

  57. [57]

    Graph neural networks: A review of methods and applications.AI open,

    [Zhouet al., 2020 ] Jie Zhou, Ganqu Cui, Shengding Hu, Zhengyan Zhang, Cheng Yang, Zhiyuan Liu, Lifeng Wang, Changcheng Li, and Maosong Sun. Graph neural networks: A review of methods and applications.AI open,

  58. [58]

    Be- yond homophily in graph neural networks: Current limita- tions and effective designs

    [Zhuet al., 2020 ] Jiong Zhu, Yujun Yan, Lingxiao Zhao, Mark Heimann, Leman Akoglu, and Danai Koutra. Be- yond homophily in graph neural networks: Current limita- tions and effective designs. InNeurIPS, 2020