A Bayesian hyperbolic latent space model with inferable temperature parameter outperforms fixed-temperature and Euclidean models in network reconstruction by better capturing tree-like topologies.
Advances in neural information processing systems , volume=
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
years
2026 2verdicts
UNVERDICTED 2representative citing papers
The survey groups attention-based GNNs into three stages—graph recurrent attention networks, graph attention networks, and graph transformers—while reviewing architectures and future directions.
citing papers explorer
-
Hyperbolic Latent Space Models for Network Embedding: Model Specification and Bayesian Inference
A Bayesian hyperbolic latent space model with inferable temperature parameter outperforms fixed-temperature and Euclidean models in network reconstruction by better capturing tree-like topologies.
-
Attention-based graph neural networks: a survey
The survey groups attention-based GNNs into three stages—graph recurrent attention networks, graph attention networks, and graph transformers—while reviewing architectures and future directions.