IMPRESS improves graph few-shot learning by learning representations in hyperbolic space and using denoising diffusion to better approximate target distributions from few support samples.
Hicf: Hyper- bolic informative collaborative filtering
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
years
2026 2verdicts
UNVERDICTED 2representative citing papers
The survey groups attention-based GNNs into three stages—graph recurrent attention networks, graph attention networks, and graph transformers—while reviewing architectures and future directions.
citing papers explorer
-
Improving Graph Few-shot Learning with Hyperbolic Space and Denoising Diffusion
IMPRESS improves graph few-shot learning by learning representations in hyperbolic space and using denoising diffusion to better approximate target distributions from few support samples.
-
Attention-based graph neural networks: a survey
The survey groups attention-based GNNs into three stages—graph recurrent attention networks, graph attention networks, and graph transformers—while reviewing architectures and future directions.