Recognition: unknown
Temporal connection probabilities in real networks
Pith reviewed 2026-05-08 04:57 UTC · model grok-4.3
The pith
A closed-form expression unifies latent geometry with memory to predict when links form in temporal networks.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
We derive a closed-form non-Markovian expression for next-step connection probabilities that unifies latent hyperbolic geometry with long-range memory of past interactions. This expression yields interpretable forecasts governed by a small set of parameters. Applied to large-scale real networks, we find quantitative agreement with empirical connection probabilities and reveal how geometry and memory jointly shape link dynamics. These results establish a minimal and extensible foundation for principled probabilistic forecasting of temporal network topology.
What carries the argument
The closed-form non-Markovian expression for next-step connection probabilities, which integrates latent hyperbolic geometry with memory of past interactions.
If this is right
- Forecasts of future links become possible with a small number of interpretable parameters.
- The expression achieves quantitative agreement with observed connection probabilities across large real networks.
- Geometry and memory together determine the observed patterns of link formation.
- The approach supplies a minimal foundation for probabilistic prediction of how network topology evolves over time.
Where Pith is reading between the lines
- The same expression could be inserted into simulators of spreading processes to test whether geometry-plus-memory improves forecasts of epidemic or information diffusion.
- Parameter values that remain stable across networks suggest a route toward identifying universal scaling relations in temporal network growth.
- If the unification works, similar closed-form combinations of geometry and memory might be derived for other evolving systems such as citation graphs or contact networks.
Load-bearing premise
Real networks possess latent hyperbolic geometry, and long-range memory of past interactions can be captured by a non-Markovian closed-form expression with a small set of parameters that generalize across networks.
What would settle it
A collection of temporal networks in which the measured next-step connection probabilities deviate systematically from the values given by the derived expression, or in which the fitted parameters differ substantially from network to network.
Figures
read the original abstract
Principled prediction of when and where links form in complex networks is a fundamental problem. We derive a closed-form non-Markovian expression for next-step connection probabilities that unifies latent hyperbolic geometry with long-range memory of past interactions. This expression yields interpretable forecasts governed by a small set of parameters. Applied to large-scale real networks, we find quantitative agreement with empirical connection probabilities and reveal how geometry and memory jointly shape link dynamics. These results establish a minimal and extensible foundation for principled probabilistic forecasting of temporal network topology.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The manuscript derives a closed-form non-Markovian expression for next-step connection probabilities in temporal networks. This expression unifies latent hyperbolic geometry with long-range memory of past interactions and produces interpretable forecasts controlled by a small set of parameters. The authors report quantitative agreement with empirical connection probabilities on large-scale real networks and conclude that geometry and memory jointly determine link-formation dynamics.
Significance. If the derivation is correct, the result supplies a minimal, extensible, and parameter-efficient framework for probabilistic forecasting of temporal network topology. The closed-form unification of hyperbolic geometry and non-Markovian memory, together with direct quantitative empirical comparisons rather than qualitative fits, constitutes a clear strength. The claim that a small parameter set generalizes across networks is also noteworthy and, on the evidence presented, appears internally consistent without circular reduction to fitted quantities.
minor comments (3)
- [§3.2] §3.2: the precise functional form of the memory kernel is introduced without an explicit equation label; adding an equation number would improve traceability when the kernel is later inserted into the main probability expression.
- [Figure 4] Figure 4: the comparison plots would be clearer if the model curves were accompanied by shaded uncertainty bands derived from the parameter posterior or bootstrap resampling, rather than point estimates alone.
- [Abstract] The abstract states 'quantitative agreement' but does not name the metric (e.g., mean absolute error, R², or log-likelihood); stating the metric and its typical value in the abstract would help readers gauge the strength of the match immediately.
Simulated Author's Rebuttal
We thank the referee for the positive assessment of our manuscript and the recommendation for minor revision. The referee's summary accurately captures the core contribution: a closed-form non-Markovian expression unifying latent hyperbolic geometry with long-range memory, supported by quantitative empirical validation on real networks.
Circularity Check
Derivation is self-contained with no circularity
full rationale
The paper presents a derivation of a closed-form non-Markovian expression for next-step connection probabilities that unifies latent hyperbolic geometry with long-range memory of past interactions. This expression is derived from first principles rather than being defined in terms of its own outputs or fitted parameters. The model uses a small set of parameters whose values are determined from data, but the functional form itself is not forced by construction to match the predictions; instead, it is shown to recover empirical probabilities on real networks through direct quantitative comparisons. No self-definitional reductions, fitted inputs renamed as predictions, load-bearing self-citations, or ansatz smuggling are present. The central claim remains independent of its inputs and is externally falsifiable via network data.
Axiom & Free-Parameter Ledger
free parameters (1)
- small set of governing parameters
axioms (2)
- domain assumption Networks possess latent hyperbolic geometry
- domain assumption Long-range memory of past interactions can be expressed in closed non-Markovian form
Reference graph
Works this paper leans on
-
[1]
Quantitative Finance
Each snapshot corresponds to a one-week interval, obtained by aggregating all AS links observed during that week. The network grows over time, from 466 nodes in the first snapshot to 7786 in the last. PGP W eb of T rust.The PGP WoT topology snap- shots were extracted from Ref. [27]. Nodes represent users’ public-key certificates, and links denote mutual t...
2003
-
[2]
Interpretation of the memory exponent The power-law memory kernel entering the definition of Φt ij, wτ(λ) =τ −λ, τ=t−l+ 1,(G4) suppresses the contribution of past interactions increas- ingly strongly asλgrows. As a result, sufficiently large values ofλplace the model in a regime where interaction memory is dominated by the most recent observations, and th...
-
[3]
Whenω 1 is close to 1, the predictive geometric probabilityp pred ij (t) becomes small for the vast majority of node pairs
Identifiability in the strong-persistence regime In all considered networks, the inferred persistence pa- rameterω 1 is high and, in PGP, Bitcoin, and arXiv, very close to 1. Whenω 1 is close to 1, the predictive geometric probabilityp pred ij (t) becomes small for the vast majority of node pairs. Indeed, from Eq. (E7), one has ppred ij (t)≈ 1−ω 1 1−ω 2 e...
-
[4]
For the Φ = 0 branch, we retain all effective-distance bins at each time step and apply filtering across time as de- scribed earlier
At positive effective distances in the Φ>0 branch, we again retain bins observed in more than 30 time steps. For the Φ = 0 branch, we retain all effective-distance bins at each time step and apply filtering across time as de- scribed earlier. The results for Φ>0 are shown in Fig. 10 and those for Φ = 0 in Fig. 4 of the main text. In Bitcoin and arXiv, lin...
-
[5]
D. Krioukov, “Brain theory,” Frontiers in Computational Neuroscience8(2014), 10.3389/fncom.2014.00114
-
[6]
Strange kinetics shape network growth,
I. Bonamassa, “Strange kinetics shape network growth,” Physics17, 96 (2024)
2024
-
[7]
Recommender systems,
L. L¨ u, M. Medo, C. H. Yeung, Y.-C. Zhang, Z.-K. Zhang, and T. Zhou, “Recommender systems,” Physics Reports 519, 1–49 (2012)
2012
-
[8]
Easley and J
D. Easley and J. Kleinberg,Networks, Crowds, and Mar- kets: Reasoning about a Highly Connected World(Cam- bridge University Press, 2010)
2010
-
[9]
Understanding terrorist organizations with a dynamic model,
A. Gutfraind, “Understanding terrorist organizations with a dynamic model,” inMathematical Methods in Counterterrorism(Springer, Vienna, 2009) pp. 107–125
2009
-
[10]
Emergence of scaling in random networks,
A.-L. Barab´ asi and R. Albert, “Emergence of scaling in random networks,” Science286, 509–512 (1999)
1999
-
[11]
Popularity versus similarity in grow- ing networks,
F. Papadopoulos, M. Kitsak, M. ´A. Serrano, M. Bogu˜ n´ a, and D. Krioukov, “Popularity versus similarity in grow- ing networks,” Nature489, 537 EP – (2012)
2012
-
[12]
Network mapping by replaying hyperbolic growth,
F. Papadopoulos, C. Psomas, and D. Krioukov, “Network mapping by replaying hyperbolic growth,” IEEE/ACM Transactions on Networking23, 198–211 (2015)
2015
-
[13]
A survey of link prediction in complex networks,
V. Mart´ ınez, F. Berzal, and J.-C. Cubero, “A survey of link prediction in complex networks,”49(2016), 10.1145/3012704
-
[14]
Link pre- diction with hyperbolic geometry,
M. Kitsak, I. Voitalov, and D. Krioukov, “Link pre- diction with hyperbolic geometry,” Phys. Rev. Res.2, 043113 (2020)
2020
-
[15]
A sur- vey of link prediction in temporal networks,
J. Xiong, A. Zareie, and R. Sakellariou, “A sur- vey of link prediction in temporal networks,” (2025), arXiv:2502.21185 [cs.AI]
-
[16]
Link prediction with spatial and temporal consistency in dynamic networks,
W. Yu, W. Cheng, C. C. Aggarwal, H. Chen, and W. Wang, “Link prediction with spatial and temporal consistency in dynamic networks,” inProceedings of the Twenty-Sixth International Joint Conference on Artifi- cial Intelligence (IJCAI-17)(2017) pp. 3343–3349
2017
-
[17]
A universal method based on structure subgraph feature for link prediction over dynamic networks,
X. Li, W. Liang, X. Zhang, X. Liu, and W. Wu, “A universal method based on structure subgraph feature for link prediction over dynamic networks,” in2019 IEEE 39th International Conference on Distributed Computing Systems (ICDCS)(IEEE, 2019) pp. 1210–1220
2019
-
[18]
Temporal network prediction and interpretation,
L. Zou, X. Zhan, J. Sun, A. Hanjalic, and H. Wang, “Temporal network prediction and interpretation,” IEEE Transactions on Network Science and Engineering9, 1215–1224 (2022)
2022
-
[19]
Fundamental dynamics of popularity-similarity trajec- tories in real networks,
E. S. Papaefthymiou, C. Iordanou, and F. Papadopoulos, “Fundamental dynamics of popularity-similarity trajec- tories in real networks,” Phys. Rev. Lett.132, 257401 (2024)
2024
-
[20]
Hyperbolic geometry of complex net- works,
D. Krioukov, F. Papadopoulos, M. Kitsak, A. Vahdat, and M. Bogu˜ n´ a, “Hyperbolic geometry of complex net- works,” Phys. Rev. E82, 036106 (2010)
2010
-
[21]
Short- and long-term temporal network prediction based on network memory,
L. Zou, A. Ceria, and H. Wang, “Short- and long-term temporal network prediction based on network memory,” Applied Network Science8, 76 (2023)
2023
-
[22]
Mercator: uncovering faithful hyperbolic embeddings of complex networks,
G. Garc´ ıa-P´ erez, A. Allard, M ´A. Serrano, and M. Bogu˜ n´ a, “Mercator: uncovering faithful hyperbolic embeddings of complex networks,” New Journal of Physics21, 123033 (2019)
2019
-
[23]
How memory generates heterogeneous dynamics in temporal networks,
C. L. Vestergaard, M. G´ enois, and A. Barrat, “How memory generates heterogeneous dynamics in temporal networks,” Phys. Rev. E90, 042805 (2014)
2014
-
[24]
Temporal networks,
P. Holme and J. Saram¨ aki, “Temporal networks,” Physics Reports519, 97–125 (2012)
2012
-
[25]
Effects of mem- ory on spreading processes in non-Markovian temporal networks,
O. E. Williams, F. Lillo, and V. Latora, “Effects of mem- ory on spreading processes in non-Markovian temporal networks,” New Journal of Physics21, 043028 (2019)
2019
-
[26]
Network geometry,
M. Bogu˜ n´ a, I. Bonamassa, M. De Domenico, S. Havlin, D. Krioukov, and M. Serrano, “Network geometry,” Na- ture Reviews Physics3, 114–135 (2021)
2021
-
[27]
(ω 1, ω2)-temporal random hyperbolic graphs,
S. Zambirinis and F. Papadopoulos, “(ω 1, ω2)-temporal random hyperbolic graphs,” Phys. Rev. E110, 024309 (2024)
2024
-
[28]
The d-mercator method for the multidimensional hyperbolic embedding of real networks,
R. Jankowski, A. Allard, M. Bogu˜ n´ a, and M. ´A. Ser- rano, “The d-mercator method for the multidimensional hyperbolic embedding of real networks,” Nature Commu- nications14, 7585 (2023)
2023
-
[29]
Dimension mat- ters when modeling network communities in hyperbolic spaces,
B. D´ esy, P. Desrosiers, and A. Allard, “Dimension mat- ters when modeling network communities in hyperbolic spaces,” PNAS Nexus2, pgad136 (2023)
2023
-
[30]
Ark IPv6 Topology Dataset,
“Ark IPv6 Topology Dataset,”https://www.caida.org/ catalog/datasets/ipv6_allpref_topology_dataset/, Accessed: February, 2026
2026
-
[31]
OpenPGP Web of Trust Dataset,
“OpenPGP Web of Trust Dataset,”https://www. lysator.liu.se/~jc/wotsap/wots2/, Accessed: Febru- ary, 2026
2026
-
[32]
Kaggle - Bitcoin Dataset for analysis,
“Kaggle - Bitcoin Dataset for analysis,”https: //www.kaggle.com/datasets/chrysanthikosyfaki/ bitcoin-dataset-for-analysis, Accessed: February, 2026
2026
-
[33]
Kaggle - arXiv Dataset,
“Kaggle - arXiv Dataset,”https://www.kaggle. com/datasets/Cornell-University/arxiv, Accessed: February, 2026. 13 20 10 0 10 20 effective distance 100 101 102 103 104 105 106 107 number of pairs = 0 30 20 10 0 10 20 effective distance 100 101 102 103 104 105 106 107 number of pairs (0, 0.05) 30 20 10 0 10 20 effective distance 100 101 102 103 104 105 106 1...
2026
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.