pith. machine review for the scientific record. sign in

arxiv: 2604.07493 · v1 · submitted 2026-04-08 · 💻 cs.CR · cs.LG· stat.AP

Recognition: unknown

Differentially Private Modeling of Disease Transmission within Human Contact Networks

Authors on Pith no claims yet

Pith reviewed 2026-05-10 17:18 UTC · model grok-4.3

classification 💻 cs.CR cs.LGstat.AP
keywords differential privacycontact networksdisease transmissionstochastic block modelsexponential random graph modelsagent-based simulationepidemiology
0
0 comments X

The pith

Differential privacy on contact network summaries adds less error to disease spread simulations than sampling or model misspecification.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper develops a pipeline that applies differential privacy to node-level network statistics from sensitive contact data, fits statistical models such as ERGMs or SBMs to those noisy summaries, generates synthetic networks, and runs agent-based SIS disease simulations on the results. It evaluates this on egocentric sexual network data and finds that the added privacy noise produces only small changes in simulated incidence, prevalence, and intervention effects relative to errors from sampling and model misspecification. A reader would care because contact networks contain private details yet are essential for understanding infectious disease dynamics, so this approach could let curators release useful public-health insights without exposing individuals.

Core claim

By computing node-level differentially private summaries of a contact network, fitting an ERGM or SBM to those summaries to produce synthetic networks that reflect the original structure, and then simulating SIS disease spread via agent-based modeling, the privacy-induced noise remains small compared with sampling variability and model misspecification on ARTNet egocentric sexual network data. Numerical outcomes such as incidence and prevalence, as well as qualitative conclusions about intervention effect sizes, stay comparable with and without the privacy step.

What carries the argument

The three-step pipeline of node-level differential privacy on network summaries, statistical model fitting (SBM or ERGM) to generate synthetic networks, and agent-based SIS disease simulation.

If this is right

  • Simulated disease incidence and prevalence stay close between private and non-private networks.
  • Qualitative findings on the size of intervention effects remain consistent with and without privacy protection.
  • Curators of sensitive contact data can release epidemiologic insights derived from simulations while satisfying differential privacy.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same pipeline could be tested on networks with different degree distributions or community structures to check whether the relative size of privacy noise stays small.
  • If stronger statistical models reduce misspecification error, the practical cost of adding differential privacy would shrink further.
  • Extending the approach to other simulation domains that rely on private networks, such as information or behavior diffusion, would be a direct next step.

Load-bearing premise

The statistical network models fitted to the differentially private summaries capture the structural features that actually drive transmission dynamics in the subsequent agent-based simulations.

What would settle it

Running the same disease simulations on many independent samples of the original network and on DP-protected synthetic networks, then observing that the spread in incidence or prevalence caused by the privacy step exceeds the spread from sampling or from alternative model fits.

Figures

Figures reproduced from arXiv: 2604.07493 by Adam Smith, Debanuj Nayak, Iden Kalemaj, Jason R. Gantenberg, Shlomi Hod, Thomas A. Trikalinos.

Figure 1
Figure 1. Figure 1: An overview of our pipeline for differentially private (DP) simulation of infectious disease trans [PITH_FULL_IMAGE:figures/full_fig_p004_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Epidemic dynamics comparing baseline and intervention scenarios under the “high prevalence” [PITH_FULL_IMAGE:figures/full_fig_p005_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Network representation and its corresponding mixing matrix. [PITH_FULL_IMAGE:figures/full_fig_p006_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: An illustration of the experimental design to evaluate our pipeline as described in Section [PITH_FULL_IMAGE:figures/full_fig_p013_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Edge-level network statistics presented on a relative scale, showing percentage difference from the [PITH_FULL_IMAGE:figures/full_fig_p015_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Prevalence ratios comparing intervention effects across different modeling approaches and preva [PITH_FULL_IMAGE:figures/full_fig_p016_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Granular analysis by age and race for ERGM high prevalence condition across different maximum [PITH_FULL_IMAGE:figures/full_fig_p017_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Effect of privacy budget (ε) on prevalence ratio with maximum degree truncated at 3 across all four modeling scenarios. of the total variation can be attributed to: (1) the differentially private release of summary statistics; (2) sampling a synthetic network from a model fitted on summary statistics; and (3) simulating an epidemic on a synthetic network. These sources of randomness can be illustrated by t… view at source ↗
Figure 9
Figure 9. Figure 9: Variance analysis at privacy budget ε = 1 and truncated degree ∆ = 3 showing the breakdown of different sources of randomness in the experimental pipeline for ERGM at High Prevalence condition. The plots show the average prevalence of baseline scenario over all synthetic networks and simulations (solid line), per differential private release (dashed line), per synthetic network (plus sign), and the range p… view at source ↗
Figure 10
Figure 10. Figure 10: Prevalence for the baseline scenario across different modeling approaches and prevalence condi [PITH_FULL_IMAGE:figures/full_fig_p037_10.png] view at source ↗
Figure 11
Figure 11. Figure 11: Prevalence for the intervention scenario across different modeling approaches and prevalence [PITH_FULL_IMAGE:figures/full_fig_p038_11.png] view at source ↗
Figure 12
Figure 12. Figure 12: Incidence rate ratios comparing intervention effects across different modeling approaches and [PITH_FULL_IMAGE:figures/full_fig_p039_12.png] view at source ↗
Figure 13
Figure 13. Figure 13: Granular analysis by age and race for ERGM low prevalence condition across different maximum [PITH_FULL_IMAGE:figures/full_fig_p040_13.png] view at source ↗
Figure 14
Figure 14. Figure 14: Granular analysis by age and race for BM high prevalence condition across different maximum [PITH_FULL_IMAGE:figures/full_fig_p041_14.png] view at source ↗
Figure 15
Figure 15. Figure 15: Granular analysis by age and race for BM low prevalence condition across different maximum [PITH_FULL_IMAGE:figures/full_fig_p042_15.png] view at source ↗
Figure 16
Figure 16. Figure 16: Variance analysis at privacy budget ε = 1 and truncated degree ∆ = 3 showing the breakdown of different sources of randomness in the experimental pipeline across different modeling approaches and prevalence conditions. The plots show the average prevalence of baseline scenario over all synthetic net￾works and simulations (solid line), per differential private release (dashed line), per synthetic network (… view at source ↗
Figure 17
Figure 17. Figure 17: Degree-level network statistics across different modeling approaches. The x-axis represents the [PITH_FULL_IMAGE:figures/full_fig_p046_17.png] view at source ↗
Figure 18
Figure 18. Figure 18: Prevalence ratio across different maximum degree thresholds for age group 25-34 across all four [PITH_FULL_IMAGE:figures/full_fig_p047_18.png] view at source ↗
Figure 19
Figure 19. Figure 19: Prevalence ratio across different maximum degree thresholds for Black population across all four [PITH_FULL_IMAGE:figures/full_fig_p048_19.png] view at source ↗
read the original abstract

Epidemiologic studies of infectious diseases often rely on models of contact networks to capture the complex interactions that govern disease spread, and ongoing projects aim to vastly increase the scale at which such data can be collected. However, contact networks may include sensitive information, such as sexual relationships or drug use behavior. Protecting individual privacy while maintaining the scientific usefulness of the data is crucial. We propose a privacy-preserving pipeline for disease spread simulation studies based on a sensitive network that integrates differential privacy (DP) with statistical network models such as stochastic block models (SBMs) and exponential random graph models (ERGMs). Our pipeline comprises three steps: (1) compute network summary statistics using \emph{node-level} DP (which corresponds to protecting individuals' contributions); (2) fit a statistical model, like an ERGM, using these summaries, which allows generating synthetic networks reflecting the structure of the original network; and (3) simulate disease spread on the synthetic networks using an agent-based model. We evaluate the effectiveness of our approach using a simple Susceptible-Infected-Susceptible (SIS) disease model under multiple configurations. We compare both numerical results, such as simulated disease incidence and prevalence, as well as qualitative conclusions such as intervention effect size, on networks generated with and without differential privacy constraints. Our experiments are based on egocentric sexual network data from the ARTNet study (a survey about HIV-related behaviors). Our results show that the noise added for privacy is small relative to other sources of error (sampling and model misspecification). This suggests that, in principle, curators of such sensitive data can provide valuable epidemiologic insights while protecting privacy.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 2 minor

Summary. The paper proposes a three-step pipeline for privacy-preserving disease transmission modeling: (1) apply node-level differential privacy to network summary statistics from sensitive contact data, (2) fit statistical network models such as stochastic block models (SBMs) or exponential random graph models (ERGMs) to the noisy summaries to generate synthetic networks, and (3) run agent-based SIS simulations on those networks to study incidence, prevalence, and intervention effects. Using egocentric sexual contact data from the ARTNet study, the authors compare epidemic outcomes and qualitative conclusions between DP-constrained and non-DP versions, concluding that the added privacy noise is small relative to sampling and model misspecification errors.

Significance. If the central empirical claim holds with adequate quantification, the work would demonstrate a practical way to release useful epidemiologic insights from sensitive human contact networks while providing formal privacy guarantees. This bridges differential privacy techniques with network epidemiology and could inform data-sharing policies for public-health studies. The use of real ARTNet data and direct comparison of simulation outputs (rather than only summary statistics) is a positive aspect, though the evaluation's lack of detail on privacy parameters and statistical rigor reduces the immediate impact.

major comments (3)
  1. [Abstract, evaluation section] Abstract and evaluation section: the central claim that 'the noise added for privacy is small relative to other sources of error (sampling and model misspecification)' is presented without reported effect sizes, confidence intervals, or statistical tests comparing DP vs. non-DP simulation outputs (incidence, prevalence, intervention effect sizes). This leaves the relative-error conclusion qualitative rather than quantitative.
  2. [Evaluation section] Evaluation section: no specific privacy budget values (ε) or sensitivity parameters for the node-DP mechanism are stated, nor are the exact summary statistics to which noise is added (e.g., degree sequences or subgraph counts used in ERGM terms). Without these, the privacy-utility tradeoff cannot be assessed or reproduced.
  3. [Pipeline description and results] Pipeline description and results: the assumption that SBM/ERGM fits to node-DP noisy summaries preserve transmission-critical structure (degree heterogeneity, clustering) for SIS dynamics is not directly tested. No comparison of network properties known to affect epidemic thresholds (e.g., degree distribution moments or clustering coefficients) between DP and non-DP synthetic networks is provided.
minor comments (2)
  1. [Abstract] The abstract mentions 'multiple configurations' for the SIS model but does not specify the exact parameter ranges or number of simulation replicates used to generate the reported incidence/prevalence values.
  2. [Methods] Notation for the node-DP mechanism and the subsequent model-fitting step could be clarified with explicit equations showing how noisy statistics enter the ERGM/SBM likelihood.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for their constructive and detailed comments. These have highlighted opportunities to strengthen the quantitative rigor, reproducibility, and validation aspects of our work. We address each major comment below and have revised (or will revise) the manuscript accordingly.

read point-by-point responses
  1. Referee: [Abstract, evaluation section] Abstract and evaluation section: the central claim that 'the noise added for privacy is small relative to other sources of error (sampling and model misspecification)' is presented without reported effect sizes, confidence intervals, or statistical tests comparing DP vs. non-DP simulation outputs (incidence, prevalence, intervention effect sizes). This leaves the relative-error conclusion qualitative rather than quantitative.

    Authors: We agree that quantitative support is needed to make the central claim rigorous. In the revised manuscript, we will report effect sizes (e.g., relative percentage differences) for incidence, prevalence, and intervention effect sizes. We will also provide confidence intervals based on multiple independent simulation replicates and include statistical tests (such as paired t-tests or non-parametric equivalents) comparing DP-constrained and non-DP outputs. This will allow a precise assessment of whether privacy noise is indeed small relative to sampling and model misspecification errors. revision: yes

  2. Referee: [Evaluation section] Evaluation section: no specific privacy budget values (ε) or sensitivity parameters for the node-DP mechanism are stated, nor are the exact summary statistics to which noise is added (e.g., degree sequences or subgraph counts used in ERGM terms). Without these, the privacy-utility tradeoff cannot be assessed or reproduced.

    Authors: We acknowledge the importance of these details for reproducibility and evaluation of the privacy-utility tradeoff. The revised manuscript will explicitly report the privacy budget values (ε) used in the experiments, the sensitivity parameters of the node-level DP mechanism, and the precise summary statistics to which noise is added (e.g., degree sequences for SBMs or specific subgraph counts for ERGMs). These additions will enable readers to assess and replicate the results. revision: yes

  3. Referee: [Pipeline description and results] Pipeline description and results: the assumption that SBM/ERGM fits to node-DP noisy summaries preserve transmission-critical structure (degree heterogeneity, clustering) for SIS dynamics is not directly tested. No comparison of network properties known to affect epidemic thresholds (e.g., degree distribution moments or clustering coefficients) between DP and non-DP synthetic networks is provided.

    Authors: We agree that direct comparisons of network properties would provide stronger support for the assumption that critical structures are preserved. In the revised evaluation section, we will add comparisons of key network statistics between DP and non-DP synthetic networks, including degree distribution moments (mean and variance), clustering coefficients, and other metrics relevant to epidemic thresholds. These will complement the SIS simulation outcomes and offer direct evidence on structural fidelity. revision: yes

Circularity Check

0 steps flagged

No circularity: empirical pipeline evaluated on external ARTNet data with standard SIS dynamics

full rationale

The paper's derivation consists of an explicit three-step pipeline (node-DP summary computation, statistical model fitting to those summaries, synthetic network generation, then agent-based SIS simulation) whose outputs are compared numerically and qualitatively to the identical pipeline run without DP noise. All reported quantities (incidence, prevalence, intervention effect sizes) are obtained by running the same external ARTNet egocentric data through both pipelines and measuring differences against sampling and misspecification variation; no equation, fitted parameter, or self-citation is used to define or force the relative-error conclusion. The evaluation therefore remains independent of the paper's own fitted values.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 0 invented entities

The approach rests on standard definitions of differential privacy and established properties of ERGMs and SBMs without introducing new free parameters, axioms beyond domain standards, or invented entities.

axioms (2)
  • standard math Node-level differential privacy mechanisms satisfy the standard DP definition and composition properties
    Invoked in step 1 to protect individual contributions when computing network summaries.
  • domain assumption ERGMs and SBMs can generate networks whose structural statistics match those of the original contact network sufficiently for downstream SIS simulations
    Central to step 2 and the validity of the synthetic networks used in step 3.

pith-pipeline@v0.9.0 · 5624 in / 1537 out tokens · 53894 ms · 2026-05-10T17:18:09.155453+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

126 extracted references · 70 canonical work pages

  1. [1]

    J. M. Abowd. The U.S. Census Bureau adopts differential privacy. In Y. Guo and F. Farooq, editors, Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD 2018, London, UK, August 19-23, 2018, page 2867. ACM, 2018. doi: 10.1145/3219819.3226070. URLhttps://doi.org/10.1145/3219819.3226070

  2. [2]

    S. F. Ackley, J. Lessler, and M. M. Glymour. Dynamical modeling as a tool for inferring causation. American Journal of Epidemiology, 191(1):1–6, 2021. doi: 10.1093/aje/kwab222

  3. [3]

    RadioDiff-3D: A 3D× 3D radio map dataset and generative diffusion based benchmark for 6g environment-aware communication,

    F. Ahmed, A. X. Liu, and R. Jin. Publishing social network graph Eigen spectrum with privacy guar- antees.IEEE Transactions on Network Science and Engineering, 7(2):892–906, 2020. doi: 10.1109/TNSE. 2019.2901716

  4. [4]

    Aktay, S

    A. Aktay, S. Bavadekar, G. Cossoul, J. Davis, D. Desfontaines, A. Fabrikant, E. Gabrilovich, K. Gade- palli, B. Gipson, M. Guevara, C. Kamath, M. Kansal, A. Lange, C. Mandayam, A. Oplinger, C. Pluntke, T. Roessler, A. Schlosberg, T. Shekel, S. Vispute, M. Vu, G. Wellenius, B. Williams, and R. J. Wilson. Google COVID-19 community mobility reports: Anonymiz...

  5. [5]

    B. M. Althouse, E. A. Wenger, J. C. Miller, S. V. Scarpino, A. Allard, and L. H´ebert-Dufresne. Super- spreading events in the transmission dynamics of SARS-CoV-2: Opportunities for interventions and control.PLoS Biology, 2020. doi: 10.1371/journal.pbio.3000897

  6. [6]

    R. M. Anderson and G. P. Garnett. Mathematical models of the transmission and control of sexually transmitted diseases.Sexually Transmitted Diseases, 27(10):636–643, 2000

  7. [7]

    S. O. Aral, A. A. Adimora, and K. A. Fenton. Understanding and responding to disparities in HIV and other sexually transmitted infections in African Americans.The Lancet, 372(9635):337–340, 2008. ISSN 0140-6736, 1474-547X. doi: 10.1016/S0140-6736(08)61118-6

  8. [8]

    Backstrom, C

    L. Backstrom, C. Dwork, and J. M. Kleinberg. Wherefore art thou r3579x?: anonymized social networks, hidden patterns, and structural steganography. In C. L. Williamson, M. E. Zurko, P. F. Patel-Schneider, and P. J. Shenoy, editors,Proceedings of the 16th International Conference on World Wide Web, WWW 2007, Banff, Alberta, Canada, May 8-12, 2007, pages 18...

  9. [9]

    Berghel, P

    S. Berghel, P. Bohannon, D. Desfontaines, C. Estes, S. Haney, L. Hartman, M. Hay, A. Machanavajjhala, T. Magerlein, G. Miklau, A. Pai, W. Sexton, and R. Shrestha. Tumult analytics: a robust, easy-to- use, scalable, and expressive framework for differential privacy.CoRR, abs/2212.04133, 2022. doi: 10.48550/ARXIV.2212.04133. URLhttps://doi.org/10.48550/arXi...

  10. [10]

    Blocki, A

    J. Blocki, A. Blum, A. Datta, and O. Sheffet. The Johnson-Lindenstrauss transform itself preserves differential privacy. InProceedings, IEEE Symposium on Foundations of Computer Science (FOCS), pages 410–419, 2012. doi: 10.1109/FOCS.2012.67

  11. [11]

    In: Kleinberg, R.D

    J. Blocki, A. Blum, A. Datta, and O. Sheffet. Differentially private data analysis of social networks via restricted sensitivity. InProceedings, Innovations in Theoretical Computer Science (ITCS), pages 87–96. ACM, 2013. doi: 10.1145/2422436.2422449

  12. [12]

    Blocki, E

    J. Blocki, E. Grigorescu, and T. Mukherjee. Privately estimating graph parameters in sublinear time. InProceedings, International Colloquium on Automata, Languages and Programming (ICALP), volume 229, pages 26:1–26:19, 2022. doi: 10.4230/LIPIcs.ICALP.2022.26

  13. [13]

    Borgs, J

    C. Borgs, J. T. Chayes, and A. D. Smith. Private graphon estimation for sparse graphs. InAdvances in Neural Information Processing Systems (NeurIPS), pages 1369–1377, 2015

  14. [14]

    Borgs, J

    C. Borgs, J. T. Chayes, A. D. Smith, and I. Zadik. Revealing network structure, confidentially: Im- proved rates for node-private graphon estimation. InProceedings, IEEE Symposium on Foundations of Computer Science (FOCS), pages 533–543, 2018. doi: 10.1109/FOCS.2018.00057

  15. [15]

    B. Chen, B. She, C. Hawkins, A. Benvenuti, B. Fallin, P. E. Par ´e, and M. Hale. Differentially private computation of basic reproduction numbers in networked epidemic models. In2024 American Control Conference (ACC), pages 4422–4427, 2024. doi: 10.23919/ACC60939.2024.10644264

  16. [16]

    B. Chen, B. She, C. Hawkins, P. E. Par´e, and M. T. Hale. Scalable distributed reproduction numbers of network epidemics with differential privacy, 2025. URLhttps://arxiv.org/abs/2501.18862

  17. [17]

    R. Chen, B. C. Fung, P. S. Yu, and B. C. Desai. Correlated network data publication via differential privacy.The VLDB Journal, 23(4):653–676, 2014

  18. [18]

    Chen and S

    S. Chen and S. Zhou. Recursive mechanism: towards node differential privacy and unrestricted joins. InProceedings, ACM International Conference on Management of Data (SIGMOD), pages 653–664, 2013. doi: 10.1145/2463676.2465304

  19. [19]

    Daigavane, G

    A. Daigavane, G. Madan, A. Sinha, A. G. Thakurta, G. Aggarwal, and P. Jain. Node-level differentially private graph neural networks, 2021

  20. [20]

    Pets for public health challenge.https://data.org/initiatives/pets-challenge/, 2024

    data.org. Pets for public health challenge.https://data.org/initiatives/pets-challenge/, 2024

  21. [21]

    W. Day, N. Li, and M. Lyu. Publishing graph degree distribution with node differential privacy. In Proceedings, ACM International Conference on Management of Data (SIGMOD), pages 123–138, 2016. doi: 10.1145/2882903.2926745

  22. [22]

    Kim, Vinod Vaikuntanathan, and Or Zamir

    L. Dhulipala, Q. C. Liu, S. Raskhodnikova, J. Shi, J. Shun, and S. Yu. Differential privacy from locally adjustable graph algorithms: k-core decomposition, low out-degree ordering, and densest subgraphs. InProceedings, IEEE Symposium on Foundations of Computer Science (FOCS), pages 754–765, 2022. doi: 10.1109/FOCS54457.2022.00077

  23. [23]

    X. Ding, X. Zhang, Z. Bao, and H. Jin. Privacy-preserving triangle counting in large graphs. In Proceedings of the ACM International Conference on Information and Knowledge Management (CIKM), pages 1283–1292. ACM, 2018. URLhttps://doi.org/10.1145/3269206.3271736

  24. [24]

    Dwork, K

    C. Dwork, K. Kenthapadi, F. McSherry, I. Mironov, and M. Naor. Our data, ourselves: Privacy via distributed noise generation. InInternational Conference on the Theory and Applications of Cryptographic Techniques (EUROCRYPT), volume 4004, pages 486–503, 2006. doi: 10.1007/1176167929

  25. [25]

    Dwork, F

    C. Dwork, F. McSherry, K. Nissim, and A. D. Smith. Calibrating noise to sensitivity in private data analysis.Journal of Privacy and Confidentiality, 7(3):17–51, 2016. doi: 10.29012/jpc.v7i3.405. 23

  26. [26]

    Eli ´aˇs, M

    M. Eli ´aˇs, M. Kapralov, J. Kulkarni, and Y. T. Lee. Differentially private release of synthetic graphs. InProceedings of the Fourteenth Annual ACM-SIAM Symposium on Discrete Algorithms, pages 560–578. SIAM, 2020

  27. [27]

    Ellers, M

    M. Ellers, M. Cochez, T. Schumacher, M. Strohmaier, and F. Lemmerich. Privacy attacks on network embeddings.CoRR, abs/1912.10979, 2019. URLhttp://arxiv.org/abs/1912.10979

  28. [28]

    J. M. Epstein. Agent-based computational models and generative social science.Complexity, 4(5): 41–60, 1999. doi: 10.1002/(SICI)1099-0526(199905/06)4:5Âą41::AID-CPLX9£3.0.CO;2-F

  29. [29]

    J. M. Epstein. Why model?Journal of Artificial Societies and Social Simulation, 11(4):12, 2008

  30. [30]

    Gao and F

    T. Gao and F. Li. Phdp: Preserving persistent homology in differentially private graph publications. InIEEE INFOCOM 2019-IEEE Conference on Computer Communications, pages 2242–2250. IEEE, 2019

  31. [31]

    Gao and F

    T. Gao and F. Li. Sharing social networks using a novel differentially private graph model. In2019 16th IEEE Annual Consumer Communications & Networking Conference (CCNC), pages 1–4. IEEE, 2019

  32. [32]

    Gao and F

    T. Gao and F. Li. Protecting social network with differential privacy under novel graph model.IEEE Access, 8:185276–185289, 2020. doi: 10.1109/ACCESS.2020.3026008. URLhttps://doi.org/10.1109/ ACCESS.2020.3026008

  33. [33]

    S. M. Goodreau, N. B. Carnegie, E. Vittinghoff, J. R. Lama, J. Sanchez, B. Grinsztejn, B. A. Koblin, K. H. Mayer, and S. P. Buchbinder. What drives the US and Peruvian HIV epidemics in men who have sex with men (MSM)?PloS ONE, 7(11):e50522, 2012

  34. [34]

    S. M. Goodreau, S. Cassels, D. Kasprzyk, D. E. Monta˜no, A. Greek, and M. Morris. Concurrent part- nerships, acute infection and HIV epidemic dynamics among young adults in Zimbabwe.AIDS and Behavior, 16:312–322, 2012

  35. [35]

    S. M. Goodreau, E. S. Rosenberg, S. M. Jenness, N. Luisi, S. E. Stansfield, G. A. Millett, and P. S. Sullivan. Sources of racial disparities in HIV prevalence in men who have sex with men in Atlanta, GA, USA: a modelling study.The Lancet HIV, 4(7):e311–e320, 2017

  36. [36]

    Gupta, A

    A. Gupta, A. Roth, and J. Ullman. Iterative constructions and private data release. InTheory of cryp- tography conference, pages 339–356. Springer, 2012

  37. [37]

    Gupta, A

    A. Gupta, A. Roth, and J. Ullman. Iterative constructions and private data release. InProceed- ings, Theory of Cryptography Conference (TCC), volume 7194, pages 339–356, 2012. doi: 10.1007/ 978-3-642-28914-9 19

  38. [38]

    M. Hay, C. Li, G. Miklau, and D. D. Jensen. Accurate estimation of the degree distribution of private networks. InProceedings, SIAM International Conference on Data Mining (ICDM), pages 169–178, 2009. doi: 10.1109/ICDM.2009.11

  39. [39]

    M. Hay, M. Gaboardi, and S. Vadhan. A programming framework for opendp.6th Workshop on the Theory and Practice of Differential Privacy (TPDP 2020), 2020. URLhttps://projects.iq.harvard. edu/files/opendp/files/opendp_programming_framework_11may2020_1_01.pdf

  40. [40]

    In2025 IEEE Symposium on Security and Privacy (SP)

    S. Hod and R. Canetti. Differentially private release of Israel’s National Registry of Live Births. In M. Blanton, W. Enck, and C. Nita-Rotaru, editors,IEEE Symposium on Security and Privacy, SP 2025, San Francisco, CA, USA, May 12-15, 2025, pages 3912–3930. IEEE, 2025. doi: 10.1109/SP61157.2025.00101. URLhttps://doi.org/10.1109/SP61157.2025.00101

  41. [41]

    P. W. Holland, K. B. Laskey, and S. Leinhardt. Stochastic blockmodels: First steps.Social networks, 5 (2):109–137, 1983. 24

  42. [42]

    Holohan, S

    N. Holohan, S. Braghin, P. M. Aonghusa, and K. Levacher. Diffprivlib: The IBM differential privacy library.CoRR, abs/1907.02444, 2019. URLhttp://arxiv.org/abs/1907.02444

  43. [43]

    D. R. Hunter and M. S. Handcock. Inference in curved exponential family models for networks.Journal of Computational and Graphical Statistics, 15(3):565–583, 2006

  44. [44]

    D. R. Hunter, M. S. Handcock, C. T. Butts, S. M. Goodreau, and M. Morris. Ergm: A package to fit, simulate and diagnose exponential-family models for networks. 24(3):nihpa54860, 2008. doi: 10.18637/jss.v024.i03

  45. [45]

    Iftikhar, Q

    M. Iftikhar, Q. Wang, and Y. Lin. dk-microaggregation: Anonymizing graphs with differential privacy guarantees. InPacific-Asia Conference on Knowledge Discovery and Data Mining, pages 191–203. Springer, 2020

  46. [46]

    S. M. Jenness, S. M. Goodreau, M. Morris, and S. Cassels. Effectiveness of combination packages for HIV-1 prevention in sub-Saharan Africa depends on partnership network structure: a mathematical modelling study.Sexually Transmitted Infections, 92(8):619–624, 2016

  47. [47]

    S. M. Jenness, S. M. Goodreau, E. Rosenberg, E. N. Beylerian, K. W. Hoover, D. K. Smith, and P. Sul- livan. Impact of the Centers for Disease Control’s HIV preexposure prophylaxis guidelines for men who have sex with men in the United States.The Journal of Infectious Diseases, 214(12):1800–1807, 2016

  48. [48]

    S. M. Jenness, K. M. Weiss, S. M. Goodreau, T. Gift, H. Chesson, K. W. Hoover, D. K. Smith, A. Y. Liu, P. S. Sullivan, and E. S. Rosenberg. Incidence of gonorrhea and chlamydia following Human Immunodeficiency Virus preexposure prophylaxis among men who have sex with men: a modeling study.Clinical Infectious Diseases, 65(5):712–718, 2017

  49. [49]

    S. M. Jenness, S. M. Goodreau, and M. Morris. Epimodel: An R Package for Mathematical Modeling of Infectious Disease Over Networks.Journal of Statistical Software, 84(8):1–47, 2018. doi: 10.18637/ jss.v084.i08

  50. [50]

    S. M. Jenness, K. S. Willebrand, A. A. Malik, B. A. Lopman, and S. B. Omer. Dynamic network strate- gies for SARS-CoV-2 control on a cruise ship.Epidemics, 37:100488, 2021

  51. [51]

    S. M. Jenness, A. Le Guillou, C. Lyles, K. T. Bernstein, K. Krupinsky, E. A. Enns, P. S. Sullivan, and K. P. Delaney. The role of HIV partner services in the modern biomedical HIV prevention era: A network modeling study. 49(12):801–807, 2022. ISSN 0148-5717. doi: 10.1097/olq.0000000000001711

  52. [52]

    S. Ji, P. Mittal, and R. A. Beyah. Graph data anonymization, de-anonymization attacks, and de- anonymizability quantification: A survey.IEEE Commun. Surv. Tutorials, 19(2):1305–1326, 2017. doi: 10.1109/COMST.2016.2633620. URLhttps://doi.org/10.1109/COMST.2016.2633620

  53. [53]

    Kalemaj, S

    I. Kalemaj, S. Raskhodnikova, A. D. Smith, and C. E. Tsourakakis. Node-differentially private estima- tion of the number of connected components. InProceedings, ACM Symposium on Principles of Database Systems (PODS), 2023

  54. [54]

    Karwa and A

    V. Karwa and A. B. Slavkovic. Differentially private graphical degree sequences and synthetic graphs. InPrivacy in Statistical Databases, volume 7556, pages 273–285. Springer, 2012. doi: 10.1007/ 978-3-642-33627-0 21

  55. [55]

    Karwa, S

    V. Karwa, S. Raskhodnikova, A. D. Smith, and G. Yaroslavtsev. Private analysis of graph structure. ACM Transactions on Database Systems, 39(3):22:1–22:33, 2014. doi: 10.1145/2611523

  56. [56]

    Karwa, A

    V. Karwa, A. B. Slavkovic, and P. N. Krivitsky. Differentially private exponential random graphs. In Privacy in Statistical Databases, 2014. 25

  57. [57]

    Karwa, P

    V. Karwa, P. N. Krivitsky, and A. B. Slavkovic. Sharing social network data: differentially private estimation of exponential family random-graph models.Journal of the Royal Statistical Society: Series C (Applied Statistics), 66, 2015

  58. [58]

    S. P. Kasiviswanathan, K. Nissim, S. Raskhodnikova, and A. D. Smith. Analyzing graphs with node differential privacy. InProceedings, Theory of Cryptography Conference (TCC), pages 457–476, 2013. doi: 10.1007/978-3-642-36594-2 26. URLhttps://cs-people.bu.edu/sofya/pubs/nodeprivacy-TCC. pdf

  59. [59]

    M. Keeling. The implications of network structure for epidemic dynamics.Theoretical Population Biology, 67(1):1–8, 2005. doi: 10.1016/j.tpb.2004.08.002

  60. [60]

    M. J. Keeling and K. T. D. Eames. Networks and epidemic models.Journal of The Royal Society Interface, 2:295–307, 2005

  61. [61]

    P. N. Krivitsky and M. S. Handcock. A separable model for dynamic networks.Journal of the Royal Statistical Society. Series B, Statistical Methodology, 76(1):29, 2014

  62. [62]

    P. N. Krivitsky, D. R. Hunter, M. Morris, and C. Klumb. ergm 4: New features for analyzing exponential-family random graph models. 105:1–44, 2023. doi: 10.18637/jss.v105.i06

  63. [63]

    Lazer, A

    D. Lazer, A. Pentland, L. Adamic, S. Aral, A.-L. Barab ´asi, D. Brewer, N. Christakis, N. Contractor, J. Fowler, M. Gutmann, T. Jebara, G. King, M. Macy, D. Roy, and M. Van Alstyne. Computational Social Science.Science, 323(5915):721–723, 2009. doi: 10.1126/science.1167742

  64. [64]

    Lessler and D

    J. Lessler and D. A. T. Cummings. Mechanistic models of infectious disease and their impact on public health.American Journal of Epidemiology, 183(5):415–422, 2016. doi: 10.1093/aje/kww021

  65. [65]

    C. Li, G. Miklau, M. Hay, A. McGregor, and V. Rastogi. The matrix mechanism: optimizing linear counting queries under differential privacy.The VLDB journal, 24(6):757–781, 2015

  66. [66]

    G. Z. Li, D. Nguyen, and A. Vullikanti. Computing epidemic metrics with edge differential privacy. In S. Dasgupta, S. Mandt, and Y. Li, editors,International Conference on Artificial Intelligence and Statistics, 2-4 May 2024, Palau de Congressos, Valencia, Spain, volume 238 ofProceedings of Machine Learning Re- search, pages 4303–4311. PMLR, 2024. URLhttp...

  67. [67]

    F. Liu, E. C. Eugenio, I.-H. Jin, and C. M. Bowen. Differentially private generation of social networks via exponential random graph models.2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC), pages 1695–1700, 2020

  68. [68]

    G. Liu, X. Ma, and W. Li. Publishing node strength distribution with node differential privacy.IEEE Access, 8:217642–217650, 2020. URLhttps://doi.org/10.1109/ACCESS.2020.3040077

  69. [69]

    E. T. Lofgren, M. E. Halloran, C. M. Rivers, J. M. Drake, T. C. Porco, B. Lewis, W. Yang, A. Vespignani, J. Shaman, J. N. S. Eisenberg, M. C. Eisenberg, M. Marathe, S. V. Scarpino, K. A. Alexander, R. Meza, M. J. Ferrari, J. M. Hyman, L. A. Meyers, and S. Eubank. Opinion: Mathematical models: A key tool for outbreak response.Proceedings of the National Ac...

  70. [70]

    Lu and G

    W. Lu and G. Miklau. Exponential random graph estimation under differential privacy. InACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 921–930. ACM, 2014. doi: 10.1145/2623330.2623683

  71. [71]

    Lurie, A

    M. Lurie, A. Lysyanskaya, J. Netter, S. Ramachandran, K. Toussaint, T. Trikalinos, B. S. Loucks, P. Luiz, and J. Gantenberg. Report of proceedings and recommendations from the Workshop on Privacy and Ethics in Pandemic Data Collection and Processing. Technical report, Brown University, 2023. URL https://mapps-brown.github.io/workshop2023/. 26

  72. [72]

    C. M. Macal. Everything you need to know about agent-based modelling and simulation.Journal of Simulation, 10(2):144–156, 2016. doi: 10.1057/jos.2016.7

  73. [73]

    K. M. Maloney, A. Le Guillou, R. A. Driggers, S. Sarkar, E. J. Anderson, A. A. Malik, and S. M. Jenness. Projected impact of concurrently available long-acting injectable and daily-oral human immunodefi- ciency virus preexposure prophylaxis: A mathematical model. 223(1):72–82. ISSN 1537-6613. doi: 10.1093/infdis/jiaa552

  74. [74]

    Mobility analysis for pandemic prevention strategies.https://web.archive.org/ web/20240726030722/https://www.mappsproject.com/, 2023

    MAPPS Project. Mobility analysis for pandemic prevention strategies.https://web.archive.org/ web/20240726030722/https://www.mappsproject.com/, 2023. Brown University

  75. [75]

    D. J. Mir and R. N. Wright. A differentially private estimator for the stochastic kronecker graph model. In D. Srivastava and I. Ari, editors,Proceedings of the 2012 Joint EDBT/ICDT Workshops, Berlin, Germany, March 30, 2012, pages 167–176. ACM, 2012. doi: 10.1145/2320765.2320818. URLhttps://doi.org/ 10.1145/2320765.2320818

  76. [76]

    Morris, editor.Network Epidemiology: A Handbook for Survey Design and Data Collection

    M. Morris, editor.Network Epidemiology: A Handbook for Survey Design and Data Collection. International Studies in Demography. Oxford University Press, 2004. ISBN 9780199269013

  77. [77]

    T. T. Mueller, D. Usynin, J. C. Paetzold, D. Ruecker, and G. Kaissis. SoK: Differential privacy on graph- structured data, 2022

  78. [78]

    M ¨ulle, C

    Y. M ¨ulle, C. Clifton, and K. B¨ohm. Privacy-integrated graph clustering through differential privacy. InProceedings of the Workshops of the EDBT/ICDT 2015 Joint Conference, volume 1330, pages 247–254, 2015

  79. [79]

    Narayanan and V

    A. Narayanan and V. Shmatikov. De-anonymizing social networks. In30th IEEE Symposium on Secu- rity and Privacy (SP 2009), 17-20 May 2009, Oakland, California, USA, pages 173–187. IEEE Computer Society, 2009. doi: 10.1109/SP.2009.22. URLhttps://doi.org/10.1109/SP.2009.22

  80. [80]

    K. N. Nelson, N. R. Gandhi, B. Mathema, B. A. Lopman, J. C. Brust, S. C. Auld, N. Ismail, S. V. Omar, T. S. Brown, S. Allana, et al. Modeling missing cases and transmission links in networks of extensively drug-resistant tuberculosis in KwaZulu-Natal, South Africa.American Journal of Epidemiology, 189(7): 735–745, 2020

Showing first 80 references.