Dingo-Pop uses a transformer to perform amortized, end-to-end population inference from GW strain data in seconds, bypassing per-event Monte Carlo sampling.
Title resolution pending
3 Pith papers cite this work. Polarity classification is still indexing.
citation-role summary
citation-polarity summary
fields
gr-qc 3years
2026 3verdicts
UNVERDICTED 3roles
background 1polarities
background 1representative citing papers
Labrador is a domain-optimized neural posterior estimation tool achieving 1% median importance-sampling efficiency and first extensive coverage of long-duration low-mass gravitational wave signals through equivariance and a stable procedure for differing priors.
A glitch-robust amortized inference framework combining normalizing flows, time-frequency multimodal fusion, and contrastive learning outperforms MCMC for Taiji massive black hole binary parameter estimation under noise contamination.
citing papers explorer
-
End-to-End Population Inference from Gravitational-Wave Strain using Transformers
Dingo-Pop uses a transformer to perform amortized, end-to-end population inference from GW strain data in seconds, bypassing per-event Monte Carlo sampling.
-
labrador: A domain-optimized machine-learning tool for gravitational wave inference
Labrador is a domain-optimized neural posterior estimation tool achieving 1% median importance-sampling efficiency and first extensive coverage of long-duration low-mass gravitational wave signals through equivariance and a stable procedure for differing priors.
-
Robust parameter inference for Taiji via time-frequency contrastive learning and normalizing flows
A glitch-robust amortized inference framework combining normalizing flows, time-frequency multimodal fusion, and contrastive learning outperforms MCMC for Taiji massive black hole binary parameter estimation under noise contamination.