Contrastive predictive coding pretraining combined with structured state space models yields the strongest ECG foundation models, with continued gains from scaling data to 11 million samples.
citation dossier
PTB-XL, a large publicly available electrocardiography dataset.Scientific data, 7 (1):1–15
why this work matters in Pith
Pith has found this work in 2 reviewed papers. Its strongest current cluster is cs.LG (1 papers). The largest review-status bucket among citing papers is UNVERDICTED (2 papers). For highly cited works, this page shows a dossier first and a bounded explorer second; it never tries to render every citing paper at once.
years
2026 2verdicts
UNVERDICTED 2representative citing papers
ProtoSSL discovers generalizable prototypes from unlabeled time-series via self-supervision and assigns them to new tasks for interpretable predictions, outperforming supervised baselines in low-data regimes on ECG datasets.
citing papers explorer
-
Pretraining Strategies and Scaling for ECG Foundation Models: A Systematic Study
Contrastive predictive coding pretraining combined with structured state space models yields the strongest ECG foundation models, with continued gains from scaling data to 11 million samples.
-
ProtoSSL: Interpretable Prototype Learning from Unlabeled Time-Series Data
ProtoSSL discovers generalizable prototypes from unlabeled time-series via self-supervision and assigns them to new tasks for interpretable predictions, outperforming supervised baselines in low-data regimes on ECG datasets.