Contrastive predictive coding pretraining combined with structured state space models yields the strongest ECG foundation models, with continued gains from scaling data to 11 million samples.
Title resolution pending
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
years
2026 2verdicts
UNVERDICTED 2representative citing papers
PhysioLite delivers Transformer-comparable ECG/EMG performance using learnable wavelet filters and hardware-aware design at ~370KB quantized size on μNPUs.
citing papers explorer
-
Pretraining Strategies and Scaling for ECG Foundation Models: A Systematic Study
Contrastive predictive coding pretraining combined with structured state space models yields the strongest ECG foundation models, with continued gains from scaling data to 11 million samples.
-
Towards Real-Time ECG and EMG Modeling on $\mu$NPUs
PhysioLite delivers Transformer-comparable ECG/EMG performance using learnable wavelet filters and hardware-aware design at ~370KB quantized size on μNPUs.