Paper: openreview.net/forum?id=xXR...
Code and ECG-CPC weights: github.com/AI4HealthUOL...
Paper: openreview.net/forum?id=xXR...
Code and ECG-CPC weights: github.com/AI4HealthUOL...
Representation Diversity: Models with similar accuracy learn distinct internal patterns, revealing multiple paths to effective ECG understanding.
Label Efficiency: ECG FMs improve label efficiency by 3.3โ9x vs. supervised baselines
Proposed FM: ECG-CPC
Backbone: Structured State Space Sequence (S4) Model.
Pretraining Dataset: HEEDB (10M samples).
Pretraining Method: Contrastive Predictive Coding.
Model Complexity: 3.8M parameters, 1.741 GFLOPs.
Most important outcomes:
Architecture > Scale: The lightweight S4-based ECG-CPC outperforms larger Transformer models across most tasks, showing design beats size.
Most FMs struggle to beat strong supervised baselines (S4).
Our work on benchmarking foundation models for electrocardiography has been accepted at ICLR2026! We benchmarked 7 ECG FMs (proposed a highly efficient FM based on CPC ourselves) on 26 tasks across 12 datasets, looked into label efficiency and representational similarity