Resource-Efficient ECG Foundation Networks via Layer-wise Adaptive Compression
Keywords: Foundation models, residual models, layer-wise compression, quantization, pruning, layer importance, ECG classification
Abstract: Foundation models for biosignals, such as wearable ECG monitors, face challenges in resource-constrained settings due to high memory and computational demands. In this work, we propose an adaptive layer-wise compression framework that leverages quantization and pruning to reduce model size while preserving predictive performance. Layer importance, estimated via parameter contribution and weight variance, guides fine-grained assignment of bit-widths and pruning thresholds, balancing efficiency and accuracy across high- and low-sensitivity layers. Extensive experiments on Chapman and CPSC ECG datasets demonstrate that our method consistently outperforms fixed global model compression schemes, achieving up to 10.44$\times$ compression without any loss. Our architecture-agnostic framework scales to both lightweight residual networks and large foundation models, enabling real-time, low-resource ECG monitoring and advancing scalable biosignal AI for edge and mobile health applications.
Submission Number: 75
Loading