FedMH: Federated Learning with Multi-Task Head for Heterogeneous Models in Offline Signature Verification

17 Sept 2025 (modified: 12 Feb 2026)ICLR 2026 Conference Desk Rejected SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Offline handwritten signature authentication, Federated Learning
Abstract: Offline handwritten signature verification is a critical biometric authentication technology widely used in high-risk domains such as finance and law. However, data silos constrain models to train solely on local, limited, and imbalanced data, typically resulting in overfitting to majority classes. Federated learning is one of the solutions that addresses this issue by enabling collaborative training while preserving data privacy. However, existing federated learning methods face three key challenges in the scenario of offline handwritten signature verification: the scarcity of local data and class imbalance, the need for a dual-task head, and heterogeneity constraints in the model aggregation process. To tackle these issues, we propose FedMH, a novel federated learning method based on a multi-task head strategy, which supports heterogeneous model environments. Specifically, FedMH comprises three core components including (1) an adaptive data augmentation strategy that enhances data diversity and improves learning in minority classes by targeting few-shot classes in local single-user genuine and forged signature subsets, (2) a dual-task head collaboration mechanism that dynamically guides parameter updates to foster synergy between tasks, and (3) a gradient optimization method employing linear probing in parameter space and Pareto improvement criteria to enable efficient knowledge aggregation across heterogeneous models. We also provide a theoretical convergence proof for FedMH to ensure its reliability and stability. Comprehensive experiments validate the effectiveness of the proposed FedMH on three benchmark handwritten signature datasets. The proposed FedMH achieves state-of-the-art performance compared to heterogeneous federated baseline methods and solves the long-standing problem of multi-task head collaboration. At the same time, when faced with unfamiliar datasets, FedMH also demonstrates stronger generalization ability than the baseline methods.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 8888
Loading