Keywords: Federated Learning, Financial Distress Prediction, Explainable AI, Financial Inclusion, Data Sovereignty, Responsible AI, Differential Privacy, Solid Pods, Personal Data Spaces, Consent Management, Policy Enforcement
Abstract: We propose a decentralized, privacy-first architecture for predicting consumer financial distress, evolving beyond simulated federated environments toward a deployable, user-centric design. Leveraging Solid pods, we enforce structural data sovereignty by keeping financial data in personal storage and governing access through pod-local access control, verifiable authorization, and decentralized identity, while bringing the financial risk analytics algorithms to the data via a federated learning. To mitigate inference risks from shared model updates in the federated learning process (e.g., reconstruction and membership inference), we integrate Differential Privacy via update clipping and calibrated noise injection within the user's trusted pod environment. This work demonstrates that rigorous privacy guarantees can coexist with the predictive utility required for effective, socially responsible early warning systems and intervention in consumer financial applications. Specifically, we contribute (i) a Solid-based compute-to-data architecture for financial risk modeling, (ii) a policy-governed consent lifecycle for sensitive financial data, and (iii) a layered privacy design combining structural enforcement with differential privacy.
Submission Number: 9
Loading