Participatory Systems for Personalized PredictionDownload PDF

Published: 21 Nov 2022, Last Modified: 05 May 2023TSRML2022Readers: Everyone
Keywords: privacy, participation, prediction, informed consent, healthcare
TL;DR: We introduce a family of personalized prediction models called \emph{participatory systems} that improve privacy, fairness and performance user participation and informed consent.
Abstract: Machine learning models often request personal information from users to assign more accurate predictions across a heterogeneous population. Personalized models are not built to support \emph{informed consent}: users cannot "opt-out" of providing personal data, nor understand the effects of doing so. In this work, we introduce a family of personalized prediction models called \emph{participatory systems} that support informed consent. Participatory systems are interactive prediction models that let users opt into reporting additional personal data at prediction time, and inform them about how their data will improve their predictions. We present a model-agnostic approach for supervised learning tasks where personal data is encoded as "group" attributes (e.g., sex, age group, HIV status). Given a pool of user-specified models, our approach can create a variety of participatory systems that differ in their training requirements and opportunities for informed consent. We conduct a comprehensive empirical study of participatory systems in clinical prediction tasks and compare them to common approaches for personalization. Our results show that our approach can produce participatory systems that exhibit large improvements in privacy, fairness, and performance at the population and group levels.
3 Replies

Loading