Abstract: In this work, we approach the issue of privacy in distributed constraint reasoning by studying how agents compromise solution quality for preserving privacy, using utility and game theory. We propose a utilitarian definition of privacy in the context of distributed constraint reasoning, detail its different implications, and present a model and solvers, as well as their properties. We then show how important steps in a distributed constraint optimization with privacy requirements can be modeled as a planning problem, and more specifically as a stochastic game. We present experiments validating the interest of our approach, according to several criteria.
0 Replies
Loading