Article ID Journal Published Year Pages File Type
423886 Electronic Notes in Theoretical Computer Science 2011 19 Pages PDF
Abstract

Differential privacy is a promising approach to privacy preserving data analysis with a well-developed theory for functions. Despite recent work on implementing systems that aim to provide differential privacy, the problem of formally verifying that these systems have differential privacy has not been adequately addressed. We develop a formal probabilistic automaton model of differential privacy for systems by adapting prior work on differential privacy for functions. We present the first sound verification technique for proving differential privacy of interactive systems. The technique is based on a form of probabilistic bisimulation relation. The novelty lies in the way we track quantitative privacy leakage bounds using a relation family instead of a single relation. We illustrate the proof technique on a representative automaton motivated by PINQ, an implemented system that is intended to provide differential privacy. Surprisingly, our analysis yields a privacy leakage bound of (2t⁎ϵ) rather than (t⁎ϵ) when ϵ-differentially private functions are called t times. The extra leakage arises from accounting for bounded memory constraints of real computers.

Related Topics
Physical Sciences and Engineering Computer Science Computational Theory and Mathematics