کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
423886 | 685301 | 2011 | 19 صفحه PDF | دانلود رایگان |

Differential privacy is a promising approach to privacy preserving data analysis with a well-developed theory for functions. Despite recent work on implementing systems that aim to provide differential privacy, the problem of formally verifying that these systems have differential privacy has not been adequately addressed. We develop a formal probabilistic automaton model of differential privacy for systems by adapting prior work on differential privacy for functions. We present the first sound verification technique for proving differential privacy of interactive systems. The technique is based on a form of probabilistic bisimulation relation. The novelty lies in the way we track quantitative privacy leakage bounds using a relation family instead of a single relation. We illustrate the proof technique on a representative automaton motivated by PINQ, an implemented system that is intended to provide differential privacy. Surprisingly, our analysis yields a privacy leakage bound of (2t⁎ϵ) rather than (t⁎ϵ) when ϵ-differentially private functions are called t times. The extra leakage arises from accounting for bounded memory constraints of real computers.
Journal: Electronic Notes in Theoretical Computer Science - Volume 276, 29 September 2011, Pages 61-79