Search Search

Inference-Based Privacy Guarantees for Differentially Private Mechanisms, or: The Physics of Differential Privacy

Friday, March 11, 2016
12:00 PM - 1:00 PM
Location: Baxter B125
Robert Kleinberg, Associate Professor of Computer Science, Cornell University

Abstract

How much information can an attacker learn about an individual by observing the outcome of a computation? If the computation is differentially private, the answer is: "not much more than if the individual's data had been excluded from the input to the computation." A stronger notion of privacy, originally propounded by Dalenius in the 1970's, requires instead that the attacker should not learn much more about an individual than if the outcome of the computation had never been revealed. Simple examples, as well as a general impossibility theorem of Dwork and Naor, preclude the possibility of supplying this stronger "inference-based" type of guarantee against attackers with arbitrary side information, assuming the computation is at least minimally useful.

In this talk we revisit the notion of inference-based privacy (henceforth, IBP) and ask: under what limitations on the adversary's side information can we deduce an IBP guarantee from a differential one? We model the adversary's side information as a prior distribution over datasets (or, more generally, a set of possible priors) and we prove two main results. The first result pertains to distributions that satisfy a natural positive-affiliation condition, and gives an upper bound on the IBP parameter for any differentially private mechanism. This upper bound is matched by a simple mechanism that adds Laplace noise to the sum of the data. The second result pertains to distributions that have weak correlations, defined in terms of a suitable "influence matrix". The result provides an upper bound for IBP in terms of the differential privacy parameter and the spectral norm of this matrix.

Statistical mechanics constitutes a leitmotif that runs through the talk, influencing both the proofs and the interpretation of the results. In brief, application of a differentially private mechanism to correlated data is analogous to application of an external field to a spin system. Techniques developed by physicists for bounding the change in magnetization due to an external field provide a blueprint for proving our main results. Physical phenomena such as phase transitions have analogues in the setting of the privacy questions we address.

This is joint work with Arpita Ghosh.

Series: Linde Institute/Social and Information Sciences Laboratory Seminar Series (SISL)
For more information, please phone 626-395-4083 or email bestrada@hss.caltech.edu

Upcoming events

Event archive