Faculty Recruiting Make a Gift

Noise-Aware Inference for Differential Privacy via Sufficient Statistics Perturbation

17 Jan
Thursday, 01/17/2019 2:00pm to 4:00pm
CS140
Ph.D. Dissertation Proposal Defense

Abstract:

Domains involving sensitive human data, such as health care, human mobility, and online activity, are becoming increasingly dependent upon machine learning algorithms. This leads to scenarios in which data owners wish to protect the privacy of individuals comprising the sensitive data, while at the same time data modelers wish to analyze and draw conclusions from the data. Thus there is a growing demand to develop effective private inference methods that can marry the needs of both parties. For this we turn to differential privacy, which provides a framework for executing these algorithms in a private fashion by injecting random noise at various points in the process. Among these mechanisms is sufficient statistics perturbation (SSP), in which sufficient statistics, a quantity that captures all information about the model parameters, are corrupted with noise and released to the public. This mechanism offers desirable efficiency properties and simpler implementations in comparison to alternatives.

Existing work has developed private inference algorithms for a multitude of models using SSP, but all fail to account for the noise added by the privacy mechanism. This work is the first to develop these methods in a principled manner that directly accounts for the injected noise. We do so for maximum likelihood estimation of undirected graphical models, for Bayesian inference of exponential family models, and for Bayesian inference of conditional regression models.

Advisor: Dan Sheldon