Joint Theory Seminar / Computer Science Speaking Skills Talk

— 3:00pm

Location:
In Person - Gates Hillman 8102

Speaker:
JUSTIN WHITEHOUSE , Ph.D Student, Computer Science Department, Carnegie Mellon University
https://jwhitehouse11.github.io/

Brownian Noise Reduction: Maximizing Privacy Subject to Accuracy Constraints

There is a disconnect between how researchers and practitioners handle privacy-utility tradeoffs. Researchers primarily operate from a privacy-first perspective, setting strict privacy requirements and minimizing risk subject to these constraints. Practitioners often desire an accuracy-first perspective, possibly satisfied with the greatest privacy they can get subject to obtaining sufficiently small error. Ligett, et al. have introduced a “noise reduction” algorithm to address the latter perspective. 

The authors show that by adding correlated Laplace noise and progressively reducing it on demand, it is possible to produce a sequence of increasingly accurate estimates of a private parameter while only paying a privacy cost for the least noisy iterate released. In this work, we generalize noise reduction to the setting of Gaussian noise, introducing the Brownian mechanism. The Brownian mechanism works by first adding Gaussian noise of high variance corresponding to the final point of a simulated Brownian motion. Then, at the practitioner's discretion, noise is gradually decreased by tracing back along the Brownian path to an earlier time.

Our mechanism is more naturally applicable to the common setting of 2-sensitivity, empirically outperforms existing work on common statistical tasks, and provides customizable control of privacy loss over the entire interaction with the practitioner. Overall, our results demonstrate that one can meet utility constraints while still maintaining strong levels of privacy.

Presented as part of the Theory Lunch Seminar

Presented in Partial Fulfillment of the CSD Speaking Skills Requirement


Add event to Google
Add event to iCal