pith. machine review for the scientific record. sign in

arxiv: 1603.07294 · v2 · submitted 2016-03-23 · 💻 cs.LG · cs.AI· cs.CR· stat.ML

Recognition: unknown

On the Theory and Practice of Privacy-Preserving Bayesian Data Analysis

Authors on Pith no claims yet
classification 💻 cs.LG cs.AIcs.CRstat.ML
keywords privacydataanalysisposteriorapproachbayesiandifferentialefficient
0
0 comments X
read the original abstract

Bayesian inference has great promise for the privacy-preserving analysis of sensitive data, as posterior sampling automatically preserves differential privacy, an algorithmic notion of data privacy, under certain conditions (Dimitrakakis et al., 2014; Wang et al., 2015). While this one posterior sample (OPS) approach elegantly provides privacy "for free," it is data inefficient in the sense of asymptotic relative efficiency (ARE). We show that a simple alternative based on the Laplace mechanism, the workhorse of differential privacy, is as asymptotically efficient as non-private posterior inference, under general assumptions. This technique also has practical advantages including efficient use of the privacy budget for MCMC. We demonstrate the practicality of our approach on a time-series analysis of sensitive military records from the Afghanistan and Iraq wars disclosed by the Wikileaks organization.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Modulated learning for private and distributed regression with just a single sample per client device

    cs.LG 2026-05 unverdicted novelty 5.0

    Single-sample clients add one calibrated noisy perturbation to their data point and share transformed representations, allowing the server to recover unbiased gradients for private distributed regression.