Calibrating the attack to sensitivity in differentially private mechanisms

Unsal, Ayse ; Önen, Melek
Journal of Cybersecurity and Privacy, Vol. 2, N°4, 18 October 2022

This work studies anomaly detection under differential privacy with Gaussian and Laplacian perturbation using both statistical and information-theoretic tools. In our setting, the adversary aims to modify the content of a statistical dataset via insertion of additional data without being detected by using the dfiferential privacy to her/his own benefit. To this end, firstly via hypothesis testing, we characterize statistical thresholds for the adversary in various settings, which balances the privacy budget and the impact of the attack (the modification applied on the original data) in order to avoid being detected. In addition, we establish the privacy-distortion trade-off in the sense of the well-known rate-
distortion function for the Gaussian mechanism by using an information-theoretic approach. Accordingly, we derive an upper bound on the variance of the attacker's additional data as a function of the sensitivity and the original data's second-order statistics. Lastly, we introduce a new privacy metric based on Chernoff information for anomaly detection under differential privacy as a stronger alternative for the (ϵ, δ)− differential privacy in Gaussian mechanisms. Analytical results are supported by numerical evaluations.

DOI
HAL
Type:
Journal
Date:
2022-10-18
Department:
Digital Security
Eurecom Ref:
6996
Copyright:
MDPI

PERMALINK : https://www.eurecom.fr/publication/6996