Causal Discovery Under Local Privacy - Institut Polytechnique de Paris
Communication Dans Un Congrès Année : 2024

Causal Discovery Under Local Privacy

Résumé

Differential privacy is a widely adopted framework designed to safeguard the sensitive information of data providers within a data set. It is based on the application of controlled noise at the interface between the server that stores and processes the data, and the data consumers. Local differential privacy is a variant that allows data providers to apply the privatization mechanism themselves on their data individually. Therefore, it provides protection also in contexts in which the server, or even the data collector, cannot be trusted. The introduction of noise, however, inevitably affects the utility of the data, particularly by distorting the correlations between individual data components. This distortion can prove detrimental to tasks such as causal structure learning. In this paper, we consider various well-known locally differentially private mechanisms and compare the trade-off between the privacy they provide, and the accuracy of the causal structure produced by algorithms for causal learning when applied to data obfuscated by these mechanisms. Our analysis yields valuable insights for selecting appropriate local differentially private protocols for causal discovery tasks. We foresee that our findings will aid researchers and practitioners in conducting locally private causal discovery.
Fichier principal
Vignette du fichier
binkyte24a.pdf (33.54 Mo) Télécharger le fichier
Origine Accord explicite pour ce dépôt
Licence

Dates et versions

hal-04617032 , version 1 (19-06-2024)

Licence

Identifiants

Citer

Ruta Binkyte, Carlos Pinzón, Szilvia Lestyán, Kangsoo Jung, Héber Hwang Arcolezi, et al.. Causal Discovery Under Local Privacy. Third Conference on Causal Learning and Reasoning, Apr 2024, Los Angeles, CA, United States. pp.325-383. ⟨hal-04617032⟩
205 Consultations
8 Téléchargements

Altmetric

Partager

More