Title: Data Correlation behaviour on Privacy Leakage in Differential Privacy
Abstract: Massive static or continuous data generated from day-to-day life activities has eventually led to increased analysis and statistical data publishing. Differential privacy has also seen a significant rise in attention as a privacy framework. Many of the existing privacy methods still use the traditional Differential privacy mechanism as a tool to publish private data to protect privacy. However, these privacy methods completely ignore the fact that there may exist some correlation between the data and assume that each data entity is independent, which is not the case majority of the time. Specifically, a temporal correlation may exist in data generated from the user's continuous publication of data. If an adversary obtains such correlation information from other sources, then the adversary may infer users' private data from the published data. Therefore, it is necessary to measure (or analyze) the privacy leakage of traditional differential privacy mechanisms, irrespective of whether the users' data contains correlation over time as the data is published. And present a method under the protection of a differential privacy mechanism, which shows the maximum privacy leakage when the data involves correlation. Finally, we conduct an experiment to analyze the maximum privacy leakage, and the results show that user data's privacy leakage can gradually increase over time as the data is published.
Publication Year: 2023
Publication Date: 2023-06-08
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot