Title: Differential private noise adding mechanism: Basic conditions and its application
Abstract: Differential privacy is a formal mathematical standard for quantifying the degree of that individual privacy in a statistical database is preserved. To guarantee differential privacy, a typical method is adding random noise to the original data for data release. In this paper, we investigate the basic conditions of differential privacy considering the general random noise adding mechanism, and then apply this result for privacy analysis of the privacy-preserving consensus algorithm. Specifically, we obtain a necessary and sufficient condition of differential privacy, which provides a useful and efficient criterion of achieving differential privacy. We utilize the result to analyze the privacy of some common random noises and the theory matches with the existing literature for special cases. Applying the theory, differential privacy property of a privacy-preserving consensus algorithm is investigated based on the proposed theory. We obtain the necessary condition of differential privacy for the privacy-preserving consensus algorithm. In addition, it is proved that the average consensus and differential privacy cannot be guaranteed simultaneously by any privacy-preserving consensus algorithm.
Publication Year: 2017
Publication Date: 2017-05-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 34
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot