TY - JOUR
T1 - The Value of Privacy
T2 - Strategic Data Subjects, Incentive Mechanisms and Fundamental Limits
AU - Wang, Weina
AU - Ying, Lei
AU - Zhang, Junshan
N1 - Funding Information:
This work was supported in part by the NSF under Grant ECCS-1255425.
Publisher Copyright:
© 2016 ACM.
PY - 2016/6
Y1 - 2016/6
N2 - We study the value of data privacy in a game-theoretic model of trading private data, where a data collector purchases private data from strategic data subjects (individuals) through an incentive mechanism. The private data of each individual represents her knowledge about an underlying state, which is the information that the data collector desires to learn. Different from most of the existing work on privacy-aware surveys, our model does not assume the data collector to be trustworthy. Then, an individual takes full control of its own data privacy and reports only a privacy-preserving version of her data. In this paper, the value of ϵ units of privacy is measured by the minimum payment of all nonnegative payment mechanisms, under which an individual's best response at a Nash equilibrium is to report the data with a privacy level of ϵ. The higher ϵ is, the less private the reported data is. We derive lower and upper bounds on the value of privacy which are asymptotically tight as the number of data subjects becomes large. Specifically, the lower bound assures that it is impossible to use less amount of payment to buy ϵ units of privacy, and the upper bound is given by an achievable payment mechanism that we designed. Based on these fundamental limits, we further derive lower and upper bounds on the minimum total payment for the data collector to achieve a given learning accuracy target, and show that the total payment of the designed mechanism is at most one individual's payment away from the minimum.
AB - We study the value of data privacy in a game-theoretic model of trading private data, where a data collector purchases private data from strategic data subjects (individuals) through an incentive mechanism. The private data of each individual represents her knowledge about an underlying state, which is the information that the data collector desires to learn. Different from most of the existing work on privacy-aware surveys, our model does not assume the data collector to be trustworthy. Then, an individual takes full control of its own data privacy and reports only a privacy-preserving version of her data. In this paper, the value of ϵ units of privacy is measured by the minimum payment of all nonnegative payment mechanisms, under which an individual's best response at a Nash equilibrium is to report the data with a privacy level of ϵ. The higher ϵ is, the less private the reported data is. We derive lower and upper bounds on the value of privacy which are asymptotically tight as the number of data subjects becomes large. Specifically, the lower bound assures that it is impossible to use less amount of payment to buy ϵ units of privacy, and the upper bound is given by an achievable payment mechanism that we designed. Based on these fundamental limits, we further derive lower and upper bounds on the minimum total payment for the data collector to achieve a given learning accuracy target, and show that the total payment of the designed mechanism is at most one individual's payment away from the minimum.
KW - differential privacy
KW - game theory
KW - incentive mechanism
KW - mechanism design
KW - strategic data subjects
UR - http://www.scopus.com/inward/record.url?scp=85112755852&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85112755852&partnerID=8YFLogxK
U2 - 10.1145/2896377.2901461
DO - 10.1145/2896377.2901461
M3 - Article
AN - SCOPUS:85112755852
SN - 0163-5999
VL - 44
SP - 249
EP - 260
JO - Performance Evaluation Review
JF - Performance Evaluation Review
IS - 1
ER -