TY - GEN
T1 - On the fine asymptotics of information theoretic privacy
AU - Kalantari, Kousha
AU - Kosut, Oliver
AU - Sankar, Lalitha
N1 - Funding Information:
This work was supported in part by the National Science Foundation under grant CCF-1422358
Publisher Copyright:
© 2016 IEEE.
PY - 2017/2/10
Y1 - 2017/2/10
N2 - The tradeoff between privacy and utility is studied for small datasets using tools from fixed error asymptotics in information theory. The problem is formulated as determining the privacy mechanism (random mapping) which minimizes the mutual information (a metric for privacy leakage) between the private features of the original dataset and a released version, subject to a distortion constraint between the public features and the released version. An excess probability bound is used to constrain the distortion, thus limiting the random variation in distortion due to the finite length. Bounds are derived for the following variants of the problem: (1) whether the mechanism is memoryless (local privacy) or not (global privacy), (2) whether the privacy mechanism has direct access to the private data or not. It is shown that these settings yield different performance in the first order: for global privacy, the first-order leakage decreases with the excess probability, whereas for local privacy it remains constant. The derived bounds also provide tight performance results up to second order for local privacy, as well as bounds on the second order term for global privacy.
AB - The tradeoff between privacy and utility is studied for small datasets using tools from fixed error asymptotics in information theory. The problem is formulated as determining the privacy mechanism (random mapping) which minimizes the mutual information (a metric for privacy leakage) between the private features of the original dataset and a released version, subject to a distortion constraint between the public features and the released version. An excess probability bound is used to constrain the distortion, thus limiting the random variation in distortion due to the finite length. Bounds are derived for the following variants of the problem: (1) whether the mechanism is memoryless (local privacy) or not (global privacy), (2) whether the privacy mechanism has direct access to the private data or not. It is shown that these settings yield different performance in the first order: for global privacy, the first-order leakage decreases with the excess probability, whereas for local privacy it remains constant. The derived bounds also provide tight performance results up to second order for local privacy, as well as bounds on the second order term for global privacy.
KW - Privacy utility trade off
KW - excess distortion
KW - fine asymptotics
KW - mutual information leakage
UR - http://www.scopus.com/inward/record.url?scp=85015202142&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85015202142&partnerID=8YFLogxK
U2 - 10.1109/ALLERTON.2016.7852277
DO - 10.1109/ALLERTON.2016.7852277
M3 - Conference contribution
AN - SCOPUS:85015202142
T3 - 54th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2016
SP - 532
EP - 539
BT - 54th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2016
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 54th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2016
Y2 - 27 September 2016 through 30 September 2016
ER -