Robust Privacy-Utility Tradeoffs under Differential Privacy and Hamming Distortion

Kousha Kalantari, Lalitha Sankar, Anand D. Sarwate

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

A privacy-utility tradeoff is developed for an arbitrary set of finite-alphabet source distributions. Privacy is quantified using differential privacy (DP), and utility is quantified using expected Hamming distortion maximized over the set of distributions. The family of source distribution sets (source sets) is categorized into three classes, based on different levels of prior knowledge they capture. For source sets whose convex hull includes the uniform distribution, symmetric DP mechanisms are optimal. For source sets whose probability values have a fixed monotonic ordering, asymmetric DP mechanisms are optimal. For all other source sets, general upper and lower bounds on the optimal privacy leakage are developed and a necessary and sufficient condition for tightness are established. Differentially private leakage is an upper bound on mutual information (MI) leakage: the two criteria are compared analytically and numerically to illustrate the effect of adopting a stronger privacy criterion.

Original languageEnglish (US)
JournalIEEE Transactions on Information Forensics and Security
DOIs
StateAccepted/In press - Apr 27 2018

Keywords

  • Differential privacy
  • Distortion
  • Distortion measurement
  • Hamming distortion
  • information leakage
  • Mutual information
  • Privacy
  • utility-privacy tradeoff

ASJC Scopus subject areas

  • Safety, Risk, Reliability and Quality
  • Computer Networks and Communications

Cite this

Robust Privacy-Utility Tradeoffs under Differential Privacy and Hamming Distortion. / Kalantari, Kousha; Sankar, Lalitha; Sarwate, Anand D.

In: IEEE Transactions on Information Forensics and Security, 27.04.2018.

Research output: Contribution to journalArticle

@article{2714832ebfca494791b5435591ebc601,
title = "Robust Privacy-Utility Tradeoffs under Differential Privacy and Hamming Distortion",
abstract = "A privacy-utility tradeoff is developed for an arbitrary set of finite-alphabet source distributions. Privacy is quantified using differential privacy (DP), and utility is quantified using expected Hamming distortion maximized over the set of distributions. The family of source distribution sets (source sets) is categorized into three classes, based on different levels of prior knowledge they capture. For source sets whose convex hull includes the uniform distribution, symmetric DP mechanisms are optimal. For source sets whose probability values have a fixed monotonic ordering, asymmetric DP mechanisms are optimal. For all other source sets, general upper and lower bounds on the optimal privacy leakage are developed and a necessary and sufficient condition for tightness are established. Differentially private leakage is an upper bound on mutual information (MI) leakage: the two criteria are compared analytically and numerically to illustrate the effect of adopting a stronger privacy criterion.",
keywords = "Differential privacy, Distortion, Distortion measurement, Hamming distortion, information leakage, Mutual information, Privacy, utility-privacy tradeoff",
author = "Kousha Kalantari and Lalitha Sankar and Sarwate, {Anand D.}",
year = "2018",
month = "4",
day = "27",
doi = "10.1109/TIFS.2018.2831619",
language = "English (US)",
journal = "IEEE Transactions on Information Forensics and Security",
issn = "1556-6013",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - JOUR

T1 - Robust Privacy-Utility Tradeoffs under Differential Privacy and Hamming Distortion

AU - Kalantari, Kousha

AU - Sankar, Lalitha

AU - Sarwate, Anand D.

PY - 2018/4/27

Y1 - 2018/4/27

N2 - A privacy-utility tradeoff is developed for an arbitrary set of finite-alphabet source distributions. Privacy is quantified using differential privacy (DP), and utility is quantified using expected Hamming distortion maximized over the set of distributions. The family of source distribution sets (source sets) is categorized into three classes, based on different levels of prior knowledge they capture. For source sets whose convex hull includes the uniform distribution, symmetric DP mechanisms are optimal. For source sets whose probability values have a fixed monotonic ordering, asymmetric DP mechanisms are optimal. For all other source sets, general upper and lower bounds on the optimal privacy leakage are developed and a necessary and sufficient condition for tightness are established. Differentially private leakage is an upper bound on mutual information (MI) leakage: the two criteria are compared analytically and numerically to illustrate the effect of adopting a stronger privacy criterion.

AB - A privacy-utility tradeoff is developed for an arbitrary set of finite-alphabet source distributions. Privacy is quantified using differential privacy (DP), and utility is quantified using expected Hamming distortion maximized over the set of distributions. The family of source distribution sets (source sets) is categorized into three classes, based on different levels of prior knowledge they capture. For source sets whose convex hull includes the uniform distribution, symmetric DP mechanisms are optimal. For source sets whose probability values have a fixed monotonic ordering, asymmetric DP mechanisms are optimal. For all other source sets, general upper and lower bounds on the optimal privacy leakage are developed and a necessary and sufficient condition for tightness are established. Differentially private leakage is an upper bound on mutual information (MI) leakage: the two criteria are compared analytically and numerically to illustrate the effect of adopting a stronger privacy criterion.

KW - Differential privacy

KW - Distortion

KW - Distortion measurement

KW - Hamming distortion

KW - information leakage

KW - Mutual information

KW - Privacy

KW - utility-privacy tradeoff

UR - http://www.scopus.com/inward/record.url?scp=85046375645&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85046375645&partnerID=8YFLogxK

U2 - 10.1109/TIFS.2018.2831619

DO - 10.1109/TIFS.2018.2831619

M3 - Article

AN - SCOPUS:85046375645

JO - IEEE Transactions on Information Forensics and Security

JF - IEEE Transactions on Information Forensics and Security

SN - 1556-6013

ER -