Explaining differential privacy: New methods for understanding data protection

Explaining differential privacy: New methods for understanding data protection

Technology
Webp 3r2l9nmmbri3huekmox6348shtyh
Alexandra Reeve Givens President & CEO at Center for Democracy & Technology | Official website

ORGANIZATIONS IN THIS STORY

LETTER TO THE EDITOR

Have a concern or an opinion about this story? Click below to share your thoughts.
Send a message

Community Newsmaker

Know of a story that needs to be covered? Pitch your story to The Business Daily.
Community Newsmaker

Data collection is a common practice with significant implications for privacy, prompting organizations in both industry and government to turn to differential privacy (DP) to protect individuals' personal information. DP allows for data analysis while limiting what can be learned about individuals, thus maintaining their privacy. Companies such as Google, Apple, Meta, Microsoft, and Uber, along with government agencies like the U.S. Census Bureau, have adopted this approach. However, the level of privacy protection varies according to a parameter known as epsilon, with smaller values indicating stronger privacy.

Epsilon, a measure of information leakage, is often not communicated to the individuals whose data is collected, primarily due to its complexity. "Despite epsilon’s importance as an indicator of privacy risk, it is seldom communicated to the people whose personal data are used by technology companies and other large organizations. This is because epsilon is difficult to reason about, even among experts." Rachel Cummings and Priyanka Nanayakkara seek to bridge this gap in their work with Gabriel Kaptchuk, Elissa M. Redmiles, and Mary Anne Smart by designing methods to explain epsilon, empowering individuals to make informed decisions about sharing their data.

Their study evaluated three methods: an odds-based text method, an odds-based visualization method, and an example-based method. The research involved a survey of 963 people using a workplace scenario where employees had to decide whether to share data, based on varying communication methods and epsilon values. The odds-based visualization method notably improved participants' understanding of risks compared to a control scenario without privacy protections. In contrast, the example-based method decreased comprehension.

"Interestingly, we found that participants were more likely to share data when given one of our methods over the No-Epsilon Control," say the authors. This suggests that clear communication about epsilon can positively influence individuals' willingness to share data. Participants proved sensitive to changes in epsilon: higher degrees of privacy led to greater willingness to share data.

The research underscores the effectiveness of odds-based explanation methods in enhancing understanding and empowering decisions concerning personal data under differential privacy. There is potential for these methods to be implemented, increasing transparency and trust in privacy-protecting technologies. "While probabilistic information is often sidestepped in public-facing explanations of epsilon, we hope that in the near future, organizations deploying DP can increase transparency by using methods like ours."

The authors advocate for clearer DP explanations to facilitate broader and responsible adoption, ultimately allowing increased trust and use of this technology, promoting valuable data analysis while preserving privacy.

ORGANIZATIONS IN THIS STORY

LETTER TO THE EDITOR

Have a concern or an opinion about this story? Click below to share your thoughts.
Send a message

Community Newsmaker

Know of a story that needs to be covered? Pitch your story to The Business Daily.
Community Newsmaker

MORE NEWS