Thoughts on a topic I had not heard of until recently, as usual is its in the implications of the measure.
"..As privacy violations have become rampant, and calls for better measures to protect sensitive, personally identifiable information have primarily resulted in bureaucratic policies satisfying almost no one, differential privacy is emerging as a potential solution.
In "Differential Privacy: The Pursuit of Protections by Default," https://queue.acm.org/detail.cfm?id=3439229 Case Study in ACM Queue, Google’s Damien Desfontaines and Miguel Guevara reflect with Jim Waldo and Terry Coatta on the engineering challenges that lie ahead for differential privacy, as well as what remains to be done to achieve their ultimate goal of providing privacy protection by default.
Differential privacy, an approach based on a mathematically rigorous definition of privacy that allows formalization and proof of the guarantees against re-identification offered by a system, signifies measures of privacy that can be quantified and reasoned about—and then used to apply suitable privacy protections.
In September 2019, Google released an open source version of the differential privacy library: https://github.com/google/differential-privacy , making the capability generally available. To date, differential privacy has been adopted by the US Census Bureau, along with a number of technology companies.
Queue https://queue.acm.org/ is ACM's magazine for practicing software engineers. Written by engineers for engineers, Queue focuses on the technical problems and challenges that loom ahead, helping readers to sharpen their own thinking and pursue innovative solutions. ... "
No comments:
Post a Comment