Most people think that privacy in the IoT world is an impossibility. With more and more sensor data capturing information about people and storing that data, it is hard to accept the notion that people’s lives are private. But believe it or not, Google is trying to make it much harder to find out who someone is based upon their data. They are doing this by open-sourcing their own tools that make it more difficult to tell real data from white noise. The following article provides a great read on what they are calling “differential privacy”, and how it will work.
Google today announced that it is open-sourcing its so-called differential privacy library, an internal tool the company uses to securely draw insights from datasets that contain the private and sensitive personal information of its users.
Differential privacy is a cryptographic approach to data science, particularly with regard to analysis, that allows someone relying on software-aided analysis to draw insights from massive datasets while protecting user privacy. It does so by mixing novel user data with artificial “white noise,” as explained by Wired’s Andy Greenberg. That way, the results of any analysis cannot be used to unmask individuals or allow a malicious third party to trace any one data point back to an identifiable source.