Published on the IAPP Privacy Tech Blog.
If you are a privacy professional interested in anonymity, then you’ve probably followed the Wired article “How One of Apple’s Key Privacy Safeguards Falls Short”. It’s the story of how some researchers reverse-engineered Apple’s proprietary implementation of differential privacy and found that the setting of the privacy loss parameter is not in the safe region.
So you might be thinking that Apple got caught trying to pull a fast one: secretly adjusting the privacy knob to the “more data, less privacy” setting. But it’s not so simple. Contrary to what the Wired article suggests, it might well be that Apple’s data collection privacy is in fact quite strong. We don’t actually know, because their differential privacy setting doesn’t tell us one way or the other.
I personally think that the root of the problem here lies not with Apple, but with the researchers who over-hype differential privacy. Read why in my recent blog published on the IAPP Privacy Tech Blog.
Categorised in: About Aircloak, GDPR, General
Apple Differential Privacy Privacy Wired