Apple chief executive Tim Cook told the WWDC (World Wide Developer Conference) that iOS 10 would use DP to collect more relevant data about its users – like knowing how many are using certain emojis, or where people travel.
As stated in its iOS 10 developer preview guide, DP will initially be used in the new Messaging App to recommend appropriate emojis (if you use the word love for instance and other users have inserted a heart symbol it may pop up) and to improve predictive text – again based on other users' experience.
It will also use it in search – Spotlight and Lookup Hints. It will improve context sensitivity and recommend apps, music, restaurants, or whatever based on what you are doing.
|
It all sounds innocuous enough, especially as it supposedly masks privately identifiable information (PII) — it is based on the "common" good — and delivers what people like you want in information and offers. In that respect it's better than Gmail’s outright scanning of your email or documents, Google feeding you advertising based on searches, and Facebook’s database of private message URLs, all of which use PII.
Craig Federighi, Apple’s SVP of software engineering, said at WWDC that the technology works by adding "incorrect information" to the data Apple collects. Apple’s algorithms extract useful insights while making it very difficult for anyone to link data back to an individual user.
Apple says it is being used to gain an added insight into user behaviour – to identify patterns of how groups of similar users use the device, apps, and more. It will assign what it calls a "privacy budget" – a limit on the number of fragment submissions that can be made by a single person during a set period. Those that do get submitted go into an anonymous pipeline, and Apple will periodically delete fragment donations from the server.
The security community is sceptical at best.
The most quoted reference is Schneier on Security who says this anonymisation is much harder than people think and it is likely that this technique will be full of vulnerabilities.
Another forum says, “The more information you intend to ‘ask’ of your database, the more noise that has to be injected to minimise the privacy leakage. In DP there is a fundamental trade-off between accuracy and privacy, which can be a big problem when training complex machine leading models. Once data has been leaked, it's gone. Once you've leaked as much data as your calculations tell you is safe, you can't keep going – at least not without risking your users' privacy. At this point, the best solution may be to just to destroy the database and start over. If such a thing is possible."
Another says Cook must pay marketing executives pretty well for them to conjure up such an artful deception: "Yes we're spying on you (but, trust me, we're not) because the money is so good. Oh, and would you like to donate some blood to Apple while you're at it?"
Another says that while DP is effective in individual databases once its lumped together over time it is inevitable that PII will be exposed.
Whatever the case it will be interesting if Apple ever releases the inner workings of DP to public scrutiny. For now, it is just more data being collected for Apple's unknown purposes.