Every single day
Every word you say
Every game you play
Every night you stay
I'll be watching you
O can't you see
You belong to me
A recent study by a Canadian Not-for-profit Open Effect of popular sports wearables titled ‘Every Step you fake’ raises serious issues, that if even remotely true overstep the generally accepted privacy boundaries.
Issues included persistent transmitting of a unique Bluetooth identifier, leaked credentials, allowing activity to be intercepted or tampered with, and susceptibility to man-in-the middle attacks.
But the key issue is the personal privacy implications. All that data goes into a cloud hosted by the companies that sell the devices, or the developers that make the fitness app.
The report starts: Electronic fitness trackers [and we include smartwatches] are designed to display aggregate fitness information automatically on mobile devices and, frequently, on web-sites developed and controlled by the company that makes the given device.
Contemporary consumer fitness wearables collect a broad range of data. The number of floors, or altitudinal changes, a person climbs a day is measured, levels and deepness of sleep, and heart rate activity are all captured by best-of-class consumer-level fitness trackers. And all of this data is of interest to the wearers of the devices, to companies interested in mining and selling collected fitness data, to insurance companies, to authorities and courts of law, and even potentially to criminals motivated to steal or access data retained by fitness companies.
In short, the researchers ask:
- Was the data collected noted in company’s privacy policies and terms of service and, if so, what protections or assurances do individuals have concerning the privacy or security of that data?
- What of that data is classified by the company as ‘personal’ data, which is tested by issuing legally compelling requests for the company to disclose all the personal data held on a requesting individual?
- Does the information received by the individual match what a company asserts is ‘personally identifiable information’ in their terms of service or privacy policies?
Big data has become both the panacea for the worlds ills and the beast that can consume us with Exabyte’s collected from multiple sources on anyone with a digital footprint. Big data can be aggregated [from any source] and used equally fore good as evil.
For example, the success of the Apple Watch – and we are not inferring in any way that Apple is doing evil – could be used to predict insurance data (a little like how actuarial tables were developed from public data), or it could be used to track people’s movement, which shops they visited, where they lingered, how they paid, if they went home that night, and so much more. There are many issues to this statement as wearables are meant to be worn 24x7 giving a great source of data:
- What do these companies controlling your personal health cloud actually do with it especially as some app developers may only have a short commercial existence?
- Can you get that data back – remove it from the app cloud?
- Can you set your own privacy limitations on what it can be used for?
- What are the legal implications if the cloud owner is asked by a court, law enforcement or other organisation for access to the bulk and even granular information?
The study found that makers of fitness trackers and apps are today’s de facto stewards [set their own rules*] of self-collected data and that many people believe, and find that, commercial stewardship creates particular access challenges. From a self-tracker’s perspective, access to our data is insecure when it is controlled by commercial stewards with conflicting interests whose corporate lifespan may be brief.
* Set their own rules
A major concern is that a huge range of parties may be interested in accessing either fitness-related data or other transmissions from the wearable devices are compounded by the relative lack of overt regulation surrounding how fitness tracker data can be collected, processed, retained, or propagated.
In the cases of many fitness tracker companies these worries are entirely legitimate. Many of the companies reserve the rights to data collected from devices, from consumers’ manual data entry, and from the social networking aspects of their services.
Such rights can include commercially sharing it, conducting data analyses of it, providing it to government authorities, and disposing of it as an asset in the case of bankruptcy or merger processes. Data may also be shared either on an individual or aggregate basis, though companies often ‘anonymize’ data prior to providing it to third parties.
For the most part companies can do this because fitness data is not generally classified as ‘health data as defined for medical devices’.
The report is work in progress – it has opened a Pandora’s box
In brief it has found many Bluetooth devices:
- Are discoverable to shop beacons as well as other smart devices
- Often transmit Personal information in unencrypted, simple text and packet capture is possible
- Are not highly secure, and can be intercepted (man-in-the-middle)
- Uses a static mac address (easy to find and track)
- And their firmware could be updated via unsigned downloads from the internet
The notable exception was the new Apple Watch that used HTTPS and Bluetooth LE Privacy functions. Microsoft's Band 2 was not in the study but also used HTTPS and LE Privacy.
My conclusion is not based on anything other than a healthy paranoia to try and keep a small digital footprint and avoid things like ID theft. I am not concerned about ‘state’ use of Big Data for planning and predicting the future but I am concerned about companies [and cybercriminals] using it to know more about me.
But as big data experts are wont to say, “Throw it in the data lake and see what bobs up” scares the bejesus out of me. You see seemingly disparate data can be combined in that lake to create links that no one ever thought existed. Fitness data added to demographic and geographical (location) data could lead to decisions made on whether one doctor or a truck load set up in that area. Fitness data added to shopping beacons and social data could change the advertising mix etc.
These examples are innocuous enough but a good data scientist will make far more use of it.
It is time that fitness data was reclassified as personally identifiable information and bought under the same scrutiny that your medical data enjoys.