An alarming number of high-profile security breaches have shown just how at risk your data is. With wearables, however, there’s another, more sensitive dimension.
“A lot of what a device will collect will be sensitive by nature, such as a wristband that measures heart beat,” says Robert Bond, head of data protection at law firm Speechly Bircham.
“With many wearables, nobody tells you how the data is being managed. When you upload your data, it’s usually hosted in the cloud – and the cloud isn’t secure.”
As with any technology service, there is usually a trade-off for the user, who is getting something useful, such as insights into their sleep patterns, and in return is adding to a vast data bank that is of use to technology providers and others, including insurers and medical researchers.
Anne Bruinvels, a life sciences entrepreneur who has worked on the development of OWise, an app to help breast cancer sufferers manage their illness through a diary and keeping track of their treatment plan, is very aware of the responsibilities placed on developers to look after sensitive data.
“Data privacy is really key,” she says. OWise takes particular care to exclude data that identifies individuals from its database. “Our data is not contaminated with patient data – we don’t need to know someone’s name or date of birth,” she says. “We’d rather have slightly less data than compromise their privacy.”
One area of concern for anyone using a device that gathers personal data is how aware the developers are of their requirements under the law. At present, data protection in the European Union is governed by a patchwork of country-specific regulations, but come 2016, a new framework will be in place across all the EU member states that sets out tough requirements on privacy and transparency.
What’s comforting for any EU citizen is that, although the developer of the device they’re using might be based in Cupertino, California, as Apple is, the provider is nonetheless bound by the new rules if the company is handling or processing the data of an EU citizen.
Robert Bond warns that many developers aren’t aware of the requirements they face when the new data protection framework comes into force. “They are techies who develop solutions,” he says. “They are focused on what the tech can do, how quickly they can get it to market and how much money they can make.
PRIVACY AS DEFAULT
“We have got to make developers aware that they should be building in privacy as a default. More security should not be added after it has gone wrong. Quite a number of healthcare companies are being more careful about who they use to develop these apps – they are bringing developers in-house.
We’d rather have slightly less data than compromise privacy
“Broadly, most of the major medical device manufacturers are increasingly more aware of the security risks.”
Mr Bond’s view is borne out by the chief executive of a medical devices company, who says: “Doctor-patient confidentiality is key. Everything we do is anonymised – we in the company can’t tell who people are, which is standard in clinical trials.”
But will the developers of the apps and devices in the consumer space adhere to the same high standards? Mr Bond is pessimistic. “We do a lot of venture capital work and we see a lot of private equity being put into startups that are in the business of data,” he says. “Due diligence seems to indicate that there is almost no consideration given to data privacy. Small businesses need to be building privacy into development.”
Under the current regime, you can’t opt out of parts of a developer’s terms and conditions. If you want to use their device, you have to agree to everything, including any data-sharing the developer chooses to do. So for example, if your device connects to your health club and shares your data, the health club could be passing on that data to, say, insurers, who will pay to have access to it.
The forthcoming EU data protection framework, however, will give you the right to opt out of such disclosures.
However, there are a number of questions you should ask yourself when considering whether to use a wearable device that shares your data. The first is to look for how transparent and upfront the documentation is about how the developer gathers, stores and shares your data. “The more sensitive the data, the more you should think, ‘Stop, do I need to reassure myself that I have an understanding about what happens next?’” Mr Bond advises.
Another issue is how portable is the data you’re generating. What happens if you stop using that device or that app? Can you extract your data and use it with a new device or is it within a walled garden?
The best providers are very aware of these concerns and are prepared for the tougher regulatory regime. Ms Bruinvels says: “We are fully compliant with the current regulations and we will keep improving that – this happens on a continuous basis.”
There is a great deal of potential benefit in generating and sharing such data. The OWise breast cancer app, for example, makes its dataset available to cancer researchers. But, as Mr Bond concludes: “There’s no such thing as a free lunch. If in doubt, don’t use it.”