Monthly Archives: August 2014

Data Privacy and the Internet of Things

iStock_000044457776Small (1) Earlier this month (August 2014) Offcom announced that UK adults spend an average of eight hours and 41 minutes a day on media devices – which compares with an average night’s sleep of eight hours and 21 minutes …

I have to admit to being something of a science fiction fan and it seems to me that our own world has some interesting parallels with that created by E M Forster in his short novel, The Machine Stops.

The setting is a world where humans live in isolation in underground cells, and where everything is provided by the global “Machine” – music, art, literature, conversation, education, knowledge, interaction with other humans, food, religion, medicine – truly everything that humankind allegedly requires. In Forster’s world, travel is available, but unpopular and treated with suspicion. The physically strong are culled at birth. The weak survive.  When the Machine breaks down, the humans – its subjects – perish, leaving the only hope for the human race with those who had previously escaped the underground world and made their way to the surface to live outside the Machine’s jurisdiction.

In our own world, we have the internet, social media, online music, art, and the ability to educate, work and communicate, both personally and in business, from a distance.

And, of course, we have the Internet of Things, which is currently generating a great deal of interest and discussion, and which brings us ever closer to Forster’s world.

What is the Internet of Things?

The answer lies in the name, though it’s worth mentioning that “Things” include people.

??????????????????????????????????????????????????????In a nutshell, we are living in a world where broadband is an ubiquitous fact of life, technology is moving faster and faster – and becoming increasingly less expensive, and more and more devices are being created with wifi capability and sensors – from smartphones to fridges,  remote household heating systems to tumble-dryers, razors to kettles, and TVs to wearable devices.

According to Gartner (a Connecticut-based IT research and advisory company) by 2020 there will be over 26 billion connected devices.  With an assumed 8 billion people on the planet in the same year, that’s an average of over 3 ¼ ‘smart’ devices per man, woman and child!

For example, LG has developed a fridge that has a camera which allows owners to see what food is inside.  It scans items as they’re added, tracks expiry dates and recommends recipes based on the food available. The owner can also programme Body Mass Index (BMI) and weight loss targets.  Using smart TV and voice recognition technology, the fridge can see who is opening the door, recommend a recipe … and even in future turn on the oven to the right temperature if you choose that recipe!

It’s intended that this fridge will link with online food shopping services so that it can restock itself when supplies run low.  The fridge’s data will all be accessible to the owner vie smartphone, tablet or PCF so the owner can stay in control. (If you like the idea, the fridge is scheduled to be on sale in the UK later this year for around £2,000.)

RFID Tags and Security Issues

There is no doubt that the opportunity for automated household management may be appealing and is possibly unavoidable in the future.   And there are many other potential uses too, including tracking wildlife, chipping pets (and even humans), providing access to a person’s medical records, and monitoring our medical conditions to notify us of drugs and dosages to be taken.  We already have RFID technology in our passports, our travel passes, even our clothes (though primarily for stock control reasons rather than intended tracking).

But privacy is a real concern.  Given the sensitivity of some of the data to be collected, it is alarming to read that the default security settings on these devices are often very weak, making it straightforward for hackers to break into devices.  This has been amply demonstrated already:

‘Smart’ Devices Send out Spam emails …

Between December 23rd 2013 and January 6th 2014, about 750,000 spam messages were sent out by smart gadgets.  The malware involved was able to instal itself on a range of kitchen appliances, home media systems and web-connected televisions.  It was able to do so because the gadgets had not been set up securely, used default passwords, and the owners were unaware of the potential for security issues – if they even knew the devices carried RFID tags.

Privacy and Security

Data Compliant Cloud considerationsBusinesses must be mindful of the consumer’s privacy and security when they develop products that can gather and share data about what they, their owners, and other, linked “smart” products do.  This new technology will be collecting private, and sometimes deeply personal and sensitive data about the owners who may be wearing the technology or installing it in their homes.

Currently it seems that companies are storing data from these smart devices onto the cloud, without necessarily informing the consumer or giving them a choice.  Even with the antiquated Data Protection legislation currently in place, if such data would allow individuals associated with that data to be personally identified, that must be a breach of the DPA.

There’s no doubt that becoming compliant and secure in the RFID environment will be much simpler for businesses if they start the process at the very beginning of the technological developments.  They would also be well advised to make their compliance and security solutions scalable to avoid significant problems in the future.

The EU Directive on the Protection of Personal Data states that a person must freely give specific consent and be informed before their personal information is processed.  EU Member States are required to ensure confidentiality of communications by prohibiting unlawful interception and surveillance of personal information unless consent has been provided.

This suggests that using RFID chips unleashes serious privacy implications.  To remain compliant with EU data protection legislation, organisations should make it absolutely clear that:

  • The merchandise includes RFID tags
  • Whether the user’s data will be will be collected and stored by the organisation
  • What data will be collected
  • How the data will be used


EU RFID Technical Standards

RFD-Blue-1bAt the end of July, the European Commission has put out a series of recommendations to protect consumers from privacy risks associated with RFID chips.  Viviane Reding, former EU Commissioner said: “While smart chips working with RFID technology can make businesses more efficient and better organised, I am convinced they will only be welcomed in Europe if they are used by the consumers and not on the consumers. No European should carry a chip in one of their possessions without being informed precisely what they are used for, with the choice of removing or switching it off at any time. The ‘Internet of Things’ will only work if it is accepted by the people.”

Privacy Impact Assessments

While the sentiment is admirable, it has, until now, been difficult to see quite how it is enforceable.  A good starting point, however, is that an RFID Privacy Impact Assessment has been agreed, which should ensure data protection within current EU privacy regulations.

rfid logo


In the meantime, the European Commission’s new RFID logo has been developed for items that include RFID tags so that individuals will know that they are carrying items that can be tracked – eg Oyster cards, fashion items, wearable technology and so on.   Unfortunately the scheme is voluntary, which means that businesses are not obliged to use the logos.

The Future

RFID items are increasingly widespread and popular – the technology is cheap and efficient, retailers find it enormously helpful from a stock control perspective, consumers find it useful.  It will be fascinating to see how the development of RFID products impacts on our lives, our privacy and our security.  Perhaps we’re not so very far away from the world envisioned by EM Forster back in 1909 – long before the internet and all its trimmings were in place.

As Shakespeare so tellingly put it:  “O brave new world that has such people in’t”

Big Data and the Data Protection Act

big data privacyBig data is a big issue for organisations across the world.  Businesses, governments, health organisations, analysts and scientists are all looking at the opportunities to be gained from using big data.

Another big issue is the matter of individuals’ privacy and data protection. The EU data protection principles are already established throughout the member states, and in the UK we have the Data Protection Act, which is regulated and enforced by the ICO.

So what is big data?

I first wrote about big data in 2012.  Analyst Doug Laney described it as being three-dimensional – a combination of volume, velocity and variety.

It probably began when customers started shopping over the internet.  Businesses started to save and analyse data from clicks, searches, registrations, purchases and so on. Then came social networks where individuals post personal and business information about themselves, hold conversations with their friends, family and colleagues, post updates and opinions, store their photographs and music and films and videos in the cloud…

This technology is continuing to develop at speed, while big data analysis and algorithms are becomng ever more sophisticated.  As a result, big data’s relationship with data protection and privacy regulations is becoming a serious and significant issue.

Big Data and the Data Protection Act

Of course, not all big data actually uses personal information.  For example, researchers analysing data from particle physics experiments at CERN’s Large Hadron Collier sift through approximately 16 million gigabytes of data every year.  This is hardly a serious threat to individuals’ privacy.

On the other hand, businesses using data from social media, in combination with sales transactions and loyalty cards does indeed use personal data, and in this case the Data Protection Act (DPA) comes into force to protect the individual.

Regardless of whether or not we think the DPA is adequate to protect individuals against organisations working with data, it is the only legislation we have.  And the ICO has just produced a report suggesting a number of areas where organisations must be mindful of their big data regulatory responsibilities:

  1. Fair processing: Where big data is used to make decisions affecting individuals, a key requirement is that such processing – including the initial collection of that individual’s data – is fair and transparent.  A clear explanation of why the data is being collected (the Purpose) and, where necessary, consent of the individual to that purpose is a key element in the compliant use of such data.
  2. Consent: any consent must be ‘freely given, specific and informed’.  People must be able to understand how their data is to be used, and there must be a clear indication that they have consented to such use. If an organisation is relying on consent as a condition for processing big data, it is important that the data subjects have a clear choice and are able to withdraw their consent if they wish. Otherwise, the consent does not meet the requirements of the DPA.
  3. Repurposing: where data has been collected for one reason, and is now being used for a completely different purpose, then the organisation needs to make its users or customers aware of this – most particularly if the data is being used for a purpose that the individual could not reasonably have expected at the time the data was initially collected.  In this case, where consent is relied on, consent is required.
  4. Excessive, relevant data: using all the available data for analysis might be expected to contravene Principle 3 of the data protection act which states that data must be adequate, relevant and not excessive.  An organisation must be clear from the outset what they expect to learn or do by processing all the data.  They must also be in a position, if necessary, to demonstrate how they have satisfied themselves that the data they are using from perhaps a multiplicity of sources is relevant and not excessive.
  5. Security: organisations using personal data should always be mindful of security and the potential for data security breaches.  The use of big data is no different in this respect – but the number of new datasets that may be acquired in combination with the existing data used may make the security issues a little more widespread, and will require robust risk assessment and risk management policies and procedures.
  6. Anonymisation: if data is correctly anonymised it will no longer be considered personal data and will therefore not be subject to the DPA.  However, when using the multiple data sources associated with big data analytics, achieving genuine anonymisation can be difficult to achieve and the ICO advises organisations to carry out a robust risk assessment of the risk of re-identification, and provide solutions proportionate to the risk.
  7. Privacy Impact Assessment (PIA): a PIA is an important part of being compliant as it helps gain an understanding of how the processing will affect the individuals concerned.  For example, there is a difference between using personal data to identify general trends, and using it to make decisions that affect those individuals.
  8. Long-term use: using big data for analytics does not waive the requirement that data should be kept only for the period required for the stated business purposes.  If a business wants to hold the data for long-term use, the reasons must be articulated and justified.
  9. Subject access: don’t forget that people can request to see the data you are processing about them.  When using big data, systems can become complex and unwieldy making such requests difficult, time-consuming or expensive to fulfil.  Keeping the system simple will obviously benefit the organisation.
  10. Third parties: if data has been purchased from a third party in order to run its big data analytics, the purchaser becomes the data controller for the purchased data.  It is now responsible for ensuring it has met the DPA’s conditions for further use of that data, and, if it is relying on the original consent obtained by the supplier, then the purchaser must ensure that this is adequate to cover its further processing requirements.

In summary

When using big data, always make sure you comply with the DPA.

Don’t be secretive, deceptive or misleading.  Make sure you obtain appropriate consents as required.  Explain clearly what you’re doing with big data to your users, your customers and those from whom you’re collecting data.  And make sure the information is utterly transparent.  It’s also worth being creative about how you tell them what you’re doing, by finding, describing and providing visible benefits that they will appreciate.

If you have any compliance or security questions on your own use of big data, please contact or call 01787 277742. Data Compliant offers the following services: