Category Archives: General Information

GDPR and Data Privacy Impact Assessments (DPIAs)

DPIA blog image

When are they needed?  How are they done?

Next year under the new GDPR data protection legislation, Privacy Impact Assessments will become known as Data Privacy Impact Assessments, and will be mandatory instead of merely recommended.

The ICO currently describes PIAs as “a tool which can help organisations identify the most effective way to comply with their data protection obligations and meet individuals’ expectations of privacy.”

While the soon-to-be-rechristened DPIAs will be legally required, data controllers should continue to fully embrace these opportunities to ensure that heavy fines, brand reputational damage and the associated risks of data breaches can be averted from an early stage in any planned operation.

When will a DPIA be legally required?

Organisations will be required to carry out a DPIA when data processing is “likely to result in a high risk to the rights and freedoms of individuals.” This can be during an existing or before a planned project involving data processing that comes with a risk to the rights of individuals as provided by the Data Protection Act. They can also range in scope, depending on the organisation and the scale of its project.

DPIAs will therefore be required when an organisation is planning an operation that could affect anyone’s right to privacy: broadly speaking, anyone’s right ‘to be left alone.’ DPIAs are primarily designed to allow organisations to avoid breaching an individual’s freedom to “control, edit, manage or delete information about themselves and to decide how and to what extent such information is communicated to others.” If there is a risk of any such breach, a DPIA must be followed through.

Listed below are examples of projects, varying in scale, in which the current PIA is advised – and it is safe to assume all of these examples will necessitate a DPIA after the GDPR comes into force:

  • A new IT system for storing and accessing personal data.
  • A new use of technology such as an app.
  • A data sharing initiative where two or more organisations (even if they are part of the same group company) seek to pool or link sets of personal data.
  • A proposal to identify people in a particular group or demographic and initiate a course of action.
  • Processing quantities of sensitive personal data
  • Using existing data for a new and unexpected or more intrusive purpose.
  • A new surveillance system (especially one which monitors members of the public) or the application of new technology to an existing system (for example adding Automatic number plate recognition capabilities to existing CCTV).
  • A new database which consolidates information held by separate parts of an organisation.
  • Legislation, policy or strategies which will impact on privacy through the collection of use of information, or through surveillance or other monitoring

How is a DPIA carried out?

There are 7 main steps that comprise a DPIA:

  1. Identify the need for a DPIA

This will mainly involve answering ‘screening questions,’ at an early stage in a project’s development, to identify the potential impacts on individuals’ privacy. The project management should begin to think about how they can address these issues, while consulting with stakeholders.

  1. Describe the information flows

Explain how information will be obtained, used and retained. This part of the process can identify the potential for – and help to avoid – ‘function creep’: when data ends up being processed or used unintentionally, or unforeseeably.

  1. Identify the privacy and related risks

Compile a record of the risks to individuals in terms of possibly intrusions of data privacy as well as corporate risks or risks to the organisation in terms of regulatory action, reputational damage and loss of public trust. This involves a compliance check with the Data Protection Act and the GDPR.

  1. Identify and evaluate the privacy solutions

With the record of risks ready, devise a number of solutions to eliminate or minimise these risks, and evaluate the costs and benefits of each approach. Consider the overall impact of each privacy solution.

  1. Sign off and record the DPIA outcomes

Obtain appropriate sign-offs and acknowledgements throughout the organisation. A report based on the findings and conclusions of the prior steps of the DPIA should be published and accessible for consultation throughout the project.

  1. Integrate the outcomes into the project plan

Ensure that the DPIA is implemented into the overall project plan. The DPIA should be utilised as an integral component throughout the development and execution of the project.

  1. Consult with internal and external stakeholders as needed throughout the process

This is not a ‘step’ as such, but an ongoing commitment to stakeholders to be transparent about the process of carrying out the DPIA, and being open to consultation and the expertise and knowledge of the organisation’s various stakeholders – from colleagues to customers. The ICO explains, “data protection risks are more likely to remain unmitigated on projects which have not involved discussions with the people building a system or carrying out procedures.”

DPIAs – what are the benefits?

There are benefits to DPIAs for organisations who conduct them.  Certainly there are cost benefits to be gained from knowing the risks before starting work:

  • cost benefits from adopting a Privacy by Design approach:  knowing the risks before starting work allows issues to be fixed early, resulting in reduced development costs and delays to the schedule
  • risk mitigation in relation to fines and loss of sales caused by lack of customer and/or shareholder confidence
  • reputational benefits and trust building from being seen to consider and embed privacy issues into a programme’s design from the outset

For more information about DPIAs and how Data Compliant can help, please email dc@datacompliant.co.uk.

Harry Smithson   20th July 2017

ICO updates Subject Access Requests (SARs) advice for data controllers following Court of Appeal decisions

The Information Commissioner’s Office (ICO) has updated its ‘Code of Practice on Subject Access Requests’ chiefly in response to several Court of Appeal decisions made earlier this year related to SARs. Under the Data Protection Act 1998, individuals (‘data subjects’) may request access to their personal information held by a ‘data controller.’

These requests for information are called SARs, and can range from the request for specific or limited information to the request for the entirety of held information including why it is held and to whom it may have been disclosed. The scope of a data controller’s obligations, therefore, will vary from case to case, and will be particularly burdensome for large organisations. Currently, data controllers may charge a fee of up to £10 for processing a SAR, and must provide the requester the relevant information within 40 calendar days. When the GDPR comes into force next year, data controllers will normally not be entitled to charge a fee, irrespective of the inconvenience, and will be expected to provide the information within a shorter timeframe of 30 calendar days.

However, the ICO has revised its guidance in dealing with SARs to prepare controllers for data compliance in light of the Court of Appeal’s judgements on a string of cases in which SARs took place alongside ongoing or threatened litigation – cases which in the opinion of numerous legal commentators, therefore, highlight the potential for widespread abuse of SARs to redress grievances outside the purview of data protection law.

The three key changes to the ICO’s Code

  1. Scope for assessing ‘disproportionate effort’

The DPA includes an exemption from having to respond to SARs if this would involve ‘disproportionate effort’ for the data controller. Whereas the Code previously indicated that a refusal to provide information on the grounds of it being difficult is unacceptable, it now, with greater lenience, states: “there is scope for assessing whether, in the circumstances of a particular case, supplying a copy of the requested information in permanent form would result in so much work or expense as to outweigh the requester’s right of access to their personal data.” The ICO expects controllers to evaluate the benefits to the data subject as a result of the SAR against the difficulties in complying with the request, and assess whether the scope of the request is reasonable.

  1. Dialogue between controller and requester

The ICO now advises controllers to enter into dialogue with data subjects following a SAR. This may allow the requester to specify which information they require, thereby refining the request, and making the process more manageable and less likely to result in disproportionate effort. The Code continues to explain how it will take into account both controller’s and subject’s willingness to participate in this dialogue if they receive a complaint about the handling of a SAR.

  1. Information management systems and redaction of third-party data

 The ICO now expects controllers to have information management systems wherein personal information, including archived or back-up data, can be found expediently in anticipation of a SAR. Moreover, the information management system should allow for the redaction of third-party data. This is important, since certain SARs may be declined if the information requested would result some way in the disclosure of personal information about another living person.

Subject Access Requests: For more information have a look at the 4 Court of Appeal decisions that informed the ICO’s revised guidance:  Dawson-Damer v Taylor Wessing LLP, Ittihadieh v 5-11 Cheyne Gardens, Deer v Oxford University, Holyoake v Candy

Harry Smithson 7th July 2017

Largest cyber attack on Parliament to date prompts fears of major national security compromise

International Trade Secretary Liam Fox stressed that “it’s a warning to everybody, whether they are in Parliament or elsewhere that they need to do everything possible to maintain their own cyber security.”

 

MPs took to Twitter and social media to notify their constituents as their parliamentary email accounts were besieged by a concerted 12-hour ‘brute force’ cyberattack targeting ‘weak passwords.’

On Friday night, parliamentary officials “discovered unauthorised attempts to access accounts of parliamentary networks users.” In response, remote access to the network was cut off, meaning that MPs and aides could not access their official email accounts outside of Westminster.

Parliamentary officials have been working with the National Cyber Security Centre, part of intelligence agency GCHQ, to investigate the attempted breach and assess the potential compromise to national security.

The NCSC’s latest statement on their website, as of Saturday, reads:

The NCSC is aware of an incident and is working around the clock with the UK Parliamentary digital security team to understand what has happened and advise on the necessary mitigating actions.

It is still unclear whether the attempts were successful, or whether any confidential information in the network has been acquired. Moreover, MPs and cyber specialists can only speculate as to the identity of the cyber-attackers.

However, whether those responsible for the attack are foreign ‘state actors’ or organised criminals, the compromise to the confidentiality of private or personal information and national security details is a major risk. Security advisors have warned that the parliamentary email network is a ‘treasure trove’ of information not only for blackmailers, but also for hostile states, crime syndicates and terrorist organisations.

Many Twitter users following the story have been quick to link this attempted breach to Russian state agencies (some using the hashtag #russia), citing interference in European and American elections, as well as the cyberattack on the German Bundestag in 2015, as prior examples of similar assaults on democratic institutions. However, the relatively rudimentary nature of the ‘brute force’ attempted password hacks on Parliament on Friday contrasts, for instance, with the sophisticated attempt to install remote data monitoring software onto the German state’s computer systems two years ago, which German authorities blamed on Russian agents.

While government sources have stated that it is too early to draw conclusions regarding the fallout of the event or the perpetrators, MPs have acknowledged the extent of the threat posed by cybercrime. Tory MP for NW Leicestershire, Andrew Bridgen, stated, “if people thought our emails were not secure it would seriously undermine our constituents’ confidence and trust in approaching their MP at a time of crisis.”

Referencing the ‘WannaCry’ attack on 48 NHS hospitals only a month ago, International Trade Secretary Liam Fox said it was ‘no surprise’ that Parliament would face hacking attempts given the recent attack on our public services.

In the Queen’s Speech last week, the government outlined plans to improve data protection with a new Data Protection Bill, but this did not provide details of plans to counter threats of largescale hacking or cybercrime at home or abroad.

The government indicated, however, that they hoped the new law would help them to collaborate with former EU partners and international allies in order to confront threats to global security, threats in which cyber-conflict plays an increasingly prominent role. It may well be that these measures are following up the government’s statement in 2015 in the National Security Strategy that cyber-attacks from both organised crime and foreign intelligence agencies are one of the “most significant risks to UK interests.”

GDPR Re-Permissioning needs careful planning

Morrisons becomes the latest high-profile company fined for breaking Privacy and Electronic Communications Regulations (PECR)

The ICO, the independent authority responsible for investigating breaches of data protection law, has fined the fourth largest supermarket chain in the UK £10,500 for sending 130,671 of their customers’ unsolicited marketing emails.

These customers had explicitly opted-out of receiving marketing emails related to their Morrisons ‘More’ loyalty card when they signed up to the scheme. In October and November 2016, Morrisons used the email addresses associated with these loyalty cards to promote various deals. This is in contravention of laws defining the misuse of personal information, which stipulate that individuals must give consent to receive personal ‘direct’ marketing via email.

‘Service emails’ versus ‘Marketing emails’

While the emails’ subject heading was ‘Your Account Details,’ the customers were told that by changing the marketing preferences on their loyalty card account, they could receive money off coupons, extra More Points and the company’s latest news.

The subject heading might suggest to the recipient that they are ‘service emails,’ which are defined under the Data Protection Act 1998 (DPA) as any email an organisation has a legal obligation to send, or an email without which an individual would be disadvantaged (for instance, a reminder for a booked train departure). But there is a fine line between a service email and a marketing email: if an email contains any brand promotion or advertising content whatsoever, it is deemed the latter under the DPA. Emails that ask for clarification on marketing preferences are still marketing emails and a misuse of personal contact data.

Morrisons explained to the ICO that the recipients of these emails had opted-in to marketing related to online groceries, but opted-out of marketing related to their loyalty cards, so emails had been sent for the ostensible purpose of qualifying marketing preferences which also included promotional content. Morrisons could not provide evidence that these customers had consented to receiving this type of email, however, and they were duly fined – although in cases such as this it is often the losses from reputational damage that businesses fear more.

Fines and reputational damage

This comes just three months after the ICO confirmed fines – for almost identical breaches of PECR – of £13,000 and £70,000 for Honda and Exeter-based airline Flybe respectively. Whereas Honda could not prove that 289,790 customers had given consent to direct e-marketing, Flybe disregarded 3.3 million addressees’ explicit wishes to not receive marketing emails.

Even a fine of £70,000 – which can currently be subject to a 20% early payment discount – for sending out emails to existing customers with some roundabout content in them for the sake of promotion, will seem charitable when the General Data Protection Regulation (GDPR) updates the PECR and DPA in 2018. Under the new regulations, misuse of data including illegal marketing risks a fine of up to €20 million or 4% of annual global turnover.

The ICO has acknowledged Honda’s belief that their emails were a means of helping their firm remain compliant with data protection law, and that the authority “recognises that companies will be reviewing how they obtain customer consent for marketing to comply with stronger data protection legislation coming into force in May 2018.”

These three cases are forewarnings of the imminent rise in stakes for not marketing in compliance with data protection law. The GDPR, an EU regulation that will demand British businesses’ compliance irrespective of Brexit, not only massively increases the monetary penalty for non-compliance, but also demands greater accountability to individuals with regard to the use and storage of their personal data.

The regulators recent actions show that companies will not be able cut legal corners under the assumption of ambiguity between general service and implicit promotional emails. And with the GDPR coming into force next year, adherence to data protection regulations is something marketing departments will need to find the time and resources to prepare for.

Harry Smithson, 22/06/17

Queen’s Speech Confirms New Bill to Replace Data Protection Act 1998

As part of several of measures aimed at “making our country safer and more united,” a new Data Protection Bill has been announced in the Queen’s Speech.

The Bill, which follows up proposals in the Conservative manifesto ahead of the election in June, is designed to make the UK’s data protection framework “suitable for our new digital age, allowing citizens to better control their data.”

The intentions behind the Bill are to:

  • Give people more rights over the use and storage of their personal information. Social media platforms will be required to delete data gathered about people prior to them turning 18. The ‘right to be forgotten’ is enshrined in the Bill’s requirement of organisations to delete an individual’s data on request or when there are “no longer legitimate grounds for retaining it.”
  • Implement the EU’s General Data Protection Regulation, and the new Directive which applies to law enforcement data processing. This meets the UK’s obligations to international law enforcement during its time as an EU member state and provides the UK with a system to share data internationally after Brexit is finalised.
  • To update the powers and sanctions available to the Information Commissioner.
  • Strengthen the UK’s competitive position in technological innovation and digital markets by providing a safe framework for data sharing and a robust personal data protection regime.
  • Ensure that police and judicial authorities can continue to exchange information “with international partners in the fight against terrorism and other serious crimes.”

Ultimately, the Bill seeks to modernise the UK’s data protection regime and to secure British citizens’ ability to control the processing and application of their personal information. The Queen’s Speech expressed the Government’s concern not only over law enforcement, but also the digital economy: over 70% of all trade in services are enabled by data flows, making data protection critical to international trade, and in 2015, the digital sector contributed £118 billion to the economy and employed over 1.4 million people across the UK.

Written by Harry Smithson, 22nd June 2017

Phishing ..Christmas..a time for taking?

phishing-alertThere I was, at my desk on Monday morning, preoccupied with getting everything done before the Christmas break, and doing about 3 things at once (or trying to).  An email hit my inbox with the subject “your account information has been changed”.  Because I regularly update all my passwords, I’m used to these kinds of emails arriving from different companies – sometimes to remind me that I’ve logged in on this or that device, or to tell me that my password has been changed, and to check that I the person who actually changed it.

As I hadn’t updated any passwords for a couple of days, I was rather intrigued to see who had sent the email, and I immediately  opened it.  It was from Apple to say I’d added an email as a rescue email to my Apple ID.

apple-email

Well that sounded wrong, so I clicked on the link to ‘Verify Now’ and was taken to a page that looked pretty legitimate.

apple-email-link

 

I thought I should see what was actually going on, so I logged in to my Apple ID using my previous password.  If I had been in any doubt, the fact that it accepted my out-of-date password made it very clear that this was a scam.

The site asked me to continue inputting my data.  At the top of the pages are my name and address details.  It’s also, for the first time, telling me that my account is suspended – always a hacker’s trick to get you worried and filling in information too quickly to think about what you’re actually doing.

apple-verify-1

Then the site starts to request credit card details and bank details …

apple-verify-2

And finally my date of birth so they can steal my identity, and a mobile number so that they can send me scam texts.

apple-verify-3

I know seven other people who received exactly the same email. And it’s just too easy to fall for, so any number of people could be waking up tomorrow with their identity stolen, and bank account and credit cards stripped of all money or credit.

With that in mind, here are some things to look out for in phishy (see what I did there) emails:

  1. Check the email address the email came from! If it looks wrong – it probably is!
  2. Hover your mouse over the links in the email to see where they take you. If this email had really been Apple it would have gone to an https:\\ address, at apple.co.uk
  3. Check grammatical errors in the text of the letter

Now if you do fall for an email as well executed as this, and if I’m completely honest, I’m shocked at how close to a real Apple email and website they looked, make sure you notify your bank and credit card companies immediately.  Change all of your passwords as soon as possible because if you use the same log in combination for any other accounts those could be targeted next.

Christmas has always been a time for giving.  Now it’s become the prime time for taking.

charlotte-seymour-2016

 

Written by Charlotte Seymour, 22nd December 2016

RSPCA and British Heart Foundation Fined

CHARITY FINED.jpg

So it’s getting closer and closer to Christmas – a time for giving, with more and more charity adverts on the TV, on the radio, on social media – in fact  pretty much everywhere you look. Although Christmas can be a bit tight on the purse strings thousands of people still give to their favourite charities.

Whether you’re helping children, refugees, animals or cancer or medical research, these organisations all promote that the money goes to a good cause. Unless this ‘good cause’ is to pay an ICO fine…?

Two of the major charities we all know and love are the RSPCA and the British Heart Foundation. And both have been under investigation for secretly screening its donors aiming to target those with more money. This process is known as “wealth-screening”.

The two organisations hired wealth management companies who pieced together information on its donors from publicly available sources to build data on their income, property value and even friendship circles. This allowed for a massive pool of donor data to be created and sold.

The RSPCA and BHF were part of a scheme called Reciprocate where they could share and swap data with other charities to find prospective donors. Donors to both charities were given an opt-out option.

Information included in the scheme was people’s names, addresses, date of birth and the value and date of their last donation. The ICO ruled that the charities didn’t provide a clear enough explanation to allow consumers to make an educated decision what it was they were signing up for, and therefore ruled that they had therefore not given their consent.

The RSPCA has admitted that it was not aware of the actual charities with whom they were sharing their data.  It also became clear that the charity shared data of those donors who had opted out.

The BHF insists it had all the correct permissions. However the ICO disagrees on the basis that the charities with whom they were sharing the data were not for similar causes.

The ICO has fined the RSPCA £25,000 and the British Heart Foundation £18,000. Ironically the BJF was praised on its data handling by the ICO in June this year, and it is likely to appeal the fine.

In my opinion I feel the whole thing is a mess. I like to give to charity when I can, which if I’m honest, isn’t as frequent as I’d like.

However when you hear of debacles like this, it really does put you off. I want my money to go to a good cause. I don’t want my data being shared without my knowledge so that other charities can investigate how much I earn, whether I own my property and what social circles I move in, and then decide whether I’m worth targeting. Surely these charities should be thankful for every single donation. The widow’s mite springs to mind.

I feel for the poor animals and souls that rely on these charities, who are I’m sure going to take a hit from these fines. It’s not their fault, yet no doubt it’s them that’s going to pay the price.

charlotte-seymour-2016

 

Written by Charlotte Seymour, 8th December 2016.