Monthly Archives: July 2017

Data Protection Weekly Round-up: PECR breaches, ransomware research and Facebook on security

Two large corporations fined for PECR breaches; Google study reveals ransomware profits, and Facebook urges people-led changes to security methodology

In the blog below, you’ll note how the Information Commissioner’s Office is taking a hard-line approach to PECR.  If an organisation uses electronic channels to re-permission its database in time for GDPR enforcement in May 2018, it must comply with PECR. Moneysupermarket.com is the latest in a series of big names to fall foul of email regulations.

You’ll also see an analysis of ransomware profitability, which helps explain its continued growth;   the final story summarises Facebook’s views on data security.

The ICO issues fines amounting to £160,000 for Provident Personal Credit and Moneysupermarket.com

The Information Commissioner’s Office has issued civil monetary penalties of £80,000 each for Provident Personal Credit, a Bradford-based sub-prime lender, and Moneysupermarket.com, a leading brand comparison site, on the 17th and 20th of July respectively. In both cases the fine was  for breaching the Privacy and Electronic Communications Regulation (PECR).

Text confused person

Unsolicited texts annoy prospects and customers

Quick-loan credit firm Provident Personal Credit, a brand operated by Provident Financial, was fined £80,000 for sending out nearly 1 million nuisance text messages in the space of 6 months.

The company employed a third party affiliate to send the unsolicited marketing for loans provided by a sister brand, Satsuma Loans.

Text messages may not be sent if the recipients have not consented to receiving marketing texts, so this activity was in breach of PECR.

emails out of laptop

Beware of sending “service” emails which are actually “marketing” emails

A few days later, the price and brand comparison website Moneysupermarket.com was fined for sending 7.1 million emails over 10 days updating customers with its Terms and Conditions, despite these customers having explicitly opted-out of receiving this type of email. This offence is almost identical to the breaches for which Morrison’s, Honda and Flybe were fined last month.

One of the key problems was the section “Preference Centre Update” which said: “We hold an e-mail address for you which means we could be sending you personalised news, products and promotions. You’ve told us in the past you prefer not to receive these. If you’d like to reconsider, simply click the following link to start receiving our e-mails.”

In a previous blog, we explained the ambiguity between ‘service’ emails and ‘marketing’ emails when implicitly emailing or communicating marketing content to individuals who have opted out. This is in breach of regulations (which will only get stricter after the General Data Protection Legislation comes into force in May 2018).

Google research leads to fears of proliferating ransomware

ransomware 2

Ransomware encrypts and scrambles victims’ computerised files. The files will not be decrypted until after a ransom is paid

Research carried out by Elie Bursztein, Kylie McRoberts and Luca Invernizzi from Google has found that cyber-thieves have made $25m (£19m) in the last two years through the use of ransomware. The research suggests that this type of malware regularly makes more than $1m (£761,500) for its creators.

The two strains of ransomware that have seen the most success are ‘Locky’ and ‘Cerber,’ which have collected $7.8m (£5.9m) and $6.9m (£5.2) respectively. But fears have arisen that due to the profitability of ransomware, new and more expansive variants will emerge amid the increasingly competitive, aggressive and “fast-moving” market for cybercrime weaponry. Mr Burszstein warns that ‘SamSam’ and ‘Spora’ are variants that seem to be gaining traction.

The research collected reports from victims of ransomware but also from an experiment wherein thousands of ‘synthetic’ virtual victims were created online. Mr Bursztein and his colleagues then monitored the network traffic generated by these fake victims to study the movement of money. More than 95% of Bitcoin payments (the preferred currency for ransom payments) were cashed out via Russia’s BTC-e exchange.

The lucrative nature of ransomware has led the Google researchers to conclude that it is “here to stay” and may well proliferate among the many syndicates and crime networks around the world. At a talk at the Black Hat conference, one of the world’s largest information security events, Mr Bursztein warned, “it’s no longer a game reserved for tech-savvy criminals, it’s for almost anyone.”

Facebook’s security boss argues that the industry should change its approach

facebook

Hitting the data security balance: user issues vs. tech solutions

At a talk at this year’s Black Hat, Facebook’s Chief Information Security Officer, Alex Stamos, has criticised the information security industry’s over-prioritisation of technology over people.

Advocating a ‘people-centric’ approach to information security, Mr Stamos stated his belief that most security professionals were too focused on complex ‘stunt’ hacks involving large corporations and state organisations, and tended to ignore problems that the majority of technology users face.

He told the attendees, “we have perfected the art of finding problems without fixing real-world issues. We focus too much on complexity, not harm.”

He explained that most Facebook users are not being targeted by spies or nation states, and that their loss of control over their information are from simple causes with simple solutions in which, he claims, the security industry takes no interest. He criticised the industry in general for lacking ‘empathy’ with less tech-savvy people, citing the often-expressed thought by security professionals that there would be fewer breaches and data losses if people were perfect.

He used the example of the widespread criticism from cyber experts that the security team for Facebook subsidiary Whatsapp faced after their decision to use ‘end-to-end’ encryption for the popular messaging app, which was heralded by some as sacrificing security for the sake of usability. Such a sacrifice did not manifest, but Mr Stamos was keen to emphasise the fact that it simply did not occur to security experts that usability was worth pursuing.

Mr Stamos advocated the diversification of the industry by working with less technically minded people who could empathise with the imperfections of tech-users, thus helping to develop more straightforward tools and services that would benefit a larger amount of people.

Facebook has also committed half a million dollars to fund a new project to secure election campaigns from cyber attack.  The initiative will be run by the Belfer Center for Science and International Affairs, a think-tank affiliated to Harvard University.  This is timely, given the scandals around the cyber- attack on French President Emmanuel Macron’s recent election campaign, and the Russian hack of the Democratic National Committee during the US elections last year.

If you have any data privacy compliance, governance or security concerns which you’d like to discuss with Data Compliant, please email dc@datacompliant.co.uk.

Harry Smithson   20th July 2017

GDPR – ICO Puts Trust at the Heart of Data Processing

Trust & data

Information Commissioner’s Annual Report

The Information Commissioner’s Office (ICO) published its annual report on the 13th July. It is the first time the Information Commissioner Elizabeth Denham has compiled an annual report, having taken up the post a year ago.

The report highlights the increased powers and expanding caseload and capacities  of the regulator. At a time of increasing concern about the use (and abuse) of personal information, the ICO is seeing a great deal more work.  This is, in part, reflected by an increase in staff numbers of around 8% year on year.

GDPR and Public Trust

The ICO’s foreword emphasises its commitment to regaining public trust in data controllers and processors. It is hoped that changing laws provide the regulator with an opportunity to enable individuals to trust in large organisations handling personal information. The Commissioner  states that “trust” will be “at the heart of what the Information Commissioner’s Office will do in the next four years.” Confidence in the digital economy is a consideration that the regulator acknowledges and aims to encourage, especially since the digital sector is growing 30% faster than any other part of the economy.

This echoes the government’s concerns regarding the digital economy and its relation to data protection principles that were enumerated in the Queen’s Speech and addressed by several measures including a Data Protection Bill, which is designed to implement the General Data Protection Regulation (GDPR).

In a year characterised by the impending replacement of the Data Protection Act 1998 (DPA) with the GDPR in May 2018, the report’s outline of major work undertaken leads with a nod to the many public, private and third sector organisations that will be preparing for the new legislative framework.

Consent

‘Consent,’ which has become one of the watchwords for the GDPR (and a word that will be increasingly found on the bulletin boards and coffee mugs of marketing departments) will take on a stricter legal definition soon – a marketing monolith for which the ICO anticipates organisations will seek detailed guidance.

Data Breaches

But the GDPR by no means eclipsed the ICO’s other responsibilities. Nuisance calls, unsolicited marketing and data sharing have routinely seen organisations facing fines and other civil measures. Breaches of the DPA and Privacy and Electronic Communications Regulations 2003 (PECR) such as these by a number of charities, of which the Daily Mail reported allegations in 2015, have led the ICO to issue 13 civil monetary penalties to the value of £181,000.

Indeed, some companies, Honda (whom we reported about last month) being an explicit example, have been issued fines for unsolicited marketing in breach of the DPA due to emails which asked for clarification regarding customers’ marketing preferences – which Honda for example maintained were a means of preparing for the GDPR. So while preparation for the GDPR is something to which the ICO has committed a great deal of resources, they have by no means neglected upholding the current law. The ICO has consistently made clear that it is not acceptable to break the law in preparation for another.

Monetary penalties

Overall, the ICO issued more civil monetary penalties for breaches of PECR than ever before (23), to the value of £1,923,000. It has also issued 16 fines for serious breaches of data protection principles totalling £1,624,500. It cannot be stated enough that after May 2018, these figures could skyrocket if organisations do not find ways of being compliant with the new, more expansive and rigorous legislation. Criminal prosecutions have seen a 267% increase, and the ICO has received 18,300 concerns regarding data protection brought to them – 2,000 more than last year.

Subject Access Requests (SARs)

Data controllers or organisations handling a wide range of personal data may have increasing requests for Subject Access Requests (SARs). The report states that 42% of all concerns brought to the ICO where the nature was specified were related to subject access. While these requests for data are provided under the DPA (and will be upheld with more rigour as one the data subject ‘rights’ by the GDPR) and not the freedom of information legislation, it nonetheless falls upon organisations of whatever size to be co-operative and compliant when the disclosure of information is required. It is important for organisations to train their staff to be able to recognise a SAR and act promptly.  Data controllers must recognise the importance of compliance not only with the law but with ICO audits and investigations, as well as of the necessity for efficient and conscientious data handling.

For information about how DC can help you meet the requirements of GDPR,  please email dc@datacompliant.co.uk.

Harry Smithson, July 25th 2017

GDPR and Data Privacy Impact Assessments (DPIAs)

DPIA blog image

When are they needed?  How are they done?

Next year under the new GDPR data protection legislation, Privacy Impact Assessments will become known as Data Privacy Impact Assessments, and will be mandatory instead of merely recommended.

The ICO currently describes PIAs as “a tool which can help organisations identify the most effective way to comply with their data protection obligations and meet individuals’ expectations of privacy.”

While the soon-to-be-rechristened DPIAs will be legally required, data controllers should continue to fully embrace these opportunities to ensure that heavy fines, brand reputational damage and the associated risks of data breaches can be averted from an early stage in any planned operation.

When will a DPIA be legally required?

Organisations will be required to carry out a DPIA when data processing is “likely to result in a high risk to the rights and freedoms of individuals.” This can be during an existing or before a planned project involving data processing that comes with a risk to the rights of individuals as provided by the Data Protection Act. They can also range in scope, depending on the organisation and the scale of its project.

DPIAs will therefore be required when an organisation is planning an operation that could affect anyone’s right to privacy: broadly speaking, anyone’s right ‘to be left alone.’ DPIAs are primarily designed to allow organisations to avoid breaching an individual’s freedom to “control, edit, manage or delete information about themselves and to decide how and to what extent such information is communicated to others.” If there is a risk of any such breach, a DPIA must be followed through.

Listed below are examples of projects, varying in scale, in which the current PIA is advised – and it is safe to assume all of these examples will necessitate a DPIA after the GDPR comes into force:

  • A new IT system for storing and accessing personal data.
  • A new use of technology such as an app.
  • A data sharing initiative where two or more organisations (even if they are part of the same group company) seek to pool or link sets of personal data.
  • A proposal to identify people in a particular group or demographic and initiate a course of action.
  • Processing quantities of sensitive personal data
  • Using existing data for a new and unexpected or more intrusive purpose.
  • A new surveillance system (especially one which monitors members of the public) or the application of new technology to an existing system (for example adding Automatic number plate recognition capabilities to existing CCTV).
  • A new database which consolidates information held by separate parts of an organisation.
  • Legislation, policy or strategies which will impact on privacy through the collection of use of information, or through surveillance or other monitoring

How is a DPIA carried out?

There are 7 main steps that comprise a DPIA:

  1. Identify the need for a DPIA

This will mainly involve answering ‘screening questions,’ at an early stage in a project’s development, to identify the potential impacts on individuals’ privacy. The project management should begin to think about how they can address these issues, while consulting with stakeholders.

  1. Describe the information flows

Explain how information will be obtained, used and retained. This part of the process can identify the potential for – and help to avoid – ‘function creep’: when data ends up being processed or used unintentionally, or unforeseeably.

  1. Identify the privacy and related risks

Compile a record of the risks to individuals in terms of possibly intrusions of data privacy as well as corporate risks or risks to the organisation in terms of regulatory action, reputational damage and loss of public trust. This involves a compliance check with the Data Protection Act and the GDPR.

  1. Identify and evaluate the privacy solutions

With the record of risks ready, devise a number of solutions to eliminate or minimise these risks, and evaluate the costs and benefits of each approach. Consider the overall impact of each privacy solution.

  1. Sign off and record the DPIA outcomes

Obtain appropriate sign-offs and acknowledgements throughout the organisation. A report based on the findings and conclusions of the prior steps of the DPIA should be published and accessible for consultation throughout the project.

  1. Integrate the outcomes into the project plan

Ensure that the DPIA is implemented into the overall project plan. The DPIA should be utilised as an integral component throughout the development and execution of the project.

  1. Consult with internal and external stakeholders as needed throughout the process

This is not a ‘step’ as such, but an ongoing commitment to stakeholders to be transparent about the process of carrying out the DPIA, and being open to consultation and the expertise and knowledge of the organisation’s various stakeholders – from colleagues to customers. The ICO explains, “data protection risks are more likely to remain unmitigated on projects which have not involved discussions with the people building a system or carrying out procedures.”

DPIAs – what are the benefits?

There are benefits to DPIAs for organisations who conduct them.  Certainly there are cost benefits to be gained from knowing the risks before starting work:

  • cost benefits from adopting a Privacy by Design approach:  knowing the risks before starting work allows issues to be fixed early, resulting in reduced development costs and delays to the schedule
  • risk mitigation in relation to fines and loss of sales caused by lack of customer and/or shareholder confidence
  • reputational benefits and trust building from being seen to consider and embed privacy issues into a programme’s design from the outset

For more information about DPIAs and how Data Compliant can help, please email dc@datacompliant.co.uk.

Harry Smithson   20th July 2017

ICO updates Subject Access Requests (SARs) advice for data controllers following Court of Appeal decisions

The Information Commissioner’s Office (ICO) has updated its ‘Code of Practice on Subject Access Requests’ chiefly in response to several Court of Appeal decisions made earlier this year related to SARs. Under the Data Protection Act 1998, individuals (‘data subjects’) may request access to their personal information held by a ‘data controller.’

These requests for information are called SARs, and can range from the request for specific or limited information to the request for the entirety of held information including why it is held and to whom it may have been disclosed. The scope of a data controller’s obligations, therefore, will vary from case to case, and will be particularly burdensome for large organisations. Currently, data controllers may charge a fee of up to £10 for processing a SAR, and must provide the requester the relevant information within 40 calendar days. When the GDPR comes into force next year, data controllers will normally not be entitled to charge a fee, irrespective of the inconvenience, and will be expected to provide the information within a shorter timeframe of 30 calendar days.

However, the ICO has revised its guidance in dealing with SARs to prepare controllers for data compliance in light of the Court of Appeal’s judgements on a string of cases in which SARs took place alongside ongoing or threatened litigation – cases which in the opinion of numerous legal commentators, therefore, highlight the potential for widespread abuse of SARs to redress grievances outside the purview of data protection law.

The three key changes to the ICO’s Code

  1. Scope for assessing ‘disproportionate effort’

The DPA includes an exemption from having to respond to SARs if this would involve ‘disproportionate effort’ for the data controller. Whereas the Code previously indicated that a refusal to provide information on the grounds of it being difficult is unacceptable, it now, with greater lenience, states: “there is scope for assessing whether, in the circumstances of a particular case, supplying a copy of the requested information in permanent form would result in so much work or expense as to outweigh the requester’s right of access to their personal data.” The ICO expects controllers to evaluate the benefits to the data subject as a result of the SAR against the difficulties in complying with the request, and assess whether the scope of the request is reasonable.

  1. Dialogue between controller and requester

The ICO now advises controllers to enter into dialogue with data subjects following a SAR. This may allow the requester to specify which information they require, thereby refining the request, and making the process more manageable and less likely to result in disproportionate effort. The Code continues to explain how it will take into account both controller’s and subject’s willingness to participate in this dialogue if they receive a complaint about the handling of a SAR.

  1. Information management systems and redaction of third-party data

 The ICO now expects controllers to have information management systems wherein personal information, including archived or back-up data, can be found expediently in anticipation of a SAR. Moreover, the information management system should allow for the redaction of third-party data. This is important, since certain SARs may be declined if the information requested would result some way in the disclosure of personal information about another living person.

Subject Access Requests: For more information have a look at the 4 Court of Appeal decisions that informed the ICO’s revised guidance:  Dawson-Damer v Taylor Wessing LLP, Ittihadieh v 5-11 Cheyne Gardens, Deer v Oxford University, Holyoake v Candy

Harry Smithson 7th July 2017

Data Compliant’s Weekly Roundup: NHS Trust breaches DPA, Wetherspoons delete entire customer database, & more

The ICO censures NHS Trust for breaching data protection law

In light of the Royal Free NHS Trust’s mishandling of 1.6 million patients’ information for research and innovation purposes, the Information Commissioner’s Office (ICO) has asked the Trust to sign a four-point undertaking to ensure future compliance with data protection law. This requires  them to:

  • establish a proper legal basis under the Data Protection Act for the Google DeepMind project and for any future trials;
  • set out how it will comply with its duty of confidence to patients in any future trial involving personal data;
  • complete a privacy impact assessment, including specific steps to ensure transparency
  • commission an audit of the trial, the results of which will be shared with the Information Commissioner

The Royal Free worked with ‘DeepMind,’ Google’s recently acquired artificial intelligence technology, to develop an ‘alert, detection and diagnosis system.’ The hospital provided details of 1.6 million patients to the DeepMind division with a view to creating an app called ‘Streams’ that helps doctors detect patients at risk of acute kidney injury (AKI). The patients had not given their consent to this, and indeed did not know that their information was being shared in this way.

DeepMind is a sophisticated artificial intelligence program that offers ‘self-taught AI software’ to process and find solutions to projects that would otherwise require massive amounts of human learning and labour.

Elizabeth Denham, the Information Commissioner, stated:

“There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.”

Wetherspoons notifies email subscribers that they will be ceasing email correspondence

In a measure anticipating the General Data Protection Regulations (GDPR) coming into force next year, the popular pub chain J. D. Wetherspoons will be deleting their entire customer email database.

This may also be in response to a data breach in 2015 in which over 650,000 of the company’s customer email addresses were affected.

This ‘unexpected’ news came as Wetherspoons customers received an email on the 23rd last month explaining:

“I’m writing to inform you that we will no longer be sending our monthly customer newsletters by e-mail.

Many companies use e-mail to promote themselves, but we don’t want to take this approach – which many consider intrusive.

Our database of customers’ email addresses, including yours, will be securely deleted.”

We reported on this blog last month the fines that the ICO issued for illegal marketing offences made by Morrison’s, Flybe and Honda. It would seem that these high-profile cases are beginning to influence major companies in how they deal not only with the imminent tightening of regulations under the GDPR, but also increasing public awareness surrounding data protection law.

It has been reported by the NCC Group that fines from the ICO in 2016 would be £69m instead of the actual £880,500, if the GDPR had been in force. This would explain Wetherspoons decision.

A Wetherspoons spokesperson told Wired.com:

“We felt, on balance, that we would rather not hold even email addresses for customers. The less customer information we have, which now is almost none, then the less risk associated with data.”

The Government Digital Service (GDS) makes users change passwords after security breach

The DGS website, which allows registered users to find data published by government departments and agencies; public bodies and authorities, has asked its users to change their passwords after a publicly accessible database of usernames and emails had been discovered during a security scan.

The Information Commissioner’s Office has been notified but is yet to make an official statement on the matter.

A GDS spokeswoman told the BBC that only data.gov.uk accounts were compromised, not accounts associated with any other government website. She continued to explain that only email addresses, usernames and ‘hashed’ passwords, i.e. passwords that have been scrambled, not personal information such as names and addresses, had become accessible. Scrambled passwords are not as useful to cyber-criminals.

However, as a precaution, registered users will have to change their password when they next try to log in and were advised to change their password for other websites or services if it is the same as the one they used for the data.gov site.

There is no evidence yet to suggest that this breach has been exploited in any way.

Cyber-criminals often send emails to victims of a data breach, masquerading as service emails, in order to tease out more information, so web users ought to be careful in these circumstances and look out for phishing emails in their inboxes.

The ICO announces fines issued to 13 charities for failing to follow data protection rules

The ICO announced this week fines that the authority issued in April after a series of investigations taking place since 2015 uncovered 13 charities’ misuse of personal information.

The ICO’s online statement highlights fines ranging from £6,000 for Oxfam for processing information about people that had not been consensually provided, to £18,000 for the International Fund for Animal Welfare for numerous breaches of data protection law, including the sharing of donor information with other charities.

Government department changes name: DCMS becomes DDCMS

The government has announced that the former Department for Culture, Media and Sport (DCMS) will now be the Department for Digital, Culture, Media and Sport (DDCMS). This reflects the government’s commitment to preparing for an increasingly digitalised world.

However, the department will still be known as the DCMS in all communications.

A government statement outlines the change:

“In a move that acknowledges the way the Department’s remit has evolved, the Prime Minister and Culture Secretary Karen Bradley have agreed a departmental name change. The Department will continue to be referred to as DCMS in all communications, but is now the department for Digital, Culture, Media and Sport.”

 

 

Harry Smithson 7th July 2017