Tag Archives: data protection

42 days and counting….

With the countdown to Brexit clock still ticking it seems that all has gone (relatively) quiet on the Brexit front. Parliament is not sitting and won’t be back until 14th October but this has not stopped politicians and commentators on all sides of the debate from re-iterating their deeply-held positions.   

Behind the scenes, it is reported, there is a great deal of shuttle-diplomacy taking place. Both the Prime Minister and his chief negotiator, David Frost, have become frequent passengers on the Eurostar as they dash between London, Brussels and the other capitals of Europe.  Yet the details of the discussions are still far from clear.

The “non-paper”

On Thursday it emerged that apparently the Government has issued a ”non-paper” to the EU outlining some thoughts on how an acceptable Brexit deal can be achieved.  “Non paper” is a particularly bizarre EU concept for written proposals that have no formal status. At the risk of sounding like something from Alice through the Looking Glass, it is a paper that is not a ”paper”. 

The details contained in the non-paper are unlikely to be officially released. But, if past experience is anything to go by, non-papers tend to see the light of day through unofficial and unattributable leaks.

The Law’s Delay

With little information to go on it is perhaps unsurprising that attention has turned to the other burning issue in UK politics – the judgement of the Supreme Court on the legality of the decision to prorogue Parliament.   Whilst this is not a Brexit issue in itself, the plaintiffs in the two cases before the Court clearly suspect that Parliament was suspended in order to prevent scrutiny of the Brexit negotiations. 

At the time of writing the judges are still out and the judgement is yet to be issued. Whatever the decision it is clearly going to have an impact on the course of the Brexit countdown. 

With attention focussed on legal matters it is perhaps worthwhile spending a little time looking at an often misunderstood aspect of data protection law, specifically the legal basis for processing data. 

On what legal basis can companies process personal data?

The collection and processing of personal data must be first and foremost be lawful under the GDPR and Data Protection Act 2018.  There are six legal grounds for processing and one of them MUST apply.  They are summarised below in no particular order:

  • Consent – a person must have given their consent for one or more specific purpose(s) (e.g. for consumer electronic marketing purposes)
  • Contract – the processing is necessary for the performance of a contract to which a data subject is a party or has requested before entering into a contract (e.g. for employee, client or third-party contracts)
  • Legal obligation – for compliance with a legal obligation such as HMRC
  • Vital interests – processing is necessary to protect a data subject or another person (e.g. medical records in the case of an accident)
  • Legitimate interests – where data processing is necessary for the purposes of the legitimate interests of the data controller, except where such interests are overridden by the interests or fundamental rights or freedoms of the individual (a Legitimate Interests Assessment must take place e.g. for some direct marketing purposes)
  • Public interest – for a task carried out in the public interest or in the exercise of official authority vested in the controller

Please feel free to contact us if you have any queries or concerns about how Brexit will affect your business, by calling 01787 277742 or email teambrexit@datacompliant.co.uk

Gareth Evans, 20th September 2019

Data Breaches in Cloud Computing

The cloud computing economy is expected to grow to $191 billion by 2020, an increase of $100 billion in five years, according to the analysts at Forrester. After Monday’s mega-leak, Ecuadorians may be a little hesitant to embrace this secular shift to cloud computing.

The advantages of this system for storage and productivity are well-documented, but cloud computer servers come with several serious security risks.

High-profile breaches of cloud platforms at Evernote, Adobe, Slack and LastPass over the last few years have led to extra scrutiny of cloud computing from a security perspective, as these online databases are more and more relied upon for storing sensitive data.

Outrage over cloud platform Ecuador personal and financial data leak

This massive data breach was made possible by a vulnerability on an unsecured AWS Elasticsearch server.  It was discovered on 16th September and caused outrage throughout the Andean state.

Roughly twenty million people, including 6.7mn children, were affected, comprising nearly the entire population. Even the President of Ecuador was affected, as well as Julian Assange, who was given a ‘cedula,’ or national ID number, during his stay at the Ecuadorean embassy in London.

Collectively, the information was described by one journalist “as valuable as gold in the hands of criminal gangs.”

The scale and detail of the 18GB cache of personal information exposed by the leaky server was such that the researchers were actually able to reconstruct entire family trees.

The types of personal and confidential information available on the database included:

  • names;
  • national ID numbers;
  • DOBs;
  • places of birth;
  • home addresses;
  • genders;
  • phone numbers;
  • family and marriage records;
  • education and work records;
  • financial information including tax records.

It is not known whether any agents took advantage of the leaky server before it was plugged by the Ecuador’s computer emergency security team shortly after the discovery.

How did the breach happen?

A local data analytics company, Novaestrat held vast amounts of Ecuadorian data on an Elasticsearch server, which had no password protection, allowing anyone access. 

Though there is no evidence that the government’s database was hacked or breached by Novaestrat, these revelations led to the swift arrest of the company’s executive, and a full investigation over how the company possessed the data it held.

Novaestrat was awarded several government contracts by the former political regime, so it is likely that these were reason the company gained access to the personal data.

Plans for Data Protection Law

This breach has caused the Ecuador’s Ministry of Telecommunications to speed up the process of passing a new data privacy law.  This is intended to match rising international standards of data protection (for example, the GDPR).

Why Data Retention and Deletion Schedules are vital

There is a clear lesson here, both to data controllers and data processors.  You must make sure, whether you are a data controller or a data processor, that you have robust data retention and deletion schedules in place

Data controllers

Data Processors 

1. Make sure your data processors are legally obliged to delete the data

1. Ensure that you have procedures in place to enable you to meet the requirements of your data processor agreement

2. Demand evidence that the deletion has taken place

2. Ensure you have a robust mechanism for the destruction of the data

3. Exercise your audit rights

3. Be prepared to provide evidence of the destruction

a) Once the  purpose of the data sharing has been met and / or

4. Consider backup files as well as live

b) According to your own retention and deletion policies

 

  

If you have any questions about data retention and deletion policies or data processor agreement, please contact us via email team@datacompliant.co.uk or call 01787 277742

Brexit: 55 Days to Go, or is it?

It has been a momentous week for UK politics. With Parliament back from the summer recess MPs moved to seize the Order Paper from Government. There then followed an audacious move to legislate against a “No Deal” Brexit, a move which would hamstring the Government’s Brexit negotiation strategy. The Government’s strenuous attempts to prevent the passage of legislation to take the “No Deal” option out of the equation led to the withdrawal of the whip (in effect the suspension) of 21 Conservative MPs.  The legislation that would prevent a No Deal outcome will return to Parliament early next week.  It remains to be seen whether the legislation achieves Royal Assent and is written into law.

In the meantime, the plan to prorogue Parliament for five weeks ahead of the Brexit deadline moved on apace despite a number of legal challenges.

Elsewhere 

This week also saw the Prime Minister’s own brother, Jo Johnson, resign as Universities Minister registering his objection to the direction of the Brexit negotiations which he viewed as no longer in the national interest. 

On a happier note Downing Street announced the arrival of a new resident as the Prime Minister and his partner unveiled Dilyn their new puppy.

The Prime Minister is keen to reinforce his Government’s resolve to achieve Brexit, with or without a deal by the 31st October deadline. But despite his vigorous defence of this policy it remains unclear whether this will be achieved.

Implications for businesses

Amongst all this political turmoil it is difficult for businesses to plan ahead especially if the business model includes data transfers to or from EEA companies.

In the case of a No Deal Brexit on the date of departure the UK becomes a ‘Third Country’ in terms of EU data transference rules.  This means companies that the UK will not have adequacy status, so needs to take particular steps when processing EEA* data, for example, the data of your customers or prospects or clients.

What you need to do

The UK will recognise all EEA countries as adequate under UK law.  So there are no issues with you continuing to send personal data to the EEA. 

The reverse, however, is not the case so there will be major changes when transferring personal data from the EEA to the UK.  You need to prepare:

  1. Know your data” specifically that data you process about EU individuals.  Make sure your data mapping is up to date and identifies those individuals outside the UK, but within the EEA.
  2. Take appropriate GDPR safeguards for processing and transfers of EEA data, and update your privacy policy accordingly
  3. Use Standard Contractual Clauses to enable transfers of personal data from the EEA to the UK and vice versa.
  4.  If you are using Binding Corporate Rules, these will need to be adjusted slightly post-Brexit.

*EEA = The 27 European Member States, plus Iceland, Liechtenstein and Norway.

Please feel free to contact us if you have any queries or concerns about how Brexit will affect your business, by calling 01787 277742 or email teambrexit@datacompliant.co.uk

Countdown to Brexit… 69 days to go

The new Parliamentary session starts on 3rd September. Inevitably the session will be, once again, dominated by Brexit. With so little time between the start of the session and the Brexit deadline of Hallowe’en (31st October) there will be little Parliamentary time given over to any issues other than the terms of the UK’s exit from the EU. Parliamentary time is limited further by the Party Conference season with a further recess between 14th September and 9th October.

The Conservative Party Conference runs from 29th September to 2nd October in Manchester.  Members of Cabinet will be expected to attend and no doubt their speeches from the platform and on the fringe will be scrutinised for new policy initiatives and especially the direction of policy post Brexit. 

Over the summer the political agenda was dominated by possibility of a “No Deal” Brexit with MPs from all parties floating a variety plans for how such an eventuality could be prevented. Prime Minister Johnson has been resolute in his belief that the No Deal option cannot be removed from the table.     

Data Protection Implications

The new Prime Minister wasted no time in assembling his new Cabinet, making his intentions very clear by appointing, with few exceptions, long-standing Brexit supporters. Notable among the exceptions were the appointment of Amber Rudd to the Work & Pensions brief she has held since November 2018 and Nicky Morgan who assumes a Cabinet role as Secretary of State for Digital, Culture, Media and Sport. This is of particular interest because the brief includes Data Protection regulation and writing the “UK GDPR” into UK law.

When the UK exits the EU, as is planned, the EU GDPR will no longer be  applicable in the UK (although the Data Protection Act 2018 which references the GPDR will still apply). The UK government intends to write the GDPR into UK law, with changes to tailor it for the UK.The government has already published the – ‘Keeling Schedule’ for the GDPR, which shows the planned amendments. It can be found here http://bit.ly/2Nsy9sw 

The amendments primarily relate to references to the European Parliament, EU Member States, and the EU Commission.

What Next?

Deal or No Deal on the exit date, the UK will become a ‘third country’ (to use the jargon).  It has been suggested that there will be a period of at least 2 years of negotiations to finalise the full terms of the divorce arrangements.  During this time the UK Government will continue to allow transfers to the EU.  This will be kept under review by the new Secretary of State.  Watch this space!

Gareth Evans 23.08.2019

Framework for EU-US data flows under scrutiny as ‘Schrems II’ case takes place at the CJEU

For those unfamiliar with the Schrems saga, a brief catch-up may be required. The original case, now known as ‘Schrems I,’ involved an Austrian activist, Max Schrems, filing a complaint with the Irish Data Protection Agency against Facebook. The complaint was that Facebook had allowed US authorities to access his personal data on social media in violation of EU data protection law. This case ultimately found its way to the Court of Justice of the European Union (CJEU) and resulted in the invalidation of the ‘Safe Harbor Framework,’ which was the framework companies relied on to transfer data from the EU to the US. This is largely because legislation in the States does not have adequate limits on what data authorities may access.

With the Safe Harbor Framework invalidated, the Irish DPA asked Max Schrems to reformulate the case. On the 9th July, ‘Schrems II’ was heard at the CJEU in Luxembourg. This case took aim at EU Standard Contractual Clauses (SCC), which Facebook has been relying to legitimise its international data flows. Advocates for Schrems also called for invalidation of the EU-US Privacy Shield, arguing it provides inadequate protection and privacy to data subjects.

The hearing included many supporters of SCC, who emphasised the role of DPAs in enforcing SCC and suspending data flows where necessary and appropriate. The CJEU will likely not reach a decision until early 2020, but with the two remaining frameworks for legitimate EU-US data flows under such heavy scrutiny, data protection practitioners should be preparing for the impact these potential invalidations will have on their clients’ or their companies’ data flows.

Harry Smithson, July 2019

University data protection policies under scrutiny as report finds threats of cyber attacks

A report published by the Higher Education Policy Institute and conducted by Jisc, a digital infrastructure provider for HE, has emphasised the expanding risks of cyberattacks among UK universities and academic institutions in general. Last year saw an increase (17%) in attacks and breaches from the year before, and the trend is likely to continue. The cyberattacks will not only increase in frequency, but also in sophistication.

It is common knowledge that the higher education sector is expanding massively as more and more young people at home and abroad become students in the UK. On top of this, universities have become increasingly involved in cyber security research, making these institutions ever more desirable targets for, in the report’s words, “organised criminals and some unscrupulous nation states.” According to separate research conducted by VMware, 36% of universities believe that a successful cyberattack on their research data would pose a risk to national security.

The report (titled “How safe is your data? Cyber-security in higher education”) begins by relating a couple of everyday scenarios in academia in which cyberattacks can easily occur. These scenarios include a Distributed Denial of Service (DDoS) attack on a student using a Virtual Learning Environment (VLE); and a ransomware infection affecting a university’s digital infrastructure after a member of staff visits a website containing malicious code.

Threats such as these compound the sector’s somewhat underreported history of data protection challenges (to put it lightly). Thousands of records, many containing special category data (prior to the GDPR, ‘sensitive personal data’), have been breached across a host of institutions throughout 2017 and 2018. A whistle-stop tour of these incidents might include the University of East Anglia’s email scandal in which a spreadsheet containing health records connected to essay extensions was leaked to hundreds of students; the University of Greenwich receiving a £120,000 fine for holding data on an unsecured server; and Oxford and Cambridge research papers being stolen and sold on Farsi language websites.

To understand the extent of vulnerability that the HE sector’s data protection policies and practices have demonstrated, one need only look at Jisc’s penetration tests on an array of institutions’ resilience to ‘spear-phishing,’ an attack in which a specific individual is targeted with requests for information (often an email using the name of a senior member of staff, requesting, for example, gift voucher purchases or the review of an attached document

containing malware). 100% of Jisc’s attempts to use spear-phishing to gain access to data or find cyber vulnerabilities were successful.

Data protection policies come hand in hand with cyber security. Vast amounts of information are stored and used in university research projects, containing data relating not only to students and faculty, but to many external individuals and third parties. Robust data protection policy, including appropriate training for staff and regular risk assessments that analyse cybersecurity penetrability, is vital to reduce the risk of phishing and vulnerability to breaches and hackers.

As the report concludes, “It is imperative that those in higher education continually assess and improve their security capability and for higher education leaders to take the lead in managing cyber risk to protect students, staff and valuable research data from the growing risk of attack.”

Harry Smithson, June 2019

European Commission reports awareness throughout Europe of data rights and data protection

The Special Eurobarometer 487a report on GDPR conducted by survey and data insight consultancy Kantar at the request of the European Commission has been published this month. Where relevant, the report’s findings are compared to findings from the Special Eurobarometer 431 on Data Protection conducted in 2015.

The salient finding is that two-thirds of Europeans have heard of the General Data Protection Regulation (GDPR). Moreover, a clear majority are aware of most of the rights guaranteed by GDPR and nearly six out of ten Europeans know of a national authority tasked with protecting their data and responding to breaches.

The level of general awareness of GDPR varies across the EU, ranging from nine in ten respondents in Sweden to just over four in ten in France (44%).

The sixty-eight page report contains detailed comparative data on European attitudes toward the Internet, social media, online purchasing, data security and other GDPR-related phenomena.

Social Media

Despite a general increase in data awareness, the majority of social network users in Europe who responded affirmatively to the question, ‘have you ever tried to change the privacy settings of your personal profile from the default settings on an online social network?’ has decreased by 4% (from 60% in 2015 to 56% in 2019). Trust in social media giants among Europeans, therefore, seems to remain stable.

Interestingly, while UK internet-users are by some way the most likely in Europe to regularly purchase online (64%, followed by the Dutch and Swedish on 50%), they are also among the most likely to ‘never’ use social networks (one in five), following only the Czech Republic (21%) and France (28%). Might this not place under scrutiny the common assumption of a significantly positive correlation between marketing on social media and online sales? While online purchasing has remained stable since 2015, use of social media has expanded significantly, by 15%.

Privacy Statements

For anyone working on privacy statements or considering reworking the ones they have, the report’s findings on this subject may be useful. Your average EU28 internet-user is only 13% likely to ‘fully read’ a privacy statement. 47% of respondents said they would read them ‘partially,’ while 37% would not read them at all. These figures are also fairly consistent across all demographics and member states.

Perhaps unsurprisingly, it is the length of privacy statements that is the main reason respondents give for not fully reading them (at 66% of respondents, who could choose multiple reasons in the survey). In the UK this is higher than average at 75%, in line with the finding that high rates of internet usage correlate with people finding things too long to read.

Length of privacy statement is followed by finding them unclear or difficult to understand (31%), the sufficiency of a privacy statement existing on the website at all (17%), the belief that the law will protect them in any case (15%), the statement ‘isn’t important’ (11%), distrust in the website honouring the statement (10%, although this has fallen by 5% since 2015), and finally ‘other’ and ‘not knowing where the find them’ (5% each).

Websites do seem to have improved the clarity or wording of their privacy statements slightly over the last four years, given the mild reduction (7%) in Europeans claiming the statements are difficult to read. Respondents in the UK are among the least likely in Europe (at 19%) to find privacy statements unclear, on a par with Croatians and just below Latvians (at 15%).

Concern over control of data

The report shows there is still more that can be done by organisations and even data protection authorities wanting to build confidence among people providing information online. More than six in ten Europeans are concerned about not having complete control over the information they provide online. Indeed, 16% responded that they were ‘very concerned’. The British and the Irish are among the most concerned, either ‘very’ or ‘fairly’, at 73% and 75% respectively.

Overall, there has been a mild decrease across Europe of respondents expressing concern over control of their data, with significant decreases of up to 20% in Eastern Europe. Five countries show minor increases of concern, the highest being France and Cyprus with 5%.

Conclusions

Respondents are not only broadly aware of the rights guaranteed under GDPR, but many have begun to exercise them. Nearly a quarter of Europeans (24%) have cited the right to not receive direct marketing in taking action against this infringement. While awareness of the right to have a say when decisions are automated remains relatively low (41%), this proportion is likely to increase.

As the report states, ‘the GDPR regulation is now more important than ever – almost all respondents use the Internet (84%), with three quarters doing so daily.’ Organisations have the opportunity to pave the way for greater confidence and trust in online activities involving consumers’ data.

Harry Smithson, 28th June 2019

Belgian Data Protection Authority’s first GDPR fine imposed on public official 

The Belgian DPA delivered a strong message on 28th May 2019, that data protection is “everyone’s concern” and everyone’s responsibility, by premiering the GDPR’s sanctioning provision in Belgium with a fine of €2,000 imposed on a mayor (‘bourgmestre’) for the illegal utilisation of personal data. 

Purpose Limitation was Breached 

The mayor in question used personal data obtained for the purposes of mayoral operations in an election campaign, in breach of GDPR, particularly the purpose limitation principle, which states that data controllers and/or processors must only collect personal data for a specific, explicit and legitimate purpose. Given the fairly moderate fine, the data the mayor obtained would not have contained special category data (formerly known as sensitive personal data in the UK). The Belgian DPA also looked at other factors when deciding on the severity of the sanction, including the limited number of affected data subjects; nature and gravity of the infringement; and duration.  

‘Not Compatible with Initial Purpose’ 

The Belgian DPA received a complaint from the affected data subjects themselves, whose consent to the processing of their data was based on the assumption it would be used appropriately, in this case for administrative mayoral duties. The plaintiffs and the defendant were heard by the DPA’s Litigation Chamber, which concluded along GPDR lines that ‘the personal data initially collected was not compatible with the purpose for which the data was further used by the mayor.’ 

The decision was signed off by the relatively new Belgian commissioner, David Stevens, as well as the Director of the Litigation Chamber, a Data Protection Authority chamber independent from the rest of the Belgian judicial system. 

Harry Smithson, 3rd June 2019

GDPR’s 1st Birthday

General Data Protection Regulation reaches its first birthday

This blogpost arrives as the General Data Protection Regulation (GDPR) reaches its first birthday, and a week after a report from the Washington-based Center for Data Innovation (CDI) suggested amendments to the GDPR.

The report argues that regulatory relaxations would help foster Europe’s ‘Algorithmic Economy,’ purporting that GDPR’s restrictions of data sharing herald setbacks for European competitiveness in the AI technology field. 

Citing the European Commission’s ambition “for Europe to become the world-leading region for developing and deploying cutting-edge, ethical and secure AI,” the report then proceeds to its central claim that “the GDPR, in its current form, puts Europe’s future competitiveness at risk.”  

That being said, the report notes with approval France’s pro-AI strategy within the GDPR framework, in particular the country’s use of the clause that “grants them the authority to repurpose and share personal data in sectors that are strategic to the public interest—including health care, defense, the environment, and transport. 

Research is still being conducted into the legal and ethical dimensions of AI and the potential ramifications of automated decision-making on data subjects. In the UK, the ICO and the government’s recent advisory board, the Centre for Data Ethics and Innovation (CDEI – not to be confused with aforementioned CDI), are opening discussions and conducting call-outs for evidence regarding individuals’ or organisations’ experiences with AI. There are of course responsible ways of using AI, and organisations hoping to make the best of this new technology have the opportunity to shape the future of Europe’s innovative, but ethical use of data. 

The Information Commissioner’s Office (ICO) research fellows and technology policy advisors release short brief for combatting Artificial Intelligence (AI) security risks 

 The ICO’s relatively young AI auditing framework blog (set up in March this year) discusses security risks in their latest post, using comparative examples of data protection threats between traditional technology and AI systems. The post focuses “on the way AI can adversely affect security by making known risks worse and more challenging to control.” 

 From a data protection perspective, the main issue with AI is its complexity and the volume of not only data, but externalities or ‘external dependencies’ that AI requires to function, particularly in the AI subfield Machine Learning (ML)Externalities take the form of third-party or open-source software used for building ML systems, or third-party consultants or suppliers who use their own or a partially externally dependent ML system. The ML systems themselves may have over a hundred external dependencies, including code libraries, software or even hardware, and their effectiveness will be determined by their interaction with multiple data sets from a huge variety of sources. 

 Organisations will not have contracts with these third parties, making data flows throughout the supply chain difficult to track and data security hard to keep on top of. AI developers come from a wide array of backgrounds, and there is no unified or coherent policy for data protection within the AI engineering community. 

 The ICO’s AI auditing blog uses the example of an organisation hiring a recruitment company who use Machine Learning to match candidate CVs to job vacancies. A certain amount of personal data would have been transferred between the organisation and the recruitment agency using manual methods. However, additional steps in the ML system will mean that data will be stored and transferred in different formats across different servers and systems. They conclude, “for both the recruitment firm and employers, this will increase the risk of a data breach, including unauthorised processing, loss, destruction and damage. 

 For example, they write: 

  • The employer may need to copy HR and recruitment data into a separate database system to interrogate and select the data relevant to the vacancies the recruitment firm is working on. 
  • The selected data subsets will need to be saved and exported into files, and then transferred to the recruitment firm in compressed form. 
  • Upon receipt the recruitment firm could upload the files to a remote location, eg the cloud.  
  • Once in the cloud, the files may be loaded into a programming environment to be cleaned and used in building the AI system. 
  • Once ready, the data is likely to be saved into a new file to be used at a later time. 

This example will be relevant to all organisations contracting external ML services, which is the predominant method for UK businesses hoping to harness the benefits of AI. The blog provides three main pieces of advice based on ongoing research into this new, wide (and widening) area of data security. They suggest that organisations should, 

  • Record and document the movement and storing of personal data, noting when the transfer took place, the sender and recipient, and the respective locations and formats. This will help monitor risks to security and data breaches; 
  • Intermediate files such as compressed versions should be deleted when required – as per best-practice data protection guidelines; and 
  • Use de-identification and anonymization techniques and technologies before they are taken from the source and shared either internally or externally. 

Harry Smithson,  May 2019

What is a Data Protection Officer (DPO), and do you need one?

A DPO (Data Protection Officer) is an individual responsible for ensuring that their organisation is processing the data of its staff, customers, providers and any other individuals, i.e. data subjects, in compliance with data protection regulations. As of the EU-wide General Data Protection Regulation (GDPR), a DPO is mandatory for:

  1. Public authorities; and
  2. Organisations that process data
  • On a large scale; 
  • With regular and systematic monitoring; 
  • As a core activity; or 
  • In large volumes of ‘special category data,’ formerly known as ‘sensitive personal data,’ i.e. information related to a living individual’s racial or ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health condition, sex life or sexual orientation, or biometric data.  

It may not be immediately obvious whether an organisation must have a designated DPO under GDPR. If so, it is necessary to make a formal evaluation, recording the decision and the reasons behind it. The WP29 Guidelines on Data Protection Officers (‘DPO’), endorsed by the European Data Protection Board (EDBP), recommends that organisations should conduct and document an internal analysis to determine whether or not a DPO should be appointed. Ultimately, such decision-making should always take into account the organisation’s obligation to fulfil the rights of the data subject, the primary concern of the GDPR: does the scale, volume or type of data processing in your organisation risk adversely effecting an individual or the wider public?

A DPO is not legally required

Organisations may benefit from voluntarily appointing an internal DPO or hiring an advisor – this will ensure best-practice data protection policies and practices, improving cyber security, staff and consumer trust, and other business benefits. When a DPO is designated voluntarily, they will be considered as mandatory under GDPR – i.e. the voluntarily appointed DPO’s responsibilities as defined in articles 37 and 39 of the GDPR will correspond to those of a legally mandated DPO (in other words, GDPR does not recognise a quasi-DPO with reduced responsibility). As an excerpt from the GDPR explains “if an organisation is not legally required to designate a DPO, and does not wish to designate a DPO on a voluntary basis, that organisation is quite at liberty to employ staff or outside consultants to provide information and advice relating to the protection of personal data.

However, it is important to ensure that there is no confusion regarding their title, status, position and tasks. Therefore, it should be made clear, in any communications within the company, as well as with data protection authorities, data subjects, and the public at large, that the title of this individual or consultant is not a data protection officer (DPO).

But how are the conditions that make a DPO mandatory defined under GDPR?

Large-scale processing: there is no absolute definition under GDPR, but there are evaluative guidelines. The GDPR’s WP29 guidance suggests data controllers should consider:

  • The number of data subjects concerned;
  • The volume of data processed;
  • The range of data items being processed;
  • The duration or permanence of the data processing activity; and
  • The geographical extent.

Regular and systematic monitoring: as with ‘large-scale processing,’ there is no definition as such, but WP29 guidance clarifies that monitoring involves any form of tracking or profiling on the internet, including for the purposes of behavioural advertising. Here are a number of examples of regular and systematic monitoring:

  • Data-driven marketing activities;
  • Profiling and scoring for purposes of risk assessment;
  • Email retargeting;
  • Location tracking (e.g. by mobile apps); or
  • Loyalty programmes.

 What does a Data Protection Officer do?

Article 39 of the GDPR, ‘Tasks of the data protection officer,’ lists and explains the DPO’s obligations. It explains that, as a minimum, the responsibility of a DPO is the items summarised below:

  1. Inform and advise the controller or the processor and the employees
  2. Monitor compliance with the Regulation, with other Union or Member State data protection provisions and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, awareness-raising and training of staff involved in processing operations, and the related audits
  3. Provide advice as regards data protection impact assessments and monitor performance
  4. Cooperate with the supervisory authority
  5. Act as the contact point for the supervisory authority on issues relating to data processing

 

 Harry Smithson 2019