Tag Archives: GDPR

Data sharing under GDPR: What you need to know

In this blog, we’re going to explain how the DPA, UK GDPR and EU GDPR affect the way you process and share personal data. First, here’s a quick intro to the terms by which people are labelled in their relation to data protection law:

  • Data controller: a person or organisation who either alone, or jointly, or in common with other persons, determines the purposes for which and the manner in which any personal data are, or are to be, processed.
  • Data processor: a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller.
  • Data subject: an individual to whom data relates in the context of data protection law; an individual with data protection rights.

Processing

Before you can think about sharing data in the first place, you need to ensure that any data you have (and potentially may wish to share) has been processed and stored lawfully.  The DPA and GDPR apply only to personal data, which is defined as ‘any information relating to an identified or identifiable natural person,’ i.e. a data subject.

Not all of the data you obtain will count as personal data. If data sets are anonymised and an individual can no longer be identified, then the GDPR will not apply, since the information no longer constitutes personal data.

Six Principles

The regulation defines six principles that must be followed when processing personal data. All personal data must:

  1. be processed lawfully, fairly and transparently
  2. be kept to the original purpose
  3. be minimised (i.e. only the personal data that is necessary is collected)
  4. have the accuracy upheld
  5. be removed if they are not necessary
  6. be kept confidential and their integrity maintained

Legal Basis for processing

You will also need to have a legal basis for processing personal data, of which there are six possible grounds.  These are not hierarchical – you use the legal basis that is appropriate.

  1. consent of the data subject
  2. necessary for the performance of a contract
  3. legal obligation placed upon controller
  4. necessary to protect the vital interests of the data subject
  5. carried out in the public interest or is in the exercise of official authority
  6. legitimate interest pursued by controller

The grounds for processing cannot be retroactively adjusted or changed, i.e. you cannot choose to justify the processing or sharing of data in a different way after having done so. Data protection policies must be consistent and trustworthy, regardless of who you are.

Basic things to remember when sharing personal data

Restrictions apply to sharing personal data and therefore not anonymised or pseudonymised data. The latter is often used in healthcare notes, for example.  But remember, the pseudonymisation key itself is personal data.

With whom may I share personal data

Examples of sharing personal data include sharing with:

  • a joint data controller (for joint purposes).
  • another data controller (a third party for their own use).
  • a data processor engaged to store or use data for you (for your purposes)

Before sharing personal data, you must ensure:

  • there is a good reason for the sharing to take place 
  • the individuals have been clearly informed that their personal data is being shared, and the details of the sharing, including:
    • the details of the data to be shared
    • with whom it is to be shared
    • the purpose of the sharing
    • the legal basis for the sharing
    • for how long the data will be held
    • the mechanism by which they can give consent / opt out
  • the volume of personal data that needs to be shared is minimised.
  • the availability of the information is also minimised, or the shared data exists for the minimum time
  • any parties processing the data must therefore have clearly stated retention and deletion policies.
  • the sharing is secure.
  • the sharing is documented.

Contracts and Agreements

Where contracts or other data sharing agreements are required, it is wise to have a data sharing agreement in a framework which can be customised to suit your business needs. A Data Protection Officer (DPO)  can help your team create the appropriate frameworks, and develop bespoke data sharing agreements.

If you are sharing to a country outside the UK or EU that has not been declared ‘adequate’ by the EU Commission, then the new EU standard contractual clauses should normally be used, with supplementary measures.  These were updated in 2021 to meet the needs of the EU GDPR.  The UK has also issued a new “Addendum” enable these SCCs to be used for international transfers from the UK.

Each data sharing process must be considered on a case by case basis.  If in doubt consult your DPO and / or a specialist data protection lawyer.  And remember, it is important to stay up-to-date by following the latest guidance from a DPO and the relevant data protection authorities (the Information Commissioner’s Office for the UK).

Brexit: 55 Days to Go, or is it?

It has been a momentous week for UK politics. With Parliament back from the summer recess MPs moved to seize the Order Paper from Government. There then followed an audacious move to legislate against a “No Deal” Brexit, a move which would hamstring the Government’s Brexit negotiation strategy. The Government’s strenuous attempts to prevent the passage of legislation to take the “No Deal” option out of the equation led to the withdrawal of the whip (in effect the suspension) of 21 Conservative MPs.  The legislation that would prevent a No Deal outcome will return to Parliament early next week.  It remains to be seen whether the legislation achieves Royal Assent and is written into law.

In the meantime, the plan to prorogue Parliament for five weeks ahead of the Brexit deadline moved on apace despite a number of legal challenges.

Elsewhere 

This week also saw the Prime Minister’s own brother, Jo Johnson, resign as Universities Minister registering his objection to the direction of the Brexit negotiations which he viewed as no longer in the national interest. 

On a happier note Downing Street announced the arrival of a new resident as the Prime Minister and his partner unveiled Dilyn their new puppy.

The Prime Minister is keen to reinforce his Government’s resolve to achieve Brexit, with or without a deal by the 31st October deadline. But despite his vigorous defence of this policy it remains unclear whether this will be achieved.

Implications for businesses

Amongst all this political turmoil it is difficult for businesses to plan ahead especially if the business model includes data transfers to or from EEA companies.

In the case of a No Deal Brexit on the date of departure the UK becomes a ‘Third Country’ in terms of EU data transference rules.  This means companies that the UK will not have adequacy status, so needs to take particular steps when processing EEA* data, for example, the data of your customers or prospects or clients.

What you need to do

The UK will recognise all EEA countries as adequate under UK law.  So there are no issues with you continuing to send personal data to the EEA. 

The reverse, however, is not the case so there will be major changes when transferring personal data from the EEA to the UK.  You need to prepare:

  1. Know your data” specifically that data you process about EU individuals.  Make sure your data mapping is up to date and identifies those individuals outside the UK, but within the EEA.
  2. Take appropriate GDPR safeguards for processing and transfers of EEA data, and update your privacy policy accordingly
  3. Use Standard Contractual Clauses to enable transfers of personal data from the EEA to the UK and vice versa.
  4.  If you are using Binding Corporate Rules, these will need to be adjusted slightly post-Brexit.

*EEA = The 27 European Member States, plus Iceland, Liechtenstein and Norway.

Please feel free to contact us if you have any queries or concerns about how Brexit will affect your business, by calling 01787 277742 or email teambrexit@datacompliant.co.uk

Sweden issues first fine under GDPR for the use of facial recognition technology in a school

Previously on this blog, we discussed the UK Information Commissioner’s Office (ICO) investigation into the planned rollout of facial recognition software for a large site around King’s Cross in London. This investigation has renewed scrutiny of the technology among data protection observers, particularly in its relation to privacy rights.

Facial recognition technology for use in schools and on campuses has taken off in the United States and elsewhere, and there are even tech companies dedicated specifically to this section of the security industry. Amid understandable concerns of security at schools in the US, companies offer fairly comprehensive ‘biometric security platforms’ for schools, colleges and universities. Such services claim to identify unauthorised visitors, alert school personnel and secure campus events.

Despite the industry’s seemingly unstoppable uptake, Sweden’s Data Protection Authority (DPA) has issued its first monetary punitive measure to date for the use of this technology in a school. The DPA found a local authority to be in breach of the EU’s General Data Protection Regulation (GDPR), which the Swedish Rijksdag adopted as the Data Protection Act in April last year.

The local authority, the Skellefteå municipality in the north, was trialling facial recognition on secondary school students for the purpose of tracking attendance. Pupils faces would be scanned and registered remotely as they entered the classroom. Consent from the parents of the twenty-two students who participated in the trial in autumn 2018 had been sought, but this was not deemed sufficient reason to collect the special category (biometric) data: the DPA saw no adequate reason for the municipality to process and control this sensitive and potentially risky data. They took into consideration the students’ privacy expectations, as well as the fact that there are many less intrusive means of automating or economising on attendance tracking. As stated clearly by GDPR, ‘personal data shall be adequate, relevant and not excessive in relation to the purpose of purposes for which they are processed.’

In February, the local authority had told SVT Nyheter, the state broadcaster, that teachers were spending 17,000 hours a year reporting attendance, which is how facial recognition as a time- and cost-effective replacement for human labour, as so often the case with new tech, came to the table.

Countdown to Brexit… 69 days to go

The new Parliamentary session starts on 3rd September. Inevitably the session will be, once again, dominated by Brexit. With so little time between the start of the session and the Brexit deadline of Hallowe’en (31st October) there will be little Parliamentary time given over to any issues other than the terms of the UK’s exit from the EU. Parliamentary time is limited further by the Party Conference season with a further recess between 14th September and 9th October.

The Conservative Party Conference runs from 29th September to 2nd October in Manchester.  Members of Cabinet will be expected to attend and no doubt their speeches from the platform and on the fringe will be scrutinised for new policy initiatives and especially the direction of policy post Brexit. 

Over the summer the political agenda was dominated by possibility of a “No Deal” Brexit with MPs from all parties floating a variety plans for how such an eventuality could be prevented. Prime Minister Johnson has been resolute in his belief that the No Deal option cannot be removed from the table.     

Data Protection Implications

The new Prime Minister wasted no time in assembling his new Cabinet, making his intentions very clear by appointing, with few exceptions, long-standing Brexit supporters. Notable among the exceptions were the appointment of Amber Rudd to the Work & Pensions brief she has held since November 2018 and Nicky Morgan who assumes a Cabinet role as Secretary of State for Digital, Culture, Media and Sport. This is of particular interest because the brief includes Data Protection regulation and writing the “UK GDPR” into UK law.

When the UK exits the EU, as is planned, the EU GDPR will no longer be  applicable in the UK (although the Data Protection Act 2018 which references the GPDR will still apply). The UK government intends to write the GDPR into UK law, with changes to tailor it for the UK.The government has already published the – ‘Keeling Schedule’ for the GDPR, which shows the planned amendments. It can be found here http://bit.ly/2Nsy9sw 

The amendments primarily relate to references to the European Parliament, EU Member States, and the EU Commission.

What Next?

Deal or No Deal on the exit date, the UK will become a ‘third country’ (to use the jargon).  It has been suggested that there will be a period of at least 2 years of negotiations to finalise the full terms of the divorce arrangements.  During this time the UK Government will continue to allow transfers to the EU.  This will be kept under review by the new Secretary of State.  Watch this space!

Gareth Evans 23.08.2019

Facebook’s cryptocurrency Libra under scrutiny amid concerns of ‘data handling practices’

It would be giving the burgeoning cryptocurrency Libra short shrift to call it ambitious. Its aims as stated in the Libra Association’s white paper are lofty even by the rhetorical standards of Silicon Valley. If defining Libra as ‘the internet of money’ isn’t enough to convince you of the level of its aspiration, the paper boasts the currency’s ability to financially enfranchise the world’s 1.7 billion adults without access to traditional banking networks or the global financial system.

Like its crypto predecessors, Libra uses blockchain technology to remain decentralised and inclusive, enabling anyone with the ability to pick up a smartphone to participate in global financial networks. Distinguishing itself, however, from existing cryptocurrencies, Libra promises stability thanks to the backing of a reserve of ‘real assets,’ held by the Libra Reserve. There is also the added benefit, hypothetically, of Libra proving to be more energy efficient than cryptocurrencies such as Bitcoin because there will be no ‘proof of work’ mechanism such as Bitcoin mining, which requires more and more electricity as the currency inflates.

So far, so Zuckerberg. It may seem unsurprising then, that global data protection regulators have seen the need to release a joint statement raising concerns over the ‘privacy risks posed by the Libra digital currency and infrastructure.’ While risks to financial privacy and related concerns have been raised by Western policymakers and other authorities, this is the first official international statement relating specifically to personal privacy.

The joint statement, reported on the UK’s Information Commissioner’s Office (ICO) on the 5th August, has signatories from Albania, Australia, Canada, Burkina Faso, the European Union, the United Kingdom and the United States. The primary concern is that there is essentially no information from Facebook, or their participating subsidiary Calibra, on how personal information will be handled or protected. The implementation of Libra is rapidly forthcoming – the target launch is in the first half of next year. Its expected uptake is anticipated to be similarly rapid and widescale thanks to Facebook’s goliath global status. It is likely, therefore, that the Libra Association (nominally independent, but for which Facebook, among other tech and communications giants, is a founding member) will become the custodian of millions of peoples’ data – many of whom will reside in countries that have no data protection laws – in a matter of months.

The statement poses six main questions (a ‘non-exhaustive’ conversation-starter) with a view to getting at least some information on how Libra will actually function both on a user-level and across the network, how the Libra Network will ensure compliance with relevant data protection regulations, how privacy protections will be incorporated into the infrastructure, etc. All of these questions are asked to get some idea of how Facebook and Calibra et al. have approached personal data considerations.

Profiling, Algorithms and ‘Dark Patterns’

The joint statement asks how algorithms and profiling involving personal data will be used, and how this will be made clear to data subjects to meet the standards for legal consent. These are important questions relating to the design of the access to the currency on a user-level, of which prospective stakeholders remain ill-informed. The Libra website does state that the Libra blockchain is pseudonymous, allowing users to hold addresses not linked to their real-world identity. How these privacy designs will manifest remains unclear, however, and there is as yet no guarantee de-identified information cannot be reidentified through nefarious means either internally or by third parties.

The regulators also bring up the use of nudges and dark patterns (sometimes known as dark UX) – methods of manipulating user behaviour that can rapidly become unethical or illegal. Nudges may be incorporated into a site (they may sometimes be useful, such as a ‘friendly reminder’ that Mother’s Day is coming up on a card website) in order to prompt commercial activity that may not have happened otherwise. There is not always a fine line between a reasonable nudge and a dubious one. Consider the example of Facebook asking a user ‘What’s on your mind?’, prompting the expression of a feeling or an attitude, for instance. We already know that Facebook has plans to scan information on emotional states ostensibly for the purposes of identifying suicidal ideation and preventing tragic mistakes. The benefits of this data to unscrupulous agents, however, could prove, and indeed has proved, incalculable.

The Libra Network envisions a ‘vibrant ecosystem’ (what else?) of app-developers and other pioneers to ‘spur the global use of Libra.’ Questions surrounding the Network’s proposals to limit data protection liabilities in these apps are highly pertinent considering the lightspeed pace with which the currency is being designed and implemented.

Will Libra be able to convince regulators that it can adequately distance itself from these practices? Practices which take place constantly and perennially online? Has there been any evidence of Data Protection Impact Assessments (DPIAs), as demanded unequivocally by the European Union’s General Data Protection Regulation (GDPR) on a data-sharing scale of this magnitude?

Hopefully, Facebook or one of its subsidiaries or partners will partake in this conversation started by the joint statement, providing the same level of cooperation and diligence shown to data protection authorities as they have to financial authorities. More updates to come.

Harry Smithson, 9th August 2019

Two high-profile GDPR fines for British Airways and Marriott International, Inc

The Information Commissioner’s Office (ICO) has released two statements this week declaring intention to fine British Airways and Marriott International, Inc £183.39m and £99m respectively for breaches of the General Data Protection Regulation (GDPR). In both cases, which affect data subjects from countries across the world, the ICO was the lead supervisory authority acting on behalf of other EU Member State data protection authorities.

These punitive measures are provided under the GDPR, and are the largest fines issued by the ICO to date. These fines both therefore break the former record, which was the £500,000 fine issued to Facebook last year for the social media giant’s role in the Cambridge Analytica scandal (which was actually the maximum fine possible under the previous, much more lenient legislation, since much of the action had taken place prior to GDPR’s implementation).

These two warning shots are fines amounting to 1.5% of the respective company’s global turnover, out of a possible 4% provided by GDPR. This leniency is availed by the companies’ willingness to cooperate with the authority and make immediate improvements where possible. However, it is expected that the companies will appeal the decision.

Failure to protect their customers’ data due to negligent digital security was at the heart of the decisions. The ICO discovered that from June to September 2018, users of BA’s website were being diverted to a fraudulent site used to harvest data. Roughly 500,000 customers had their personal information compromised in this way. Arguably on an even greater scale, the hotel giant Marriott was found to be presiding over a system exposing 339 million guest records to the internet.

Due diligence is the important aspect to these decisions, associated to the principle of ‘accountability’ defined in the GDPR. In the case of BA, poor security arrangements on the website were responsible for the cyber attackers being able to harvest personal data relating to log-in details, payment cards, travel bookings, names and addresses. Similarly, Marriott had failed to pursue due diligence when the company acquired Starwood (a hotel chain), which maintained a vulnerability in its guest reservation database dating back to 2014.

Marriott’s CEO has emphasised the fact that their subsidiary was victim to a cyberattack indeed the company itself notified data protection authorities of the breach, but as the Information Commissioner Elizabeth Denham has stated, “the GDPR makes it clear that organisations must be accountable for the personal data they hold. This can include carrying out proper due diligence when making a corporate acquisition, and putting in place proper accountability measures to assess not only what personal data has been acquired, but also how it is protected.”

These decisions set a strong precedent, and will hopefully encourage companies to take greater responsibility for the personal data they hold. Being victim to a cyberattack is not in itself an excuse: companies and organisations must demonstrate that they have attempted to take appropriate and robust security measures. The accountability principle as explained in the GDPR is very clear on this.

Harry Smithson, July 2019

Belgian Data Protection Authority’s first GDPR fine imposed on public official 

The Belgian DPA delivered a strong message on 28th May 2019, that data protection is “everyone’s concern” and everyone’s responsibility, by premiering the GDPR’s sanctioning provision in Belgium with a fine of €2,000 imposed on a mayor (‘bourgmestre’) for the illegal utilisation of personal data. 

Purpose Limitation was Breached 

The mayor in question used personal data obtained for the purposes of mayoral operations in an election campaign, in breach of GDPR, particularly the purpose limitation principle, which states that data controllers and/or processors must only collect personal data for a specific, explicit and legitimate purpose. Given the fairly moderate fine, the data the mayor obtained would not have contained special category data (formerly known as sensitive personal data in the UK). The Belgian DPA also looked at other factors when deciding on the severity of the sanction, including the limited number of affected data subjects; nature and gravity of the infringement; and duration.  

‘Not Compatible with Initial Purpose’ 

The Belgian DPA received a complaint from the affected data subjects themselves, whose consent to the processing of their data was based on the assumption it would be used appropriately, in this case for administrative mayoral duties. The plaintiffs and the defendant were heard by the DPA’s Litigation Chamber, which concluded along GPDR lines that ‘the personal data initially collected was not compatible with the purpose for which the data was further used by the mayor.’ 

The decision was signed off by the relatively new Belgian commissioner, David Stevens, as well as the Director of the Litigation Chamber, a Data Protection Authority chamber independent from the rest of the Belgian judicial system. 

Harry Smithson, 3rd June 2019

GDPR’s 1st Birthday

General Data Protection Regulation reaches its first birthday

This blogpost arrives as the General Data Protection Regulation (GDPR) reaches its first birthday, and a week after a report from the Washington-based Center for Data Innovation (CDI) suggested amendments to the GDPR.

The report argues that regulatory relaxations would help foster Europe’s ‘Algorithmic Economy,’ purporting that GDPR’s restrictions of data sharing herald setbacks for European competitiveness in the AI technology field. 

Citing the European Commission’s ambition “for Europe to become the world-leading region for developing and deploying cutting-edge, ethical and secure AI,” the report then proceeds to its central claim that “the GDPR, in its current form, puts Europe’s future competitiveness at risk.”  

That being said, the report notes with approval France’s pro-AI strategy within the GDPR framework, in particular the country’s use of the clause that “grants them the authority to repurpose and share personal data in sectors that are strategic to the public interest—including health care, defense, the environment, and transport. 

Research is still being conducted into the legal and ethical dimensions of AI and the potential ramifications of automated decision-making on data subjects. In the UK, the ICO and the government’s recent advisory board, the Centre for Data Ethics and Innovation (CDEI – not to be confused with aforementioned CDI), are opening discussions and conducting call-outs for evidence regarding individuals’ or organisations’ experiences with AI. There are of course responsible ways of using AI, and organisations hoping to make the best of this new technology have the opportunity to shape the future of Europe’s innovative, but ethical use of data. 

The Information Commissioner’s Office (ICO) research fellows and technology policy advisors release short brief for combatting Artificial Intelligence (AI) security risks 

 The ICO’s relatively young AI auditing framework blog (set up in March this year) discusses security risks in their latest post, using comparative examples of data protection threats between traditional technology and AI systems. The post focuses “on the way AI can adversely affect security by making known risks worse and more challenging to control.” 

 From a data protection perspective, the main issue with AI is its complexity and the volume of not only data, but externalities or ‘external dependencies’ that AI requires to function, particularly in the AI subfield Machine Learning (ML)Externalities take the form of third-party or open-source software used for building ML systems, or third-party consultants or suppliers who use their own or a partially externally dependent ML system. The ML systems themselves may have over a hundred external dependencies, including code libraries, software or even hardware, and their effectiveness will be determined by their interaction with multiple data sets from a huge variety of sources. 

 Organisations will not have contracts with these third parties, making data flows throughout the supply chain difficult to track and data security hard to keep on top of. AI developers come from a wide array of backgrounds, and there is no unified or coherent policy for data protection within the AI engineering community. 

 The ICO’s AI auditing blog uses the example of an organisation hiring a recruitment company who use Machine Learning to match candidate CVs to job vacancies. A certain amount of personal data would have been transferred between the organisation and the recruitment agency using manual methods. However, additional steps in the ML system will mean that data will be stored and transferred in different formats across different servers and systems. They conclude, “for both the recruitment firm and employers, this will increase the risk of a data breach, including unauthorised processing, loss, destruction and damage. 

 For example, they write: 

  • The employer may need to copy HR and recruitment data into a separate database system to interrogate and select the data relevant to the vacancies the recruitment firm is working on. 
  • The selected data subsets will need to be saved and exported into files, and then transferred to the recruitment firm in compressed form. 
  • Upon receipt the recruitment firm could upload the files to a remote location, eg the cloud.  
  • Once in the cloud, the files may be loaded into a programming environment to be cleaned and used in building the AI system. 
  • Once ready, the data is likely to be saved into a new file to be used at a later time. 

This example will be relevant to all organisations contracting external ML services, which is the predominant method for UK businesses hoping to harness the benefits of AI. The blog provides three main pieces of advice based on ongoing research into this new, wide (and widening) area of data security. They suggest that organisations should, 

  • Record and document the movement and storing of personal data, noting when the transfer took place, the sender and recipient, and the respective locations and formats. This will help monitor risks to security and data breaches; 
  • Intermediate files such as compressed versions should be deleted when required – as per best-practice data protection guidelines; and 
  • Use de-identification and anonymization techniques and technologies before they are taken from the source and shared either internally or externally. 

Harry Smithson,  May 2019

What is a Data Protection Officer (DPO), and do you need one?

A DPO (Data Protection Officer) is an individual responsible for ensuring that their organisation is processing the data of its staff, customers, providers and any other individuals, i.e. data subjects, in compliance with data protection regulations. As of the EU-wide General Data Protection Regulation (GDPR), a DPO is mandatory for:

  1. Public authorities; and
  2. Organisations that process data
  • On a large scale; 
  • With regular and systematic monitoring; 
  • As a core activity; or 
  • In large volumes of ‘special category data,’ formerly known as ‘sensitive personal data,’ i.e. information related to a living individual’s racial or ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health condition, sex life or sexual orientation, or biometric data.  

It may not be immediately obvious whether an organisation must have a designated DPO under GDPR. If so, it is necessary to make a formal evaluation, recording the decision and the reasons behind it. The WP29 Guidelines on Data Protection Officers (‘DPO’), endorsed by the European Data Protection Board (EDBP), recommends that organisations should conduct and document an internal analysis to determine whether or not a DPO should be appointed. Ultimately, such decision-making should always take into account the organisation’s obligation to fulfil the rights of the data subject, the primary concern of the GDPR: does the scale, volume or type of data processing in your organisation risk adversely effecting an individual or the wider public?

A DPO is not legally required

Organisations may benefit from voluntarily appointing an internal DPO or hiring an advisor – this will ensure best-practice data protection policies and practices, improving cyber security, staff and consumer trust, and other business benefits. When a DPO is designated voluntarily, they will be considered as mandatory under GDPR – i.e. the voluntarily appointed DPO’s responsibilities as defined in articles 37 and 39 of the GDPR will correspond to those of a legally mandated DPO (in other words, GDPR does not recognise a quasi-DPO with reduced responsibility). As an excerpt from the GDPR explains “if an organisation is not legally required to designate a DPO, and does not wish to designate a DPO on a voluntary basis, that organisation is quite at liberty to employ staff or outside consultants to provide information and advice relating to the protection of personal data.

However, it is important to ensure that there is no confusion regarding their title, status, position and tasks. Therefore, it should be made clear, in any communications within the company, as well as with data protection authorities, data subjects, and the public at large, that the title of this individual or consultant is not a data protection officer (DPO).

But how are the conditions that make a DPO mandatory defined under GDPR?

Large-scale processing: there is no absolute definition under GDPR, but there are evaluative guidelines. The GDPR’s WP29 guidance suggests data controllers should consider:

  • The number of data subjects concerned;
  • The volume of data processed;
  • The range of data items being processed;
  • The duration or permanence of the data processing activity; and
  • The geographical extent.

Regular and systematic monitoring: as with ‘large-scale processing,’ there is no definition as such, but WP29 guidance clarifies that monitoring involves any form of tracking or profiling on the internet, including for the purposes of behavioural advertising. Here are a number of examples of regular and systematic monitoring:

  • Data-driven marketing activities;
  • Profiling and scoring for purposes of risk assessment;
  • Email retargeting;
  • Location tracking (e.g. by mobile apps); or
  • Loyalty programmes.

 What does a Data Protection Officer do?

Article 39 of the GDPR, ‘Tasks of the data protection officer,’ lists and explains the DPO’s obligations. It explains that, as a minimum, the responsibility of a DPO is the items summarised below:

  1. Inform and advise the controller or the processor and the employees
  2. Monitor compliance with the Regulation, with other Union or Member State data protection provisions and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, awareness-raising and training of staff involved in processing operations, and the related audits
  3. Provide advice as regards data protection impact assessments and monitor performance
  4. Cooperate with the supervisory authority
  5. Act as the contact point for the supervisory authority on issues relating to data processing

 

 Harry Smithson 2019

 

The GDPR and Profiling

Profiling is a very useful tool which marketers have been using for decades to understand their customers better and to target them appropriately.  However, the GDPR does make some changes to how profiling is considered which should be considered carefully before profiling is undertaken.  For the first time, profiling has been included with automated processing decision-making and the same rights apply to the individuals whose information is being profiled. So how does this affect businesses?

Profiling Benefits

There are obvious benefits both to businesses and consumers in relation to profiling, which is used in a broad number of sectors from healthcare to insurance, retail to publishing, leisure to recruitment.

It is also an extremely useful tool for marketers, providing benefits of increased efficiency, savings in resource, and the financial and reputational benefits of understanding customers and establishing more personal, relevant communications with them.  The customer or individual benefits in turn from receiving fewer communications, and far more relevant messages.

What is profiling?

The GDPR defines profiling as: “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”

Profiling can be as simple as segmenting your own customers into groups based on gender, purchase history, and other data that the customer has provided to you during your relationship.  It becomes more complex when additional data is added to the mix, for example, adding to the information your customer has provided you, by applying data from external sources such as social media, or providers of geo-demographic or lifestyle data.

Profiling and the GDPR

As with all processing under the GDPR, those who profile individuals have responsibilities to those individuals.  Profiles must be accurate, relevant, and non-discriminatory.  All 6 GDPR Principles become critical as profiles are evolutionary, and over time, individuals’ profiles will change. So accuracy and retention are critical.  Privacy by design is key.  As is the requirement that individuals must be made aware of such profiling and of their right not to be subject to such decisions.

It’s worth noting that automated decisions can be made with or without profiling.  And the reverse is also true – profiling can take place without making automated decisions.  It’s all a matter of how the data is used.  Where manual decisions are made, Article 22 does not apply.

Consent or Legitimate Interests?

The legal basis under which profiling takes place is a matter for careful consideration.  There has been debate over whether profiling requires the consent of the individual who is being profiled, or whether legitimate interest may apply.

There will be instances where the impact of the profiling will have a legal or significant effect – for example, in financial services (mortgage yes or no), or when marketing to vulnerable customers – for example, gambling products to those in financial difficulty.  Where profiling is considered to have a legal or significant effect, an organisation will need to rely on the legal basis of Consent before profiling and making decisions on the basis of such profiling.

However, in many cases, marketing will not have such an impact, and in those cases, consent will not be required.  Instead it may be possible to rely on Legitimate Interests.  BUT before such a decision is made, a Legitimate Interest Assessment will need to be conducted.  This will need to consider the necessity of the profiling, the balance of benefits to the individuals versus the business, and the measures taken to protect the personal data and profiles involved.

The Legitimate Interest Assessment will not only help you determine whether it is appropriate to conduct the profiling on this basis, it will also provide evidence that the individuals’ rights have been considered, contributing to the business’s need to meet the GDPR’s new principle of Accountability.

Victoria Tuffill  7th March 2018