Belgian Data Protection Authority’s first GDPR fine imposed on public official

The Belgian DPA delivered a strong message on 28th May 2019, that data protection is “everyone’s concern” and everyone’s responsibility, by premiering the GDPR’s sanctioning provision in Belgium with a fine of €2,000 imposed on a mayor (‘bourgmestre’) for the illegal utilisation of personal data. 

Purpose Limitation was Breached 

The mayor in question used personal data obtained for the purposes of mayoral operations in an election campaign, in breach of GDPR, particularly the purpose limitation principle, which states that data controllers and/or processors must only collect personal data for a specific, explicit and legitimate purpose. Given the fairly moderate fine, the data the mayor obtained would not have contained special category data (formerly known as sensitive personal data in the UK). The Belgian DPA also looked at other factors when deciding on the severity of the sanction, including the limited number of affected data subjects; nature and gravity of the infringement; and duration.  

‘Not Compatible with Initial Purpose’ 

The Belgian DPA received a complaint from the affected data subjects themselves, whose consent to the processing of their data was based on the assumption it would be used appropriately, in this case for administrative mayoral duties. The plaintiffs and the defendant were heard by the DPA’s Litigation Chamber, which concluded along GPDR lines that ‘the personal data initially collected was not compatible with the purpose for which the data was further used by the mayor.’ 

The decision was signed off by the relatively new Belgian commissioner, David Stevens, as well as the Director of the Litigation Chamber, a Data Protection Authority chamber independent from the rest of the Belgian judicial system. 

Harry Smithson, 3rd June 2019

Personal Data Protection Act (PDPA) comes into effect in Thailand after royal endorsement 

On 27th May, the Kingdom of Thailand’s first personal data protection law was published in the Government Gazette and made official. This comes three months after the National Legislative Assembly passed the Personal Data Protection Act in late February and submitted the act for royal endorsement.

While the law is now technically in effect, its main ‘operative provisions’ such as data subjects’ rights to consent and access requests, civil liabilities and penalties, will not come into proper effect until a year after their publishing, i.e. May 2020. This was planned to give data controllers and processors a grace period of one year to prepare for PDPA compliance, the requirements and obligations of which were designed to have demonstrative equivalency with the EU’s international spearheading General Data Protection Regulation (GDPR). As such, within a year, Thai public bodies, businesses and other organisations will qualify for ‘adequate’ third-party status regarding data protection provisions, allowing market activity and various other transactions involving proportionate data processing between EU member states and Thailand. The GDPR has special restrictions on the transfer of data outside the EEA, but the PDPA may prompt the EU to make a data protection provision adequacy decision regarding Thailand.

It is uncommon in Thai law for provisions to have an ‘extraterritorial’ purview, but the new PDPA was designed to have this scope to protect Thai data subjects from the risks of data processing, particularly marketing or ‘targeting’ conducted by non-Thai agents offering goods or services in Thailand.

The PDPA contains many word-for-word translations of the GDPR, including the special category of ‘sensitive personal data’ and Subject Access Requests (SARs). Personal data itself is defined, along GDPR lines, as “information relating to a person which is identifiable, directly or indirectly, excluding the information of a dead person.”

The European Commission has so far recognised Andorra, Argentina, Canada (commercial organisations), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland, Uruguay and the United States of America (limited to the Privacy Shield framework) and most recently Japan, as third countries providing adequate data protection. Adequacy agreements are currently being sought with South Korea, and with Thailand’s latest measures, it may be that the southeast Asian nation will be brought into the fold.

Irrespective of Brexit, the Data Protection Act passed by Theresa May’s administration last year manifests the GDPR within the UK’s regulatory framework. However, on exit the UK will become a so called third country and will have to seek an adequacy decision.  The ICO will continue to be the UK Supervisory Authority.

Harry Smithson, 30 May 2019 


GDPR’s 1st Birthday

General Data Protection Regulation reaches its first birthday

This blogpost arrives as the General Data Protection Regulation (GDPR) reaches its first birthday, and a week after a report from the Washington-based Center for Data Innovation (CDI) suggested amendments to the GDPR.

The report argues that regulatory relaxations would help foster Europe’s ‘Algorithmic Economy,’ purporting that GDPR’s restrictions of data sharing herald setbacks for European competitiveness in the AI technology field.

Citing the European Commission’s ambition “for Europe to become the world-leading region for developing and deploying cutting-edge, ethical and secure AI,” the report then proceeds to its central claim that “the GDPR, in its current form, puts Europe’s future competitiveness at risk.”

That being said, the report notes with approval France’s pro-AI strategy within the GDPR framework, in particular the country’s use of the clause that “grants them the authority to repurpose and share personal data in sectors that are strategic to the public interest—including health care, defense, the environment, and transport.”

Research is still being conducted into the legal and ethical dimensions of AI and the potential ramifications of automated decision-making on data subjects. In the UK, the ICO and the government’s recent advisory board, the Centre for Data Ethics and Innovation (CDEI – not to be confused with aforementioned CDI), are opening discussions and conducting call-outs for evidence regarding individuals’ or organisations’ experiences with AI. There are of course responsible ways of using AI, and organisations hoping to make the best of this new technology have the opportunity to shape the future of Europe’s innovative, but ethical use of data.

The Information Commissioner’s Office (ICO) research fellows and technology policy advisors release short brief for combatting Artificial Intelligence (AI) security risks

The ICO’s relatively young AI auditing framework blog (set up in March this year) discusses security risks in their latest post, using comparative examples of data protection threats between traditional technology and AI systems. The post focuses “on the way AI can adversely affect security by making known risks worse and more challenging to control.”

From a data protection perspective, the main issue with AI is its complexity and the volume of not only data, but externalities or ‘external dependencies’ that AI requires to function, particularly in the AI subfield Machine Learning (ML). Externalities take the form of third-party or open-source software used for building ML systems, or third-party consultants or suppliers who use their own or a partially externally dependent ML system. The ML systems themselves may have over a hundred external dependencies, including code libraries, software or even hardware, and their effectiveness will be determined by their interaction with multiple data sets from a huge variety of sources.

Organisations will not have contracts with these third parties, making data flows throughout the supply chain difficult to track and data security hard to keep on top of. AI developers come from a wide array of backgrounds, and there is no unified or coherent policy for data protection within the AI engineering community.

The ICO’s AI auditing blog uses the example of an organisation hiring a recruitment company who use Machine Learning to match candidate CVs to job vacancies. A certain amount of personal data would have been transferred between the organisation and the recruitment agency using manual methods. However, additional steps in the ML system will mean that data will be stored and transferred in different formats across different servers and systems. They conclude, “for both the recruitment firm and employers, this will increase the risk of a data breach, including unauthorised processing, loss, destruction and damage.”

For example, they write: 

  • The employer may need to copy HR and recruitment data into a separate database system to interrogate and select the data relevant to the vacancies the recruitment firm is working on.
  • The selected data subsets will need to be saved and exported into files, and then transferred to the recruitment firm in compressed form.
  • Upon receipt the recruitment firm could upload the files to a remote location, eg the cloud.
  • Once in the cloud, the files may be loaded into a programming environment to be cleaned and used in building the AI system.
  • Once ready, the data is likely to be saved into a new file to be used at a later time.

This example will be relevant to all organisations contracting external ML services, which is the predominant method for UK businesses hoping to harness the benefits of AI. The blog provides three main pieces of advice based on ongoing research into this new, wide (and widening) area of data security. They suggest that organisations should,

  • Record and document the movement and storing of personal data, noting when the transfer took place, the sender and recipient, and the respective locations and formats. This will help monitor risks to security and data breaches;
  • Intermediate files such as compressed versions should be deleted when required – as per best-practice data protection guidelines; and
  • Use de-identification and anonymization techniques and technologies before they are taken from the source and shared either internally or externally.

Harry Smithson,  May 2019

What is a Data Protection Officer (DPO), and do you need one?

A DPO (Data Protection Officer) is an individual responsible for ensuring that their organisation is processing the data of its staff, customers, providers and any other individuals, i.e. data subjects, in compliance with data protection regulations. As of the EU-wide General Data Protection Regulation (GDPR), a DPO is mandatory for:

  1. Public authorities; and
  2. Organisations that process data
  • On a large scale; 
  • With regular and systematic monitoring; 
  • As a core activity; or 
  • In large volumes of ‘special category data,’ formerly known as ‘sensitive personal data,’ i.e. information related to a living individual’s racial or ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health condition, sex life or sexual orientation, or biometric data.  

It may not be immediately obvious whether an organisation must have a designated DPO under GDPR. If so, it is necessary to make a formal evaluation, recording the decision and the reasons behind it. The WP29 Guidelines on Data Protection Officers (‘DPO’), endorsed by the European Data Protection Board (EDBP), recommends that organisations should conduct and document an internal analysis to determine whether or not a DPO should be appointed. Ultimately, such decision-making should always take into account the organisation’s obligation to fulfil the rights of the data subject, the primary concern of the GDPR: does the scale, volume or type of data processing in your organisation risk adversely effecting an individual or the wider public?

Even if a DPO is not legally required, organisations may benefit from voluntarily appointing an internal DPO or hiring an advisor – this will ensure best-practice data protection policies and practices, improving cyber security, staff and consumer trust, and other business benefits. When a DPO is designated voluntarily, they will be considered as mandatory under GDPR – i.e. the voluntarily appointed DPO’s responsibilities as defined in articles 37 and 39 of the GDPR will correspond to those of a legally mandated DPO (in other words, GDPR does not recognise a quasi-DPO with reduced responsibility). As an excerpt from the GDPR explains “if an organisation is not legally required to designate a DPO, and does not wish to designate a DPO on a voluntary basis, that organisation is quite at liberty to employ staff or outside consultants to provide information and advice relating to the protection of personal data.

However, it is important to ensure that there is no confusion regarding their title, status, position and tasks. Therefore, it should be made clear, in any communications within the company, as well as with data protection authorities, data subjects, and the public at large, that the title of this individual or consultant is not a data protection officer (DPO).

But how are the conditions that make a DPO mandatory defined under GDPR?

Large-scale processing: there is no absolute definition under GDPR, but there are evaluative guidelines. The GDPR’s WP29 guidance suggests data controllers should consider:

  • The number of data subjects concerned;
  • The volume of data processed;
  • The range of data items being processed;
  • The duration or permanence of the data processing activity; and
  • The geographical extent.

Regular and systematic monitoring: as with ‘large-scale processing,’ there is no definition as such, but WP29 guidance clarifies that monitoring involves any form of tracking or profiling on the internet, including for the purposes of behavioural advertising. Here are a number of examples of regular and systematic monitoring:

  • Data-driven marketing activities;
  • Profiling and scoring for purposes of risk assessment;
  • Email retargeting;
  • Location tracking (e.g. by mobile apps); or
  • Loyalty programmes.

 What does a Data Protection Officer do?

Article 39 of the GDPR, ‘Tasks of the data protection officer,’ lists and explains the DPO’s obligations. It explains that, as a minimum, the responsibility of a DPO is the items summarised below:

  1. Inform and advise the controller or the processor and the employees
  2. Monitor compliance with the Regulation, with other Union or Member State data protection provisions and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, awareness-raising and training of staff involved in processing operations, and the related audits
  3. Provide advice as regards data protection impact assessments and monitor performance
  4. Cooperate with the supervisory authority
  5. Act as the contact point for the supervisory authority on issues relating to data processing


 Harry Smithson 2019


HMRC’s 28 days to delete unlawfully obtained biometric data

In a statement released on 3rd May, the Information Commissioner’s Office reiterated their decision to issue HMRC a preliminary enforcement notice in early April. This initial notice was based on an investigation conducted by the ICO after a complaint from Big Brother Watch concerning HMRC’s Voice ID service on a number of the department’s helplines since January 2017.


The voice authentication for customer verification uses a type of biometric data considered special category information under the GDPR, and is therefore subject to stricter conditions. ICO’s investigation found that HMRC did “not give customers sufficient information about how their biometric data would be processed and failed to give them the chance to give or withhold consent.” HMRC was therefore in breach of GDPR.

The preliminary enforcement notice issued by the ICO on April 4th stated that HMRC must delete all data within the Voice ID system for which the department was never given explicit consent to have or use. According to Big Brother Watch, this data amounted to approximately five million records of customers’ voices. These records would have been obtained on HMRC’s helplines, but due to poor data security policy for the Voice ID system, the customers had no means of explicitly consenting to HMRC’s processing of this data.

Steve Wood, Deputy Commissioner at the ICO, stated, “We welcome HMRC’s prompt action to begin deleting personal data that it obtained unlawfully. Our investigation exposed a significant breach of data protection law – HMRC appears to have given little or no consideration to it with regard to its Voice ID service.”

The final enforcement notice is expected 10th May. This will give HMRC a twenty-eight-day timeframe to complete the deletion of this large compilation of biometric data.

The director of Big Brother Watch, Silkie Carlo, was encouraged by the ICO’s actions:

“To our knowledge, this is the biggest ever deletion of biometric IDs from a state-held database. This sets a vital precedent for biometrics collection and the database state, showing that campaigners and the ICO have real teeth and no government department is above the law.”

 Harry Smithson, May 2019. 

Be Data Aware: the ICO’s campaign to improve data awareness

As the Information Commissioners Office’s ongoing investigation into the political weaponisation of data analytics and harvesting sheds more and more light on the reckless use of ‘algorithms, analysis, data matching and profiling’ involving personal information, consumers are becoming more data conscious. The ICO, as of 8th May, has launched an awareness campaign, featuring a video, legal factsheets reminding citizens of their rights under GDPR, and advice guidelines on internet behaviour. Currently the campaign is floating on Twitter under #BeDataAware.


While the public is broadly aware of targeted marketing, and fairly accustomed to the process of companies attempting to reach certain demographics, the political manipulation of data is considered, if not a novel threat, then a problem compounded by the new frontier of online data analytics. Ipsos MORI’s UK Cyber Survey conducted on behalf of the DCMS found that 80% of respondents considered cyber security to be a ‘high priority,’ but that many of these people would not be in groups likely to take much action to prevent cybercrime personally. What this could indicate is that while consumers may be concerned about cybercrime being used against themselves, they are also aware of broader social, economic and political dangers that the inappropriate or illegal use of personal information poses.

ICO’s video (currently on Vimeo, but not YouTube), titled ‘Your Data Matters,’ asks at the beginning, “When you search for a holiday, do you notice online adverts become much more specific?” Proceeding to graphics detailing this relatively well-known phenomenon, the video then draws a parallel with political targeting: “Did you know political campaigners use these same targeting techniques, personalising their campaign messaging to you, trying to influence your vote?” Importantly, the video concludes, “You have the right to know who is targeting you and how your data is used.”

To take a major example of an organisation trying to facilitate this right, Facebook allows users to see why they may have been targeted by an advert with a clickable, dropdown option called ‘Why am I seeing this?’ Typically, the answer will read ‘[Company] is trying to reach [gender] between the ages X – Y in [Country].’ But the question remains as to whether this will be sufficiently detailed in the future. With growing pressure on organisations to pursue best practice when it comes to data security, and with the public’s growing perception of the political ramifications of data security policies, will consumers and concerned parties demand more information on, for instance, which of their online behaviours have caused them to be targeted?

A statement from the Information Commissioner Elizabeth Denham as part of the Be Data Aware campaign has placed the ICO’s data security purview firmly in the context of upholding democratic values.

“Our goal is to effect change and ensure confidence in our democratic system. And that can only happen if people are fully aware of how organisations are using their data, particularly if it happens behind the scenes.

“New technologies and data analytics provide persuasive tools that allow campaigners to connect with voters and target messages directly at them based on their likes, swipes and posts. But this cannot be at the expense of transparency, fairness and compliance with the law.”

Uproar surrounding the data analytics scandal, epitomised by Cambridge Analytica’s data breach beginning in 2014, highlights the public’s increasing impatience with the reckless use of data. The politicisation of cybercrime, and greater knowledge and understanding of data misuse, means that consumers will be far less forgiving of companies that are not seen to be taking information security seriously.

Harry Smithson 9 May 2019

The GDPR and Profiling

Profiling is a very useful tool which marketers have been using for decades to understand their customers better and to target them appropriately.  However, the GDPR does make some changes to how profiling is considered which should be considered carefully before profiling is undertaken.  For the first time, profiling has been included with automated processing decision-making and the same rights apply to the individuals whose information is being profiled. So how does this affect businesses?

Profiling 2018Profiling Benefits

There are obvious benefits both to businesses and consumers in relation to profiling, which is used in a broad number of sectors from healthcare to insurance, retail to publishing, leisure to recruitment.

It is also an extremely useful tool for marketers, providing benefits of increased efficiency, savings in resource, and the financial and reputational benefits of understanding customers and establishing more personal, relevant communications with them.  The customer or individual benefits in turn from receiving fewer communications, and far more relevant messages.

What is profiling?

The GDPR defines profiling as: “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”

Profiling can be as simple as segmenting your own customers into groups based on gender, purchase history, and other data that the customer has provided to you during your relationship.  It becomes more complex when additional data is added to the mix, for example, adding to the information your customer has provided you, by applying data from external sources such as social media, or providers of geo-demographic or lifestyle data.

Profiling and the GDPR

As with all processing under the GDPR, those who profile individuals have responsibilities to those individuals.  Profiles must be accurate, relevant, and non-discriminatory.  All 6 GDPR Principles become critical as profiles are evolutionary, and over time, individuals’ profiles will change. So accuracy and retention are critical.  Privacy by design is key.  As is the requirement that individuals must be made aware of such profiling and of their right not to be subject to such decisions.

It’s worth noting that automated decisions can be made with or without profiling.  And the reverse is also true – profiling can take place without making automated decisions.  It’s all a matter of how the data is used.  Where manual decisions are made, Article 22 does not apply.

Consent or Legitimate Interests?

The legal basis under which profiling takes place is a matter for careful consideration.  There has been debate over whether profiling requires the consent of the individual who is being profiled, or whether legitimate interest may apply.

There will be instances where the impact of the profiling will have a legal or significant effect – for example, in financial services (mortgage yes or no), or when marketing to vulnerable customers – for example, gambling products to those in financial difficulty.  Where profiling is considered to have a legal or significant effect, an organisation will need to rely on the legal basis of Consent before profiling and making decisions on the basis of such profiling.

However, in many cases, marketing will not have such an impact, and in those cases, consent will not be required.  Instead it may be possible to rely on Legitimate Interests.  BUT before such a decision is made, a Legitimate Interest Assessment will need to be conducted.  This will need to consider the necessity of the profiling, the balance of benefits to the individuals versus the business, and the measures taken to protect the personal data and profiles involved.

The Legitimate Interest Assessment will not only help you determine whether it is appropriate to conduct the profiling on this basis, it will also provide evidence that the individuals’ rights have been considered, contributing to the business’s need to meet the GDPR’s new principle of Accountability.


Victoria Tuffill  7th March 2018

Data Protection Roundup: GDPR undermined by Facebook? Morrisons’ breach liability; Google’s iphone snooping

I find it fascinating to watch how data protection in general and GDPR in particular play out with the huge multinationals which it has been designed to capture, and which arguably have the most to lose in terms of fines.  Facebook and Google are once again in the news in relation to their use of personal data.  And the  High Court judgement against Morrisons sets a precedent which aligns with GDPR’s intention of individuals’ rights to have their data protected.

Google accused of bypassing privacy settings to harvest personal information of 5.4 million iPhone users between 2011 and 2012

The search engine tech giant Google is being taken to court by a group called Google You Owe Us, led by ex-Which director Richard Lloyd. The group claims that several hundred pounds could be owed in compensation to the millions of victims of Google’s transgression against privacy rights, meaning Google could face a massive financial penalty.

Online Cookies

Google breached DPA and PECR by misusing cookies

Google exploited cookies, which are small pieces of computer text that collect data from devices, to run large-scale targeted ad campaigns. In the UK Google’s actions were in breach of the Data Protection Act (DPA) and the Privacy and Electronic Communication Regulation (PECR). For such breaches after the General Data Protection Regulation (GDPR) comes into force in late May 2018, organisations could face a fine of up to €20 million or 4% of annual global turnover (whichever is higher – and for the billion-dollar giant Google, obviously the latter).  However, this case relates to a period prior to GDPR.

Google on Phone

Did you go online with your iPhone? Were your privacy preferences ignored?

For several months in 2011 and 2012, Google stands accused of bypassing the default privacy settings on Apple phones in order to track the online behaviour of Safari users, by placing ad-tracking cookies onto the devices. This then enabled advertisers to target content to those devices and their users.

The Google activity has become known as the ‘Safari workaround,’ and while it affected various devices, the lawsuit filed in the High Court addresses the targeting of iPhone users.

Over 5 million people in Britain had an iphone during the period.  “In all my years speaking up for consumers,” Mr Lloyd from Google You Owe Us states, “I’ve rarely seen such a massive abuse of trust where so many people have no way to seek redress on their own. Through this action, we will send a strong message to Google and other tech giants in Silicon Valley that we’re not afraid to fight back.”

According to the veteran privacy rights campaigner, Google claimed that he must go to California, the heartland of the Silicon revolution, if he wanted to pursue legal action against the firm, to which he responded, “It is disappointing that they are trying to hide behind procedural and jurisdictional issues rather than being held to account for their actions.”

According to the BBC, the broadcaster was told by Google that these legal proceedings are “not new” and that they “have defended similar cases before.” Google has stated that they do not believe the case has any merit and that they intend to contest it.

While there is no precedent in the UK for such massive action against Google, in the US Google has settled two large-scale litigation cases out of court. Regarding the same activity, the tech company agreed to pay a record $22.5m (£16.8m) in a case brought by the US Federal Trade Commission in 2012. It also made out of court settlements with a small number of British consumers.

According to the BBC, the case will probably be heard in the High Court in Spring 2018, a month or so prior to the enforcement of the GDPR.


Morrisons found liable for employee data breach

Morrisons workers brought a claim against the supermarket after a former member of staff, senior internal auditor Andrew Skelton (imprisoned as a result of his actions) stole and posted online confidential data (including salary and bank details) about nearly 100,000 employees.

Compensation Nov 2017In an historic High Court ruling, the Supermarket has been found liable for Skelton’s actions, which means that  those affected may claim compensation for the “upset and distress” caused.

The case is the first data leak class action in the UK.  Morrisons has said it will appeal the decision.


Facebook claims European data protection standards will not allow for their pattern-recognition “suicide alert tool” to be usable in EU.

Facebook Dislike

Facebook blames GDPR for its plans to withhold Suicide Prevention software from EU

Facebook’s decision to deny EU countries a pattern-recognition tool to alert authorities to users possibly suffering from depression or suicidal thoughts has been criticised as a move to undermine the upcoming tightening of EU-wide data protection standards, enshrined in the General Data Protection Regulation (GDPR).

Facebook has argued that their Artificial Intelligence (AI) programme which scans the social media network for troubling comments and posts that might indicate suicidal ideation will not be employed in EU countries on the grounds that European policy-makers and the public at large are too sensitive about privacy issues to allow site-wide scanning.

In a blogpost, Facebook’s VP of Product Management stated, “we are starting to roll out artificial intelligence outside the US to help identify when someone might be expressing thoughts of suicide, including on Facebook Live. This will eventually be available worldwide, except the EU.”

Tim Turner, a data consultant based in Manchester, has suggested that the move might be “a shot across the EU’s bows […] Facebook perhaps wants to undermine the GDPR — which doesn’t change many of the legal challenges significantly for this — and they’re using this as a method to do so.”

Mr Turner continues, “nobody could argue with wanting to save lives, and it could be a way of watering down legislation that is a challenge to Facebook’s data hungry business model. Without details of what they think the legal problems are with this, I’m not sure they deserve the benefit of the doubt.”

Written by Harry Smithson  1st December, 2017



New GDPR Guidance in the Data Compliant Data Protection Roundup

The Information Commissioner’s Office (ICO) releases GDPR guidance on “contracts and liabilities between controllers and processors.”

GDPR 7 Months and Counting

Organisations only have until May 2018 to review, redraft and negotiate controller / processor contracts

Ahead of the May 2018 deadline for GDPR enforcement, the ICO has released a 28-page document providing “detailed, practical guidance for UK organisations on contracts between controllers and processors under the GDPR.” The document aims to explain the requirements and responsibilities of data controllers as well as the new liabilities of processors. The document points out that many of the requirements may already be covered by existing contracts, but that the expansion and clarification of contractual clauses to evidence compliance with all aspects of the new regulations will likely be necessary.

Under the new regulations, contracts will be required between data controllers (the organisations responsible for the holding and use of the data) and data processors (those involved in the collection and ‘processing’ of data). This written contract or “other legal act” is to “evidence and govern” the working relationship of both parties. Under the current rules, these contracts are only advised as a measure to demonstrate compliance when necessary.


EU Commission encourages standard contractual clauses and certification schemes (yet to be drafted)

It is noted that “standard contractual clauses” as well as certification schemes for contractual codes of conduct provided by the EU Commission or a supervisory authority such as the ICO will be allowed and encouraged by the GDPR, but that as yet none have been drafted.

Emphasis is given to the GDPR’s expansion of liability to include data processors as well as controllers, the former now liable to pay damages or become subject to penalties if not found compliant. On top of this, processors will need to have contracts with other processors (sub-processors) if they are to utilise their services, with written authorisation from the controller.

What needs to be included in the contract:

Contracts must explain:


Contracts must explain several key points – if not, you will be fined!

  • The subject matter and duration of the processing
  • The nature and purpose of the processing
  • The type of personal data and categories of data subject
  • The obligations and rights of the controller

Contracts must, as a minimum, require the processor to:

  • Only act on the written instructions of the controller
  • Ensure that people processing the data are subject to a duty of confidence
  • Take appropriate measures to ensure the security of processing
  • Only engage sub-processors with the prior consent of the controller and under a written contract
  • Assist the controller in providing subject access and allowing data subjects to exercise their rights under the GDPR
  • Assist the controller in meeting its GDPR obligations in relation to the security of processing, the notification of personal data breaches and data protection impact assessments
  • Delete or return all personal data to the controller as requested at the end of the contract
  • Submit to audits and inspections, provide the controller with whatever information it needs to ensure that they are both meeting their Article 28 obligations, and tell the controller immediately if it is asked to do something infringing the GDPR or other data protection law of the EU or a member state.

Common Thread Network (CTN) announces Patricia Poku as new co-chair alongside Information Commissioner Elizabeth Denham

The CTN, the forum for data protection and privacy authorities among Commonwealth countries, has appointed a new co-chair to sit alongside the incumbent UK Information Commissioner. The decision was made at the CTN Annual General Meeting on 25th September. The organisation promotes cross-border co-operation for data security and privacy objectives.

Patricia Poku, also recently appointed as Executive Director and Member of the Board for the Data Protection Commission of Ghana, has worked as Head of Data Protection for the 2012 London Olympic Games and Global Director for Data Protection & Privacy at World Vision International.

cyber attack

Increasing cybercrime is driving transational cooperation

With the rise of cybercrime and data abuse as international phenomena, not only on the level of government operative activities but also syndicate-level action usually involving the use of malware and the new universal digital currency Bitcoin, transnational co-operation is more important than ever, and gaining in participants. In July, South Africa joined the CTN and in August, the Cayman Islands issued its first Data Protection Bill, working for “adequacy with the EU directive,” the GDPR.

Policies and Procedures Cropped

Global traction for best-practice polices

That the GDPR necessitates organisations outside the EU fulfilling data protection adequacy standards with EU member states if they wish to do business or in any way process data in Europe indicates that the best-practice policies encouraged by the GDPR may find global traction – and organisations such as the CTN have an important role to play in these processes. GDPR-level policies and practices will be especially desirable given the emphasis the ICO has been putting on the benefits to consumer trust that robust data protection provides. It should be viewed that in a global digital economy, data protection best-practice makes commercial sense.

Written by Harry Smithson

GDPR and Accountants

Tax returns onlineGDPR Debate

On Monday, 16th October, Data Compliant’s Victoria Tuffill was invited by AccountancyWeb to join a panel discussion on how GDPR will impact accountants and tax agents.

The other members of the panel were our host, John Stokdyk, Global Editor of AccountingWEB, who kept us all on the straight and narrow, while asking some very pertinent questions; Ian Cooper from Thomson Reuters who gave strong insights into technical solutions; and Dave Tucker from Thompson Jenner LLP, who provided a very useful practitioner viewpoint.

GDPR in General

There is a presumption that every professional body is fully informed of all compliance regulations within their field of expertise.  But the continuing barrage of changes and adjustments to European and British law makes it easy to drop the ball.

GDPR is a typical example.  To quote the Information Commissioner, Elizabeth Denham, it’s “The biggest change to data protection law for a generation”. Yet for many accountants – and so many others – it’s only just appearing on the radar.   This means there’s an increasingly limited amount of time to be ready.

GDPR has been 20 years coming, and is intended to bring the law up to date – in terms of new technology, new ways we communicate with each other, and the increasing press coverage and consumer awareness of personal data and how it’s used by professional organisations and others.  GDPR has been law for 17 months now, and it will be enforced from May 2018.

GDPR and Accountants

So what does GDPR mean for accountants in particular?

  • Accountants will have to deal with the fact that it’s designed to give individuals back their own control over their own personal information and strengthens their rights.
  • It increases compliance and record keeping obligations on accountants. GDPR makes it very plain that any firm which processes personal data is obliged to protect that data – for accountants that responsibility is very significant given the nature of the personal data an accountant holds.
  • There are increased enforcement powers – I’m sure everyone’s heard of the maximum fine of E20,000 or 4% of global turnover, whichever is higher. But also, the media have a strong hold on the whole area of data breaches – and often the reputational damage has a far greater impact than the fine.
  • Accountancy firms must know precisely what data they hold and where it’s held so they can they assess the scale of the issue, and be sure to comply with the demands of GDPR.

The video covers key points for practitioners to understand before they can prepare for compliance, and summarises some initial steps they should take today to prepare their firms.

The other members of the panel were our host, John Stokdyk, Global Editor of AccountingWEB, who kept us all on the straight and narrow, while asking some very pertinent questions; Ian Cooper from Thomson Reuters who gave strong insights into technical solutions; and Dave Tucker from Thompson Jenner LLP, who provided a very useful practitioner viewpoint.

The session can be found here:  Practice Excellence Live 2017:  GDPR.

It is a 45 minute video, so for those with limited time, I have broken down the areas covered into bite-size chunks:

video accountants timingsData Compliant is working with its clients to help them prepare for GDPR, so if you are concerned about how GDPR will affect your firm or business, feel free to give us a call and have a chat on 01787 277742 or email if you’d like more information.




Victoria Tuffill  19th October, 2017