European Commission publishes “White Paper on Artificial Intelligence”

19th February saw the release of the European Commission’s white paper on AI, which remains open to public consultation until May. While extolling the virtues of AI such as its much anticipated roles in fine-tuning medical diagnostics and mitigating climate breakdown, the white paper ranks intrusion and privacy risks among the four main issues facing policy-making around AI. The other three risks included opaque and/or discriminatory decision-making and criminal application.

The expected impact on governance that AI uptake could have, and the resulting conspicuous contrast with governance systems lacking cutting edge AI capacity, leads the Commission to go so far as to note that a common European framework for policy on AI is necessary to avoid “the fragmentation of the single market.”

The paper outlines a largely theoretical “European approach to excellence and trust,” emphasising the requirement for global competitiveness in AI innovation. It states however that “trustworthiness is a prerequisite for [AI] uptake.” For instance, safeguards on law enforcement’s expanded capacities due to AI technology are recommended, though currently not detailed. Much of this trust is purportedly to be garnered by taking the “human-centric approach” to AI application. This approach was explicated in a paper called “Communication on Building Trust in Human-Centric Artificial Intelligence” released by the Commission last year, in which privacy and data governance was among seven “key requirements that AI applications should respect.”

Concrete, technical policies for regulation are somewhat more elusive. Both papers reiterate the accuracy requirement for any datasets that AI may be using as fuel for thought, i.e. the necessity for data integrity, but the requirement for stored data to be accurate is enforced by the General Data Protection Regulation (GDPR), a framework which will remain in the UK after Brexit due to the Data Protection Act 2018 and is seeing emulation across the world. Quite how the Commission’s value system of human-centric ethics will manifest in AI development remains unclear.

Where the white paper on AI is most outspoken is the perceived limitations of current EU legislation to regulate or even conceptualise AI. Changes to the legal concept of ‘safety’ invoked by AI risk and predictive analysis are anticipated; ambiguity concerning responsibility between economic agents in the supply chain may pose judicial quandaries; and there is even a chapter dedicated to the problem of AI indecipherability: if human officials cannot ascertain how an AI programme reached a decision, how can they know whether such a decision was skewed by bias in a dataset? Human oversight of AI development is therefore recommended at each stage of the industrial chain.

Harry Smithson, 21st February 2020

Government expands Ofcom’s role to combat ‘Online Harms’

Online harms can take a variety of forms, privacy violations being among the most notorious. Regardless of how we categorise negative internet user experiences, we know from a recent Ofcom study that 61% of adults and 79% of 12-15 year olds have reported at least one potentially harmful online experience in the last 12 months.

As part of the government’s response to public consultation on the Online Harms White Paper, the DCMS announced on the 12th February that the UK’s telecoms and broadcasting regulator will also be the new online harms regulator. The Home Office and DCMS have been working together with Barnardo’s charity to provide greater protection for vulnerable internet users, particularly children, building upon growing institutional and regulatory oversight of digital services.

Unlike the General Data Protection Regulation (GDPR), which has far-reaching purviews, the regulation will likely only apply to fewer than 5% of UK business, as Ofcom will only be responsible for monitoring organisations that host user-generated content (comments, forums etc.).

But from a data protection perspective, it’s interesting to see how GDPR terminology and values have shaped this initiative – consider, for instance, former secretary of state Nicky Morgan’s statement on the government’s response to the white paper:

“We will give the regulator the powers it needs to lead the fight for an internet that remains vibrant and open but with the protections, accountability and transparency people deserve.”

We can expect to see the official anointing of the new Ofcom coming into force under Nicky Morgan’s recent successor, Oliver Dowden.

In the meantime, the Information Commissioner Elizabeth Denham, head of the UK’s enforcer for GDPR, has welcomed this expanded Ofcom as “an important step forward in addressing people’s growing mistrust of social media and online services.”

She continues, in an ICO press release on the heels of the DCMS announcement, “the scales are falling from our eyes as we begin to question who has control over what we see and how our personal data is used.”

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.

Harry Smithson, 14th February 2020

What does the law say about protecting your health and other sensitive data?

Health data, identity theft and fraud are among the most significant concerns of data protection, especially where sensitive personal data is concerned.  Now the Information Commissioners Office has published detailed guidance on how data controllers should protect and handle this ‘Special Category’ data. 

Special category data

Known as the most sensitive category of personal data, special category data concerns information on a person’s:

  • health
  • sex life or sexual orientation
  • racial or ethnic origin
  • political opinions
  • religious or philosophical beliefs
  • membership to a trade union
  • genetic data
  • biometric data for uniquely identifying a person such as a fingerprint, or facial recognition

Special care must be taken when processing sensitive data.  Because of its sensitive nature, there is a high risk to individuals if such data were to fall into the wrong hands.  It is illegal to process any of the above categories of data without a specific reason. 

So, data controllers MUST select one of the following legal grounds before processing:

  • explicit consent
  • obligations in employment
  • social security and social protection law
  • to protect vital interests
  • processing by non-for-profit bodies
  • manifestly made public
  • establish, exercise or defend legal claims
  • substantial public interest
  • preventative or occupational medicine
  • public health
  • research purposes.

‘Special Category’ data must also be given extra levels of security to protect it.  For example, limiting the number of individuals who may access such data, minimising the amount of data collected, stronger access controls – these and other such measures help protect the privacy of the individual, and to maintain the integrity and confidentiality of the data.

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742

Gareth Evans, 15th November 2019

US Privacy Bill

On October 11, 2019, California Governor Gavin Newsom signed the remaining amendments to the California Consumer Privacy Act (CCPA) into law.  The CCPA provides unprecedented privacy rights to California residents similar to those enjoyed by EU citizens since the implementation of GDPR. Most companies that do business with California will need to comply with the requirements of the new law.  The deadline for compliance with CCPA is 1st January 2020 though some commentators believe that this deadline may be extended.

Other US states are already considering introducing privacy legislation reflecting the measures taken by California. However, events are moving quickly…

On 5th November two Californian Democrat Congresswomen, Anna G. Eshoo and Zoe Lofgren, introduced an Online Privacy Bill to the US House of Representatives.  If successfully enacted the Act would create a federal Data Protection Agency (DPA) covering the whole of the US.

Corporate Data Privacy Obligations

The draft legislation imposes a raft of obligations on organisations, including:

  • disclose why they need to collect and process data
  • minimise employee and contractor access to personal data
  • not disclose or sell personal information without explicit consent
  • not use private communications such as email to target ads or for “other invasive purposes”

The legislation is attempting to tackle a range of abuse of privacy data. This is illustrated by the requirement for organisations to “notify the agency (the DPA) and users of breaches and data sharing abuses, e.g., Cambridge Analytica.”

Citizens Data Privacy Rights

The bill would give citizens the right to:

  • access, correct, delete, and transfer data about them;
  • request a human review of impactful automated decisions;
  • opt-in consent for using data for machine learning / A.I. algorithms;
  • be informed if a covered entity has collected your information; and to choose for how long their data can be kept   

Sound familiar?

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.You can find more of our blogs here.

Gareth Evans, 15th November 2019

Politics. Fines. Data Deletion.

GDPR Regulations begin to bite

We are now beginning to see the impact of the GDPR regulations across politics, businesses and public services.  With the upcoming UK general election, the ICO is issuing timely reminders.  In Europe we are starting to see large fines being levied for GDPR breaches.

ICO Issues Letter to UK Political Parties

In a timely reminder the Information Commissioner has written to 13 political parties in the UK. The letter reminds them of their legal obligations regarding the use of Personal Data in the lead-up to the General Election. The ICO letter highlights the need for parties to:

  • provide individuals with clear and accessible information about how their personal data is being used.  This includes

    • data obtained directly from individuals

    • data obtained from third parties, including data brokers 

    • inferred data – ie data that is inferred from observed behaviour, such as reading or buying habits, responses to advertising and so on 

  • demonstrate compliance with the law. The scope here includes any third-party data processors.  For political parties, this specifically includes data analytics providers and online campaigning platforms
  • have the appropriate records of consent from individuals (where consent is the legal basis for processing) to send political messages through electronic channels (texts, emails)
  • identify lawful bases for processing special category data, such as political opinions and ethnicity.

This places political parties on the same basis as commercial organisations under UK law. 

Record Fine in Austria

The Austrian Data Protection Authority has imposed an €18 million fine on the Austrian Postal Service, Österreichische Post AG (“ÖPAG”).  After an investigation, the Austrian DPA established that ÖPAG processed and sold data regarding its customers’ political allegiances amongst other violations.This is a violation of the GDPR.

The fine is subject to an appeal.

Record Fine in Germany

On November 5, 2019, the Berlin Commissioner for Data Protection and Freedom of Information announced that it had imposed the highest fine issued in Germany since the EU GDPR became applicable.  Deutsche Wohnen SE, a real estate company, was fined  €14.5 million.

After onsite inspections, the Berlin Commissioner noticed the company was retaining personal data of tenants for an unlimited period. It had not examined whether the retention was legitimate or necessary.

Data should be removed without delay. once it is no longer needed for the specific purpose for which it was collected. Deutsche Wohnen SE was using an archiving system that did not enable the removal of such data. Affected data related to financial and personal circumstances, such as bank statements, training contracts, tax, social and health insurance data.

This fine should act as a strong reminder to all companies to review and update their data retention and deletion policies, processes and supporting procedures.

More news later this week. In the meantime, if you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.  You can find more blogs here.

Gareth Evans, 11th November 2019

Politics and social media

Politics, Social Media and Data Protection

This has been a week where the combination of politics, social media and data protection have been much in evidence.

Twitter political advertising ban

Twitter boss Jack Dorsey decided to ban political advertising on Twitter globally, which has focussed attention on the use of personal data in targeting political messages. This has gained traction in the UK particularly as it coincides with an unscheduled General Election campaign.

Facebook agrees to pay maximum fine 

At the same time, the ICO announced an agreement with Facebook over their investigation into the misuse of personal data in political campaigns. The investigation began in 2017.

As part of that investigation, on 24 October 2018 the ICO issued a monetary penalty notice (MPN) of £500,000 against Facebook.  £500,000 was the maximum allowed under the Data Protection Act (DPA) 1998.  The ICO identified “suspected failings related to compliance with the UK data protection principles covering lawful processing of data and data security”.  

Following an appeal referred to a Tribunal, Facebook and the ICO have agreed to withdraw their respective appeals. Facebook has made no admission of liability, but has agreed to pay the £500,000 fine.

In a statement following the joint agreement, Facebook’s General Counsel said:  

The ICO has stated that it has not discovered evidence that the data of Facebook users in the EU was transferred to Cambridge Analytica. However, we look forward to continuing to cooperate with the ICO’s wider and ongoing investigation into the use of data analytics for political purposes.”

The ICO’s fine is the maximum available under DPA 1988.  Under current law (which implements the GDPR), sanctions can be up to 4% of annual global turnover or €20 million – whichever is greater.  

Facebook withdraws political campaigns

In the spirit of cooperation on the responsible use of data analytics in political communications, Facebook has withdrawn a number of political communications. The Government’s MyTown campaign was aimed at key marginal seats.  It was withdrawn as it did not contain the appropriate disclaimers.  In addition, an advert by the Fair Tax Campaign was withdrawn because it did not disclose that it was sponsored content.   

More news next week. In the meantime, if you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.  You can find more blogs here.

Gareth Evans, 5th November 2019

There Now Follows A Brief Interlude…

This blog, charting the progress of the UK towards Brexit and its impacts from a data protection perspective, has been running for a few months now. It seems that every week there has been a new twist or sudden change in direction.

This week, again contrary to many commentators’ predictions, came the news that we have a General Election in six weeks’ time, on 12th December. It is highly unusual to hold an Election so late in the year, a fact that adds to the overall sense of uncertainty over the final outcome.   Voters are likely to cast their vote in line with their Remain or Leave preference as much as their traditional party allegiance.  It may well throw up some surprises.

What is certain however that during the six weeks of the election campaign Brexit and the Withdrawal Agreement Bill will not progress. There now follows, as broadcasters used to say, a brief interlude.  In the meantime, we will continue to monitor and report on data protection developments both in the UK and internationally on our blogs.

Follow the money

And if readers are concerned about the fate of the millions of commemorative 50 pence pieces minted to celebrate the UK’s departure from the EU on 31st October, I understand they have been melted down. They will be re-minted (if that’s the word) when a new Brexit date has been secured.

Please contact us if you have any queries or concerns about how Brexit will affect your business.  Call 01787 277742 or email teambrexit@datacompliant.co.uk

Gareth Evans, 1st November 2019

The Countdown Clock Stalls

Shortly after he was installed in post, the new Chancellor, Sajid Javid, announced that three million new 50 pence coins celebrating Brexit would be issued on 31st October to mark the UK’s exit from the EU.  Millions more were set to be minted in the following months. The move was supposed to underline the Government’s determination to conclude Brexit by their Halloween deadline. Events of the last week mean that the Chancellor may have to shelve plans to issue millions of shiny new 50 pence coins.  How did we get to this situation?

The Prime Minister bought back his version of the Withdrawal Agreement hotfoot from Brussels cutting short his time at the European Summit of EU heads of state.   As widely predicted a Saturday sitting of Parliament was called to vote on the deal. Despite the Government’s best intentions, the Withdrawal Agreement Bill failed to reach its 2nd Reading (the first hurdle for any Bill) derailed by Sir Oliver Letwin’s amendment.

A new week saw another attempt to progress the Withdrawal Agreement Bill (now referred to as “WAB”).  On Tuesday the Bill returned to Parliament and, remarkably, for the first time in four attempts it won a majority in Parliament. But that wasn’t the end of the story. In order to progress the Bill needs to secure time for Parliamentary scrutiny.   The Government had proposed two days of Parliamentary time. This proposal drew considerable opposition – and not just from official Opposition MPs.  As a result, this Programme Bill containing the timetable was rejected. The vote brought the progress of the WAB to a juddering halt.  Suddenly the Government was forced to seek an extension from the EU to the passage of the WAB, possibly for as long as three months. The 31st October deadline for Brexit now looks very unlikely.  Treasury officials so far remain tight-lipped on the fate of the three million commemorative 50 pence coins.

We are now in a period of political stand-off. The Prime Minister has offered considerably more Parliamentary time to scrutinise and debate the 130-odd pages of the WAB provided they agree to a General Election on 12th December. Meanwhile the EU has yet to decide on the length of the extension they are prepared to grant the UK to agree the terms of withdrawal. Opposition parties in the UK will not agree to a General Election until they know the terms the EU are prepared to offer. The stand-off will surely resolve itself but, at the time of writing, it is hard to predict the precise terms of this resolution.   

Data Protection and the WAB

The Withdrawal Agreement Bill and accompanying Explanatory Notes are very light on data protection provisions. Each contains a single paragraph on data protection making clear that none of the current arrangements will change. This is good news for businesses planning for the future.

A good deal of the WAB is concerned with the trading status of Northern Ireland and its relationship with the remaining 27 EU Member States.

As it stands the WAB makes Northern Ireland a member of the EU Customs Union while remaining in the United Kingdom.  This is a significant advantage for Northern Irish businesses that seek to move goods or services cross border with EU Member States.  The legislation is still in draft form and remains subject to amendment but if UK business wish to avoid difficulties in receiving data from other EU members they could derive significant advantage from locating their data processing capabilities in Ulster rather than on mainland UK.  Just a thought…

Please contact us if you have any queries or concerns about how Brexit will affect your business.  Call 01787 277742 or email teambrexit@datacompliant.co.uk

Gareth Evans, 25th October, 2019

How to conduct a Data Protection Impact Assessment (DPIA) in 8 simple steps

Many business activities these days will entail significant amounts of data processing and transference. It’s not always clear-cut as to what your organisation does that legally requires, or does not legally require, an impact assessment on the use of personal data – i.e. a Data Protection Impact Assessment (DPIA).

People may be familiar with Privacy Impact Assessments (PIAs), which were advised as best-practice by the Information Commissioner before the EU’s GDPR made DPIAs mandatory for certain activities. Now the focus is not so much on the obligation to meet individuals’ privacy expectations, but on the necessity to safeguard everyone’s data protection rights.

DPIAs are crucial records to demonstrate compliance with data protection law. In GDPR terms, they are evidence of transparency and accountability. They protect your clients, your staff, your partners and any potential third parties. Being vigilant against data protection breaches is good for everyone – with cybercrime on the rise, it’s important that organisations prevent unscrupulous agents from exploiting personal information.

In this blog, we’ll go through a step-by-step guide for conducting a DPIA. But first, let’s see what sort of things your organisation might be doing that need a DPIA.

When is a DPIA required?

The regulations are clear: DPIAs are mandatory for data processing that is “likely to result in a high risk to the rights and freedoms” of individuals. This can be during a current activity, or before a planned project. DPIAs can range in scope, relative to the scope of the processing.

Here are some examples of projects when a DPIA is necessary:

  • A new IT system for storing and accessing personal data;
  • New use of technology such as an app;
  • A data sharing initiative in which two or more organisations wish to pool or link sets of personal data;
  • A proposal to identify people of a specific demographic for the purpose of commercial or other activities;
  • Using existing data for a different purpose;
  • A new surveillance system or software/hardware changes to the existing system; or
  • A new database that consolidates information from different departments or sections of the organisation.

The GDPR also has a couple more conditions for a DPIA to be mandatory, namely:

  • Any evaluation you make based on automated processing, including profiling, as well as automated decision-making especially if this can have significant or legal consequences for someone; and
  • The processing of large quantities of special category personal data (formerly known as sensitive personal data).

An eight-step guide to your DPIA

  • Identify the need for a DPIA
    • Looking at the list above should give you an idea of whether a DPIA will be required. But there are also various ‘screening questions’ that should be asked early on in a project’s development. Importantly, the data protection team should assess the potential impacts on individuals’ privacy rights the project may have. Internal stakeholders should also be consulted and considered.
  • Describe the data flows
    • Explain how information will be collected, used and stored. This is important to redress the risk of ‘function creep,’ i.e. when data ends up getting used for different purposes, which may have unintended consequences.
  • Identify privacy and related risks
    • Identify and record the risks that relate to individuals’ privacy, including clients and staff.
    • Also identify corporate or organisational risks, for example the risks of non-compliance, such as fines, or a loss of customers’ trust. This involves a compliance check with the Principles of the Data Protection Act 2018 (the UK’s GDPR legislation).
  • Identify and evaluate privacy solutions
    • With the risks recorded, find ways to eliminate or minimise these risks. Consider doing cost/benefit analyses of each possible solution and consider their overall impact.
  • Sign off and record DPIA outcomes
    • Obtain the appropriate sign-off and acknowledgements throughout your organisation. A record of your DPIA evaluations and decisions should be made available for consultation during and after the project.
  • Consult with internal and external stakeholders throughout the project
    • This is not so much a step as an ongoing process. Commit to being transparent with stakeholders about the DPIA process. Listen to what your stakeholders have to say and make use of their expertise. This can include both employees as well as customers. Being open to consultation with clear communication channels for stakeholders to bring up data protection concerns or ideas will be extremely useful.
  • Ongoing monitoring
    • The DPIA’s results should be fed back into the wider project management process. You should take the time to make sure that each stage of the DPIA has been implemented properly, and that the objectives are being met.
    • Remember – if the project changes in scope, or its aims develop in the project lifestyle, you may need to revisit step one and make the appropriate reassessments.

This brief outline should help you to structure as well as understand the appropriateness of DPIAs. Eventually, these assessment processes will be second nature and an integral part of your project management system. Good luck!

If you have any questions about the data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742

Harry Smithson, 21st October 2019

Brexit. 14 Days to Go – A Declaration of Intent

Well who saw that coming? Suddenly at about 10.30 on Thursday morning came the announcement that the UK Government and the EU have agreed a Brexit deal. It came as something of a surprise as public pronouncements from both sides of the negotiations had been suggesting a deal was increasingly unlikely. Maybe this was expectation management in action because out of the blue on Thursday morning was a revised Withdrawal Agreement.  Importantly the Agreement was accompanied by a new version of the Political Declaration which outlines the direction of future UK/EU negotiations (more of this below).

Shortly after the announcement of a deal came the news that the Westminster Parliament had voted to sit on Saturday in order to ratify the Withdrawal Agreement that has been hammered out. With the Democratic Ulster Unionists and others refusing to support the agreement it is far from clear that the Agreement will win Parliamentary backing.

The Importance of Data Flows

The Political Declaration that sits alongside the draft Withdrawal Agreement is a statement of intent and is not legally binding on either the UK or the EU.  Nevertheless, it is worthwhile pointing out that right up front in the Political Agreement is a section on Data Protection. The prominence of this section – second in the list of “Initial Provisions” – suggests a genuine wish to secure a basis for the transference of data between the UK and EU once the UK has exited.  The opening paragraph of the section reads as follows: 

In view of the importance of data flows and exchanges across the future relationship, the Parties [the UK and EU] are committed to ensuring a high level of personal data protection to facilitate such flows between them.

It continues

The Union’s data protection rules provide for a framework allowing the European Commission to recognise a third country’s data protection standards as providing an adequate level of protection, thereby facilitating transfers of personal data to that third country. On the basis of this framework, the European Commission will start the assessments with respect to the United Kingdom as soon as possible after the United Kingdom’s withdrawal, endeavouring to adopt decisions by the end of 2020…

Positive Data Protection News for Businesses

For businesses transferring data to or from EU Member States this is encouraging. It has always been clear that should the UK leave the EU it would become a “Third Country” for data protection purposes. Transfers of data to EU member states would have to meet an EU assessment of equivalent levels of data protection.  We have seen several of counties struggle to achieve equivalence standards.

While the UK has written the provisions of GDPR into UK law it would appear, on the face of it, to be a formality to pass the European Commission’s assessment. Here, with the publication of the Political Declaration the EU’s intention to make the equivalence hurdle as easy as possible to negotiate has been signalled. 

It is currently unclear if the Withdrawal Agreement will pass through Parliament but the crucial importance of efficient and secure data flows between the UK and Europe has been underlined by this week’s Political Declaration.

Please contact us if you have any queries or concerns about how Brexit will affect your business, or if you need help with your data protection.  Call 01787 277742 or email teambrexit@datacompliant.co.uk

Gareth Evans, 18th October, 2019