Tag Archives: data privacy

AI: Balancing Innovation, Ethics, Privacy & Governance  

After last week’s AI Action Summit in Paris, AI ethics and safety legislation has become a hot topic globally.  Various regions are taking different approaches. U.S. Vice President J.D. Vance made it very clear that the Trump administration was firmly opposed to “excessive regulation” of AI, and argued that it would stifle innovation and hinder the growth of the AI industry.

Global Divide in AI Regulation

With different regions in the world taking different approaches, the landscape is complex.  Even within the US, there are divided approaches.  In the absence of federal guidance, some states are actively implementing their own AI governance state laws to address ethical and safety concerns.  These, of course, will now conflict with the current federal stance, which leans towards minimal regulation in favour of rapid AI development.

Global AI race risks safety, privacy and ethics

Globally, it’s a race, with China and the US at the forefront of AI development. China’s AI strategy focuses on becoming the world leader by 2030, with significant investments in research and development. The US has a similar goal and is doubling its AI research investment. Britain’s Starmer also has ambitions for rapid development.  But the global competitive race is clearly in danger of compromising ethical considerations and safety – and sustainability issues – in favour of innovation and rapid development.

Trustworthy AI governance

So it is somewhat reassuring that the UK, South Korea, France, Ireland and Australia data protection authorities have issued a joint statement on “building trustworthy data governance frameworks to encourage development of innovative and privacy-protective AI”.  It does at least show that these countries are making a concerted effort to balance innovation with ethical, privacy and safety considerations

In summary the joint statement :

  • States the need for AI to be developed and deployed in accordance with data protection and privacy rules, including robust data governance frameworks, and embedding privacy-by-design into AI systems from the start of the planning process
  • Aims to provide legal certainty and safeguards including transparency and fundamental rights
  • Commits to clarifying the legal bases for processing personal data in the context of AI
  • The  countries will exchange and establish a shared understand of proportionate security measures, which will be updated to keep up with evolving AI data processing activities
  • They will monitor the technical and societal impacts of AI and leverage the expertise and experience of Data Protection Authorities and other relevant entities
  • They aim to reduce legal uncertainty, while creating opportunities for innovation in a compliant environment
  • Commits to strengthening interaction with other authorities to improve consistency between the various regulatory frameworks for AI systems, tools and applications

It does not, however, address other concerning issues such as:

  • Bias and fairness (for example in areas such as hiring, lending, law enforcement). However the EU’s AI Act works towards mitigating these biases
  • Environmental impact (includes significant electricity demand and massive drinking water consumption. The extraction of raw materials and the generation of electronic waste to produce and transport high-performance computing hardware.) The Artificial Intelligence Environmental Impacts Act of 2024 in the US (if Trump doesn’t repeal it) and UNEP’s guidelines are steps towards addressing these concerns.

Data Protection Legislation Applies

In essence, regardless of guidelines and specific AI legislation and guidelines, the data protection legislation fundamentals do not change just because the processing involves AI. All AI personal data processing must abide by the prevailing data protection legislation – wherever in the world you are. 

Data Compliant

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742,  And, for more information about to meet your AI obligations, please see here.

Victoria Tuffill

17th February 2025

Data Protection and Privacy Impacts of the New UK Data (Use and Access) Bill

Background

On Wednesday 23 October 2024, the UK Government published its Data (Use and Access) Bill (“DUA“). It promised to “harness the enormous power of data to boost the UK economy by £10 billion” and “unlock the secure and effective use of data for the public interest“. 

The DUA mirrors many of the concepts and provisions from the previous Government’s abandoned Data Protection and Digital Information Bill (“DPDI“), though there are subtle changes. The DUA appears to place greater focus on data sharing and digital. 

It is worth noting that the EU is set to review the UK’s data transfer adequacy status in mid-2025. Maintaining adequacy status is vital to the UK. (Possibly) as a result, some of the more contentious issues included in the discarded DPDI have been removed from the DUA. 

With the mid-2025 adequacy review date in mind, the government will undoubtedly try to get the Bill through as quickly as possible. After two readings in the House of Lords, it is now at Committee Stage.

DUA – Key Points for organisations

The key points of the DUA are:

  • UK Adequacy Status:  As stated above, the EU is reviewing the UK’s adequacy status in Mid-June.
  • Accountability requirements:  in the DPDI, there were plans to amend and simplify the accountability obligations required under GDPR.  These have NOT been carried over into the DUA.  Specifically there are to be no changes to:
    • the requirements for a DPO
    • requirements for Records of Processing Activities
    • requirements for Data Protection Impact Assessments.
  • ICO Reform: The Information Commissioner’s Office will be replaced by a new corporate body called Information Commission.  Executive members will be appointed and scrutinised by the Chair and non-executive members.  The Commissioner will be required to look to public interest factors around data protection. For example, it must consider the desirability of promoting innovation and competition.  There is also emphasis on protecting children in relation to data processing.
  • Special Category Data:  the Secretary of State has the power to add and remove new special categories of data. Those that already exist in Article 9 may not be removed. 
  • Data Subject Access Requests (DSARs): The discarded DPDI inlcuded the concept of an exception around “vexatious” requests. This has NOT been included in the DUA. However, proportionality is a key consideration in the DUA, which makes responding to DSARs more straightforward, including by confirming that a DSAR search for personal data need only be “reasonable and proportionate”
    • The 30-day time period to complete a DSAR begins only after the organisation has confirmed the individual’s identity.
    • The DUA also helps businesses by turning common DSAR practices, based on ICO guidance, into law.This offers certainty for organisations. For example, where
      • If an organisation has large amounts of information about the data subject, it may ask the subject to narrow down the information requested. 
      • While it seeks this information, it may briefly halt the time frame.
  • Legitimate Interests: there is a new concept of recognised legitimate interests where certain data processing activities will not require a full Legitimate Interest Assessment (LIA), specifically, for example:
    • safeguarding national security or public safety
    • responding to an emergency
    • crime prevention / investigation
    • public health
    • exercising data subject rights, regulatory functions or civil law claims. 
  • This list can be updated ongoing subject to parliamentary approval. 
  • It is worth noting that the European Court of Justice has consistently ruled that any interest that is legal may be a legitimate interest – i.e. that a purely commercial interest can be a legitimate interest.
  • In addition, when conducting an LIA, it is acceptable to take into account not only the benefits to the individuals, but also so the environment (e.g. paper reduction), economy (e.g. generating growth and spending budgets in a targeted manner).
  • Privacy and Electronic Communications Regulations:  PECR is included in DUA, and therefore is aligned with the levels of fine available for GDPR breaches.  This is a massive increase from the £500,000 maximum fine currently in place.  In addition, the DPDI’s email soft opt-in for non-commercial organisations (such as charities) is NOT currently included (though lobbying is ongoing).
  • Cookie Consent Exemptions: The aim is to reduce the number of cookie consent banners.  DUA allows the use of cookies without consent in specific circumstances, such as ensuring security or preventing fraud, collecting information for statistical purposes for own use, to improve the website functionality and appearance to the user, and to provide emergency assistance.  This is particularly beneficial to those parties who do not use advertising cookies – for example B2B websites.
  • Digital Verification Services: DUA aims to create a framework for trusted online identity verification services, moving away from paper-based and in-person tasks (e.g. registering births and deaths online). Companies providing digital verification tools must be certified against government standards and will receive a ‘trust mark’.
  • Smart Data Schemes: The introduction of smart data schemes will require businesses in sectors like financial services and public utilities to enable data interoperability and secure data sharing. This aims to enhance consumer confidence and drive innovation
  • Data Access Provisions: The DUA introduces data access standards similar to the EU’s Data Governance Act, enabling controlled data sharing between businesses and public authorities. 
  • Automated Decision Making: The DUA will make it easier for organisations to adopt a broader use of automated decision-making for low-risk, beneficial processing – for example when using artificial intelligence (AI) systems. It limits the scope of the UK’s GDPR Article 22 to cover only “significant” decisions, and those based either in part or entirely on special category data. 
  • Data Transfers: the DUA replaces Chapter 5 of the UK GDPR with a new “data protection test” for the Secretary of State to consider international data transfers, in which the objective is to ensure standards are not materially lower than in the UK.  This differs from the EU approach which looks for equivalence.

Timetable

With the upcoming adequacy review in mind, it seems likely that the government is trying to get the Bill through as quickly as possible – it has already had two readings in the House of Lords and is currently at Committee Stage in the House of Lords.

Data Compliant

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742,  And, for more information about to meet your accountancy obligations – both before and after the DUA comes into force, please see here.

Victoria Tuffill

18th December 2024

US Privacy Bill

On October 11, 2019, California Governor Gavin Newsom signed the remaining amendments to the California Consumer Privacy Act (CCPA) into law.  The CCPA provides unprecedented privacy rights to California residents similar to those enjoyed by EU citizens since the implementation of GDPR. Most companies that do business with California will need to comply with the requirements of the new law.  The deadline for compliance with CCPA is 1st January 2020 though some commentators believe that this deadline may be extended.

Other US states are already considering introducing privacy legislation reflecting the measures taken by California. However, events are moving quickly…

On 5th November two Californian Democrat Congresswomen, Anna G. Eshoo and Zoe Lofgren, introduced an Online Privacy Bill to the US House of Representatives.  If successfully enacted the Act would create a federal Data Protection Agency (DPA) covering the whole of the US.

Corporate Data Privacy Obligations

The draft legislation imposes a raft of obligations on organisations, including:

  • disclose why they need to collect and process data
  • minimise employee and contractor access to personal data
  • not disclose or sell personal information without explicit consent
  • not use private communications such as email to target ads or for “other invasive purposes”

The legislation is attempting to tackle a range of abuse of privacy data. This is illustrated by the requirement for organisations to “notify the agency (the DPA) and users of breaches and data sharing abuses, e.g., Cambridge Analytica.”

Citizens Data Privacy Rights

The bill would give citizens the right to:

  • access, correct, delete, and transfer data about them;
  • request a human review of impactful automated decisions;
  • opt-in consent for using data for machine learning / A.I. algorithms;
  • be informed if a covered entity has collected your information; and to choose for how long their data can be kept   

Sound familiar?

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.You can find more of our blogs here.

Gareth Evans, 15th November 2019

Politics. Fines. Data Deletion.

GDPR Regulations begin to bite

We are now beginning to see the impact of the GDPR regulations across politics, businesses and public services.  With the upcoming UK general election, the ICO is issuing timely reminders.  In Europe we are starting to see large fines being levied for GDPR breaches.

ICO Issues Letter to UK Political Parties

In a timely reminder the Information Commissioner has written to 13 political parties in the UK. The letter reminds them of their legal obligations regarding the use of Personal Data in the lead-up to the General Election. The ICO letter highlights the need for parties to:

  • provide individuals with clear and accessible information about how their personal data is being used.  This includes

    • data obtained directly from individuals

    • data obtained from third parties, including data brokers 

    • inferred data – ie data that is inferred from observed behaviour, such as reading or buying habits, responses to advertising and so on 

  • demonstrate compliance with the law. The scope here includes any third-party data processors.  For political parties, this specifically includes data analytics providers and online campaigning platforms
  • have the appropriate records of consent from individuals (where consent is the legal basis for processing) to send political messages through electronic channels (texts, emails)
  • identify lawful bases for processing special category data, such as political opinions and ethnicity.

This places political parties on the same basis as commercial organisations under UK law. 

Record Fine in Austria

The Austrian Data Protection Authority has imposed an €18 million fine on the Austrian Postal Service, Österreichische Post AG (“ÖPAG”).  After an investigation, the Austrian DPA established that ÖPAG processed and sold data regarding its customers’ political allegiances amongst other violations.This is a violation of the GDPR.

The fine is subject to an appeal.

Record Fine in Germany

On November 5, 2019, the Berlin Commissioner for Data Protection and Freedom of Information announced that it had imposed the highest fine issued in Germany since the EU GDPR became applicable.  Deutsche Wohnen SE, a real estate company, was fined  €14.5 million.

After onsite inspections, the Berlin Commissioner noticed the company was retaining personal data of tenants for an unlimited period. It had not examined whether the retention was legitimate or necessary.

Data should be removed without delay. once it is no longer needed for the specific purpose for which it was collected. Deutsche Wohnen SE was using an archiving system that did not enable the removal of such data. Affected data related to financial and personal circumstances, such as bank statements, training contracts, tax, social and health insurance data.

This fine should act as a strong reminder to all companies to review and update their data retention and deletion policies, processes and supporting procedures.

More news later this week. In the meantime, if you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.  You can find more blogs here.

Gareth Evans, 11th November 2019

Politics and social media

Politics, Social Media and Data Protection

This has been a week where the combination of politics, social media and data protection have been much in evidence.

Twitter political advertising ban

Twitter boss Jack Dorsey decided to ban political advertising on Twitter globally, which has focussed attention on the use of personal data in targeting political messages. This has gained traction in the UK particularly as it coincides with an unscheduled General Election campaign.

Facebook agrees to pay maximum fine 

At the same time, the ICO announced an agreement with Facebook over their investigation into the misuse of personal data in political campaigns. The investigation began in 2017.

As part of that investigation, on 24 October 2018 the ICO issued a monetary penalty notice (MPN) of £500,000 against Facebook.  £500,000 was the maximum allowed under the Data Protection Act (DPA) 1998.  The ICO identified “suspected failings related to compliance with the UK data protection principles covering lawful processing of data and data security”.  

Following an appeal referred to a Tribunal, Facebook and the ICO have agreed to withdraw their respective appeals. Facebook has made no admission of liability, but has agreed to pay the £500,000 fine.

In a statement following the joint agreement, Facebook’s General Counsel said:  

The ICO has stated that it has not discovered evidence that the data of Facebook users in the EU was transferred to Cambridge Analytica. However, we look forward to continuing to cooperate with the ICO’s wider and ongoing investigation into the use of data analytics for political purposes.”

The ICO’s fine is the maximum available under DPA 1988.  Under current law (which implements the GDPR), sanctions can be up to 4% of annual global turnover or €20 million – whichever is greater.  

Facebook withdraws political campaigns

In the spirit of cooperation on the responsible use of data analytics in political communications, Facebook has withdrawn a number of political communications. The Government’s MyTown campaign was aimed at key marginal seats.  It was withdrawn as it did not contain the appropriate disclaimers.  In addition, an advert by the Fair Tax Campaign was withdrawn because it did not disclose that it was sponsored content.   

More news next week. In the meantime, if you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.  You can find more blogs here.

Gareth Evans, 5th November 2019

New US Privacy Bill on the Way

Core US Privacy Principles

 On 18th November Democratic Senators issued a set of core principles that should underpin any proposed Federal Privacy legislation.

The principles cover several issues across four categories to protect consumer privacy: (1) establish data safeguards, (2) invigorate competition, (3) strengthen consumer and civil rights, and (4) impose real accountability.

New US Privacy Bill

Then on November 26, 2019, the senators unveiled a new comprehensive federal privacy bill entitled the Consumer Online Privacy Rights Act (“COPRA”).

The bill would create a new bureau within the Federal Trade Commission. The bureau would promote data security and strengthen the law at State level.

COPPRA would amongst other measures: 

  • grant citizens new privacy rights, including the rights to access, delete and correct their data, as well as a right to data portability;
  • require organisations to obtain express, affirmative consent for the collection and use of sensitive data
  • prohibit the use of certain types of personal data including race, ethnicity and gender from being used to discriminate in decisions on   employment, credit, housing or education.
  • require organisations using algorithms to take decisions to undertake an algorithmic decision-making impact assessment.

Interestingly COPPRA seeks to exclude small businesses with annual revenue of less than $25 million from its requirements, as long as they process the data of fewer than 100,000 individuals or households annually.

If enacted the legislation is likely to impact businesses handling data of US citizens.  It should ease the transfer and processing of personal data for companies operating in different US States.

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742. You can find more of our blogs here.

Gareth Evans, 2nd December 2019

Data Protection Roundup: GDPR undermined by Facebook? Morrisons’ breach liability; Google’s iphone snooping

I find it fascinating to watch how data protection in general and GDPR in particular play out with the huge multinationals which it has been designed to capture, and which arguably have the most to lose in terms of fines.  Facebook and Google are once again in the news in relation to their use of personal data.  And the  High Court judgement against Morrisons sets a precedent which aligns with GDPR’s intention of individuals’ rights to have their data protected.

Google accused of bypassing privacy settings to harvest personal information of 5.4 million iPhone users between 2011 and 2012

The search engine tech giant Google is being taken to court by a group called Google You Owe Us, led by ex-Which director Richard Lloyd. The group claims that several hundred pounds could be owed in compensation to the millions of victims of Google’s transgression against privacy rights, meaning Google could face a massive financial penalty.

Online Cookies

Google breached DPA and PECR by misusing cookies

Google exploited cookies, which are small pieces of computer text that collect data from devices, to run large-scale targeted ad campaigns. In the UK Google’s actions were in breach of the Data Protection Act (DPA) and the Privacy and Electronic Communication Regulation (PECR). For such breaches after the General Data Protection Regulation (GDPR) comes into force in late May 2018, organisations could face a fine of up to €20 million or 4% of annual global turnover (whichever is higher – and for the billion-dollar giant Google, obviously the latter).  However, this case relates to a period prior to GDPR.

Google on Phone

Did you go online with your iPhone? Were your privacy preferences ignored?

For several months in 2011 and 2012, Google stands accused of bypassing the default privacy settings on Apple phones in order to track the online behaviour of Safari users, by placing ad-tracking cookies onto the devices. This then enabled advertisers to target content to those devices and their users.

The Google activity has become known as the ‘Safari workaround,’ and while it affected various devices, the lawsuit filed in the High Court addresses the targeting of iPhone users.

Over 5 million people in Britain had an iphone during the period.  “In all my years speaking up for consumers,” Mr Lloyd from Google You Owe Us states, “I’ve rarely seen such a massive abuse of trust where so many people have no way to seek redress on their own. Through this action, we will send a strong message to Google and other tech giants in Silicon Valley that we’re not afraid to fight back.”

According to the veteran privacy rights campaigner, Google claimed that he must go to California, the heartland of the Silicon revolution, if he wanted to pursue legal action against the firm, to which he responded, “It is disappointing that they are trying to hide behind procedural and jurisdictional issues rather than being held to account for their actions.”

According to the BBC, the broadcaster was told by Google that these legal proceedings are “not new” and that they “have defended similar cases before.” Google has stated that they do not believe the case has any merit and that they intend to contest it.

While there is no precedent in the UK for such massive action against Google, in the US Google has settled two large-scale litigation cases out of court. Regarding the same activity, the tech company agreed to pay a record $22.5m (£16.8m) in a case brought by the US Federal Trade Commission in 2012. It also made out of court settlements with a small number of British consumers.

According to the BBC, the case will probably be heard in the High Court in Spring 2018, a month or so prior to the enforcement of the GDPR.

 

Morrisons found liable for employee data breach

Morrisons workers brought a claim against the supermarket after a former member of staff, senior internal auditor Andrew Skelton (imprisoned as a result of his actions) stole and posted online confidential data (including salary and bank details) about nearly 100,000 employees.

Compensation Nov 2017In an historic High Court ruling, the Supermarket has been found liable for Skelton’s actions, which means that  those affected may claim compensation for the “upset and distress” caused.

The case is the first data leak class action in the UK.  Morrisons has said it will appeal the decision.

 

Facebook claims European data protection standards will not allow for their pattern-recognition “suicide alert tool” to be usable in EU.

Facebook Dislike

Facebook blames GDPR for its plans to withhold Suicide Prevention software from EU

Facebook’s decision to deny EU countries a pattern-recognition tool to alert authorities to users possibly suffering from depression or suicidal thoughts has been criticised as a move to undermine the upcoming tightening of EU-wide data protection standards, enshrined in the General Data Protection Regulation (GDPR).

Facebook has argued that their Artificial Intelligence (AI) programme which scans the social media network for troubling comments and posts that might indicate suicidal ideation will not be employed in EU countries on the grounds that European policy-makers and the public at large are too sensitive about privacy issues to allow site-wide scanning.

In a blogpost, Facebook’s VP of Product Management stated, “we are starting to roll out artificial intelligence outside the US to help identify when someone might be expressing thoughts of suicide, including on Facebook Live. This will eventually be available worldwide, except the EU.”

Tim Turner, a data consultant based in Manchester, has suggested that the move might be “a shot across the EU’s bows […] Facebook perhaps wants to undermine the GDPR — which doesn’t change many of the legal challenges significantly for this — and they’re using this as a method to do so.”

Mr Turner continues, “nobody could argue with wanting to save lives, and it could be a way of watering down legislation that is a challenge to Facebook’s data hungry business model. Without details of what they think the legal problems are with this, I’m not sure they deserve the benefit of the doubt.”

Written by Harry Smithson  1st December, 2017

 

 

Weekly Roundup: lack of data protection budgeting among UK businesses; international resolution to secure transparency among subcontractors; fine for ex-council worker

1 in 5 UK businesses have no data protection budget – compared to 4 in 5 local authorities 

GDPR Budget

A report by international email management company Mimecast states that a fifth of surveyed UK businesses do not have a specific budget dedicated to information security or data protection – a source of great concern ahead of the stringent General Data Protection Regulation (GDPR) in May 2018.

495038416

Over 80% of councils were found to have no funding towards meeting mandatory GDPR requirements

This reinforces the concerns over the information provided in response to a FOI  request by M-Files Corporation in July, which found that four out of five councils had, at that time, yet to allocate funding towards meeting the new requirements of the GDPR.  That research also found that 56% of local authorities contacted had still not appointed a data protection officer despite this being mandated by GDPR.

That such a substantial proportion of businesses have no explicit budgetary or financial commitment to combatting cybercrime and personal data abuse may be particularly unwelcome news to proponents and enforcers of the new GDPR. The Information Commissioner’s Office, the independent data protection authority, has been working hard over the last year to publicise and prepare British organisations for the impending legislation.

The lack of data protection budgeting is compounded by Mimecast’s findings that many UK businesses may not be monitoring their data efficiently. For instance, 15% of the surveyed organisations stated that they did not know whether they had suffered a data loss incident during the last year or not. 27% blamed human error for previous losses, which would indicate that a large number of organisations will need to start taking employee data protection and handling training much more seriously.

44% of the surveyed organisations suspect that their email system contains personal sensitive information as defined under the GDPR, but only 17% of them believed that this information could be retrieved immediately. The average amount of hours it would take British organisations to track down sensitive personal information was calculated as 8.

The report suggests that a significant number of organisations are very underprepared for the increased responsibility and accountability demanded by the GDPR. For help and information on preparing for the GDPR, see the Data Compliant main site.

10th International Conference of Information Commissioners (ICIC 2017) resolves to tackle difficulties of access to information on outsourced public services

The Information Commissioner’s Office (ICO) has confirmed a resolution on international action for improving access to information frameworks surrounding contracted-out public services, a system which has seen increased use throughout Europe, and rapid growth in the UK since 2010.

Challenges have been arising for a couple of decades concerning the transparency of information about the “new modes of delivery for public services.” This is often because the analysis of the efficacy of subcontracted services can be rendered difficult when, due to the principle of competition in the private sector, certain information – particularly regarding the production process of public services – can escape public scrutiny on the grounds of the protection of commercial confidentiality.

The International Conference, jointly hosted by Information Commissioner Elizabeth Denham and Acting Scottish Information Commissioner Margaret Keyse, was attended by Commissioners of 39 jurisdictions from 30 countries and seven continents. The resolution was passed in Manchester on 21st September following dialogue with civil society groups.

The resolution highlights the “challenge of scrutinising public expenditure and the performance of services provided by outsourced contractors” and “the impact on important democratic values such as accountability and transparency and the wider pursuit of the public interest.”

The Conference summarised that the first step to be taken would be the promotion of “global open contracting standards,” presumably as a means of garnering consensus on the importance of transparency in this regard for the benefit of the public, researchers and policy-makers. A conference working group is to be formed to “share practice about different initiatives that have been developed to tackle the issue.”

The event lasted two days and ran with the title: ‘Trust, transparency and progressive information rights.’ Contributions were heard from academics, journalists, freedom of information campaigners and regulators.

Access to information on the grounds of individual rights and the safeguarding of public interests will be strengthened by the provisions of the GDPR. This resolution provides a reminder and opportunity for organisations working as subcontractors to review the ways in which they store and handle data. Transparency and accountability, longer considered in any way contradictory, are key watchwords for the clutch of data protection reforms taking place throughout the world. Many organisations would do well to assess whether they are in a position to meet the standards of good governance and best practice regarding data management, which will soon become a benchmark for consumer trust.

Ex-employee of Leicester City Council fined for stealing vulnerable people’s personal information

The ICO has confirmed the prosecution of an ex-council worker for unlawfully obtaining the personal information of service users of Leicester City Council’s Adult Social Care Department.

vulnerable

Personal data, including medical conditions, care and financial records were “unlawfully” obtained by an ex-council worker

The personal details of vulnerable people were taken without his employer’s consent, and breached the current Data Protection Act 1998. 34 emails containing the personal information of 349 individuals, including sensitive personal data such as medical conditions, care and financial details and records of debt, were sent to a private email address prior to the individual having left the council.

The ICO’s Head of Enforcement Steve Eckersley stated, “Employees need to understand the consequences of taking people’s personal information with them when they leave a job role. It’s illegal and when you’re caught, you will be prosecuted.”

 

Harry Smithson  29th September 2017

 

 

 

Data Compliant News Blog: Cyberattack threatens over 400,000 British consumers, Data Protection Bill 2017 published and fines levied on councils mishandling data

Equifax data breach – hackers may have access to hundreds of thousands of British consumers’ personal details

The Information Commissioner’s Office (ICO) is investigating a hack on Equifax, a large credit rating agency based in Atlanta, USA, to find out whether and to what extent the company’s British consumers’ personal details have been obtained by the hackers. The FBI is also said to be monitoring the situation.

The cyberattack, reported earlier this month, occurred in May and July. The company has already admitted that 143 million American customers’ personal details have been obtained by the hackers.

Credit Cards

400,000 UK customers may be affected by Equifax breach

The US information that the hackers may have accessed includes names, social security numbers, dates of birth, addresses and driving licence details, as well as over 200,000 credit card numbers.

The ICO told Equifax that the company must warn British residents of the data breach and inform them of any information relating to them which has been obtained by the cyber attackers. The credit agency promptly issued alerts to the affected Britons, stating however that an ‘identity takeover’ was unlikely.

Britons would do well to be mindful that, once a hacker has  name, date of birth,  email addresses, and telephone numbers, it takes little effort to acquire the missing elements, which is why the ICO has warned members of the public to remain vigilant against unsolicited emails and communications.  They should also be particularly wary of unexpected transactions or activity recorded on their financial statements.

Shares in Equifax saw considerable reductions throughout the week, and two of the company’s senior executives, the Chief Information Officer and Chief Security Officer have resigned with immediate effect..

The Data Protection Bill 2017, which includes GPDR, has been published

New Law 2

GDPR is included in its entirety in the UK’s Data Protection Bill 2017, now going through Parliament

On 14th September, the Department for Digital, Culture, Media and Sport published the Data Protection Bill 2017. The Bill has been anticipated since the Queen’s speech in June, in which the government outlined its plan to implement the European-wide data protection game-changer GDPR into British law.

Culture secretary Karen Bradley explains: “The Data Protection Bill will give people more control over their data, support businesses in their use of data, and prepare Britain for Brexit.  In the digital world strong cyber security and data protection go hand in hand. This Bill is a key component of our work to secure personal information online.”

While the Bill inculcates the GDPR, and therefore provides the basis for data-sharing and other adequacy agreements with the EU after Brexit, the government has stated that it managed to negotiate some ‘vital’ and ‘proportionate’ exemptions for the UK.

Some of the exemptions are provided for journalists accessing personal data to expose wrongdoing or for the good of the public; scientific and research organisations such as museums if their work is hindered; anti-doping bodies; financial firms handling personal data on suspicion of terrorist financing; money laundering; and employment where access may be neededs to personal data to fulfil the requirements of employment law.

The second reading of the Bill in Parliament will take place on 10th October, after which a general debate on Brexit and data protection takes place on the 12th.

As yet, there have been few critics of the proposed legislation outside certain industries whose use of big data makes them particularly susceptible to possible data protection breaches and massive fines (£17m or 4% annual global turnover). Some industry leaders have called for exemptions, including the private pension giant Scottish Widows, who claimed GDPR-level regulations would make it impossible for them to contact some of their customers without breaking the law. However, according to the government, 80% of Britons do not believe that they have control over their information online, and the Bill enjoys widespread support at this point. The Shadow Cabinet has yet to offer any official response or criticism.

Islington Council fined £70,000 

The Information Commissioner’s Office (ICO) fined Islington Council £70,000 for failing to secure 89,000 peoples’ personal information on an online parking ticket system.

Design faults in the Council’s ‘Ticket Viewer’ system, which keeps CCTV images of parking offences, compromised the security of 89,000 peoples’ personal data. Some of this data is under the category of sensitive personal information, e.g. medical details disclosed for the sake of appealing against a parking fine.

Harry Smithson 23rd September 2017

GDPR – ICO Puts Trust at the Heart of Data Processing

Trust & data

Information Commissioner’s Annual Report

The Information Commissioner’s Office (ICO) published its annual report on the 13th July. It is the first time the Information Commissioner Elizabeth Denham has compiled an annual report, having taken up the post a year ago.

The report highlights the increased powers and expanding caseload and capacities  of the regulator. At a time of increasing concern about the use (and abuse) of personal information, the ICO is seeing a great deal more work.  This is, in part, reflected by an increase in staff numbers of around 8% year on year.

GDPR and Public Trust

The ICO’s foreword emphasises its commitment to regaining public trust in data controllers and processors. It is hoped that changing laws provide the regulator with an opportunity to enable individuals to trust in large organisations handling personal information. The Commissioner  states that “trust” will be “at the heart of what the Information Commissioner’s Office will do in the next four years.” Confidence in the digital economy is a consideration that the regulator acknowledges and aims to encourage, especially since the digital sector is growing 30% faster than any other part of the economy.

This echoes the government’s concerns regarding the digital economy and its relation to data protection principles that were enumerated in the Queen’s Speech and addressed by several measures including a Data Protection Bill, which is designed to implement the General Data Protection Regulation (GDPR).

In a year characterised by the impending replacement of the Data Protection Act 1998 (DPA) with the GDPR in May 2018, the report’s outline of major work undertaken leads with a nod to the many public, private and third sector organisations that will be preparing for the new legislative framework.

Consent

‘Consent,’ which has become one of the watchwords for the GDPR (and a word that will be increasingly found on the bulletin boards and coffee mugs of marketing departments) will take on a stricter legal definition soon – a marketing monolith for which the ICO anticipates organisations will seek detailed guidance.

Data Breaches

But the GDPR by no means eclipsed the ICO’s other responsibilities. Nuisance calls, unsolicited marketing and data sharing have routinely seen organisations facing fines and other civil measures. Breaches of the DPA and Privacy and Electronic Communications Regulations 2003 (PECR) such as these by a number of charities, of which the Daily Mail reported allegations in 2015, have led the ICO to issue 13 civil monetary penalties to the value of £181,000.

Indeed, some companies, Honda (whom we reported about last month) being an explicit example, have been issued fines for unsolicited marketing in breach of the DPA due to emails which asked for clarification regarding customers’ marketing preferences – which Honda for example maintained were a means of preparing for the GDPR. So while preparation for the GDPR is something to which the ICO has committed a great deal of resources, they have by no means neglected upholding the current law. The ICO has consistently made clear that it is not acceptable to break the law in preparation for another.

Monetary penalties

Overall, the ICO issued more civil monetary penalties for breaches of PECR than ever before (23), to the value of £1,923,000. It has also issued 16 fines for serious breaches of data protection principles totalling £1,624,500. It cannot be stated enough that after May 2018, these figures could skyrocket if organisations do not find ways of being compliant with the new, more expansive and rigorous legislation. Criminal prosecutions have seen a 267% increase, and the ICO has received 18,300 concerns regarding data protection brought to them – 2,000 more than last year.

Subject Access Requests (SARs)

Data controllers or organisations handling a wide range of personal data may have increasing requests for Subject Access Requests (SARs). The report states that 42% of all concerns brought to the ICO where the nature was specified were related to subject access. While these requests for data are provided under the DPA (and will be upheld with more rigour as one the data subject ‘rights’ by the GDPR) and not the freedom of information legislation, it nonetheless falls upon organisations of whatever size to be co-operative and compliant when the disclosure of information is required. It is important for organisations to train their staff to be able to recognise a SAR and act promptly.  Data controllers must recognise the importance of compliance not only with the law but with ICO audits and investigations, as well as of the necessity for efficient and conscientious data handling.

For information about how DC can help you meet the requirements of GDPR,  please email dc@datacompliant.co.uk.

Harry Smithson, July 25th 2017