Tag Archives: data protection

Lessons from Darts: Team Dynamics in Data Protection

Teams are an essential part of life – from school to adulthood, from sports to business.  A well-functioning team leads to extraordinary achievements, whether in a local darts league or a data governance team.

The Darts Team Triumph

Consider my local darts team, which recently won the team title, along with individual singles titles. This victory wasn’t just about individual knowledge and talent; it was the result of shared goals, a strategy to achieve them, collaboration, strong mentoring, and mutual support. Each of our players’ unique skills, camaraderie and collective effort all contributed to the team’s overall success. 

Transferring Team Dynamics to Data Governance

The same principles apply to data protection governance teams. Every member of the Team must understand its overall objectives ensuring that they are responsible and accountable for data management and governance. The Team will need a framework for success, including communication and collaboration, and creating and maintaining policies and procedures around data collection, privacy, compliance, integrity and security. And it must provide regular reports to senior management who are ultimately accountable. 

Roles, Goals and Data Stewardship

Individuals within the team will take on data stewardship roles.  In essence they will oversee the entire lifecycle of personal data from collection to deletion, and be accountable for compliance and security at all stages. All team members will support each other, sharing knowledge and expertise to help manage challenges and foster a culture of continuous improvement. And each will have their own individual areas of responsibility including embedding data protection throughout their own area of the business.

Education and Continuous Improvement

Like in darts, governance team members learn from each other’s techniques, and share knowledge, best practices and insights. This knowledge is then used to help build awareness throughout the organisation about data protection and data security, and to educate employees about crucial data protection principles.

Risk Management

Sports and business both carry risks, and the team must take responsibility for identifying, assessing and mitigating them – in data governance, for example through Data Protection Impact Assessments (DPIAs).  The team must also develop and execute its response plans so that it knows how to respond if there is a data breach or security incident.

Enabling Team Leaders

Team Leaders are crucial. They are pivotal in flowing down information to their specific areas of the business – in data governance, for example, it’s helpful to have leaders from IT, HR, Marketing, Operations, Payroll and so on. It’s those Team Leaders who will then ensure that everyone in their team understands their roles and responsibilities, and who provide the resources and training so that every individual in an organisation can thrive and contribute effectively.

Conclusion

Effective teams enable the individuals in your organisation to achieve more together than they ever could alone. With a data governance team that fosters collaboration, shared problem-solving and continuous education, your organisation will benefit from strong and highly successful outcomes.

Data Compliant International

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742.  And for more information about how to meet your Accountability and Governance obligations, please see here. 

Data Protection and Privacy Impacts of the New UK Data (Use and Access) Bill

Background

On Wednesday 23 October 2024, the UK Government published its Data (Use and Access) Bill (“DUA“). It promised to “harness the enormous power of data to boost the UK economy by £10 billion” and “unlock the secure and effective use of data for the public interest“. 

The DUA mirrors many of the concepts and provisions from the previous Government’s abandoned Data Protection and Digital Information Bill (“DPDI“), though there are subtle changes. The DUA appears to place greater focus on data sharing and digital. 

It is worth noting that the EU is set to review the UK’s data transfer adequacy status in mid-2025. Maintaining adequacy status is vital to the UK. (Possibly) as a result, some of the more contentious issues included in the discarded DPDI have been removed from the DUA. 

With the mid-2025 adequacy review date in mind, the government will undoubtedly try to get the Bill through as quickly as possible. After two readings in the House of Lords, it is now at Committee Stage.

DUA – Key Points for organisations

The key points of the DUA are:

  • UK Adequacy Status:  As stated above, the EU is reviewing the UK’s adequacy status in Mid-June.
  • Accountability requirements:  in the DPDI, there were plans to amend and simplify the accountability obligations required under GDPR.  These have NOT been carried over into the DUA.  Specifically there are to be no changes to:
    • the requirements for a DPO
    • requirements for Records of Processing Activities
    • requirements for Data Protection Impact Assessments.
  • ICO Reform: The Information Commissioner’s Office will be replaced by a new corporate body called Information Commission.  Executive members will be appointed and scrutinised by the Chair and non-executive members.  The Commissioner will be required to look to public interest factors around data protection. For example, it must consider the desirability of promoting innovation and competition.  There is also emphasis on protecting children in relation to data processing.
  • Special Category Data:  the Secretary of State has the power to add and remove new special categories of data. Those that already exist in Article 9 may not be removed. 
  • Data Subject Access Requests (DSARs): The discarded DPDI inlcuded the concept of an exception around “vexatious” requests. This has NOT been included in the DUA. However, proportionality is a key consideration in the DUA, which makes responding to DSARs more straightforward, including by confirming that a DSAR search for personal data need only be “reasonable and proportionate”
    • The 30-day time period to complete a DSAR begins only after the organisation has confirmed the individual’s identity.
    • The DUA also helps businesses by turning common DSAR practices, based on ICO guidance, into law.This offers certainty for organisations. For example, where
      • If an organisation has large amounts of information about the data subject, it may ask the subject to narrow down the information requested. 
      • While it seeks this information, it may briefly halt the time frame.
  • Legitimate Interests: there is a new concept of recognised legitimate interests where certain data processing activities will not require a full Legitimate Interest Assessment (LIA), specifically, for example:
    • safeguarding national security or public safety
    • responding to an emergency
    • crime prevention / investigation
    • public health
    • exercising data subject rights, regulatory functions or civil law claims. 
  • This list can be updated ongoing subject to parliamentary approval. 
  • It is worth noting that the European Court of Justice has consistently ruled that any interest that is legal may be a legitimate interest – i.e. that a purely commercial interest can be a legitimate interest.
  • In addition, when conducting an LIA, it is acceptable to take into account not only the benefits to the individuals, but also so the environment (e.g. paper reduction), economy (e.g. generating growth and spending budgets in a targeted manner).
  • Privacy and Electronic Communications Regulations:  PECR is included in DUA, and therefore is aligned with the levels of fine available for GDPR breaches.  This is a massive increase from the £500,000 maximum fine currently in place.  In addition, the DPDI’s email soft opt-in for non-commercial organisations (such as charities) is NOT currently included (though lobbying is ongoing).
  • Cookie Consent Exemptions: The aim is to reduce the number of cookie consent banners.  DUA allows the use of cookies without consent in specific circumstances, such as ensuring security or preventing fraud, collecting information for statistical purposes for own use, to improve the website functionality and appearance to the user, and to provide emergency assistance.  This is particularly beneficial to those parties who do not use advertising cookies – for example B2B websites.
  • Digital Verification Services: DUA aims to create a framework for trusted online identity verification services, moving away from paper-based and in-person tasks (e.g. registering births and deaths online). Companies providing digital verification tools must be certified against government standards and will receive a ‘trust mark’.
  • Smart Data Schemes: The introduction of smart data schemes will require businesses in sectors like financial services and public utilities to enable data interoperability and secure data sharing. This aims to enhance consumer confidence and drive innovation
  • Data Access Provisions: The DUA introduces data access standards similar to the EU’s Data Governance Act, enabling controlled data sharing between businesses and public authorities. 
  • Automated Decision Making: The DUA will make it easier for organisations to adopt a broader use of automated decision-making for low-risk, beneficial processing – for example when using artificial intelligence (AI) systems. It limits the scope of the UK’s GDPR Article 22 to cover only “significant” decisions, and those based either in part or entirely on special category data. 
  • Data Transfers: the DUA replaces Chapter 5 of the UK GDPR with a new “data protection test” for the Secretary of State to consider international data transfers, in which the objective is to ensure standards are not materially lower than in the UK.  This differs from the EU approach which looks for equivalence.

Timetable

With the upcoming adequacy review in mind, it seems likely that the government is trying to get the Bill through as quickly as possible – it has already had two readings in the House of Lords and is currently at Committee Stage in the House of Lords.

Data Compliant

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742,  And, for more information about to meet your accountancy obligations – both before and after the DUA comes into force, please see here.

Victoria Tuffill

18th December 2024

EU Standard Contractual Clauses – Public Consultation

This month (September 2024), the European Commission has announced that it plans to ask for public feedback on the EU Standard Contractual Clauses (SCCs) under the General Data Protection Regulation. The public consultation will take place in the fourth quarter of 2024, giving you an opportunity to have your views and opinions heard.

This is not unexpected – the GDPR’s Article 97, requires the Commission to review the GDPR’s implementation every four years (see the 2020 Evaluation Report here).  The upcoming 2024 review was expected to include an evaluation of the practical application of the SCCs.

New SCCs in 2025

According to the timeline, the public consultation is imminent and due to take place in the 4th quarter of 2024. This would be followed by a draft act, planned for Commission adoption in 2nd quarter of 2025.  You can find more information and a timeline here.

What are SCCs?

Standard contractual clauses are standardised, pre-approved model data protection clauses, which allow controllers and processors to meet their obligations under EU and / or UK data protection law. 

They are widely used as a tool for data transfers to third countries (which means those countries outside the EEA or the  UK who do not have adequacy status).  It is quite a simple matter for controllers and processors to incorporate them into their contractual arrangements.

The clauses contain data protection safeguards to make sure that personal data benefits from a high level of protection even when sent to a third country.  By adhering to the SCCs, data importers are contractually committed to abide by a set of data protection safeguards.

Can I change the text?

The core text can not be changed. If parties do change the text themselves, they will no longer have the legal certainty offered by the EU act.  If you amend the clauses, then they can no longer be used as a basis for data transfers to third countries, unless they are approved by a national data protection authority as “ad hoc clauses”

Even so, there are areas where the parties can make choices:

  • To select modules and / or specific options offered within the text
  • To complete the text where necessary (eg to specify time periods, supervisory authority and competent courts
  • To complete the Annexes
  • To include additional safeguards that increase the level of protection for the data. 

Impact on UK use of SCCs

There is not yet any indication of the potential impact on the UK’s international data transfer Agreement (IDTA) or the Addendum to the EU’s SCCs; we would expect to hear more after the EU’s public consultation.

Victoria Tuffill – 13th September 2024

If you have any questions or concerns about how and when to use SCCs, please call 01787 277742 or email dc@datacompliant.co.uk

And please take a look at our services.

Facebook data breach – €265million fine

The Irish DPC has issued a fine of €265 million to Meta Platforms Ireland Limited (MPIL) – the data controller of the Facebook network – after a 19-month enquiry. The DPC also issued a reprimand and has imposed a range of specified remedial actions to be completed within three months.

While the Irish DPC is the lead regulator, this decision included cooperation with the other EU data protection supervisory authorities.  This has been a surprisingly swift process, largely due to the EU countries being in agreement over the issue.

The enquiry began in April 2021.  Over 530 million Facebook users’ personal data — including email addresses and mobile phone numbers — were reported to have been exposed online. It appears that the data had been scraped maliciously from Facebook profiles, using a Contact Importer tool provided by Facebook. In September 2019, Facebook adjusted the tool to prevent further malicious activity. The DPC focussed its enquiry on tools running from 25 May 2018 (when GDPR came into force) and September 2019” (when Facebook made its security amendments).

The core issue that led to the fine was Meta’s failure to meet the obligations around Data Protection by Design and Default (Article 25 of the GDPR) by implementing appropriate technical and organisational measures.

Data Protection by Design and Default

Data Protection by Design and Default is not new.  But while in the past it’s been “advisable”, it is now, under GDPR, a legal requirement. Which means that you must, by law, have appropriate technical and organisational measures in place to ensure you comply effectively with data protection principles; and that you protect and safeguard individuals’ rights.

In practice, this means that you must think about data protection and privacy compliance – up-front. And build it into all the data processing you undertake. It has to be embedded throughout your business and all its practices.  And it’s important that it starts at the very beginning of the process, from concept and design stage, and runs right through the lifecycle of any personal data processing you do. 

This is the requirement that the DPC determined that Meta did not meet.

Meta Statement

In response to the DPC actions, Meta says it is “reviewing this decision carefully”, and stated: “We made changes to our systems during the time in question, including removing the ability to scrape our features in this way using phone numbers… Unauthorised data scraping is unacceptable and against our rules and we will continue working with our peers on this industry challenge … Protecting the privacy and security of people’s data is fundamental to how our business works. That’s why we have cooperated fully with the Irish Data Protection Commission on this important issue. “

Total Meta GDPR fines?

This latest fine brings the total amount of fines imposed since Autumn 2021 by the DPC on Meta to €912m.  Previous fines include €405m just a couple of months ago (teenagers’ Instagram accounts displayed their phone numbers and email addresses on a “public-by-default” setting); In March 2022, a GDPR fine of €17m was levied;  and in September 2021 a €225m fine was issued over “severe” and “serious” infringements by WhatsApp .

Avoid GDPR Fines

Privacy by Design and Default is at the heart of the GDPR. A Data Protection Impact Assessment (DPIA) is just one of the vital tools businesses need to help them meet their compliance and security obligations. It is an essential means of demonstrating that you put compliance and the security of your data subjects at the heart of everything you do.   

Consider the individuals whose data you are processing. What will be the impact on them? Will the processing be fair? Is it even legal? Would they expect you to process it in this way? Have you made them aware? Have you told them their rights? Will their data be safe? Have you done your due diligence on your suppliers? Do you have the right contracts? What are the risks? How can the risks be mitigated? Do you have appropriate organisational processes in place? What technical safeguards do I have / need? 

Asking yourselves questions like this will help you be sure you are taking appropriate steps towards meeting your obligations when processing personal data.

If you have questions or concerns about the practicalities around Data Protection by Design and Default, or how best to conduct a DPIA, or if you would like to chat about your own measures in this area, please call 01787 277742 or email dc@datacompliant.co.uk. You can find information about some of our services here.

Victoria Tuffill  29th November 2022

Data Protection and Fingerprints

Under the EU General Data Protection Regulation (GDPR), biometric data is considered special category data, which requires more stringent conditions for processing.  Fingerprints are an example of biometric data, and employers need to consider carefully how and where they use such data.

When processing any personal data, an organisation needs to have legal grounds for doing so.  And, in the case of special category data such as fingerprints, an additional Article 9 Condition must be applied.

A company in Holland, who used fingerprints inappropriately to monitor their employee’s attendance and time registration, was recently fined E750,000.

The company had obtained Consent from its employees, but under the GDPR Consent must be freely given, which means that the individuals must be allowed to refuse to give Consent.  Because there is a significant imbalance in power between an employer and an employee, it can be difficult for employers to demonstrate that employees have been given an genuine opportunity to refuse Consent.

In this case, some employees had felt obliged to give Consent, so the Dutch DPA found that the company did not have valid legal grounds to process the data for this purpose. 

Though there may be an appeal, this illustrates the seriousness of processing special category data in a way that is not considered unnecessary or disproportionate.

If you have any questions about biometric data or data protection in general, please contact us via email team@datacompliant.co.uk or call 01787 277742.

Victoria Tuffill, 25th May, 2020

 

Cybercriminals are increasingly impersonating WHO and the UN

Research by British security software and hardware company Sophos found that coronavirus email scams tripled in the last week of March, and we can expect the volume to be increasing. Over 3% of global spam is related to coronavirus, with many of these fraudulent emails impersonating the World Health Organisation or even the United Nations.

Chester Wisniewski, Principal Research Scientist at Sophos, said:

“Cybercriminals are wasting no time in shifting their dirty, tried-and-true attack campaigns towards advantageous lures that prey on mounting virus fears. Criminals often dip a toe in the water when there is a new or sensational topic in the news.”

He detailed a case in which his company tracked an email pretending to come from a WHO address, purportedly giving health advice in an attachment. But after inspection, the text matched a previous spam campaign from “a familiar criminal.”

While most of these spam operations are used to get information from people, there are even more aggressive cybercriminals out there.

Threatening extortion campaigns are also being pursued. In these, messages over social media or email threaten to give the victim or the victim’s family coronavirus unless they pay up. With the amount of information online, and the procedures used to construct holistic user profiles based on miscellaneous knowledge, attackers can make it seem like they know everything about a victim just by giving a few details. This makes the attacker seem like they have the capacity to execute their threats, and inevitably, people end up being exploited.

Other more sophisticated scammers use HMRC or departmental logos and graphics to get information from consumers, offering spurious sums of money under the guise of lockdown or furlough relief. In the United States, there has been evidence of insurance scams, such as fake COVID-19 health insurance offered at competitive rates.

Scammers and con-artists are sensitive to the news cycle, trends and the current political or economic climate. They will often seem persuasive because what they claim will seem salient, despite the content having most likely been tweaked from a previous scam based on a different news item or trending phenomenon.

Do not let criminals make you take rash decisions over fear of current market turmoil.

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.

Harry Smithson, 10th April 2020

Government expands Ofcom’s role to combat ‘Online Harms’

Online harms can take a variety of forms, privacy violations being among the most notorious. Regardless of how we categorise negative internet user experiences, we know from a recent Ofcom study that 61% of adults and 79% of 12-15 year olds have reported at least one potentially harmful online experience in the last 12 months.

As part of the government’s response to public consultation on the Online Harms White Paper, the DCMS announced on the 12th February that the UK’s telecoms and broadcasting regulator will also be the new online harms regulator. The Home Office and DCMS have been working together with Barnardo’s charity to provide greater protection for vulnerable internet users, particularly children, building upon growing institutional and regulatory oversight of digital services.

Unlike the General Data Protection Regulation (GDPR), which has far-reaching purviews, the regulation will likely only apply to fewer than 5% of UK business, as Ofcom will only be responsible for monitoring organisations that host user-generated content (comments, forums etc.).

But from a data protection perspective, it’s interesting to see how GDPR terminology and values have shaped this initiative – consider, for instance, former secretary of state Nicky Morgan’s statement on the government’s response to the white paper:

“We will give the regulator the powers it needs to lead the fight for an internet that remains vibrant and open but with the protections, accountability and transparency people deserve.”

We can expect to see the official anointing of the new Ofcom coming into force under Nicky Morgan’s recent successor, Oliver Dowden.

In the meantime, the Information Commissioner Elizabeth Denham, head of the UK’s enforcer for GDPR, has welcomed this expanded Ofcom as “an important step forward in addressing people’s growing mistrust of social media and online services.”

She continues, in an ICO press release on the heels of the DCMS announcement, “the scales are falling from our eyes as we begin to question who has control over what we see and how our personal data is used.”

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.

Harry Smithson, 14th February 2020

What does the law say about protecting your health and other sensitive data?

Health data, identity theft and fraud are among the most significant concerns of data protection, especially where sensitive personal data is concerned.  Now the Information Commissioners Office has published detailed guidance on how data controllers should protect and handle this ‘Special Category’ data. 

Special category data

Known as the most sensitive category of personal data, special category data concerns information on a person’s:

  • health
  • sex life or sexual orientation
  • racial or ethnic origin
  • political opinions
  • religious or philosophical beliefs
  • membership to a trade union
  • genetic data
  • biometric data for uniquely identifying a person such as a fingerprint, or facial recognition

Special care must be taken when processing sensitive data.  Because of its sensitive nature, there is a high risk to individuals if such data were to fall into the wrong hands.  It is illegal to process any of the above categories of data without a specific reason. 

So, data controllers MUST select one of the following legal grounds before processing:

  • explicit consent
  • obligations in employment
  • social security and social protection law
  • to protect vital interests
  • processing by non-for-profit bodies
  • manifestly made public
  • establish, exercise or defend legal claims
  • substantial public interest
  • preventative or occupational medicine
  • public health
  • research purposes.

‘Special Category’ data must also be given extra levels of security to protect it.  For example, limiting the number of individuals who may access such data, minimising the amount of data collected, stronger access controls – these and other such measures help protect the privacy of the individual, and to maintain the integrity and confidentiality of the data.

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742

Gareth Evans, 15th November 2019

US Privacy Bill

On October 11, 2019, California Governor Gavin Newsom signed the remaining amendments to the California Consumer Privacy Act (CCPA) into law.  The CCPA provides unprecedented privacy rights to California residents similar to those enjoyed by EU citizens since the implementation of GDPR. Most companies that do business with California will need to comply with the requirements of the new law.  The deadline for compliance with CCPA is 1st January 2020 though some commentators believe that this deadline may be extended.

Other US states are already considering introducing privacy legislation reflecting the measures taken by California. However, events are moving quickly…

On 5th November two Californian Democrat Congresswomen, Anna G. Eshoo and Zoe Lofgren, introduced an Online Privacy Bill to the US House of Representatives.  If successfully enacted the Act would create a federal Data Protection Agency (DPA) covering the whole of the US.

Corporate Data Privacy Obligations

The draft legislation imposes a raft of obligations on organisations, including:

  • disclose why they need to collect and process data
  • minimise employee and contractor access to personal data
  • not disclose or sell personal information without explicit consent
  • not use private communications such as email to target ads or for “other invasive purposes”

The legislation is attempting to tackle a range of abuse of privacy data. This is illustrated by the requirement for organisations to “notify the agency (the DPA) and users of breaches and data sharing abuses, e.g., Cambridge Analytica.”

Citizens Data Privacy Rights

The bill would give citizens the right to:

  • access, correct, delete, and transfer data about them;
  • request a human review of impactful automated decisions;
  • opt-in consent for using data for machine learning / A.I. algorithms;
  • be informed if a covered entity has collected your information; and to choose for how long their data can be kept   

Sound familiar?

If you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.You can find more of our blogs here.

Gareth Evans, 15th November 2019

Politics. Fines. Data Deletion.

GDPR Regulations begin to bite

We are now beginning to see the impact of the GDPR regulations across politics, businesses and public services.  With the upcoming UK general election, the ICO is issuing timely reminders.  In Europe we are starting to see large fines being levied for GDPR breaches.

ICO Issues Letter to UK Political Parties

In a timely reminder the Information Commissioner has written to 13 political parties in the UK. The letter reminds them of their legal obligations regarding the use of Personal Data in the lead-up to the General Election. The ICO letter highlights the need for parties to:

  • provide individuals with clear and accessible information about how their personal data is being used.  This includes

    • data obtained directly from individuals

    • data obtained from third parties, including data brokers 

    • inferred data – ie data that is inferred from observed behaviour, such as reading or buying habits, responses to advertising and so on 

  • demonstrate compliance with the law. The scope here includes any third-party data processors.  For political parties, this specifically includes data analytics providers and online campaigning platforms
  • have the appropriate records of consent from individuals (where consent is the legal basis for processing) to send political messages through electronic channels (texts, emails)
  • identify lawful bases for processing special category data, such as political opinions and ethnicity.

This places political parties on the same basis as commercial organisations under UK law. 

Record Fine in Austria

The Austrian Data Protection Authority has imposed an €18 million fine on the Austrian Postal Service, Österreichische Post AG (“ÖPAG”).  After an investigation, the Austrian DPA established that ÖPAG processed and sold data regarding its customers’ political allegiances amongst other violations.This is a violation of the GDPR.

The fine is subject to an appeal.

Record Fine in Germany

On November 5, 2019, the Berlin Commissioner for Data Protection and Freedom of Information announced that it had imposed the highest fine issued in Germany since the EU GDPR became applicable.  Deutsche Wohnen SE, a real estate company, was fined  €14.5 million.

After onsite inspections, the Berlin Commissioner noticed the company was retaining personal data of tenants for an unlimited period. It had not examined whether the retention was legitimate or necessary.

Data should be removed without delay. once it is no longer needed for the specific purpose for which it was collected. Deutsche Wohnen SE was using an archiving system that did not enable the removal of such data. Affected data related to financial and personal circumstances, such as bank statements, training contracts, tax, social and health insurance data.

This fine should act as a strong reminder to all companies to review and update their data retention and deletion policies, processes and supporting procedures.

More news later this week. In the meantime, if you have any questions about data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742.  You can find more blogs here.

Gareth Evans, 11th November 2019