Author Archives: Data Compliant

Choosing your DPO: Full-Time? Service? Consult?

I’ve noticed many SMEs running  LinkedIn ads for DPOs, who are recruiting full-time employees.  And it’s disappointing to see how often other options are overlooked. 

Which DPO Option and Benefits Suit You Best?

I think it’s important to consider all the available options before making a decision.  Yes, you could employ a Full-Time DPO for your business.  Or you could contract with DPO specialists to provide DPO as a service.  Or appoint an internal ‘non-specialist’ and support them with an external data protection consultancy or consultant.

I’ve listed below some of the benefits from these options, so that the next time you need to find a DPO, you have information to help you make an informed decision about what type of DPO you actually need, and what solution might fit you best.

1. Cost 

As with any new employee, hiring a highly qualified full-time DPO involves significant costs, especially in salary and benefits.  Outsourcing the DPO role means you only pay for the DPO services and time you use. You don’t need to consider staff benefits, holiday, sickness, appraisals. Or the time, cost and expense involved if you need to let them go. Nor do you need to be concerned about FTE overheads like office space, equipment or other resources.

2. Expertise

Internal DPOs may struggle with budget constraints and limited resources. Outsourcing can provide a more cost-effective solution with access to necessary resources as needed. An outsourced DPO service can give you more or less support month by month, depending on your needs. You can choose how much time you need, and, in any case the time required can be flexible. You can ramp it up or down depending on whether you have a large project, or simply need ongoing maintenance. And if you need an interim DPO, you can appoint a consultant with no need for long term commitment.

3. Flexibility / Scalability

Top quality DPOs (whether FTE or outsourced) are experts in data protection law regulations. But perhaps the biggest advantage to using the DPO-as-a-Service or consultancy option is that those DPOs will have gained considerable and diverse experience from working in many different industry sectors.  They see many and varied solutions to common data protection issues from the numerous clients with whom they work.  And vitally, within a team of DPOs in a consultancy, they will always be learning from each other, considering solutions based on the shared knowledge of the whole team.  And that shared knowledge becomes your company’s shared knowledge.

4. Unbiased Approach

An outsourced DPO or consultant has the advantage of being independent and unconflicted. They are able to consider your issues with fresh eyes and no bias.  This means that they can conduct unbiased audits and assessments of your data protection practices. Then help you implement any remedial actions. 

5. Internal Challenges

Full-time DPOs often face challenges such as lack of support from key stakeholders and cooperation within the organisation. Although fully engaged with the client and its goals, outsourced DPOs can navigate these challenges more effectively due to their independence. 

Conclusion

The traditional route of hiring a full-time employee may be perfect for many companies. But it’s clearly not the only solution.  So when you next need to appoint a DPO, you could state that not only full-time employees, but also DPO-as-a-Service providers or consultants are welcome to apply. 

That way you can be sure that you don’t miss out by excluding the right person by default.  And of course, you can review and interview applicants as normal and make your own decision about which individual or option fits your needs best. 

Data Compliant International

If you are looking for a DPO or supportive consultant, Data Compliant International provides DPO-as-a-Service, and data protection / privacy  consultants to a wide range of business sectors.  If you’d like to know more about how we help our clients, please take a look here.  If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742.  

Lessons from Darts: Team Dynamics in Data Protection

Teams are an essential part of life – from school to adulthood, from sports to business.  A well-functioning team leads to extraordinary achievements, whether in a local darts league or a data governance team.

The Darts Team Triumph

Consider my local darts team, which recently won the team title, along with individual singles titles. This victory wasn’t just about individual knowledge and talent; it was the result of shared goals, a strategy to achieve them, collaboration, strong mentoring, and mutual support. Each of our players’ unique skills, camaraderie and collective effort all contributed to the team’s overall success. 

Transferring Team Dynamics to Data Governance

The same principles apply to data protection governance teams. Every member of the Team must understand its overall objectives ensuring that they are responsible and accountable for data management and governance. The Team will need a framework for success, including communication and collaboration, and creating and maintaining policies and procedures around data collection, privacy, compliance, integrity and security. And it must provide regular reports to senior management who are ultimately accountable. 

Roles, Goals and Data Stewardship

Individuals within the team will take on data stewardship roles.  In essence they will oversee the entire lifecycle of personal data from collection to deletion, and be accountable for compliance and security at all stages. All team members will support each other, sharing knowledge and expertise to help manage challenges and foster a culture of continuous improvement. And each will have their own individual areas of responsibility including embedding data protection throughout their own area of the business.

Education and Continuous Improvement

Like in darts, governance team members learn from each other’s techniques, and share knowledge, best practices and insights. This knowledge is then used to help build awareness throughout the organisation about data protection and data security, and to educate employees about crucial data protection principles.

Risk Management

Sports and business both carry risks, and the team must take responsibility for identifying, assessing and mitigating them – in data governance, for example through Data Protection Impact Assessments (DPIAs).  The team must also develop and execute its response plans so that it knows how to respond if there is a data breach or security incident.

Enabling Team Leaders

Team Leaders are crucial. They are pivotal in flowing down information to their specific areas of the business – in data governance, for example, it’s helpful to have leaders from IT, HR, Marketing, Operations, Payroll and so on. It’s those Team Leaders who will then ensure that everyone in their team understands their roles and responsibilities, and who provide the resources and training so that every individual in an organisation can thrive and contribute effectively.

Conclusion

Effective teams enable the individuals in your organisation to achieve more together than they ever could alone. With a data governance team that fosters collaboration, shared problem-solving and continuous education, your organisation will benefit from strong and highly successful outcomes.

Data Compliant International

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742.  And for more information about how to meet your Accountability and Governance obligations, please see here. 

AI: Balancing Innovation, Ethics, Privacy & Governance  

After last week’s AI Action Summit in Paris, AI ethics and safety legislation has become a hot topic globally.  Various regions are taking different approaches. U.S. Vice President J.D. Vance made it very clear that the Trump administration was firmly opposed to “excessive regulation” of AI, and argued that it would stifle innovation and hinder the growth of the AI industry.

Global Divide in AI Regulation

With different regions in the world taking different approaches, the landscape is complex.  Even within the US, there are divided approaches.  In the absence of federal guidance, some states are actively implementing their own AI governance state laws to address ethical and safety concerns.  These, of course, will now conflict with the current federal stance, which leans towards minimal regulation in favour of rapid AI development.

Global AI race risks safety, privacy and ethics

Globally, it’s a race, with China and the US at the forefront of AI development. China’s AI strategy focuses on becoming the world leader by 2030, with significant investments in research and development. The US has a similar goal and is doubling its AI research investment. Britain’s Starmer also has ambitions for rapid development.  But the global competitive race is clearly in danger of compromising ethical considerations and safety – and sustainability issues – in favour of innovation and rapid development.

Trustworthy AI governance

So it is somewhat reassuring that the UK, South Korea, France, Ireland and Australia data protection authorities have issued a joint statement on “building trustworthy data governance frameworks to encourage development of innovative and privacy-protective AI”.  It does at least show that these countries are making a concerted effort to balance innovation with ethical, privacy and safety considerations

In summary the joint statement :

  • States the need for AI to be developed and deployed in accordance with data protection and privacy rules, including robust data governance frameworks, and embedding privacy-by-design into AI systems from the start of the planning process
  • Aims to provide legal certainty and safeguards including transparency and fundamental rights
  • Commits to clarifying the legal bases for processing personal data in the context of AI
  • The  countries will exchange and establish a shared understand of proportionate security measures, which will be updated to keep up with evolving AI data processing activities
  • They will monitor the technical and societal impacts of AI and leverage the expertise and experience of Data Protection Authorities and other relevant entities
  • They aim to reduce legal uncertainty, while creating opportunities for innovation in a compliant environment
  • Commits to strengthening interaction with other authorities to improve consistency between the various regulatory frameworks for AI systems, tools and applications

It does not, however, address other concerning issues such as:

  • Bias and fairness (for example in areas such as hiring, lending, law enforcement). However the EU’s AI Act works towards mitigating these biases
  • Environmental impact (includes significant electricity demand and massive drinking water consumption. The extraction of raw materials and the generation of electronic waste to produce and transport high-performance computing hardware.) The Artificial Intelligence Environmental Impacts Act of 2024 in the US (if Trump doesn’t repeal it) and UNEP’s guidelines are steps towards addressing these concerns.

Data Protection Legislation Applies

In essence, regardless of guidelines and specific AI legislation and guidelines, the data protection legislation fundamentals do not change just because the processing involves AI. All AI personal data processing must abide by the prevailing data protection legislation – wherever in the world you are. 

Data Compliant

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742,  And, for more information about to meet your AI obligations, please see here.

Victoria Tuffill

17th February 2025

Data Protection and Privacy Impacts of the New UK Data (Use and Access) Bill

Background

On Wednesday 23 October 2024, the UK Government published its Data (Use and Access) Bill (“DUA“). It promised to “harness the enormous power of data to boost the UK economy by £10 billion” and “unlock the secure and effective use of data for the public interest“. 

The DUA mirrors many of the concepts and provisions from the previous Government’s abandoned Data Protection and Digital Information Bill (“DPDI“), though there are subtle changes. The DUA appears to place greater focus on data sharing and digital. 

It is worth noting that the EU is set to review the UK’s data transfer adequacy status in mid-2025. Maintaining adequacy status is vital to the UK. (Possibly) as a result, some of the more contentious issues included in the discarded DPDI have been removed from the DUA. 

With the mid-2025 adequacy review date in mind, the government will undoubtedly try to get the Bill through as quickly as possible. After two readings in the House of Lords, it is now at Committee Stage.

DUA – Key Points for organisations

The key points of the DUA are:

  • UK Adequacy Status:  As stated above, the EU is reviewing the UK’s adequacy status in Mid-June.
  • Accountability requirements:  in the DPDI, there were plans to amend and simplify the accountability obligations required under GDPR.  These have NOT been carried over into the DUA.  Specifically there are to be no changes to:
    • the requirements for a DPO
    • requirements for Records of Processing Activities
    • requirements for Data Protection Impact Assessments.
  • ICO Reform: The Information Commissioner’s Office will be replaced by a new corporate body called Information Commission.  Executive members will be appointed and scrutinised by the Chair and non-executive members.  The Commissioner will be required to look to public interest factors around data protection. For example, it must consider the desirability of promoting innovation and competition.  There is also emphasis on protecting children in relation to data processing.
  • Special Category Data:  the Secretary of State has the power to add and remove new special categories of data. Those that already exist in Article 9 may not be removed. 
  • Data Subject Access Requests (DSARs): The discarded DPDI inlcuded the concept of an exception around “vexatious” requests. This has NOT been included in the DUA. However, proportionality is a key consideration in the DUA, which makes responding to DSARs more straightforward, including by confirming that a DSAR search for personal data need only be “reasonable and proportionate”
    • The 30-day time period to complete a DSAR begins only after the organisation has confirmed the individual’s identity.
    • The DUA also helps businesses by turning common DSAR practices, based on ICO guidance, into law.This offers certainty for organisations. For example, where
      • If an organisation has large amounts of information about the data subject, it may ask the subject to narrow down the information requested. 
      • While it seeks this information, it may briefly halt the time frame.
  • Legitimate Interests: there is a new concept of recognised legitimate interests where certain data processing activities will not require a full Legitimate Interest Assessment (LIA), specifically, for example:
    • safeguarding national security or public safety
    • responding to an emergency
    • crime prevention / investigation
    • public health
    • exercising data subject rights, regulatory functions or civil law claims. 
  • This list can be updated ongoing subject to parliamentary approval. 
  • It is worth noting that the European Court of Justice has consistently ruled that any interest that is legal may be a legitimate interest – i.e. that a purely commercial interest can be a legitimate interest.
  • In addition, when conducting an LIA, it is acceptable to take into account not only the benefits to the individuals, but also so the environment (e.g. paper reduction), economy (e.g. generating growth and spending budgets in a targeted manner).
  • Privacy and Electronic Communications Regulations:  PECR is included in DUA, and therefore is aligned with the levels of fine available for GDPR breaches.  This is a massive increase from the £500,000 maximum fine currently in place.  In addition, the DPDI’s email soft opt-in for non-commercial organisations (such as charities) is NOT currently included (though lobbying is ongoing).
  • Cookie Consent Exemptions: The aim is to reduce the number of cookie consent banners.  DUA allows the use of cookies without consent in specific circumstances, such as ensuring security or preventing fraud, collecting information for statistical purposes for own use, to improve the website functionality and appearance to the user, and to provide emergency assistance.  This is particularly beneficial to those parties who do not use advertising cookies – for example B2B websites.
  • Digital Verification Services: DUA aims to create a framework for trusted online identity verification services, moving away from paper-based and in-person tasks (e.g. registering births and deaths online). Companies providing digital verification tools must be certified against government standards and will receive a ‘trust mark’.
  • Smart Data Schemes: The introduction of smart data schemes will require businesses in sectors like financial services and public utilities to enable data interoperability and secure data sharing. This aims to enhance consumer confidence and drive innovation
  • Data Access Provisions: The DUA introduces data access standards similar to the EU’s Data Governance Act, enabling controlled data sharing between businesses and public authorities. 
  • Automated Decision Making: The DUA will make it easier for organisations to adopt a broader use of automated decision-making for low-risk, beneficial processing – for example when using artificial intelligence (AI) systems. It limits the scope of the UK’s GDPR Article 22 to cover only “significant” decisions, and those based either in part or entirely on special category data. 
  • Data Transfers: the DUA replaces Chapter 5 of the UK GDPR with a new “data protection test” for the Secretary of State to consider international data transfers, in which the objective is to ensure standards are not materially lower than in the UK.  This differs from the EU approach which looks for equivalence.

Timetable

With the upcoming adequacy review in mind, it seems likely that the government is trying to get the Bill through as quickly as possible – it has already had two readings in the House of Lords and is currently at Committee Stage in the House of Lords.

Data Compliant

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742,  And, for more information about to meet your accountancy obligations – both before and after the DUA comes into force, please see here.

Victoria Tuffill

18th December 2024

EU Standard Contractual Clauses – Public Consultation

This month (September 2024), the European Commission has announced that it plans to ask for public feedback on the EU Standard Contractual Clauses (SCCs) under the General Data Protection Regulation. The public consultation will take place in the fourth quarter of 2024, giving you an opportunity to have your views and opinions heard.

This is not unexpected – the GDPR’s Article 97, requires the Commission to review the GDPR’s implementation every four years (see the 2020 Evaluation Report here).  The upcoming 2024 review was expected to include an evaluation of the practical application of the SCCs.

New SCCs in 2025

According to the timeline, the public consultation is imminent and due to take place in the 4th quarter of 2024. This would be followed by a draft act, planned for Commission adoption in 2nd quarter of 2025.  You can find more information and a timeline here.

What are SCCs?

Standard contractual clauses are standardised, pre-approved model data protection clauses, which allow controllers and processors to meet their obligations under EU and / or UK data protection law. 

They are widely used as a tool for data transfers to third countries (which means those countries outside the EEA or the  UK who do not have adequacy status).  It is quite a simple matter for controllers and processors to incorporate them into their contractual arrangements.

The clauses contain data protection safeguards to make sure that personal data benefits from a high level of protection even when sent to a third country.  By adhering to the SCCs, data importers are contractually committed to abide by a set of data protection safeguards.

Can I change the text?

The core text can not be changed. If parties do change the text themselves, they will no longer have the legal certainty offered by the EU act.  If you amend the clauses, then they can no longer be used as a basis for data transfers to third countries, unless they are approved by a national data protection authority as “ad hoc clauses”

Even so, there are areas where the parties can make choices:

  • To select modules and / or specific options offered within the text
  • To complete the text where necessary (eg to specify time periods, supervisory authority and competent courts
  • To complete the Annexes
  • To include additional safeguards that increase the level of protection for the data. 

Impact on UK use of SCCs

There is not yet any indication of the potential impact on the UK’s international data transfer Agreement (IDTA) or the Addendum to the EU’s SCCs; we would expect to hear more after the EU’s public consultation.

Victoria Tuffill – 13th September 2024

If you have any questions or concerns about how and when to use SCCs, please call 01787 277742 or email dc@datacompliant.co.uk

And please take a look at our services.

Massive €290 Million Uber fine for EU-US data transfers

Last week the Dutch Data Protection Authority fined Uber a massive €290 million for transferring personal data from EU to US servers without adequate protections. This is a massive fine – one of the largest seen to date under GDPR.

According to the Dutch DPA, Uber collected sensitive information (eg account details, taxi licenses, location data, photos, identity documents and some criminal and medial data) from its EU drivers and stored it on servers in the US without protective transfer tools for 2 years. There were 170 complaints from French drivers (French complaints, but the Dutch DPA issued the fines as Uber’s European Office is in the Netherlands).

When did the breach happen?

The two-year period spanned the time that the Privacy Shield was invalidated, and the Data Privacy Framework came into force. According to the DPA, Uber stopped using SCCs in August 2021, so it found that the data of EU drivers had not been protected adequately. Of course, now that the Data Privacy Framework is in force, there is no ongoing breach.

How could it have been avoided?

Uber could have used Standard Contractual Clauses to transfer its personal data to the US.

What does this mean for others?

This gives an indication of how significant data transfer mechanisms and risk assessments are. Uber intends to appeal the fine and the outcome of the appeal will be of interest to many businesses. The 2020 EU Court ruling that invalidated Privacy Shield really left a great deal of legal uncertainty over how to continue the data flows that were already in place. There was also very limited help or guidance after the invalidation of Privacy Shield. And it took until 2023 – that’s three years – for the Data Privacy Framework to be established. There will be many companies who would have been slow to – or failed to – build Standard Contractual Clauses into their contracts, and who will be concerned about the nature of this retroactive fine.

Victoria Tuffill –2nd September 2024

If you have any questions or concerns about data protection, please call 01787 277742 or email dc@datacompliant.co.uk

And please take a look at our services.

Phishing / Vishing – if you’re tricked … report it fast

In today’s digital age, cybersecurity is critical to every organisation’s operations. One of the most common causes of data breaches is when attackers trick you into giving them personal information they should not have.

Vishing and Phishing tricks

For example, through vishing attacks, where somebody calls you and pretends to be a company you work with, and they ask for what seems perfectly reasonable information. You provide it.

Or you click on a link in a phishing email because it looks as though it has come from someone you know.  It takes you to a web page asking for information. You provide it. 

In both cases, you have provided the attacker with information that should never have been given to them.

What to do next?

These social engineering attempts are becoming increasingly sophisticated, making them hard to detect.  And it’s too easy to fall into their trap.

Remember, if you DO click on a malicious link, or enter your password details in a site which looks suspicious …. don’t just pretend it didn’t happen. It is crucial that the error is immediately reported to your line manager or IT department. 

Why should I own up to causing a data breach?

Because if you do so immediately, the breach can be prevented.  Here’s an example from one of our clients. 

One of its employees received an email from a contact he had previously communicated with via a cloud-sharing platform he would normally access, and at first glance, it all looked fine.  Until he clicked on a malicious link, entered his credentials and authenticated using Multi-Factor Authentication (MFA).  He realised his mistake and reported it to IT immediately.

Here’s what happened

How did reporting it help?

Thanks to the swift reporting and decisive actions taken by the IT team, any potential security breach was thwarted before it could escalate into a more serious incident.  There was no breach of confidentiality, inappropriate access, or unavailability of the company’s personal data.

Organisations can prevent potential security breaches and protect their valuable assets by acting swiftly and decisively.  Fast action can make all the difference between a minor incident and a significant data breach.

What if I don’t own up?

Well, let’s imagine it. Those login attempts would, without doubt, have succeeded.  All the personal data to which that employee had access would be in the hands of the attacker. The company would be facing a severe data breach situation.  They’d be using time and resources to investigate.  They’d probably be calling in expert help to mitigate the damage.  The cost, both in time and money, would be substantial, as they considered:

  • How much damage have I done to my customers?
  • How can I contain the damage?
  • Will I be sued?
  • How many customers will I lose?
  • Who can I use – internally and/or externally –  to help me work through this?
  • How much will I have to spend on expert advice and support through the process?
  • How will I pay for it?
  • Is the breach severe enough to warrant reporting to the supervisory authority – if so, how can I do so within 72 hours?
  • How can I save us from the corresponding reputational damage
  • How much will I be fined, and what can I do to try to reduce the amount?

But because this issue was reported without delay … none of the above was necessary.

Employers must encourage employees to report errors

Employers play a crucial role in fostering a culture of reporting errors. It is important to create an environment where employees feel safe to admit their mistakes and are encouraged to report any suspicious activity.

So, as an employer, it is your job to make sure your staff know that they must report any errors of this kind.  Instead of making them scared to report their mistake, offer them help – for example, by providing additional or repeat training to help them recognise such tricks in future. 

Train your workers to stay vigilant, stay informed, and promptly report any suspicious activity to your IT team.  This will also have the advantage of fostering a culture of vigilance, awareness and honesty.  It helps everyone in the organisation understand the importance of information security and how to safeguard personal data.

If you’re in any doubt, re-read the story above.  And stop to think about the potential damage that would have ensued if the individual concerned had not reported his error.  The attackers would – without doubt – have logged in successfully, and would then have been able to succeed in meeting their aims – to the detriment of your company and your customers.

Victoria Tuffill

20th August 2024

If you have any questions or concerns with data security and increasing staff awareness, please call 01787 277742 or email dc@datacompliant.co.uk

And please take a look at our services.

EU AI Act 2024: Key Obligations for Providers and Deployers

Summary

The EU AI Act, which came into force on August 1st, 2024, introduces a risk-based legal framework for AI systems, and categorises AI systems into different risk levels with corresponding levels of obligation. Some uses of AI that are banned, for example social scoring by governments or manipulating people subliminally.

Similar to the GDPR, the Act also emphasises transparency and accountability, and highlights the importance of data governance, data quality, and data protection.

This document summarises the key obligations of those subject to the EU Act who provide or deploy AI. Those AI systems and models already on the market may be exempt or have longer compliance deadlines.

Territorial Scope of the AI Act

Territorially, the AI Act applies to:

  • deployers that have their place of establishment or who are located within the EU;
  • providers – whether or not they are established or located within the EU –  if they are placing AI systems on the market / using them in the EU
  • those who import or distribute AI systems in the EU
  • product manufacturers who place or put into service AI systems in the EU under their own name or trademark
  • providers and deployers of AI systems where the output of the AI systems are used in the EU, regardless of where they are established or located
  • persons affected by the use of an AI system located in the EU.

Risk-Based Classification

The AI Act introduces a risk-based legal framework based on the specific AI system’s use and the associated risks. It establishes four risk-based types of AI systems:

  • Unacceptable risk – prohibited AI systems
  • High-risk AI systems
  • Limited risk AI systems
  • Minimal risk AI systems

The obligations of the AI Act are mainly focused on providers and deployers of AI systems, and a summary of the risk ratings and obligations of providers and deployers when using high, limited/minimal risk and general-purpose AI systems is shown below:

HIGH RISK: For example: critical infrastructure, hiring processes, employee management, educational and vocational training opportunities and outcomes, credit scoring, emotion recognition, prioritisation of emergency first response services, social scoring, risk assessment and pricing of life and health insurance, some areas of law enforcement, migration, asylum/border management, administration of justice and democratic processes.

Providers must:

  • register the system in an EU database
  • establish, document and maintain appropriate AI risk management system
  • have effective data governance required re training, testing and validation – including bias mitigation
  • draft and maintain AI system technical documentation
  • ensure record keeping, logging and tracing obligations are met
  • put in place robust cyber security controls
  • ensure the system is fair and transparent
  • enable effective human oversight over the system
  • in some instances, complete Conformity Assessment before releasing the system
  • affix the CE marking to the AI system and include the provider’s contact information on the AI system, packaging or accompanying documentation

Deployers must:

  • ensure a competent person with appropriate training, authority and support provides oversight of the system
  • ensure that data input is relevant and sufficiently representative to meet the purpose
  • inform impacted individuals where a system will make or assist in making decisions about them
  • where AI will impact workers, inform them and their representatives that they are subject to a high-risk AI system
  • conduct a rights impact assessment – e.g. where credit testing or pricing of life or health insurance is involved
  • where AI results in a legal (or similar) effect, provide a clear, comprehensible explanation of the role of the AI system in the decision-making process, along with the main elements of the decision
  • develop an ethical AI Code of Conduct

LIMITED OR MINIMAL RISK: for example, AI systems with specific transparency risks or an ability to mislead; AI systems intended to interact directly with individuals; AI systems generating synthetic image, audio, video or text content; emotion recognition systems, biometric categorisation systems; deep fake systems, systems that generate or manipulate public interest information texts made available to the public

Providers and Deployers must:

  • Be transparent – users must be aware that they are interacting with a machine
  • Ensure that any AI-generated content is detectable as artificially generated and / or manipulated
  • Comply with any labelling obligations
  • Develop an ethical AI Code of Conduct (recommended)

GENERAL PURPOSE AI SYSTEMS, including AI systems that can perform general functions such as image/speech recognition, audio/visual generation, pattern detection, question answering, translation, chatbots, spam filters, and so on.

Providers must

  • draft and maintain AI system technical documentation
  • provide information to those who intend to integrate the AI model into their own AI systems
  • implement a policy to comply with EU law on copyright and related rights
  • document and make public a meaningful summary of the content used for training the general-purpose AI model

Fines

Fines for non-compliance range from €7.5 million to a maximum of €35 million (or 1% to 7% of global annual turnover), depending on the severity of the infringement.

PENALTY / FINECOMPLIANCE ISSUE
€35 million or 7% of total worldwide annual turnover for the preceding year, whichever is higherNon-compliance with the rules about prohibited AI systems
€15 million or 3% of total worldwide annual turnover for the preceding year, whichever is higherNon-compliance with most obligations under the AI Act

Noncompliance from providers of general-purpose AI models
€7.5 million or 1% of total worldwide annual turnover for the preceding year, whichever is higherFor the supply of incorrect, incomplete or misleading information

If you have any questions or concerns with how you can use AI in compliance with GDPR or other data protection legislation, please call +44 1787 277742 or email dc@datacompliant.co.uk

And please take a look at our services.

Victoria Tuffill – 12th August 2024

cookie compliance

Time to put an end to half-baked cookies

Set Compliant Cookies – or Face the Penalty

Cookies are clearly the flavour of the month for UK and European data protection enforcers. Supervisory authorities are now turning their attention to non-compliant websites.

UK cookie compliance

In November 2023, the ICO wrote to 53 of the UK’s top 100 websites, instructing them to change their cookie practices or suffer the consequences. Not surprisingly, 38 companies had already complied by the end of January. Others are in the middle of putting things right, and some are working on alternative models (more on that from the ICO next month).

The ICO is now widening its cookie investigations and warning companies to make their cookies compliant. It is investing time, money, and resources to do so. For example, it is developing an AI solution to help find websites with non-compliant cookie banners. The ICO intends to work through websites which target UK users, focusing on cookie compliance by checking cookie usage, and rooting out non-compliant websites.

EU cookie compliance

 In early January, Spain issued new cookie guidance. And the Netherlands has also just issued new guidance, and announced that 2024 will see it investigating cookie/cookie banner use and misuse.  

Cookie Compliance – Key Tips

So, for those who have adopted questionable cookie practices … it’s probably time to put things right before you find yourselves on the receiving end of enforcement penalties. To help you, here are some key tips for your cookie banners:

1. Personal data

PECR (UK) or e-Privacy Directive (Europe) demands consent before setting cookies. But where cookies process personal data, consent MUST be to GDPR standards – so

    • Do not use pre-ticked boxes
    • Explain the purpose of each cookie so that website visitors can make an informed yes/no decision
    • Ideally, put a link to your cookie policy on the banner. Your cookie policy should include the cookies you are setting and explain what they do, how you will use them, how long they will remain on your device, and how users can manage them

2. Pre-set cookies

You may set “strictly necessary” cookies (for example, for website functionality, security, managing shopping baskets, or other requested online services) without consent 

3. Analytics, tracking, and advertising cookies

Wait to set cookies until AFTER the user accepts them.

4. Balance

Make it as easy for users to reject cookies as it is to accept them. If it’s one click to accept, it must be one click to reject.

    • Their choices must be clear and obvious, so words like “accept,” “reject,” and “select” are helpful.
    • If you’re not pre-setting cookies before users accept them, technically you don’t need a reject or refusal button. But if you choose not to use one, you must
      •  make it clear that you have not set any non-essential cookies 
      • Make it obvious how they move past the banner – for example, you can enable them to close it with a single click; or set it to drop if the user takes no action within a specified (short) time.

5. Cookie walls

If you’re using a cookie wall, it MUST drop whether they accept or reject cookies. You may not make access to your site dependent on a user accepting cookies.

6. Consent withdrawal

Make sure there is always a link to the cookie banner so visitors can withdraw their consent at any time.

If you have any questions or concerns with how you can set compliant cookies, please call 01787 277742 or email dc@datacompliant.co.uk

And please take a look at our services.

Victoria Tuffill – 7th February 2024

EU Commission Adopts EU-US Data Privacy Framework

Well it’s taken a few years of hard negotiation, but at last there’s good news (at least for now …) for EU – US data transfers. The long-awaited EU-US Data Privacy Framework (DPF) has now been voted upon by the EU. Adequacy status has been granted, and enters into force immediately (11th July 2023). This means that the EU considers that the DPF provides protection that is essentially as robust as that provided within the EU.

How does the EU-US DPF work?

The DPF framework is designed to provide a safe, reliable, efficient and cost-effective way for businesses to transfer personal data between the EEA and the United States.  The mechanism is similar to the previous Privacy Shield.  And with immediate effect, the EU may send personal data to all self-certified US companies that adhere to the DPF privacy obligations without the need for additional transfer safeguards.

  • For US companies,  the DPF certification process is not onerous – it is very similar for to that adopted for the previous Privacy Shield – participation will arguably be the simplest, most cost-effective and reliable EU-U.S. personal data transfer tool available
  • The DPF will simplify the needs for transfer impact assessments, 2021 SCCs, and supplementary measures when sharing data with U.S. businesses certified under and compliant with the new DPF.

Is DPF different from Privacy Shield?

There are some improvements over Privacy Shield. Most notably, the US has said it will limit US intelligence services’ access to EU data to what is “necessary and proportionate to protect national security”.  Also, EU citizens have improved redress mechanisms if their personal data is handled against the Framework’s privacy obligations.  And there is a new Data Protection Review Court to help address such issues.

Many US companies are already self-certified under the previous Privacy Shield Framework.  They will all be able to access a simplified process for self-certifying their commitment to the new DPF. In practice, this means that they will simply need to sign up to the new Data Privacy Framework principles.

What does this mean for the UK?

This provides both hope and expectation that the much-anticipated UK-US Data Bridge (the UK extension to the Data Privacy Framework) could be in place shortly.   The Department of Commerce states that from 17 July 2023, DPF-certified US organisations can also self-certify for the UK Extension. US organisations signing up to the UK Extension must also self-certify for the EU-US DPF.

However, the UK Extension may not be relied upon for  UK personal data transfers until the UK adequacy regulations are in force. There is no clear date for this,  but it is clear that finalising the Data Bridge is a key deliverable for UK-US data flows in 2023. 

How long will the DPF last?

Impossible to say.  Like all adequacy decisions, it is subject to ongoing scrutiny.  The real issue, however, is that we should expect significant legal cases against the EU-US DPF.  Though the EU Justice Commissioner has stated that the Commission is “very confident to try to, not only implement such an agreement, but also to defend in all procedures it will have to face” it is worth noting that legal cases against both original frameworks – Safe Harbor and its replacement Privacy Shield – resulted in both ultimately being ruled invalid by the EU Court of justice.  However, it is expected that the U.S. efforts to address concerns surrounding data subjects’ redress may help the DPF withstand legal challenges.

 

Victoria Tuffill

13th July 2023

If you have any questions or concerns about your data protection measures, or data transfers to third countries, please don’t hesitate to take a look here or call 01787 277742 or email dc@datacompliant.co.uk