Category Archives: Data Compliance

Choosing your DPO: Full-Time? Service? Consult?

I’ve noticed many SMEs running  LinkedIn ads for DPOs, who are recruiting full-time employees.  And it’s disappointing to see how often other options are overlooked. 

Which DPO Option and Benefits Suit You Best?

I think it’s important to consider all the available options before making a decision.  Yes, you could employ a Full-Time DPO for your business.  Or you could contract with DPO specialists to provide DPO as a service.  Or appoint an internal ‘non-specialist’ and support them with an external data protection consultancy or consultant.

I’ve listed below some of the benefits from these options, so that the next time you need to find a DPO, you have information to help you make an informed decision about what type of DPO you actually need, and what solution might fit you best.

1. Cost 

As with any new employee, hiring a highly qualified full-time DPO involves significant costs, especially in salary and benefits.  Outsourcing the DPO role means you only pay for the DPO services and time you use. You don’t need to consider staff benefits, holiday, sickness, appraisals. Or the time, cost and expense involved if you need to let them go. Nor do you need to be concerned about FTE overheads like office space, equipment or other resources.

2. Expertise

Internal DPOs may struggle with budget constraints and limited resources. Outsourcing can provide a more cost-effective solution with access to necessary resources as needed. An outsourced DPO service can give you more or less support month by month, depending on your needs. You can choose how much time you need, and, in any case the time required can be flexible. You can ramp it up or down depending on whether you have a large project, or simply need ongoing maintenance. And if you need an interim DPO, you can appoint a consultant with no need for long term commitment.

3. Flexibility / Scalability

Top quality DPOs (whether FTE or outsourced) are experts in data protection law regulations. But perhaps the biggest advantage to using the DPO-as-a-Service or consultancy option is that those DPOs will have gained considerable and diverse experience from working in many different industry sectors.  They see many and varied solutions to common data protection issues from the numerous clients with whom they work.  And vitally, within a team of DPOs in a consultancy, they will always be learning from each other, considering solutions based on the shared knowledge of the whole team.  And that shared knowledge becomes your company’s shared knowledge.

4. Unbiased Approach

An outsourced DPO or consultant has the advantage of being independent and unconflicted. They are able to consider your issues with fresh eyes and no bias.  This means that they can conduct unbiased audits and assessments of your data protection practices. Then help you implement any remedial actions. 

5. Internal Challenges

Full-time DPOs often face challenges such as lack of support from key stakeholders and cooperation within the organisation. Although fully engaged with the client and its goals, outsourced DPOs can navigate these challenges more effectively due to their independence. 

Conclusion

The traditional route of hiring a full-time employee may be perfect for many companies. But it’s clearly not the only solution.  So when you next need to appoint a DPO, you could state that not only full-time employees, but also DPO-as-a-Service providers or consultants are welcome to apply. 

That way you can be sure that you don’t miss out by excluding the right person by default.  And of course, you can review and interview applicants as normal and make your own decision about which individual or option fits your needs best. 

Data Compliant International

If you are looking for a DPO or supportive consultant, Data Compliant International provides DPO-as-a-Service, and data protection / privacy  consultants to a wide range of business sectors.  If you’d like to know more about how we help our clients, please take a look here.  If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742.  

Lessons from Darts: Team Dynamics in Data Protection

Teams are an essential part of life – from school to adulthood, from sports to business.  A well-functioning team leads to extraordinary achievements, whether in a local darts league or a data governance team.

The Darts Team Triumph

Consider my local darts team, which recently won the team title, along with individual singles titles. This victory wasn’t just about individual knowledge and talent; it was the result of shared goals, a strategy to achieve them, collaboration, strong mentoring, and mutual support. Each of our players’ unique skills, camaraderie and collective effort all contributed to the team’s overall success. 

Transferring Team Dynamics to Data Governance

The same principles apply to data protection governance teams. Every member of the Team must understand its overall objectives ensuring that they are responsible and accountable for data management and governance. The Team will need a framework for success, including communication and collaboration, and creating and maintaining policies and procedures around data collection, privacy, compliance, integrity and security. And it must provide regular reports to senior management who are ultimately accountable. 

Roles, Goals and Data Stewardship

Individuals within the team will take on data stewardship roles.  In essence they will oversee the entire lifecycle of personal data from collection to deletion, and be accountable for compliance and security at all stages. All team members will support each other, sharing knowledge and expertise to help manage challenges and foster a culture of continuous improvement. And each will have their own individual areas of responsibility including embedding data protection throughout their own area of the business.

Education and Continuous Improvement

Like in darts, governance team members learn from each other’s techniques, and share knowledge, best practices and insights. This knowledge is then used to help build awareness throughout the organisation about data protection and data security, and to educate employees about crucial data protection principles.

Risk Management

Sports and business both carry risks, and the team must take responsibility for identifying, assessing and mitigating them – in data governance, for example through Data Protection Impact Assessments (DPIAs).  The team must also develop and execute its response plans so that it knows how to respond if there is a data breach or security incident.

Enabling Team Leaders

Team Leaders are crucial. They are pivotal in flowing down information to their specific areas of the business – in data governance, for example, it’s helpful to have leaders from IT, HR, Marketing, Operations, Payroll and so on. It’s those Team Leaders who will then ensure that everyone in their team understands their roles and responsibilities, and who provide the resources and training so that every individual in an organisation can thrive and contribute effectively.

Conclusion

Effective teams enable the individuals in your organisation to achieve more together than they ever could alone. With a data governance team that fosters collaboration, shared problem-solving and continuous education, your organisation will benefit from strong and highly successful outcomes.

Data Compliant International

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742.  And for more information about how to meet your Accountability and Governance obligations, please see here. 

Data Protection and Privacy Impacts of the New UK Data (Use and Access) Bill

Background

On Wednesday 23 October 2024, the UK Government published its Data (Use and Access) Bill (“DUA“). It promised to “harness the enormous power of data to boost the UK economy by £10 billion” and “unlock the secure and effective use of data for the public interest“. 

The DUA mirrors many of the concepts and provisions from the previous Government’s abandoned Data Protection and Digital Information Bill (“DPDI“), though there are subtle changes. The DUA appears to place greater focus on data sharing and digital. 

It is worth noting that the EU is set to review the UK’s data transfer adequacy status in mid-2025. Maintaining adequacy status is vital to the UK. (Possibly) as a result, some of the more contentious issues included in the discarded DPDI have been removed from the DUA. 

With the mid-2025 adequacy review date in mind, the government will undoubtedly try to get the Bill through as quickly as possible. After two readings in the House of Lords, it is now at Committee Stage.

DUA – Key Points for organisations

The key points of the DUA are:

  • UK Adequacy Status:  As stated above, the EU is reviewing the UK’s adequacy status in Mid-June.
  • Accountability requirements:  in the DPDI, there were plans to amend and simplify the accountability obligations required under GDPR.  These have NOT been carried over into the DUA.  Specifically there are to be no changes to:
    • the requirements for a DPO
    • requirements for Records of Processing Activities
    • requirements for Data Protection Impact Assessments.
  • ICO Reform: The Information Commissioner’s Office will be replaced by a new corporate body called Information Commission.  Executive members will be appointed and scrutinised by the Chair and non-executive members.  The Commissioner will be required to look to public interest factors around data protection. For example, it must consider the desirability of promoting innovation and competition.  There is also emphasis on protecting children in relation to data processing.
  • Special Category Data:  the Secretary of State has the power to add and remove new special categories of data. Those that already exist in Article 9 may not be removed. 
  • Data Subject Access Requests (DSARs): The discarded DPDI inlcuded the concept of an exception around “vexatious” requests. This has NOT been included in the DUA. However, proportionality is a key consideration in the DUA, which makes responding to DSARs more straightforward, including by confirming that a DSAR search for personal data need only be “reasonable and proportionate”
    • The 30-day time period to complete a DSAR begins only after the organisation has confirmed the individual’s identity.
    • The DUA also helps businesses by turning common DSAR practices, based on ICO guidance, into law.This offers certainty for organisations. For example, where
      • If an organisation has large amounts of information about the data subject, it may ask the subject to narrow down the information requested. 
      • While it seeks this information, it may briefly halt the time frame.
  • Legitimate Interests: there is a new concept of recognised legitimate interests where certain data processing activities will not require a full Legitimate Interest Assessment (LIA), specifically, for example:
    • safeguarding national security or public safety
    • responding to an emergency
    • crime prevention / investigation
    • public health
    • exercising data subject rights, regulatory functions or civil law claims. 
  • This list can be updated ongoing subject to parliamentary approval. 
  • It is worth noting that the European Court of Justice has consistently ruled that any interest that is legal may be a legitimate interest – i.e. that a purely commercial interest can be a legitimate interest.
  • In addition, when conducting an LIA, it is acceptable to take into account not only the benefits to the individuals, but also so the environment (e.g. paper reduction), economy (e.g. generating growth and spending budgets in a targeted manner).
  • Privacy and Electronic Communications Regulations:  PECR is included in DUA, and therefore is aligned with the levels of fine available for GDPR breaches.  This is a massive increase from the £500,000 maximum fine currently in place.  In addition, the DPDI’s email soft opt-in for non-commercial organisations (such as charities) is NOT currently included (though lobbying is ongoing).
  • Cookie Consent Exemptions: The aim is to reduce the number of cookie consent banners.  DUA allows the use of cookies without consent in specific circumstances, such as ensuring security or preventing fraud, collecting information for statistical purposes for own use, to improve the website functionality and appearance to the user, and to provide emergency assistance.  This is particularly beneficial to those parties who do not use advertising cookies – for example B2B websites.
  • Digital Verification Services: DUA aims to create a framework for trusted online identity verification services, moving away from paper-based and in-person tasks (e.g. registering births and deaths online). Companies providing digital verification tools must be certified against government standards and will receive a ‘trust mark’.
  • Smart Data Schemes: The introduction of smart data schemes will require businesses in sectors like financial services and public utilities to enable data interoperability and secure data sharing. This aims to enhance consumer confidence and drive innovation
  • Data Access Provisions: The DUA introduces data access standards similar to the EU’s Data Governance Act, enabling controlled data sharing between businesses and public authorities. 
  • Automated Decision Making: The DUA will make it easier for organisations to adopt a broader use of automated decision-making for low-risk, beneficial processing – for example when using artificial intelligence (AI) systems. It limits the scope of the UK’s GDPR Article 22 to cover only “significant” decisions, and those based either in part or entirely on special category data. 
  • Data Transfers: the DUA replaces Chapter 5 of the UK GDPR with a new “data protection test” for the Secretary of State to consider international data transfers, in which the objective is to ensure standards are not materially lower than in the UK.  This differs from the EU approach which looks for equivalence.

Timetable

With the upcoming adequacy review in mind, it seems likely that the government is trying to get the Bill through as quickly as possible – it has already had two readings in the House of Lords and is currently at Committee Stage in the House of Lords.

Data Compliant

If you would like help or assistance with any of your data protection obligations, please email dc@datacompliant.co.uk or call 01787 277742,  And, for more information about to meet your accountancy obligations – both before and after the DUA comes into force, please see here.

Victoria Tuffill

18th December 2024

EU Standard Contractual Clauses – Public Consultation

This month (September 2024), the European Commission has announced that it plans to ask for public feedback on the EU Standard Contractual Clauses (SCCs) under the General Data Protection Regulation. The public consultation will take place in the fourth quarter of 2024, giving you an opportunity to have your views and opinions heard.

This is not unexpected – the GDPR’s Article 97, requires the Commission to review the GDPR’s implementation every four years (see the 2020 Evaluation Report here).  The upcoming 2024 review was expected to include an evaluation of the practical application of the SCCs.

New SCCs in 2025

According to the timeline, the public consultation is imminent and due to take place in the 4th quarter of 2024. This would be followed by a draft act, planned for Commission adoption in 2nd quarter of 2025.  You can find more information and a timeline here.

What are SCCs?

Standard contractual clauses are standardised, pre-approved model data protection clauses, which allow controllers and processors to meet their obligations under EU and / or UK data protection law. 

They are widely used as a tool for data transfers to third countries (which means those countries outside the EEA or the  UK who do not have adequacy status).  It is quite a simple matter for controllers and processors to incorporate them into their contractual arrangements.

The clauses contain data protection safeguards to make sure that personal data benefits from a high level of protection even when sent to a third country.  By adhering to the SCCs, data importers are contractually committed to abide by a set of data protection safeguards.

Can I change the text?

The core text can not be changed. If parties do change the text themselves, they will no longer have the legal certainty offered by the EU act.  If you amend the clauses, then they can no longer be used as a basis for data transfers to third countries, unless they are approved by a national data protection authority as “ad hoc clauses”

Even so, there are areas where the parties can make choices:

  • To select modules and / or specific options offered within the text
  • To complete the text where necessary (eg to specify time periods, supervisory authority and competent courts
  • To complete the Annexes
  • To include additional safeguards that increase the level of protection for the data. 

Impact on UK use of SCCs

There is not yet any indication of the potential impact on the UK’s international data transfer Agreement (IDTA) or the Addendum to the EU’s SCCs; we would expect to hear more after the EU’s public consultation.

Victoria Tuffill – 13th September 2024

If you have any questions or concerns about how and when to use SCCs, please call 01787 277742 or email dc@datacompliant.co.uk

And please take a look at our services.

Phishing / Vishing – if you’re tricked … report it fast

In today’s digital age, cybersecurity is critical to every organisation’s operations. One of the most common causes of data breaches is when attackers trick you into giving them personal information they should not have.

Vishing and Phishing tricks

For example, through vishing attacks, where somebody calls you and pretends to be a company you work with, and they ask for what seems perfectly reasonable information. You provide it.

Or you click on a link in a phishing email because it looks as though it has come from someone you know.  It takes you to a web page asking for information. You provide it. 

In both cases, you have provided the attacker with information that should never have been given to them.

What to do next?

These social engineering attempts are becoming increasingly sophisticated, making them hard to detect.  And it’s too easy to fall into their trap.

Remember, if you DO click on a malicious link, or enter your password details in a site which looks suspicious …. don’t just pretend it didn’t happen. It is crucial that the error is immediately reported to your line manager or IT department. 

Why should I own up to causing a data breach?

Because if you do so immediately, the breach can be prevented.  Here’s an example from one of our clients. 

One of its employees received an email from a contact he had previously communicated with via a cloud-sharing platform he would normally access, and at first glance, it all looked fine.  Until he clicked on a malicious link, entered his credentials and authenticated using Multi-Factor Authentication (MFA).  He realised his mistake and reported it to IT immediately.

Here’s what happened

How did reporting it help?

Thanks to the swift reporting and decisive actions taken by the IT team, any potential security breach was thwarted before it could escalate into a more serious incident.  There was no breach of confidentiality, inappropriate access, or unavailability of the company’s personal data.

Organisations can prevent potential security breaches and protect their valuable assets by acting swiftly and decisively.  Fast action can make all the difference between a minor incident and a significant data breach.

What if I don’t own up?

Well, let’s imagine it. Those login attempts would, without doubt, have succeeded.  All the personal data to which that employee had access would be in the hands of the attacker. The company would be facing a severe data breach situation.  They’d be using time and resources to investigate.  They’d probably be calling in expert help to mitigate the damage.  The cost, both in time and money, would be substantial, as they considered:

  • How much damage have I done to my customers?
  • How can I contain the damage?
  • Will I be sued?
  • How many customers will I lose?
  • Who can I use – internally and/or externally –  to help me work through this?
  • How much will I have to spend on expert advice and support through the process?
  • How will I pay for it?
  • Is the breach severe enough to warrant reporting to the supervisory authority – if so, how can I do so within 72 hours?
  • How can I save us from the corresponding reputational damage
  • How much will I be fined, and what can I do to try to reduce the amount?

But because this issue was reported without delay … none of the above was necessary.

Employers must encourage employees to report errors

Employers play a crucial role in fostering a culture of reporting errors. It is important to create an environment where employees feel safe to admit their mistakes and are encouraged to report any suspicious activity.

So, as an employer, it is your job to make sure your staff know that they must report any errors of this kind.  Instead of making them scared to report their mistake, offer them help – for example, by providing additional or repeat training to help them recognise such tricks in future. 

Train your workers to stay vigilant, stay informed, and promptly report any suspicious activity to your IT team.  This will also have the advantage of fostering a culture of vigilance, awareness and honesty.  It helps everyone in the organisation understand the importance of information security and how to safeguard personal data.

If you’re in any doubt, re-read the story above.  And stop to think about the potential damage that would have ensued if the individual concerned had not reported his error.  The attackers would – without doubt – have logged in successfully, and would then have been able to succeed in meeting their aims – to the detriment of your company and your customers.

Victoria Tuffill

20th August 2024

If you have any questions or concerns with data security and increasing staff awareness, please call 01787 277742 or email dc@datacompliant.co.uk

And please take a look at our services.

HMRC’s 28 days to delete unlawfully obtained biometric data

In a statement released on 3rd May, the Information Commissioner’s Office reiterated their decision to issue HMRC a preliminary enforcement notice in early April. This initial notice was based on an investigation conducted by the ICO after a complaint from Big Brother Watch concerning HMRC’s Voice ID service on a number of the department’s helplines since January 2017.

blurred-background-cellphone-cellular-1426939

The voice authentication for customer verification uses a type of biometric data considered special category information under the GDPR, and is therefore subject to stricter conditions. ICO’s investigation found that HMRC did “not give customers sufficient information about how their biometric data would be processed and failed to give them the chance to give or withhold consent.” HMRC was therefore in breach of GDPR.

The preliminary enforcement notice issued by the ICO on April 4th stated that HMRC must delete all data within the Voice ID system for which the department was never given explicit consent to have or use. According to Big Brother Watch, this data amounted to approximately five million records of customers’ voices. These records would have been obtained on HMRC’s helplines, but due to poor data security policy for the Voice ID system, the customers had no means of explicitly consenting to HMRC’s processing of this data.

Steve Wood, Deputy Commissioner at the ICO, stated, “We welcome HMRC’s prompt action to begin deleting personal data that it obtained unlawfully. Our investigation exposed a significant breach of data protection law – HMRC appears to have given little or no consideration to it with regard to its Voice ID service.”

The final enforcement notice is expected 10th May. This will give HMRC a twenty-eight-day timeframe to complete the deletion of this large compilation of biometric data.

The director of Big Brother Watch, Silkie Carlo, was encouraged by the ICO’s actions:

“To our knowledge, this is the biggest ever deletion of biometric IDs from a state-held database. This sets a vital precedent for biometrics collection and the database state, showing that campaigners and the ICO have real teeth and no government department is above the law.”

 Harry Smithson, May 2019. 

Be Data Aware: the ICO’s campaign to improve data awareness

As the Information Commissioners Office’s ongoing investigation into the political weaponisation of data analytics and harvesting sheds more and more light on the reckless use of ‘algorithms, analysis, data matching and profiling’ involving personal information, consumers are becoming more data conscious. The ICO, as of 8th May, has launched an awareness campaign, featuring a video, legal factsheets reminding citizens of their rights under GDPR, and advice guidelines on internet behaviour. Currently the campaign is floating on Twitter under #BeDataAware.

D6ntRrGXkAEz8uX

While the public is broadly aware of targeted marketing, and fairly accustomed to the process of companies attempting to reach certain demographics, the political manipulation of data is considered, if not a novel threat, then a problem compounded by the new frontier of online data analytics. Ipsos MORI’s UK Cyber Survey conducted on behalf of the DCMS found that 80% of respondents considered cyber security to be a ‘high priority,’ but that many of these people would not be in groups likely to take much action to prevent cybercrime personally. What this could indicate is that while consumers may be concerned about cybercrime being used against themselves, they are also aware of broader social, economic and political dangers that the inappropriate or illegal use of personal information poses.

ICO’s video (currently on Vimeo, but not YouTube), titled ‘Your Data Matters,’ asks at the beginning, “When you search for a holiday, do you notice online adverts become much more specific?” Proceeding to graphics detailing this relatively well-known phenomenon, the video then draws a parallel with political targeting: “Did you know political campaigners use these same targeting techniques, personalising their campaign messaging to you, trying to influence your vote?” Importantly, the video concludes, “You have the right to know who is targeting you and how your data is used.”

To take a major example of an organisation trying to facilitate this right, Facebook allows users to see why they may have been targeted by an advert with a clickable, dropdown option called ‘Why am I seeing this?’ Typically, the answer will read ‘[Company] is trying to reach [gender] between the ages X – Y in [Country].’ But the question remains as to whether this will be sufficiently detailed in the future. With growing pressure on organisations to pursue best practice when it comes to data security, and with the public’s growing perception of the political ramifications of data security policies, will consumers and concerned parties demand more information on, for instance, which of their online behaviours have caused them to be targeted?

A statement from the Information Commissioner Elizabeth Denham as part of the Be Data Aware campaign has placed the ICO’s data security purview firmly in the context of upholding democratic values.

“Our goal is to effect change and ensure confidence in our democratic system. And that can only happen if people are fully aware of how organisations are using their data, particularly if it happens behind the scenes.

“New technologies and data analytics provide persuasive tools that allow campaigners to connect with voters and target messages directly at them based on their likes, swipes and posts. But this cannot be at the expense of transparency, fairness and compliance with the law.”

Uproar surrounding the data analytics scandal, epitomised by Cambridge Analytica’s data breach beginning in 2014, highlights the public’s increasing impatience with the reckless use of data. The politicisation of cybercrime, and greater knowledge and understanding of data misuse, means that consumers will be far less forgiving of companies that are not seen to be taking information security seriously.

Harry Smithson 9 May 2019

GDPR and Accountants

Tax returns onlineGDPR Debate

On Monday, 16th October, Data Compliant’s Victoria Tuffill was invited by AccountancyWeb to join a panel discussion on how GDPR will impact accountants and tax agents.

The other members of the panel were our host, John Stokdyk, Global Editor of AccountingWEB, who kept us all on the straight and narrow, while asking some very pertinent questions; Ian Cooper from Thomson Reuters who gave strong insights into technical solutions; and Dave Tucker from Thompson Jenner LLP, who provided a very useful practitioner viewpoint.

GDPR in General

There is a presumption that every professional body is fully informed of all compliance regulations within their field of expertise.  But the continuing barrage of changes and adjustments to European and British law makes it easy to drop the ball.

GDPR is a typical example.  To quote the Information Commissioner, Elizabeth Denham, it’s “The biggest change to data protection law for a generation”. Yet for many accountants – and so many others – it’s only just appearing on the radar.   This means there’s an increasingly limited amount of time to be ready.

GDPR has been 20 years coming, and is intended to bring the law up to date – in terms of new technology, new ways we communicate with each other, and the increasing press coverage and consumer awareness of personal data and how it’s used by professional organisations and others.  GDPR has been law for 17 months now, and it will be enforced from May 2018.

GDPR and Accountants

So what does GDPR mean for accountants in particular?

  • Accountants will have to deal with the fact that it’s designed to give individuals back their own control over their own personal information and strengthens their rights.
  • It increases compliance and record keeping obligations on accountants. GDPR makes it very plain that any firm which processes personal data is obliged to protect that data – for accountants that responsibility is very significant given the nature of the personal data an accountant holds.
  • There are increased enforcement powers – I’m sure everyone’s heard of the maximum fine of E20,000 or 4% of global turnover, whichever is higher. But also, the media have a strong hold on the whole area of data breaches – and often the reputational damage has a far greater impact than the fine.
  • Accountancy firms must know precisely what data they hold and where it’s held so they can they assess the scale of the issue, and be sure to comply with the demands of GDPR.

The video covers key points for practitioners to understand before they can prepare for compliance, and summarises some initial steps they should take today to prepare their firms.

The other members of the panel were our host, John Stokdyk, Global Editor of AccountingWEB, who kept us all on the straight and narrow, while asking some very pertinent questions; Ian Cooper from Thomson Reuters who gave strong insights into technical solutions; and Dave Tucker from Thompson Jenner LLP, who provided a very useful practitioner viewpoint.

The session can be found here:  Practice Excellence Live 2017:  GDPR.

It is a 45 minute video, so for those with limited time, I have broken down the areas covered into bite-size chunks:

video accountants timingsData Compliant is working with its clients to help them prepare for GDPR, so if you are concerned about how GDPR will affect your firm or business, feel free to give us a call and have a chat on 01787 277742 or email dc@datacompliant.co.uk if you’d like more information.

 

 

 

Victoria Tuffill  19th October, 2017

 

 

 

Data Compliant News Blog: Cyberattack threatens over 400,000 British consumers, Data Protection Bill 2017 published and fines levied on councils mishandling data

Equifax data breach – hackers may have access to hundreds of thousands of British consumers’ personal details

The Information Commissioner’s Office (ICO) is investigating a hack on Equifax, a large credit rating agency based in Atlanta, USA, to find out whether and to what extent the company’s British consumers’ personal details have been obtained by the hackers. The FBI is also said to be monitoring the situation.

The cyberattack, reported earlier this month, occurred in May and July. The company has already admitted that 143 million American customers’ personal details have been obtained by the hackers.

Credit Cards

400,000 UK customers may be affected by Equifax breach

The US information that the hackers may have accessed includes names, social security numbers, dates of birth, addresses and driving licence details, as well as over 200,000 credit card numbers.

The ICO told Equifax that the company must warn British residents of the data breach and inform them of any information relating to them which has been obtained by the cyber attackers. The credit agency promptly issued alerts to the affected Britons, stating however that an ‘identity takeover’ was unlikely.

Britons would do well to be mindful that, once a hacker has  name, date of birth,  email addresses, and telephone numbers, it takes little effort to acquire the missing elements, which is why the ICO has warned members of the public to remain vigilant against unsolicited emails and communications.  They should also be particularly wary of unexpected transactions or activity recorded on their financial statements.

Shares in Equifax saw considerable reductions throughout the week, and two of the company’s senior executives, the Chief Information Officer and Chief Security Officer have resigned with immediate effect..

The Data Protection Bill 2017, which includes GPDR, has been published

New Law 2

GDPR is included in its entirety in the UK’s Data Protection Bill 2017, now going through Parliament

On 14th September, the Department for Digital, Culture, Media and Sport published the Data Protection Bill 2017. The Bill has been anticipated since the Queen’s speech in June, in which the government outlined its plan to implement the European-wide data protection game-changer GDPR into British law.

Culture secretary Karen Bradley explains: “The Data Protection Bill will give people more control over their data, support businesses in their use of data, and prepare Britain for Brexit.  In the digital world strong cyber security and data protection go hand in hand. This Bill is a key component of our work to secure personal information online.”

While the Bill inculcates the GDPR, and therefore provides the basis for data-sharing and other adequacy agreements with the EU after Brexit, the government has stated that it managed to negotiate some ‘vital’ and ‘proportionate’ exemptions for the UK.

Some of the exemptions are provided for journalists accessing personal data to expose wrongdoing or for the good of the public; scientific and research organisations such as museums if their work is hindered; anti-doping bodies; financial firms handling personal data on suspicion of terrorist financing; money laundering; and employment where access may be neededs to personal data to fulfil the requirements of employment law.

The second reading of the Bill in Parliament will take place on 10th October, after which a general debate on Brexit and data protection takes place on the 12th.

As yet, there have been few critics of the proposed legislation outside certain industries whose use of big data makes them particularly susceptible to possible data protection breaches and massive fines (£17m or 4% annual global turnover). Some industry leaders have called for exemptions, including the private pension giant Scottish Widows, who claimed GDPR-level regulations would make it impossible for them to contact some of their customers without breaking the law. However, according to the government, 80% of Britons do not believe that they have control over their information online, and the Bill enjoys widespread support at this point. The Shadow Cabinet has yet to offer any official response or criticism.

Islington Council fined £70,000 

The Information Commissioner’s Office (ICO) fined Islington Council £70,000 for failing to secure 89,000 peoples’ personal information on an online parking ticket system.

Design faults in the Council’s ‘Ticket Viewer’ system, which keeps CCTV images of parking offences, compromised the security of 89,000 peoples’ personal data. Some of this data is under the category of sensitive personal information, e.g. medical details disclosed for the sake of appealing against a parking fine.

Harry Smithson 23rd September 2017

Data Protection Weekly Roundup: GDPR exemption appeals, gambling industry exploitation scandal, cyber attacks and data breaches

Corporate pensions company Scottish Widows to lobby for specific exemptions from the General Data Protection Regulation ahead of EU initiative’s May 2018 introduction.

Pensions

Scottish Widows seeks derogations in relation to communicating with its customers in order to “bring people to better outcomes.”

The Lloyds Banking Group subsidiary Scottish Widows, the 202-year old life, pensions and investment company based in Edinburgh, has called for derogations from the GDPR.

A great deal has been written across the Internet about the impending GDPR, and much of the information available is contradictory. In fact many organisations and companies have been at pains to work out what exactly will be expected of them come May 2018. While it is true that the GDPR will substantially increase policy enforcers’ remits for penalising breaches of data protection law, the decontextualized figure of monetary penalties reaching €20 million or 4% of annual global turnover – while accurate in severe cases – has become something of a tub-thump for critics of the regulation.

Nevertheless, the GDPR is the most ambitious and widescale attempt to secure individual privacy rights in a proliferating global information economy to date, and organisations should be preparing for compliance. But the tangible benefits from consumer and investor trust provided by data compliance should always be kept in sight. There is more information about the GDPR on this blog and the Data Compliant main site.

Certain sectors will feel the effects of GDPR – in terms of the scale of work to prepare for compliance – more than others. It is perhaps understandable, therefore, why Scottish Widows, whose pension schemes may often be supplemented by semi-regular advice and contact, would seek derogations from the GDPR’s tightened conditions for proving consent to specific types of communications. Since the manner in which consent to communicate with their customers was acquired by Scottish Widows will not be recognised under the new laws, the company points out that “in future we will not be able to speak to old customers we are currently allowed to speak to.”

Scottish Widows’ head of policy, pensions and investments Peter Glancy’s central claim is that “GDPR means we can’t do a lot of things that you might want to be able to do to bring people to better outcomes.”

Article 23 of the GDPR enables legislators to provide derogations in certain circumstances. The Home Office and Department of Health for instance have specific derogations so as not to interfere with the safeguarding of public health and security. Scottish Widows cite the Treasury’s and DWP’s encouragement of increased pension savings, and so it may well be that the company plans to lobby for specific exemptions on the grounds that, as it stands, the GDPR may put pressure on the safeguarding of the public’s “economic or financial interests.”

Profiling low income workers and vulnerable people for marketing purposes in gambling industry provokes outrage and renewed calls for reform.

gambling

The ICO penalised charities  for “wealth profiling”. Gambling companies are also “wealth profiling” in reverse – to target people on low incomes who can ill afford to play

If doubts remain that the systematic misuse of personal data demands tougher data protection regulations, these may be dispelled by revelations that the gambling industry has been using third party affiliates to harvest data so that online casinos and bookmakers can target people on low incomes and former betting addicts.

An increase in the cost of gambling ads has prompted the industry to adopt more aggressive marketing and profiling with the use of data analysis. An investigation by the Guardian including interviews with industry and ex-industry insiders describes a system whereby data providers or ‘data houses’ collect information on age, income, debt, credit information and insurance details. This information is then passed on to betting affiliates, who in turn refer customers to online bookmakers for a fee. This helps the affiliates and the gambling firms tailor their marketing to people on low incomes, who, according to a digital marketer, “were among the most successfully targeted segments.”

The data is procured through various prize and raffle sites that prompt participants to divulge personal information after a lengthy terms and conditions that marketers in the industry suspect serves only to obscure to many users how and where the data will be transferred and used.

This practice, which enables ex-addicts to be tempted back into gambling by the offer of free bets, has been described as extremely effective. In November last year, the Information Commissioner’s Office (ICO) targeted more than 400 companies after allegations the betting industry was sending spam texts (a misuse of personal data). But it is not mentioned that any official measures were taken after the investigations, which might have included such actions as a fine of £500,000 under the current regulations. Gambling companies are regulated by the slightly separate Gambling Commission, who seek to ensure responsible marketing and practice. But under the GDPR it may well be that the ICO would have licence to take a much stronger stance against the industry’s entrenched abuse of personal information to encourage problem gambling.

Latest ransomware attack on health institution affects Scottish health board, NHS Lanarkshire.

According to the board, a new variant of the malware Bitpaymer, different to the infamous global WannaCry malware, infected its network and led to some appointment and procedure cancellations. Investigations are ongoing into how the malware managed to infect the system without detection.

Complete defence against ransomware attacks is problematic for the NHS because certain vital life-saving machinery and equipment could be disturbed or rendered dysfunctional if the NHS network is changed too dramatically (i.e. tweaked to improve anti-virus protection).

A spokesman for the board’s IT department told the BBC, “Our security software and systems were up to date with the latest signature files, but as this was a new malware variant the latest security software was unable to detect it. Following analysis of the malware our security providers issued an updated signature so that this variant can now be detected and blocked.”

Catching the hackers in the act

Hackers

Attacks on newly-set up online servers start within just over one hour, and are then subjected to “constant” assault.

According to an experiment conducted by the BBC, cyber-criminals start attacking newly set-up online servers about an hour after they are switched on.

The BBC asked a security company, Cybereason, to carry out to judge the scale and calibre of cyber-attacks that firms face every day.   A “honeypot” was then set up, in which servers were given real, public IP addresses and other identifying information that announced their online presence, each was configured to resemble, superficially at least, a legitimate server.  Each server could accept requests for webpages, file transfers and secure networking, and was accessible online for about 170 hours.

They found that that automated attack tools scanned such servers about 71 minutes after they were set up online, trying to find areas they could exploit.  Once the machines had been found by the bots, they were subjected to a “constant” assault by the attack tools.

Vulnerable people’s personal information exposed online for five years

Vulnerable customers

Vulnerable customers’ personal data needs significant care to protect the individuals and their homes from harm

Nottinghamshire County Council has been fined £70,000 by the Information Commissioner’s Office for posting genders, addresses, postcodes and care needs of elderly and disabled people in an online directory – without basic security or access restrictions such as a basic login requiring username or password.  The data also included details of the individuals’ care needs, the number of home visits per day and whether they were or had been in hospital.  Though names were not included on the portal, it would have taken very little effort to identify the individuals from their addresses and genders.

This breach was discovered when a member of the public was able to access and view the data without any need to login, and was concerned that it could enable criminals to target vulnerable people – especially as such criminals would be aware that the home would be empty if the occupant was in hospital.

The ICO’s Head of Enforcement, Steve Eckersley, stated that there was no good reason for the council to have overlooked the need to put robust measures in place to protect the data – the council had financial and staffing resources available. He described the breach as “serious and prolonged” and “totally unacceptable and inexcusable.”

The “Home Care Allocation System” (HCAS) online portal was launched in July 2011, to allow social care providers to confirm that they had capacity to support a particular service user.  The breach was reported in June 2016, and by this time the HCAS system contained a directory of 81 service users. It is understood that the data of 3,000 people had been posted in the five years the system was online.

Not surprisingly, the Council offered no mitigation to the ICO.  This is a typical example of where a Data Privacy Impact Assessement will be mandated under GDPR.

Harry Smithson, 6th September 2017