Tag Archives: data compliant

Brexit: 55 Days to Go, or is it?

It has been a momentous week for UK politics. With Parliament back from the summer recess MPs moved to seize the Order Paper from Government. There then followed an audacious move to legislate against a “No Deal” Brexit, a move which would hamstring the Government’s Brexit negotiation strategy. The Government’s strenuous attempts to prevent the passage of legislation to take the “No Deal” option out of the equation led to the withdrawal of the whip (in effect the suspension) of 21 Conservative MPs.  The legislation that would prevent a No Deal outcome will return to Parliament early next week.  It remains to be seen whether the legislation achieves Royal Assent and is written into law.

In the meantime, the plan to prorogue Parliament for five weeks ahead of the Brexit deadline moved on apace despite a number of legal challenges.

Elsewhere 

This week also saw the Prime Minister’s own brother, Jo Johnson, resign as Universities Minister registering his objection to the direction of the Brexit negotiations which he viewed as no longer in the national interest. 

On a happier note Downing Street announced the arrival of a new resident as the Prime Minister and his partner unveiled Dilyn their new puppy.

The Prime Minister is keen to reinforce his Government’s resolve to achieve Brexit, with or without a deal by the 31st October deadline. But despite his vigorous defence of this policy it remains unclear whether this will be achieved.

Implications for businesses

Amongst all this political turmoil it is difficult for businesses to plan ahead especially if the business model includes data transfers to or from EEA companies.

In the case of a No Deal Brexit on the date of departure the UK becomes a ‘Third Country’ in terms of EU data transference rules.  This means companies that the UK will not have adequacy status, so needs to take particular steps when processing EEA* data, for example, the data of your customers or prospects or clients.

What you need to do

The UK will recognise all EEA countries as adequate under UK law.  So there are no issues with you continuing to send personal data to the EEA. 

The reverse, however, is not the case so there will be major changes when transferring personal data from the EEA to the UK.  You need to prepare:

  1. Know your data” specifically that data you process about EU individuals.  Make sure your data mapping is up to date and identifies those individuals outside the UK, but within the EEA.
  2. Take appropriate GDPR safeguards for processing and transfers of EEA data, and update your privacy policy accordingly
  3. Use Standard Contractual Clauses to enable transfers of personal data from the EEA to the UK and vice versa.
  4.  If you are using Binding Corporate Rules, these will need to be adjusted slightly post-Brexit.

*EEA = The 27 European Member States, plus Iceland, Liechtenstein and Norway.

Please feel free to contact us if you have any queries or concerns about how Brexit will affect your business, by calling 01787 277742 or email teambrexit@datacompliant.co.uk

GDPR’s 1st Birthday

General Data Protection Regulation reaches its first birthday

This blogpost arrives as the General Data Protection Regulation (GDPR) reaches its first birthday, and a week after a report from the Washington-based Center for Data Innovation (CDI) suggested amendments to the GDPR.

The report argues that regulatory relaxations would help foster Europe’s ‘Algorithmic Economy,’ purporting that GDPR’s restrictions of data sharing herald setbacks for European competitiveness in the AI technology field. 

Citing the European Commission’s ambition “for Europe to become the world-leading region for developing and deploying cutting-edge, ethical and secure AI,” the report then proceeds to its central claim that “the GDPR, in its current form, puts Europe’s future competitiveness at risk.”  

That being said, the report notes with approval France’s pro-AI strategy within the GDPR framework, in particular the country’s use of the clause that “grants them the authority to repurpose and share personal data in sectors that are strategic to the public interest—including health care, defense, the environment, and transport. 

Research is still being conducted into the legal and ethical dimensions of AI and the potential ramifications of automated decision-making on data subjects. In the UK, the ICO and the government’s recent advisory board, the Centre for Data Ethics and Innovation (CDEI – not to be confused with aforementioned CDI), are opening discussions and conducting call-outs for evidence regarding individuals’ or organisations’ experiences with AI. There are of course responsible ways of using AI, and organisations hoping to make the best of this new technology have the opportunity to shape the future of Europe’s innovative, but ethical use of data. 

The Information Commissioner’s Office (ICO) research fellows and technology policy advisors release short brief for combatting Artificial Intelligence (AI) security risks 

 The ICO’s relatively young AI auditing framework blog (set up in March this year) discusses security risks in their latest post, using comparative examples of data protection threats between traditional technology and AI systems. The post focuses “on the way AI can adversely affect security by making known risks worse and more challenging to control.” 

 From a data protection perspective, the main issue with AI is its complexity and the volume of not only data, but externalities or ‘external dependencies’ that AI requires to function, particularly in the AI subfield Machine Learning (ML)Externalities take the form of third-party or open-source software used for building ML systems, or third-party consultants or suppliers who use their own or a partially externally dependent ML system. The ML systems themselves may have over a hundred external dependencies, including code libraries, software or even hardware, and their effectiveness will be determined by their interaction with multiple data sets from a huge variety of sources. 

 Organisations will not have contracts with these third parties, making data flows throughout the supply chain difficult to track and data security hard to keep on top of. AI developers come from a wide array of backgrounds, and there is no unified or coherent policy for data protection within the AI engineering community. 

 The ICO’s AI auditing blog uses the example of an organisation hiring a recruitment company who use Machine Learning to match candidate CVs to job vacancies. A certain amount of personal data would have been transferred between the organisation and the recruitment agency using manual methods. However, additional steps in the ML system will mean that data will be stored and transferred in different formats across different servers and systems. They conclude, “for both the recruitment firm and employers, this will increase the risk of a data breach, including unauthorised processing, loss, destruction and damage. 

 For example, they write: 

  • The employer may need to copy HR and recruitment data into a separate database system to interrogate and select the data relevant to the vacancies the recruitment firm is working on. 
  • The selected data subsets will need to be saved and exported into files, and then transferred to the recruitment firm in compressed form. 
  • Upon receipt the recruitment firm could upload the files to a remote location, eg the cloud.  
  • Once in the cloud, the files may be loaded into a programming environment to be cleaned and used in building the AI system. 
  • Once ready, the data is likely to be saved into a new file to be used at a later time. 

This example will be relevant to all organisations contracting external ML services, which is the predominant method for UK businesses hoping to harness the benefits of AI. The blog provides three main pieces of advice based on ongoing research into this new, wide (and widening) area of data security. They suggest that organisations should, 

  • Record and document the movement and storing of personal data, noting when the transfer took place, the sender and recipient, and the respective locations and formats. This will help monitor risks to security and data breaches; 
  • Intermediate files such as compressed versions should be deleted when required – as per best-practice data protection guidelines; and 
  • Use de-identification and anonymization techniques and technologies before they are taken from the source and shared either internally or externally. 

Harry Smithson,  May 2019

What is a Data Protection Officer (DPO), and do you need one?

A DPO (Data Protection Officer) is an individual responsible for ensuring that their organisation is processing the data of its staff, customers, providers and any other individuals, i.e. data subjects, in compliance with data protection regulations. As of the EU-wide General Data Protection Regulation (GDPR), a DPO is mandatory for:

  1. Public authorities; and
  2. Organisations that process data
  • On a large scale; 
  • With regular and systematic monitoring; 
  • As a core activity; or 
  • In large volumes of ‘special category data,’ formerly known as ‘sensitive personal data,’ i.e. information related to a living individual’s racial or ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health condition, sex life or sexual orientation, or biometric data.  

It may not be immediately obvious whether an organisation must have a designated DPO under GDPR. If so, it is necessary to make a formal evaluation, recording the decision and the reasons behind it. The WP29 Guidelines on Data Protection Officers (‘DPO’), endorsed by the European Data Protection Board (EDBP), recommends that organisations should conduct and document an internal analysis to determine whether or not a DPO should be appointed. Ultimately, such decision-making should always take into account the organisation’s obligation to fulfil the rights of the data subject, the primary concern of the GDPR: does the scale, volume or type of data processing in your organisation risk adversely effecting an individual or the wider public?

A DPO is not legally required

Organisations may benefit from voluntarily appointing an internal DPO or hiring an advisor – this will ensure best-practice data protection policies and practices, improving cyber security, staff and consumer trust, and other business benefits. When a DPO is designated voluntarily, they will be considered as mandatory under GDPR – i.e. the voluntarily appointed DPO’s responsibilities as defined in articles 37 and 39 of the GDPR will correspond to those of a legally mandated DPO (in other words, GDPR does not recognise a quasi-DPO with reduced responsibility). As an excerpt from the GDPR explains “if an organisation is not legally required to designate a DPO, and does not wish to designate a DPO on a voluntary basis, that organisation is quite at liberty to employ staff or outside consultants to provide information and advice relating to the protection of personal data.

However, it is important to ensure that there is no confusion regarding their title, status, position and tasks. Therefore, it should be made clear, in any communications within the company, as well as with data protection authorities, data subjects, and the public at large, that the title of this individual or consultant is not a data protection officer (DPO).

But how are the conditions that make a DPO mandatory defined under GDPR?

Large-scale processing: there is no absolute definition under GDPR, but there are evaluative guidelines. The GDPR’s WP29 guidance suggests data controllers should consider:

  • The number of data subjects concerned;
  • The volume of data processed;
  • The range of data items being processed;
  • The duration or permanence of the data processing activity; and
  • The geographical extent.

Regular and systematic monitoring: as with ‘large-scale processing,’ there is no definition as such, but WP29 guidance clarifies that monitoring involves any form of tracking or profiling on the internet, including for the purposes of behavioural advertising. Here are a number of examples of regular and systematic monitoring:

  • Data-driven marketing activities;
  • Profiling and scoring for purposes of risk assessment;
  • Email retargeting;
  • Location tracking (e.g. by mobile apps); or
  • Loyalty programmes.

 What does a Data Protection Officer do?

Article 39 of the GDPR, ‘Tasks of the data protection officer,’ lists and explains the DPO’s obligations. It explains that, as a minimum, the responsibility of a DPO is the items summarised below:

  1. Inform and advise the controller or the processor and the employees
  2. Monitor compliance with the Regulation, with other Union or Member State data protection provisions and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, awareness-raising and training of staff involved in processing operations, and the related audits
  3. Provide advice as regards data protection impact assessments and monitor performance
  4. Cooperate with the supervisory authority
  5. Act as the contact point for the supervisory authority on issues relating to data processing

 

 Harry Smithson 2019

 

HMRC’s 28 days to delete unlawfully obtained biometric data

In a statement released on 3rd May, the Information Commissioner’s Office reiterated their decision to issue HMRC a preliminary enforcement notice in early April. This initial notice was based on an investigation conducted by the ICO after a complaint from Big Brother Watch concerning HMRC’s Voice ID service on a number of the department’s helplines since January 2017.

HMRC did not give customers sufficient information

The voice authentication for customer verification uses a type of biometric data considered special category information under the GDPR, and is therefore subject to stricter conditions. ICO’s investigation found that HMRC did “not give customers sufficient information about how their biometric data would be processed and failed to give them the chance to give or withhold consent.” HMRC was therefore in breach of GDPR.

The preliminary enforcement notice issued by the ICO on April 4th stated that HMRC must delete all data within the Voice ID system for which the department was never given explicit consent to have or use. According to Big Brother Watch, this data amounted to approximately five million records of customers’ voices. These records would have been obtained on HMRC’s helplines, but due to poor data security policy for the Voice ID system, the customers had no means of explicitly consenting to HMRC’s processing of this data.

Steve Wood, Deputy Commissioner at the ICO, stated, “We welcome HMRC’s prompt action to begin deleting personal data that it obtained unlawfully. Our investigation exposed a significant breach of data protection law – HMRC appears to have given little or no consideration to it with regard to its Voice ID service.”

The final enforcement notice is expected 10th May. This will give HMRC a twenty-eight-day timeframe to complete the deletion of this large compilation of biometric data.

The director of Big Brother Watch, Silkie Carlo, was encouraged by the ICO’s actions:

“To our knowledge, this is the biggest ever deletion of biometric IDs from a state-held database. This sets a vital precedent for biometrics collection and the database state, showing that campaigners and the ICO have real teeth and no government department is above the law.”

 Harry Smithson, May 2019. 

HMRC’s 28 days to delete unlawfully obtained biometric data

In a statement released on 3rd May, the Information Commissioner’s Office reiterated their decision to issue HMRC a preliminary enforcement notice in early April. This initial notice was based on an investigation conducted by the ICO after a complaint from Big Brother Watch concerning HMRC’s Voice ID service on a number of the department’s helplines since January 2017.

blurred-background-cellphone-cellular-1426939

The voice authentication for customer verification uses a type of biometric data considered special category information under the GDPR, and is therefore subject to stricter conditions. ICO’s investigation found that HMRC did “not give customers sufficient information about how their biometric data would be processed and failed to give them the chance to give or withhold consent.” HMRC was therefore in breach of GDPR.

The preliminary enforcement notice issued by the ICO on April 4th stated that HMRC must delete all data within the Voice ID system for which the department was never given explicit consent to have or use. According to Big Brother Watch, this data amounted to approximately five million records of customers’ voices. These records would have been obtained on HMRC’s helplines, but due to poor data security policy for the Voice ID system, the customers had no means of explicitly consenting to HMRC’s processing of this data.

Steve Wood, Deputy Commissioner at the ICO, stated, “We welcome HMRC’s prompt action to begin deleting personal data that it obtained unlawfully. Our investigation exposed a significant breach of data protection law – HMRC appears to have given little or no consideration to it with regard to its Voice ID service.”

The final enforcement notice is expected 10th May. This will give HMRC a twenty-eight-day timeframe to complete the deletion of this large compilation of biometric data.

The director of Big Brother Watch, Silkie Carlo, was encouraged by the ICO’s actions:

“To our knowledge, this is the biggest ever deletion of biometric IDs from a state-held database. This sets a vital precedent for biometrics collection and the database state, showing that campaigners and the ICO have real teeth and no government department is above the law.”

 Harry Smithson, May 2019. 

Be Data Aware: the ICO’s campaign to improve data awareness

As the Information Commissioners Office’s ongoing investigation into the political weaponisation of data analytics and harvesting sheds more and more light on the reckless use of ‘algorithms, analysis, data matching and profiling’ involving personal information, consumers are becoming more data conscious. The ICO, as of 8th May, has launched an awareness campaign, featuring a video, legal factsheets reminding citizens of their rights under GDPR, and advice guidelines on internet behaviour. Currently the campaign is floating on Twitter under #BeDataAware.

80% of respondents considered cyber security to be a ‘high priority’ 

While the public is broadly aware of targeted marketing, and fairly accustomed to the process of companies attempting to reach certain demographics, the political manipulation of data is considered, if not a novel threat, then a problem compounded by the new frontier of online data analytics. Ipsos MORI’s UK Cyber Survey conducted on behalf of the DCMS found that 80% of respondents considered cyber security to be a ‘high priority,’ but that many of these people would not be in groups likely to take much action to prevent cybercrime personally. What this could indicate is that while consumers may be concerned about cybercrime being used against themselves, they are also aware of broader social, economic and political dangers that the inappropriate or illegal use of personal information poses.

ICO’s video titled ‘Your Data Matters,’ asks at the beginning, “When you search for a holiday, do you notice online adverts become much more specific?” Proceeding to graphics detailing this relatively well-known phenomenon, the video then draws a parallel with political targeting: “Did you know political campaigners use these same targeting techniques, personalising their campaign messaging to you, trying to influence your vote?” Importantly, the video concludes, “You have the right to know who is targeting you and how your data is used.”

To take a major example of an organisation trying to facilitate this right, Facebook allows users to see why they may have been targeted by an advert with a clickable, dropdown option called ‘Why am I seeing this?’ Typically, the answer will read ‘[Company] is trying to reach [gender] between the ages X – Y in [Country].’ But the question remains as to whether this will be sufficiently detailed in the future. With growing pressure on organisations to pursue best practice when it comes to data security, and with the public’s growing perception of the political ramifications of data security policies, will consumers and concerned parties demand more information on, for instance, which of their online behaviours have caused them to be targeted?

A statement from the Information Commissioner Elizabeth Denham as part of the Be Data Aware campaign has placed the ICO’s data security purview firmly in the context of upholding democratic values.

“Our goal is to effect change and ensure confidence in our democratic system. And that can only happen if people are fully aware of how organisations are using their data, particularly if it happens behind the scenes.

“New technologies and data analytics provide persuasive tools that allow campaigners to connect with voters and target messages directly at them based on their likes, swipes and posts. But this cannot be at the expense of transparency, fairness and compliance with the law.”

Uproar surrounding the data analytics scandal, epitomised by Cambridge Analytica’s data breach beginning in 2014, highlights the public’s increasing impatience with the reckless use of data. The politicisation of cybercrime, and greater knowledge and understanding of data misuse, means that consumers will be far less forgiving of companies that are not seen to be taking information security seriously.

Harry Smithson, 9 May 2019

The GDPR and Profiling

Profiling is a very useful tool which marketers have been using for decades to understand their customers better and to target them appropriately.  However, the GDPR does make some changes to how profiling is considered which should be considered carefully before profiling is undertaken.  For the first time, profiling has been included with automated processing decision-making and the same rights apply to the individuals whose information is being profiled. So how does this affect businesses?

Profiling Benefits

There are obvious benefits both to businesses and consumers in relation to profiling, which is used in a broad number of sectors from healthcare to insurance, retail to publishing, leisure to recruitment.

It is also an extremely useful tool for marketers, providing benefits of increased efficiency, savings in resource, and the financial and reputational benefits of understanding customers and establishing more personal, relevant communications with them.  The customer or individual benefits in turn from receiving fewer communications, and far more relevant messages.

What is profiling?

The GDPR defines profiling as: “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”

Profiling can be as simple as segmenting your own customers into groups based on gender, purchase history, and other data that the customer has provided to you during your relationship.  It becomes more complex when additional data is added to the mix, for example, adding to the information your customer has provided you, by applying data from external sources such as social media, or providers of geo-demographic or lifestyle data.

Profiling and the GDPR

As with all processing under the GDPR, those who profile individuals have responsibilities to those individuals.  Profiles must be accurate, relevant, and non-discriminatory.  All 6 GDPR Principles become critical as profiles are evolutionary, and over time, individuals’ profiles will change. So accuracy and retention are critical.  Privacy by design is key.  As is the requirement that individuals must be made aware of such profiling and of their right not to be subject to such decisions.

It’s worth noting that automated decisions can be made with or without profiling.  And the reverse is also true – profiling can take place without making automated decisions.  It’s all a matter of how the data is used.  Where manual decisions are made, Article 22 does not apply.

Consent or Legitimate Interests?

The legal basis under which profiling takes place is a matter for careful consideration.  There has been debate over whether profiling requires the consent of the individual who is being profiled, or whether legitimate interest may apply.

There will be instances where the impact of the profiling will have a legal or significant effect – for example, in financial services (mortgage yes or no), or when marketing to vulnerable customers – for example, gambling products to those in financial difficulty.  Where profiling is considered to have a legal or significant effect, an organisation will need to rely on the legal basis of Consent before profiling and making decisions on the basis of such profiling.

However, in many cases, marketing will not have such an impact, and in those cases, consent will not be required.  Instead it may be possible to rely on Legitimate Interests.  BUT before such a decision is made, a Legitimate Interest Assessment will need to be conducted.  This will need to consider the necessity of the profiling, the balance of benefits to the individuals versus the business, and the measures taken to protect the personal data and profiles involved.

The Legitimate Interest Assessment will not only help you determine whether it is appropriate to conduct the profiling on this basis, it will also provide evidence that the individuals’ rights have been considered, contributing to the business’s need to meet the GDPR’s new principle of Accountability.

Victoria Tuffill  7th March 2018

GDPR and Accountants

Tax returns onlineGDPR Debate

On Monday, 16th October, Data Compliant’s Victoria Tuffill was invited by AccountancyWeb to join a panel discussion on how GDPR will impact accountants and tax agents.

The other members of the panel were our host, John Stokdyk, Global Editor of AccountingWEB, who kept us all on the straight and narrow, while asking some very pertinent questions; Ian Cooper from Thomson Reuters who gave strong insights into technical solutions; and Dave Tucker from Thompson Jenner LLP, who provided a very useful practitioner viewpoint.

GDPR in General

There is a presumption that every professional body is fully informed of all compliance regulations within their field of expertise.  But the continuing barrage of changes and adjustments to European and British law makes it easy to drop the ball.

GDPR is a typical example.  To quote the Information Commissioner, Elizabeth Denham, it’s “The biggest change to data protection law for a generation”. Yet for many accountants – and so many others – it’s only just appearing on the radar.   This means there’s an increasingly limited amount of time to be ready.

GDPR has been 20 years coming, and is intended to bring the law up to date – in terms of new technology, new ways we communicate with each other, and the increasing press coverage and consumer awareness of personal data and how it’s used by professional organisations and others.  GDPR has been law for 17 months now, and it will be enforced from May 2018.

GDPR and Accountants

So what does GDPR mean for accountants in particular?

  • Accountants will have to deal with the fact that it’s designed to give individuals back their own control over their own personal information and strengthens their rights.
  • It increases compliance and record keeping obligations on accountants. GDPR makes it very plain that any firm which processes personal data is obliged to protect that data – for accountants that responsibility is very significant given the nature of the personal data an accountant holds.
  • There are increased enforcement powers – I’m sure everyone’s heard of the maximum fine of E20,000 or 4% of global turnover, whichever is higher. But also, the media have a strong hold on the whole area of data breaches – and often the reputational damage has a far greater impact than the fine.
  • Accountancy firms must know precisely what data they hold and where it’s held so they can they assess the scale of the issue, and be sure to comply with the demands of GDPR.

The video covers key points for practitioners to understand before they can prepare for compliance, and summarises some initial steps they should take today to prepare their firms.

The other members of the panel were our host, John Stokdyk, Global Editor of AccountingWEB, who kept us all on the straight and narrow, while asking some very pertinent questions; Ian Cooper from Thomson Reuters who gave strong insights into technical solutions; and Dave Tucker from Thompson Jenner LLP, who provided a very useful practitioner viewpoint.

The session can be found here:  Practice Excellence Live 2017:  GDPR.

It is a 45 minute video, so for those with limited time, I have broken down the areas covered into bite-size chunks:

video accountants timingsData Compliant is working with its clients to help them prepare for GDPR, so if you are concerned about how GDPR will affect your firm or business, feel free to give us a call and have a chat on 01787 277742 or email dc@datacompliant.co.uk if you’d like more information.

 

 

 

Victoria Tuffill  19th October, 2017

 

 

 

Weekly Roundup: lack of data protection budgeting among UK businesses; international resolution to secure transparency among subcontractors; fine for ex-council worker

1 in 5 UK businesses have no data protection budget – compared to 4 in 5 local authorities 

GDPR Budget

A report by international email management company Mimecast states that a fifth of surveyed UK businesses do not have a specific budget dedicated to information security or data protection – a source of great concern ahead of the stringent General Data Protection Regulation (GDPR) in May 2018.

495038416

Over 80% of councils were found to have no funding towards meeting mandatory GDPR requirements

This reinforces the concerns over the information provided in response to a FOI  request by M-Files Corporation in July, which found that four out of five councils had, at that time, yet to allocate funding towards meeting the new requirements of the GDPR.  That research also found that 56% of local authorities contacted had still not appointed a data protection officer despite this being mandated by GDPR.

That such a substantial proportion of businesses have no explicit budgetary or financial commitment to combatting cybercrime and personal data abuse may be particularly unwelcome news to proponents and enforcers of the new GDPR. The Information Commissioner’s Office, the independent data protection authority, has been working hard over the last year to publicise and prepare British organisations for the impending legislation.

The lack of data protection budgeting is compounded by Mimecast’s findings that many UK businesses may not be monitoring their data efficiently. For instance, 15% of the surveyed organisations stated that they did not know whether they had suffered a data loss incident during the last year or not. 27% blamed human error for previous losses, which would indicate that a large number of organisations will need to start taking employee data protection and handling training much more seriously.

44% of the surveyed organisations suspect that their email system contains personal sensitive information as defined under the GDPR, but only 17% of them believed that this information could be retrieved immediately. The average amount of hours it would take British organisations to track down sensitive personal information was calculated as 8.

The report suggests that a significant number of organisations are very underprepared for the increased responsibility and accountability demanded by the GDPR. For help and information on preparing for the GDPR, see the Data Compliant main site.

10th International Conference of Information Commissioners (ICIC 2017) resolves to tackle difficulties of access to information on outsourced public services

The Information Commissioner’s Office (ICO) has confirmed a resolution on international action for improving access to information frameworks surrounding contracted-out public services, a system which has seen increased use throughout Europe, and rapid growth in the UK since 2010.

Challenges have been arising for a couple of decades concerning the transparency of information about the “new modes of delivery for public services.” This is often because the analysis of the efficacy of subcontracted services can be rendered difficult when, due to the principle of competition in the private sector, certain information – particularly regarding the production process of public services – can escape public scrutiny on the grounds of the protection of commercial confidentiality.

The International Conference, jointly hosted by Information Commissioner Elizabeth Denham and Acting Scottish Information Commissioner Margaret Keyse, was attended by Commissioners of 39 jurisdictions from 30 countries and seven continents. The resolution was passed in Manchester on 21st September following dialogue with civil society groups.

The resolution highlights the “challenge of scrutinising public expenditure and the performance of services provided by outsourced contractors” and “the impact on important democratic values such as accountability and transparency and the wider pursuit of the public interest.”

The Conference summarised that the first step to be taken would be the promotion of “global open contracting standards,” presumably as a means of garnering consensus on the importance of transparency in this regard for the benefit of the public, researchers and policy-makers. A conference working group is to be formed to “share practice about different initiatives that have been developed to tackle the issue.”

The event lasted two days and ran with the title: ‘Trust, transparency and progressive information rights.’ Contributions were heard from academics, journalists, freedom of information campaigners and regulators.

Access to information on the grounds of individual rights and the safeguarding of public interests will be strengthened by the provisions of the GDPR. This resolution provides a reminder and opportunity for organisations working as subcontractors to review the ways in which they store and handle data. Transparency and accountability, longer considered in any way contradictory, are key watchwords for the clutch of data protection reforms taking place throughout the world. Many organisations would do well to assess whether they are in a position to meet the standards of good governance and best practice regarding data management, which will soon become a benchmark for consumer trust.

Ex-employee of Leicester City Council fined for stealing vulnerable people’s personal information

The ICO has confirmed the prosecution of an ex-council worker for unlawfully obtaining the personal information of service users of Leicester City Council’s Adult Social Care Department.

vulnerable

Personal data, including medical conditions, care and financial records were “unlawfully” obtained by an ex-council worker

The personal details of vulnerable people were taken without his employer’s consent, and breached the current Data Protection Act 1998. 34 emails containing the personal information of 349 individuals, including sensitive personal data such as medical conditions, care and financial details and records of debt, were sent to a private email address prior to the individual having left the council.

The ICO’s Head of Enforcement Steve Eckersley stated, “Employees need to understand the consequences of taking people’s personal information with them when they leave a job role. It’s illegal and when you’re caught, you will be prosecuted.”

 

Harry Smithson  29th September 2017

 

 

 

Data Protection Weekly Roundup: GDPR exemption appeals, gambling industry exploitation scandal, cyber attacks and data breaches

Corporate pensions company Scottish Widows to lobby for specific exemptions from the General Data Protection Regulation ahead of EU initiative’s May 2018 introduction.

Pensions

Scottish Widows seeks derogations in relation to communicating with its customers in order to “bring people to better outcomes.”

The Lloyds Banking Group subsidiary Scottish Widows, the 202-year old life, pensions and investment company based in Edinburgh, has called for derogations from the GDPR.

A great deal has been written across the Internet about the impending GDPR, and much of the information available is contradictory. In fact many organisations and companies have been at pains to work out what exactly will be expected of them come May 2018. While it is true that the GDPR will substantially increase policy enforcers’ remits for penalising breaches of data protection law, the decontextualized figure of monetary penalties reaching €20 million or 4% of annual global turnover – while accurate in severe cases – has become something of a tub-thump for critics of the regulation.

Nevertheless, the GDPR is the most ambitious and widescale attempt to secure individual privacy rights in a proliferating global information economy to date, and organisations should be preparing for compliance. But the tangible benefits from consumer and investor trust provided by data compliance should always be kept in sight. There is more information about the GDPR on this blog and the Data Compliant main site.

Certain sectors will feel the effects of GDPR – in terms of the scale of work to prepare for compliance – more than others. It is perhaps understandable, therefore, why Scottish Widows, whose pension schemes may often be supplemented by semi-regular advice and contact, would seek derogations from the GDPR’s tightened conditions for proving consent to specific types of communications. Since the manner in which consent to communicate with their customers was acquired by Scottish Widows will not be recognised under the new laws, the company points out that “in future we will not be able to speak to old customers we are currently allowed to speak to.”

Scottish Widows’ head of policy, pensions and investments Peter Glancy’s central claim is that “GDPR means we can’t do a lot of things that you might want to be able to do to bring people to better outcomes.”

Article 23 of the GDPR enables legislators to provide derogations in certain circumstances. The Home Office and Department of Health for instance have specific derogations so as not to interfere with the safeguarding of public health and security. Scottish Widows cite the Treasury’s and DWP’s encouragement of increased pension savings, and so it may well be that the company plans to lobby for specific exemptions on the grounds that, as it stands, the GDPR may put pressure on the safeguarding of the public’s “economic or financial interests.”

Profiling low income workers and vulnerable people for marketing purposes in gambling industry provokes outrage and renewed calls for reform.

gambling

The ICO penalised charities  for “wealth profiling”. Gambling companies are also “wealth profiling” in reverse – to target people on low incomes who can ill afford to play

If doubts remain that the systematic misuse of personal data demands tougher data protection regulations, these may be dispelled by revelations that the gambling industry has been using third party affiliates to harvest data so that online casinos and bookmakers can target people on low incomes and former betting addicts.

An increase in the cost of gambling ads has prompted the industry to adopt more aggressive marketing and profiling with the use of data analysis. An investigation by the Guardian including interviews with industry and ex-industry insiders describes a system whereby data providers or ‘data houses’ collect information on age, income, debt, credit information and insurance details. This information is then passed on to betting affiliates, who in turn refer customers to online bookmakers for a fee. This helps the affiliates and the gambling firms tailor their marketing to people on low incomes, who, according to a digital marketer, “were among the most successfully targeted segments.”

The data is procured through various prize and raffle sites that prompt participants to divulge personal information after a lengthy terms and conditions that marketers in the industry suspect serves only to obscure to many users how and where the data will be transferred and used.

This practice, which enables ex-addicts to be tempted back into gambling by the offer of free bets, has been described as extremely effective. In November last year, the Information Commissioner’s Office (ICO) targeted more than 400 companies after allegations the betting industry was sending spam texts (a misuse of personal data). But it is not mentioned that any official measures were taken after the investigations, which might have included such actions as a fine of £500,000 under the current regulations. Gambling companies are regulated by the slightly separate Gambling Commission, who seek to ensure responsible marketing and practice. But under the GDPR it may well be that the ICO would have licence to take a much stronger stance against the industry’s entrenched abuse of personal information to encourage problem gambling.

Latest ransomware attack on health institution affects Scottish health board, NHS Lanarkshire.

According to the board, a new variant of the malware Bitpaymer, different to the infamous global WannaCry malware, infected its network and led to some appointment and procedure cancellations. Investigations are ongoing into how the malware managed to infect the system without detection.

Complete defence against ransomware attacks is problematic for the NHS because certain vital life-saving machinery and equipment could be disturbed or rendered dysfunctional if the NHS network is changed too dramatically (i.e. tweaked to improve anti-virus protection).

A spokesman for the board’s IT department told the BBC, “Our security software and systems were up to date with the latest signature files, but as this was a new malware variant the latest security software was unable to detect it. Following analysis of the malware our security providers issued an updated signature so that this variant can now be detected and blocked.”

Catching the hackers in the act

Hackers

Attacks on newly-set up online servers start within just over one hour, and are then subjected to “constant” assault.

According to an experiment conducted by the BBC, cyber-criminals start attacking newly set-up online servers about an hour after they are switched on.

The BBC asked a security company, Cybereason, to carry out to judge the scale and calibre of cyber-attacks that firms face every day.   A “honeypot” was then set up, in which servers were given real, public IP addresses and other identifying information that announced their online presence, each was configured to resemble, superficially at least, a legitimate server.  Each server could accept requests for webpages, file transfers and secure networking, and was accessible online for about 170 hours.

They found that that automated attack tools scanned such servers about 71 minutes after they were set up online, trying to find areas they could exploit.  Once the machines had been found by the bots, they were subjected to a “constant” assault by the attack tools.

Vulnerable people’s personal information exposed online for five years

Vulnerable customers

Vulnerable customers’ personal data needs significant care to protect the individuals and their homes from harm

Nottinghamshire County Council has been fined £70,000 by the Information Commissioner’s Office for posting genders, addresses, postcodes and care needs of elderly and disabled people in an online directory – without basic security or access restrictions such as a basic login requiring username or password.  The data also included details of the individuals’ care needs, the number of home visits per day and whether they were or had been in hospital.  Though names were not included on the portal, it would have taken very little effort to identify the individuals from their addresses and genders.

This breach was discovered when a member of the public was able to access and view the data without any need to login, and was concerned that it could enable criminals to target vulnerable people – especially as such criminals would be aware that the home would be empty if the occupant was in hospital.

The ICO’s Head of Enforcement, Steve Eckersley, stated that there was no good reason for the council to have overlooked the need to put robust measures in place to protect the data – the council had financial and staffing resources available. He described the breach as “serious and prolonged” and “totally unacceptable and inexcusable.”

The “Home Care Allocation System” (HCAS) online portal was launched in July 2011, to allow social care providers to confirm that they had capacity to support a particular service user.  The breach was reported in June 2016, and by this time the HCAS system contained a directory of 81 service users. It is understood that the data of 3,000 people had been posted in the five years the system was online.

Not surprisingly, the Council offered no mitigation to the ICO.  This is a typical example of where a Data Privacy Impact Assessement will be mandated under GDPR.

Harry Smithson, 6th September 2017