Tag Archives: personal data

How to conduct a Data Protection Impact Assessment (DPIA) in 8 simple steps

Many business activities these days will entail significant amounts of data processing and transference. It’s not always clear-cut as to what your organisation does that legally requires, or does not legally require, an impact assessment on the use of personal data – i.e. a Data Protection Impact Assessment (DPIA).

People may be familiar with Privacy Impact Assessments (PIAs), which were advised as best-practice by the Information Commissioner before the EU’s GDPR made DPIAs mandatory for certain activities. Now the focus is not so much on the obligation to meet individuals’ privacy expectations, but on the necessity to safeguard everyone’s data protection rights.

DPIAs are crucial records to demonstrate compliance with data protection law. In GDPR terms, they are evidence of transparency and accountability. They protect your clients, your staff, your partners and any potential third parties. Being vigilant against data protection breaches is good for everyone – with cybercrime on the rise, it’s important that organisations prevent unscrupulous agents from exploiting personal information.

In this blog, we’ll go through a step-by-step guide for conducting a DPIA. But first, let’s see what sort of things your organisation might be doing that need a DPIA.

When is a DPIA required?

The regulations are clear: DPIAs are mandatory for data processing that is “likely to result in a high risk to the rights and freedoms” of individuals. This can be during a current activity, or before a planned project. DPIAs can range in scope, relative to the scope of the processing.

Here are some examples of projects when a DPIA is necessary:

  • A new IT system for storing and accessing personal data;
  • New use of technology such as an app;
  • A data sharing initiative in which two or more organisations wish to pool or link sets of personal data;
  • A proposal to identify people of a specific demographic for the purpose of commercial or other activities;
  • Using existing data for a different purpose;
  • A new surveillance system or software/hardware changes to the existing system; or
  • A new database that consolidates information from different departments or sections of the organisation.

The GDPR also has a couple more conditions for a DPIA to be mandatory, namely:

  • Any evaluation you make based on automated processing, including profiling, as well as automated decision-making especially if this can have significant or legal consequences for someone; and
  • The processing of large quantities of special category personal data (formerly known as sensitive personal data).

An eight-step guide to your DPIA

  • Identify the need for a DPIA
    • Looking at the list above should give you an idea of whether a DPIA will be required. But there are also various ‘screening questions’ that should be asked early on in a project’s development. Importantly, the data protection team should assess the potential impacts on individuals’ privacy rights the project may have. Internal stakeholders should also be consulted and considered.
  • Describe the data flows
    • Explain how information will be collected, used and stored. This is important to redress the risk of ‘function creep,’ i.e. when data ends up getting used for different purposes, which may have unintended consequences.
  • Identify privacy and related risks
    • Identify and record the risks that relate to individuals’ privacy, including clients and staff.
    • Also identify corporate or organisational risks, for example the risks of non-compliance, such as fines, or a loss of customers’ trust. This involves a compliance check with the Principles of the Data Protection Act 2018 (the UK’s GDPR legislation).
  • Identify and evaluate privacy solutions
    • With the risks recorded, find ways to eliminate or minimise these risks. Consider doing cost/benefit analyses of each possible solution and consider their overall impact.
  • Sign off and record DPIA outcomes
    • Obtain the appropriate sign-off and acknowledgements throughout your organisation. A record of your DPIA evaluations and decisions should be made available for consultation during and after the project.
  • Consult with internal and external stakeholders throughout the project
    • This is not so much a step as an ongoing process. Commit to being transparent with stakeholders about the DPIA process. Listen to what your stakeholders have to say and make use of their expertise. This can include both employees as well as customers. Being open to consultation with clear communication channels for stakeholders to bring up data protection concerns or ideas will be extremely useful.
  • Ongoing monitoring
    • The DPIA’s results should be fed back into the wider project management process. You should take the time to make sure that each stage of the DPIA has been implemented properly, and that the objectives are being met.
    • Remember – if the project changes in scope, or its aims develop in the project lifestyle, you may need to revisit step one and make the appropriate reassessments.

This brief outline should help you to structure as well as understand the appropriateness of DPIAs. Eventually, these assessment processes will be second nature and an integral part of your project management system. Good luck!

If you have any questions about the data protection, please contact us via email team@datacompliant.co.uk or call 01787 277742

Harry Smithson, 21st October 2019

Operation Yellowhammer

Ornithology

What has Brexit to do with a small member of the bunting family? A bird that is migratory, apparently recognising no national boundaries. One that is found throughout Europe and thrives in its adopted homes in Australia and New Zealand.  So widespread is the Yellowhammer (Emberiza citronella) that it has been adopted as the state symbol of Alabama.

Clearly someone in Whitehall running the No Deal contingency planning is a keen birder. Or has a sense of humour.

Political Wrangling

In extraordinary scenes this week Parliament was prorogued until 15th October.  It is a move that is not without controversy and we shall see the judgement of the UK Supreme Court next week. 

One of the final acts of outgoing Parliament was to pass legislation compelling Government to publish the dossier on the potential consequences of a “No Deal” Brexit, a document bearing the codename “Operation Yellowhammer”.    

Extracts from the Yellowhammer report had been leaked and published in the Sunday Times a few weeks ago.  Following the publication of the extracts arguments raged about the age of the Yellowhammer document and whether or not it represented a “Worst Case” or “Base Case” planning scenario.

Wednesday evening saw the publication of a remarkably matter-of-fact 5-page Operation Yellowhammer document with the status “Official Sensitive”. The report is dated 2nd August and it apparently represents the Government’s “Reasonable Worst Case Planning Assumptions”. 

Implications for UK Companies receiving Personal Data from EU

Under the general heading “Key Planning Assumptions” are sections on the widely discussed impacts on HGVs entering the UK particularly at the Channel ports and disruption to the supply of fresh food and medicines.

Listed at Number 9 (of 20 assumptions) is the following:

“The EU will not have made a data decision with regard to the UK before exit.  This will disrupt the flow of personal data from the EU where an alternative legal basis for transfer is not in place. In no deal an adequacy assessment could take years.”

This is an important consideration for companies in their No Deal contingency planning. Under the No Deal outcome, the UK becomes a Third Country and would no longer be covered by EU ‘free flow of data’ rules. 

As the Yellowhammer report states the UK would need to apply for an adequacy decision to ease the flow of personal data from EU member states.  Typically, an adequacy assessment from the EU can take 2 years – sometimes longer.

To date the EU Commission has adopted 13 adequacy decisions with: Andorra, Argentina, Canada, the Faroe Islands, Guernsey, Israel, the Isle of Man, Jersey, New Zealand, Switzerland, Uruguay, and the United States (for companies using the EU-US Privacy Shield or Swiss-US Privacy Shield). Most recently the EU agreed a reciprocal adequacy decision with Japan. South Korea started the process of applying for an adequacy decision from the Commission in 2015. The decision is still awaited. 

What to do?

The previous blog (“55 Days to Go”) covers the steps that UK companies need to take when handling data from EU member states.

Privacy Notices

Companies must also review and update Privacy Notices to include amongst other items:

  • The legal basis for processing personal data
  • Collection and use of personal data for direct marketing, analytics, research purposes – where applicable
  • Collection of HR personal data
  • Transfers of personal data outside the EEA and the transfer mechanisms in place to govern those transfers

What Next?

An update on the August version of the Yellowhammer report which will include the current contingency plans in the case of a No Deal Brexit has been promised. It will be interesting to see if any progress has been made on the status off data flows in recent weeks.

Meanwhile the Government continues to insist its preferred outcome from the Brexit negotiations is a deal with the EU.  If a deal is secured it may be that the current flows of personal data to and from the EU are not materially affected.

Watch this space.

Please feel free to contact us if you have any queries or concerns about how Brexit will affect your business, by calling 01787 277742 or email teambrexit@datacompliant.co.uk

Gareth Evans, 12th September 2019

Facebook’s cryptocurrency Libra under scrutiny amid concerns of ‘data handling practices’

It would be giving the burgeoning cryptocurrency Libra short shrift to call it ambitious. Its aims as stated in the Libra Association’s white paper are lofty even by the rhetorical standards of Silicon Valley. If defining Libra as ‘the internet of money’ isn’t enough to convince you of the level of its aspiration, the paper boasts the currency’s ability to financially enfranchise the world’s 1.7 billion adults without access to traditional banking networks or the global financial system.

Like its crypto predecessors, Libra uses blockchain technology to remain decentralised and inclusive, enabling anyone with the ability to pick up a smartphone to participate in global financial networks. Distinguishing itself, however, from existing cryptocurrencies, Libra promises stability thanks to the backing of a reserve of ‘real assets,’ held by the Libra Reserve. There is also the added benefit, hypothetically, of Libra proving to be more energy efficient than cryptocurrencies such as Bitcoin because there will be no ‘proof of work’ mechanism such as Bitcoin mining, which requires more and more electricity as the currency inflates.

So far, so Zuckerberg. It may seem unsurprising then, that global data protection regulators have seen the need to release a joint statement raising concerns over the ‘privacy risks posed by the Libra digital currency and infrastructure.’ While risks to financial privacy and related concerns have been raised by Western policymakers and other authorities, this is the first official international statement relating specifically to personal privacy.

The joint statement, reported on the UK’s Information Commissioner’s Office (ICO) on the 5th August, has signatories from Albania, Australia, Canada, Burkina Faso, the European Union, the United Kingdom and the United States. The primary concern is that there is essentially no information from Facebook, or their participating subsidiary Calibra, on how personal information will be handled or protected. The implementation of Libra is rapidly forthcoming – the target launch is in the first half of next year. Its expected uptake is anticipated to be similarly rapid and widescale thanks to Facebook’s goliath global status. It is likely, therefore, that the Libra Association (nominally independent, but for which Facebook, among other tech and communications giants, is a founding member) will become the custodian of millions of peoples’ data – many of whom will reside in countries that have no data protection laws – in a matter of months.

The statement poses six main questions (a ‘non-exhaustive’ conversation-starter) with a view to getting at least some information on how Libra will actually function both on a user-level and across the network, how the Libra Network will ensure compliance with relevant data protection regulations, how privacy protections will be incorporated into the infrastructure, etc. All of these questions are asked to get some idea of how Facebook and Calibra et al. have approached personal data considerations.

Profiling, Algorithms and ‘Dark Patterns’

The joint statement asks how algorithms and profiling involving personal data will be used, and how this will be made clear to data subjects to meet the standards for legal consent. These are important questions relating to the design of the access to the currency on a user-level, of which prospective stakeholders remain ill-informed. The Libra website does state that the Libra blockchain is pseudonymous, allowing users to hold addresses not linked to their real-world identity. How these privacy designs will manifest remains unclear, however, and there is as yet no guarantee de-identified information cannot be reidentified through nefarious means either internally or by third parties.

The regulators also bring up the use of nudges and dark patterns (sometimes known as dark UX) – methods of manipulating user behaviour that can rapidly become unethical or illegal. Nudges may be incorporated into a site (they may sometimes be useful, such as a ‘friendly reminder’ that Mother’s Day is coming up on a card website) in order to prompt commercial activity that may not have happened otherwise. There is not always a fine line between a reasonable nudge and a dubious one. Consider the example of Facebook asking a user ‘What’s on your mind?’, prompting the expression of a feeling or an attitude, for instance. We already know that Facebook has plans to scan information on emotional states ostensibly for the purposes of identifying suicidal ideation and preventing tragic mistakes. The benefits of this data to unscrupulous agents, however, could prove, and indeed has proved, incalculable.

The Libra Network envisions a ‘vibrant ecosystem’ (what else?) of app-developers and other pioneers to ‘spur the global use of Libra.’ Questions surrounding the Network’s proposals to limit data protection liabilities in these apps are highly pertinent considering the lightspeed pace with which the currency is being designed and implemented.

Will Libra be able to convince regulators that it can adequately distance itself from these practices? Practices which take place constantly and perennially online? Has there been any evidence of Data Protection Impact Assessments (DPIAs), as demanded unequivocally by the European Union’s General Data Protection Regulation (GDPR) on a data-sharing scale of this magnitude?

Hopefully, Facebook or one of its subsidiaries or partners will partake in this conversation started by the joint statement, providing the same level of cooperation and diligence shown to data protection authorities as they have to financial authorities. More updates to come.

Harry Smithson, 9th August 2019

Data Protection Weekly Round-up: New Data Protection Bill; the impact of Brexit; £150k fines for failure to apply TPS

This week there’s been much in the media about the UK’s upcoming new Data Protection Bill.  Unfortunately some of the reporting has been unclear, providing very woolly information on some of the new rights of individuals, and the circumstances they do – or do not – apply.  Nonetheless, the main story is that the Data Protection Act will be replaced and that it will include the requirements of the EU’s General Data Protection Regulation (GDPR).

In other news, the ICO has taken further action against companies who fail to follow the current Data Protection Act and PECR regulations.  This week the spotlight falls on companies who fail to screen their call lists against TPS.  This illegal behaviour has resulted in fines of £150,000 for the week.

Data Protection Bill set to be read out in Parliament in September

Queen

As promised in the Queen’s Speech, GDPR will become part of the UK’s new data protection law. The process begins next month  in Parliament.

The government has said that it plans to give the Data Protection Bill, announced in the Queen’s speech in June, an airing in Parliament at some point next month. This has been confirmed by the Department for Digital, Culture, Media and Sport (which continues to be officially abbreviated as DCMS, despite the recent addition of ‘Digital’).

The new Bill will replace the existing Data Protection Act 1998 and one of its chief aims is to implement the EU-wide General Data Protection Regulation (GDPR).  The UK must adhere to GDPR during its time as a member state and almost certainly beyond – albeit under different legal provisions. The manner in which this EU initiative could apply in the UK after a finalised Brexit is discussed in the next story.

This first reading of the Bill next month is largely a formality. It gives lawmakers, consultants and interested parties a chance to inform themselves and gather the information they need before a second reading takes place, during which a parliamentary debate is properly staged.

Last month, Germany became the first EU member state to approve its data protection legislation meeting the requirements of GDPR – the German Federal Data Protection Act (‘Bundesdatenschutzgesetz‘).

House of Lords publishes a report on the EU data protection package

Responding to the government’s plans outlined in a White Paper on The United Kingdom’s exit from and new partnership with the European Union, the House of Lords has reviewed various options regarding the data protection policy aspect of this new relationship in a report published on 18th July.

Since the government has stated that it wants to “maintain unhindered and uninterrupted data flows with the EU post-Brexit,” the House of Lords has assessed this commitment with a view to providing a more detailed set of practical objectives.

EU

For the UK to continue trading with EU citizens post-Brexit, GDPR or its equivalent will  need to apply.

The report summarises that the UK has two feasible options if it wants to continue uninterrupted data flow with the EU, which is now a lynchpin in our service-driven economy. There will be a transitional period of adopting the General Data Protection Regulation (GDPR) and the Police and Criminal Justice Directive (PCJ) while the UK remains an EU Member State, regulations which the government plans to implement with the aforementioned new Data Protection Bill. But the report states that after Brexit, the UK will either have to pursue an ‘adequacy decision’ from the European Commission, “certifying that [the UK] provides a standard of protection which is ‘essentially equivalent’ to EU data protection standards,” or else individual data controllers will have to implement their own data protection safeguards, which would “include tools such as Standard Contractual Clauses, and Binding Corporate Rules.”

The report favours the former, that is, adequacy decisions conferred to the UK as a third state in its relation to the EU, provided under Articles 45 and 36 of the GDPR and PCJ respectively. The report states that the Lords were “persuaded by the Information Commissioner’s view that the UK is so heavily integrated with the EU – three quarters of the UK’s cross-border data flows are with EU countries – that it would be difficult for the UK to get by without an adequacy arrangement.”

The report concludes that there is no prospect of a clean break, since the UK will have to continue to update its domestic data protection policies to remain aligned to the standards of EU data protection in the event of changing regulations – that is, if the UK wants the seamless transfer of data with EU countries that is regarded as crucial to the digital economy and the UK’s competitive position in the modern globalised market.

Information Commissioner’s Office (ICO) levies £150,000 of fines for nuisance calls

The ICO has issued official warnings, “reminding companies making direct marketing calls that people registered with the Telephone Preference Service are ‘off-limits,’” after two Bradford-based firms were fined a total of £150,000 for flouting this preference.

fined 150000
Calling consumers without consent is illegal unless you run the files against TPS.

HPAS Ltd (t/a Safestyle UK) and Laura Anderson Ltd (t/a Virgo Home Improvements) have been fined £70,000 and £80,000 respectively for making illegal nuisance calls to people on the TPS register. Both firms have been issued enforcement notices and will face court action if the practice continues.

The ICO received 264 complaints about Virgo over 20 months (despite repeated warnings and formal monitoring), and 440 complaints about the latter in 19 months.  Virgo Home Improvements had already been fined £33,000 just over a year ago, bringing their total fines for making nuisance calls up to £113,000.

One complaint about Safestyle quoted by the ICO read, “this harassment has been going on for over five years now. I want it to stop.” Members of the public are becoming increasingly aware of data protection policy, and the prospect of new legislation that will crack down on aggravating breaches such as these will be welcomed by many.

Written by Harry Smithson, 8th August 2017

http://www.datacompliant.co.uk

Weekly Roundup: Global Cyber-Attack, Google Scan Emails, Political Party Under Investigation, Nuisance Calls Fine

Malware outbreak in 64 countries, Google scrap email scans, and the Conservative Party face ‘serious allegations’

Global cyber-attack disrupts companies in 64 countries

Corrupted Ukrainian accountancy software ‘MEDoc’ is suspected to be the medium of a cyberattack on companies ranging from British ad agency WPP to Tasmanian Cadbury’s factory, with many European and American firms reporting disruption to services. Banks in Ukraine, Russian oil giant Rosneft, shipping giant Maersk, a Rotterdam port operator, Dutch global parcel service TNT and US law firm DLA Piper were among those suffering inabilities to process orders or else general computer shutdowns.

Heralded as “a recent dangerous trend” by Microsoft, this attack comes just 6 weeks after the WannaCry attack primarily affecting NHS hospitals. Both attacks appear to make use of a Windows vulnerability called ‘Eternal Blue,’ thought to have been discovered by the NSA and leaked online – although the NSA has not confirmed this. The NSA’s possible use of this vulnerability, which has served to create a model for cyber-attacks for political and criminal hackers, has been described by security experts as “a nightmare scenario.”

A BBC report suggests that given 80% of all instances of this malware were in Ukraine, and that the provided email address for the ‘ransom’ closed down quickly, the attack could be politically motivated at Ukraine or those who do business in Ukraine. Recent announcements suggest it could be related to data not money.

The malware appears to have been channelled through the automatic update system, according to security experts including the malware expert credited with ending the WannaCry attack, Marcus Hutchins. The MEDoc software would have originally begun this process legitimately, but at some point the update system released the malware into numerous companies’ computer systems.

 

Google to stop scanning Gmail accounts for personalised marketing data

In a blog published at the end of last week, the tech firm Google have confirmed that they will stop scanning Gmail users’ emails for the sake of accruing data to be used in personalised adverts, by the end of the year. This will put the consumer version of Gmail in line with the business edition.

Google had advertised their Gmail service by offering 1GB of ‘free’ webmail storage. However, it transpired that Google was paying for this offer by running these scans.

This recent change in tactic has been met with ‘qualified’ welcome by privacy campaigners. Executive director Dr Gus Hosein of Privacy International, the British charity who have been campaigning for regulators to intervene since they discovered the scans, stated:

When they first came up with the dangerous idea of monetising the content of our communications, Privacy International warned Google against setting the precedent of breaking the confidentiality of messages for the sake of additional income. […] Of course they can now take this decision after they have consolidated their position in the marketplace as the aggregator of nearly all the data on internet usage, aside from the other giant, Facebook.

Google faced a fairly substantial backlash on account of these scans when they were discovered, notably from Microsoft, with their series of critical ‘Gmail man’ adverts, depicting a man searching through people’s messages.

However, digital rights watchdog Big Brother Watch celebrated Google’s move, describing it as “absolutely a step in the right direction, let’s hope it encourages others to follow suit.”

UK Conservative Party under investigation for breaching data protection and election law

A Channel 4 News undercover investigation has provoked ‘serious allegations’ of data protection and election offences against the Conservative Party.

The investigation uncovered the party’s use of a market research firm based in Neath, South Wales, to make thousands of cold calls to voters in marginal seats ahead of the election this month. Call centre staff followed a ‘market research’ script, but under scrutiny this script appears to canvass for specific local Conservative candidates – in a severe breach of election law.

Despite the information commissioner Elizabeth Denham’s written warnings to all major parties before the election began, reminding them of data protection law and the illegality of such telecommunications, the Conservatives operated a fake market research company. This constitutes a breach separate to election law, and mandates the Information Commissioner’s Office to investigate.

The ICO’s statement on 23rd June reads,

The investigation has uncovered what appear to be underhand and potentially unlawful practices at the centre, in calls made on behalf of the Conservative Party. These allegations include:

  • Paid canvassing on behalf of Conservative election candidates – banned under election law.
  • Political cold calling to prohibited numbers
  • Misleading calls claiming to be from an ‘independent market research company’ which does not apparently exist

MyHome Installations Ltd fined £50,000 for nuisance calls

Facing somewhat less public scrutiny and condemnation than the Conservative Party, Maidstone domestic security firm MyHome Installations has been issued a £50,000 fine by the ICO for making nuisance calls.

The people who received these calls had explicitly opted out of telephone marketing by registering their numbers with the Telephone Preference Service (TPS), the “UK’s official opt-out of telephone marketing.”

The ICO received 169 complaints from members of the public who’d received unwanted calls about electrical surveys and home security from MyHome Installations Ltd.

Harry Smithson 28 June 2017

Phishing ..Christmas..a time for taking?

phishing-alertThere I was, at my desk on Monday morning, preoccupied with getting everything done before the Christmas break, and doing about 3 things at once (or trying to).  An email hit my inbox with the subject “your account information has been changed”.  Because I regularly update all my passwords, I’m used to these kinds of emails arriving from different companies – sometimes to remind me that I’ve logged in on this or that device, or to tell me that my password has been changed, and to check that I the person who actually changed it.

As I hadn’t updated any passwords for a couple of days, I was rather intrigued to see who had sent the email, and I immediately  opened it.  It was from Apple to say I’d added an email as a rescue email to my Apple ID.

apple-email

Well that sounded wrong, so I clicked on the link to ‘Verify Now’ and was taken to a page that looked pretty legitimate.

apple-email-link

 

I thought I should see what was actually going on, so I logged in to my Apple ID using my previous password.  If I had been in any doubt, the fact that it accepted my out-of-date password made it very clear that this was a scam.

The site asked me to continue inputting my data.  At the top of the pages are my name and address details.  It’s also, for the first time, telling me that my account is suspended – always a hacker’s trick to get you worried and filling in information too quickly to think about what you’re actually doing.

apple-verify-1

Then the site starts to request credit card details and bank details …

apple-verify-2

And finally my date of birth so they can steal my identity, and a mobile number so that they can send me scam texts.

apple-verify-3

I know seven other people who received exactly the same email. And it’s just too easy to fall for, so any number of people could be waking up tomorrow with their identity stolen, and bank account and credit cards stripped of all money or credit.

With that in mind, here are some things to look out for in phishy (see what I did there) emails:

  1. Check the email address the email came from! If it looks wrong – it probably is!
  2. Hover your mouse over the links in the email to see where they take you. If this email had really been Apple it would have gone to an https:\\ address, at apple.co.uk
  3. Check grammatical errors in the text of the letter

Now if you do fall for an email as well executed as this, and if I’m completely honest, I’m shocked at how close to a real Apple email and website they looked, make sure you notify your bank and credit card companies immediately.  Change all of your passwords as soon as possible because if you use the same log in combination for any other accounts those could be targeted next.

Christmas has always been a time for giving.  Now it’s become the prime time for taking.

charlotte-seymour-2016

 

Written by Charlotte Seymour, 22nd December 2016

Data Compliant’s Weekly Round-Up

hacker-1

It’s the weekend before Christmas. Have you done all your Christmas shopping? If you’re shopping online, this is the last weekend you can really do your online shopping and still get everything delivered on time. 

Now you may be bored of hearing it but please be careful, look after your passwords, change them regularly, don’t have devices store your information! Lets start the year without a stranger stealing money from your credit cards and bank accounts!

Yahoo…Again 

This week brings us the news that Yahoo had announced a hack from 2013 – a separate breach to the 500,000 hacked records announced in September. 

Yahoo was investigating the 2014 breach when it uncovered the earlier hack – this time discovering that a billions accounts had been compromised. 

The reputational damage to Yahoo is enormous – a clear pattern of poor security is emerging and if I had an account with Yahoo, I’d be considering changing my provider immediately.  Having said that, though,  how can we be certain that other companies haven’t had similar breaches and we just don’t know about them yet?

The ICO’s deputy commissioner, Simon Entwisle has released a statement saying that they are talking to Yahoo and will try to find out how many UK users have been affected by the latest hack. Their immediate advice is to recommend  strongly that customers change their passwords if they haven’t already.

TalkTalk
An update on the huge TalkTalk hack has been released. One of the hackers, a 17 year old, has admitted to 7 offences relating to the hack and has been given a 12-month rehabilitation order and an £85 fine. He was 
told his excellent computer skills need to be used for the good. 19-year old Daniel Kelley also pleaded guilty. He has been told that a jail sentence is inevitable, and has been released on bail prior to sentencing in March.

Uber
Uber has come under fire after an ex-worker claimed that staff could track fares of celebrities, politicians and even ex-partners. If that’s true, it’s lucky for me I’ve only ever used it in Australia where no exes live and unfortunately I’m not yet a celeb!

Uber released a statement to the Standard stating that the claims made by Mr Spangenberg are “absolutely not true … we have hundreds of security and privacy experts working round the clock  to protect our data … all potential violations are quickly and thoroughly investigated.” Uber also makes it clear that access to personal data is limited to approved workers who may only access the data they need in order to perform their job function. 

Lionhead Studio just as bad as ‘Trolls”?
It has been released this week at a BAFTA event that a teenager targeted Sam van Tilburgh and his team, back in 2003, when they were creating the game Fable. The teen released a screen shot of the hero stabbing a child in the head – something no one was expecting to see. 

Rather than go through official routes, Tilburgh and team decided adopt an unconventional aporiach. They were able to track the boy’s IP address and let care the teenager. They then ‘acquired’ some of his school work from and published a part of it, with a demand that he stop or they would publish more and tell be his family what he was up to. He did indeed stop.

Tilburgh said Lionhead’s legal team knew nothing of the retaliating hack, and it has taken 13 years for the story to surface! I wonder if there’ll be repercussions.

The National Lottery hit with fine
So it wasn’t so long ago we heard that hackers had attacked The National Lottery (TNL). Today we hear TNL’s operator Camelot has been issued with a fine of £3m because of a fraudulent payout back in 2009. How this happened has not yet been announced but  it sounds as if a ‘deliberately damaged ticket’ was to blame. The prize fund payout is suspected to be around £2.5m but the actual figure has not yet been officially released.

I, for one will continue to buy my lottery tickets. Although The National Lottery has come under fire recently, it has fuelled a whopping £36 billion into good causes such as sports, community and heritage projects. Also imagine if you won.. (legitimately)

charlotte-seymour-2016

Written by Charlotte Seymour, 17th December 2016

Insider Threats – Charlotte’s View

Insider Threats – Charlotte’s View

Something that is being spoken about more and more (due to the unfortunate higher frequency) is insider threat. It’s in the news an awful lot more than it ever used to be.

Do you remember the auditor of Morrisons who released a spreadsheet detailing just shy of 100,000 members of staff’s (very) personal details? He did end up getting jailed for 8 years but I heard a saying recently, it’s not a digital footprint you leave it’s more of a digital tattoo. Even two years after the incident Morrisons is still suffering the effects.

Now obviously that was what you would call a malicious breach. It does unfortunately happen, but there are ways for you to protect your company against this. Firstly we here at Data Compliant believe that if you have detailed joiner processes in place (i.e. thorough screening and references and criminal checks where appropriate), ongoing appraisals with staff and good leaver processes you can minimise your risk.

Other ways of insider breaches occurring, and much more likely in my opinion, are negligence, carelessness and genuine accidents. Did you know that over 50% of data breaches are cause by staff error? This may be because staff do not follow company procedures correctly and open up pathways for hackers. Or it could be that your staff are tricked into handing over information that they shouldn’t.

Your staff could be your company’s weakest point in relation to protecting it’s personal and confidential data. But you can take simple steps to minimise this risk by training your staff in data protection.

Online training has some big advantages for businesses, it’s a quick, efficient and relatively inexpensive way of training large numbers of employees while “taking them out of the business” for the least possible time.

The risk of breaches isn’t just your business’ reputation, or even a hefty fine from the ICO but as mentioned before, also a criminal conviction. Now that is a lot to risk.

If you’re interested in online training have a look at this video.

 

charlotte

Written by Charlotte Seymour, November 2016

 

Safe Harbor – how does it work?

safe harbor pic

The Data Protection Act 1998 prohibits the transfer of personal data to non-European Union countries unless those countries meet the EU “adequacy” standard for privacy protection. Although both the US and EU profess to similar goals of protecting individuals’ privacy, their actual approaches are quite different.

As a result, the US Department of Commerce consulted with the European Commission, and developed the “Safe Harbor” framework – a cross-border data transfer mechanism that complies with European data protection laws and allows businesses to move personal data from the EU to the United States.  There is a similar but separate framework between the US and Switzerland.

To join the Safe Harbor framework, a company self-certifies to the Department of Commerce that it complies with seven data privacy principles (notice, choice, onward transfer, security, data integrity, access and enforcement) and that it meets the EU adequacy standard.  This self-certification needs to be renewed annually.  If a company fails to complete the annual re-certification process in time, the organisation’s certification is changed to “not current”.

The Federal Trade Commission addresses any violations – indeed on 21st January 2014, the FTC identified twelve companies who claimed in their marketing material that they currently complied with the US – EU Safe Harbor Framework, but who had allowed their certification to expire.  The twelve companies range from technology, consumer products and accounting – as well as National Football League teams.

To “set an example” and to help ensure the ongoing integrity of the Safe Harbor framework, the twelve companies have been prohibited from misrepresenting the extent to which they participate in any privacy or security programme sponsored by the government or any other self-regulatory or standard-setting organisation (including the Safe Harbor Framework).

It is worth noting that agreeing to adhere to the Safe Harbor Frameworks is a permanent undertaking in that an organisation must continue to apply the Safe Harbor Privacy Principles to personal data obtained through the Safe Harbour Frameworks for as long as the organisation stores, uses or discloses the data, even if the organisation has left the Safe Harbor.

There is a Safe Harbor list, which anybody can check to verify an organisation’s status:   https://safeharbor.export.gov/list.aspx

If you are planning to transfer data between the EU and the US, and would like us to help you, just call Michelle or Victoria on 01787 277742 or email victoria@tuffillverner.co.uk or michelle@tuffillverner.co.uk

NHS … patient data … what’s next?

According to the ICO, there were 388 data breaches relating to health data in the first nine months of 2013.  That is 34% of all the data breaches in the UK during the same period, and the proportion has increased from 27% at the end of March to 38% by the end of September 2013.  The chart below compares the number data breach levels by industry sector over the same period.  Given the sensitivity of the health data held by medical organisations in this country, those are shocking statistics.Data breaches by sector to Sept 30 2013

Centralised medical records database

Despite this poor track record, very soon the NHS is going to combine all our medical records into one massive database. Every GP practice in the UK will shortly begin to disclose their patients’ personal and sensitive data to care.data at the Health and Social Care information Centre (HSCIC).  The process is monthly, automatic, and assumes patient consent unless patients actively opt out – which is not necessarily a simple process.

nhs databaseSo what does this mean to patients?  Essentially, personal confidential data (PCD) such as family history, vaccinations, diagnoses, referrals, blood pressure, BMI, cholesterol and NHS prescriptions and more will be extracted from GP systems and shared with care.data.

In order to match data from the GP surgeries with data acquired by the HSCIC from other sources (such as hospitals) identifying data such as data of birth, postcode, NHS number and gender will be included within the data extracts.  Once matched across all the data sources, the data is pseudonymised (ie identifying characteristics are removed).

Once an individual is flagged as “deceased” no further data will be collected – though the data already provided will continue to be processed by the HSCIC.

medical data chartsWhat are the benefits?

If it were possible to trust the security and intentions of those collecting the data, there are some fantastic potential benefits, for example improved patient care; the effective prevention, treatment and management of illness; hospital performance, management of NHS resources; or the analysis and understanding of specific treatment benefits; even planning new health services.

What are the risks?

The poor track record of the NHS in terms of protecting our medical data is alarming and raises concerns over confidentiality of our medical records.  In addition, there are increasing numbers of private companies who provide services to the NHS, from physiotherapists to care homes; from private hospitals to insurance companies.  Members of the public are likely to be uneasy about private companies benefiting from their health data, and equally concerned that their GP will no longer be the “gatekeeper” of their confidential medical data.

Furthermore, although the data will be pseudonymised, single-minded analysts may undoubtedly try and will probably succeed to some degree in finding a way of matching the data against other commercial data sets to “re-identify” the individuals.

Who can use the data?

The data can be released for five listed reasons:  health intelligence, health improvement, audit, health service research and service planning. That’s a pretty broad spectrum, and it is evident that the number and range of potential customers for this centralised database of our medical records is enormous.

For example, how long it will be before insurers persuade the HSCIC that it is to the benefit of the health and social care system that they should model and predict medical claims rates based on the UK’s centralised medical database, and use the findings to price their medical insurance policies accordingly.

Can GP practices opt out?

Doctor Data ControllerThe Health and Social Care Act 2012 creates a statutory obligation for GP practices to disclose the information as directed.  GPs are unable to refuse to do so as such refusal would put them in breach of the statutory requirement.

But because the GP practice is actually the “data controller” of their patients’ confidential medical records, GP practices are also responsible for ensuring that their patients’ personal and sensitive data is handled fairly (as defined under the Data Protection Act 1998).

So it is up to GPs to ensure that patients are aware that their data will be shared with the HSCIC, that the HSCIC has powers to extract personal confidential data, and, arguably, what the HSCIC intends to do with the data.

And if a patient claims they were unaware that their data was to be shared, it would be the GP practice who would be investigated by the ICO.

The GP practices remain data controllers of the data they hold within the practice, but are no longer responsible for the data once it has been disclosed to the HSCIC.  Instead the HSCIC and NHS England become joint data controllers who are obliged to comply with the Data Protection Act.  NHS England will determine the “Purpose” for the data collection, while the HSCIC will determine the manner of processing.

How do patients opt out?

Normally one would expect the sharing of data of this sensitivity and confidentiality to be subject to patient opt-in, rather than the NHS assuming consent.  However, the Health and Social Care Act 2012 empowers the HSCIC to require providers (eg your GP practice) to send it personal confidential data when directed to do so.  And the Act overrides the requirement to seek patient consent.

A patient can inform their GP of their wish to opt out, and no reason is required.  It is worth noting that the right to opt out has been implemented as a constitutional rather than a legal right.  Having opted out, it is up to the GP practice to ensure that the right code is appended to the legal record.

However, the patient has no right to prevent his or her medical data leaving the GP practice if such data carries no identifiable information as this is anonymous data rather than personal data.  The question, really, is what is “identifiable information”?  It is DOB? Arguably in some circumstances, it may be.  And surely an NHS number is identifiable information.

The Secretary of State for Health has given a commitment that individuals’ objections to disclosure ot the HSCIC will be respected in “all but exceptional circumstance” (for example, a civil emergency).

Is the process compliant?

You could argue that this data sharing activity defies the second principle of the Data Protection Act:  “Personal data shall be obtained only for one or more specified and lawful purposes, and shall not be further processed in any manner incompatible with the purpose or those purposes”.  In my view, you don’t talk to your doctor about a medical condition for any purpose other than to have him solve – or try to solve the problem for you.  And while that may include prescriptions, or visits to consultants, hospitals and clinics, making our medical records data available to commercial organisations cannot possibly be considered the “Purpose”.