Category Archives: General Information

GDPR and Data Privacy Impact Assessments (DPIAs)

DPIA blog image

When are they needed?  How are they done?

Next year under the new GDPR data protection legislation, Privacy Impact Assessments will become known as Data Privacy Impact Assessments, and will be mandatory instead of merely recommended.

The ICO currently describes PIAs as “a tool which can help organisations identify the most effective way to comply with their data protection obligations and meet individuals’ expectations of privacy.”

While the soon-to-be-rechristened DPIAs will be legally required, data controllers should continue to fully embrace these opportunities to ensure that heavy fines, brand reputational damage and the associated risks of data breaches can be averted from an early stage in any planned operation.

When will a DPIA be legally required?

Organisations will be required to carry out a DPIA when data processing is “likely to result in a high risk to the rights and freedoms of individuals.” This can be during an existing or before a planned project involving data processing that comes with a risk to the rights of individuals as provided by the Data Protection Act. They can also range in scope, depending on the organisation and the scale of its project.

DPIAs will therefore be required when an organisation is planning an operation that could affect anyone’s right to privacy: broadly speaking, anyone’s right ‘to be left alone.’ DPIAs are primarily designed to allow organisations to avoid breaching an individual’s freedom to “control, edit, manage or delete information about themselves and to decide how and to what extent such information is communicated to others.” If there is a risk of any such breach, a DPIA must be followed through.

Listed below are examples of projects, varying in scale, in which the current PIA is advised – and it is safe to assume all of these examples will necessitate a DPIA after the GDPR comes into force:

  • A new IT system for storing and accessing personal data.
  • A new use of technology such as an app.
  • A data sharing initiative where two or more organisations (even if they are part of the same group company) seek to pool or link sets of personal data.
  • A proposal to identify people in a particular group or demographic and initiate a course of action.
  • Processing quantities of sensitive personal data
  • Using existing data for a new and unexpected or more intrusive purpose.
  • A new surveillance system (especially one which monitors members of the public) or the application of new technology to an existing system (for example adding Automatic number plate recognition capabilities to existing CCTV).
  • A new database which consolidates information held by separate parts of an organisation.
  • Legislation, policy or strategies which will impact on privacy through the collection of use of information, or through surveillance or other monitoring

How is a DPIA carried out?

There are 7 main steps that comprise a DPIA:

  1. Identify the need for a DPIA

This will mainly involve answering ‘screening questions,’ at an early stage in a project’s development, to identify the potential impacts on individuals’ privacy. The project management should begin to think about how they can address these issues, while consulting with stakeholders.

  1. Describe the information flows

Explain how information will be obtained, used and retained. This part of the process can identify the potential for – and help to avoid – ‘function creep’: when data ends up being processed or used unintentionally, or unforeseeably.

  1. Identify the privacy and related risks

Compile a record of the risks to individuals in terms of possibly intrusions of data privacy as well as corporate risks or risks to the organisation in terms of regulatory action, reputational damage and loss of public trust. This involves a compliance check with the Data Protection Act and the GDPR.

  1. Identify and evaluate the privacy solutions

With the record of risks ready, devise a number of solutions to eliminate or minimise these risks, and evaluate the costs and benefits of each approach. Consider the overall impact of each privacy solution.

  1. Sign off and record the DPIA outcomes

Obtain appropriate sign-offs and acknowledgements throughout the organisation. A report based on the findings and conclusions of the prior steps of the DPIA should be published and accessible for consultation throughout the project.

  1. Integrate the outcomes into the project plan

Ensure that the DPIA is implemented into the overall project plan. The DPIA should be utilised as an integral component throughout the development and execution of the project.

  1. Consult with internal and external stakeholders as needed throughout the process

This is not a ‘step’ as such, but an ongoing commitment to stakeholders to be transparent about the process of carrying out the DPIA, and being open to consultation and the expertise and knowledge of the organisation’s various stakeholders – from colleagues to customers. The ICO explains, “data protection risks are more likely to remain unmitigated on projects which have not involved discussions with the people building a system or carrying out procedures.”

DPIAs – what are the benefits?

There are benefits to DPIAs for organisations who conduct them.  Certainly there are cost benefits to be gained from knowing the risks before starting work:

  • cost benefits from adopting a Privacy by Design approach:  knowing the risks before starting work allows issues to be fixed early, resulting in reduced development costs and delays to the schedule
  • risk mitigation in relation to fines and loss of sales caused by lack of customer and/or shareholder confidence
  • reputational benefits and trust building from being seen to consider and embed privacy issues into a programme’s design from the outset

For more information about DPIAs and how Data Compliant can help, please email dc@datacompliant.co.uk.

Harry Smithson   20th July 2017

ICO updates Subject Access Requests (SARs) advice for data controllers following Court of Appeal decisions

The Information Commissioner’s Office (ICO) has updated its ‘Code of Practice on Subject Access Requests’ chiefly in response to several Court of Appeal decisions made earlier this year related to SARs. Under the Data Protection Act 1998, individuals (‘data subjects’) may request access to their personal information held by a ‘data controller.’

These requests for information are called SARs, and can range from the request for specific or limited information to the request for the entirety of held information including why it is held and to whom it may have been disclosed. The scope of a data controller’s obligations, therefore, will vary from case to case, and will be particularly burdensome for large organisations. Currently, data controllers may charge a fee of up to £10 for processing a SAR, and must provide the requester the relevant information within 40 calendar days. When the GDPR comes into force next year, data controllers will normally not be entitled to charge a fee, irrespective of the inconvenience, and will be expected to provide the information within a shorter timeframe of 30 calendar days.

However, the ICO has revised its guidance in dealing with SARs to prepare controllers for data compliance in light of the Court of Appeal’s judgements on a string of cases in which SARs took place alongside ongoing or threatened litigation – cases which in the opinion of numerous legal commentators, therefore, highlight the potential for widespread abuse of SARs to redress grievances outside the purview of data protection law.

The three key changes to the ICO’s Code

  1. Scope for assessing ‘disproportionate effort’

The DPA includes an exemption from having to respond to SARs if this would involve ‘disproportionate effort’ for the data controller. Whereas the Code previously indicated that a refusal to provide information on the grounds of it being difficult is unacceptable, it now, with greater lenience, states: “there is scope for assessing whether, in the circumstances of a particular case, supplying a copy of the requested information in permanent form would result in so much work or expense as to outweigh the requester’s right of access to their personal data.” The ICO expects controllers to evaluate the benefits to the data subject as a result of the SAR against the difficulties in complying with the request, and assess whether the scope of the request is reasonable.

  1. Dialogue between controller and requester

The ICO now advises controllers to enter into dialogue with data subjects following a SAR. This may allow the requester to specify which information they require, thereby refining the request, and making the process more manageable and less likely to result in disproportionate effort. The Code continues to explain how it will take into account both controller’s and subject’s willingness to participate in this dialogue if they receive a complaint about the handling of a SAR.

  1. Information management systems and redaction of third-party data

 The ICO now expects controllers to have information management systems wherein personal information, including archived or back-up data, can be found expediently in anticipation of a SAR. Moreover, the information management system should allow for the redaction of third-party data. This is important, since certain SARs may be declined if the information requested would result some way in the disclosure of personal information about another living person.

Subject Access Requests: For more information have a look at the 4 Court of Appeal decisions that informed the ICO’s revised guidance:  Dawson-Damer v Taylor Wessing LLP, Ittihadieh v 5-11 Cheyne Gardens, Deer v Oxford University, Holyoake v Candy

Harry Smithson 7th July 2017

Largest cyber attack on Parliament to date prompts fears of major national security compromise

International Trade Secretary Liam Fox stressed that “it’s a warning to everybody, whether they are in Parliament or elsewhere that they need to do everything possible to maintain their own cyber security.”

 

MPs took to Twitter and social media to notify their constituents as their parliamentary email accounts were besieged by a concerted 12-hour ‘brute force’ cyberattack targeting ‘weak passwords.’

On Friday night, parliamentary officials “discovered unauthorised attempts to access accounts of parliamentary networks users.” In response, remote access to the network was cut off, meaning that MPs and aides could not access their official email accounts outside of Westminster.

Parliamentary officials have been working with the National Cyber Security Centre, part of intelligence agency GCHQ, to investigate the attempted breach and assess the potential compromise to national security.

The NCSC’s latest statement on their website, as of Saturday, reads:

The NCSC is aware of an incident and is working around the clock with the UK Parliamentary digital security team to understand what has happened and advise on the necessary mitigating actions.

It is still unclear whether the attempts were successful, or whether any confidential information in the network has been acquired. Moreover, MPs and cyber specialists can only speculate as to the identity of the cyber-attackers.

However, whether those responsible for the attack are foreign ‘state actors’ or organised criminals, the compromise to the confidentiality of private or personal information and national security details is a major risk. Security advisors have warned that the parliamentary email network is a ‘treasure trove’ of information not only for blackmailers, but also for hostile states, crime syndicates and terrorist organisations.

Many Twitter users following the story have been quick to link this attempted breach to Russian state agencies (some using the hashtag #russia), citing interference in European and American elections, as well as the cyberattack on the German Bundestag in 2015, as prior examples of similar assaults on democratic institutions. However, the relatively rudimentary nature of the ‘brute force’ attempted password hacks on Parliament on Friday contrasts, for instance, with the sophisticated attempt to install remote data monitoring software onto the German state’s computer systems two years ago, which German authorities blamed on Russian agents.

While government sources have stated that it is too early to draw conclusions regarding the fallout of the event or the perpetrators, MPs have acknowledged the extent of the threat posed by cybercrime. Tory MP for NW Leicestershire, Andrew Bridgen, stated, “if people thought our emails were not secure it would seriously undermine our constituents’ confidence and trust in approaching their MP at a time of crisis.”

Referencing the ‘WannaCry’ attack on 48 NHS hospitals only a month ago, International Trade Secretary Liam Fox said it was ‘no surprise’ that Parliament would face hacking attempts given the recent attack on our public services.

In the Queen’s Speech last week, the government outlined plans to improve data protection with a new Data Protection Bill, but this did not provide details of plans to counter threats of largescale hacking or cybercrime at home or abroad.

The government indicated, however, that they hoped the new law would help them to collaborate with former EU partners and international allies in order to confront threats to global security, threats in which cyber-conflict plays an increasingly prominent role. It may well be that these measures are following up the government’s statement in 2015 in the National Security Strategy that cyber-attacks from both organised crime and foreign intelligence agencies are one of the “most significant risks to UK interests.”

GDPR Re-Permissioning needs careful planning

Morrisons becomes the latest high-profile company fined for breaking Privacy and Electronic Communications Regulations (PECR)

The ICO, the independent authority responsible for investigating breaches of data protection law, has fined the fourth largest supermarket chain in the UK £10,500 for sending 130,671 of their customers’ unsolicited marketing emails.

These customers had explicitly opted-out of receiving marketing emails related to their Morrisons ‘More’ loyalty card when they signed up to the scheme. In October and November 2016, Morrisons used the email addresses associated with these loyalty cards to promote various deals. This is in contravention of laws defining the misuse of personal information, which stipulate that individuals must give consent to receive personal ‘direct’ marketing via email.

‘Service emails’ versus ‘Marketing emails’

While the emails’ subject heading was ‘Your Account Details,’ the customers were told that by changing the marketing preferences on their loyalty card account, they could receive money off coupons, extra More Points and the company’s latest news.

The subject heading might suggest to the recipient that they are ‘service emails,’ which are defined under the Data Protection Act 1998 (DPA) as any email an organisation has a legal obligation to send, or an email without which an individual would be disadvantaged (for instance, a reminder for a booked train departure). But there is a fine line between a service email and a marketing email: if an email contains any brand promotion or advertising content whatsoever, it is deemed the latter under the DPA. Emails that ask for clarification on marketing preferences are still marketing emails and a misuse of personal contact data.

Morrisons explained to the ICO that the recipients of these emails had opted-in to marketing related to online groceries, but opted-out of marketing related to their loyalty cards, so emails had been sent for the ostensible purpose of qualifying marketing preferences which also included promotional content. Morrisons could not provide evidence that these customers had consented to receiving this type of email, however, and they were duly fined – although in cases such as this it is often the losses from reputational damage that businesses fear more.

Fines and reputational damage

This comes just three months after the ICO confirmed fines – for almost identical breaches of PECR – of £13,000 and £70,000 for Honda and Exeter-based airline Flybe respectively. Whereas Honda could not prove that 289,790 customers had given consent to direct e-marketing, Flybe disregarded 3.3 million addressees’ explicit wishes to not receive marketing emails.

Even a fine of £70,000 – which can currently be subject to a 20% early payment discount – for sending out emails to existing customers with some roundabout content in them for the sake of promotion, will seem charitable when the General Data Protection Regulation (GDPR) updates the PECR and DPA in 2018. Under the new regulations, misuse of data including illegal marketing risks a fine of up to €20 million or 4% of annual global turnover.

The ICO has acknowledged Honda’s belief that their emails were a means of helping their firm remain compliant with data protection law, and that the authority “recognises that companies will be reviewing how they obtain customer consent for marketing to comply with stronger data protection legislation coming into force in May 2018.”

These three cases are forewarnings of the imminent rise in stakes for not marketing in compliance with data protection law. The GDPR, an EU regulation that will demand British businesses’ compliance irrespective of Brexit, not only massively increases the monetary penalty for non-compliance, but also demands greater accountability to individuals with regard to the use and storage of their personal data.

The regulators recent actions show that companies will not be able cut legal corners under the assumption of ambiguity between general service and implicit promotional emails. And with the GDPR coming into force next year, adherence to data protection regulations is something marketing departments will need to find the time and resources to prepare for.

Harry Smithson, 22/06/17

Queen’s Speech Confirms New Bill to Replace Data Protection Act 1998

As part of several of measures aimed at “making our country safer and more united,” a new Data Protection Bill has been announced in the Queen’s Speech.

The Bill, which follows up proposals in the Conservative manifesto ahead of the election in June, is designed to make the UK’s data protection framework “suitable for our new digital age, allowing citizens to better control their data.”

The intentions behind the Bill are to:

  • Give people more rights over the use and storage of their personal information. Social media platforms will be required to delete data gathered about people prior to them turning 18. The ‘right to be forgotten’ is enshrined in the Bill’s requirement of organisations to delete an individual’s data on request or when there are “no longer legitimate grounds for retaining it.”
  • Implement the EU’s General Data Protection Regulation, and the new Directive which applies to law enforcement data processing. This meets the UK’s obligations to international law enforcement during its time as an EU member state and provides the UK with a system to share data internationally after Brexit is finalised.
  • To update the powers and sanctions available to the Information Commissioner.
  • Strengthen the UK’s competitive position in technological innovation and digital markets by providing a safe framework for data sharing and a robust personal data protection regime.
  • Ensure that police and judicial authorities can continue to exchange information “with international partners in the fight against terrorism and other serious crimes.”

Ultimately, the Bill seeks to modernise the UK’s data protection regime and to secure British citizens’ ability to control the processing and application of their personal information. The Queen’s Speech expressed the Government’s concern not only over law enforcement, but also the digital economy: over 70% of all trade in services are enabled by data flows, making data protection critical to international trade, and in 2015, the digital sector contributed £118 billion to the economy and employed over 1.4 million people across the UK.

Written by Harry Smithson, 22nd June 2017

Phishing ..Christmas..a time for taking?

phishing-alertThere I was, at my desk on Monday morning, preoccupied with getting everything done before the Christmas break, and doing about 3 things at once (or trying to).  An email hit my inbox with the subject “your account information has been changed”.  Because I regularly update all my passwords, I’m used to these kinds of emails arriving from different companies – sometimes to remind me that I’ve logged in on this or that device, or to tell me that my password has been changed, and to check that I the person who actually changed it.

As I hadn’t updated any passwords for a couple of days, I was rather intrigued to see who had sent the email, and I immediately  opened it.  It was from Apple to say I’d added an email as a rescue email to my Apple ID.

apple-email

Well that sounded wrong, so I clicked on the link to ‘Verify Now’ and was taken to a page that looked pretty legitimate.

apple-email-link

 

I thought I should see what was actually going on, so I logged in to my Apple ID using my previous password.  If I had been in any doubt, the fact that it accepted my out-of-date password made it very clear that this was a scam.

The site asked me to continue inputting my data.  At the top of the pages are my name and address details.  It’s also, for the first time, telling me that my account is suspended – always a hacker’s trick to get you worried and filling in information too quickly to think about what you’re actually doing.

apple-verify-1

Then the site starts to request credit card details and bank details …

apple-verify-2

And finally my date of birth so they can steal my identity, and a mobile number so that they can send me scam texts.

apple-verify-3

I know seven other people who received exactly the same email. And it’s just too easy to fall for, so any number of people could be waking up tomorrow with their identity stolen, and bank account and credit cards stripped of all money or credit.

With that in mind, here are some things to look out for in phishy (see what I did there) emails:

  1. Check the email address the email came from! If it looks wrong – it probably is!
  2. Hover your mouse over the links in the email to see where they take you. If this email had really been Apple it would have gone to an https:\\ address, at apple.co.uk
  3. Check grammatical errors in the text of the letter

Now if you do fall for an email as well executed as this, and if I’m completely honest, I’m shocked at how close to a real Apple email and website they looked, make sure you notify your bank and credit card companies immediately.  Change all of your passwords as soon as possible because if you use the same log in combination for any other accounts those could be targeted next.

Christmas has always been a time for giving.  Now it’s become the prime time for taking.

charlotte-seymour-2016

 

Written by Charlotte Seymour, 22nd December 2016

RSPCA and British Heart Foundation Fined

CHARITY FINED.jpg

So it’s getting closer and closer to Christmas – a time for giving, with more and more charity adverts on the TV, on the radio, on social media – in fact  pretty much everywhere you look. Although Christmas can be a bit tight on the purse strings thousands of people still give to their favourite charities.

Whether you’re helping children, refugees, animals or cancer or medical research, these organisations all promote that the money goes to a good cause. Unless this ‘good cause’ is to pay an ICO fine…?

Two of the major charities we all know and love are the RSPCA and the British Heart Foundation. And both have been under investigation for secretly screening its donors aiming to target those with more money. This process is known as “wealth-screening”.

The two organisations hired wealth management companies who pieced together information on its donors from publicly available sources to build data on their income, property value and even friendship circles. This allowed for a massive pool of donor data to be created and sold.

The RSPCA and BHF were part of a scheme called Reciprocate where they could share and swap data with other charities to find prospective donors. Donors to both charities were given an opt-out option.

Information included in the scheme was people’s names, addresses, date of birth and the value and date of their last donation. The ICO ruled that the charities didn’t provide a clear enough explanation to allow consumers to make an educated decision what it was they were signing up for, and therefore ruled that they had therefore not given their consent.

The RSPCA has admitted that it was not aware of the actual charities with whom they were sharing their data.  It also became clear that the charity shared data of those donors who had opted out.

The BHF insists it had all the correct permissions. However the ICO disagrees on the basis that the charities with whom they were sharing the data were not for similar causes.

The ICO has fined the RSPCA £25,000 and the British Heart Foundation £18,000. Ironically the BJF was praised on its data handling by the ICO in June this year, and it is likely to appeal the fine.

In my opinion I feel the whole thing is a mess. I like to give to charity when I can, which if I’m honest, isn’t as frequent as I’d like.

However when you hear of debacles like this, it really does put you off. I want my money to go to a good cause. I don’t want my data being shared without my knowledge so that other charities can investigate how much I earn, whether I own my property and what social circles I move in, and then decide whether I’m worth targeting. Surely these charities should be thankful for every single donation. The widow’s mite springs to mind.

I feel for the poor animals and souls that rely on these charities, who are I’m sure going to take a hit from these fines. It’s not their fault, yet no doubt it’s them that’s going to pay the price.

charlotte-seymour-2016

 

Written by Charlotte Seymour, 8th December 2016.

National Lottery customers hacked. But who handed over the key?

master-key

Another day … another hack. Such events are inescapably becoming almost daily news. The endless catalogue of everyday cyber crime, ranging from hacking, ransom attacks, bullying, breaches, theft and fraud, simply underlines that any crime that can be committed in our physical world can – and is – equally being perpetrated in cyber space.

Given that such attacks and breaches are making the headlines almost daily, it baffles me that companies and customers (that’s us by the way) don’t make a greater effort to protect themselves.

Camelot, The National Lottery’s operator, discovered this latest breach on Sunday and went public on Wednesday morning. Camelot says that only 26,500 of the 9.5 million registered user accounts were compromised, and that there has only been activity on just under 50 of the infiltrated accounts. They have confirmed that no money has been removed or added to any of these accounts and that the National Lottery does not hold full debit card or bank account details. The Information Commissioner’s Office says it has launched an investigation.

Camelot insists that the reason for the compromised accounts is because users have been operating the same password for multiple websites. (Sound familiar? Last week’s Deliveroo breach comes to mind).

Quite properly when we hear of a data breach we turn the spotlight onto the companies that we deal with, who are in charge of protecting our information. But it would be no bad thing for us to point the spotlight at ourselves as the other half of the equation. As consumers, we have to take responsibility too.

We have all repeatedly been advised – and frankly, must surely know by now –  it is vital that a different password is used for every website. For as long as we fail to take this basic precaution, these breaches will be possible.  It would seem that we’re no or slow learners.

I don’t know about you, but I have more accounts than I care to think about. A password including capital letters, symbols and numbers is difficult enough to remember for just one account. However with hacks happening more and more frequently it’s made me pull up my socks and change all of my passwords.

I choose not to have my phone or computer store my passwords, because if either device is stolen (or lost) someone will have all my information in the palm of their hand.

It’s time we all realised how vitally important it is to have safe and secure and different passwords for every account we have, especially when cyber criminals are getting wiser and more sophisticated by the minute. A password is a key. So using just one password to access all your websites means that you are effectively handing criminals the master key to all your online activity.

Hint – A password with 12 characters including a few bits and pieces can take over 2 centuries to crack … that’s the one for me!

charlotte-seymour-2016

Written by Charlotte Seymour, 30th November 2016

Snoopers’ Charter – What do you think?

big brother.pngThe Investigatory Powers Bill, also known as the Snoopers’ Charter, was passed by the House of Lords last week. This means that service providers will now need to keep – for 12 months – records of every website you visit, (not the exact URL but the website itself), every phone call you make, how long each call lasts, including dates and times the calls were made. They will also track the apps you use on your phone or tablet.

The idea behind the Bill is to prevent terrorism and organised crime, which, it goes without saying, we all fully support.  What it will also obviously do is to place massive amounts of personal information into the hands of the government and other bodies for that 12-month period.  And there has been and will continue to be a huge debate over whether and to what extent this is a breach of our privacy.

This Bill will also allow the police and authorities to look at a specific location and see which websites are highly used in that area, and even who is visiting that area. Dozens of public organisations and departments, such as HMRC, the Food Standards Agency and Gambling Commission, will also be able to access this information without needing evidence for ‘reasonable doubt’ to do so.

What has not changed is that security services still have the ability to hack in to your communications, and eavesdrop into your calls, read your texts and emails, only as long as they have the required warrant to do so. So in theory your actual conversations are still safe unless there is a reason to believe you are involved in something you shouldn’t be.

All this is very well, but is the Bill self-defeating?  Doesn’t it just encourage the use of VPNs which will bounce your IP around the world so you can’t be traced?  If you were doing something you didn’t want officials to know about, isn’t that just what you’d do?

Food for thought here is that the UK will expect companies like Google, Facebook and Apple to unencrypt some of their software so that the UK can gain access to those records. These companies aren’t British companies. So can they refuse? The thing that worries me is that if they do refuse, would they be tempted to pull out of working with the UK completely?  In which case, what does the government want more – the business and jobs these companies provide or the data they hold?

Not only that, but we are now living in the age where Yahoo can lose half a billion accounts, a Three Mobile breach can put millions of customers at risk, and thousands of Tesco customers can have money simply removed from their bank accounts.  And the list goes on. Is not keeping all this data stored for 12 months just like a huge red target for hackers?  Even though this Bill is driven by national security, the risk is that it still leaves an ocean of information that can be dipped into, hacked and misused.

I feel caught between a rock and a hard place.  I have no issues with the government bodies looking through my history should they choose to, but is it right that they can? And then you have to wonder … has anything really changed that much?  Hmmm…

What do you think? None of this will go away. Our children will inherit this Bill and will grow up with all of its implications.

charlotte

 

Written by Charlotte Seymour – November 2016

Data Transfer Security – Not so safe

data-transferA Historical Society (the ICO haven’t released the name of which one) has been fined after a laptop was stolen,  holding sensitive personal data on those who had donated or loaned artefacts. The laptop was unencrypted and the ICO found that there were no policies in place when it came to encryption or homeworking. The organisation was fined just £500 to be reduced to £400 if paid early.  Not much of a punishment if you ask me – who doesn’t encrypt sensitive personal data now-a-days??

This announcement was released less than a week after Essex, Suffolk and Norfolk had fallen into a debacle where thousands of highly sensitive medical records were lost in transit between GP surgeries. It seems extraordinary that, even in this age of drones and virtual reality, doctors’ surgeries are still using hard copy of our medical records which, if you change practices,  have to be physically picked up from your old doctor’s surgery and transferred to your new GP.

Just imagine moving house – it’s very exciting, and stressful and can be overwhelming. Whether you are just moving down the road or across counties, you have the worry of unpacking, making your new house a home, changing your address on things like your driver’s license, bank account etc. You also have to think about changing your dentist and of course your doctor.

Capita, an outsourced company, took on the national £400 million, 7 year NHS contract in September last year to do a number of things, including transferring patient notes from one GP practice to another. It has just been reported that over 9,000 patient records have gone missing in the last couple of months across East Anglia alone. The GPC (the General Practitioners Committee) ran a survey of 281 practices and found that just under a third had received the wrong patient notes, over a quarter of practices failed to have records collected from them on the agreed date with Capita and over 80% of urgent requests for records were not processed within 3 weeks.

The NHS is always under scrutiny for something but when they don’t have the correct information for their patients, it makes you feel a little sorry for them.

The main question on my mind is why are there still physical records for such sensitive information like your medical history? The target to reach the utopia of paperless patient records is currently 2020, so for another 4 years our physical records will still need to be transferred physically if we move to another practice. Given the ransomware attacks, breaches and hacks already prevalent within not only the NHS but across all organisations and business sectors, you have to hope that greater care will be taken with our digital records than we are currently seeing with our physical records.

What I find interesting is that Capita, according to a report from the BBC, has refused to recognise these claims. If that is the case, you have to ask why, only last week, Health minister Nicola Blackwood told MPs that she expects Capita to consider “compensation as an option” and stated that Capita had been ‘inadequately prepared’ to take over the primary care support services contract earlier this year. She also made it plain that there should have been greater scrutiny of Capita’s competence in delivering the contract.

A Capita spokeswoman said: ‘NHS England contracted Capita to both streamline delivery of GP support services and make significant cost savings across what was a highly localised service with unstandardised, generally unmeasured and in some cases, uncompliant processes.

We have taken on this challenging initiative and we have openly apologised for the varied level of service experienced by some service users as these services were transitioned and are being transformed.’ She said the company did not recognise ‘whatsoever’ claims that thousands of patient records were missing.

Regardless of the above, what is absolutely clear is that whether you are transferring data either physically, like in this case, or electronically it is highly important to have appropriate security procedures in place. Keep records of the data you are transferring. Know that it has – or has not arrived. And, if you’re using digital transfers, the ICO has recommended encrypting not only your files but also the connection you are using to transfer them.

Now if you excuse me I’m just off to call my doctor’s surgery and see if they still are in possession of my medical notes!

charlotte

 

Written by Charlotte Seymour – November 2016