Category Archives: Uncategorized

Belgian Data Protection Authority’s first GDPR fine imposed on public official

The Belgian DPA delivered a strong message on 28th May 2019, that data protection is “everyone’s concern” and everyone’s responsibility, by premiering the GDPR’s sanctioning provision in Belgium with a fine of €2,000 imposed on a mayor (‘bourgmestre’) for the illegal utilisation of personal data. 

Purpose Limitation was Breached 

The mayor in question used personal data obtained for the purposes of mayoral operations in an election campaign, in breach of GDPR, particularly the purpose limitation principle, which states that data controllers and/or processors must only collect personal data for a specific, explicit and legitimate purpose. Given the fairly moderate fine, the data the mayor obtained would not have contained special category data (formerly known as sensitive personal data in the UK). The Belgian DPA also looked at other factors when deciding on the severity of the sanction, including the limited number of affected data subjects; nature and gravity of the infringement; and duration.  

‘Not Compatible with Initial Purpose’ 

The Belgian DPA received a complaint from the affected data subjects themselves, whose consent to the processing of their data was based on the assumption it would be used appropriately, in this case for administrative mayoral duties. The plaintiffs and the defendant were heard by the DPA’s Litigation Chamber, which concluded along GPDR lines that ‘the personal data initially collected was not compatible with the purpose for which the data was further used by the mayor.’ 

The decision was signed off by the relatively new Belgian commissioner, David Stevens, as well as the Director of the Litigation Chamber, a Data Protection Authority chamber independent from the rest of the Belgian judicial system. 

Harry Smithson, 3rd June 2019

Personal Data Protection Act (PDPA) comes into effect in Thailand after royal endorsement 

On 27th May, the Kingdom of Thailand’s first personal data protection law was published in the Government Gazette and made official. This comes three months after the National Legislative Assembly passed the Personal Data Protection Act in late February and submitted the act for royal endorsement.

While the law is now technically in effect, its main ‘operative provisions’ such as data subjects’ rights to consent and access requests, civil liabilities and penalties, will not come into proper effect until a year after their publishing, i.e. May 2020. This was planned to give data controllers and processors a grace period of one year to prepare for PDPA compliance, the requirements and obligations of which were designed to have demonstrative equivalency with the EU’s international spearheading General Data Protection Regulation (GDPR). As such, within a year, Thai public bodies, businesses and other organisations will qualify for ‘adequate’ third-party status regarding data protection provisions, allowing market activity and various other transactions involving proportionate data processing between EU member states and Thailand. The GDPR has special restrictions on the transfer of data outside the EEA, but the PDPA may prompt the EU to make a data protection provision adequacy decision regarding Thailand.

It is uncommon in Thai law for provisions to have an ‘extraterritorial’ purview, but the new PDPA was designed to have this scope to protect Thai data subjects from the risks of data processing, particularly marketing or ‘targeting’ conducted by non-Thai agents offering goods or services in Thailand.

The PDPA contains many word-for-word translations of the GDPR, including the special category of ‘sensitive personal data’ and Subject Access Requests (SARs). Personal data itself is defined, along GDPR lines, as “information relating to a person which is identifiable, directly or indirectly, excluding the information of a dead person.”

The European Commission has so far recognised Andorra, Argentina, Canada (commercial organisations), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Switzerland, Uruguay and the United States of America (limited to the Privacy Shield framework) and most recently Japan, as third countries providing adequate data protection. Adequacy agreements are currently being sought with South Korea, and with Thailand’s latest measures, it may be that the southeast Asian nation will be brought into the fold.

Irrespective of Brexit, the Data Protection Act passed by Theresa May’s administration last year manifests the GDPR within the UK’s regulatory framework. However, on exit the UK will become a so called third country and will have to seek an adequacy decision.  The ICO will continue to be the UK Supervisory Authority.

Harry Smithson, 30 May 2019 

 

GDPR’s 1st Birthday

General Data Protection Regulation reaches its first birthday

This blogpost arrives as the General Data Protection Regulation (GDPR) reaches its first birthday, and a week after a report from the Washington-based Center for Data Innovation (CDI) suggested amendments to the GDPR.

The report argues that regulatory relaxations would help foster Europe’s ‘Algorithmic Economy,’ purporting that GDPR’s restrictions of data sharing herald setbacks for European competitiveness in the AI technology field.

Citing the European Commission’s ambition “for Europe to become the world-leading region for developing and deploying cutting-edge, ethical and secure AI,” the report then proceeds to its central claim that “the GDPR, in its current form, puts Europe’s future competitiveness at risk.”

That being said, the report notes with approval France’s pro-AI strategy within the GDPR framework, in particular the country’s use of the clause that “grants them the authority to repurpose and share personal data in sectors that are strategic to the public interest—including health care, defense, the environment, and transport.”

Research is still being conducted into the legal and ethical dimensions of AI and the potential ramifications of automated decision-making on data subjects. In the UK, the ICO and the government’s recent advisory board, the Centre for Data Ethics and Innovation (CDEI – not to be confused with aforementioned CDI), are opening discussions and conducting call-outs for evidence regarding individuals’ or organisations’ experiences with AI. There are of course responsible ways of using AI, and organisations hoping to make the best of this new technology have the opportunity to shape the future of Europe’s innovative, but ethical use of data.

The Information Commissioner’s Office (ICO) research fellows and technology policy advisors release short brief for combatting Artificial Intelligence (AI) security risks

The ICO’s relatively young AI auditing framework blog (set up in March this year) discusses security risks in their latest post, using comparative examples of data protection threats between traditional technology and AI systems. The post focuses “on the way AI can adversely affect security by making known risks worse and more challenging to control.”

From a data protection perspective, the main issue with AI is its complexity and the volume of not only data, but externalities or ‘external dependencies’ that AI requires to function, particularly in the AI subfield Machine Learning (ML). Externalities take the form of third-party or open-source software used for building ML systems, or third-party consultants or suppliers who use their own or a partially externally dependent ML system. The ML systems themselves may have over a hundred external dependencies, including code libraries, software or even hardware, and their effectiveness will be determined by their interaction with multiple data sets from a huge variety of sources.

Organisations will not have contracts with these third parties, making data flows throughout the supply chain difficult to track and data security hard to keep on top of. AI developers come from a wide array of backgrounds, and there is no unified or coherent policy for data protection within the AI engineering community.

The ICO’s AI auditing blog uses the example of an organisation hiring a recruitment company who use Machine Learning to match candidate CVs to job vacancies. A certain amount of personal data would have been transferred between the organisation and the recruitment agency using manual methods. However, additional steps in the ML system will mean that data will be stored and transferred in different formats across different servers and systems. They conclude, “for both the recruitment firm and employers, this will increase the risk of a data breach, including unauthorised processing, loss, destruction and damage.”

For example, they write: 

  • The employer may need to copy HR and recruitment data into a separate database system to interrogate and select the data relevant to the vacancies the recruitment firm is working on.
  • The selected data subsets will need to be saved and exported into files, and then transferred to the recruitment firm in compressed form.
  • Upon receipt the recruitment firm could upload the files to a remote location, eg the cloud.
  • Once in the cloud, the files may be loaded into a programming environment to be cleaned and used in building the AI system.
  • Once ready, the data is likely to be saved into a new file to be used at a later time.

This example will be relevant to all organisations contracting external ML services, which is the predominant method for UK businesses hoping to harness the benefits of AI. The blog provides three main pieces of advice based on ongoing research into this new, wide (and widening) area of data security. They suggest that organisations should,

  • Record and document the movement and storing of personal data, noting when the transfer took place, the sender and recipient, and the respective locations and formats. This will help monitor risks to security and data breaches;
  • Intermediate files such as compressed versions should be deleted when required – as per best-practice data protection guidelines; and
  • Use de-identification and anonymization techniques and technologies before they are taken from the source and shared either internally or externally.

Harry Smithson,  May 2019

What is a Data Protection Officer (DPO), and do you need one?

A DPO (Data Protection Officer) is an individual responsible for ensuring that their organisation is processing the data of its staff, customers, providers and any other individuals, i.e. data subjects, in compliance with data protection regulations. As of the EU-wide General Data Protection Regulation (GDPR), a DPO is mandatory for:

  1. Public authorities; and
  2. Organisations that process data
  • On a large scale; 
  • With regular and systematic monitoring; 
  • As a core activity; or 
  • In large volumes of ‘special category data,’ formerly known as ‘sensitive personal data,’ i.e. information related to a living individual’s racial or ethnic origin, political opinions, religious beliefs, trade union membership, physical or mental health condition, sex life or sexual orientation, or biometric data.  

It may not be immediately obvious whether an organisation must have a designated DPO under GDPR. If so, it is necessary to make a formal evaluation, recording the decision and the reasons behind it. The WP29 Guidelines on Data Protection Officers (‘DPO’), endorsed by the European Data Protection Board (EDBP), recommends that organisations should conduct and document an internal analysis to determine whether or not a DPO should be appointed. Ultimately, such decision-making should always take into account the organisation’s obligation to fulfil the rights of the data subject, the primary concern of the GDPR: does the scale, volume or type of data processing in your organisation risk adversely effecting an individual or the wider public?

Even if a DPO is not legally required, organisations may benefit from voluntarily appointing an internal DPO or hiring an advisor – this will ensure best-practice data protection policies and practices, improving cyber security, staff and consumer trust, and other business benefits. When a DPO is designated voluntarily, they will be considered as mandatory under GDPR – i.e. the voluntarily appointed DPO’s responsibilities as defined in articles 37 and 39 of the GDPR will correspond to those of a legally mandated DPO (in other words, GDPR does not recognise a quasi-DPO with reduced responsibility). As an excerpt from the GDPR explains “if an organisation is not legally required to designate a DPO, and does not wish to designate a DPO on a voluntary basis, that organisation is quite at liberty to employ staff or outside consultants to provide information and advice relating to the protection of personal data.

However, it is important to ensure that there is no confusion regarding their title, status, position and tasks. Therefore, it should be made clear, in any communications within the company, as well as with data protection authorities, data subjects, and the public at large, that the title of this individual or consultant is not a data protection officer (DPO).

But how are the conditions that make a DPO mandatory defined under GDPR?

Large-scale processing: there is no absolute definition under GDPR, but there are evaluative guidelines. The GDPR’s WP29 guidance suggests data controllers should consider:

  • The number of data subjects concerned;
  • The volume of data processed;
  • The range of data items being processed;
  • The duration or permanence of the data processing activity; and
  • The geographical extent.

Regular and systematic monitoring: as with ‘large-scale processing,’ there is no definition as such, but WP29 guidance clarifies that monitoring involves any form of tracking or profiling on the internet, including for the purposes of behavioural advertising. Here are a number of examples of regular and systematic monitoring:

  • Data-driven marketing activities;
  • Profiling and scoring for purposes of risk assessment;
  • Email retargeting;
  • Location tracking (e.g. by mobile apps); or
  • Loyalty programmes.

 What does a Data Protection Officer do?

Article 39 of the GDPR, ‘Tasks of the data protection officer,’ lists and explains the DPO’s obligations. It explains that, as a minimum, the responsibility of a DPO is the items summarised below:

  1. Inform and advise the controller or the processor and the employees
  2. Monitor compliance with the Regulation, with other Union or Member State data protection provisions and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, awareness-raising and training of staff involved in processing operations, and the related audits
  3. Provide advice as regards data protection impact assessments and monitor performance
  4. Cooperate with the supervisory authority
  5. Act as the contact point for the supervisory authority on issues relating to data processing

 

 Harry Smithson 2019

 

Data Compliant GDPR panic

GDPR – panic … or not?

myth or fact

GDPR – don’t get bogged down by fear-mongering and myth

GDPR is beset with myth, rumour, and so-called experts. The amount of confusion and misinformation provided is incredibly detrimental. And this is largely because many organisations and individuals who are trying to promote their services are using fear tactics to do so.

But they’re missing the point.

We have a Data Protection Act currently in place, and Privacy and Electronic Communication Regulations to support it.  Any organisation which is ignoring the current data protection legislation has every reason to panic about GDPR. Ignorance is no excuse.  And they won’t be able to get away with ignoring GDPR willfully just because they consider data protection an inconvenient restriction preventing them taking unethical actions to make more money.

On the other hand, organisations who conform to the current legislation have a head-start when addressing how to comply with the new regulation.

GDPR – a simple summary

At its simplest, GDPR is a long-overdue evolution which is primarily about all organisations (whether data controllers or data processors):

  1. putting the individual first
  2. being held accountable for protecting that individual’s data

At the same time, GDPR addresses the vast changes to the data landscape since the original data protection legislation of the 1990s:

  • it takes account of technological advances – bear in mind, there was barely an internet in the early ’90s!
  • it seeks to protect EU citizens from  misuse of their personal data wherever that data is processed
  • it addresses (at least in part) the disparity in data protection legislation throughout the EU and its members

GDPR increases both compliance obligations on the part of organisations, and enforcement powers on the part of the regulator.

Compliance Obligations:  The principle of Accountability puts a heavy administrative burden on data controllers and data processors.  Robust record-keeping in relation to all data processing is essential; evidenced decisions around data processing will be critical.

Enforcement Powers:  Yes, there are massive fines for non-compliance.  And yes, they will go up to £20,000,000 or 4% of global turnover.  But is that really the key headline?

GDPR’s Key Message:  Put the Individual First

Rights human rights

As GDPR comes closer, individuals are going to become increasingly aware of their rights – new and old

All organisations who process personal data need to understand that individuals must be treated fairly, and have, under GDPR, greater rights than before.  This means that organisations need to be transparent about their data processing activity, and take full responsibility for protecting the personal or personally identifiable data they process.

What does that mean in practice?

  • Tell the individuals what you intend to do with their data – and make it absolutely plain what you mean
  • Explain that there’s a value exchange – by all means help them understand the benefits to providing the data and allowing the processing – but don’t tell lies, and don’t mislead them
  • If you don’t want to tell them what you’re doing … you probably shouldn’t be doing it
  • If you need their consent, make sure you obtain it fairly, with simple messaging and utter clarity around precisely what it is to which they are consenting
  • Tell them all their rights (including the right to withdraw consent; to object to processing where relevant; to be provided with all the information you hold about them, to be forgotten, etc)
  • Always balance your rights as an organisation against their rights as an individual

Look out for your Reputation

shame

Never underestimate the reputational damage caused by a data breach

The Information Commissioner, Elizabeth Denham, states clearly that, while the ICO has heavy-weight power to levy massive fines, “we intend to use those powers proportionately and judiciously”.  So the ICO may issue warnings, reprimands, corrective orders and fines, but that could be the least of your worries.

Something that tends to be overlooked when talking about penalties of non-compliance is reputational damage.  All the ICO’s sanctions (from warnings to fines) are published on the ICO website.  And the press loves nothing more than a nice, juicy data breach.

So even if no fine is levied, reputations will suffer.  At worst, customers will be lost.  Shareholders will lose confidence.  Revenues will decline.  Board members will lose their jobs.  And, to quote Denham again, “You can’t insure against that.”

Victoria Tuffill     18th August 2017

Data Compliant advises on GDPR compliance – if you’d like more information, please call 01787 277742 or email dc@datacompliant.co.uk

 

GDPR and Data Privacy Impact Assessments (DPIAs)

DPIA blog image

When are they needed?  How are they done?

Next year under the new GDPR data protection legislation, Privacy Impact Assessments will become known as Data Privacy Impact Assessments, and will be mandatory instead of merely recommended.

The ICO currently describes PIAs as “a tool which can help organisations identify the most effective way to comply with their data protection obligations and meet individuals’ expectations of privacy.”

While the soon-to-be-rechristened DPIAs will be legally required, data controllers should continue to fully embrace these opportunities to ensure that heavy fines, brand reputational damage and the associated risks of data breaches can be averted from an early stage in any planned operation.

When will a DPIA be legally required?

Organisations will be required to carry out a DPIA when data processing is “likely to result in a high risk to the rights and freedoms of individuals.” This can be during an existing or before a planned project involving data processing that comes with a risk to the rights of individuals as provided by the Data Protection Act. They can also range in scope, depending on the organisation and the scale of its project.

DPIAs will therefore be required when an organisation is planning an operation that could affect anyone’s right to privacy: broadly speaking, anyone’s right ‘to be left alone.’ DPIAs are primarily designed to allow organisations to avoid breaching an individual’s freedom to “control, edit, manage or delete information about themselves and to decide how and to what extent such information is communicated to others.” If there is a risk of any such breach, a DPIA must be followed through.

Listed below are examples of projects, varying in scale, in which the current PIA is advised – and it is safe to assume all of these examples will necessitate a DPIA after the GDPR comes into force:

  • A new IT system for storing and accessing personal data.
  • A new use of technology such as an app.
  • A data sharing initiative where two or more organisations (even if they are part of the same group company) seek to pool or link sets of personal data.
  • A proposal to identify people in a particular group or demographic and initiate a course of action.
  • Processing quantities of sensitive personal data
  • Using existing data for a new and unexpected or more intrusive purpose.
  • A new surveillance system (especially one which monitors members of the public) or the application of new technology to an existing system (for example adding Automatic number plate recognition capabilities to existing CCTV).
  • A new database which consolidates information held by separate parts of an organisation.
  • Legislation, policy or strategies which will impact on privacy through the collection of use of information, or through surveillance or other monitoring

How is a DPIA carried out?

There are 7 main steps that comprise a DPIA:

  1. Identify the need for a DPIA

This will mainly involve answering ‘screening questions,’ at an early stage in a project’s development, to identify the potential impacts on individuals’ privacy. The project management should begin to think about how they can address these issues, while consulting with stakeholders.

  1. Describe the information flows

Explain how information will be obtained, used and retained. This part of the process can identify the potential for – and help to avoid – ‘function creep’: when data ends up being processed or used unintentionally, or unforeseeably.

  1. Identify the privacy and related risks

Compile a record of the risks to individuals in terms of possibly intrusions of data privacy as well as corporate risks or risks to the organisation in terms of regulatory action, reputational damage and loss of public trust. This involves a compliance check with the Data Protection Act and the GDPR.

  1. Identify and evaluate the privacy solutions

With the record of risks ready, devise a number of solutions to eliminate or minimise these risks, and evaluate the costs and benefits of each approach. Consider the overall impact of each privacy solution.

  1. Sign off and record the DPIA outcomes

Obtain appropriate sign-offs and acknowledgements throughout the organisation. A report based on the findings and conclusions of the prior steps of the DPIA should be published and accessible for consultation throughout the project.

  1. Integrate the outcomes into the project plan

Ensure that the DPIA is implemented into the overall project plan. The DPIA should be utilised as an integral component throughout the development and execution of the project.

  1. Consult with internal and external stakeholders as needed throughout the process

This is not a ‘step’ as such, but an ongoing commitment to stakeholders to be transparent about the process of carrying out the DPIA, and being open to consultation and the expertise and knowledge of the organisation’s various stakeholders – from colleagues to customers. The ICO explains, “data protection risks are more likely to remain unmitigated on projects which have not involved discussions with the people building a system or carrying out procedures.”

DPIAs – what are the benefits?

There are benefits to DPIAs for organisations who conduct them.  Certainly there are cost benefits to be gained from knowing the risks before starting work:

  • cost benefits from adopting a Privacy by Design approach:  knowing the risks before starting work allows issues to be fixed early, resulting in reduced development costs and delays to the schedule
  • risk mitigation in relation to fines and loss of sales caused by lack of customer and/or shareholder confidence
  • reputational benefits and trust building from being seen to consider and embed privacy issues into a programme’s design from the outset

For more information about DPIAs and how Data Compliant can help, please email dc@datacompliant.co.uk.

Harry Smithson   20th July 2017

GDPR Re-Permissioning needs careful planning

Morrisons becomes the latest high-profile company fined for breaking Privacy and Electronic Communications Regulations (PECR)

The ICO, the independent authority responsible for investigating breaches of data protection law, has fined the fourth largest supermarket chain in the UK £10,500 for sending 130,671 of their customers’ unsolicited marketing emails.

These customers had explicitly opted-out of receiving marketing emails related to their Morrisons ‘More’ loyalty card when they signed up to the scheme. In October and November 2016, Morrisons used the email addresses associated with these loyalty cards to promote various deals. This is in contravention of laws defining the misuse of personal information, which stipulate that individuals must give consent to receive personal ‘direct’ marketing via email.

‘Service emails’ versus ‘Marketing emails’

While the emails’ subject heading was ‘Your Account Details,’ the customers were told that by changing the marketing preferences on their loyalty card account, they could receive money off coupons, extra More Points and the company’s latest news.

The subject heading might suggest to the recipient that they are ‘service emails,’ which are defined under the Data Protection Act 1998 (DPA) as any email an organisation has a legal obligation to send, or an email without which an individual would be disadvantaged (for instance, a reminder for a booked train departure). But there is a fine line between a service email and a marketing email: if an email contains any brand promotion or advertising content whatsoever, it is deemed the latter under the DPA. Emails that ask for clarification on marketing preferences are still marketing emails and a misuse of personal contact data.

Morrisons explained to the ICO that the recipients of these emails had opted-in to marketing related to online groceries, but opted-out of marketing related to their loyalty cards, so emails had been sent for the ostensible purpose of qualifying marketing preferences which also included promotional content. Morrisons could not provide evidence that these customers had consented to receiving this type of email, however, and they were duly fined – although in cases such as this it is often the losses from reputational damage that businesses fear more.

Fines and reputational damage

This comes just three months after the ICO confirmed fines – for almost identical breaches of PECR – of £13,000 and £70,000 for Honda and Exeter-based airline Flybe respectively. Whereas Honda could not prove that 289,790 customers had given consent to direct e-marketing, Flybe disregarded 3.3 million addressees’ explicit wishes to not receive marketing emails.

Even a fine of £70,000 – which can currently be subject to a 20% early payment discount – for sending out emails to existing customers with some roundabout content in them for the sake of promotion, will seem charitable when the General Data Protection Regulation (GDPR) updates the PECR and DPA in 2018. Under the new regulations, misuse of data including illegal marketing risks a fine of up to €20 million or 4% of annual global turnover.

The ICO has acknowledged Honda’s belief that their emails were a means of helping their firm remain compliant with data protection law, and that the authority “recognises that companies will be reviewing how they obtain customer consent for marketing to comply with stronger data protection legislation coming into force in May 2018.”

These three cases are forewarnings of the imminent rise in stakes for not marketing in compliance with data protection law. The GDPR, an EU regulation that will demand British businesses’ compliance irrespective of Brexit, not only massively increases the monetary penalty for non-compliance, but also demands greater accountability to individuals with regard to the use and storage of their personal data.

The regulators recent actions show that companies will not be able cut legal corners under the assumption of ambiguity between general service and implicit promotional emails. And with the GDPR coming into force next year, adherence to data protection regulations is something marketing departments will need to find the time and resources to prepare for.

Harry Smithson, 22/06/17

Data Compliant’s Weekly Round Up

data-protection-type-writer

What a week!  We’ve had another hack using log in credentials stolen from another provider (see my Camelot breach blog), hundreds of thousands of pounds worth of fines issued by the ICO for millions of unsolicited calls and text, an ‘accidental’ Brexit strategy leak and people being exploited by cyber blackmail (now called Sextortion).

ICO fines and GSMA
This week Oracle Insurance was reported by consumers to the Global System Mobile Association’s (GSMA) SPAM reporting service, which the ICO accesses. After investigation the ICO found that Oracle had sent 136,369 marketing texts where sufficient consent hadn’t been given.  The ICO levied a fine of £30,000.

Similar to this Silver City Tech have been fined an explosive £100,000! The Dorset-based company denies sending any unsolicited texts, let alone 1,132,149 of them. A third party company sent the texts on behalf of Silver City Tech. However the ICO sees the third party as a postman just delivering the message – it’s the company behind the message (ie the data controller) that is held responsible. Again the company couldn’t provide any evidence of consent. After being approached by the ICO in Dec 2015 a further 1,942,182 texts were sent, resulting in Silvery City Tech being being fined £100,000.  There’s a clear message here -if the ICO investigates and advises you not to do something …. it’s as well to stop!

Reporting Spam
It’s worth knowing that if you want to report SPAM, just forward the text message to 7726 (spelling out SPAM).  Then you don’t need to text STOP back to the marketing company – which is always a risk as doing so validates your telephone number, and unscrupulous organisations may well then sell your number to another marketing company.

Brexit Strategy Leak
According to Sky News, the latest victim caught carrying an unguarded document in Downing Street is thought to be Julia Dockerill who works for Conservative Party vice-chairman (international) Mark Field.lady has been papped on her way to a cabinet meeting carrying a note pad detailing notes on the Brexit strategy. Now, personally I’m conflicted on this story. With all of the papping, data breaches, hacks and data-in-transit news stories that we all hear about on a daily basis, surely the victim must know that she needs to be safer than this?  Who doesn’t close their notepad after using it – especially outside Number 10? (Or is that me being fussy?)  There are arguments saying that this was planned and wasn’t an accident at all. What do you think?

Sextortion
If you’d asked me what sextortion was on Monday I would have looked at you blankly and thought you were speaking a different language. However on Wednesday the term was everywhere – on the radio, all over the BBC website and all over social media. If you haven’t heard about it, it’s organised criminal gangs enticing individuals (mainly young men) to perform sexual acts on a webcam.  The criminals then threaten to release the footage to their friends and family unless they pay them. Police say that the number of cases that the victims have been brave enough to report has over doubled from last year.. There are victims as young as 15 although statistics show that the majority of victims fall into the 18-21 age bracket, and there have been 4 suicides this year. Police are advising not to pay anything to blackmailers and contact the police immediately. The force has arrested 40 men responsible in the Philippines.

TalkTalk and Post Office Hack
Reports are coming in that TalkTalk and Post Office customer’s internet access has been cut after a number of routers were targeted. The Post Office have said that it has affected 100,000 of it’s customers and the problem started on Sunday. (A lot happened on Sunday, first the National Lottery, now the Post Office – is no one safe on a Sunday!?) Although it has affected a lot of people, we should thank our lucky stars we’re not in Germany where a similar hack affected an unlucky 900,000 customers.

I think we’ll all be thankful when this week ends. It just seems to be getting worse. However on a positive note it’s December now! Only 22 days until Christmas!!! (Not that I’m counting).

charlotte-seymour-2016

 

Written by Charlotte Seymour, 2nd December 2016

Data Transfer Security – Not so safe

data-transferA Historical Society (the ICO haven’t released the name of which one) has been fined after a laptop was stolen,  holding sensitive personal data on those who had donated or loaned artefacts. The laptop was unencrypted and the ICO found that there were no policies in place when it came to encryption or homeworking. The organisation was fined just £500 to be reduced to £400 if paid early.  Not much of a punishment if you ask me – who doesn’t encrypt sensitive personal data now-a-days??

This announcement was released less than a week after Essex, Suffolk and Norfolk had fallen into a debacle where thousands of highly sensitive medical records were lost in transit between GP surgeries. It seems extraordinary that, even in this age of drones and virtual reality, doctors’ surgeries are still using hard copy of our medical records which, if you change practices,  have to be physically picked up from your old doctor’s surgery and transferred to your new GP.

Just imagine moving house – it’s very exciting, and stressful and can be overwhelming. Whether you are just moving down the road or across counties, you have the worry of unpacking, making your new house a home, changing your address on things like your driver’s license, bank account etc. You also have to think about changing your dentist and of course your doctor.

Capita, an outsourced company, took on the national £400 million, 7 year NHS contract in September last year to do a number of things, including transferring patient notes from one GP practice to another. It has just been reported that over 9,000 patient records have gone missing in the last couple of months across East Anglia alone. The GPC (the General Practitioners Committee) ran a survey of 281 practices and found that just under a third had received the wrong patient notes, over a quarter of practices failed to have records collected from them on the agreed date with Capita and over 80% of urgent requests for records were not processed within 3 weeks.

The NHS is always under scrutiny for something but when they don’t have the correct information for their patients, it makes you feel a little sorry for them.

The main question on my mind is why are there still physical records for such sensitive information like your medical history? The target to reach the utopia of paperless patient records is currently 2020, so for another 4 years our physical records will still need to be transferred physically if we move to another practice. Given the ransomware attacks, breaches and hacks already prevalent within not only the NHS but across all organisations and business sectors, you have to hope that greater care will be taken with our digital records than we are currently seeing with our physical records.

What I find interesting is that Capita, according to a report from the BBC, has refused to recognise these claims. If that is the case, you have to ask why, only last week, Health minister Nicola Blackwood told MPs that she expects Capita to consider “compensation as an option” and stated that Capita had been ‘inadequately prepared’ to take over the primary care support services contract earlier this year. She also made it plain that there should have been greater scrutiny of Capita’s competence in delivering the contract.

A Capita spokeswoman said: ‘NHS England contracted Capita to both streamline delivery of GP support services and make significant cost savings across what was a highly localised service with unstandardised, generally unmeasured and in some cases, uncompliant processes.

We have taken on this challenging initiative and we have openly apologised for the varied level of service experienced by some service users as these services were transitioned and are being transformed.’ She said the company did not recognise ‘whatsoever’ claims that thousands of patient records were missing.

Regardless of the above, what is absolutely clear is that whether you are transferring data either physically, like in this case, or electronically it is highly important to have appropriate security procedures in place. Keep records of the data you are transferring. Know that it has – or has not arrived. And, if you’re using digital transfers, the ICO has recommended encrypting not only your files but also the connection you are using to transfer them.

Now if you excuse me I’m just off to call my doctor’s surgery and see if they still are in possession of my medical notes!

charlotte

 

Written by Charlotte Seymour – November 2016

 

Insider Threats – Charlotte’s View

Insider Threats – Charlotte’s View

Something that is being spoken about more and more (due to the unfortunate higher frequency) is insider threat. It’s in the news an awful lot more than it ever used to be.

Do you remember the auditor of Morrisons who released a spreadsheet detailing just shy of 100,000 members of staff’s (very) personal details? He did end up getting jailed for 8 years but I heard a saying recently, it’s not a digital footprint you leave it’s more of a digital tattoo. Even two years after the incident Morrisons is still suffering the effects.

Now obviously that was what you would call a malicious breach. It does unfortunately happen, but there are ways for you to protect your company against this. Firstly we here at Data Compliant believe that if you have detailed joiner processes in place (i.e. thorough screening and references and criminal checks where appropriate), ongoing appraisals with staff and good leaver processes you can minimise your risk.

Other ways of insider breaches occurring, and much more likely in my opinion, are negligence, carelessness and genuine accidents. Did you know that over 50% of data breaches are cause by staff error? This may be because staff do not follow company procedures correctly and open up pathways for hackers. Or it could be that your staff are tricked into handing over information that they shouldn’t.

Your staff could be your company’s weakest point in relation to protecting it’s personal and confidential data. But you can take simple steps to minimise this risk by training your staff in data protection.

Online training has some big advantages for businesses, it’s a quick, efficient and relatively inexpensive way of training large numbers of employees while “taking them out of the business” for the least possible time.

The risk of breaches isn’t just your business’ reputation, or even a hefty fine from the ICO but as mentioned before, also a criminal conviction. Now that is a lot to risk.

If you’re interested in online training have a look at this video.

 

charlotte

Written by Charlotte Seymour, November 2016