Tag Archives: data privacy

GDPR – ICO Puts Trust at the Heart of Data Processing

Trust & data

Information Commissioner’s Annual Report

The Information Commissioner’s Office (ICO) published its annual report on the 13th July. It is the first time the Information Commissioner Elizabeth Denham has compiled an annual report, having taken up the post a year ago.

The report highlights the increased powers and expanding caseload and capacities  of the regulator. At a time of increasing concern about the use (and abuse) of personal information, the ICO is seeing a great deal more work.  This is, in part, reflected by an increase in staff numbers of around 8% year on year.

GDPR and Public Trust

The ICO’s foreword emphasises its commitment to regaining public trust in data controllers and processors. It is hoped that changing laws provide the regulator with an opportunity to enable individuals to trust in large organisations handling personal information. The Commissioner  states that “trust” will be “at the heart of what the Information Commissioner’s Office will do in the next four years.” Confidence in the digital economy is a consideration that the regulator acknowledges and aims to encourage, especially since the digital sector is growing 30% faster than any other part of the economy.

This echoes the government’s concerns regarding the digital economy and its relation to data protection principles that were enumerated in the Queen’s Speech and addressed by several measures including a Data Protection Bill, which is designed to implement the General Data Protection Regulation (GDPR).

In a year characterised by the impending replacement of the Data Protection Act 1998 (DPA) with the GDPR in May 2018, the report’s outline of major work undertaken leads with a nod to the many public, private and third sector organisations that will be preparing for the new legislative framework.

Consent

‘Consent,’ which has become one of the watchwords for the GDPR (and a word that will be increasingly found on the bulletin boards and coffee mugs of marketing departments) will take on a stricter legal definition soon – a marketing monolith for which the ICO anticipates organisations will seek detailed guidance.

Data Breaches

But the GDPR by no means eclipsed the ICO’s other responsibilities. Nuisance calls, unsolicited marketing and data sharing have routinely seen organisations facing fines and other civil measures. Breaches of the DPA and Privacy and Electronic Communications Regulations 2003 (PECR) such as these by a number of charities, of which the Daily Mail reported allegations in 2015, have led the ICO to issue 13 civil monetary penalties to the value of £181,000.

Indeed, some companies, Honda (whom we reported about last month) being an explicit example, have been issued fines for unsolicited marketing in breach of the DPA due to emails which asked for clarification regarding customers’ marketing preferences – which Honda for example maintained were a means of preparing for the GDPR. So while preparation for the GDPR is something to which the ICO has committed a great deal of resources, they have by no means neglected upholding the current law. The ICO has consistently made clear that it is not acceptable to break the law in preparation for another.

Monetary penalties

Overall, the ICO issued more civil monetary penalties for breaches of PECR than ever before (23), to the value of £1,923,000. It has also issued 16 fines for serious breaches of data protection principles totalling £1,624,500. It cannot be stated enough that after May 2018, these figures could skyrocket if organisations do not find ways of being compliant with the new, more expansive and rigorous legislation. Criminal prosecutions have seen a 267% increase, and the ICO has received 18,300 concerns regarding data protection brought to them – 2,000 more than last year.

Subject Access Requests (SARs)

Data controllers or organisations handling a wide range of personal data may have increasing requests for Subject Access Requests (SARs). The report states that 42% of all concerns brought to the ICO where the nature was specified were related to subject access. While these requests for data are provided under the DPA (and will be upheld with more rigour as one the data subject ‘rights’ by the GDPR) and not the freedom of information legislation, it nonetheless falls upon organisations of whatever size to be co-operative and compliant when the disclosure of information is required. It is important for organisations to train their staff to be able to recognise a SAR and act promptly.  Data controllers must recognise the importance of compliance not only with the law but with ICO audits and investigations, as well as of the necessity for efficient and conscientious data handling.

For information about how DC can help you meet the requirements of GDPR,  please email dc@datacompliant.co.uk.

Harry Smithson, July 25th 2017

GDPR and Data Privacy Impact Assessments (DPIAs)

DPIA blog image

When are they needed?  How are they done?

Next year under the new GDPR data protection legislation, Privacy Impact Assessments will become known as Data Privacy Impact Assessments, and will be mandatory instead of merely recommended.

The ICO currently describes PIAs as “a tool which can help organisations identify the most effective way to comply with their data protection obligations and meet individuals’ expectations of privacy.”

While the soon-to-be-rechristened DPIAs will be legally required, data controllers should continue to fully embrace these opportunities to ensure that heavy fines, brand reputational damage and the associated risks of data breaches can be averted from an early stage in any planned operation.

When will a DPIA be legally required?

Organisations will be required to carry out a DPIA when data processing is “likely to result in a high risk to the rights and freedoms of individuals.” This can be during an existing or before a planned project involving data processing that comes with a risk to the rights of individuals as provided by the Data Protection Act. They can also range in scope, depending on the organisation and the scale of its project.

DPIAs will therefore be required when an organisation is planning an operation that could affect anyone’s right to privacy: broadly speaking, anyone’s right ‘to be left alone.’ DPIAs are primarily designed to allow organisations to avoid breaching an individual’s freedom to “control, edit, manage or delete information about themselves and to decide how and to what extent such information is communicated to others.” If there is a risk of any such breach, a DPIA must be followed through.

Listed below are examples of projects, varying in scale, in which the current PIA is advised – and it is safe to assume all of these examples will necessitate a DPIA after the GDPR comes into force:

  • A new IT system for storing and accessing personal data.
  • A new use of technology such as an app.
  • A data sharing initiative where two or more organisations (even if they are part of the same group company) seek to pool or link sets of personal data.
  • A proposal to identify people in a particular group or demographic and initiate a course of action.
  • Processing quantities of sensitive personal data
  • Using existing data for a new and unexpected or more intrusive purpose.
  • A new surveillance system (especially one which monitors members of the public) or the application of new technology to an existing system (for example adding Automatic number plate recognition capabilities to existing CCTV).
  • A new database which consolidates information held by separate parts of an organisation.
  • Legislation, policy or strategies which will impact on privacy through the collection of use of information, or through surveillance or other monitoring

How is a DPIA carried out?

There are 7 main steps that comprise a DPIA:

  1. Identify the need for a DPIA

This will mainly involve answering ‘screening questions,’ at an early stage in a project’s development, to identify the potential impacts on individuals’ privacy. The project management should begin to think about how they can address these issues, while consulting with stakeholders.

  1. Describe the information flows

Explain how information will be obtained, used and retained. This part of the process can identify the potential for – and help to avoid – ‘function creep’: when data ends up being processed or used unintentionally, or unforeseeably.

  1. Identify the privacy and related risks

Compile a record of the risks to individuals in terms of possibly intrusions of data privacy as well as corporate risks or risks to the organisation in terms of regulatory action, reputational damage and loss of public trust. This involves a compliance check with the Data Protection Act and the GDPR.

  1. Identify and evaluate the privacy solutions

With the record of risks ready, devise a number of solutions to eliminate or minimise these risks, and evaluate the costs and benefits of each approach. Consider the overall impact of each privacy solution.

  1. Sign off and record the DPIA outcomes

Obtain appropriate sign-offs and acknowledgements throughout the organisation. A report based on the findings and conclusions of the prior steps of the DPIA should be published and accessible for consultation throughout the project.

  1. Integrate the outcomes into the project plan

Ensure that the DPIA is implemented into the overall project plan. The DPIA should be utilised as an integral component throughout the development and execution of the project.

  1. Consult with internal and external stakeholders as needed throughout the process

This is not a ‘step’ as such, but an ongoing commitment to stakeholders to be transparent about the process of carrying out the DPIA, and being open to consultation and the expertise and knowledge of the organisation’s various stakeholders – from colleagues to customers. The ICO explains, “data protection risks are more likely to remain unmitigated on projects which have not involved discussions with the people building a system or carrying out procedures.”

DPIAs – what are the benefits?

There are benefits to DPIAs for organisations who conduct them.  Certainly there are cost benefits to be gained from knowing the risks before starting work:

  • cost benefits from adopting a Privacy by Design approach:  knowing the risks before starting work allows issues to be fixed early, resulting in reduced development costs and delays to the schedule
  • risk mitigation in relation to fines and loss of sales caused by lack of customer and/or shareholder confidence
  • reputational benefits and trust building from being seen to consider and embed privacy issues into a programme’s design from the outset

For more information about DPIAs and how Data Compliant can help, please email dc@datacompliant.co.uk.

Harry Smithson   20th July 2017

Data Compliant’s Weekly Round Up

cowboy-round-up-cropped

This week has been a bit hectic when it comes to data breaches and news. We started off with Snoopers’ Charter being passed, then we heard that Deliveroo had been hacked and many of its customers had been paying for someone else’s dinner after passwords were stolen from another business.

We heard of yet another colossal hack – mobile network Three had been infiltrated by 3 hackers dotted all over the country now putting two thirds of the 9,000,000 Three customers at risk. The hackers accessed the upgrade system using an employee log in and were able to intercept the new phones before they reached the customers that the hackers had upgraded. Could this be an insider threat? Although Three can confirm no financial data was appropriated the information that was obtainable were things like names, telephone numbers, addresses and date of birth all of which is classed as personal data in accordance with the Data Protection Act. It’s all very handy data for criminals to steal someone’s identity.

Police are investigating Broxtowe Borough Council after an email containing allegations about someone’s conduct was sent to all staff members (730 people in total) in which they were told about in September. The ICO have said they are not going to take any action.

Hatchimals
Hatchimals are the latest craze with the kids these days and I bet they’re on everyone’s Christmas wish list. For those who don’t know what Hatchimals are, they’re Furby-like toys inside an egg that the child has to nurture until it hatches. Once hatched the toy will learn how to speak from it’s owner – so I’m told by my overly eager nephew. However due to these toys being so popular, scammers are out in force and are taking to social media to encourage loving parents to hand over more than double what these toys are going for. Once the scammers have got the money, the parents are then blocked and never hear from them again. Sometimes over £100 worse off. These toys are out of stock in every retailer that sells children’s toys in the UK so if there is an ad online, on social media, or in an email saying they’re still available and better yet – they’re on sale, don’t be fooled, if it’s too good to be true, it usually is.

Black Friday and Cyber Monday
I would imagine due to it being Black Friday this Friday (25th November) and cyber Monday on the 28th fake adverts and phishing emails are going to be on the rise this week and most of next week too. Although it is sad to think that hackers take to this time of year to steal from loving friends and family to earn themselves a bit of extra money, it does unfortunately happen every year. Now some of these hacks are easy to spot, it just takes a bit of common sense, however they are also getting more and more sophisticated and harder to recognise.

Last year UK consumers spent £2 billion in 24 hours online and in stores on Black Friday and £3.3billion over the whole weekend. Predictions this year are even higher than the last. So if you’re anything like me and are planning to get home from work, make yourself a cup of tea, put your feet up and do your Black Friday shopping online, here are some hints and tips for you to stay safe this weekend.

  • Make sure the websites you are visiting have https: at the front of the URL. The s actually stands for secure! Who knew?
  • If you receive any emails from your bank, paypal or anything asking you to confirm your payment details with a link to click on to do so, hover your mouse over the link to see what the URL is, if it isn’t the company’s name .com/.co.uk etc it’s a scam.
  • Look at the email address you receive an email from, is that the company’s name?
  • Use strong passwords, and different passwords for each log in (this is how many people got stung with Deliveroo as they used the same password for their account with them and with other websites and apps).
  • Read the websites privacy policy before handing over all of your sensitive information. These are legally binding and have to inform you of what the company plans to do with your data.

I could go on and on but these main 5 steps should keep you fairly safe this weekend. Don’t be put off by the minority of people who do wish to scam you into handing over all of your money. There are some good people (and even better bargains) out there, so happy shopping!

charlotte-seymour-2016
Written by Charlotte Seymour – 25th November 2016.

Snoopers’ Charter – What do you think?

big brother.pngThe Investigatory Powers Bill, also known as the Snoopers’ Charter, was passed by the House of Lords last week. This means that service providers will now need to keep – for 12 months – records of every website you visit, (not the exact URL but the website itself), every phone call you make, how long each call lasts, including dates and times the calls were made. They will also track the apps you use on your phone or tablet.

The idea behind the Bill is to prevent terrorism and organised crime, which, it goes without saying, we all fully support.  What it will also obviously do is to place massive amounts of personal information into the hands of the government and other bodies for that 12-month period.  And there has been and will continue to be a huge debate over whether and to what extent this is a breach of our privacy.

This Bill will also allow the police and authorities to look at a specific location and see which websites are highly used in that area, and even who is visiting that area. Dozens of public organisations and departments, such as HMRC, the Food Standards Agency and Gambling Commission, will also be able to access this information without needing evidence for ‘reasonable doubt’ to do so.

What has not changed is that security services still have the ability to hack in to your communications, and eavesdrop into your calls, read your texts and emails, only as long as they have the required warrant to do so. So in theory your actual conversations are still safe unless there is a reason to believe you are involved in something you shouldn’t be.

All this is very well, but is the Bill self-defeating?  Doesn’t it just encourage the use of VPNs which will bounce your IP around the world so you can’t be traced?  If you were doing something you didn’t want officials to know about, isn’t that just what you’d do?

Food for thought here is that the UK will expect companies like Google, Facebook and Apple to unencrypt some of their software so that the UK can gain access to those records. These companies aren’t British companies. So can they refuse? The thing that worries me is that if they do refuse, would they be tempted to pull out of working with the UK completely?  In which case, what does the government want more – the business and jobs these companies provide or the data they hold?

Not only that, but we are now living in the age where Yahoo can lose half a billion accounts, a Three Mobile breach can put millions of customers at risk, and thousands of Tesco customers can have money simply removed from their bank accounts.  And the list goes on. Is not keeping all this data stored for 12 months just like a huge red target for hackers?  Even though this Bill is driven by national security, the risk is that it still leaves an ocean of information that can be dipped into, hacked and misused.

I feel caught between a rock and a hard place.  I have no issues with the government bodies looking through my history should they choose to, but is it right that they can? And then you have to wonder … has anything really changed that much?  Hmmm…

What do you think? None of this will go away. Our children will inherit this Bill and will grow up with all of its implications.

charlotte

 

Written by Charlotte Seymour – November 2016

Insider Threats – Charlotte’s View

Insider Threats – Charlotte’s View

Something that is being spoken about more and more (due to the unfortunate higher frequency) is insider threat. It’s in the news an awful lot more than it ever used to be.

Do you remember the auditor of Morrisons who released a spreadsheet detailing just shy of 100,000 members of staff’s (very) personal details? He did end up getting jailed for 8 years but I heard a saying recently, it’s not a digital footprint you leave it’s more of a digital tattoo. Even two years after the incident Morrisons is still suffering the effects.

Now obviously that was what you would call a malicious breach. It does unfortunately happen, but there are ways for you to protect your company against this. Firstly we here at Data Compliant believe that if you have detailed joiner processes in place (i.e. thorough screening and references and criminal checks where appropriate), ongoing appraisals with staff and good leaver processes you can minimise your risk.

Other ways of insider breaches occurring, and much more likely in my opinion, are negligence, carelessness and genuine accidents. Did you know that over 50% of data breaches are cause by staff error? This may be because staff do not follow company procedures correctly and open up pathways for hackers. Or it could be that your staff are tricked into handing over information that they shouldn’t.

Your staff could be your company’s weakest point in relation to protecting it’s personal and confidential data. But you can take simple steps to minimise this risk by training your staff in data protection.

Online training has some big advantages for businesses, it’s a quick, efficient and relatively inexpensive way of training large numbers of employees while “taking them out of the business” for the least possible time.

The risk of breaches isn’t just your business’ reputation, or even a hefty fine from the ICO but as mentioned before, also a criminal conviction. Now that is a lot to risk.

If you’re interested in online training have a look at this video.

 

charlotte

Written by Charlotte Seymour, November 2016

 

EU DPA Regulation – 7 Key Changes

EU balance

A good balance between business needs and individual rights

Talks on ensuring a high level of data protection across the EU Marketers are now complete and draft text was agreed on Wednesday 16th December 2015.  Marketers are delighted with the “strong compromise” agreed by Parliament and Council negotiators in their last round of talks.

The draft regulation aims to give individuals control over their private data, while also creating clarity and legal certainty for businesses to spur competition in the digital market.  Back in September Angela Merkel appealed to the European parliament to take a business view rather than simply look at the Regulation from a data protection perspective  lest the legislation hold back economic growth in Europe.  At the same time she described data as the “raw material” of the future and expressed her belief that it is fundamental to the digital single market.

The regulation returns control over citizens’ personal data to citizens. Companies will not be allowed to divulge information that they have received for a particular purpose without the permission of the person concerned.

EU DPA Regulation – 7 Key Changes

  1. 4% Fines:  The Council had called for fines of up to two percent of global turnover, while the Parliament’s version would have increased that to five percent.  In apparent compromise, the figure has been set at four percent, which for global companies could amount to millions.
  2. Data Protection Officers (DPOs):  Companies will have to appoint a data protection officer if they process sensitive data on a large scale or collect information on many consumers.  These do not have to be internal or full-time.
  3. Consent:  to marketers’ relief, consent will now have to be ‘unambiguous’ rather than the originally proposed ‘explicit’ which provides a more business-friendly approach to the legislation. In essence this means that direct mail and telephone marketing can still be conducted on an opt-out basis.  Nonetheless, businesses will be obliged to ensure that consumers will have to give their consent by a clear and affirmative action to the use of their data for a specific purpose.
  4. Definition of Personal Data – the definition has been  expanded in particular by reference to an identification number, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person.
  5. Online identifiers -whether cookies and ISPs are personal data has been the subject of discussion for some months.  James Milligan of the DMA has expressed the view that a compromise has been reached “Whether or not online identifiers such as cookies fall into the definition of ‘personal data’ will depend on where they are placed in the online ecosystem. For example, a cookie placed by my internet service provider will be classified as personal data as it could identify me, whereas a cookie placed by an advertiser lower down the online ecosystem and cannot be linked to my email address or anything else which could identify me, is unlikely to be considered as personal data.  This represents a sensible compromise as it was feared that all online identifiers would be considered as personal data. This separation means non-identifiable, ‘blind’ data can be more widely used than identifiable personal data.”
  6. Profiling – Profiling has now been included under the term ‘automated decision making’.  Individuals have the right not to be subject to the results of automated decision making, so they can opt out of profiling. It will be necessary to implement tick-boxes or similar mechanisms to secure the data subject’s positive indication of consent to specific processing activities related to Profiling.
  7. Parental consent – Member states could not agree to set a 13-year age limit for parental consent for children to use social media such as Facebook or Instagram. Instead, member states will now be free to set their own limits between 13 and 16 years.

 

Next Steps

The provisional agreements on the package will be put to a confirmation vote in the Civil Liberties Committee today (Thursday 17 December) at 9.30 in Strasbourg.

If the deal is approved in committee it will then be put to a vote by Parliament as whole in the new year, after which member states will have two years to transpose the provisions of the directive into their national laws. The regulation, which will apply directly in all member states, will also take effect after two years.

Written by Michelle Evans, Compliance Director at Data Compliant Ltd.

If you would like further advice on how the EU Regulation will affect your business, just call Michelle or Victoria on 01787 277742 or email dc@datacompliant.co.uk

 

 

Safe Harbor Framework ruled “Inadequate”

global transfers

What was Safe Harbour?

The Safe Harbour Framework was a cross border transfer mechanism which complied with EU data protection laws and allowed the transfer of personal data between the EU and the USA.  More details on how Safe Harbour worked can be found here.

Why was the Safe Harbour Framework invalidated?

After the recent Facebook case ruling, on 6th October, the Court of Justice of the European Union (CJEU) judged that “US Companies do not afford an adequate level of protection of personal data” and therefore the Safe Harbour Framework is now invalid.

The CJEU indicated that US legislation authorises on a general basis, storage of all personal data of all the persons whose data is transferred from the EU to the U.S. without any differentiation, limitation or exception being made in light of the objectives pursued, and without providing an objective criterion for determining limits to the access and use of this data by public authorities.

The CJEU further observed that the Safe Harbour Framework does not provide sufficient legal remedies to allow individuals to access their personal data and to obtain rectification or erasure of such data. This compromises the fundamental right to effective judicial protection, according to the CJEU.  You can read the European Court of Justice Press Release here.

There have been concerns about the Safe Harbour Framework for some time and the European Commission and the US authorities have been negotiating with a view to introducing an arrangement providing greater protection of privacy to replace the existing agreement.

How can I now transfer my data to US?

Organisations that have been using Safe Harbour will now have to review how they transfer personal data to the US and come up with alternative solutions.  However, it is worth noting that the Information Commissioner’s Office has recognised that this process will take some time.  And James Milligan at the DMA states that data already transferred to US-based companies under Safe Harbour will be unaffected.

In the meantime multi-national companies transferring data to their affiliates can look at using Binding Corporate Rules which allow the transfer of data from the EEA to be in compliance with the 8th data protection principle.

Another legal method of transferring personal data to the US is to use the Model Contract Clauses produced by the EU for transfers of personal information outside the EU.

Michelle Evans, Compliance Director at Data Compliant Ltd.

If you are planning to transfer data between the EU and the US, and would like help on how to do so in the light of this new ruling, just call Michelle or Victoria on 01787 277742 or email dc@datacompliant.co.uk