Tag Archives: eu data privacy

GDPR and Data Privacy Impact Assessments (DPIAs)

DPIA blog image

When are they needed?  How are they done?

Next year under the new GDPR data protection legislation, Privacy Impact Assessments will become known as Data Privacy Impact Assessments, and will be mandatory instead of merely recommended.

The ICO currently describes PIAs as “a tool which can help organisations identify the most effective way to comply with their data protection obligations and meet individuals’ expectations of privacy.”

While the soon-to-be-rechristened DPIAs will be legally required, data controllers should continue to fully embrace these opportunities to ensure that heavy fines, brand reputational damage and the associated risks of data breaches can be averted from an early stage in any planned operation.

When will a DPIA be legally required?

Organisations will be required to carry out a DPIA when data processing is “likely to result in a high risk to the rights and freedoms of individuals.” This can be during an existing or before a planned project involving data processing that comes with a risk to the rights of individuals as provided by the Data Protection Act. They can also range in scope, depending on the organisation and the scale of its project.

DPIAs will therefore be required when an organisation is planning an operation that could affect anyone’s right to privacy: broadly speaking, anyone’s right ‘to be left alone.’ DPIAs are primarily designed to allow organisations to avoid breaching an individual’s freedom to “control, edit, manage or delete information about themselves and to decide how and to what extent such information is communicated to others.” If there is a risk of any such breach, a DPIA must be followed through.

Listed below are examples of projects, varying in scale, in which the current PIA is advised – and it is safe to assume all of these examples will necessitate a DPIA after the GDPR comes into force:

  • A new IT system for storing and accessing personal data.
  • A new use of technology such as an app.
  • A data sharing initiative where two or more organisations (even if they are part of the same group company) seek to pool or link sets of personal data.
  • A proposal to identify people in a particular group or demographic and initiate a course of action.
  • Processing quantities of sensitive personal data
  • Using existing data for a new and unexpected or more intrusive purpose.
  • A new surveillance system (especially one which monitors members of the public) or the application of new technology to an existing system (for example adding Automatic number plate recognition capabilities to existing CCTV).
  • A new database which consolidates information held by separate parts of an organisation.
  • Legislation, policy or strategies which will impact on privacy through the collection of use of information, or through surveillance or other monitoring

How is a DPIA carried out?

There are 7 main steps that comprise a DPIA:

  1. Identify the need for a DPIA

This will mainly involve answering ‘screening questions,’ at an early stage in a project’s development, to identify the potential impacts on individuals’ privacy. The project management should begin to think about how they can address these issues, while consulting with stakeholders.

  1. Describe the information flows

Explain how information will be obtained, used and retained. This part of the process can identify the potential for – and help to avoid – ‘function creep’: when data ends up being processed or used unintentionally, or unforeseeably.

  1. Identify the privacy and related risks

Compile a record of the risks to individuals in terms of possibly intrusions of data privacy as well as corporate risks or risks to the organisation in terms of regulatory action, reputational damage and loss of public trust. This involves a compliance check with the Data Protection Act and the GDPR.

  1. Identify and evaluate the privacy solutions

With the record of risks ready, devise a number of solutions to eliminate or minimise these risks, and evaluate the costs and benefits of each approach. Consider the overall impact of each privacy solution.

  1. Sign off and record the DPIA outcomes

Obtain appropriate sign-offs and acknowledgements throughout the organisation. A report based on the findings and conclusions of the prior steps of the DPIA should be published and accessible for consultation throughout the project.

  1. Integrate the outcomes into the project plan

Ensure that the DPIA is implemented into the overall project plan. The DPIA should be utilised as an integral component throughout the development and execution of the project.

  1. Consult with internal and external stakeholders as needed throughout the process

This is not a ‘step’ as such, but an ongoing commitment to stakeholders to be transparent about the process of carrying out the DPIA, and being open to consultation and the expertise and knowledge of the organisation’s various stakeholders – from colleagues to customers. The ICO explains, “data protection risks are more likely to remain unmitigated on projects which have not involved discussions with the people building a system or carrying out procedures.”

DPIAs – what are the benefits?

There are benefits to DPIAs for organisations who conduct them.  Certainly there are cost benefits to be gained from knowing the risks before starting work:

  • cost benefits from adopting a Privacy by Design approach:  knowing the risks before starting work allows issues to be fixed early, resulting in reduced development costs and delays to the schedule
  • risk mitigation in relation to fines and loss of sales caused by lack of customer and/or shareholder confidence
  • reputational benefits and trust building from being seen to consider and embed privacy issues into a programme’s design from the outset

For more information about DPIAs and how Data Compliant can help, please email dc@datacompliant.co.uk.

Harry Smithson   20th July 2017

EU DPA Regulation – 7 Key Changes

EU balance

A good balance between business needs and individual rights

Talks on ensuring a high level of data protection across the EU Marketers are now complete and draft text was agreed on Wednesday 16th December 2015.  Marketers are delighted with the “strong compromise” agreed by Parliament and Council negotiators in their last round of talks.

The draft regulation aims to give individuals control over their private data, while also creating clarity and legal certainty for businesses to spur competition in the digital market.  Back in September Angela Merkel appealed to the European parliament to take a business view rather than simply look at the Regulation from a data protection perspective  lest the legislation hold back economic growth in Europe.  At the same time she described data as the “raw material” of the future and expressed her belief that it is fundamental to the digital single market.

The regulation returns control over citizens’ personal data to citizens. Companies will not be allowed to divulge information that they have received for a particular purpose without the permission of the person concerned.

EU DPA Regulation – 7 Key Changes

  1. 4% Fines:  The Council had called for fines of up to two percent of global turnover, while the Parliament’s version would have increased that to five percent.  In apparent compromise, the figure has been set at four percent, which for global companies could amount to millions.
  2. Data Protection Officers (DPOs):  Companies will have to appoint a data protection officer if they process sensitive data on a large scale or collect information on many consumers.  These do not have to be internal or full-time.
  3. Consent:  to marketers’ relief, consent will now have to be ‘unambiguous’ rather than the originally proposed ‘explicit’ which provides a more business-friendly approach to the legislation. In essence this means that direct mail and telephone marketing can still be conducted on an opt-out basis.  Nonetheless, businesses will be obliged to ensure that consumers will have to give their consent by a clear and affirmative action to the use of their data for a specific purpose.
  4. Definition of Personal Data – the definition has been  expanded in particular by reference to an identification number, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person.
  5. Online identifiers -whether cookies and ISPs are personal data has been the subject of discussion for some months.  James Milligan of the DMA has expressed the view that a compromise has been reached “Whether or not online identifiers such as cookies fall into the definition of ‘personal data’ will depend on where they are placed in the online ecosystem. For example, a cookie placed by my internet service provider will be classified as personal data as it could identify me, whereas a cookie placed by an advertiser lower down the online ecosystem and cannot be linked to my email address or anything else which could identify me, is unlikely to be considered as personal data.  This represents a sensible compromise as it was feared that all online identifiers would be considered as personal data. This separation means non-identifiable, ‘blind’ data can be more widely used than identifiable personal data.”
  6. Profiling – Profiling has now been included under the term ‘automated decision making’.  Individuals have the right not to be subject to the results of automated decision making, so they can opt out of profiling. It will be necessary to implement tick-boxes or similar mechanisms to secure the data subject’s positive indication of consent to specific processing activities related to Profiling.
  7. Parental consent – Member states could not agree to set a 13-year age limit for parental consent for children to use social media such as Facebook or Instagram. Instead, member states will now be free to set their own limits between 13 and 16 years.

 

Next Steps

The provisional agreements on the package will be put to a confirmation vote in the Civil Liberties Committee today (Thursday 17 December) at 9.30 in Strasbourg.

If the deal is approved in committee it will then be put to a vote by Parliament as whole in the new year, after which member states will have two years to transpose the provisions of the directive into their national laws. The regulation, which will apply directly in all member states, will also take effect after two years.

Written by Michelle Evans, Compliance Director at Data Compliant Ltd.

If you would like further advice on how the EU Regulation will affect your business, just call Michelle or Victoria on 01787 277742 or email dc@datacompliant.co.uk

 

 

EU versus Google – £12 million DPA fine

Google vs EU largerThe pressure on Google over European data privacy issues has been ongoing for several years as EU data protection watchdogs attempt to bring the organisation – and other huge US companies – into line with European data protection principles.

The latest threat to Google comes from Holland, where the Dutch DPA has threatened Google with a fine of up to 15M euros for breaking local laws over how it can use user data.  Google has been given until the end of February 2015 to change the way it handles personal data, before the fine is levied.

Online behaviour used to target advertising

So what has Google done wrong?  The issue is over the way Google uses data about people’s online behaviour to tailor advertisements.  Google builds up a profile for every one of its users based on keywords used in searches, email messages, cookies, location data – even video viewing habits.  However, it does not inform its data subjects that it is collecting and using data in this way, and nor does it obtain consent.

Google’s Data Assets

Google’s data is a core asset for the business, and other businesses like it.  One of Google’s key data privacy issues is that the company has merged all its separate privacy policies into one policy which allows Google to share its user data across all its services – for example, Gmail data and search engine data can be used and combined across the company.  In addition, there is no opt-out for the data subject.

From Google’s point of view, its customer profiling is enhanced considerably by this activity – and advertising to targeted customers is Google’s core revenue stream.  Google also uses customer data to drive new products such as Google now (appointment based app, giving details on how to get to your appointment, where it is, what are the traffic conditions and what time to leave) – a great concept, but one that would be useless without Google’s ability to collect and use data from its users.

It has been clear for some time that the EU is determined to take on the challenge of the giant UK search engines and social media platforms, and curb the way they use data.  Because Google has such a vast share of the market, it, in particular, regularly comes under fire from the EU.

Google Privacy Policy – Fairness and Transparency

The requirement for additional permissions or opt-outs may be more problematic than helpful for Google customers.  But fairness and transparency is an issue that Google could address relatively simply – as a minimum the customer should be informed about the data Google is collecting about him or her, why it is being collected and how it is being used. And a little bit of creativity in the wording would serve to illustrate the benefits to the customer.

The single privacy policy makes such transparency difficult.  So perhaps the simplest solution is to re-establish separate privacy policies for each of its business areas.  That might at least serve to reassure not only the EU, but also the US data protection authorities who have also expressed concerns over Google’s single privacy policy.

Your thoughts and views are always welcome – please add your comments below.  If you have any concerns about your data compliance in general or the impact of EU changes in your business, contact us on 01787 277742.  Or email victoria@datacompliant.co.uk

Services at December 2014

EU Data Protection Regulation – Getting closer?

EU dpaThe EU Regulation is designed to replace the current multiplicity of EU data protection laws with a single set of rules to be applied throughout all Member States.  Time is moving on so it’s important to keep on top of the discussions and updates being published.

Last month’s proposed revisions to Chapter IV (which deals with data controller and data processor obligations) are summarised below.  However, it is worth remembering that “nothing is agreed until everything is agreed” in relation to the Regulation.

Greater discretion for data controllers – risk-based compliance

Businesses will be relieved to see greater discretion for data controllers in complying with the legislation as recent Chapter IV discussions in Europe have moved towards a risk-based approach to compliance.

A balance between privacy and entrepreneurship

EU balanceThe proposed amendments to Chapter IV suggest that data compliance obligations should be proportional to the organisation’s specific data processing activity and associated risks.

Once these activities and risks have been assessed, appropriate privacy and data protection tools should be instigated by the organisation.

Different activities, even where the same data is involved, may quite often have different consequences, requiring different levels of protection. The risk-based approach allows data controllers a more flexible approach in assessing their data compliance responsibilities within the context of their own particular business.

It appears that most countries welcome the risk-based approach, which they view as providing a good balance between protecting personal data and safeguarding businesses and entrepreneurship.

Chapter IV Proposed Revisions 

Below are some examples of the revisions proposed by the EU Council:

  • Data protection impact assessments are only required where “high” risk (for example identity theft, fraud or financial loss) to the rights and freedoms of individuals is involved
  • The appointment of Data Protection Officers is voluntary (unless individual Member State legislation states otherwise)
  • Only data breaches that are likely to result in “high risk for rights and freedoms of individuals” need be reported
  • If stolen or breached data is encrypted or protected in such a way that the data remains indecipherable, there is no requirement to report the breach.
  • Required levels of security measures will be established by considering multiple factors, including the nature, scope, context and purpose of the data processing to be undertaken, in combination with the cost of implementation and the technology available.
  • Only where a data privacy impact assessment indicates that data processing would result in “high risk” to the rights and freedoms of individuals, the supervisory data protection authority should be consulted prior to the start of such processing

There is also a suggestion that data controllers may use “adherence of the processor to an approved code of conduct or an approved certification mechanism” to demonstrate compliance with the obligations of a controller.  So organisations may find it well worth considering selecting only those data processors who have appropriate data security certification such as ISO 27001 or DMA DataSeal.

If you have any concerns about your data compliance in general or the impact of EU changes in your business, contact us on 01787 277742.  Or email victoria@datacompliant.co.uk

Services