Tag Archives: profiling

Facebook’s cryptocurrency Libra under scrutiny amid concerns of ‘data handling practices’

It would be giving the burgeoning cryptocurrency Libra short shrift to call it ambitious. Its aims as stated in the Libra Association’s white paper are lofty even by the rhetorical standards of Silicon Valley. If defining Libra as ‘the internet of money’ isn’t enough to convince you of the level of its aspiration, the paper boasts the currency’s ability to financially enfranchise the world’s 1.7 billion adults without access to traditional banking networks or the global financial system.

Like its crypto predecessors, Libra uses blockchain technology to remain decentralised and inclusive, enabling anyone with the ability to pick up a smartphone to participate in global financial networks. Distinguishing itself, however, from existing cryptocurrencies, Libra promises stability thanks to the backing of a reserve of ‘real assets,’ held by the Libra Reserve. There is also the added benefit, hypothetically, of Libra proving to be more energy efficient than cryptocurrencies such as Bitcoin because there will be no ‘proof of work’ mechanism such as Bitcoin mining, which requires more and more electricity as the currency inflates.

So far, so Zuckerberg. It may seem unsurprising then, that global data protection regulators have seen the need to release a joint statement raising concerns over the ‘privacy risks posed by the Libra digital currency and infrastructure.’ While risks to financial privacy and related concerns have been raised by Western policymakers and other authorities, this is the first official international statement relating specifically to personal privacy.

The joint statement, reported on the UK’s Information Commissioner’s Office (ICO) on the 5th August, has signatories from Albania, Australia, Canada, Burkina Faso, the European Union, the United Kingdom and the United States. The primary concern is that there is essentially no information from Facebook, or their participating subsidiary Calibra, on how personal information will be handled or protected. The implementation of Libra is rapidly forthcoming – the target launch is in the first half of next year. Its expected uptake is anticipated to be similarly rapid and widescale thanks to Facebook’s goliath global status. It is likely, therefore, that the Libra Association (nominally independent, but for which Facebook, among other tech and communications giants, is a founding member) will become the custodian of millions of peoples’ data – many of whom will reside in countries that have no data protection laws – in a matter of months.

The statement poses six main questions (a ‘non-exhaustive’ conversation-starter) with a view to getting at least some information on how Libra will actually function both on a user-level and across the network, how the Libra Network will ensure compliance with relevant data protection regulations, how privacy protections will be incorporated into the infrastructure, etc. All of these questions are asked to get some idea of how Facebook and Calibra et al. have approached personal data considerations.

Profiling, Algorithms and ‘Dark Patterns’

The joint statement asks how algorithms and profiling involving personal data will be used, and how this will be made clear to data subjects to meet the standards for legal consent. These are important questions relating to the design of the access to the currency on a user-level, of which prospective stakeholders remain ill-informed. The Libra website does state that the Libra blockchain is pseudonymous, allowing users to hold addresses not linked to their real-world identity. How these privacy designs will manifest remains unclear, however, and there is as yet no guarantee de-identified information cannot be reidentified through nefarious means either internally or by third parties.

The regulators also bring up the use of nudges and dark patterns (sometimes known as dark UX) – methods of manipulating user behaviour that can rapidly become unethical or illegal. Nudges may be incorporated into a site (they may sometimes be useful, such as a ‘friendly reminder’ that Mother’s Day is coming up on a card website) in order to prompt commercial activity that may not have happened otherwise. There is not always a fine line between a reasonable nudge and a dubious one. Consider the example of Facebook asking a user ‘What’s on your mind?’, prompting the expression of a feeling or an attitude, for instance. We already know that Facebook has plans to scan information on emotional states ostensibly for the purposes of identifying suicidal ideation and preventing tragic mistakes. The benefits of this data to unscrupulous agents, however, could prove, and indeed has proved, incalculable.

The Libra Network envisions a ‘vibrant ecosystem’ (what else?) of app-developers and other pioneers to ‘spur the global use of Libra.’ Questions surrounding the Network’s proposals to limit data protection liabilities in these apps are highly pertinent considering the lightspeed pace with which the currency is being designed and implemented.

Will Libra be able to convince regulators that it can adequately distance itself from these practices? Practices which take place constantly and perennially online? Has there been any evidence of Data Protection Impact Assessments (DPIAs), as demanded unequivocally by the European Union’s General Data Protection Regulation (GDPR) on a data-sharing scale of this magnitude?

Hopefully, Facebook or one of its subsidiaries or partners will partake in this conversation started by the joint statement, providing the same level of cooperation and diligence shown to data protection authorities as they have to financial authorities. More updates to come.

Harry Smithson, 9th August 2019

The GDPR and Profiling

Profiling is a very useful tool which marketers have been using for decades to understand their customers better and to target them appropriately.  However, the GDPR does make some changes to how profiling is considered which should be considered carefully before profiling is undertaken.  For the first time, profiling has been included with automated processing decision-making and the same rights apply to the individuals whose information is being profiled. So how does this affect businesses?

Profiling Benefits

There are obvious benefits both to businesses and consumers in relation to profiling, which is used in a broad number of sectors from healthcare to insurance, retail to publishing, leisure to recruitment.

It is also an extremely useful tool for marketers, providing benefits of increased efficiency, savings in resource, and the financial and reputational benefits of understanding customers and establishing more personal, relevant communications with them.  The customer or individual benefits in turn from receiving fewer communications, and far more relevant messages.

What is profiling?

The GDPR defines profiling as: “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”

Profiling can be as simple as segmenting your own customers into groups based on gender, purchase history, and other data that the customer has provided to you during your relationship.  It becomes more complex when additional data is added to the mix, for example, adding to the information your customer has provided you, by applying data from external sources such as social media, or providers of geo-demographic or lifestyle data.

Profiling and the GDPR

As with all processing under the GDPR, those who profile individuals have responsibilities to those individuals.  Profiles must be accurate, relevant, and non-discriminatory.  All 6 GDPR Principles become critical as profiles are evolutionary, and over time, individuals’ profiles will change. So accuracy and retention are critical.  Privacy by design is key.  As is the requirement that individuals must be made aware of such profiling and of their right not to be subject to such decisions.

It’s worth noting that automated decisions can be made with or without profiling.  And the reverse is also true – profiling can take place without making automated decisions.  It’s all a matter of how the data is used.  Where manual decisions are made, Article 22 does not apply.

Consent or Legitimate Interests?

The legal basis under which profiling takes place is a matter for careful consideration.  There has been debate over whether profiling requires the consent of the individual who is being profiled, or whether legitimate interest may apply.

There will be instances where the impact of the profiling will have a legal or significant effect – for example, in financial services (mortgage yes or no), or when marketing to vulnerable customers – for example, gambling products to those in financial difficulty.  Where profiling is considered to have a legal or significant effect, an organisation will need to rely on the legal basis of Consent before profiling and making decisions on the basis of such profiling.

However, in many cases, marketing will not have such an impact, and in those cases, consent will not be required.  Instead it may be possible to rely on Legitimate Interests.  BUT before such a decision is made, a Legitimate Interest Assessment will need to be conducted.  This will need to consider the necessity of the profiling, the balance of benefits to the individuals versus the business, and the measures taken to protect the personal data and profiles involved.

The Legitimate Interest Assessment will not only help you determine whether it is appropriate to conduct the profiling on this basis, it will also provide evidence that the individuals’ rights have been considered, contributing to the business’s need to meet the GDPR’s new principle of Accountability.

Victoria Tuffill  7th March 2018

The GDPR and Profiling

Profiling is a very useful tool which marketers have been using for decades to understand their customers better and to target them appropriately.  However, the GDPR does make some changes to how profiling is considered which should be considered carefully before profiling is undertaken.  For the first time, profiling has been included with automated processing decision-making and the same rights apply to the individuals whose information is being profiled. So how does this affect businesses?

Profiling 2018Profiling Benefits

There are obvious benefits both to businesses and consumers in relation to profiling, which is used in a broad number of sectors from healthcare to insurance, retail to publishing, leisure to recruitment.

It is also an extremely useful tool for marketers, providing benefits of increased efficiency, savings in resource, and the financial and reputational benefits of understanding customers and establishing more personal, relevant communications with them.  The customer or individual benefits in turn from receiving fewer communications, and far more relevant messages.

What is profiling?

The GDPR defines profiling as: “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”

Profiling can be as simple as segmenting your own customers into groups based on gender, purchase history, and other data that the customer has provided to you during your relationship.  It becomes more complex when additional data is added to the mix, for example, adding to the information your customer has provided you, by applying data from external sources such as social media, or providers of geo-demographic or lifestyle data.

Profiling and the GDPR

As with all processing under the GDPR, those who profile individuals have responsibilities to those individuals.  Profiles must be accurate, relevant, and non-discriminatory.  All 6 GDPR Principles become critical as profiles are evolutionary, and over time, individuals’ profiles will change. So accuracy and retention are critical.  Privacy by design is key.  As is the requirement that individuals must be made aware of such profiling and of their right not to be subject to such decisions.

It’s worth noting that automated decisions can be made with or without profiling.  And the reverse is also true – profiling can take place without making automated decisions.  It’s all a matter of how the data is used.  Where manual decisions are made, Article 22 does not apply.

Consent or Legitimate Interests?

The legal basis under which profiling takes place is a matter for careful consideration.  There has been debate over whether profiling requires the consent of the individual who is being profiled, or whether legitimate interest may apply.

There will be instances where the impact of the profiling will have a legal or significant effect – for example, in financial services (mortgage yes or no), or when marketing to vulnerable customers – for example, gambling products to those in financial difficulty.  Where profiling is considered to have a legal or significant effect, an organisation will need to rely on the legal basis of Consent before profiling and making decisions on the basis of such profiling.

However, in many cases, marketing will not have such an impact, and in those cases, consent will not be required.  Instead it may be possible to rely on Legitimate Interests.  BUT before such a decision is made, a Legitimate Interest Assessment will need to be conducted.  This will need to consider the necessity of the profiling, the balance of benefits to the individuals versus the business, and the measures taken to protect the personal data and profiles involved.

The Legitimate Interest Assessment will not only help you determine whether it is appropriate to conduct the profiling on this basis, it will also provide evidence that the individuals’ rights have been considered, contributing to the business’s need to meet the GDPR’s new principle of Accountability.

 

Victoria Tuffill  7th March 2018