Pages

Monday, November 26, 2012

Privacy v Convenience/attraction/gratification/access/community/conformity

Thanks to Martyn Thomas via the invaluable FIPR Alerts list for the pointer to the European Network and Information Security Agency (ENISA) report on Privacy considerations of online behavioural tracking published on 14 November.  ENISA have been fairly active on the privacy front this year with four reports, a study on monetising privacy and one on data collection and storage in the EU both published in February, the tracking report from a couple of weeks back and one last week on the right to be forgotten.

The data collection study highlighted "the clear contrast between the importance of the privacy by design principle on the one hand and the reality of lax data protection practices with many online service providers on the other hand" and aimed "to conduct an analysis of the relevant legal framework of European Member States on the principle of minimal disclosure and the minimum duration of the storage of personal data." The authors recommendations, in brief, were:
  • the national Data Protection Authorities should provide clear guidelines to data controllers;
  • the Article 29 Data Protection Working Party, the European Data Protection Supervisor and ENISA should do the same for specific areas of processing of personal data with pan-European impact;
  • the Data Protection Authorities should aim to improve user awareness relating to the rights stemming from the data protection legislation and on the possibilities offered to users by the legal system to exercise these rights, including by complaining in cases of excessive collection and storage of personal data, and
  • the Member States should identify and eliminate conflicting regulatory provisions relating to the collection and storage of persona data."
The monetising privacy report said that the uptake of privacy enhancing technologies is low and there are not many options, possibly because only a small number of people are prepared to pay for them.

The tracking study notes "Internet users are being increasingly tracked and profiled and their personal data are extensively used as currency in exchange for services. It is important that this new reality is better understood by all stakeholders if we are to be able to support and respect the right for privacy." It provides a technical perspective on behavioural tracking, asks "Why are users tracked? What techniques are used? To what extent are we tracked today? What are the trends? What are the risks? What protective measures exist? What could regulators do to help improve user privacy?" and recommends:
"- Development of anti-tracking initiatives and solutions for mobile applications; the users of mobile devices are more exposed as most anti-tracking initiatives are not focusing on mobile devices
- Development of easy-to-use tools for transparency and control; awareness is important but there is a need to enhance transparency tools to allow the users to know how their personal data is collected, managed and transferred
- Enforcement solutions should be deployed to block misbehaving players and to force compliance with rules and regulations regarding personal data protection; mechanisms should be defined by regulatory bodies both for compliance and for monitoring and detection of violation of the rules
- Privacy-by-design should be promoted; regulations have an important role in boosting the adaptation of privacy-preserving solutions, i.e. by enforcing the rules, and by ensuring the existence of complete, compliant, concrete and meaningful privacy policies."
The right to be forgotten paper focuses on technical limitations and challenges when trying to enforce such a right.
"The recommendations of the paper cover multiple aspects:
  • Technical means of assisting the enforcement of the right to be forgotten require a definition of the scope of personal data, a clarification of who has the right to ask for the deletion of personal data under what circumstances, and what are acceptable ways to affect the removal of data. Data Protection Authorities, the Article 29 Data Protection Working Party, the European Data Protection Supervisor, etc. should work together to clarify these issues. Furthermore, when providing the above mentioned definitions, the technical challenges in enforcing the right to be forgotten (and the associated costs) for a given choice of definition should be considered carefully.
  • For any reasonable interpretation of the right to be forgotten, a purely technical and comprehensive solution to enforce the right in the open Internet is generally impossible. An interdisciplinary approach is needed and policy makers should be aware of this fact. 
  • A possible pragmatic approach to assist with the enforcement of the right to be forgotten is to require search engine operators and sharing services within the EU to filter references to forgotten information stored inside and outside the EU region. 
  • Particular care must be taken concerning the deletion of personal data stored on discarded and offline storage devices. 
  • Data controllers should be required to provide users with easy access to the personal data they store and ways to update, rectify, and delete data without undue delay and without cost to the user (to the extent that this does not conflict with other applicable laws). 
  • Research communities, industry, etc. should develop techniques and coordinate initiatives that aim at preventing the unwanted collection and dissemination of information (e.g., robot.txt, do not track, access control).
As mentioned above, this paper is complementing two other recent publications of ENISA in this area. In this broader context, given the findings of this paper, ENISA recommends that policy makers should ensure the use of technologies supporting the principle of minimal disclosure in order to minimize the amount of personal data collected and stored online. We also recommend the use of encryption for the storage and transfer of personal data. Particular attention should be focusing on tracking and profiling online, and enforcement solutions should be deployed to block misbehaving players and to force compliance with rules and regulations regarding personal data protection.

At the same time, Data Protection Authorities, the Article 29 Data Protection Working Party, the European Data Protection Supervisor, etc. should work together to clarify pending definition issues taking into account the practical implementation aspects while Member States should eliminate conflicting regulations."
They are all worthy studies by smart people with sensible recommendations. When reading them I found myself nodding and mumbling "absolutely!" and "couldn't agree more" and "you got that one right - it is impossible"  and "somebody gets it!".

Then, as increasingly with these things, I turn the final page and depressingly ask what difference is this going to make?

We have a fundamental problem with privacy and the human condition. We say, when asked, we care about it - and we do - but we act like we don't. That's down to:
  • attraction - we like the stuff we (non-transparently/invisibly) give up our data for
  • gratification - we enjoy and find useful the stuff we (invisibly) give up our data for and we get at it easily and quickly on the Net
  • access - we get at services and deals on the internet and offline (e.g. supermarket "loyalty" card schemes) that we would not otherwise get without (invisibly) giving up our data
  • community - we get access to communities by (invisibly) giving up our data
  • conformity - we get the chance to fit in by (invisibly) giving up our data
  • convenience (and this one beats everything) - it's easier on the net even if we have to (invisibly) give up our data
The payoff is instant or at least quick and visible. Yet the damage to privacy at an individual, community, regional, national and global level is abstract, invisible, long term and undermines the fabric of our society.

So how do we deal with the pathological calculus that is -
Privacy vs Convenience/attraction/gratification/access/community/conformity/convenience?
Put another way, how can so much be given up by so many for so little so often?

And how do we begin to evolve towards a situation where a significantly greater proportion of the population realise and act according to the exponentially invaluable value of our personal data currency?