Upon a prompt from Jim Killock at the Open Rights Group, I've submitted the following response to the Department for Digital, Culture, Media & Sport Review of Representative Action Provisions, Section 189 Data Protection Act 2018 consultation. (Apologies for the repetition in the paragraph about some of the worst breaches of data protection law being attached to sensitive areas of our private lives, like tracking individual’s use of mental health websites.)
[This is the first time I've used the new Blogger interface and I'm not keen. The html interface is particularly dense tiny font and challenging to read/interpret/use]
I didn't have a lot of time, so drew heavily from Jim's own and Dr Johnny Ryan's work on challenging the legality of the adtech industry's architecture and operational practices.
Department for Digital, Culture, Media & Sport Review of Representative Action Provisions, Section 189 Data Protection Act 2018
My name is Ray Corrigan. I am a senior lecturer in the Science, Technology Engineering and Mathematics Faculty at The Open University but I am responding to this consultation in a personal capacity.
I write, in particular, in relation to the department’s examination of whether to introduce new provisions to permit organisations to act on behalf of individuals who have not given their express authorisation.
I am in favour of such provisions.
Chapter VIII Article 80(2) of the General Data Protection Regulation, provides that EU Member States may provide that any not-for-profit body, organisation or association which has been properly constituted in accordance with the law, independently of a data subject’s mandate, has the right to lodge, in that Member State, a complaint with the supervisory authority which is competent pursuant to GDPR Article 77 and to exercise the rights referred to in GDPR Articles 78 (right to an effective judicial remedy against a supervisory body) and 79 (right to an effective judicial remedy against a data controller or processor), if it considers that the rights of a data subject under this Regulation have been infringed as a result of the processing.
The UK government chose not to incorporate this provision into the Data Protection Act 2018, and I would suggest it is important that this now be rectified.
The big technology and associated “ad tech” companies having been running rings round governments and regulators for too long. As Johnny Ryan of Brave points out, in this formal complaint concerning massive, web-wide data breach by Google and other “ad tech” companies under the GDPR,
“Every time a person visits a website and is shown a “behavioural” ad on a website, intimate personal data that describes each visitor, and what they are watching online, is broadcast to tens or hundreds of companies. Advertising technology companies broadcast these data widely in order to solicit potential advertisers’ bids for the attention of the specific individual visiting the website.
A data breach occurs because this broadcast, known as an “bid request” in the online industry, fails to protect these intimate data against unauthorized access. Under the GDPR this is unlawful...
Bid request data can include the following personal data:
• What you are reading or watching
• Your location
• Description of your device
• Unique tracking IDs or a “cookie match”.
• This allows advertising technology companies to try to identify you the next time you are seen, so that a long-term profile can be built or consolidated with offline data about you
• Your IP address (depending on the version of “real time bidding” system)
• Data broker segment ID, if available.
• This could denote things like your income bracket, age and gender, habits, social media influence, ethnicity, sexual orientation, religion, political leaning, etc. (depending on the version of bidding system)
Dr Ryan said “There is a massive and systematic data breach at the heart of the behavioral advertising industry. Despite the two year lead-in period before the GDPR, adtech companies have failed to comply. Our complaint should trigger a EU-wide investigation in to the ad tech industry’s practices, using Article 62 of the GDPR. The industry can fix this. Ads can be useful and relevant without broadcasting intimate personal data”.”
For all their flaws, getting the GDPR and the Data Protection Act 2018 in place as legal infrastructure for regulating the collection & processing was not a bad start. Unfortunately, with few exceptions such as the recent Belgian data protection authority declaration that the behavioural advertising industry has been engaged in routine, systematic, industrial scale, blanket data collection and management practices, in serious breach of multiple provisions of the GDPR from the day it was passed, enforcement efforts have been underwhelming, at best, so far.
Ordinary internet users are almost completely oblivious to the mechanics of the hidden personal data processing adtech architecture behind most websites; and as the Belgian data protection authority have just pointed out, the deployment and operation of that invasive technology is systemically and systematically unlawful. It is almost astonishing that we, commerce, industry & governments enabled it, but we did and it is time to do something about that.
Mass data collection, processing, onward dissemination and storage has become incredibly complex. Relying on individuals to spot misbehaviour and malfeasance in this area and initiate complaints or legal proceedings to reign in an industry out of control, is unrealistic. The woman on the Clapham omnibus simply does not have the expertise, time or resources. Not-for-profit bodies, human rights organisations or other related associations, however, which have been properly constituted in accordance with the law, do have the expertise and understanding, even if, in these difficult times, many are experiencing a shortage of resources. It is more important than ever that such organisations are given the authority in law to raise complaints, independently, about nefarious data collection and management practices. NGOs should be empowered to complain, in the public interest and to protect individual rights, to the Information Commissioner’s Office and complain to the court about controllers, processors or ICO failure.
This power must include the capacity to challenge the Information Commissioner’s Office. In September 2018, Jim Killock of the Open Rights Group and Dr Michael Veale of University College London, submitted a formal GDPR complaint to the UK Information Commissioner about “real time bidding” the core of the industry’s invasive adtech architecture. In June 2019, the ICO gave the adtech industry six months to clean up its act. In January 2020, after six months of substantive inaction on the part of the industry, the ICO threw in the towel and said they would be taking no enforcement action to remedy industry breaches. https://brave.com/ico-faces-action/
Some of the worst breaches of data protection law are attached to sensitive areas of our private lives, like tracking individual’s use of mental health websites. The ad tech described in the extract from Dr Ryan above engages in some invisible and deeply invasive profiling. Some of the worst breaches of data protection law are attached to sensitive areas of our private lives, like tracking individual’s use of mental health websites. These areas need to be challenged but often are not because of their sensitivity.
When you visit a website, which delivers ads your personal data is broadcast to tens or hundreds of companies. What you read, watch or listen to is categorised and you are profiled into categories. Some of these are bland e.g. “football” or “jazz”. Some are hugely and outrageously sensitive. The rule making representative body for the adtech industry, the Interactive Advertising Industry (IAB) has, for example, got a “IAB7-28 Incest/Abuse Support” category. Other categories are related to sensitive or embarrassing health conditions, sexual orientation, religious affiliation etc. Google categories include “eating disorders”, political leanings etc.
These tags and profiles and trackers can stick with internet users for a long time and people have no idea of the digital baggage they are carrying round as a result. Such tags are not necessary for ad targeting. They are more a convenience for the industry to make it easier to track and profile and re-identify people. And the obscurity of the whole process, systems and mechanisms make it almost impossible for individuals to exercise their rights under the law, in the UK, the Data Protection Act 2018. We cannot find, identify, verify, correct or delete these digital shadows and profiles. The power differential and lack of transparency make it extremely difficult for individuals to take effective action to rectify unlawful and unethical activities on the part of the requisite industries.
Industry pretend they deal in anonymous or non-sensitive data which is a flat-out falsehood. Detailed, invasive personal profiles are constantly and casually created and traded as people innocently surf the internet unaware of these machinations. Industry treats this as routine business practice. It does not have to be this way and should stop. That mass privacy invasion is routine business practice on the internet does not make it right and it is time to stop it.
There is no great functional difference between adtech and techniques Cambridge Analytica used in an attempt to influence voters but, the Cambridge Analytica story, for a time, entered the realm of short attention span news cycle. The adtech data management platforms are just a longer running, invisible scandal.
It is particularly important, in the case of sensitive personal information, therefore, that qualified NGOs be given the power to bring complaints, independently, to protect individual and societal privacy. Privacy is not just an individual value but the fundamental basis of a healthy society.
A couple of final points before I close – firstly to note the necessary parallels with consumer law and secondly on Brexit.
Consumer law allows consumer organisations to initiate complaints in the public interest on the part of consumers. There is no reason, in principle, why NGOs should be prevented from engaging in an equivalent form of action in relation to consumer privacy.
On the Brexit front the UK in January 2021 will be facing the prospect of getting an approved data adequacy decision from the EU in relation to cross border flows of data. Elements of the Investigatory Powers Act 2016, the Digital Economy Act 2017 and recent government moves to pass the Internal Market Bill mean this could prove difficult. (See e.g. Brown, I. & Korff, D. The inadequacy of UK data protection law Part One: General inadequacy https://www.ianbrown.tech/wp-content/uploads/2020/10/Korff-and-Brown-UK-adequacy.pdf) A move to incorporate Article 80(2) of the GDPR into UK domestic law, enabling NGOs and other lawfully constituted public interest organisations to challenge unlawful data collection and management practices, could only help the process of demonstrating the UK, post Brexit, should be held to provide “adequate” protection to personal data.