Showing posts with label human rights. Show all posts
Showing posts with label human rights. Show all posts

Friday, March 07, 2025

Response to Ofcom technology notices consultation

At the behest of the Open Rights Group, I have written to Ofcom regarding their technology notices consultation.

Dear Ofcom consultation team,

I am responding to your consultation as an individual.

I am happy for you to publish this response.

I wish to respond to the following consultation question.”

‘Do you have any views on our audit-based assessment, including our proposed principles, objectives, and the scoring system? Please provide Evidence to support your response.’

It is my view that OFCOM needs to consider how they score and consider the following risks and threats that could arise from accrediting any scanning technologies:

1. The threat that the system might infringe on people’s human right to free expression or privacy. This is particularly relevant given these systems could break end-to-end encryption and the recent ECHR ruling in the case of Podchasov v. Russia – https://hudoc.echr.coe.int/eng/?i=001-230854.

OFCOM will have a legal duty to assess the proportionality of any such system that doesn’t infringe on our human rights and could find itself facing legal challenges over this issue if it doesn’t demonstrate an assessment of the impact on the right to privacy from breaking E2EE within its framework.

2. The risk of false positives & wrongful accusations. If these are too high then law-enforcement agencies will be flooded with false positive results from any scanning system and people will be wrongfully accused causing them harms. A higher threshold should therefore be applied to accuracy.

3. The risks are that any system will break UK data protection laws and/or undermine the nation's cybersecurity by introducing backdoor vulnerabilities to private and secure messaging systems. The recent situation where Apple has withdrawn its advanced data protection product from the UK market highlights that forcing or approving a poor technology upon a company could result in UK users losing access to products. OFCOM should consider the risks to UK consumers of forcing new technologies onto providers that are not feasible to deliver or have too high economic and social costs.

4. Equalities act implications and impact on people with protected characteristics– OFCOM will have to consider the impact of any scanning system in relation to the public sector equalities duty.  

5. Higher weighting in framework around risks where there are legal duties. The current minimum threshold for ‘fairness’ does not consider the risk Ofcom faces of breaking its legal obligations to consider Human Rights Act, Equalities Act and the Data Protection Act. As such a separate scoring and risk assessment should be taken for each technology it considers to ensure Ofcom can evidence it has met its statutory legal duties.

6. The risk any system will facilitate the spread of CSEM – A regulator wishing to control the use of image-based sexual abuse (IBSA) removal tools must carefully assess the risks posed by perceptual hash inversion attacks. These attacks could result in someone creating CSEM images from the hashed data the tool was using. For Evidence of these attacks, see S. Hawkes, C. Weinert, T. Almeida and M. Mehrnezhad, "Perceptual Hash Inversion Attacks on Image-Based Sexual Abuse Removal Tools," in IEEE Security & Privacy, doi: 10.1109/MSEC.2024.3485497. Further risks and threats from scanning technologies are set out in 'Bugs in our pockets: the risks of client-side scanning - Harold Abelson, Ross Anderson, Steven M Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G Neumann, Ronald L Rivest, Jeffrey I Schiller, Bruce Schneier, Vanessa Teague, Carmela Troncoso, Bugs in our pockets: the risks of client-side scanning, Journal of Cybersecurity, Volume 10, Issue 1, 2024, tyad020, https://doi.org/10.1093/cybsec/tyad020
perceptual hash inversion attacks. These attacks could result in someone creating CSEM images from the hashed data the tool was using. Evidence of these attacks and risks are published in a research paper in IEEE Security & Privacy ( https://ieeexplore.ieee.org/document/10762793)

Yours sincerely,
Ray Corrigan

In relation to item 3. above I'd also recommend those interested read and thoroughly digest 'Keys under doormats: mandating insecurity by requiring government access to all data and communications' by Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A. Specter and Daniel J. Weitzner. 

Bottom line there is no backdoor that can be created to a cryptographic system that only the good guys will have access to.

 

Friday, February 07, 2025

Privacy concerns about the Data Use and Access Bill

At the prompt of the Open Rights Group, I've written to Layla Moran about the government's proposed the Data Use and Access Bill.

Dear Layla Moran,

I am a resident of Abingdon getting in touch with you as the Liberal Democrat MP for Oxford West and Abingdon

I am writing to you with concerns about the Data Use and Access Bill. The Bill will have its second reading in the House of Commons on Wednesday, 12 February.

This Bill threatens to undo data protections that have safeguarded people's privacy and security for years. It is a significant move away from the EU's GDPR model towards a weaker, US-style system. This dangerous step could leave individuals vulnerable to data exploitation and harm. Businesses depend on a shared set of data protection standards under a common Human Rights framework with our closest trading partners. If we allow this to happen, we not only risk the UK's adequacy agreement with the EU—which is crucial for trade—but, more importantly, erode the fundamental rights of millions of people.

I want to live in a country where personal information is handled respectfully, not where corporations and the government can make decisions about us behind closed doors, using algorithms we can't possibly hope to understand. This Bill weakens key protections and removes essential safeguards, leaving people with little recourse if they are unfairly treated. Have we not learnt anything from recent scandals such as the Horizon Computer system scandal with postmasters? I urge you to stand against these dangerous provisions and fight for amendments that protect people's rights.

1. Safeguarding Rights Against Automated Decision-Making and AI

This Bill dismantles the right not to be subject to fully automated decisions (Clause 80), meaning AI systems could make significant choices about people's lives—including their job prospects, credit eligibility, or access to public services. We already know AI systems can reinforce discrimination and errors, so why weaken protections when we should strengthen them?

Recommendation: Drop Clause 80 and expand Article 22 protections to include increased safeguards around automated decision-making. This helps to ensure a safer roll-out of the expanding role of AI in Government.

2. Preventing Unaccountable Ministerial Powers Over Data Protection.

This Bill grants new secondary legislation powers to the Secretary of State (Clauses 70, 71, 74, 80, 85, and Schedule 7), allowing them to override protections with the limited scrutiny Statutory Instruments receive. This is an affront to democratic accountability. Open Rights Group warns in its latest report how this power could undermine integrity in UK elections, as any party in power could, at very short notice, change the rules on how they can use voters' data in elections. Also, recent events in the United States have shown how access to Government data can become a battlefield in the event of a constitutional crisis: giving unaccountable powers to the Secretary of State is a dangerous proposition that unnecessarily exposes us to the risk of seeing data we gave to a local authority or department being abused via Statutory Instruments.

Recommendation: Remove these Henry VIII powers and instead define what you want to do with data in primary statutory legislation.

3. Strengthening Accountability in Law Enforcement Data Sharing

The Bill lowers accountability for how police and other authorities access and use people's data. It removes the requirement for law enforcement to consider the impact of data-sharing on individuals (Schedules 4 and 5) and even eliminates the need for police to record why they are accessing a database (Clause 81). This makes abuse and overreach far more likely. The example of the Sarah Everard case, in which Police Officers were sacked for wrongful accessing case files, demonstrates why these safeguards are required.

Recommendation: Drop Schedules 4 and 5 and Clause 81 to restore accountability and transparency in law enforcement data use.

4. Addressing the ICO's Failure to Enforce Data Protection Laws.

The Information Commissioner's Office (ICO) is failing in its duty to enforce data protection laws. This is leaving people you represent in Oxford West & Abingdon —especially the most vulnerable—at risk of harm. Open Rights Group's Alternative Annual Report 2023-24 reveals that the ICO has repeatedly failed to act against serious breaches, particularly in cases affecting marginalised individuals. Instead of strengthening the ICO, this Bill weakens its independence, allowing more political influence over its work (Clauses 90 and 91, Schedule 14). This must not be allowed to happen.

Recommendation: Strengthen the ICO's independence by removing ministerial interference and increasing its enforcement powers. Also, the ICO's use of 'reprimands' should be limited, as these are ineffective methods of enforcing data protection laws.

Beyond individual rights, this Bill has significant economic implications. Losing the EU's adequacy agreement would be devastating to UK businesses, potentially costing between £1 billion and £1.6 billion in compliance costs and lost opportunities (New Economics Foundation and UCL European Institute). This is an unnecessary, self-inflicted wound that Parliament must prevent.

We need a data protection system that works for people—not just corporations. Please advocate for your constituents' rights by opposing these damaging provisions and working towards amendments that will restore public trust and accountability.

Yours sincerely,
Ray Corrigan

Thursday, October 15, 2015

Tuesday, September 30, 2014

Which rights should we discard?

As the Tories launch into their latest 'human rights are the root of all evil' fest, at their last annual conference before the next general election, I'd like to ask David Cameron and his party colleagues a question posed by the late Lord Bingham, relating to the rights laid out in the Charter of Fundamental Rights of the European Union,
"Which of these rights, I ask, would we wish to discard? Are any of them trivial,
superfluous, unnecessary? Are any them un-British?"
Just to be clear which of -
  • Human dignity? (article 1)
  • Right to life? (article 2)
  • Right to the integrity of the person? (article 3)
  • Prohibition of torture and inhuman or degrading treatment or punishment? (article 4)
  • Prohibition of slavery and forced labour? (article 5)
  • Right to liberty and security? (article 6)
  • Respect for private and family life? (article 7)
  • Protection of personal data? (article 8)
  • Right to marry and right to found a family? (article 9)
  • Freedom of thought, conscience and religion? (article 10)
  • Freedom of expression and information? (article 11)
  • Freedom of assembly and of association? (article 12)
  • Freedom of the arts and sciences? (article 13)
  • Right to education? (article 14)
  • Freedom to choose an occupation and right to engage in work? (article 15)
  • Freedom to conduct a business? (article 16)
  • Right to property? (article 17)
  • Right to asylum? (article 18)
  • Protection in the event of removal, expulsion or extradition? (article 19)
  • Equality before the law? (article 20)
  • Non-discrimination? (article 21)
  • Cultural, religious and linguistic diversity? (article 22)
  • Equality between men and women? (article 23)
  • The rights of the child? (article 24)
  • The rights of the elderly? (article 25)
  • Integration of persons with disabilities? (article 26)
  • Solidarity (articles 27 to 38) 
  • Citizens rights (articles 39 to 46)
  • Right to an effective remedy and to a fair trial? (article 47)
  • Presumption of innocence and right of defence? (article 48)
  • Principles of legality and proportionality of criminal offences and penalties? (article 49)
  • Right not to be tried or punished twice in criminal proceedings for the same criminal offence? (article 50)
  • General provisions (articles 51 to 54)
- would the main party of government wish to discard? Are any of these rights trivial, superfluous, unnecessary? Are any them un-British?"

Friday, September 26, 2014

Generations

The mindset of FBI director, James Comey, appears to be that of someone who believes the lives of all citizens should be on permanent display for government inspection and approval. Privacy, remember, is only for criminals. Apart from the fact that this won't be the first or last time that Apple or other commercial enterprise overstates the potential efficacy of its product features in any particular context, Mr Comey's belief that he has the right, nay, duty to berate the company for its modest implementation of privacy enhancing technology is not one I can share.

It's not a big logical leap for a society that normalises mass collection, processing, analysis and storage of communications (metadata and content, even if the distinction is no longer clear) to consider that those who would wish to opt out, or those that would help them do so, should be considered anomalous and suspicious.

It's not a big logical leap to say we'll collect all the information but don't worry it's only "seen" by the computer, not real people, so it's not real surveillance; and it will only be looked at for evidence of wrongdoing by bad guys... or those linked to bad guys... or those linked to those linked to bad guys...

It's not a big logical leap to say we're not collecting enough information.

Everything will be easier or better or more efficient if only we collect more.

Big data is the future of commerce and government.

We already have mass telephone and internet interception, let's install CCTV cameras in all homes. Oh yes, people already have webcams attached to their computers and we have already collected millions of users' Yahoo webcam images. But that's only in the rooms with computers and the cameras don't cover all corners of the rooms. And, after all, the footage will only be collected and "seen" by computers, not real people, so it's not real surveillance.

And we don't have to record the audio. At least at first. So we don't know what people are actually saying to each other. So it's only metadata. We know where they are and who they are talking to and for how long but not what they are actually saying to each other.  If we surreptitiously use lip reading programs that's only to detect serious criminality.  Ah, you know what, we have to protect people so it doesn't matter if we record the audio too. It will only be 'seen' by computers.

And if you have nothing to hide you have nothing to worry about.

It's not a big logical leap to say, you know all that information in all those giant databases? We should be using it not just to catch terrorists but to catch
It's not a big logical leap to say we shouldn't just limit it to these serious criminals. We need to apply use of this information to illegal immigrants, benefit scroungers, criminal youths, burglars, petty criminals, drunkards, louts, vandals, offensive people, inner city sink estates with unruly families who may cause trouble, ethnic minority communities which may have a link to a religion extreme forms of which may be cited as an excuse for murder and mayhem, Romanians (if UKIP ever get a say), the poor, disabled or sick or elderly...

It's not a big logical leap to say we need to use this information to improve public services -
  • the NHS
  • education
  • social welfare
  • policing and criminal justice
  • intelligence
  • defence
  • foreign affairs
  • economy
So people with a HSCIC database determined genetic predisposition towards contracting certain kinds of cancers may be required to take out private health insurance just in case they become a disproportionate drain on the NHS. Or kids with mental health or special educational or social needs be excluded from good schools so they don't disrupt the normal kids.

Even the most dedicated of public servants, and I have the privilege to know a lot of them, heavily under-resourced and under relentless pressure for outcomes, from management or politicians of varying levels and competence, succumb, with the best of intentions, to the temptation of mission or function creep.

It not a big logical leap to convince ourselves it is ok to use data or facilities gathered or created for one purpose for an additional 'useful' or convenient or target facilitating purpose.

Dedicated people in all walks of life bend/break/ignore/re-interpret/renew the rules because everyone else is doing it, so it's not a big deal.

But this combination of the normalisation of mass surveillance, function creep and (sometimes) well intentioned rule making and rule breaking create the conditions for the poisonous snakes in suits of the world to thrive. And it only takes a handful of them in the wrong places to cause widespread misery and abuse of human rights.

The sad thing is we can and sometimes do design, build, operate and use communications systems, computers and big data in socially, economically and legally enlightened ways, in the public good, when we do so intelligently and ethically. But we're often failing to do so. The seductive attractions of convenience, instant gratification and the ease and power with which all personal data can be collected, processed and stored (the economic agents doing the collection can figure out how to exploit/use/monetize it later), beat intelligence, ethics and the public good, every time.

I was chatting with my elder teen late one evening this week about some of this. I expressed a concern that he and his kids would be asking me, in my dotage, what the hell I thought I was doing when we were building the communications infrastructure of a surveillance state. After all, all that is required for evil to prosper is for good people to do nothing.

"Dad," says he, "I can guarantee that's one question I'll never have to ask you."

I'm wasn't sure whether to be flattered that he over-estimates the efficacy of my efforts in this landscape, sad that he may change his mind, concerned that I've burdened him with my worries or optimistic that he and his generation are smarter than me and mine; and they will put those smarts to good effect in building a more equitable and enlightened world.

Sunday, May 18, 2014

The Clarkson crisis and mass surveillance

I will try and find some time to consider in detail and blog about the European Court of Justice decision imposing an obligation on Google to make an effort to respect what many are calling 'the right to be forgotten.'

Firstly though, on a parallel theme of our recorded digital pasts returning to haunt us, could I point you at an edited version of some thoughts I had on the recent crises Jeremy Clarkson found himself embroiled in, that the very good folks at The Conversation kindly published earlier this week. A more detailed edition of those thoughts resides below.

I see Jeremy Clarkson is in the soup again for saying the wrong thing. This time he's accused of using the reviled, offensive, racist N word, in a Top Gear out-take two years ago. The usual gang of anti-Clarksonites and more than a few others have lined up to demand the BBC fire him. Perhaps surprisingly members of the government and some in the media not otherwise known as Clarkson fans have offered him qualified support.

Elsewhere various sexists, racists, homophobes, hatemongers and other assorted flavours of humanity that dislike people not like them are attracting the attention of the news media and political opponents for being associated in some way with UKIP. The Prime Minister David Cameron has been condemned for saying recently Britain is a Christian country.

The thing is, respect for the principle of freedom of expression means letting people we disagree with speak. It means letting people who say offensive things speak. It means letting people who say nasty, unpleasant, unsavoury, distasteful, dreadful, objectionable, idiotic, mean, poisonous, hostile, malignant things speak.  It means letting people who mumble casual blokey racist comments, in ill-judged attempts at humour, speak.

Letting people speak doesn't mean we have to listen to them. It doesn't mean we have provide them with a platform to speak. It doesn't mean the media is obligated to draw attention to them. And it doesn't mean we have to laugh with them in a way that encourages casual blokey offensiveness.

I fully accept  Deborah Lipstadt's mantra that 'Reasoned dialogue has a limited ability to withstand an assault by the mythic power of falsehood' (p.25 Denying the Holocaust - a wonderful book btw). But when destructive speech does take hold we have to counteract it. We must be better at explaining, in widely accessible & persuasive ways, why hate speech is so harmful pernicious and noxious. And we must expose the falsehoods and malign intent and/or ignorance underlying it intelligently, accessibly, in a publicly appealing ways and preferably backed up with solid evidence.

The UK Human Rights Act makes the European  Convention on Human Rights part of UK law. Article 10 of the Convention says everyone has the right to freedom of expression. We have the legal right to freedom of expression in the UK. As a member of the EU, we also have the fundamental right to freedom of expression guaranteed by Article 11 of the Charter of Fundamental Rights of the EU.

To make life complicated, in the UK there are also criminal offences relating to offending or insulting someone, under a variety of statutes including s127 of the Communications Act 2003 and s4A and s5 of the Public Order Act 1986.

A number of social network users have found this out the hard way, most notably Paul Chambers of Twitter joke trial fame. Mr Chambers was convicted of sending, by a public electronic communication network, a message of a "menacing character" contrary to sections 127(1)(a) and (3) of the Communications Act 2003. He had joked on Twitter about blowing up Robin Hood airport after his flight to see his girlfriend got cancelled due to snow. He lost his job and another thereafter, subsequently found it difficult to get work and it took two and half years of legal wrangling and appeals before the High Court finally cleared his name.

The media, the public and public figures, we all love a good witch-hunt, as long as we are not the object of the hunt. Soundbite politics, the 24/7 news cycle and our world of short attention spans see words and phrases taken out of context and wielded as weapons to demonise and misrepresent opponents, shout insults past each other, blame and preferably punish someone. Public debate can't get past megaphone soundbites of the 'we're the goodies they're the baddies' variety.  This is an arena that is positively hostile to deep and informed engagement with any subject matter but a fertile place for mob rule.

Could any of us withstand the kind of scrutiny Mr Clarkson's misspoken offence, recognised at the time but resurrected two years later, or Mr Chamber's Twitter joke was subjected to? Well to be blunt we are going to have to.

Why?

Well for the best part of the past 25 years commercial entities have been recording, storing, processing and analysing everything we see and do on the world wide web, for how long, from where, with whom and with what equipment. Additionally telecommunications service providers, both fixed line and mobile, have been obliged for some time, under the 2006 EU data retention directive, to store details of and provide government access to everything everyone does on the telephone or internet; for a period of between 6 months and two years.

Invisible digital watchers follow and record everything we do on digital communications networks without our conscious knowledge or consent.

Article 5 of the 2006 directive specified the data that has been gathered by communications service providers throughout the EU. It covers names, addresses, who spoke to whom, where, when, for how long, on what device, how often, websites visited etc. etc. This all paints a very detailed picture and most people don’t know it has been going on. The who, where, why, how, what and when of individual lives is all there in this 'metadata.'

We've also discovered in the past year via the revelations of former NSA contractor, Edward Snowden, that governments, in particular the UK and US variety, have been going much further, watching and recording our networked lives in even more detail than previously realised. If we thought about it at all which most of us don't. Through clandestine programs like GCHQ's 'Tempora' and the NSA's 'PRISM' all telephone and internet traffic is being collected, processed and stored nominally for current or potential future use in the fight against terrorism or serious crime.

Anyone's complete online life history can be examined in forensic detail even though commerce and governments could not possibly examine everyone's life in detail. The UK intelligence services collect about 40 billion pieces of data per day, for example, and simply do not have the capacity to apply human intelligence to all of it.

Just one of the problems with these mass commercial and governmental silos of personal digital life histories is that small items taken out of context can constitute unexploded digital ordinance. Equivalent to the two year old misdemeanour of Jeremy Clarkson. Most of us don't have the public profile of Mr Clarkson or the interest of the public to anything like the same degree. But as Cardinal Richelieu is rumoured to have said about 500 years ago, "Give me six lines written by the most honest man and I'll show you the evidence to hang him."

Innocent ordinary people, not just celebrities of Mr Clarkson's ilk, have found themselves at the sharp end of media witch hunts.  And which of us knows what nefarious activities people connected to people connected to people connected to us via the internet might have engaged in at some time in that past or potentially in the future? I ask that particular question because the then deputy director of the NSA, Chris Inglis, testified before Congress, in July 2013, that you don't need to be a suspected bad guy to gain the attention of the intelligence services. The NSA track people "three hops" from their targets. If I had communicated with 200 people during my online lifetime I'd be three hops away from over 5 million people. Through my job at the Open University alone I've interacted directly with thousands of people over the past nineteen years. Three hops from thousands connects me to more than the entire population of the world.

I think it is fair to call this mass surreptitious collection of personal data mass surveillance.

Interestingly enough, in a historic decision, On 8 April 2014 the Grand Chamber of the Court of Justice of the European Union hinted at the same conclusion when they decided to invalidate the 2006 data retention directive discussed above. With what may be interpreted as half and eye on the Edward Snowden revelations, the Court, effectively condemned pre-emptive, suspicionless, warrantless mass surveillance and consequent "interference with the fundamental rights of practically the entire European population".

The case was the first major court decision on mass surveillance since the Snowden stories started to break in June 2013. Though high courts in Romania (2009), Germany (2010), Bulgaria (2010),  the Czech Republic (2011) and Cyprus (2011) had previously all declared the data retention directive unconstitutional and/or a disproportionate unjustified interference with the fundamental right to privacy, free speech and confidentiality of communications.

On 23 April 2014, the Slovak Constitutional Court, taking its lead from the Court of Justice, suspended of the Slovak implementation of the directive. The UK government, by contrast, has declared the UK data retention regulations remain in force despite the directive that requires them no longer being so. The Home Secretary, Theresa May, has stated elsewhere that the implications of the ECJ ruling were being assessed.  For the first time, on 9 May 2014, a UK parliamentary committee expressed concern at the oversight of the security and intelligence agencies in this context and asked for a prompt and clear resolution of the legal position on data retention.

The previous UK Labour government were one of the key driving forces behind the original implementation of the data retention directive. The current UK government is one of the biggest cheerleaders for and operators of mass surveillance standards and practices. Though the UK government was not involved directly in the case, (and are scrambling madly to find a way to circumvent the decision as, sadly, are the European Commission), both the current and the previous administrations' behaviour, in the data retention context, is considered so heinous in law that it should never have happened; and the laws facilitating that behaviour should never have existed.

Some commentators have also suggested the Court was firing a message not just to the UK but across the pond (2 min 40sec audio) to the effect that US mass surveillance standards are totally unacceptable in an EU context.

Now come full circle to the Clarkson furore. In their data retention decision, in passing (also known as 'obiter dicta'), the Court of Justice of the EU noted in paragraphs 27 and 28 of their decision the chilling effect of the knowledge that anything we say or do is being recorded and may be used against us -
"Those data, taken as a whole, may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.
In such circumstances... it is not inconceivable that the retention of the data in question might have an effect on the use, by subscribers or registered users, of the means of communication covered by that directive and, consequently, on their exercise of the freedom of expression"
I don't find casual laddish racist remarks at all funny. I find them offensive. Just as I find casual blokey demonisation/marginalisation/ but more particularly intentional-vicious-insult-dismissal and incitement of hatred, directed at [minority group of choice], offensive. It causes division, discrimination and tension and undermines equality, human rights, decency and collective care.

But Mr Clarkson misspoke, by accident, 2 years ago, when doing a recording for a popular TV programme. The trademark of said programme is three middle aged men, acting like big kids, mucking about with cars, playing pranks and laddishly insulting each other and other people and things for laughs.

Mr Clarkson has apologised for using a word he personally loathes. The motives of those who leaked the recording are not known.

I have no idea whether Mr Clarkson is racist though I suspect not. Intended or not, ill-used words do cause damage but it is the presence or absence of hateful intent behind such remarks rather than the words used that define the mindset of the speaker.  We can't read minds so interpret that intent, by proxy, from people's words.

Nevertheless, I would ask that s/he who wish to throw metaphorical stones at Mr Clarkson, to think also of their own many stored and detailed digital dossiers and how fragments thereof might well, one day, be held against you. Especially if, like a certain Open University academic, you might have a 3 hop connection to the population of the world.