Friday, February 07, 2025

Privacy concerns about the Data Use and Access Bill

At the prompt of the Open Rights Group, I've written to Layla Moran about the government's proposed the Data Use and Access Bill.

Dear Layla Moran,

I am a resident of Abingdon getting in touch with you as the Liberal Democrat MP for Oxford West and Abingdon

I am writing to you with concerns about the Data Use and Access Bill. The Bill will have its second reading in the House of Commons on Wednesday, 12 February.

This Bill threatens to undo data protections that have safeguarded people's privacy and security for years. It is a significant move away from the EU's GDPR model towards a weaker, US-style system. This dangerous step could leave individuals vulnerable to data exploitation and harm. Businesses depend on a shared set of data protection standards under a common Human Rights framework with our closest trading partners. If we allow this to happen, we not only risk the UK's adequacy agreement with the EU—which is crucial for trade—but, more importantly, erode the fundamental rights of millions of people.

I want to live in a country where personal information is handled respectfully, not where corporations and the government can make decisions about us behind closed doors, using algorithms we can't possibly hope to understand. This Bill weakens key protections and removes essential safeguards, leaving people with little recourse if they are unfairly treated. Have we not learnt anything from recent scandals such as the Horizon Computer system scandal with postmasters? I urge you to stand against these dangerous provisions and fight for amendments that protect people's rights.

1. Safeguarding Rights Against Automated Decision-Making and AI

This Bill dismantles the right not to be subject to fully automated decisions (Clause 80), meaning AI systems could make significant choices about people's lives—including their job prospects, credit eligibility, or access to public services. We already know AI systems can reinforce discrimination and errors, so why weaken protections when we should strengthen them?

Recommendation: Drop Clause 80 and expand Article 22 protections to include increased safeguards around automated decision-making. This helps to ensure a safer roll-out of the expanding role of AI in Government.

2. Preventing Unaccountable Ministerial Powers Over Data Protection.

This Bill grants new secondary legislation powers to the Secretary of State (Clauses 70, 71, 74, 80, 85, and Schedule 7), allowing them to override protections with the limited scrutiny Statutory Instruments receive. This is an affront to democratic accountability. Open Rights Group warns in its latest report how this power could undermine integrity in UK elections, as any party in power could, at very short notice, change the rules on how they can use voters' data in elections. Also, recent events in the United States have shown how access to Government data can become a battlefield in the event of a constitutional crisis: giving unaccountable powers to the Secretary of State is a dangerous proposition that unnecessarily exposes us to the risk of seeing data we gave to a local authority or department being abused via Statutory Instruments.

Recommendation: Remove these Henry VIII powers and instead define what you want to do with data in primary statutory legislation.

3. Strengthening Accountability in Law Enforcement Data Sharing

The Bill lowers accountability for how police and other authorities access and use people's data. It removes the requirement for law enforcement to consider the impact of data-sharing on individuals (Schedules 4 and 5) and even eliminates the need for police to record why they are accessing a database (Clause 81). This makes abuse and overreach far more likely. The example of the Sarah Everard case, in which Police Officers were sacked for wrongful accessing case files, demonstrates why these safeguards are required.

Recommendation: Drop Schedules 4 and 5 and Clause 81 to restore accountability and transparency in law enforcement data use.

4. Addressing the ICO's Failure to Enforce Data Protection Laws.

The Information Commissioner's Office (ICO) is failing in its duty to enforce data protection laws. This is leaving people you represent in Oxford West & Abingdon —especially the most vulnerable—at risk of harm. Open Rights Group's Alternative Annual Report 2023-24 reveals that the ICO has repeatedly failed to act against serious breaches, particularly in cases affecting marginalised individuals. Instead of strengthening the ICO, this Bill weakens its independence, allowing more political influence over its work (Clauses 90 and 91, Schedule 14). This must not be allowed to happen.

Recommendation: Strengthen the ICO's independence by removing ministerial interference and increasing its enforcement powers. Also, the ICO's use of 'reprimands' should be limited, as these are ineffective methods of enforcing data protection laws.

Beyond individual rights, this Bill has significant economic implications. Losing the EU's adequacy agreement would be devastating to UK businesses, potentially costing between £1 billion and £1.6 billion in compliance costs and lost opportunities (New Economics Foundation and UCL European Institute). This is an unnecessary, self-inflicted wound that Parliament must prevent.

We need a data protection system that works for people—not just corporations. Please advocate for your constituents' rights by opposing these damaging provisions and working towards amendments that will restore public trust and accountability.

Yours sincerely,
Ray Corrigan

Thursday, November 07, 2024

Note to MP on parliamentary debate on face recognition

At the behest of Big Brother Watch, I've writen to my MP, Layla Moran, asking her to participate in the Westminster Hall debate on police use of face recognition technology scheduled in parliament for next Wednesday, 13 November, from 9:30am to 11am.

 Dear Layla,

You may recall that I wrote to you last year noting a survey of over 100 MPs, carried out by YouGov on behalf of Privacy International, indicating that 70% of MPs did not know whether facial recognition technology (FRT) was being used in public spaces in their constituency.

On this occasion I am writing to you to ask you to attend and speak at a debate taking place next Wednesday 13th November at 9:30 am in Westminster Hall on the police’s use of live facial recognition. The debate is sponsored by Sir John Whittingdale MP and I would appreciate it if you could raise the concerns I have about the use of this technology and its impact on human rights and civil liberties in the UK.

Live facial recognition works by creating a 'faceprint' of everyone who passes in front of the camera — processing biometric data as sensitive as a fingerprint. This form of surveillance is deeply intrusive, often subjecting many thousands of innocent people to biometric identity checks without justification. Seven police forces around the UK are currently using this technology despite there being no specific law which governs its use, with forces such as Essex Police, the Metropolitan Police and South Wales Police continuing to use LFR on a regular basis in a way that affects millions of us as we live and travel around the UK.

Uses of live facial recognition have resulted in privacy and data breaches, misidentifications, racial discrimination, and a significant chilling effect on freedom of expression and assembly. In 2020, South Wales Police was found to have deployed live facial recognition surveillance unlawfully (Bridges v SWP) and the Metropolitan Police is also facing a judicial review brought by a Black victim of live facial recognition misidentification which took place earlier this year. Other victims of live facial recognition harassment and errors are initiating legal action in the retail context.

In recent years, parliamentarians across parties in Westminster, members of the Senedd, rights and equalities groups and technology experts across the globe have called for a stop to the use of this technology. The only detailed inquiry into the use of live facial recognition by a parliamentary committee called for a stop to its use. The Equality and Human Rights Commission (EHRC) called for the UK Government to suspend live facial recognition in 2020.

Governments across the democratic world are legislating to ban and significantly restrict the use of live facial recognition surveillance, for both law enforcement and private companies – but successive governments have left the UK behind. The EU’s AI Act will introduce an almost total ban on the use of live facial recognition, with law enforcement exceptions for only the most serious crimes and on the condition of prior judicial approval for each deployment. In the US, multiple states have banned law enforcement from using the technology entirely. The UK risks becoming an outlier in the democratic world, instead following the approach of countries like Russia and China, which have heavily invested in this technology to the detriment of their citizens’ rights and freedoms.

The legal vacuum when it comes to the police’s use of live facial recognition technology cannot continue. The UK must follow the example set by European states and introduce stringent restrictions on the use of this surveillance technology akin to the EU’s AI Act. For more information, please contact Big Brother Watch at info@bigbrotherwatch.org.uk

In addition to police use of such technologies, there is an increasing expansion of their deployment in the retail sector and other areas. The local Budgens supermarket has recently installed a face recognition system. Having used that shop for 25 years, I intend never to cross their threshold again. For ordinary citizens to have to exclude themselves from routine activities and locations they frequented for generations, simply to avoid these intrusive surveillance technologies is not conducive to the maintenance of a healthy society.

As my MP I would appreciate it if you could attend next Wednesday’s debate and raise my concerns.

I look forward to hearing from you.

Kind regards,

Ray Corrigan

Friday, September 20, 2024

Proposed reforms to the National Planning Policy Framework and other changes to the planning system

I've responded to the government consultation on Proposed reforms to the National Planning Policy Framework and other changes to the planning system via Foxglove.

 Dear Planning Policy Consultation Team,

I am an individual responding to the government’s consultation on ‘Proposed reforms to the National Planning Policy Framework and other changes to the planning system’ (question 64).

Thank you for giving the British public the opportunity to have their say on these plans.

I want to register my concerns as an individual about the potentially devastating impact of new data centres on the environment, on water and electricity production and use, and on the UK’s transition to Net Zero and green energy.

The companies building these data centres are some of the richest on the planet. The government must make their involvement conditional on their investment in green energy – ensuring they give more than they take for the UK and from the environment.

The impacts can and should be mitigated by the planning application process in the following straightforward and commonsense ways:

1. Companies building new data centres must be able to show that the centre will have no negative impact on our targets to avoid climate catastrophe, including Net Zero by 2023.

2. Companies building new data centres must be able to show that the centre will not place significant additional burdens on our power and water supply.

3. Companies building new data centres should provide a plan explaining how it will sustainably meet the power and water needs of the centre without burdening the UK’s existing power and water supplies. Any costs associated with the additional energy or water requirements of a new data centre application must be covered by them - not by the British taxpayers.

Thank you, 

Ray Corrigan

Friday, July 19, 2024

Twitter has locked me out...

... and I'm not sure I want to even bother giving Mr Musk a fictional age to be allowed back in.