At the prompt of the Open Rights Group, I've written to Layla Moran about the government's proposed the Data Use and Access Bill.
Dear Layla Moran,
I am a resident of Abingdon getting in touch with you as the Liberal Democrat MP for Oxford West and Abingdon
I am writing to you with concerns about the Data Use and Access Bill. The Bill will have its second reading in the House of Commons on Wednesday, 12 February.
This Bill threatens to undo data protections that have safeguarded people's privacy and security for years. It is a significant move away from the EU's GDPR model towards a weaker, US-style system. This dangerous step could leave individuals vulnerable to data exploitation and harm. Businesses depend on a shared set of data protection standards under a common Human Rights framework with our closest trading partners. If we allow this to happen, we not only risk the UK's adequacy agreement with the EU—which is crucial for trade—but, more importantly, erode the fundamental rights of millions of people.
I want to live in a country where personal information is handled respectfully, not where corporations and the government can make decisions about us behind closed doors, using algorithms we can't possibly hope to understand. This Bill weakens key protections and removes essential safeguards, leaving people with little recourse if they are unfairly treated. Have we not learnt anything from recent scandals such as the Horizon Computer system scandal with postmasters? I urge you to stand against these dangerous provisions and fight for amendments that protect people's rights.
1. Safeguarding Rights Against Automated Decision-Making and AI
This Bill dismantles the right not to be subject to fully automated decisions (Clause 80), meaning AI systems could make significant choices about people's lives—including their job prospects, credit eligibility, or access to public services. We already know AI systems can reinforce discrimination and errors, so why weaken protections when we should strengthen them?
Recommendation: Drop Clause 80 and expand Article 22 protections to include increased safeguards around automated decision-making. This helps to ensure a safer roll-out of the expanding role of AI in Government.
2. Preventing Unaccountable Ministerial Powers Over Data Protection.
This Bill grants new secondary legislation powers to the Secretary of State (Clauses 70, 71, 74, 80, 85, and Schedule 7), allowing them to override protections with the limited scrutiny Statutory Instruments receive. This is an affront to democratic accountability. Open Rights Group warns in its latest report how this power could undermine integrity in UK elections, as any party in power could, at very short notice, change the rules on how they can use voters' data in elections. Also, recent events in the United States have shown how access to Government data can become a battlefield in the event of a constitutional crisis: giving unaccountable powers to the Secretary of State is a dangerous proposition that unnecessarily exposes us to the risk of seeing data we gave to a local authority or department being abused via Statutory Instruments.
Recommendation: Remove these Henry VIII powers and instead define what you want to do with data in primary statutory legislation.
3. Strengthening Accountability in Law Enforcement Data Sharing
The Bill lowers accountability for how police and other authorities access and use people's data. It removes the requirement for law enforcement to consider the impact of data-sharing on individuals (Schedules 4 and 5) and even eliminates the need for police to record why they are accessing a database (Clause 81). This makes abuse and overreach far more likely. The example of the Sarah Everard case, in which Police Officers were sacked for wrongful accessing case files, demonstrates why these safeguards are required.
Recommendation: Drop Schedules 4 and 5 and Clause 81 to restore accountability and transparency in law enforcement data use.
4. Addressing the ICO's Failure to Enforce Data Protection Laws.
The Information Commissioner's Office (ICO) is failing in its duty to enforce data protection laws. This is leaving people you represent in Oxford West & Abingdon —especially the most vulnerable—at risk of harm. Open Rights Group's Alternative Annual Report 2023-24 reveals that the ICO has repeatedly failed to act against serious breaches, particularly in cases affecting marginalised individuals. Instead of strengthening the ICO, this Bill weakens its independence, allowing more political influence over its work (Clauses 90 and 91, Schedule 14). This must not be allowed to happen.
Recommendation: Strengthen the ICO's independence by removing ministerial interference and increasing its enforcement powers. Also, the ICO's use of 'reprimands' should be limited, as these are ineffective methods of enforcing data protection laws.
Beyond individual rights, this Bill has significant economic implications. Losing the EU's adequacy agreement would be devastating to UK businesses, potentially costing between £1 billion and £1.6 billion in compliance costs and lost opportunities (New Economics Foundation and UCL European Institute). This is an unnecessary, self-inflicted wound that Parliament must prevent.
We need a data protection system that works for people—not just corporations. Please advocate for your constituents' rights by opposing these damaging provisions and working towards amendments that will restore public trust and accountability.
Yours sincerely,
Ray Corrigan