Thursday, February 13, 2025

Lib Dem position on Data Use & Access Bill

 I've had a response from the Liberal Democrat MP, Layla Moran in relation to concerns raised about the proposed DUA Bill.

Dear Ray 

Thank you for writing to me about the Data (Use and Access) Bill.

We welcome the omission of many of the more objectionable elements of the previous Data Protection and Digital Information Bill, which was introduced by the previous Conservative government but fell when the General Election was called. I spoke in the second reading debate of that Bill and you can read my speech here.

Despite these changes, retention and enhancement of public trust in data use and sharing is a major issue in the bill. The focus on smart data and sharing of government data means that the Government must do more to educate the public about how and where our data is used and what powers individuals have to find out this information.

There are still major changes proposed to the GDPR -e.g. as regards police duties and Automated Decision Making which continue to make retention of data adequacy for the purposes of digital trade with the EU of the utmost priority in considering any changes.

We continue to believe that the GDPR /Data Protection Act 2018 is not in need of fundamental reform but rather where there is any ambiguity/difficulty in interpretation clarifications incorporating relevant recitals to the GDPR should be made in the legislation and in improved guidance.

We are also supportive of Baroness Kidron’s amendments to strengthen the rights of the creative industry to challenge generative AI business practices which seek to train models on their work. This is a key campaigning area for the Liberal Democrats, as we believe existing copyright law should be enforced to protect the UK’s creative industries – a world-leading British export.

The Government-as did the last one- continue to claim huge benefits -nearly £10bn. from reforms in the bill. We are very sceptical about this and will continue to scrutinise how these changes will impact business.

Liberal Democrats believe that the UK should be broadly aligned with the forthcoming EU AI legislation in terms of safeguards. Whilst we agree that Article 22 lacks clarity, we strongly disagree that it should be amended to limit its effect. We advocate amendment to strengthen its reach so that it applies to predominantly automated processing as well.

Individuals must be allowed to request information about algorithmic decisions which affect them and the data that is subject to decisions made.

It will be important that the public understand when and how AI intersects with their lives and what control they have over the process. Transparent and digestible explanations of AI systems and where they are employed will be helpful in demystifying the use of AI and promoting long-term trust in society.

The Liberal Democrats’ Science, Technology, and Innovation spokesperson Victoria Collins spoke in the second reading debate and you can read her speech here.

Overall, the Liberal Democrats support a modernised data framework that upholds digital rights while stimulating innovation. Our vision is of a digital future that spreads the benefits of technology across society while protecting fundamental liberties. However, we do need to see better safeguards in this Bill to protect fundamental rights and embed proper scrutiny. We will work with colleagues across the House, as well as with civil society organisations such as Big Brother Watch, Liberty and the Ada Lovelace Institute, to ensure that those important protections are not overlooked.

Thanks again for writing to me about this incredibly important issue.
 

Best wishes, 

Layla

Layla Moran
Liberal Democrat Member of Parliament for Oxford West & Abingdon

Friday, February 07, 2025

Privacy concerns about the Data Use and Access Bill

At the prompt of the Open Rights Group, I've written to Layla Moran about the government's proposed the Data Use and Access Bill.

Dear Layla Moran,

I am a resident of Abingdon getting in touch with you as the Liberal Democrat MP for Oxford West and Abingdon

I am writing to you with concerns about the Data Use and Access Bill. The Bill will have its second reading in the House of Commons on Wednesday, 12 February.

This Bill threatens to undo data protections that have safeguarded people's privacy and security for years. It is a significant move away from the EU's GDPR model towards a weaker, US-style system. This dangerous step could leave individuals vulnerable to data exploitation and harm. Businesses depend on a shared set of data protection standards under a common Human Rights framework with our closest trading partners. If we allow this to happen, we not only risk the UK's adequacy agreement with the EU—which is crucial for trade—but, more importantly, erode the fundamental rights of millions of people.

I want to live in a country where personal information is handled respectfully, not where corporations and the government can make decisions about us behind closed doors, using algorithms we can't possibly hope to understand. This Bill weakens key protections and removes essential safeguards, leaving people with little recourse if they are unfairly treated. Have we not learnt anything from recent scandals such as the Horizon Computer system scandal with postmasters? I urge you to stand against these dangerous provisions and fight for amendments that protect people's rights.

1. Safeguarding Rights Against Automated Decision-Making and AI

This Bill dismantles the right not to be subject to fully automated decisions (Clause 80), meaning AI systems could make significant choices about people's lives—including their job prospects, credit eligibility, or access to public services. We already know AI systems can reinforce discrimination and errors, so why weaken protections when we should strengthen them?

Recommendation: Drop Clause 80 and expand Article 22 protections to include increased safeguards around automated decision-making. This helps to ensure a safer roll-out of the expanding role of AI in Government.

2. Preventing Unaccountable Ministerial Powers Over Data Protection.

This Bill grants new secondary legislation powers to the Secretary of State (Clauses 70, 71, 74, 80, 85, and Schedule 7), allowing them to override protections with the limited scrutiny Statutory Instruments receive. This is an affront to democratic accountability. Open Rights Group warns in its latest report how this power could undermine integrity in UK elections, as any party in power could, at very short notice, change the rules on how they can use voters' data in elections. Also, recent events in the United States have shown how access to Government data can become a battlefield in the event of a constitutional crisis: giving unaccountable powers to the Secretary of State is a dangerous proposition that unnecessarily exposes us to the risk of seeing data we gave to a local authority or department being abused via Statutory Instruments.

Recommendation: Remove these Henry VIII powers and instead define what you want to do with data in primary statutory legislation.

3. Strengthening Accountability in Law Enforcement Data Sharing

The Bill lowers accountability for how police and other authorities access and use people's data. It removes the requirement for law enforcement to consider the impact of data-sharing on individuals (Schedules 4 and 5) and even eliminates the need for police to record why they are accessing a database (Clause 81). This makes abuse and overreach far more likely. The example of the Sarah Everard case, in which Police Officers were sacked for wrongful accessing case files, demonstrates why these safeguards are required.

Recommendation: Drop Schedules 4 and 5 and Clause 81 to restore accountability and transparency in law enforcement data use.

4. Addressing the ICO's Failure to Enforce Data Protection Laws.

The Information Commissioner's Office (ICO) is failing in its duty to enforce data protection laws. This is leaving people you represent in Oxford West & Abingdon —especially the most vulnerable—at risk of harm. Open Rights Group's Alternative Annual Report 2023-24 reveals that the ICO has repeatedly failed to act against serious breaches, particularly in cases affecting marginalised individuals. Instead of strengthening the ICO, this Bill weakens its independence, allowing more political influence over its work (Clauses 90 and 91, Schedule 14). This must not be allowed to happen.

Recommendation: Strengthen the ICO's independence by removing ministerial interference and increasing its enforcement powers. Also, the ICO's use of 'reprimands' should be limited, as these are ineffective methods of enforcing data protection laws.

Beyond individual rights, this Bill has significant economic implications. Losing the EU's adequacy agreement would be devastating to UK businesses, potentially costing between £1 billion and £1.6 billion in compliance costs and lost opportunities (New Economics Foundation and UCL European Institute). This is an unnecessary, self-inflicted wound that Parliament must prevent.

We need a data protection system that works for people—not just corporations. Please advocate for your constituents' rights by opposing these damaging provisions and working towards amendments that will restore public trust and accountability.

Yours sincerely,
Ray Corrigan

Thursday, November 07, 2024

Note to MP on parliamentary debate on face recognition

At the behest of Big Brother Watch, I've writen to my MP, Layla Moran, asking her to participate in the Westminster Hall debate on police use of face recognition technology scheduled in parliament for next Wednesday, 13 November, from 9:30am to 11am.

 Dear Layla,

You may recall that I wrote to you last year noting a survey of over 100 MPs, carried out by YouGov on behalf of Privacy International, indicating that 70% of MPs did not know whether facial recognition technology (FRT) was being used in public spaces in their constituency.

On this occasion I am writing to you to ask you to attend and speak at a debate taking place next Wednesday 13th November at 9:30 am in Westminster Hall on the police’s use of live facial recognition. The debate is sponsored by Sir John Whittingdale MP and I would appreciate it if you could raise the concerns I have about the use of this technology and its impact on human rights and civil liberties in the UK.

Live facial recognition works by creating a 'faceprint' of everyone who passes in front of the camera — processing biometric data as sensitive as a fingerprint. This form of surveillance is deeply intrusive, often subjecting many thousands of innocent people to biometric identity checks without justification. Seven police forces around the UK are currently using this technology despite there being no specific law which governs its use, with forces such as Essex Police, the Metropolitan Police and South Wales Police continuing to use LFR on a regular basis in a way that affects millions of us as we live and travel around the UK.

Uses of live facial recognition have resulted in privacy and data breaches, misidentifications, racial discrimination, and a significant chilling effect on freedom of expression and assembly. In 2020, South Wales Police was found to have deployed live facial recognition surveillance unlawfully (Bridges v SWP) and the Metropolitan Police is also facing a judicial review brought by a Black victim of live facial recognition misidentification which took place earlier this year. Other victims of live facial recognition harassment and errors are initiating legal action in the retail context.

In recent years, parliamentarians across parties in Westminster, members of the Senedd, rights and equalities groups and technology experts across the globe have called for a stop to the use of this technology. The only detailed inquiry into the use of live facial recognition by a parliamentary committee called for a stop to its use. The Equality and Human Rights Commission (EHRC) called for the UK Government to suspend live facial recognition in 2020.

Governments across the democratic world are legislating to ban and significantly restrict the use of live facial recognition surveillance, for both law enforcement and private companies – but successive governments have left the UK behind. The EU’s AI Act will introduce an almost total ban on the use of live facial recognition, with law enforcement exceptions for only the most serious crimes and on the condition of prior judicial approval for each deployment. In the US, multiple states have banned law enforcement from using the technology entirely. The UK risks becoming an outlier in the democratic world, instead following the approach of countries like Russia and China, which have heavily invested in this technology to the detriment of their citizens’ rights and freedoms.

The legal vacuum when it comes to the police’s use of live facial recognition technology cannot continue. The UK must follow the example set by European states and introduce stringent restrictions on the use of this surveillance technology akin to the EU’s AI Act. For more information, please contact Big Brother Watch at info@bigbrotherwatch.org.uk

In addition to police use of such technologies, there is an increasing expansion of their deployment in the retail sector and other areas. The local Budgens supermarket has recently installed a face recognition system. Having used that shop for 25 years, I intend never to cross their threshold again. For ordinary citizens to have to exclude themselves from routine activities and locations they frequented for generations, simply to avoid these intrusive surveillance technologies is not conducive to the maintenance of a healthy society.

As my MP I would appreciate it if you could attend next Wednesday’s debate and raise my concerns.

I look forward to hearing from you.

Kind regards,

Ray Corrigan

Friday, September 20, 2024

Proposed reforms to the National Planning Policy Framework and other changes to the planning system

I've responded to the government consultation on Proposed reforms to the National Planning Policy Framework and other changes to the planning system via Foxglove.

 Dear Planning Policy Consultation Team,

I am an individual responding to the government’s consultation on ‘Proposed reforms to the National Planning Policy Framework and other changes to the planning system’ (question 64).

Thank you for giving the British public the opportunity to have their say on these plans.

I want to register my concerns as an individual about the potentially devastating impact of new data centres on the environment, on water and electricity production and use, and on the UK’s transition to Net Zero and green energy.

The companies building these data centres are some of the richest on the planet. The government must make their involvement conditional on their investment in green energy – ensuring they give more than they take for the UK and from the environment.

The impacts can and should be mitigated by the planning application process in the following straightforward and commonsense ways:

1. Companies building new data centres must be able to show that the centre will have no negative impact on our targets to avoid climate catastrophe, including Net Zero by 2023.

2. Companies building new data centres must be able to show that the centre will not place significant additional burdens on our power and water supply.

3. Companies building new data centres should provide a plan explaining how it will sustainably meet the power and water needs of the centre without burdening the UK’s existing power and water supplies. Any costs associated with the additional energy or water requirements of a new data centre application must be covered by them - not by the British taxpayers.

Thank you, 

Ray Corrigan

Friday, July 19, 2024

Twitter has locked me out...

... and I'm not sure I want to even bother giving Mr Musk a fictional age to be allowed back in.