Thursday, April 03, 2025

UK Government should allow physical proof of immigration status

In the face of the failing rollout of the UK government's eVisa scheme, I've written to my MP, again, at the suggestion of the Open Rights Group.

"Of the many failures of the Labour government since coming into power last year, their adoption of Conservative/Reform right-wing rhetoric and policies on immigration has been one of the most egregious.

Please can you take action to help people who are facing serious harms because of the Home Office’s flawed eVisa scheme.

The eVisa scheme was supposed to replace physical documents with online proof of immigration status. But the roll-out of the scheme has been chaotic. Technical and administrative problems mean that many people cannot create or access their eVisa at the crucial times that they need it. Many people are currently having to rely on expired documents to prove their right to be in the UK.

Since the scheme was supposed to come into effect on January 1, 2025 people who have the right to be in the UK have been prevented from boarding planes and had their travel plans disrupted. Refugees have been unable to rent homes or get jobs. At least one person has been made homeless because of an error in their eVisa. There are undoubtedly thousands more people facing such harms – including many of your constituents.

The immediate solution:

The Home Office recently updated its guidelines to allow migrants with legacy paper documents to continue using them. This could be extended to all migrants that have the right to remain in the UK. Having a physical document as a back up will mean that people can prove their immigration status even if they are unable to show their eVisa because of technical glitches or Internet outages. It will also help people who do not have a smartphone or who are not digitally literate.

Please will you contact the Minister for Migration and Citizenship, Seema Malhotra and urge her to take this simple step to allow people to have an optional physical document to prove their right to be in the UK"

Friday, March 14, 2025

Save small sites & services from the Online Safety Act

At the request of the Open Rights Group I have written to my MP, Layla Moran, about the potential impact of the Online Safety Act on small, safe sites and services.

Dear Layla Moran,

I am a constituent of Oxford West and Abingdon.

I am writing because I am concerned that many small, safe and harmless websites, and services are closing down in the UK, because of the duties, risk assessments and threats of fines and even imprisonment contained in the Online Safety Act.

Even a small personal blog, with user comments, must comply with the new rules.

The Act comes into force on 17 March 2025. The Secretary of State, as I understand it, can change the rules very quickly to exempt small websites, under Section 220 of the Act. They could extend the onboarding period, while working out how to ensure small websites, and services do not have to close down.

Overseas websites are also likely to block access to the UK, if they understand the risks.

This could include many services which are designed as safe spaces for friendly conversation, like many Mastodon social media sites. These serve the LGBTQ+ community, for example, to keep users safe from bullying and hate speech.

Yet they may block the UK because of Online Safety compliance risks.

This will impact me personally as I have maintained a blog, B2fxxx: Random thoughts on law, the internet and society, available at https://b2fxxx.blogspot.com/ for over 20 years. I have written more than 4700 blogposts in that time on a wide range of technology policy issues, including, for example, copies of submissions I have made to UK government, EU and other consultations. The Online Safety Act means I have to consider shutting down the blog and deleting all of those writings.

The Online Safety Act is meant to keep people safe, not to shut down safe websites.

Please make Peter Kyle the Secretary of State for Science, Innovation and Technology aware of these problems and ask for a response.

More information:
https://www.newscientist.com/article/2461213-hundreds-of-small-websites-may-shut-down-due-to-uks-online-safety-act/
https://www.telegraph.co.uk/business/2024/12/17/hundreds-of-websites-to-shut-down-under-chilling-internet/
https://onlinesafetyact.co.uk/in_memoriam/

Yours sincerely,

Ray Corrigan


Friday, March 07, 2025

Response to Ofcom technology notices consultation

At the behest of the Open Rights Group, I have written to Ofcom regarding their technology notices consultation.

Dear Ofcom consultation team,

I am responding to your consultation as an individual.

I am happy for you to publish this response.

I wish to respond to the following consultation question.”

‘Do you have any views on our audit-based assessment, including our proposed principles, objectives, and the scoring system? Please provide Evidence to support your response.’

It is my view that OFCOM needs to consider how they score and consider the following risks and threats that could arise from accrediting any scanning technologies:

1. The threat that the system might infringe on people’s human right to free expression or privacy. This is particularly relevant given these systems could break end-to-end encryption and the recent ECHR ruling in the case of Podchasov v. Russia – https://hudoc.echr.coe.int/eng/?i=001-230854.

OFCOM will have a legal duty to assess the proportionality of any such system that doesn’t infringe on our human rights and could find itself facing legal challenges over this issue if it doesn’t demonstrate an assessment of the impact on the right to privacy from breaking E2EE within its framework.

2. The risk of false positives & wrongful accusations. If these are too high then law-enforcement agencies will be flooded with false positive results from any scanning system and people will be wrongfully accused causing them harms. A higher threshold should therefore be applied to accuracy.

3. The risks are that any system will break UK data protection laws and/or undermine the nation's cybersecurity by introducing backdoor vulnerabilities to private and secure messaging systems. The recent situation where Apple has withdrawn its advanced data protection product from the UK market highlights that forcing or approving a poor technology upon a company could result in UK users losing access to products. OFCOM should consider the risks to UK consumers of forcing new technologies onto providers that are not feasible to deliver or have too high economic and social costs.

4. Equalities act implications and impact on people with protected characteristics– OFCOM will have to consider the impact of any scanning system in relation to the public sector equalities duty.  

5. Higher weighting in framework around risks where there are legal duties. The current minimum threshold for ‘fairness’ does not consider the risk Ofcom faces of breaking its legal obligations to consider Human Rights Act, Equalities Act and the Data Protection Act. As such a separate scoring and risk assessment should be taken for each technology it considers to ensure Ofcom can evidence it has met its statutory legal duties.

6. The risk any system will facilitate the spread of CSEM – A regulator wishing to control the use of image-based sexual abuse (IBSA) removal tools must carefully assess the risks posed by perceptual hash inversion attacks. These attacks could result in someone creating CSEM images from the hashed data the tool was using. For Evidence of these attacks, see S. Hawkes, C. Weinert, T. Almeida and M. Mehrnezhad, "Perceptual Hash Inversion Attacks on Image-Based Sexual Abuse Removal Tools," in IEEE Security & Privacy, doi: 10.1109/MSEC.2024.3485497. Further risks and threats from scanning technologies are set out in 'Bugs in our pockets: the risks of client-side scanning - Harold Abelson, Ross Anderson, Steven M Bellovin, Josh Benaloh, Matt Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G Neumann, Ronald L Rivest, Jeffrey I Schiller, Bruce Schneier, Vanessa Teague, Carmela Troncoso, Bugs in our pockets: the risks of client-side scanning, Journal of Cybersecurity, Volume 10, Issue 1, 2024, tyad020, https://doi.org/10.1093/cybsec/tyad020
perceptual hash inversion attacks. These attacks could result in someone creating CSEM images from the hashed data the tool was using. Evidence of these attacks and risks are published in a research paper in IEEE Security & Privacy ( https://ieeexplore.ieee.org/document/10762793)

Yours sincerely,
Ray Corrigan

In relation to item 3. above I'd also recommend those interested read and thoroughly digest 'Keys under doormats: mandating insecurity by requiring government access to all data and communications' by Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A. Specter and Daniel J. Weitzner. 

Bottom line there is no backdoor that can be created to a cryptographic system that only the good guys will have access to.

 

Thursday, February 13, 2025

Lib Dem position on Data Use & Access Bill

 I've had a response from the Liberal Democrat MP, Layla Moran in relation to concerns raised about the proposed DUA Bill.

Dear Ray 

Thank you for writing to me about the Data (Use and Access) Bill.

We welcome the omission of many of the more objectionable elements of the previous Data Protection and Digital Information Bill, which was introduced by the previous Conservative government but fell when the General Election was called. I spoke in the second reading debate of that Bill and you can read my speech here.

Despite these changes, retention and enhancement of public trust in data use and sharing is a major issue in the bill. The focus on smart data and sharing of government data means that the Government must do more to educate the public about how and where our data is used and what powers individuals have to find out this information.

There are still major changes proposed to the GDPR -e.g. as regards police duties and Automated Decision Making which continue to make retention of data adequacy for the purposes of digital trade with the EU of the utmost priority in considering any changes.

We continue to believe that the GDPR /Data Protection Act 2018 is not in need of fundamental reform but rather where there is any ambiguity/difficulty in interpretation clarifications incorporating relevant recitals to the GDPR should be made in the legislation and in improved guidance.

We are also supportive of Baroness Kidron’s amendments to strengthen the rights of the creative industry to challenge generative AI business practices which seek to train models on their work. This is a key campaigning area for the Liberal Democrats, as we believe existing copyright law should be enforced to protect the UK’s creative industries – a world-leading British export.

The Government-as did the last one- continue to claim huge benefits -nearly £10bn. from reforms in the bill. We are very sceptical about this and will continue to scrutinise how these changes will impact business.

Liberal Democrats believe that the UK should be broadly aligned with the forthcoming EU AI legislation in terms of safeguards. Whilst we agree that Article 22 lacks clarity, we strongly disagree that it should be amended to limit its effect. We advocate amendment to strengthen its reach so that it applies to predominantly automated processing as well.

Individuals must be allowed to request information about algorithmic decisions which affect them and the data that is subject to decisions made.

It will be important that the public understand when and how AI intersects with their lives and what control they have over the process. Transparent and digestible explanations of AI systems and where they are employed will be helpful in demystifying the use of AI and promoting long-term trust in society.

The Liberal Democrats’ Science, Technology, and Innovation spokesperson Victoria Collins spoke in the second reading debate and you can read her speech here.

Overall, the Liberal Democrats support a modernised data framework that upholds digital rights while stimulating innovation. Our vision is of a digital future that spreads the benefits of technology across society while protecting fundamental liberties. However, we do need to see better safeguards in this Bill to protect fundamental rights and embed proper scrutiny. We will work with colleagues across the House, as well as with civil society organisations such as Big Brother Watch, Liberty and the Ada Lovelace Institute, to ensure that those important protections are not overlooked.

Thanks again for writing to me about this incredibly important issue.
 

Best wishes, 

Layla

Layla Moran
Liberal Democrat Member of Parliament for Oxford West & Abingdon