At the behest of the Open Rights Group, I have written to Ofcom regarding their technology notices consultation.
Dear Ofcom consultation team,
I am responding to your consultation as an individual.
I am happy for you to publish this response.
I wish to respond to the following consultation question.”
‘Do you have any views on our audit-based assessment, including our proposed
principles, objectives, and the scoring system? Please provide Evidence to
support your response.’
It is my view that OFCOM needs to consider how they score and consider the
following risks and threats that could arise from accrediting any scanning
technologies:
1. The threat that the system might infringe on people’s human right to free
expression or privacy. This is particularly relevant given these systems could
break end-to-end encryption and the recent ECHR ruling in the case of Podchasov
v. Russia – https://hudoc.echr.coe.int/eng/?i=001-230854.
OFCOM will have a legal duty to assess the proportionality of any such
system that doesn’t infringe on our human rights and could find itself facing
legal challenges over this issue if it doesn’t demonstrate an assessment of the
impact on the right to privacy from breaking E2EE within its framework.
2. The risk of false positives & wrongful accusations. If these are too
high then law-enforcement agencies will be flooded with false positive results
from any scanning system and people will be wrongfully accused causing them
harms. A higher threshold should therefore be applied to accuracy.
3. The risks are that any system will break UK data protection laws and/or
undermine the nation's cybersecurity by introducing backdoor vulnerabilities to
private and secure messaging systems. The recent situation where Apple has
withdrawn its advanced data protection product from the UK market highlights
that forcing or approving a poor technology upon a company could result in UK
users losing access to products. OFCOM should consider the risks to UK
consumers of forcing new technologies onto providers that are not feasible to
deliver or have too high economic and social costs.
4. Equalities act implications and impact on people with protected
characteristics– OFCOM will have to consider the impact of any scanning system
in relation to the public sector equalities duty.
5. Higher weighting in framework around risks where there are legal duties.
The current minimum threshold for ‘fairness’ does not consider the risk Ofcom
faces of breaking its legal obligations to consider Human Rights Act,
Equalities Act and the Data Protection Act. As such a separate scoring and risk
assessment should be taken for each technology it considers to ensure Ofcom can
evidence it has met its statutory legal duties.
6. The risk any system will facilitate the spread of CSEM – A regulator
wishing to control the use of image-based sexual abuse (IBSA) removal tools
must carefully assess the risks posed by perceptual hash inversion attacks.
These attacks could result in someone creating CSEM images from the hashed data
the tool was using. For Evidence of these attacks, see S. Hawkes, C. Weinert,
T. Almeida and M. Mehrnezhad, "Perceptual Hash Inversion Attacks on
Image-Based Sexual Abuse Removal Tools," in IEEE Security & Privacy,
doi: 10.1109/MSEC.2024.3485497. Further risks and threats from scanning
technologies are set out in 'Bugs in our pockets: the risks of client-side
scanning - Harold Abelson, Ross Anderson, Steven M Bellovin, Josh Benaloh, Matt
Blaze, Jon Callas, Whitfield Diffie, Susan Landau, Peter G Neumann, Ronald L
Rivest, Jeffrey I Schiller, Bruce Schneier, Vanessa Teague, Carmela Troncoso,
Bugs in our pockets: the risks of client-side scanning, Journal of
Cybersecurity, Volume 10, Issue 1, 2024, tyad020, https://doi.org/10.1093/cybsec/tyad020
perceptual hash inversion attacks. These attacks could result in someone
creating CSEM images from the hashed data the tool was using. Evidence of these
attacks and risks are published in a research paper in IEEE Security &
Privacy ( https://ieeexplore.ieee.org/document/10762793)
Yours sincerely,
Ray Corrigan
In relation to item 3. above I'd also recommend those interested read and thoroughly digest 'Keys under doormats: mandating insecurity by requiring government access to all data and communications' by Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh,
Matt Blaze, Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A. Specter and
Daniel J. Weitzner.
Bottom line there is no backdoor that can be created to a cryptographic system that only the good guys will have access to.