Pages

Friday, January 08, 2016

Scrambling for Safety 2016 remembering Caspar

Scrambling for Safety 2016 was organised at short shrift by Ross Anderson and Julian Huppert in response to the government's publication of the draft Investigatory Powers Bill. The conference took place yesterday and a livestream is available. Embedded copy below.


There were some lovely tributes to the wonderful Caspar Bowden.


I just hope this deeply principled, tenacious, generous, determined, passionate, inspirational, brilliant, warm-hearted, thoroughly decent, kind and knowledgeable gentleman - a real force of nature - had some inkling of how widely respected and appreciated he was by people from all corners of his universe, allies, combatants and observers alike.

Evidence to Joint Committee on Investigatory Powers Bill

The Joint Committee on the Investigatory Powers Bill yesterday published all of the written evidence it received. Over 120 submissions were received, in spite of the incredibly tight timetable. My evidence is available on the website and, as promised, I include a copy below. Apologies for the occasional typographical errors.

Strangely, my opening note about working for The Open University but submitting the evidence in a personal capacity got truncated from the version the committee published.
To the members of the Draft Investigatory Powers Bill Joint Committee,

Thank you for the opportunity to make a submission to your inquiry the Draft Investigatory Powers Bill.

My name is Ray Corrigan. I’m a Senior Lecturer in the Maths, Computing & Technology Faculty of The Open University, though I write to you in a personal capacity.

Summary

The Joint Committee is being required to analyse the long and complex Draft Investigatory Powers Bill in an unreasonably short timescale.

This submission to your inquiry is divided into 6 sections covering:

·         Bulk collection & retention of personal data by computers
·         Privacy
·         Legality of bulk collection and retention
·         ICRs and relevant communications data
·         The base rate fallacy and the lack of efficacy of bulk data collection
·         Complex system security and equipment interference

The first section challenges a fundamental misunderstanding – the idea that collecting and retaining bulk personal data is acceptable as long as most of the data is only “seen” by computers and not human beings. This is a line that has been promoted by successive governments for some years and seems to be widely accepted. Yet it is seriously flawed.

Next I suggest a simplified version of US scholar Daniel Solove’s model of privacy, to help provide a framework for thinking about information and data processing, in the context of communications surveillance.

Then the April 2014 European Court of Justice Digital Rights Ireland decision, invalidating the Data Retention Directive, is reviewed. Viewed carefully, the decision could be considered an aid to framing surveillance legislation. The draft Investigatory Powers Bill in its current form would be unlikely to meet the tests, laid down in the case, regarding compatibility with privacy and data protection rights, guaranteed by articles 7 and 8 of the EU Charter of Fundamental Rights.

The fourth section examines the tangled web of “internet connection records” and “relevant communications data”. Technology law expert, Graham Smith, has identified and mapped 14 different interlinked definitions in the Bill that are connected to “relevant communications data”. It is difficult to see how the bulk retention of data under the broad and dynamic scope of “relevant communications data” or “internet connection records” could meet the tests of necessity or proportionality laid down in the Digital Rights Ireland case.

The penultimate section looks at the base rate fallacy, a statistical concept that policy makers must familiarise themselves with if intending to approve the indiscriminate bulk collection, retention and processing of personal data. Finding a terrorist is a needle in a haystack problem. You don’t make it easier to find him/her by throwing industrial scale levels of the personal data, of mostly innocent people, on your data haystack. Section 150 of the Bill, Bulk personal datasets: interpretation, states “the majority of the individuals are not, and are unlikely to become, of interest to the intelligence service in the exercise of its functions”. Explicit recognition innocent people would be subject to indiscriminate surveillance under proposed powers.

The final section discusses security in general and the inadvisability of giving government agencies the powers to engage in bulk hacking of the internet. It is advocated that targeted surveillance practices are more effective than mass surveillance approaches and recommended that Part 6, Chapter 3 of the draft Bill be removed in its entirety.

I conclude with an appeal to frame surveillance laws within the rule of law, rather than attempting to shape the law to accommodate expansive, costly, ineffective and damaging population-wide surveillance practices. 
Bulk collection & retention of personal data by computers

1.       To begin I would just like to note that it is a mammoth task to expect parliamentarians to analyse this long and complex Bill in the short timescale you have been given.

2.        I would also like to tackle a fundamental misunderstanding at large in Westminster – the idea that collecting and retaining bulk personal data is acceptable as long as most of the data is only “seen” by computers and not human beings; and it will only be looked at by persons with the requisite authority if it is considered necessary.  This is a line that has been promoted by successive governments for some years and seems to be widely accepted. Yet it is seriously flawed.

3.       The logical extension of such an argument is that we should place multiple sophisticated electronic audio, video and data acquisition recording devices in every corner of every inhabited or potentially inhabited space; thereby assembling data mountains capable of being mined to extract detailed digital dossiers on the intimate personal lives of the entire population. They won’t be viewed by real people unless it becomes considered necessary.

4.       Indeed with computers and tablets in many rooms in many homes, consumer health and fitness monitoring devices, interactive Barbie dolls, fridges, cars and the internet of things lining up every conceivable physical object or service to be tagged with internet connectivity, we may not be too far away from such a world already.[1]

5.       The Home Office, on 16 December 2015, rejected a freedom of information request[2] asking for the “metadata of all emails sent to and from the Home Secretary for the period 1st January 2015 - 31st January 2015 inclusive.” They rejected the request on the grounds that the request “is vexatious because it places an unreasonable burden on the department, because it has adopted a scattergun approach and seems solely designed for the purpose of ‘fishing’ for information without any idea of what might be revealed.”

6.       Yet the same Home Office considers it acceptable to have powers, in the Investigatory Powers Bill, to engage in bulk data collection, retention and equipment interference, to assemble information about every member of the population, in the hope of conducting contemporaneous and/or post hoc ‘fishing’ activities to look for evidence of misbehaviour.

7.       I would contend that this approach is unnecessary, disproportionate and incompatible with the rule of law. It is additionally very costly and technically, mathematically and operationally ineffective. If all that was not bad enough, it risks undermining the security of our already frail and insecure communications infrastructure.


Privacy

8.       Individual and collective privacy underpins a healthy society but privacy is hard to define. We understand the conceptual protection of a person’s home being their castle and what is behind closed doors being private. But privacy was the default state in the pre-internet world and we didn’t think too hard about its subtleties.  With mass personal data processing, however, we need a better understanding of privacy. US scholar Daniel Solove has helpfully characterised privacy as a collection of problems.[3] Privacy harm, Solove argues, is triggered by –

·         Information collection
·         Information processing
·         Information dissemination/sharing
·         Privacy invasion and erosion

9.       We now teach a simplified model of Solove’s taxonomy to our 3rd level information systems students at The Open University a visual depiction of which I include below.


10.    One of the key issues clarified by Solove’s model is that the initial privacy harm originates at the point of collection of personal data, whether that collection is done by a computer, other device or a person. In the context of investigatory powers it can be helpful to ask whether the specifics under consideration relate to information collection (& retention), information processing, information dissemination or privacy invasion. Whichever of these processes are at issue their collective effect is the stripping bear of the digital persona of anyone who comes into contact with devices connected to the internet.

11.    Information collection is the surveillance done by commerce, governments and other agents, such as criminal gangs. Solove suggests it also covers the interrogation of the information collected.

12.    Commerce, governments and others are involved in information processing – the storage, use (or misuse), analysis and aggregation of lots of data, secondary use of data, the exclusion of the data subject from knowledge of how information about them is being used and exclusion from being able to correct errors. Solove expresses particularly concern about bureaucracy. Surveillance bureaucracy makes life-changing decisions based on secret information, while denying the subject/s of the data the ability to inform, see or challenge the information used. The privacy problem here is all about information. The privacy harms are bureaucratic – powerlessness for the subject, indifference to them, error, lack of transparency and accountability.

13.    On that front, when giant data mountains are conveniently sitting around, there is not a safeguard in existence that will prevent (possibly even well-intentioned) future incarnations of a bureaucracy from tapping that data for secondary uses not originally envisaged by those behind the IP Bill. A related case in point is the expansive interpretation of s7(4) of the Intelligence Services Act 1994 and the Equipment Interference Code of Practice 2015 to justify the equipment interference activities currently undertaken by the security and intelligence services (SIS). Provisions for equipment interference and bulk equipment interference included in part 5 and 6 of the IP Bill present serious economic wellbeing and security risks.

14.    Information dissemination – Data viewed out of context can paint a distorted picture. The novelist researching criminal behaviour might be flagged for buying too many of the wrong kinds of books from an online bookshop. In the UK collection of information ‘of a kind likely to be useful to a person committing or preparing an act of terrorism’ is a criminal offence, under Section 58 of the Terrorism Act 2000. We might expect an analyst to recognise a known novelist but processing purely by algorithm may lead to distortion and faulty inference. We also get dissemination through leaking and misappropriation through stealing of personal information, which can lead to exposure to identity theft, fraud, blackmail, and further distortion.

15.    Privacy invasion is about the information activities and their aggregation mentioned above but also amounts to intrusion into the personal sphere. Overt surveillance, direct interrogation, junk mail, unsolicited phone calls are all disruptive intrusions and cause harm. But there can be a decision making element to this intrusion too. For example someone may be reluctant to consult a doctor if current plans on the sharing of health data through the ill-considered care.data scheme progress further. Innocents may be inhibited from using the internet if they feel under constant surveillance.

16.    Whether or not mass indiscriminate personal data collection and retention is only “seen” by computers it remains mass indiscriminate personal data collection and retention, repeatedly found unlawful by the Court of Justice of the European Union [Digital Rights Ireland, 2014; Schrems 2015], the European Court of Human Rights [Zakharov, 2015] and multiple high courts including Romania (2009), Germany (2010), Bulgaria (2010), the Czech Republic (2011) and Cyprus (2011). Mass indiscriminate personal data collection and retention has been variously described by these courts as unconstitutional and/or a disproportionate unjustified interference with the fundamental right to privacy, free speech and confidentiality of communications.

Legality of bulk collection and retention

17.    On 8 April2014 the Grand Chamber of the European Court of Justice, (ECJ) in joined cases C-293/12 and C-594/12, issued a landmark decision declaring the 2006 data retention directive invalid. The Grand Chamber of the Court effectively condemned pre-emptive, suspicionless, bulk collection and retention of personal data and consequent "interference with the fundamental rights of practically the entire European population". The Paragraph 37 of the judgment noted the interference with articles 7 (data protection) and 8 (privacy) of the EU Charter of Fundamental Rights caused by mass data retention “must be considered to be particularly serious.”

18.    Paragraph 58 of the decision criticises the mass surveillance of innocent people not remotely connected to serious crime. Then in recognition of the need for targeted rather than mass surveillance the Court states:

"59.  Moreover, whilst seeking to contribute to the fight against serious crime, Directive 2006/24 does not require any relationship between the data whose retention is provided for and a threat to public security and, in particular, it is not restricted to a retention in relation (i) to data pertaining to a particular time period and/or a particular geographical zone and/or to a circle of particular persons likely to be involved, in one way or another, in a serious crime, or (ii) to persons who could, for other reasons, contribute, by the retention of their data, to the prevention, detection or prosecution of serious offences."

So the Court considers it unnecessary and disproportionate to engage in bulk collection of innocent communications in order to find serious criminals.

19.    Finding a terrorist or serious criminal is a needle in a haystack problem – you can’t find the needle by throwing infinitely more needle-less electronic hay on the stack.  Law enforcement, intelligence and security services need to use modern digital technologies intelligently in their work and through targeted data preservation regimes – not the mass indiscriminate data collection, retention and equipment interference proposed in the Investigatory Powers Bill – engage in technological surveillance of individuals about whom they have reasonable cause to harbour suspicion. That is not, however, the same as building an infrastructure of mass surveillance or facilitating the same through the legal architecture proposed in the Bill.

20.    The ECJ follows up this mass surveillance critique with a clear declaration in paragraph 60 that the data retention directive had no limits on access to and use of retained data to the purpose of fighting serious crime and no criteria for determining such limits. Paragraphs 60 to 68 could be read as a lesson on how to write a data retention law in a way that might be acceptable to the Court. The data retention directive declared invalid by the Court did not –

·         include procedures on determining access to data or its use or even limiting these to crime fighting
·         limit the number of people with access to the retained data to those strictly necessary
·         subject access to the data to the prior review or oversight of a court, in order to limit access to that which is strictly necessary
·         oblige member states to set down such procedures.
·         make any distinction between categories of data
·         attempt to justify the arbitrary period of retention chosen of between 6 months and 2 years
·         lay down clear and precise rules governing the extent of the interference with the fundamental rights to privacy and data protection
·         provide for sufficient safeguards to ensure effective protection of the data retained against the risk of abuse
·         provide for sufficient safeguards to ensure against any unlawful access and use of that data
·         specify a high enough data security threshold
·         require the irreversible destruction of data at the end of the retention period
·         require data to be retained within the borders of the EU
·         ensure control of data protection and access to the retained data by independent authority

21.    Big and complex as the Investigatory Powers Bill is, it too falls at many of these hurdles in relation to the bulk data collection and retention powers proposed. IT fundamentally fails to take into account data protection principles, in particular data minimisation.[4] Not only does the IP Bill eschew data protection principles, it promises to offer its own version of data processing rules to deal with bulk data which will appear as a code of practice. A guide to what this code of practice will look like is included in Schedule 6, section 3 of the Bill.

22.    The Court concluded:
"69. Having regard to all the foregoing considerations, it must be held that, by adopting Directive 2006/24, the EU legislature has exceeded the limits imposed by compliance with the principle of proportionality in the light of Articles 7, 8 and 52(1) of the Charter.
70. In those circumstances, there is no need to examine the validity of Directive 2006/24 in the light of Article 11 of the Charter.
71.  Consequently... Directive 2006/24 is invalid."
23.    In short, the data retention directive presented a disproportionate interference with the fundamental rights to respect for private and family life and the protection of personal data. Consequently the directive was invalid, null and void. And because it was invalid on privacy grounds the Court didn't see the need to pursue the question of whether it also might be invalid on the grounds of Article 11 of the Charter of Fundamental Rights relating to freedom of expression.

Internet connection records and relevant communications data
24.    The Home Office appear to have briefed the Joint Committee that the retention of “internet connection records” is the only new power in the Bill.  As far as I can tell the phrase “internet connection records” is mentioned only in section 47 of the Bill (“Addition restriction on grant of authorisations”) which does not deal with data retention.  Section 71 (Powers to require retention of certain data) deals with data retention and uses the term “relevant communications data” rather than internet connection records.

25.    “Relevant communications data” has a six part definition in s71(9)(a)-(e) relating to the purposes of section 71. “Internet connection record” has a two part definition in s49(6)(a)-(b) relating to that section. The components of the “relevant communications data” definition them acquire different meanings or definitions depending on what part of the Bill they appear in.

26.    Graham Smith has done a remarkable job of tracking these down. In a blogpost on Sunday, 29 November 2015, Never mind Internet Connection Records, what about Relevant Communications Data[5], he identified and mapped 14 different interlinked definitions in the Bill that are connected to “relevant communications data”.

27.    It is hardly surprising therefore that industry representatives, such as BT’s Mark Hughes, have testified to the joint committee that the definitions in the Bill are unclear.

28.    Relating “relevant communications data” back to the Solove model (at the beginning of my submission) implicates it variously in data collection, retention, processing and dissemination. Unfortunately even that doesn’t help identify exactly what “relevant communications data” is going to mean in practice.

29.    Government representatives have told the joint committee that definitions of ICRs and relevant communications data are “clear” and industry have insisted they are unclear in the Bill. It appears that what they really mean in practice will be worked out in private discussions through the “very good relationship” the government maintains with industry.

30.    Does the joint committee get to oversee these discussions? So who gets the final say on who gets to program the computers for surveillances and what are the specific 'selectors'/filters? Who decides what the selectors should be? Who decides who decides what the selectors should be? With the best will in the world most parliamentarians are not technical experts, so how can the committee effectively or Home Secretary or judicial commissioners scrutinise the technical aspects of this work? How do you measure the efficacy of these filters given it is widely known in the tech community how ineffective electronic filters can be? How, when someone is tagged as suspicious via these secret algorithms applied to bulk datasets, does the information on that individual then get further processed? What happens when someone is wrongly tagged and how do they retrieve their innocence and clean bill of electronic health?

31.    It is difficult to see how the bulk retention of data under the broad and dynamic scope of “relevant communications data”, or “internet connection records” if that is to be the common phrase to be alluded to regardless of its definition in the Bill, could meet the tests of necessity or proportionality laid down by the Court of Justice of the European Union in the Digital Rights Ireland case in 2014. Given that the final text of the EU’s new General Data Protection Regulations (GDPR) just been agreed, it will further complicate the committee’s efforts in trying to understand the implications of the proposed IP Bill.

32.    One last note on this section on the differences of opinion over the clarity or otherwise of the Bill. The drafters of the Bill have gone to some length to try to distinguish communications data (or meta data) from content. Paul Bernal and others have explained to the joint committee why there is no simple way to distinguish the two, given the complex overlapping nature of both e.g. does an email address mentioned in the main text of a document constitute content or communications data. Could I, on this point, just commend to you the presentation of your special adviser, Peter Sommer, from the 2012 Scrambling for Safety conference, Can we separate “comms data” and “content”– and what will it cost?[6]

Base rate fallacy
33.    The whole Investigatory Powers Bill approach to signals intelligence – giant magic computerised terrorist catching machine that watches everyone and identifies the bad guys – is flawed from a mathematical as well as operational perspective.
34.    Time and again from the dreadful attacks on the US on the 11th September 2001 through to the recent attacks in Paris the perpetrators were previously known to the security services but they lost track of them in the ocean of data noise they were then[7] and are now[8] drowning in.
35.    Even if an IP Bill mandated magic terrorist catching machine, watching the entire population of the world, was 99% reliable, it would flag too many innocents for the security services to investigate and swamp the services in unproductive activity.
36.    But it is not even as simple as that mathematically. Is your machine 99% reliable at identifying a terrorist, given they are a terrorist? Or is it 99% reliable at identifying an innocent, supposing they are truly innocent? In general your machine will have two failure rates

·         A false positive where it identifies an innocent as a terrorist
·         A false negative where it identifies a terrorist as an innocent

37.    The reliability of your identification further depends on the actual number of terrorists in the population as a whole – the base rate. The problem is particularly acute when the base rate is low. Let’s stick with the 99% reliability for both failure rates (though they will rarely be the same – if you adjust your machine to catch more terrorists, it will falsely accuse more innocents; and if you adjust it to catch less innocents it will let more terrorists go). So assume both a false positive and false negative rate of 1%. Suppose also there are 100 terrorists in every collection of 1 million people.[9] Your terrorist catching machine, watching these 1 million, will flag 99 of the 100 terrorists, giving one a free pass; but it also flags 1% of the remaining 999,900 innocents i.e. 9,999 innocent people get tagged as terrorists. So the 99% reliable machine flags 99 + 9999 = 10,098 people for suspicion. Only 99 of these 10,098 are real terrorists, giving your magic machine a hit rate of 99/10098 = 0.0098 approximately. Your 99% reliable machine is not 99% reliable but less than 1% effective at identifying terrorists.

38.    The numbers underlying this base rate fallacy[10] – the tendency to ignore known base rate statistical data (e.g. the low probability someone is a terrorist in a large population) in favour of an interpretation of specific data (my magic machine is 99% accurate) that seems as though it might be right – are slightly counter intuitive but need to be understood if you purport to deploy techniques involving the surveillance of entire populations.

39.    Denmark, following 7 years of ineffective bulk collection of data equivalent to the IP Bills internet connection records, in 2014 repealed the law requiring the collection and retention of these records.[11] Because of the base rate fallacy and the fact that terrorists are relatively few in number compared to the population as a whole, mass data collection, retention and mining systems, such as those proposed in the IP Bill, always lead to the swamping of investigators with false positives, when dealing with a large population. Law enforcement authorities end up investigating and alienating large numbers of innocent people. That’s no good for the innocents, for the investigators or for society. In Denmark, over half a million records per citizen were retained in 2013 but the system proved an ineffective tool for law enforcement and security and intelligence services.

40.    If the government has £175 million over ten years (about equivalent to Wayne Rooney’s wages and as industry and others have pointed out to the joint committee, this will not come close to paying for what the IP Bill requires) to invest in terrorism prevention, then it would be better spent on more security services people not magic terrorist catching computer systems. You need more human intelligence and better targeted and managed signals intelligence.

Complex system security and equipment interference
41.    Our communications infrastructure is complex, fragile and insecure.  Large and complex systems like the internet are extremely difficult if not impossible to secure.  Security is hard and complexity kills it. When you make any changes to complex systems, they produce unintended emergent effects. But it is a really bad idea to undermine the security of an already fragile and insecure communications infrastructure deliberately, by giving government the power to undermine that security directly, through the equipment interference measures in the IP Bill.

42.    There may be a case [though it has not been made] for carefully targeted and judicially supervised and controlled, necessary and proportionate equipment interference, to pursue known suspects, about whom the intelligence or law enforcement services have reasonable cause to harbour suspicion. That applies generally to the bulk data collection and retention and equipment interference regime of the IP Bill. The requisite authorities need to use modern digital technologies intelligently in their work and through targeted data preservation regimes – not the mass surveillance regime they are currently operating and the government is proposing to expand under the draft IP Bill – engage in technological surveillance of individuals about whom they have reasonable cause to harbour suspicion.

43.    Targeted equipment interference does however, compromise digital forensic evidence that may be used in law enforcement cases.

44.    Although equipment interference better known as hacking was avowed by the government with the publication of the draft Equipment Interference Code of Practice early in 2015, government legal representatives at the recent Privacy International Investigatory Powers Tribunal hearing denied that the government had yet admitted engaging in bulk equipment interference.

45.    The justification for bulk equipment interference appears to be based on stretching interpretations of the Anderson and Intelligence & Security Committee reports, and the Intelligence Services Act 1994 and the Police Act 1997, beyond breaking point. Anderson, in what I consider one of the few weak/evidence-light parts of his otherwise thorough and impressive report,[12] approved of bulk collection and retention of communications data.  In no part of the Anderson report is there expressed or implicit approval for bulk equipment interference.

46.    Government are claiming bulk equipment interference (mass hacking of the internet) is their attempt to "build on recommendations made by David Anderson QC and the ISC".  Generally speaking giving the government the power to hack the internet is really bad security hygiene, undermining communications infrastructure for everyone. Professor Mark Ryan of Birmingham University informed the Joint Committee that equipment interference is “a huge power” which would result in innocent people being targeted. Professor Ryan also described it is an "extremely dangerous game".

47.    Numerous other computer scientists and security experts, including Jon Crowcroft at Cambridge University, have described the Bill as a hacker charter, a description recognisable in part 5 and part 6, chapter 3 of the Bill. The 2015 Equipment Interference Code of Practice appears to have stretched the meaning of s7(4)(a) of the 1994 Intelligence Services Act's "acts of a description specified in the authorisation" to mean it covers bulk hacking. Section 7.11 of the Code of Practice claims s7(4)(a) "may relate to a broad class of operations" i.e. anything? Part 6 Chapter 3 would appear to be aimed at codifying this in the new law.

48.    There is no case for the open ended bulk equipment interference powers outlined in part 6, chapter 3 of the Bill. These powers seem to be aimed at facilitating the hacking of overseas communications data and equipment. But wherever bulk hacking is aimed it has no place in the toolbox of government authorities. Following an investigation into the Edward Snowden leaks in 2013, President Obama’s Review Group on Intelligence and Communications Technologies recommended that intelligence agencies should focus on defending rather that engaging in attacks on network and computer security.[13]

49.    Part 6 Chapter 3 should be removed in its entirety from the Bill. Part 5 needs significant amendment if it is to remain.

50.    Securing systems of the magnitude of those used by security agencies and industry, and effectively proposed in the IP Bill, from external hackers or the multitude of insiders who have access to these databases (850,000 including Edward Snowden in the case of the NSA), is incredibly difficult. The joint committee will be familiar with the recent TalkTalk hack compromising the personal data of 157,000 customers.[14] You may be familiar with the even more serious and potentially life threatening compromise of the systems of US government’s Office of Personnel Management.[15] The complete dossiers of tens of millions of US federal employees, their families and others who had applied for government jobs were stolen.

51.    When you create large and valuable databases they attract attackers. Whereas, in addition to respecting data protection principles, minimising the collection and processing of personal data to that required for the specified purpose, is also good security practice.

52.    Security experts like Ross Anderson, Bruce Schneier, Peter Neumann and others have written extensively about this.  And to understand the problem of securing these systems you need to think about how such systems can fail - how they fail naturally, through technical problems and errors (a universal problem with computers), and how they can be made to fail by attackers (insiders and outsiders) with malign intentions. And sometimes, like the case of Edward Snowden, one of 850,000 security cleared people with access to NSA secrets, those insiders or outsiders may have, what they believe to be, benign intent. Snowden’s stated intention was to disclose unconstitutional and/or illegal government agency practices. Whatever an attacker’s intent, no information to which nearly a million people have access, as a routine part of their job, is secure.

53.    The mood amongst western governments has been leaning towards deliberate mandates to undermine communications infrastructure security, providing security vulnerabilities for law enforcement and intelligence services to exploit. Anderson, Schneier, Neumann and other world renowned security experts recently published Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications.[16] The paper explains, in commendably accessible detail, why this is a bad idea.

54.    From their conclusion: “Even as citizens need law enforcement to protect themselves in the digital world, all policy-makers, companies, researchers, individuals, and law enforcement have an obligation to work to make our global information infrastructure more secure, trustworthy, and resilient. This report’s analysis of law enforcement demands for exceptional access to private communications and data shows that such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend. The costs would be substantial, the damage to innovation severe, and the consequences to economic growth difficult to predict. The costs to developed countries’ soft power and to our moral authority would also be considerable. Policy-makers need to be clear-eyed in evaluating the likely costs and benefits.”


Conclusion

55.    The government has the right to intercept, retain and analyse personal information, when someone is suspected of a serious crime. However, current operations and the powers and processes proposed in the draft IP Bill involve collection of personal data indiscrimately, in bulk and without suspicion, in addition to network security decimating equipment interference. This is, in effect, mass surveillance.

56.    Due process requires that surveillance of a real suspected criminal be based on much more than general, loose, and vague allegations, or on suspicion, surmise, or vague guesses. To operate the mass data collection and analysis systems proposed in the IP Bill, thereby giving the entire population less protection than a hitherto genuine suspected criminal, based on a standard of reasonable suspicion, is indefensible.

57.    250 years ago, Lord Chief Justice Camden decided that government agents are not allowed to break your door down and ransack your house and papers in an effort to find some evidence to incriminate you (the case of Entick v Carrington (1765) 19 Howell’s State Trials 1029, 2 Wils 275, 95 ER 807, Court of Common Pleas).
58.    The good judge also declared personal papers to be one’s “dearest property”. It is not unreasonable to suspect he might view personal data likewise in the internet age. I understand Lord Camden's reasoning in Entick became the inspiration behind the 4th Amendment to the US Constitution which offers protection from unreasonable searches and seizures. The 4th Amendment itself underpins the 46 recommendations of the Report of President Obama’s Review Group on Intelligence and Communications Technologies. For a quarter of a millennium, fishing expeditions, of the type that are proposed in the IP Bill but at a scale and scope which Lord Chief Justice Camden could barely have imagined, have been considered to fundamentally undermine the rule of law. It's time Parliament brought these modern costly, ineffective and damaging surveillance practices into line with that rule of law rather than, as with the IP Bill, attempting to shape the law to facilitate and expand them in scale and scope.











[1] Executive Office of the President President’s Council of Advisors on Science and Technology Report to the President, [May, 2014], Big Data and Privacy: A Technological Perspective
[2] https://www.whatdotheyknow.com/request/300685/response/745953/attach/html/3/FOI%2037410%20Response.pdf.html
[3] Solove, DJ, [2006], A Taxonomy of Privacy, University of Pennsylvania Law Review, Vol.154 No.3
[4] https://ico.org.uk/for-organisations/guide-to-data-protection/principle-3-adequacy/
[5] http://cyberleagle.blogspot.co.uk/2015/11/never-mind-internet-connection-records.html
[6] http://www.pmsommer.com/sf2012_sommer_commsdata_content.pdf
[7] The NSA’s Call Record Program, a 9/11 Hijacker, and the Failure of Bulk Collection https://www.eff.org/deeplinks/2015/04/nsas-call-record-program-911-hijacker-and-failure-bulk-collection
[8] Intelligence and Security Committee Report on the intelligence relating to the murder of Fusilier Lee Rigby http://isc.independent.gov.uk/committee-reports/special-reports
[9] Various spokespersons of successive UK governments have referred to 6000 dangerous people at large in the UK, so I’ve chosen 100 per million as equivalent to 6000 in 60 million.
[10] For a fuller description of the base rate fallacy see  Richards J. Heuer, Jr., Psychology of Intelligence Analysis,
Chapter 12 Biases in Estimating Probabilities, available at https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis/art15.html
[11] Details available from http://itpol.dk/consultations/written-evicence-ipbill-scitech-committee IP-Pol submission to the Science and Technology Committee inquiry into the Investigatory Powers Bill.
[12] A Question of Trust: Report of the Investigatory Powers Review by David Anderson Q.C., June 2015
[13] Richard A. Clarke, Michael J. Morell, Geoffrey R. Stone, Cass R. Sunstein, Peter Swire [13 December 2013] Liberty and Security in a Changing World: Report and Recommendations of The President’s Review Group on Intelligence and Communications Technologies
[14] http://www.bbc.co.uk/news/business-34743185
[15] https://www.opm.gov/cybersecurity/cybersecurity-incidents/
[16] https://dspace.mit.edu/bitstream/handle/1721.1/97690/MIT-CSAIL-TR-2015-026.pdf