The lead story in today's Guardian:
NHS medical research plan threatens patient privacy"The privacy of millions of
NHS patients will be critically undermined by a government plan to let medical researchers have access to personal files, the
health information watchdog told the Guardian last night.
The prime minister and Department of Health want to give Britain's research institutes an advantage against overseas competitors by opening up more than 50m records, to identify patients who might be willing to take part in trials of new drugs and treatments.
They are consulting on a proposal that is buried in the small print of the NHS constitution that would permit researchers for the first time to write to patients who share a particular set of medical conditions to seek their participation in trials.
It would result in patients receiving a letter from a stranger who knew their most intimate medical secrets, which would be regarded by many as a breach of trust by doctors who are supposed to keep information confidential."
Harry Cayton, the incoming chairman of the new government watchdog on NHS data, the National Information Governance Board for Health and Social Care, has stated categorically that the plans are "ethically unacceptable". That's likely to have made him very unpopular with ministers who are probably wandering round Whitehall muttering "you can't get the staff these days" or words to that effect. Cayton has been what
Sir Humphrey might have described as "remarkably brave" and robust in his comments:
"There is pressure from researchers and from the prime minister to beef up UK research. They think of it as boosting UK Research plc. They want a mechanism by which people's clinical records could be accessed for the purposes of inviting them to take part in research, which at the moment is not allowed. I think that would be a backward step.
"It would be saying there is a public interest in research that is so great that it overrides consent and confidentiality. That is not a proposition that holds up."
Given the disappointing conclusions on the subject slipped into Richard Thomas' and Mark Walport's Data Sharing Review Report for the Prime Minister and Secretary of State for Justice in July,
"5.8 We support the instinctive view that wherever possible, people should give consent to the use or sharing of their personal information, allowing them to exercise maximum autonomy and personal responsibility. However, achieving this in practice is not so simple. It is unrealistic to expect individuals ever to be able to exercise full control over the access to, or the use of, information about them. This is because of a number of factors, not least practical difficulties in seeking and obtaining consent in many circumstances. Moreover, there are many circumstances in which it is not useful, meaningful or appropriate to rely on consent, or indeed to obtain fresh consent at a later stage for the reuse of personal information for a different purpose...
5.17 As a general rule, it seems right that personal information obtained consensually for a specified purpose should not then be used for an incompatible purpose that goes outside the terms of the original consent. If that were to happen, it would breach the terms of the original consent. For this reason, the second Data Protection Principle, which prohibits reuse of information in any manner that is incompatible with the original purpose, stands as a significant safeguard. It is important to note, however, that ‘incompatible with’ is not the same as ‘different from’. Although some respondents to the review have said that the law should prohibit any reuse of personal information without fresh consent, we believe that returning to people on each occasion when an organisation wishes to reuse personal information for clearly beneficial and not incompatible purposes would impose a disproportionately heavy burden, particularly where the data pool is large.
5.18 Again, the example of medical research is particularly helpful here. Respondents in this sector agreed almost unanimously that a requirement to seek fresh consent for any supplementary use of previously collected personal information would be unworkable and have a severely detrimental effect on the ability to conduct important medical research. The time, money and effort required to do this would all have an adverse impact on research programmes and on patient care. This is an example where the principle of implied consent24 is valid. An NHS patient agreeing to a course of treatment should also be taken to have agreed that information given during the course of the treatment might be made available for future medical research projects, so long as robust systems are in place to protect personal information and privacy. After all, that patient may be benefiting from research using health information from earlier patients."
- it's a pity Mr Cayton's views couldn't have been incorporated when the report was being considered. Anyone taking seriously his belief that -
""It would be saying there is a public interest in research that is so great that it overrides consent and confidentiality. That is not a proposition that holds up."
- would have had great difficulty justifying paragraph 5.18.
The Department for Health has welcomed the comments "and will consider them" alongside others.
Patient confidentiality is a fundamental pillar of a sound healthcare system. To throw it away as a result of some vague unsubstantiated notion about the UK becoming the medical research center of the world would be a serious mistake.
Update: Not surprisingly Ian Brown has a few things to say about the Guardian report and reminded me of one of his many excellent contributions to the field, the Cybersecurity Knowledge Transfer Network's Privacy Engineering Whitepaper. Executive Summary:
"A stronger legal and regulatory environment, high profile privacy failures, and increasing public concerns build the case for enterprises to take privacy seriously. For those new to the subject, this paper describes the harms that privacy failures can lead to, and the reasons why privacy issues must be addressed. Harm may happen to individuals, to organisations, or to society as a whole, and enterprises should address the effects on all of these when contemplating new information systems. Leadership is essential if concern for privacy is to be embedded throughout an organisation’s culture, processes and systems.
For those attempting to design privacy in to their systems, this paper provides guidance on the issues that must be addressed. The range of issues is broad, and we can only scratch the surface here. More work is needed to develop the detail, and we hope this paper will inspire that development. But the breadth and complexity of the issues also emphasises the need to develop skills and ethics within a profession of privacy practitioners.
Finally, this paper offers three clear conclusions about the nature of privacy issues, who is responsible, and how the threat of breaches can be vastly reduced by taking swift and appropriate measures."
Update 2: Speaking of patient confidentiality, Declan McCullagh has been taking EPIC and the Patient Privacy Rights group to task for complaining about Google's plans for predicting flu trends in the US.
"Google's recent announcement that it may have found a way to predict U.S. flu trends has led to the inevitable expressions of concern from some privacy groups.
The Electronic Privacy Information Center and Patient Privacy Rights sent a letter this week to Google CEO Eric Schmidt saying if the records are "disclosed and linked to a particular user, there could be adverse consequences for education, employment, insurance, and even travel." It asks for more disclosure about how Google Flu Trends protects privacy."
Declan suggests the privacy groups' real beef should be with the government, since Google are only publishing aggregated statistics but he does accept that they are making a much more subtle point and quotes Marc Rotenberg of EPIC in explanation:
"The basic question I'm asking to Google is: how can it be that across all these key terms, you can generate aggregate anonymized data without any risk of reidentification?
Put another way, what if an attorney general in a state where marijuana was illegal sent a subpoena to Google asking for the identities of anyone who typed in "how to grow pot?" Or if abortion were illegal in a certain state, what if the subpoena wanted to know who typed in "how to get an abortion?""