Front doors and strong locks: encryption, privacy and intelligence gathering in the digital era 

MIT – March 2016

It’s great to be back here in Cambridge and always a privilege to be at MIT. Thank you to Danny Weitzner and the Internet Policy Research Initiative for inviting me.

When I took up the post of director of GCHQ, shortly after I had spent time here in the US doing some study on privacy policy, I wrote in the Financial Times about the challenges agencies like mine, MI5 and the police in the UK, or the FBI and NSA in the United States, are facing as we began to see a new generation of terrorists and criminals take advantage for their own purposes of the extraordinary opportunities offered to all of us by the internet and the web.

The comments caused the bigger stir than I expected, and were widely seen as an attack on the tech industry. In fact I wanted to start a debate in the UK about how democratic government and the tech sector could work together within a clear and sufficiently transparent legal framework. I’m very grateful to MIT for allowing me to develop that a little further this afternoon.

It’s hard to think of a more appropriate setting, partly because of MIT’s role in the development of the technology itself – so many of the creators and shapers of the internet and web are here – but also wider Cambridge involvement in the policy implications: CSAIL’s ‘Keys under the doormat’ paper and the Berkman Center’s more recent ‘Don’t Panic’ report, along with the US National Academy of Sciences’ report on collection of bulk signals intelligence, are the key contributions of the last few years.

Right at the end of 2015, I suggested that it would be better to start this debate before rather than after acts of violence. I said that partly because we all want to stop those events happening – and I’ve never doubted the shared good intentions of all concerned – but also because in my experience the worst possible time for decision-making is after an atrocity. Emotions are heightened and positions polarised. One of terrorism’s objectives will always be to get free societies to overreact, turning on themselves.

My experience of the past 18 months has been more encouraging: we have had some sensible and constructive dialogue with the tech sector and academia. As I have consistently said in private and in public, government agencies do not have the answer here. The solutions lie with those who run the internet: that wonderful collaboration of industry and academia, civil society, governments and above all the public. The perception that there is nothing but conflict between governments and tech industry is a caricature; in reality companies are routinely providing help within the law and I want to acknowledge that today.

Reflecting on the experience of the past year I want to do three things this afternoon: first to say a little about our own approach to encryption, if only to late to rest a few myths; second, to look at one aspect of the moral problem presented by what I would describe as the abuse of encryption; and finally, to say how we might work together to address this shared problem, from a UK perspective. Throughout, I am consciously avoiding offering solutions, because I don’t have them, and I think we will need to find them together. And I suspect those solutions will be diverse and fragile and dynamic in the future: they will not be 20th century solutions.

Encryption

First, encryption. The idea that we do not favour strong encryption is alien to anyone who has worked in my organisation. Information assurance is at the heart of everything we do. I am accountable to our Prime Minister just as much, if not more, for the state of cyber security in the UK as I am for intelligence collection.

For nearly 100 years we have been intimately involved in strengthening encryption. From traditional protection of military communications, through personal privacy online – including identity verification for government digital services – to the security of domestic smart power meters – where the design principle is that homeowners are in control of the data – to the security of the nuclear firing chain, we understand the importance of encryption for the economy and for the individual. That importance grows as more of our private lives move online and the economy becomes increasingly dependent on digital currency and block chain systems. We advise government, industry and individuals and how to protect their information appropriately.

Much of GCHQ’s work is on cyber security, and given the industrial scale theft of intellectual property from our companies and universities, I am acutely aware of the importance of promoting strong protection in general, and strong encryption in particular. The stakes are high and they are not all about terrorism.

GCHQ has a long-term strategy, called ‘secure by default’, to raise the security bar in commodity products. You’ll see more of this over the coming year, but in 2012 we published a set of principles for secure platforms that was fully endorsed by the industry. We are starting to see those coming to fruition now, making commodity platforms more secure off the shelf. My challenge to our new  National Cyber Security Centre, launched as part of GCHQ later this year, is to emulate the breakthroughs of our predecessors by developing and promoting similar advances insecure communications and services for the benefit of all people and our economy.

We have a history here. We  may have become famous for cracking encryption, most notably at Bletchley Park in the Second World War, but it’s worth remembering that Alan Turing, our staff member who is most publicly associated with defeating Enigma, actually spent slightly more of his career with us designing a secure telephony system – here in the US, alongside US industry and government colleagues.

Strengthening and improving encryption has been the focus of our most brilliant mathematicians over the decades, and still is. Today, I am publishing on our website facsimiles of the two original papers by the late James Ellis from January 1970 on “The possibility of secure non-secret digital encryption’, and from May 1970 a parallel paper on the possibilities for analogue encryption. I will leave it to those of you far better qualified than me to judge their significance (and I will leave these first facsimiles with the MIT library).

I note that the analog paper represents a direction which cryptography never took, seeming less relevant at a time when communications were becoming increasingly digital; but as we look again at ideas to do security at the physical layer, perhaps this paper will be revisited.

But I publish them for three reasons, apart from the obvious point that in an age of greater transparency, 46 years seems a long enough wait.

First, the shit boldness of Ellis’ concept (and of Malcolm Williamson and Clifford Cock’s subsequent work)– mirrored independently on the outside by Diffie, Hellman, Rivest, Shamir, Adelman and others – is still staggering: reversing centuries of assumptions about how communications could be protected – gives me some hope that our current difficulties can be overcome. In the face of their achievement, I instinctively question arguments that suggest technological innovation has no part in the solutions we seek.

Second, the transformational power of PKI and RSA came of course from the combination of the altruistic academic brilliance of the people I’ve already mentioned, working on the same issues in the secret and public academic domain, and the industry players of the emerging internet and web. Strong, relatively cheap encryption became democratised and enabled more secure communications on a global scale. Encryption went from being a tool of strategic advantage between superpower blocks, to a key enabler of individual freedom and safety.

And, if we’re honest, we were not instinctively keen to share with academia outside government . Things have changed since those early days; as the Cold War declined at academic research expanded and most of all encryption became of such fundamental importance to us as individuals and the societies we live in, we began to engage. We have promptly published several advances due to Cliff Cocks, which you may have seen. More recently, CESG released details of a failed quantum safe protocol in order to help the wider cryptographic community better understand the complexity of this important area of research.

Third, economics was a key factor in the creation of PKI, as it has been in the development and direction of the internet itself. The sheer cost of secure distribution of symmetric keys during the Cold War prompted Ellis to look for ‘unthinkable’ alternatives. And of course economics – the relatively high cost of processing - made early development of PKI difficult; the internet on the web on the other hand made it absolutely essential. If Ellis were alive I’m sure he would be proud that almost every aspect of life online relies on PKI.

The Berkmann Center’s recent excellent report makes this point very well - the economics of the internet makes going dark more complex than it seems, especially where end to end decryption is concerned.

The moral issue

But what the history of our cryptology teaches me above all is that the enduring problems in the debate over privacy and security are essentially moral rather than technical. And they are not really new, or at least new only in the application to the still relatively young domain of the internet and the web. In our generation, we are immersed in a debate which has much in common with that around wiretap – ‘alligator clips’ on wires – or the interception of letters for earlier generations. We should take some comfort from that; if our predecessors could reach a sensible accommodation, albeit on a much smaller scale, then so can we.

At its root, the ethical problem presented by encryption is the problem presented by any powerful, good invention, including the internet itself, namely that it can be misused. TOR is the most typical example: a brilliant invention that is still invaluable to those who need high degrees of anonymity, notably dissidents, human rights advocates and journalists; but an invention that is these days dominated in volume by criminality of one sort or another. The technology of the internet on the web is morally neutral, but those of us who use it are not.

The rational response is not to panic, or to assume that encryption itself is therefore bad, but look for a sensible, pragmatic and proportionate response to a shared problem: the abuse of encrypted services by a minority of people who want to do harm to others.

This is the shared problem to which I referred earlier. It isn’t my problem or law-enforcement or governments’, but society’s.

The solution is not of course that encryption should be weakened let alone banned. But neither is it true that nothing can be done without weakening encryption.

I am not in favour of banning encryption. Nor am I my asking for mandatory ‘back doors. I am puzzled by the caricatures in the current debate, where almost every attempt to tackle the misuse of encryption by criminals and terrorists is seen as a back door. It is an overused metaphor, or at least misapplied in many cases, and I think it illustrates the confusion of the ethical debate in what is a highly charged and technically complex area.

One problem is that we approach this from very different perspectives. For those of us in intelligence and law-enforcement, the key question is not which door to use, but whether entry into the house is lawful at all. In our case that means applying the European Convention on Human Rights, as set out in UK domestic law: is what we are doing lawful (and appropriate sanctioned), is it necessary, and is it proportionate, notably in its infringement of privacy?

Proportionality is, of course, the most contentious area. These are difficult judgements. If I look back to the earlier examples, they seem straightforward. The exploitation of a few key floors in the otherwise brilliant design of the commercial Enigma machine, along with clever maths, early computing power, the outstanding industry engineering of Tommy flowers, and some old-fashioned espionage, enabled Allied victory, and not only, as Eisenhower acknowledged, saved thousands of Allied lives but also brought the Holocaust to an end before the Nazis could complete their task.

This then is an easy example. Even though it was very large scale, it would be hard to argue that this was not proportionate, particularly in wartime. But it is worth remembering that the benefits at the time were far less clear that they appear with hindsight.

Nor would it make much sense to see this as the exploitation of a back door. What Turing and his colleagues recognised was that no system is perfect and anything that can be improved can almost inevitably be exploited, for good or ill. That is still true today. It does not follow that Enigma was a poor system, or weak or easily broken by anyone. In fact we continued to build and use improved derivatives of a Enigma – Typex – for some years after the War. For Turing I do not think there was any such thing as an impregnable door, whether front, back or side: there were strong doors and degrees of security.

Turing also knew that human behaviour was rarely as consistent as technology: we are to some extent too busy or careless in our use of it. Put more positively, we routinely make trade-offs not just between privacy and security but privacy and usability. The level of security I want to protect the privacy of my communications with my family is high, but I don’t need or want the same level of security applied to protect a nuclear submarine’s communications, and I wouldn’t be prepared to make the necessary trade-offs. Itis not inconsistent to want, on the one hand, to design out or mitigate poor user behaviour, but at the same time exploit that behaviour when lawful and necessary.

Today, judging the benefits and therefore what is proportionate is made more complex by the diversity of the threats, the diversity of technology – particularly the shift in terrorism from bespoke systems to the commodity services we all use – and the proliferation and transnational nature of communications, which heightens the potential for intrusion into the privacy of more individuals (though this can be exaggerated; as UK court judgements over the past two years have confirmed, our bulk collection does not equal bulk surveillance. They are different things.)

To come up to date, much of our effort in GCHQ at the moment is inevitably directed against ISIL. Few would disagree that this is a terrorist group that needs to be stopped. It is an easy area in which to establish common ground. This is an organisation which crucifies children in front of their parents and posts this online; that uses rape and sexual violence against girls and young women as a routine weapon of terror; that finds ever more creative and perverted ways of torturing and killing dissidents, journalists and gay people and of course it projects both radicalising propaganda and attacks back into Western countries. Faced with this threat, or with the proliferation of fissile material or chemical weapons, or the live streaming of child abuse to order, or the more routine day-to-day dramas of abduction and kidnap which affect ordinary families in both our countries, what is a proportionate response?

The key point I want to make this afternoon is that it is not for me, as an intelligence official and civil servant, or for a law enforcement officer, to make these broad judgements, whether about the use of data in general or encryption in particular; nor is it for tech company colleagues nor even for independent academics.

Since the trade-offs are for society as a whole, it must surely be for elected representatives to decide the parameters of what is acceptable. Within a transparent legal framework it is for those involved – government agencies, tech companies and academia - to work out what is possible together. And of course it is for the courts to monitor, test and enforce compliance.

Whether you are operating in a framework like the US, with its constitutional protections and separation of powers, or the UK, with its common law framework and the European Convention on Human Rights, lawmakers and the courts are essentially trying to reconcile a state’s first duty to protect its citizens with their right to privacy. Total security is not possible, and both frameworks, in their different ways qualify the right to privacy – it does not extend to the right to harm others.

Democracy for all its flaws, remains our best defence against the abuse of power and infringement of liberty and privacy, whether by governments or industry or the individual. It is after all these democratic values from which the internet was created and flourished, not vice versa. The internet is enhancing democracy in exciting new ways, but it is not a replacement for the democratic process, nor a parallel universe.

The UK position

In the UK we have just embarked on a new discussion of these broad issues and powers. Our Parliament is discussing a new bill which is intended to set out what is necessary and acceptable in as transparent a way as possible. It is based on a number of independent reviews by lawyers, parliamentarians and experts over the past 18 months.

It does not give the intelligence agencies new powers but it tries to put in one place powers which were spread across numerous statutes.

On encryption, it simply repeats the position of earlier legislation: where access to data is legally warranted, companies should provide data in clear where it is practicable or technically feasible to do so. No-one in the UK government is advocating the banning or weakening of encryption.

Defining what is reasonable and practical of course immediately engages proportionality. Does providing the data in clear endanger the security of others’ data?

The unwelcome answer which dissatisfies advocates at both ends of the spectrum is, it depends. Not everything is a backdoor, still less a door which can be exploited outside a legal framework.

The truth is that within the parameters set by legislation, it should be possible for technical experts to sit down together and work out solutions to particular manifestations of the abuse of encryption. I suspect those solutions will not be single, but diverse and increasingly dynamic, as the Berkmann Center report suggests.

But here is the major challenge which the Berkmann Centre report understandably leaves hanging: law enforcement in particular needs solutions now, often against a ticking clock, and cannot safely wait for complex or fragile possibilities to be offered. There is an urgency which needs to be met, even if a comprehensive solution is beyond reach. The kind of big data solutions which are critical to us in what we call target discovery – for example finding people we don’t know about in northern Syria who are planning attacks – is not a substitute for what law enforcement needs now against a known individual.

To address this, we recognise that we need a new relationship between the tech sector, academia, civil society and government agencies. We should be bridging the divide, sharing ideas and building a constructive dialogue in a less highly charged atmosphere. That’s why I am here.

I’ve no doubt that we will need a new forum to facilitate this, bringing together the tech industry, government agencies, academia and civil society. A space where we can build confidence, have a frank dialogue and work out how we can best tackle the problems we all recognise, within the law.

For our part we are fully committed to a collaborative approach and want to support this actively. Our Prime Minister will be setting out further details in the coming months on how the UK government plans to facilitate this dialogue.

And this will be a dialogue that starts from the position I’ve set out today – that the government and its agencies support, and want to actively promote, effective encryption.

I hope also that this process will find ways of increasing the public understanding of the issues raised by encryption. There is a strong shared interest in tech company customers and concerned citizens and voters, fully understanding how the data is protected: they are after all the same people.

While our jurisdictions are separate, the internet and its technologies are not, which is partly why I am here in Cambridge today.  I’m sure that any UK process will also therefore want to consider the international dimension and what norms of proportionality and reasonableness might apply.

I do not know where this dialogue might take us. It would be surprising if it ever reaches a final conclusion, not least because the internet and the technologies operating across it are unlikely to become static any time soon.

But pragmatic answers, developed in an atmosphere which is less heated, must be in everyone’s interest.

I hope it will bring us closer to the goal which I think is shared by all sides: moving those who misuse encryption and abuse the internet and web into the reach of the criminal justice system. Even agreeing this goal frees us to begin a new approach. The crucial point on which I hope everyone in the debate can agree is that there are solutions to this problem. Goodwill, expertise and cooperation are the key to finding them.

We managed this cooperation in every other area of public safety. As The Spectator, a libertarian UK magazine, recently pointed out when writing on government and tech industry relations, agricultural fertiliser can be misused; we work with producers to make that less likely, and we hope that retailers would naturally report obvious suspicious activity, in the way that any concerned citizen would.

The fact that technology is more complex, or that the internet is still young and developing, does not alter the fundamental approach in a democracy: agreed objectives – what is good and bad – enshrined in transparent law-making, implemented by all with responsibility; and a presumption that we all want to help, within the parameters of what is lawful and practicable.

I think we would all of us – customers, companies, agencies – be happy to see the worst behaviour driven off major platforms. Tech industry leaders understand this: as Mark Zuckerberg‘s comments in Berlin ten days ago about addressing anti-immigrant hate crime illustrate, we all worry about bad behaviour by a minority on their platforms. 

Of course some people will find new places to hide unlawful activities, and new channels of communication, but our agencies were created to tackle those hiding places; what we need to avoid is effort diverted on all sides in tension between governments and the world’s major providers. Instead we should apply our collective goodwill and technical brilliance to meeting the hardest threats to society.

Our problem at the moment, in short, is that those who do harm are hiding in the noise of the internet by using what the rest of us use: pushing them off these channels is surely a shared goal for consumers, industry and government. We do not expect to reach perfection in this: but we need to clear some ground and know where to focus our efforts

This brings me back to my first point. The intersection between the world I am trying to tackle – involving the worst of human behaviour – and the world of fantastic economic and social opportunity the internet is offering is relatively small. We are talking about a small minority.  The security tail should not wag the dog. And of course sometimes there will be nothing we can do and we will have to accept that; but those surely should be the exceptions. 

I do not for a second doubt that terrorism and the other abuse as I mentioned are any less abhorrent to the leaders of the tech industry than they are to me. Nor do I think they are any less interested in the safety and security of their fellow citizens than my staff. But where this is my core business I realise that this isn’t and shouldn’t be theirs.  Our worlds overlap, but they are not the same; my point this afternoon is that they do not need to collide.

For governments, protection of their citizens is the primary duty. And those citizens expect both their safety and their privacy to be protected in a proportionate way, not one or the other. I think this is common ground between governments and the tech industry. It cannot be an unreasonable or undeliverable demand, if approached as a shared problem. We can, for example, work together – industry, governments academia, civil society - to drive ISIL off the Internet and bring them to justice.

So the debate for me, as I said earlier, is not about back doors or front doors. It is about whether entry into the house is lawful at all. It is about whether you risk letting anyone else in if you accept that the lawful authorities can enter with a warrant. This is a fundamental issue all liberal democracies have to grapple with, about striking the right balance. It is for constitutional and democratic processes, for elected lawmakers and, in some cases, for courts to determine the outcome.

The duty of those of us charged with public protection is twofold. One is to make clear to our elected leaders what assurance they can expect from different postures. I would never advocate a surveillance state, not least because I would not want my family to live in one; nor would my staff. The price of security can be too high. My job is to see what my organisation can and cannot do under various legal and operational frameworks, and what impact that will have on the risks posed to the public.

A second duty is then to operate whatever framework and risk posture the democratic process decides on, to the best of our ability. In the area of encryption that must mean some very practical cooperation with the industry. Whatever high-level framework, whatever posture democratic nations decide upon will need to be implemented by commercial providers. And this will get technical. That is where we will need goodwill on both sides. It is where, in the UK, I hope, the process the Prime Minister will set out in the coming months can shed some really useful light.

And for my part my promise today is to engage in that process with the tech industry openly, respectfully, and in good faith.

- Robert Hannigan