Will proposed surveillance powers weaken security for UK startups?

With the draft Investigatory Powers Bill laying out the UK government’s plans for the future of surveillance, how are tech companies and encryption likely to be impacted?

Will proposed surveillance powers weaken security for UK startups?

In January 2015, David Cameron was met with derision from those in the tech community when, for the sake of national security, he called for an outright ban on end-to-end encryption. This led to much public wrangling between the then coalition government and various stakeholders, leaving something of a question mark over how the government intended to resolve the interests of tech startups and public safety. That was until the publishing of the draft Investigatory Powers Bill last November, which revealed the government’s revised plans for surveillance powers in the UK. But how does the proposed legislation stand to impact the way companies employ encryption, not to mention the security of UK consumers?

First of all, it’s important to recognise that encryption doesn’t interfere with much of the traditional work done by the security services. “Encryption doesn’t prevent individual surveillance,” says Rafael Laguna, CEO of Open-Xchange, the provider of web-based communication, collaboration and office productivity software.

Whilst recent incidents such as November’s Paris terror attacks are held up in support of the Investigatory Powers Bill, eight out of nine of those involved were already known by the security services. And intercepting the communications of known individuals through existing legal channels is fairly trivial. “You’re simply tapping the phones themselves, recording keystrokes and shooting screenshots,” Laguna continues. “No matter if the mail or the file was sent encrypted or not, when the user eventually looks at it, then you’ve got it.”

Instead, much of the Investigatory Powers Bill’s focus is on facilitating the practice of mass surveillance. But it’s important to recognise that adding more hay won’t necessarily make it easier to find the needles. “People cannot grasp the amount of data that has been collected and the limits of what can effectively be done with it,” says Laguna. Questions have already been raised about the ability of security agencies such as the NSA and GCHQ to effectively interpret and provide actionable intelligence from the huge quantities of data they have collected. “Collecting even more data won’t help,” Laguna says.

Unlike previous versions, the latest incarnation of the Investigatory Powers Bill doesn’t require the security services to have first-hand access to consumers’ encryption keys, merely for companies to decrypt them if requested. But whilst this sounds like a step forward, requiring tech companies to retain encryption keys on their servers potentially exposes both them and their customers to significant risk.

Essentially the problem is twofold. Firstly, if tech firms are forced to retain the encryption keys of their users, it is open to abuse from the inside. In 2013, multiple cases came to light of NSA operatives abusing their positions to spy on romantic interests. Placing customer encryption keys somewhere they can be accessed by employees potentially makes tech firms vulnerable to similar misconduct. “If you don’t need it, don’t store it,” Laguna says. “Not having the data in the first place is the best thing that you can do.”

More significantly, gathering encryption keys in one place would make it fairly trivial for an external threat to access consumer communications on a truly massive scale. “If we introduce backdoors, we make it much easier for all the bad hackers to do terrible things,” says Laguna. “Everything is exposed with one hack.” All it would take is for a single exploit to be found in a high-profile company’s security and potentially millions of private conversations could be accessed at a stroke. “Imagine that in the hands of ISIS, Russia or even the US,” he says. “That gives such incredible power that we would be preparing for armageddon.”

This problem is compounded by the fact that the Investigatory Powers Bill is only likely to reach mass-communication channels, rather than hitting those likely to be used by the most serious security threats. Many incredibly secure encryption algorithms are easily accessible on the internet, which means it is very simple for criminals and terrorists to send and receive messages that can’t be intercepted by the security services, whilst common consumer services are potentially compromised. “You’re taking away security for all the lay people and you’re not taking it away for the terrorists,” Laguna says. “That’s the end result.”

But it’s not only consumers that are jeopardised by these changes. We don’t need to look far to find many examples of high-profile security breaches that have severely damaged confidence in a brand, with the recent TalkTalk, Ashley Madison and iCloud hacks all fresh in mind. For companies that have staked their reputation on the security their service provides, the price of complying with regulation could prove to be too high. “If you are a service whose users heavily rely on the security of their data then you may have to pull out of countries that threaten your model,” Laguna says.

And this is perhaps one of the biggest risks that the UK government is taking with the Investigatory Powers Bill. Apple has already expressed concerns that complying with its requirements will simply prove incompatible with the underlying architecture of key services like iMessage, which employ end-to-end encryption that even the tech giant itself has no way of breaking. “I wouldn’t be surprised if Tim Cook decides they need to pull out of this country,” says Laguna. Whether the government believes it can risk driving one of the world’s largest tech brands away from these shores remains to be seen.

In light of these factors, one could fairly ask why the Investigatory Powers Bill is being so vigorously pursued. Having been born in East Germany and witnessed the abuses of the Stasi first hand, Laguna believes part of the problem is British people’s lack of similar experiences, which has made them more complacent about the risks of ubiquitous surveillance. “We should know better about what this kind of mass surveillance really does to the way countries operate,” Laguna says. “I hope that we’ve learnt that lesson from history.” 

ABOUT THE AUTHOR
Josh Russell
Josh Russell
RELATED ARTICLES







Share via
Copy link