An international coalition of social organizations, security and political experts and technology companies – including Apple, Google, Microsoft and WhatsApp – have written a critical blow to a surveillance proposal made last year by the UK intelligence agency, warning that It will undermine trust and security and threaten the fundamental rights.
"GCHQ's ghost protocol creates serious threats to digital security: if implemented, it will undermine the authentication process that allows users to verify that they are communicating with the right people, introducing potential unintended vulnerabilities and increasing the risk of communication systems becoming abused or abused, "they drive.
"These cyber security risks mean that users cannot trust their communication is secure, as users would no longer be able to rely on knowing who is on the other end of the communication, thus creating threats to fun Damental human rights, including privacy and free expression. Furthermore, the systems will be exposed to new potential vulnerabilities and risks of abuse. "GCHQ's idea of a so-called" ghost protocol "would be for state intelligence or law enforcement agencies to be invisible CC service Suppliers for encrypted communication ̵[ads1]1; on what is billed as a targeted, authorized authority.
The agency highlighted the idea in an article published this fall on the Lawfare Blog, written by National Cyber Security Center (NCSC) Ian Levy and GCHQ's Crispin Robinson (NB: NCSC is a publicly opposed GCHQ boundary) – as they said was intended to open a discussion on the "go dark" problem that robust encryption pertains to security agencies.
The couple argued that such an "exceptional access mechanism" could be baked into encrypted platforms to allow change to final encryption to be bypassed by government agencies, could instruct the platform provider to add them as a quiet listen to eave sdrop on a conversation – but without the encryption protocol itself being compromised.
"It is relatively easy for a service provider to ask a law enforcement party to a group chat or call. The service provider usually checks the identity system and really determines who is who and which entities are involved – they are usually involved in introducing the parties to a conversation or conversation, Levy and Robinson claim. "You end up with everything that is still encrypted from end to end, but there is an end to this particular communication. This type of solution seems to be no more intrusive than the virtual crocodile clips that our democratically elected representatives and judges currently authorize in traditional voice playback solutions, and provide absolutely no power they shouldn't have. "
" We are not talking about the weakening of the encryption or defeat of the end-to-end nature of the service. In a solution like this, we usually talk about suppressing a warning on a target device, and only on the device to the target and possibly those with which they communicate. It's a completely different proposal to discuss, and you don't even have to touch the encryption. "
" [M] ass-scale, commodity, end-to-end encrypted services … today is one of the toughest challenges for targeted legal access to data and an apparent security dichotomy, "they added.
Although encryption might technically remain intact in the scenario they outline, their argument surrenders both the fact and the risk of bypassing encryption via Fiddling with authentication systems to enable misleading third-party snooping.
As the coalition letter points out, it would both undermine user confidence and inject extra complexity – with the risk of fresh vulnerabilities that can be exploited by hackers.  Compromising authentication will also cause platforms themselves to get a mechanism that they can use to reverse user commissions – thereby circumventing the wider privacy distribution of the sample initially at the end of encryption in the first place, perhaps especially when distributed on commercial messaging platforms.
So in other words, just because what is being asked is not literally a backdoor in encryption that does not mean that it is not "risky for security and privacy, and just as cruel for the user's trust and rights. 19659002] "At present, the overwhelming majority of users depend on their confidence in reputable providers to perform authentication functions and verify that participants in a conversation are the ones they believe they are and only those people. GCHQ's ghost protocol undermines this trust cooperation and authentication process, "the coalition writes, pointing out that authentication remains an active area of research – and that the work would probably wipe out if the systems in question were suddenly made fundamentally insecure by the state.
They further claim that no way for the security risk to be directed at the individuals that government agencies will specifically sneak in. Ergo, the added security risk is universal.
"The ghost protocol would introduce a security threat to all users of a targeted encrypted messaging program since the proposed changes do not being exposed only to a single target, "they warn." In order for providers to suppress alerts when a ghost user is added, messaging software will have to rewrite the software that each user trusts. This means that any errors in the development of this new feature can create an unintentional vulnerability that affects each user of that application. "
There are more than 50 signatories to the letter at all, and other community and privacy groups Human Rights Watch, Reporters Without Borders, Liberty, Privacy International and EFF, as well as veteran security personnel such as Bruce Schneier, Philip Zimmermann and Jon Callas, and political experts such as former FTC CTO and Whitehouse Security Advisor, Ashkan Soltani .
While the letter looks forward to other elements of the article written by Levy and Robinson – which also describe a number of principles for defining a "minimum standard", governments should meet to have their requests accepted by companies in other countries (for example, the couple writes that "privacy and security protection is essential to public confidence" and "conduct ncy is essential") – it ends by encouraging GCHQ to abandon the ghost protocol idea in its entirety, and "avoid alternative approaches that would also threaten digi speech security and human rights ".
Achieved for an answer to the coalition's concerns, the NCSC sent us the following claim, attributed to Levy:
We welcome this response to our request for thoughts of exceptional access to data – for example, to stop terrorists. The hypothetical proposal was always intended as a starting point for discussion.
It is gratifying to see support for the six principles and we welcome feedback on their practical application. We will continue to engage with interested parties and look forward to having an open discussion to reach the best solutions.
In 2016, the United Kingdom raised up-to-date surveillance legislation that empowers government agencies to expel and hack digital commands. And with such an intrusive regime in place, it may seem strange that GCHQ is pushing even more power to sneak peoples digital chats.
Even robust end-to-end encryption can include exploitable vulnerabilities. An error was revealed to affect WhatsApp just a few weeks ago, for example (since it was fixed via an update).
In the Lawfare article, GCHQ employees argue that "legal hacking" of target units is not a paradise for government's "legitimate access requirements" because it would require governments to have vulnerabilities on the shelf to be used to hack units – which "are completely contrary to the demands of governments to disclose all the vulnerabilities they find to protect the population ".
"It seems stupid," they conclude.
Nevertheless, it also seems proper – and predictably so – to propose a page door in authentication systems as an alternative to a backdoor in encrypted message apps.