Amber Rudd, the UK’s home secretary has expressed her concerns over end-to-end encrypted messages and tech companies are not doing much to beat this internet enemy.
Over the years, tech giants such as: Facebook, Telegram, Apple and Google have adopted end-to-end encryption, making it impossible to messages to be intercepted without the users’ permission.
End-to- end encryption is a method of secured communication that prevents third parties from accessing data while it’s transferred from one end system to another. In clearer terms, nobody can snoop through messages, whether an internet service provider or an application service provider. The messages are considered unbreakable no matter the circumstances.
In view of this, Rudd says, “true in theory” that end-to-end encryption stops content of messages been accessed but “the reality is different”. In an interview with the BBC, she said:
“We want technology companies to work more closely with us on end-to-end encryption, so that there is particular need, where there is targeted need, under warrant, they share more information with us so that we can access it.”
By working more closely with tech companies, she wants metadata- the who, what, how and when of encrypted messages to be law enforcement officials. Metadata describes how and when and by whom a particular set of data was collected-but not the content itself.
However, technology companies are likely to resist any action that will compel them to share too much data with the government.
Amber Rudd said further that legislation will be an alternative should companies refuse to clamp down on the spread of extremist content. She told the BBC:
“None of these materials should be on line. They need to take ownership over making sure it isn’t. It’s government that need to urge them to really take action so that we don’t have to go down the road of legislation- and get them to do it on a voluntary but urgent basis”
Specifically, the home secretary said that tech companies must make an effort to see that online contents are placed under check. To make this happen, “they have to make sure that material terrorists want to put up gets taken down, or, even better, doesn’t go up in the first place”
In contrast to this suggestion, David Green from the Electronic Frontier Foundation posits that the approach may result in contents getting blocked incorrectly.
“We are concerned that it’s going to lead to more takedowns; not more terrorist contents but more content that’s mistaken for terrorist content being taken down”
Given the need for security over privacy, there are real innocent people who use instant messengers like WhatsApp groups to keep their conversation secure and wouldn’t be happy to see that information passed in confidence have been made public.