New UK government surveillance laws are so over-reaching that tech companies can't possibly meet all of their requirements, according to Apple, which argues the measures will make the online world far less safe.
Apple, WhatsApp, Meta all threaten to quit UK messaging
The UK Home Office is pushing proposals to extend the Investigatory Powers Act (IPA) with a range of proposals that effectively require messaging providers such as Apple, WhatsApp, or Meta to install backdoors into their services. All three services are now threatening to withdraw messaging apps from the UK market if the changes move forward.
They're making those threats for a very good reason: you cannot create a backdoor into software that will only be used by so-called "good guys." Any flaws will be identified and exploited in a range of attacks.
It is \ noteworthy that Apple sees these laws as so repressive to free speech and so invasive, while also being impossible to maintain, that it would have to cease offering messaging services in the UK — even though it continues to offer these in allegedly censorious China.
A threat to security
Further, the regulation the UK is attempting to pass is so draconian that it even lacks a review system and insists that tech firms share any security updates with the government before they're released. That puts a big block on fast security responses to all kinds of attacks, and means global audiences are left vulnerable while the Home Office decides what to do.
There are many arguments against the foolish proposals in the bill in Apple’s lengthy response, which points out that the UK already has a broad set of rules to govern this. (The new rules also suggest the Home Office will seize power to monitor messages of users located in other countries.)
“Together, these provisions could be used to force a company like Apple, that would never build a backdoor, to publicly withdraw critical security features from the UK market, depriving UK users of these protections,” the company warned.
The extended powers could dramatically disrupt the global market for security technologies, Apple also warns, “putting users in the UK and around the world at greater risk.”
Impossible to follow law under international obligations
I won’t go into all the arguments here — you should read them in their complete form — but one set of criticisms is particularly important: even if Apple could follow the UK law, it would be unable to do so under also existing international legal precedents.
In other words, the UK proposals are not in line with regulations already in place across its allied nations, including the US and European Union (EU). Apple argues the UK law would, “impinge on the right of other governments to determine for themselves the balance of data security and government access” in their own countries. In plain English, it means the UK is deliberately putting itself in conflict with laws like the EU’s GDPR and the US CLOUD Act.
“Secretly installing backdoors in end-to-end encrypted technologies in order to comply with UK law for persons not subject to any lawful process would violate that obligation” [under GDPR].
The upshot is that Apple cannot obey this law under existing regulations, so would have no choice but to quit the UK market.
A threat to free speech
Even worse, the way the act is constructed effectively means the UK gets a worldwide gag order on what people can say or share online. “That is deeply problematic, especially considering that the legal systems of most nations treat free speech as a fundamental individual right,” Apple said.
Another set of arguments relates to the way the UK seems to want to control security technologies. Not only does it want to vet what security technologies are used, but it insists on the power to secretly and without oversight or review forbid their use.
And a threat to security
The idea is that a UK minister could issue a notice to forbid use of a technology and it must be carried out, even if it's found after subsequent review to be inappropriate. This would force companies to withhold essential security updates, even when threats are being actively exploited.
This does not make anyone safe. Apple argues, strongly, that this is an inappropriate power, given the increased security threats emerging at this time. Globally, the total number of data breaches more than tripled between 2013 and 2021, the company said, citing this report.
The Act also weakens end-to-end encryption, which helps protect users against attacks, surveillance, fraud and worse.
My take
Apple’s complaints are completely valid. The proposals being rushed through by the UK government do not take into account the nation’s existing obligations. They are also deeply naïve.
Any move to weaken encryption will not only make the UK less digitally secure, but will also undermine digital security and privacy across every connected nation.
Given the value of digital trade across the UK, the proposals are a direct threat to economic prosperity, individual liberty, and state and enterprise security. It’s an appalling piece of legislation that will spawn imitations across every failing authoritarian state. It should be rejected.
Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.