“We, the undersigned, support strong encryption, which plays a crucial role in protecting personal data, privacy, intellectual property, trade secrets and cybersecurity,” wrote a bunch of nations on the weekend — the Five Eyes, India, and Japan.
As a statement of intent, it’s right up there with “Your privacy is very important to us”, “Of course I love you”, and “I’m not a racist but…”.
At one level, there’s not a lot new in this latest International statement: End-to-end encryption and public safety.
We like encryption, it says, but you can’t have it because bad people can use it too.
“Encryption is an existential anchor of trust in the digital world and we do not support counter-productive and dangerous approaches that would materially weaken or limit security systems,” the statement said.
“Particular implementations of encryption technology, however, pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children.”
The obviously important law enforcement task of tackling child sexual abuse framed the rest of the statement’s two substantive pages too.
End-to-end encryption should not come at the expense of children’s safety, it said. There was only a passing mention of “terrorists and other criminals”.
This statement, like all those that have come before it, tries, but of course, fails to square the circle: A system either is end-to-end encryption, or it isn’t.
According to renowned Australian cryptographer Dr Vanessa Teague, the main characteristic of this approach is “deceitfulness”.
She focuses on another phrase in the statement, where it complains about “end-to-end encryption [which] is implemented in a way that precludes all access to content”.
“That’s what end-to-end encryption is, gentlemen,” Teague tweeted.
“So either say you’re trying to break it, or say you support it, but not both at once.”
What’s interesting about this latest statement, though, is the way it shifts the blame further onto the tech companies for implementing encryption systems that create “severe risks to public safety”.
Those risks are “severely undermining a company’s own ability to identify and respond to violations of their terms of service”, and “precluding the ability of law enforcement agencies to access content in limited circumstances where necessary and proportionate to investigate serious crimes and protect national security, where there is lawful authority to do so”.
Note the way each party’s actions are described.
Law enforcement’s actions are reasonable, necessary, and proportionate. Their authorisation is “lawfully issued” in “limited circumstances”, and “subject to strong safeguards and oversight”. They’re “safeguarding the vulnerable”.
Tech companies are challenged to negotiate these issues “in a way that is substantive and genuinely influences design decisions”, implying that right now they’re not.
“We challenge the assertion that public safety cannot be protected without compromising privacy or cybersecurity,” the statement said.
The many solid arguments put forward explaining why introducing a back door for some actors introduces it for all, no they’re mere assertions.
“We strongly believe that approaches protecting each of these important values are possible and strive to work with industry to collaborate on mutually agreeable solutions.”
This too is an assertion, of course, but the word “belief” sounds so much better, doesn’t it.
The “war on mathematics” is a distraction
As your correspondent has previously noted, however, the fact that encryption is either end-to-end or not may be a distraction. There are ways to access communications without breaking encryption.
One obvious way is to access the endpoint devices instead. Messages can be intercepted before they’re encrypted and sent, or after they’ve been received and decrypted.
In Australia, for example, the controversial Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018 (TOLA Act) can require communication producers to install software that a law enforcement or intelligence agency has given them.
Providers can also be made to substitute a service they provide with a different service. That could well include redirecting target devices to a different update server, so they receive the spyware as a legitimate vendor update.
Doubtless there are other possibilities, all of which avoid the war on mathematics framing that some of the legislation’s opponents have been relying on.
Australia is hasty to legislate but slow to review
While Australia’s Minister for Home Affairs Peter Dutton busies himself with signing onto yet another anti-encryption manifesto, progress on the oversight of his existing laws has been slow.
The review of the mandatory data retention regime, due to be completed by April 13 this year, has yet to be seen.
This is despite the Parliamentary Joint Committee on Intelligence and Security having set itself a submissions deadline of 1 July 2019, and holding its last public hearing on 28 February 2020.
The all-important review of the TOLA Act was due to report by September 30. Parliament has been in session since then, but the report didn’t appear.
A charitable explanation would be that the government was busy preparing the Budget. With only three parliament sitting days, and a backlog of legislation to consider, other matters had to wait.
A more cynical explanation might be that the longer it takes to review the TOLA Act, the longer it’ll be before recommended amendments can be made.
Those amendments might well include having to implement the independent oversight proposed by the Independent National Security Legislation Monitor.
Right now the law enforcement and intelligence agencies themselves can issue the TOLA Act‘s Technical Assistance Notices and Technical Assistance Requests. One imagines they wouldn’t want to lose that power.
Meanwhile, the review of the International Production Orders legislation, a vital step on the way to Australian law being made compatible with the US CLOUD Act, doesn’t seem to have a deadline of any kind.
In this context, we should also remember the much-delayed and disappointing 2020 Cyber Security Strategy. That seems to have been a minimal-effort job as well.
For years now, on both sides of Australian politics, national security laws have been hasty to legislate but slow to be reviewed. The question is, is it planned this way? Or is it simply incompetence?