Your Government’s Hacking Tools Are Not Safe
From Cellebrite, to Shadow Brokers, to the CIA dump, so many recent data breaches have shown there is a real risk of exposure of government hacking tools.
The hackers will get hacked.
Recent data breaches have made it startlingly clear hacking tools used by governments really are at risk of being exposed. The actual value of the information included in each of these dumps varies, and some may not be all that helpful in and of themselves, but they still highlight a key point: hackers or other third parties can obtain powerful tools of cyber espionage that are supposedly secure. And in most cases, the government does not appear to clean up the fallout, leaving the exploits open to be re-used by scammers, criminals, or anyone else—for any purpose.
It’s as if someone posted a skeleton key online for breaking into an unimaginable number of locks.
“What we learn from the disclosures and leaks of the last months is that unknown vulnerabilities are maintained secret even after they’ve been clearly lost, and that is plain irresponsible and unacceptable.”
Indeed, even going back years, there are clear examples of hacking tools designed for governments being exposed to the wider public. In 2015, a vigilante calling themselves Phineas Fisher targeted an Italian surveillance company called Hacking Team. Phineas dumped full malware source code and installers into the public domain, for anyone to use. Just this Thursday, researchers disclosed that criminally-connected hackers had downloaded the malware and re-purposed it for their own operations. That group targeted European military personnel, journalists, and think tanks with the technology.
More recently, a different hacker breached a server belonging to Cellebrite. Cellebrite is one of the most popular mobile phone forensic firms in the market, supplying millions of dollars worth of phone cracking technology to law enforcement agencies around the world. In February the hacker released a cache of files related to Android, BlackBerry and iOS devices. Although some of the tools were seemingly copied from already publicly available jailbreaks, the files were still sitting on a Cellebrite server.
“It’s important to demonstrate that when you create these tools, they will make it out. History should make that clear,” the hacker told Motherboard at the time.
Now, US intelligence agencies are dealing with serious exposure around their own hacking tools. Wikileaks has allegedly provided details on stolen vulnerabilities used by the CIA to various affected software vendors, after releasing details on some of the hacking tools in March.
Just today, a known group of hackers calling itself The Shadow Brokers publicly released a slew of Windows exploits apparently belonging to the NSA, including at least one designed to target Windows 8 systems. The group has also dumped Unix-based attacks, and exploits for breaking into hardware firewalls.
These are not attacks for striking miscellaneous targets. Corporations, organizations, and individuals all make use of these products and technologies: hackers took advantage of one Shadow Brokers tool to target Cisco customers.
Of course, this is not to say that a certain government exploit will become public. Vigilantes may target some organizations, nation states may focus on others, while a malicious insider may steal information too. Everyone handling these tools has a different threat against them.
“[Shadow Brokers] is different in kind in my view. It’s very likely a major intelligence operation directed against the NSA. The others likely are not,” Thomas Rid, a professor at King’s College, told Motherboard in an email.
However, in the face of all this evidence, it would be naïve to think a government or contractor can guarantee the safety of a hacking tool. When an agency is looking to increase their technological capability—say when developing a backdoor in consumer products, such as with the recent legal tussle between the FBI and Apple—it may want to consider the risk of this exposure.
And when third parties do steal tools, perhaps agencies should take responsibility for that exposure, and work to address it, before their attacks are replicated by criminals or other groups.
At an event in March, Senator Ron Wyden said that agencies need to “be ready to clean up the mess when they do in fact hack innocent people or when their hacking tools fall into the wrong hands.”
Andrew Crocker, staff attorney at the Electronic Frontier Foundation told Motherboard in an email, “there are significant risks of leaks and other inadvertent disclosure by the government, and any policy about vulnerability disclosure needs to take these risks seriously.”