If you work in the realm of cyber security and monitor its goings-on then you will probably have come across this hashtag lately; #wassenaar. Here we’re going to explain what’s happening, what exactly it means and how it might affect you.
Wassenaar is the name of the town in the Netherlands where, in 1996, 41 countries signed an arms control treaty. Countries include the US, Russia and all European countries, but does not include China. The main aims of the treaty are to limit chemical weaponry such as nuclear and to control conventional weapons so they might be prevented from falling into the hands of terrorists and separatist factions.
Thus far, its implication for us infosec folk is that cryptography has been listed as a munition, which means organisations such as the NSA can eavesdrop and decrypt messages in those countries belonging to the treaty. It also causes much bureaucratic hassle for those of us selling software to foreign countries, as most products include some form of encryption.
So, what’s new?
Last year, cyberweapons were added to the Wassenaar list of munitions and on May 20th, the US Bureau of Industry and Security proposed new rules for the US to comply with the Wassenaar treaty. Cyberweapons are defined as follows.
Intrusion malware – A specific example given is of a malware found on the machines of Bahraini activists based in Washington.
Intrusion exploits – Zero Day exploits, i.e. vulnerabilities which have yet to be reported and which are sold on the dark web and to governmental institutions for large sums of money.
IP surveillance products – Programmes designed to examine the Internet structures of a country, monitor its citizens and try to discover who spies and who officials are communicating with.
What concerns cyber security professionals mostly is the ‘Intrusion Exploits’ element of the new arrangement. Some of us spend a good deal of time examining vulnerabilities; discovering them, fixing them. What does this mean for us? In essence, it means you can’t sell your intelligence to China, or export it without releasing it to the NSA first.
Previously, such exploits could be legally sold to the NSA or to any other country. It’s obvious why this loophole must be closed; anyone working in cybersecurity, from researchers to product owners, could be affected by the new arrangement. If the NSA don’t know what you know, then you could be in hot water. What they would now need to do for such information to leave the US is to apply for an export license, which includes the NSA having access to your work, which is obviously controversial.
The problem here really lies in that the ‘bad’ products they are trying to control are often indistinguishable from the good ones. This may also have implications on other tools such as pen-testing tools, including vulnerability scanners. So while the new Wassenaar arrangement are targeting the bad products such as intrusion malware, their rules also apply to many good products essential for cybersecurity.
So what about open source?
Well, it seems that as long as your research is published somewhere, such as GitHub, before you try to export it, then you’re safe. If you were to export something (i.e. have it on your laptop and board a plane to a foreign country) then you could potentially risk 20 years in jail and a 1 million dollar fine. Costly if you were to make a mistake!
Further opposition to Wassenaar comes from the security research community who, understandably, are seeing themselves as being under attack. The US is currently waging a sort of ‘war on cybercrime’ but unfortunately this is detrimental to security research work to boot. Changes to the anti-hacking law (CFAA) are effectively going to make research into zero days illegal. Proposed changes to copyright rules (DMCA) could make research into vulnerabilities in existing products such as aeronautical software also illegal. Some of these measures seem to be in favour of software companies rather than suiting any national security remit. They also seem to be extending the reach of the NSA, giving them access to all code before it’s exported as research or product. This also raises some First Amendment issues.
Currently, this is all at proposal stage and is open for comments, which you can leave here at the Federal Register. If these proposals are enacted, then they could have huge implications for those of us working in cybersecurity. Watch this space.
Get the latest content on web security
in your inbox each week.