Attempts, unofficially dubbed the "Crypto Wars", have been made by the United States (US) and allied governments to limit the public's and foreign nations' access to cryptography strong enough to thwart decryption by national intelligence agencies, especially the National Security Agency (NSA).
Two types of technology were protected: technology associated only with weapons of war ("munitions") and dual use technology, which also had commercial applications. In the U.S., dual use technology export was controlled by the Department of Commerce, while munitions were controlled by the State Department. Since in the immediate post WWII period the market for cryptography was almost entirely military, the encryption technology (techniques as well as equipment and, after computers became important, crypto software) was included as a Category XIII item into the United States Munitions List. The multinational control of the export of cryptography on the Western side of the cold war divide was done via the mechanisms of CoCom.
Encryption export controls became a matter of public concern with the introduction of the personal computer. Phil Zimmermann's PGP cryptosystem and its distribution on the Internet in 1991 was the first major 'individual level' challenge to controls on export of cryptography. The growth of electronic commerce in the 1990s created additional pressure for reduced restrictions. Shortly afterward, Netscape's SSL technology was widely adopted as a method for protecting credit card transactions using public key cryptography.
SSL-encrypted messages used the RC4 cipher, and used 128-bit keys. U.S. government export regulations would not permit crypto systems using 128-bit keys to be exported. At this stage Western governments had, in practice, a split personality when it came to encryption; policy was made by the military cryptanalysts, who were solely concerned with preventing their 'enemies' acquiring secrets, but that policy was then communicated to commerce by officials whose job was to support industry.
Legal challenges by Peter Junger and other civil libertarians and privacy advocates, the widespread availability of encryption software outside the U.S., and the perception by many companies that adverse publicity about weak encryption was limiting their sales and the growth of e-commerce, led to a series of relaxations in US export controls, culminating in 1996 in President Bill Clinton signing the Executive order 13026 transferring the commercial encryption from the Munition List to the Commerce Control List. Furthermore, the order stated that, "the software shall not be considered or treated as 'technology'" in the sense of Export Administration Regulations. This order permitted the United States Department of Commerce to implement rules that greatly simplified the export of proprietary and open source software containing cryptography, which they did in 2000.
Until 1996, the government of the United Kingdom withheld export licenses from exporters unless they used weak ciphers or short keys, and generally discouraged practical public cryptography. A debate about cryptography for the NHS brought this out in the open.
The Bullrun program is controversial, in that it is believed that NSA deliberately inserts or keeps secret vulnerabilities which affect both law-abiding US citizens as well as NSA's targets, under its NOBUS policy. In theory, NSA has two jobs: prevent vulnerabilities that affect the US, and find vulnerabilities that can be used against US targets; but as argued by Bruce Schneier, NSA seems to prioritize finding (or even creating) and keeping vulnerabilities secret. Bruce Schneier has called for the NSA to be broken up so that the group charged with strengthening cryptography is not subservient to the groups that want to break the cryptography of its targets.
In 2018, the NSA promoted the use of "lightweight encryption", in particular its ciphers Simon and Speck, for Internet of Things devices. However, the attempt to have those ciphers standardized by ISO failed because of severe criticism raised by the board of cryptography experts which provoked fears that the NSA had non-public knowledge of how to break them.
Following the 2015 Charlie Hebdo shooting, a terrorism attack, former UK Prime Minister David Cameron called for outlawing non-backdoored cryptography, saying that there should be no "means of communication" which "we cannot read". US president Barack Obama sided with Cameron on this. This call for action does not seem to have resulted in any legislation or changes in the status quo of non-backdoored cryptography being legal and available.
To me the first crypto war was more about encryption export controls and deliberately weakening cryptography than the Clipper chip itself, being this small chip just an example of weakened encryption device. We are paying yet the consequences of this crypto war (e.g. by the recent removal of server-gated cryptography from LibreSSL on june 18th, 2015).
The second crypto war will be no different. If the public supports it with enough force we will win some rhetoric. But intelligence agencies are NOT going to stand for communications they are not able to intercept.
Some have pointed that Crypto War 1 did not end and was running quietly under cover in forms of key escrow and blackbox security chips although it seemed on the surface the UKUSA Warhawk Administrations seem to relax export controls and openly hold some AES competition and SHA3 competition with all the distractive and positive noises (not that I am being ungrateful to the progress in the AES and SHA3 competition and the crypto advancements it brought).
I think the Warhawk World Governments have realized that keeping cryptographic algorithms under tight lips and active export controls are useless due to the widespread use of the Internet as a communication medium and also people printing clothings and books with algorithms and squeezing cryptographic protocols and implementation codes onto Twitter tweets (TweetSalt library).
Knowledge on high assurance security instead of just outright implementation of innocent cryptographic algorithms are in the exclusive domains of the Governments although the recent blooming of higher (although not high enough) assurance of security have taken roots in the academic and commercial researches but the outright lack of usable implementations choices are really daunting.
CW-2 is as problematic as ever, there is no single form of cryptographic processing and management that MUST allow for others access and claims that fidelity can be maintained. I think of the bitcoin authenticity model, there are other distributed key management systems, that a collection of independent nodes contain only a piece of the cryptographic keymat puzzle. If this model is inverted, as is the suggested methodology, it quickly breaks down from a risk analysis perspective.
Which brings me around to why trying to control cryptography will not work for tyrants or bureaucrats except with authoritarian followers. The crypto genie is out of the bottle, or for those wishing to suppress it, the horse has bolted and Pandora has opened the box, as far as crypto is concerned it has escaped beyond the tyrants and bureaucrats recovery and thus control.
What if your research could help solve a looming national problem, but government officials thought publishing it would be tantamount to treason? A Stanford professor and his graduate students found themselves in that situation 37 years ago, when their visionary work on computer privacy issues ran afoul of the National Security Agency. At the time, knowledge of how to encrypt and decrypt information was the domain of government; the NSA feared that making the secrets of cryptography public would severely hamper intelligence operations. But as the researchers saw it, society's growing dependence on computers meant that the private sector would also need effective measures to safeguard information. Both sides' concerns proved prescient; their conflict foreshadowed what would become a universal tug-of-war between privacy-conscious technologists and security-conscious government officials.
A year earlier, Hellman had published "New Directions in Cryptography" with his student Whitfield Diffie, Gr. '78. The paper introduced the principles that now form the basis for all modern cryptography, and its publication rightfully caused a stir among electrical engineers and computer scientists. As Hellman recalled in a 2004 oral history, the nonmilitary community's reaction to the paper was "ecstatic." In contrast, the "NSA was apoplectic."
The fact that Hellman and his students were challenging the U.S. government's longstanding domestic monopoly on cryptography deeply annoyed many in the intelligence community. The NSA acknowledged that Diffie and Hellman had come up with their ideas without access to classified materials. Even so, in the words of an internal NSA history declassified in 2009 and now held in the Stanford Archives, "NSA regarded the [Diffie-Hellman] technique as classified. Now it was out in the open."
Meyer's letter alarmed many in the academic community and drew coverage by Science and the New York Times for two main reasons. First, the letter suggested that merely publishing a scientific paper on cryptography would be the legal equivalent of exporting nuclear weapons to a foreign country. If Meyer's interpretation of the law was correct, it seemed to place severe restrictions on researchers' freedom to publish. Second, Deborah Shapley and Gina Kolata of Science magazine discovered that Meyer was an NSA employee.
In his memo to Schwartz, Hellman made a lucid case for the value of public-domain cryptography research. Astutely, Hellman first acknowledged that the U.S. government's tight control over cryptographic techniques proved enormously useful in World War II: Allied forces used confidential cryptographic discoveries to improve their own encryption systems while denying those same cryptographic benefits to Axis powers. Even so, Hellman argued that circumstances had changed. 041b061a72