Securius Newsletter

February 24, 2003
Volume 4, Number 1

Security Through Usability

By Seth Ross

In 1883, Flemish linguist Auguste Kerckhoffs published a groundbreaking article on military cryptography that is still widely cited for its admonishments against what is now known as "security by obscurity". In reviewing the historical use of cryptography, Kerckhoffs showed how, time and again, cryptographers developed complex and obscure cryptosystems that relied on the principle that the enemy wouldn't find out how they worked. Just as often, enemy cryptanalysts were able to break the systems, usually without detection. Even when generals realized that their crypto was broken, there was little they could do in a timely way, since the only solution was to replace entire cryptosystems.

Historically, cryptography had only been used by the elite, typically royalty, courtiers, diplomats, and top military leaders. By the late 19th century, the complexity of modern warfare and new communications technologies like the telegraph created new security needs. In the wake of France's defeat in the Franco-Prussian War, Kerckhoffs and other military thinkers realized that the need for secure communications was no longer restricted to top military commanders: the communications between generals and officers, and between officers and field units, needed protection as well. The idea of pervasive encryption was born.

Given the rise of field cryptography, Kerckhoffs realized that cryptographers could no longer assume that the enemy would not figure out how the underlying cryptosystem works -- the deployment of cryptography on the front lines vastly increased the risk that the enemy could steal and analyze the system's operations. He advocated what we might now call "open systems": instead of relying on the obscurity of the cryptosystem's underlying operation, Kerckhoffs argued that designers should assume that the enemy will either know or be able to deduce how the system works. Instead of relying on obscurity, he argued, security should depend on the strength of keys. In the event of a breach, only the keying material would need to be replaced, not the whole system.

Kerckhoffs was the first to publicly identify the weakness of security by obscurity. The admonition not to rely on obscurity in security systems is often called "Kerckhoffs' Principle" or "Kerckhoffs' Law". The principle is taught in many computer science classes and is frequently cited and discussed by computer security experts.[1, 2, 3, 4]

In his seminal publication, "La cryptographie militaire", Kerckhoffs actually presents five other laws as well.[5] In general, these have received a lot less attention. Here are all six of Kerckhoffs' Laws (thanks to Fabien A. P. Petitcolas for the translation)[6]:

  1. The system must be substantially, if not mathematically, undecipherable;
  2. The system must not require secrecy and can be stolen by the enemy without causing trouble;
  3. It must be easy to communicate and remember the keys without requiring written notes, it must also be easy to change or modify the keys with different participants;
  4. The system ought to be compatible with telegraph communication;
  5. The system must be portable, and its use must not require more than one person;
  6. Finally, regarding the circumstances in which such system is applied, it must be easy to use and must neither require stress of mind nor the knowledge of a long series of rules.

While security designers and engineers have embraced requirement No. 2, there's been less focus on No. 6. Over the past 120 years, "Security Through Usability" has largely been overlooked. As a consequence, if you look at modern cryptosystems, you'll find that many rely on what I'll call "security through complexity".

Security through complexity is embodied in the "more is better" school of software development that achieves greater functionality and perceived value through "feature creep". It's the tendency to develop encryption programs and platforms with so many systems and subsystems, protocols, rules, caveats, and documentation sets, that regular people cannot use them. Security engineers are aware of this drift towards complexity and unusability -- Alma Whitten published "Why Johnny Can't Encrypt" in 1999 [7] -- but given market and marketing forces, they are often unable to stop it.

PGP 8.0 embodies this trend. Philip Zimmermann's spare but functional encryption program has been transmogrified over the years into a multi-headed crypto hydra. It does AES. It does TripleDES. It does CAST. It does IDEA. Those are just the symmetric ciphers. It does Diffie-Hellman, RSA, and RSA Legacy. Do you know which ciphers you want to use, and why? It encrypts files, clipboard contents, disk partitions, or email via Outlook, Outlook Express, or Notes plug-ins. It supports key splitting so that more than one user must "turn the key" at the same time to decrypt and a Secure Viewer that is designed to resist TEMPEST snooping. It supports no fewer than five Certificate Authorities, four smart cards, and dozens of arcane key server operations.

The list of PGP features goes on and on. The point here is not to beat up on PGP, but to demonstrate a commercial violation of Kerckhoffs' "Security Through Usability" rule. Kerckhoffs was concerned about the "stress of mind" (_tension d'esprit_ in the original French) that complexity creates for crypto users in the field. Given the requirement for pervasive encryption, Kerckhoffs knew that complex systems would either break down in the field or simply be bypassed. While this problem is particularly acute under battlefield conditions, where fatigue, forgetfulness, and stress are chronic, it also impacts modern "road warriors", who are required to deliver timely results from far-flung locations while managing jetlag, technical problems, and cross-cultural pleasantries. Few users -- either military or commercial -- have the time or inclination to study documentation (the "long series of rules" to which Kerckhoffs refers).

Ironically, PGP was the first piece of crypto software with source code published on the Internet, and thus has been a model for compliance with Kerckhoffs' obscurity rule. The idea of publishing code for review has been almost universally embraced by the academic community and commercial crypto vendors, and most of the important ciphers like AES are open and available for anyone to review or use.

Now that transparency of operation and open systems are nearly universal in cryptography, perhaps it's time for all those who design and deploy cryptosystems to embrace usability. Kerckhoffs' Security Through Usability law should be taught in the universities alongside his admonitions against obscurity. Security buyers should beware of large scale, complex, expensive, and ultimately unusable systems like Public Key Infrastructures, which also violate Kerckhoff's third requirement (“easy to remember or change keys”) and fifth requirement ("not more than one person to operate"). Given the active threats posed by espionage and terrorism, the challenges of pervasive encryption are even more important now than they were in Kerckhoffs' era. It's time to make, and keep, our crypto simple.

See you next issue. 'Til then, keep your guard up!


[1] Whitfield Diffie, "Perspective: Decrypting the secret to strong security"

[2] Bruce Schneier, "Secrecy, Security, and Obscurity"

[3] Randy Bush, Steven M. Bellovin, "Security Through Obscurity Dangerous"

[4] Seungjoo Kim, "Rethinking Chosen-Ciphertext Security under Kerckhoffs' Assumption"

[5] Auguste Kerckhoffs, "La cryptographie militaire", Journal des sciences militaires, vol. IX, pp. 5-38, Jan. 1883, pp. 161-191, Feb.

[6] Fabien A. P. Petitcolas has a mini-site on Kerckhoffs' article that includes scanned images of the original

[7] Alma Whitten, J. D. Tygar, "Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0"

Copyright © 1999-2011 Seth T. Ross. All rights reserved.