Securius Newsletter

May 4, 2000
Volume 1, Number 5

Ten General Security Rules 1-5

By Seth Ross

This issue:

  • Rule 1: Security Through Obscurity Doesn't Work
  • Rule 2: Full Disclosure of Bugs and Holes Benefits Security
  • Rule 3: System Security Degrades in Direct Proportion to Use
  • Rule 4: Do It Right Before Someone Does It Wrong for You
  • Rule 5: The Fear of Getting Caught is the Beginning of Wisdom

Next issue:

  • Rule 6: There's Always Someone Out There Smarter, More
    Knowledgeable, or Better-Equipped Than You
  • Rule 7: There Are No Turnkey Security Solutions
  • Rule 8: Good and Evil Blend into Gray
  • Rule 9: Think Like the Enemy
  • Rule 10: Trust is a Relative Concept

NOTE: You may notice that this issue of the Security Outpost Bulletin is a bit late. The Internet team here at PC Guardian has been hard at work developing our next-generation computer security portal, As part of the launch of the portal, we'll be renaming this newsletter to the " Newsletter", effective as of the June issue. In the meantime, I invite all of you to our sneak preview at the URL Thank you for your continuing support and readership. Onward, Seth Ross


Much of computer security is concerned with specifics and details, procedures and tools. Sometimes it's helpful to pan out and contemplate the bigger picture. I've devised ten general rules or themes that present a way of looking at and thinking about computer security. I present the first five in this issue -- the final five will come in the next issue.

Note that a version of these rules first appeared in my book, _UNIX System Security Tools_ (McGraw-Hill, 1999). You can find out more about the book at

Here we go ...

Rule 1: Security Through Obscurity Doesn't Work

As they say in the movies, you can run but you can't hide. You may think that you're running an obscure home-based PC that no one would dream of breaking into, but your obscurity is no protection in an era when thousands of malicious little punks have access to powerful network scanning tools that may discover your system and its vulnerabilities. You may think that you're hiding critical data by burying it several directories deep, but you'd be wrong, given the powerful search facilities built into modern operating systems. A software or hardware vendor might realize that a hole exists in their offering but ship it anyway, thinking that no one will find it. These kinds of holes are discovered all the time.

At best, security through obscurity can provide temporary protection. But never be lulled by it — with modest effort and time, your system will be found, your secrets discovered. You're better off deploying strong security safeguards — from filesystem encryption to cables and locks -- than hiding.

Rule 2: Full Disclosure of Bugs and Holes Benefits Security

As cited above, some vendors may feel comfortable shipping software with security holes in the hope that the software is so complex and proprietary that no one will find them -- the tree hasn't fallen if no one was there to hear it fall. Some security professionals feel uncomfortable with the publicity that security holes and problems receive. They worry that announcing security exploits can give the "bad guys" ideas about how to attack systems.

On the other hand, the security community on the Internet has committed itself to sharing knowledge about holes and possible exploits: numerous mailing lists like Bugtraq and newsgroups like maintain open discussions intended to identify and then close holes. It's somewhat paradoxical, but the routine public disclosure of security problems benefits the overall security of the Internet and the systems on it. Security through disclosure works.

Note: This doesn't mean you should widely publicize a security hole as soon as you find it. Protocol requires that you contact the system vendor or authors of the affected program first, thus giving them a chance to develop a fix. It's good when security holes are announced. It's best if they're announced along with fixes.

Rule 3: System Security Degrades in Direct Proportion to Use

This is Farmer's Law (promulgated by computer security guru Dan Farmer): "The Security of a Computer System Degrades in Direct Proportion to the Amount of Use the System Receives."[1]

Ignoring availability for a moment, a computer that's powered down is more secure than one that's powered up. A computer that's powered down, in a locked cage, in a subterranean bomb shelter, with armed guards might be secure. Once one person is using a system, risk increases. Once two or more are using a system, risk increases even more. Put the system on the Internet and provide some services ... I'm sure you get the idea. As Dan says, "Ignorant or malicious users do more damage to system security than any other factors."[2]

The trade-off between security and usefulness/functionality is the classic computer security dilemma. Machines running Windows 95/98 support a wide variety of applications but they don't natively support meaningful authentication, access control, encryption, etc. Many Linux distributions are built for maximum functionality and thus ship with massive collections of programs and wide-open security settings. On the other end of the continuum are bastion hosts set up as part of a firewall design. Many of these do one thing (i.e., filter packets between network A and network B) and one thing only. Analyze where you need to be along the security vs. functionality continuum and plan appropriately.

Rule 4: Do It Right Before Someone Does It Wrong For You

Computer security can never be implemented in a vacuum. Simply establishing security mechanisms doesn't guarantee that they will work as planned. Security policies and mechanisms must account for the legitimate needs of users: i.e., they must be done right. An organization can decree that no users will have Internet access only to find that savvy users can buy cheap modems to circumvent this policy, thus greatly increasing the organization's vulnerability. It would be better to set more realistic policies and provide for monitored, controlled access to the net in the first place. A firewall administrator may decide to implement a fascist firewall that only allows HTTP/web access via port 80, leaving users that need Telnet access to the outside out of luck. Alternately, these users may discover that it's possible to encapsulate forbidden protocols in HTTP packets. The administrator would be better off providing for legitimate needs rather than encouraging workarounds that can create substantial and unknown risks. It's better to set things up properly yourself than to wait for someone to do it wrong for you.

Rule 5: The Fear of Getting Caught is the Beginning of Wisdom

Don't underestimate the value of deterrence. Many potential attacks can be prevented by instilling fear in the potential attackers.[3] Deterrence can be particularly effective against the amateur white-collar criminal or insider. The goal is to prevent the attacker's intent from reaching the critical point of action. There are many kinds of safeguards that can deter an attack, including login banner warnings such as:

"WARNING! Use of this system constitutes consent to security monitoring and testing. All activity is logged with your host name and IP address."[4]

Other possibilities include written reminders of computer- related laws, background checks, security briefings, and audits. Of course, these safeguards may not phase the hardened computer criminal, but even a pro will think twice after surveying a newly-cracked system and finding that a strong filesystem crypto product like Encryption Plus for Folders has been installed or that a monitoring tool like Tripwire has been configured to write daily filesystem integrity reports to read-only media.

NEXT ISSUE: Rules 6 through 10. 'Til then, keep your guard up.


[1] Farmer admits that he probably wasn't the first to state it but since he calls it "Farmer's Law," I will too.

[2] Dan Farmer,

[3] "Fear of the Lord is the beginning of wisdom" (Psalms 111:10)

[4] Warning text from

Subscribe to the Securius Newsletter
Please enter your email address: is a service of GuardianEdge Technologies.
Copyright © 2006 GuardianEdge. All rights reserved.
We will not share your personal information with third parties.
Nor will we contact you without your permission.