Skip to content

Page087

Fail Securely

As the name implies, fail securely means a system remains secure when it fails. Imagine a building with electronic locks: in the event of a power outage, should the doors remain locked when power is lost, or unlocked? Fail securely would require them to remain locked.

UNIX/Linux systems use the fsck command (file system check) to repair disks that have been damaged. After rebooting, fsck can check for damaged disk sectors, and attempt to automatically repair them. If this effort fails, many systems will open a root (superuser) session on the console terminal (with no authentication required). This is by design: if the disk is damaged, perhaps authentication is impossible (due to a corrupted login program, or password file, etc.). This is an example of failing insecurely: fail secure would require authentication in all cases.

Separation of Duties (SoD)

As we will discuss in Chapter 8, Domain 7: Security Operations, separation of duties (SoD) requires multiple people to complete critical or sensitive transactions. The goal of separation of duties is to ensure for someone to be able to abuse their access to sensitive data or transactions, they must convince another party to act in concert. Collusion is the term used for the two parties conspiring to undermine the security of the transaction. The classic action movie example of separation of duties involves two keys, a nuclear sub and a rogue captain.

Keep It Simple

Keep it Simple is also known as the “KISS principle” (short for “Keep it Simple, Stupid,” or more politely: “Keep it Simple, Silly”). Simpler systems are more secure than complex systems. As we will discuss in Chapter 9, Domain 8: Software Development Security, programmers typically make between 15 and 50 mistakes per thousand lines of code (unless they are formally trained to write secure code), and more lines of code means more bugs. To quote Bruce Schneier: “The worst enemy of security is complexity” [3].

The KISS principle was coined during the 1960s by Kelly Johnson, engineer for Lockheed Martin. During the Viet Nam war, planes were sometimes repaired in fields and required simple tools and techniques to repair them:

The principle is best exemplified by the story of Johnson handing a team of design engineers a handful of tools, with the challenge that the jet aircraft they were designing must be repairable by an average mechanic in the field under combat conditions with only these tools. Hence, the “stupid” refers to the relationship between the way things break and the sophistication available to fix them. The acronym has been used by many in the United States Air Force and the field of software development [4].

Trust, but Verify

“Trust, but Verify” is a Russian Proverb (Doveryai, no proveryai) popularized by Ronald Reagan in the 1980s. The term is now used to describe a security model that relies on accountability and integrity. Trust, but Verify focuses on dual-factor authentication for all access (both local and remote), logging, and assuring the integrity of the logs. It is simpler than Zero Trust (and thus easier and cheaper to accomplish), and the “trust” portion is directly at odds with Zero Trust (often described as “Never Trust, Always Verify”) [5], which we will discuss next.

Aydan R. Yumerefendi and Jeffrey S. Chase describe Trust, but Verify:

  • Undeniable. Actions of an accountable actor are provable and non-repudiable. That is, a service or its clients cannot plausibly deny their actions, and those actions may be legally binding.
  • Certifiable. A client, peer, or external auditor may verify that an accountable service is behaving correctly, and prove any misbehavior to an arbitrary third party. For example, a service may be prompted to prove cryptographically that its actions are justified by the sequence of operations issued by its clients, in accordance with its defined semantics.
  • Tamper-evident. Any attempt to corrupt the service state incurs a high probability of detection. In particular, an external auditor may determine if the internal state could or could not result from the sequence of operations issued on the service [6].