The CIA triad of security has become controversial. I think this is due to a lack of understanding of what it means. The CIA triad remains a good fundamental model of why security exists and what it protects. Here’s what the CIA triad is all about, and what’s wrong with the trendy model some say should replace it.
The CIA triad refers to three things: the confidentiality, integrity, and availability of computer systems and data. Although it is an old model, it is also enduring.
An overview of the CIA triad of security
I received my introduction to the CIA triad in my Security+ training way back in 2008. I can’t speak for the Security+ exam of today, but in 2008, understanding the triad was fundamental to passing the exam. And I would argue that the first step in solving many of today’s security problems would be taking a step back and remembering these fundamentals.
Some people believe the order of the triad is important. I do not share that view. The triad is a three-legged stool. If you prioritize one leg over the others, you have an unbalanced stool–and likely, a computer system that doesn’t meet its requirements. If a problem doesn’t neatly fit into one element of the triad, it’s probably because it overlaps them. That’s not a problem with the triad. It’s a testament to its versatility. It’s OK for a problem to fit into more than one category.
Let’s talk about what each of the three elements of the CIA triad cover. You’ll note a pattern in each of these. The CIA triad applies not just to the computer system, but also to the data the system contains.
The confidentiality of a computer system and its data means that only the people who have a legitimate need for the data have access to it. Every time you hear about a data breach, the confidentiality of that data has been compromised. Data that was supposed to be private is now public.
This means keeping internal data internal, but it also applies inside a company. I shouldn’t be able to look at my coworkers’ HR files. I have no right to that data, and no need for that data.
Some companies take this to extremes, slicing their network up so that unrelated computer systems cannot talk to each other. This is good for making breaches more difficult. But it can also cause silos, making collaboration across teams difficult or impossible. Providing access to the people who need it is part of implementing confidentiality correctly. Confidentiality isn’t just saying no. It’s saying no when no is the right answer, and saying yes when yes is the right answer.
This leg of the triad arguably applies to the data more than to the system. But if the system doesn’t have integrity, the data doesn’t either. Integrity in the CIA triad of security is protecting the data from unauthorized or inadvertent changes. I could do a lot of damage by going into the accounting department and making random changes to a bunch of spreadsheets. At best, I make someone redo a lot of work. At worst, I cause a wrong decision that costs the company a lot of money.
Integrity is a major reason why I don’t recommend overclocking. It causes mathematical errors, which leads to incorrect calculations and incorrect data. You can probably live with that if you’re playing video games. At work, incorrect calculations can lead to loss of life. That’s why businesses don’t overclock.
If there’s one leg of the CIA triad of security that gets neglected, it’s this one. Frequently we get so hung up on confidentiality and integrity that we make decisions that adversely affect availability. There’s a joke that says the only perfectly secure computer system is the one that’s encased in lead and never powered on. It’s also wrong. That system has perfect confidentiality and perfect integrity, but no availability. In that computer system, denial of service is part of the design.
Availability means the people who need to use the computer system and the data contained in it are able to do so. This is the part of the triad where security professionals and the rest of IT are most prone to butt heads. Security wants a new update deployed yesterday. IT wants to test the update to make sure it doesn’t cause a blue screen loop. If you’ve ever had to fix a blue screen loop, you know why.
If the people who need to be able to get at the data can’t get to it, it doesn’t matter how wonderful the data is.
The CIA triad of security vs the DIE triad
In recent years, it’s become trendy to say the CIA triad of security is obsolete and should be replaced with a new model, the DIE triad. The DIE triad is a security model for DevOps and the cloud.
That’s what’s good about the DIE triad and what’s bad about it. The CIA triad is universal. The DIE triad applies well to what’s popular today, but it doesn’t necessarily apply well to decentralized systems. Centralization is popular today. But centralization is a trend. Decentralization had its day in the 90s. When someone finds a way to make decentralized computing cheaper than cloud computing, decentralization will come back.
Notice I haven’t even mentioned yet what DIE stands for. That’s another problem with it. Its proponents concentrate more on the acronym than on what it stands for. The concept is computer systems don’t live long. It’s easier to spin up new systems and cut over to the new systems than it is to fix old systems or patch them. DIE DIE DIE. Computers are cattle, not pets. RAR!
What the DIE triad stands for
The DIE triad stands for three things: Distributed, Immutable, and Ephemeral. As you can see, some of it maps to the CIA triad, but some of it doesn’t. Distributed maps to availability, sort of. It’s a way to provide availability. Immutable maps to integrity, so that’s good, but you shouldn’t use a $12 word when a 10-cent word will do, and immutable isn’t a 10-cent word. It’s the kind of word we use to try to make ourselves sound smarter than we are. Why can’t we just say integrity?
And then there’s Ephemeral. There’s a problem in computer security that systems live forever. Design systems to be temporary and replaceable, and you solve the IBM mainframe problem. You also solve the hypothetical problem of the Windows NT 3.1 system that’s still in production somewhere, celebrating its golden anniversary soon. (This hypothetical problem ignores the reality you’re much more likely to find NT 4.0 or Windows 98, which causes us to lose credibility when security folks talk to the rest of IT.)
Problems with the DIE triad
DIE doesn’t even talk about confidentiality. I guess that’s because it’s supposed to go without saying? The other problem I have with the DIE triad is it reeks of micromanagement. The Lieutenant Colonel I worked for in my first security job wasn’t an expert in security and never claimed to be, but he sure got one thing right. One of the first things he said to me, that’s stuck with me for a decade and a half, is this:
We don’t care how they do it. We just want it to work.
The ironic thing about the DIE triad is that it ostensibly prevents you from being locked in to IBM or Unisys mainframes or Microsoft Windows, but in doing so, it locks you into cloud offerings. It replaces the villains of old with Amazon and Google and a collection of the old villains reinvented. Microsoft, IBM, and Oracle all scrambled to get into the cloud as soon as they noticed Amazon’s success with it.
Our job as security professionals isn’t to follow trends. It’s to protect the data and therefore protect the people who rely on it. While the DIE triad provides a model for doing that in the 2020s, it isn’t timeless. It’s a reaction to the 1990s and potentially limits our options in the future. The CIA triad of security, by contrast, works regardless of the decade and the underlying technology. Knocking the CIA triad and the Lockheed Martin Cyber Kill Chain is trendy right now, but if you want to remain useful when this year’s trends have passed, I recommend understanding and learning how to apply them.