Anthem, HIPAA, and encryption

Late last week, the Wall Street Journal reported that Anthem wasn’t encrypting the database containing tens of millions of health records that were stolen by sophisticated hackers.

There are numerous problems with that story, the first being that we don’t know yet whether the data was encrypted. There are other unconfirmed reports that say the attackers used a stolen username and password to get at the data, which, if that’s true, likely would have allowed them to decrypt the data anyway.

Still, I’m seeing calls now for the government to revise HIPAA to require encryption, rather than merely encourage it. And of course there are good and bad things about that as well.

The first problem is governments’ have-it-both-ways attitude toward encryption. If you’re not hiding anything, you don’t need encryption, the US and UK governments are saying now. The problem is that if you backdoor encryption algorithms to let governments see everything, the bad guys can find the backdoor and use it too.

The Anthem case shows that you need encryption even when you aren’t doing anything wrong, and this hits the government where it hurts because the US government is a large Anthem customer.

But the Anthem case also shows that encryption isn’t a panacea. If a bad guy and gets a database administrator’s username and password, he can decrypt that encryption, then run off with the data. A skilled bad guy will probably re-encrypt the data to make it harder to watch it leaving the company, in fact. (The first time my employer handed me a file and told me to steal it and then tell them how I did it, the first thing I did was encrypt it.)

There’s a lot more to this story than whether Anthem was encrypting its data.

The second problem is that encryption isn’t just an on-off switch. Some of the old, hopelessly broken, poorly written–excuse me, the right word is legacy–code that every company has is going to break when you encrypt the data. Or the code will work fine, but the Pentium 4 server that it’s residing on–it could also be a Pentium II or III–can’t handle the additional workload that encryption adds, so the application runs much more slowly. Suddenly a system that had been adequate can’t take care of the customers anymore.

So the company will have to spin up multiple projects to take care of these problems, revising the old code, moving the application to hardware that was built sometime this decade, and possibly taking care of other problems in the meantime. Those costs will get passed on to consumers in the form of higher deductibles, higher copayments, reduced benefits, or some combination of the above. In the end, the systems will be faster and more reliable and the company will be more efficient and glad it did it, but there will be pain associated with it.

The third problem is with regulation itself. Regulation is a contentious topic in Congress right now. I don’t think revisiting HIPAA is necessarily a bad thing, given that it was written in 1996. But the government has self-contradictory requirements here, and it will take time to pass a law, and the law will have a deadline that’s sometime even further in the future.

There is something to be said for writing better requirements into law, since that’s one way to ensure the industry gives encryption priority rather than letting it compete with other corporate initiatives, but the best-case scenario I can see is that it will take two years for it to make a difference–one year to write and pass a law that has a one-year deadline to begin complying with it. Two years is too long to wait for that solution.

What’s likely to happen in the meantime is that when large companies negotiate with health insurance providers, they will insist that the insurance companies encrypt their data and write it into the contract. The advantage to this is that it happens very fast–I’m sure those discussions started last week–and whoever can manage to get it done this year is going to have an enormous competitive advantage over those who cannot, but the problem with this is the lack of standards. If two companies demand contradictory things, the health provider can’t comply with both of them. When there’s a legal standard, they can just say “encryption compliant with 2015 HIPAA standards.”

The final problem, though, is that all of this distracts from what might be more effective solutions. No matter how good our best intentions are, bad things are going to happen on our networks. There are just too many holes in them. We have these problems because so few companies are good at vulnerability management and incident response. And since so few companies do this well, they’re not exactly in position to write their better practices into their contracts.

Vulnerability management is the department that ensures those monthly and quarterly security patches are getting applied effectively. I’ve never seen a place that puts adequate tools and staffing into these efforts. It isn’t glamorous work, but it’s absolutely necessary. A common breach scenario starts with a poisoned web page or e-mail attachment letting the attacker in, then using an unpatched operating system flaw to become an administrator. Fewer breaches happen if those holes exist for a week than if they exist for a year.

Incident response is the department that sees bad things happening when someone gets around the other defenses and hopefully puts an end to it. I’ve never seen a place that puts adequate tools and staffing into these efforts either. There’s more innovation going on in this space than in the vulnerability management space, and there’s a bit more glitz here too because there’s adrenaline involved, but the tools are expensive and not likely to be written into contracts, making them difficult to justify.

The only good news I see is for security professionals, or IT professionals looking for better opportunities. If you want to always have job openings available to you, you can do a lot worse than learning vulnerability management or incident response. Dabble in both enough to see if either of them interest you at all, then learn the one that interests you more, then watch for the job openings. If you don’t find any, apply for security jobs and just ask if the job involves one or the other of those.

If you found this post informative or helpful, please share it!

3 thoughts on “Anthem, HIPAA, and encryption

    • February 10, 2015 at 8:08 pm
      Permalink

      The idea makes sense, and most places are doing it or something like it because they’re required by law to do so, but it’s not a panacea. There is no panacea. Monitoring is only effective if and only if you continue to patch your systems, centrally collect the logs from all of your servers and network hardware and perhaps even some workstation data, harden systems, and do all of your traditional security. The problem isn’t so much that traditional security doesn’t work, it’s that networks are so large we can’t secure them quickly enough.

      Also, the headline is pretty misleading. They’re not saying let them in; they’re saying accept that some fraction of a percent will get in, and find them. There’s a difference. If you let your network get overrun with attackers, you’ll overwhelm the monitoring.

  • February 11, 2015 at 2:33 pm
    Permalink

    Thanks for the response. The article makes more sense now.

Comments are closed.