Uncategorized

Why Anthem Was Wrong Not to Encrypt

Screen Shot 2015-02-22 at 7.23.57 AMBeing provocative isn’t always helpful. Such is the case with Fred Trotter’s recent headline ‒ Why Anthem Was Right Not To Encrypt.

His argument that encryption wasn’t to blame for the largest healthcare data breach in U.S. history is technically correct, but lost in that technical argument is the fact that healthcare organizations are notably lax in their overall security profile. I found this out firsthand last year when I logged onto the network of a 300+ bed hospital about 2,000 miles away from my home office in Phoenix. I used a chrome browser and a single malicious IP address that was provided by Norse. I wrote about the details of that here ‒ Just How Secure Are IT Network In Healthcare? Spoiler‒alert, the answer to that question is not very.

I encourage everyone to read Fred’s article, of course, but the gist of his argument is that technically ‒ data encryption isn’t a simple choice and it has the potential to cause data processing delays. That can be a critical decision when the accessibility of patient records are urgently needed. It’s also a valid point to argue that the Anthem breach should not be blamed on data that was unencrypted, but the healine itself is misleading ‒ at best.

I don’t disagree with Fred’s narrow technical argument, but there is definitely a larger issue that he chose to ignore. That larger issue ‒ and one I’ve written about frequently ‒ is what industry experts call a “culture of security.” The sheer volume of data breaches suggests a serious lack of that culture specifically in healthcare.  The SANS Institute report last year highlights the dire state of cybersecurity in healthcare. New Cyberthreat report by SANS Institute Delivers Chilling Warning to Healthcare Industry

Less than 6 months prior to the time Anthem pulicized their breach earlier this month, Community Health Systems (CHS) announced their breach of 4.5 million patient records. Some of the top security analysts have already begun to link the two (Anthem and CHS) ‒ right down to the lethal vulnerability that was discovered last April ‒ the Heartbleed bug. There’s even speculation that the actual breaches at both Anthem and CHS may have occurred in fairly close proximity to each other (after April of last year). Again, something I covered here: Are the Data Breaches at Anthem and CHS Linked?

That “culture of security” means that there’s a technical basis ‒ and logic ‒ to use the appropriate technology (both software and hardware in tandem) to ensure that adequate data (and network) security is in place. Note the use of that word ‒ adequate.

There will never be a perfect. The attack surface in increasing ‒ exponentially with IoT ‒ and the attackers have only to find one vulnerability once. Defenders, on the other hand, need to defend against all vulnerabilities ‒ all the time. That equation gives the attackers the upperhand and the gap between attacker and defenders is widening.

In the end ‒ we’ll likely see at least 2 outcomes from these new mega breaches.

  1. If it’s determined ‒ in court ‒ that the breach was the result of the Heartbleed bug,  both Anthem and CHS will have a much harder time defending against negligence ‒ which means the damage awards will be significant.
  2. Whatever the final cost of both breaches (and those yet to come), as always, they will be passed on to each of us as patients and healthcare consumers in the form of higher premiums.

This last one is simply an extension of many other perverse incentives that exist throughout our for‒profit healthcare system. Why bother paying for an expensive barn door that locks when we can simply pass the cost of the all the lost animals onto someone else? Sure there will be hits to profits and earnings, for awhile, and some heads may actually roll (the CIO at Sony was summarily dismissed), but will these mega breaches (and others yet to happen) be enough to change the “culture of security” inside healthcare? Probably not ‒ and certainly not if strong technical voices like Fred’s continue to defend what amounts to a cavalier attitude of security on the basis of a narrow argument – even if that argument is technically correct.

A relatively high proportion of the healthcare executives we interviewed believe that the sophistication or pace of cyberattacks will increase quickly, and all of them agreed that attackers’ capabilities will likely outpace the capabilities of their organization. The healthcare sector appears to be the most underdeveloped, with 56% of healthcare respondents believing that their company spends insufficiently on cybersecurity. Risk and Responsiblity in a Hyperconnected World ‒ McKinsey ‒ January, 2014

The author is a writer for Forbes. He is based in Arizona. 

27 replies »

  1. Anthem’s security breach is not different from one that any other large organization would face. Was this attack truly sophisticated, or could anyone have pulled it.

    “Anthem was the target of a very sophisticated external cyber attack. These attackers gained unauthorized access to Anthem’s IT system and have obtained personal information from our current and former members,” Anthem President and CEO, Joseph R. Swedish, said in a statement.

    Healthcare organizations need to take some precautions against data breaches. Training employees on how to spot a cyber attack can help to reduce the risks of a healthcare breach.Cybersecurity related online communities become a good reference for employees to get more information. I would like to suggest Opsfolio.com, an online community for those involved with healthcare cyber security, which is a right guide for me to get healthcare cybersecurity informations.

  2. Dan – great article! Especially in the light of Fred’s awful and borderline pandering piece. And you have an even better follow-up to the interoperability issue presented here. “Easier” is not a valid reason to implement an insecure system. There are valid approaches to making required data accessible to processing with encryption intact, and leaving other data protected. For example – what everyday claims processing requires employment history and income attached to the patient/payer’s other info? (Note that Anthem has revealed this information was stolen – why do they even have it?!) I can see use cases for using this data in accounts payable, credit assessments, marketing/pricing segmentation and other non-medical purposes, but why does that data have to have the SS#, name attached? What about just an anonymous unique ID to drive the processing? I loathe regulation, but the insurance industry lacks other incentives to make them act responsibly on their own, so regulation may be our only hope.

  3. Can somebody explain the whole back story of the patient identifier thing?

    Why is this such a politically charged idea?

    I think a lot of people don’t get it

  4. John, I disagree – but it’s technically nuanced. The most common approach to an encryption/de-cryption function is in software. That way it’s relatively seamless to the end-user – and relies on the credential. If the credential itself is compromised (which is, I believe, the way the Aetna breach was carried out), the software used to encrypt/de-crypt would be ineffective because the user was “authorized” to the encryption software (and the software would have decrypted the data automatically). This view is supported by the disclosure that the admin noticed his/her account was in use by an unauthorized user. Subtle and nuanced distinction – but a classic example of where tech solutions are often complex – by design – and are often manipulated for criminal intent.

  5. Although encryption was not the problem, if the data was encrypted, it would have been nearly useless to the hackers. I agree with getting away from SS# as an identifier. But it seems like a no-brainer, that with the common attacks for the purpose of identity theft, that any of those identifiers should be encrypted if the organization resists full encryption.

  6. Anshu – they resist because absent a national standard – it’s not in commerical interests to pay for the size and scope of that software change without industry agreement.

    I’ve long argued that it’s not a technical challenge – it’s a business one – and the for-profit stakeholders (large, incumbent software houses in slugfest competition with each other) can’t agree to one.

    This is why we have national bodies to create standards, like voltage – and which side of the street to drive on. Left to commercial interests – you get iOS v. Android v. Blackberry v. Microsoft every time.

    HHS/ONC was/is the logical choice (ONC actually stands for the Office of the National Coordinator) and HIPAA was the legislation authorizing them to do just that – until Congress stepped in and de-funded that one component. No money – no National Patient Identifier. Interesting footnote – HIPAA was also chartered to develop a National Provider Identifier. That wasn’t de-funded – and that is in use today.

    It’s (loosely) analogous to Vehicle Identification Numbers – or VIN’s. VIN’s didn’t really come into use until 1954 – and until 1981 – each manufacturer used their own. Huge mess because it was easy to fool consumers on the history of a car. Maybe it was stolen – or maybe it was flooded – or maybe it was a “lemon.” There was no way to “standardize” on the history of a vehicle without a national VIN.

    Recognizing the huge mess – the National Highway Traffic Safety Administration (NHTSA) stepped in (1981) and built the standard still (largely) in use today.

    Problem (at scale) solved.

    We could/should do that in healthcare. HHS/ONC should be authorized to do that – but alas the healthcare IT industry has sold Congress on the idea that a) it’s not a BIG problem and b) they can solve it on their own (even though they can’t exchange data with each other – but that’s a whole ‘nother argument).

  7. Dan: I concur that its technically feasible. But I don’t understand fully why the healthcare companies won’t want a national identifier? After all, they already use one (SSN). Other than the one time cost, its not hugely expensive relative to all the other costs. There may even be some savings longer term. Why do they resist?

  8. Anshu – we can absolutely build a National Patient Identifier. In fact, HIPAA had that baked into the original legislation! I wrote about that here:

    http://www.forbes.com/sites/danmunro/2014/03/16/who-stole-u-s-healthcare-interop/

    It’s NOT a technical challenge – it’s a financial one – on the part of an industry that’s fully committed to protecting their revenue. Lot’s of blame to go around – but the “industry” lobbied (heavily) to defund HHS/ONC – and they succeeded. As a result – HHS/ONC is legally prohibited from “looking” at the problem – let alone implementing a solution.

    This is also why Fin/Svcs has such a huge advantage. Credit cards are easy to cancel/replace (often) as are things like bank account #’s etc… For that reason – the courts wouldn’t even listen to class action law suits on the part of consumers.

    All that changes with medical identity – which is why the law suits are stacking up like cord word (both CHS and Anthem). The pay-outs will be sizable – and the industry will push those costs (as they always do) onto consumers in the form of higher premiums etc…

    The silver lining – if there even is one – might well be that Anthem + CHS (maybe one or two more) and the system will be forced to address an issue they’ve successfully avoided for decades.

    Digital health (and records/administration) changed the whole health care world – but not our way of thinking. There wasn’t much protection in the paper world either – except one big one. The inability to scale the hack. In the paper world – you had to capture “the paper.” That quasi “safety net” is now completely gone.

  9. Adrian – totally agree – but as long as we continue to use permanent identifiers (SS# and DoB) and things like logins/passwords (the most popular still being 123456 and password) – we’re all at significant/quantifiable medical and financial risk.

    Like you – I’m committed to change – but it is definitely an uphill battle (of Sisyphean proportions). Rick Scott (Governor of Florida) said it best:

    “How many businesses do you know that want to cut their revenue in half? That’s why the healthcare industry won’t reform the healthcare industry.”

    Now many will argue that Rick Scott (founder of Columbia – which merged with HCA) is a bona-fide crook and that he should be in jail. Unfortunately, that ship has sailed – but his credibility to say what he said is second to none. If you or I said that – we’d raise a few eyebrows – but for him to say that? Pretty amazing and hard to refute.

  10. The Social Security Number being used as a default identifier is definitely a huge problem. My Bank A/C# is not SSN and yet I can uniquely pass my info to you to send me money. Why can’t we do the same with health data?

    Using SSN is the laziest, easiest way to deal with identifiers. There is technical solutions to this except healthcare companies have so far not adopted these for some reason (why bother?).

  11. Dan — Identity is about people, not patients. It is not healthcare-specific. What is healthcare-specific is third-party payment based on secret contracts between private payers and private providers. Hence your quip about “financial incentive”.

    Whether it’s interoperability or security, the unique structure of US healthcare has forced us into hoping that regulations by HHS/ONC will solve obvious market failure. Five years and $30 Billion later, it’s time to reconsider that approach. What’s $30 B in a market that’s wasting $1000 B per year?

    As far as I know, the only place that is seriously looking at secure and privacy-preserving patient identifiers is our IDESG Healthcare Workgroup. It’s a public-private partnership with initial funding from a federal agency. IDESG is not healthcare-specific but healthcare is the only significant vertical industry in the process. Our success in IDESG is far from assured, partly because healthcare industry stakeholders are standing back from our group and working at yet another industry-specific and privacy-challenged hack.

    Market failures are difficult to fix.

  12. Anthem’s admission that it did not encrypt this database has, oddly, occupied everyone’s attention even after Anthem admitted to other things it did not do–which are far more relevant to whether the company could have prevented this attack. As I wrote about on Feb. 14, http://www.ibj.com/articles/51789, Anthem told its employer clients that it did not use multi-factor authentication throughout its IT systems (even though every ATM in America uses that concept and even though Medicare requires its use around all sensitive data) and it did not employ user behavior analytics. That second concept would have given Anthem a much better chance of noticing that someone was transferring an estimated 35 gigabytes of data out of its system over the course of seven weeks. One cybersecurity expert described the fact that Anthem did not detect that much data leaking out of its system over such a long period of time as “outrageous” and “shocking.” Encryption most likely would not prevented this data breach, although it might have slowed the hackers down a bit, giving Anthem a better chance at detecting their activity. Anthem’s lack of encryption seems most significant as a signal that it was not doing everything it could to protect consumers’ data. Encryption is a standard feature on the most common database software, and encryption at rest adds about an 8 percent delay in processing time. So the price of encrypting seems fairly small given the size of the risk. Then again, hindsight is always 20-20.

  13. Adrian – I don’t fault Fred directly – it’s a systemic problem. It’s *so* much more fun, sexy and exciting to deal with silicon valley “new.” As a computer scientist myself – I’ve seen this firsthand with lots of “digital health” solutions. It’s fun to “imagine,” and “dream” everything new – but that’s also why top companies (the one’s that really *could* drive change) aren’t remotely interested. Here’s a quote from Sergey Brin:

    “Generally, health is just so heavily regulated. It’s just a painful business to be in. It’s just not necessarily how I want to spend my time.”

    So – if that’s the attitude of the co-founder of Google – how does that translate to all the upcoming tech talent? It’s called signaling – and it’s not good.

    I’m not accusing Fred of this – because I think his personal commitment is definitely deeper and different – but it wasn’t fully articulated in his original piece – which is why I felt compelled to write the rebuttal.

  14. Adrian – because we “stripped” out national patient identifiers when HIPAA was first passed. As you know – the P in HIPAA stands for portability – not privacy. To accommodate that “portability” (and safety) – HIPAA was supposed to deliver a national patient identifier – but it got de-funded. As it is today – HHS/ONC is legally prohibited from even LOOKING at instituting a national patient identifier – even though they recognized the huge value and instituted one for providers.

    Absent an HHS/ONC effort – the default is Social Security #. In a paper-world – that actually had some safety – because it was hard to get – at scale. Today – hackers can walk in and grab 80 million – electronically.

    To coin a phrase intended for something else – digital health fundamentally changed the whole industry – but not our way of thinking. The (false) assumption was that the EHR industry would “sort this out” on their own. Guess what – they didn’t – because there was no “financial incentive” for them to do that. I call it criminal negligence – but, of course, no way to prove that in court (that I’ve found). With 80 million permanent identifiers hacked – we may get there yet ….

    FYI – I wrote about this a year ago under the guise of interoperability – not security – but the two are really flip-sides of the same coin. I’ve given up on interoperability – less of a real threat – in favor of security. If we don’t get this right – trust will evaporate.

    Disclosure: Our family is among the 80 million that could be affected by the Anthem breach. I don’t care that much about all of us as adults, but now we’re victimizing kids – at an unprecedented scale.

    http://www.forbes.com/sites/danmunro/2014/03/16/who-stole-u-s-healthcare-interop/

  15. Dan — Patient Privacy Rights agrees with the need for privacy-preserving and safer patient identifiers. Why is healthcare the only major industry to use “patient matching” instead of voluntary unique IDs?

  16. All – one thing that the industry needs to rally around is to stop the default practice of using permanent numbers (like SS#) for patient data. We adopted an NPI for providers – it’s time to adopt one for patients. HIMSS calculated that 18% of medical errors in hospitals is the result of mis-matched patient ID’s. Everyone loves the promise of big data/analytics – and it’s fun to work in that exciting tech domain – but let’s get the priorities right. Some of this is *really* basic/fundamental stuff – it’s been there for decades – and we can/should just fix it.

  17. Fred – I agree – but the way the article was written begged for the rebuttal. We ALL need to endorse/support a “culture of security.” That doesn’t mean that encryption is always good – or required – but it does mean that more effort is absolutely required on the part of BIG institutions to protect our data. In Anthem’s case – there’s a pattern here. If you look back in the records – this wasn’t the first (and likely won’t be the last) time that Anthem was breached. Anthem (formerly known as Wellpoint) paid HHS $1.7M in 2013 for leaving data exposed over the web.

    http://www.hhs.gov/news/press/2013pres/07/20130711b.html

  18. Fred is technically correct – the Anthem breach didn’t happen because the data was un-encrypted – but that’s no excuse not to embrace a culture of security. Encryption may or may not be appropriate in every circumstance, but the networks today are wide open. It’s not that the door is unlocked – or open – it’s that there’s no door at all.

  19. Correct – there is no going back – but there’s a lot more we can/should do to protect our medical identity. One thing is to stop using SS# as an identifier – and another is embrace a “culture” of data/network security. Anthem (and to a lesser extent CHS) are huge wake-up calls.

  20. Fred, I wonder if your perspective is colored by your passion as a data scientist? That hardly matters in this particular case because hacking is an equal opportunity menace but it does provide some insight into what I have come to notice as three increasingly polarized perspectives on (big) health data: research, clinical, and consumer. Encryption stands to hinder “efficient” research. Encryption, to the extent it leads to stronger accountability, is somewhat threatening to clinicians. On balance, it’s more of a nuisance than a duty.

    To consumers, encryption is like hand washing, a sound practice and a show of respect. Technology is cheap and getting cheaper. Today’s helicopter is tomorrow’s drone and I’m sure there’s someone out there designing the first drone lawnmower. If researchers and clinicians expect to gain our trust for health data they would do well to treat it with a lot more respect.

    Adrian

  21. I have apparently failed to communicate the points in my previous article well.

    I had hoped to contrast two oversights that the mainstream press was making in covering the Anthem hack. The first being the knee-jerk reaction that mere encryption at rest is somehow protective against the type of hack that Anthem experienced, which it clearly isnt. The second point was that the NSA, by continuing to place energy in “attacking” our nations “enemies” with computers, was implicitly under-investing in helping organizations like Anthem protect themselves adequately.

    I tried to make some more subtle points too, but lets take a moment and highlight some points that I did NOT make. I did not say anywhere, as Mr Munro summarizes, “encryption has the potential to cause data processing delays”

    The problem with investing in encryption at rest, when it provides no additional security, is that other security investments are not afforded… so the overall security posture is effectively degraded.

    Mr. Briggs, in comments above, points out that I “lack of concern around the privacy and protection of patient data.” and that I “think that technology can solve all problems, and if it can’t, then there is no solution at all”

    I hold that using helicopters to cut grass is the use of the wrong tool for a given problem. I do not recommend that either. That is not to say that I am against helicopters. Nor am I against cutting grass. I can be a fan of helicopters, and cutting grass, without holding that helicopters are the right way to cut grass.

    Similarly, I believe in the use of strong encryption at rest for patient data. I also believe that patient data should be secured. The point of that article is that, I do not always believe that strong encryption at rest is the best way to secure patient data.

    I do hope that one might click the link above and read what I actually had to say rather than trusting Munro and Briggs… who are both straw-manning my position in their own way.

    Thanks,
    -FT

  22. An excellent reply to Fred Trotter’s less than convincing argument for not encrypting patient data. I worked in financial services for several years at one of the biggest global banks in the world and we figured out how keep the business going after encrypting data both at rest and in transit. The federal reserve seems to be doing just fine with exchanging encrypted data with its member banks as well. Most laptops in use with sensitive data utilize full-disk encryption and the sun continues to rise day after day.

    However, as Mr. Munro points out, the most appalling aspect of Trotter’s argument is the lack of concern around the privacy and protection of patient data. Indeed, the existing culture leaves much to be desired. Trotter’s words speak to the typical pathetic attitude of technical folks who seem to think that technology can solve all problems, and if it can’t, then there is no solution at all.

  23. Dr Palmer, pay attention to Dr Gropper. He knows something about his subject. You do not.

    My feeling is that no one is paying attention to the area in which we’re likely to find the most meaningful “solution” to health care data security, which is to punish the crap out of anyone using any personally identifiable data in anything that hints of a discriminatory fashion.

    Sure, tech wizards should do what they can to thwart snoops, snitchers, snatchers, and other data-filching creeps. But damage happens when people USE filched data in noxious ways.

  24. Technology can work to our security advantage if we put patients ahead of profits. As the attack surface increases diversity needs to increase as well.

    Of course, diversity is not a strategic interest in our for-profit healthcare system. We’re constantly tending toward more consolidation into “Integrated Delivery Networks”, “Epic Everywhere”, “Health Information Exchanges”, “All Payer Claims Databases”, “Implantable Cardiac Defibrillators all connected to one vendor’s cloud”, etc…

    Everywhere you turn, the kind of diversity that used to secure the patient-physician relationship is being replaced by profit-driven consolidation. Open source, peer reviewed medicine is being replaced by secret-sauce EHRs and apps. What could possibly go wrong?

    Technology monoculture is just as risky as agricultural monoculture. It’s also unnecessary. Health data is high and growing in value while technology and connectivity are almost free and getting cheaper. Patients and physicians can own technology as individuals. The technology can be open source with all of the transparency, security, and diversity of the traditional interaction between a patient and a licensed professional.

    Let’s use open source technology to return control to the patient and her physician. The attacks will continue but they will not impact 80 Million people at a time.

  25. Well, there is no going back to handwriting. And encryption is too slow and non-interoperable. And we have to have interoperability. But the public is not going to stand for much more of this insecurity. We are speeding toward a brick wall.

    So, don’t we have to conclude that patients are simply not going to tell us those histories that are sensitive and will refuse to allow us to enter data that is potentially harmful or embarrassing?

    Therefore, no matter what we do, the EHR will be incomplete? And the payers will get off without paying for the sensitive components of care? …because they won’t get presented with those claims? And the patients may have to pay oop for certain elements of care?…the records of which will be tucked away, hidden somewhere?

    The security issue is a crux for going forward, I think.