Categories

Tag: Privacy

For Your Radar — Huge Implications for Healthcare in Pending Privacy Legislation

Deven McGraw
Vince Kuraitis

By VINCE KURAITIS and DEVEN McGRAW

Two years ago we wouldn’t have believed it — the U.S. Congress is considering broad privacy and data protection legislation in 2019. There is some bipartisan support and a strong possibility that legislation will be passed. Two recent articles in The Washington Post and AP News will help you get up to speed.

Federal privacy legislation would have a huge impact on all healthcare stakeholders, including patients.  Here’s an overview of the ground we’ll cover in this post:

  • Why Now?
  • Six Key Issues for Healthcare
  • What’s Next?

We are aware of at least 5 proposed Congressional bills and 16 Privacy Frameworks/Principles. These are listed in the Appendix below; please feel free to update these lists in your comments.  In this post we’ll focus on providing background and describing issues. In a future post we will compare and contrast specific legislative proposals.

Continue reading…

Give Young Adults Needed Privacy for Health

Under the 2010 health care overhaul, millions of young adults in the United States can access health care on a parent’s health insurance policy. That’s a good thing because it means they are more likely to get preventive care that can keep them from getting sick in the first place.

Yet a glitch in the system means that young adults might forgo treatment for conditions they don’t want their parents to know about — such as sexually transmitted diseases.

These young people are afraid, and rightly so, that an insurance company will send an explanation of benefits home to the parent who holds the health insurance policy. And that means Mom or Dad will know about the services they received at the doctor’s office.

Research suggests that young adults ages 19 to 26 will skip a visit to the doctor if they are worried about privacy. In the worst-case scenario, that translates to no treatment at all or delayed care for sexually transmitted diseases, mental health problems, substance abuse, domestic violence, unplanned pregnancies and many other serious and potentially costly conditions.

EOBs do serve an important function. These letters document receipt of health care services, listing specific information such as the type of care, the patient’s name, the provider, total payment made and the date of service. They’re required by law in most states, because they notify the patient about services received and encourage them to report errors or fraudulent billing to the insurer — and in this way, they save money for our health care system.

At the same time, this glitch in the system can negatively affect an individual’s health. For example, if a young woman doesn’t want her parents to know about an unplanned pregnancy, she might delay getting the prenatal care that helps lead to a healthy pregnancy and a full-term baby. If a young man with serious depression doesn’t get treatment, he might end up losing a job, or worse.

And if we consider the effect that this privacy glitch could have on the spread of STDs, it is easy to see this problem as a public health issue.

Chlamydia is the most common STD in the United States, causing more than a million reported cases of infection every year. Yet health plan data shows that chlamydia screening has remained below the 50 percent mark since 2000. Some experts attribute the low testing levels among privately insured young women to concerns about confidentiality.

The tragedy of the situation is simply this: Left undetected and thus untreated, chlamydia can lead to infertility, pelvic inflammatory disease and potentially deadly ectopic pregnancies. If the EOB loophole were fixed, young adults would be more likely to be screened and treated, and we would prevent many of these costly complications.

Privacy concerns might also drive some minors and young adults to visit publicly funded clinics that provide care for STDs and other conditions — usually at a reduced price that the patient pays up front. In that case, young adults get the treatment they need without a breach in privacy due to a billing disclosure. But that means your tax dollars are paying for care covered by private insurance.

These public safety net providers already are strapped trying to care for uninsured patients who cannot get care any other way. Let’s not add to that burden.

So what’s the solution?

Individual states have eliminated EOB requirements when a dependent requests a sensitive service such as testing for an  STD. For example, Washington state allows young adults to maintain privacy for such services as long as a written request goes to the insurance company.

Many insurance companies eliminate the EOB when the holder of the policy, in this case a parent, has no financial obligation. But patchwork solutions will not give young adults all over the country the privacy they deserve.

We believe the time has come for a national solution to this problem, one that might follow the example set by the state of Washington. It is time for a national policy or rule that eliminates the EOB requirement when young adults seek access to or treatment for a limited set of sensitive services and conditions.

Young adults are just that: adults. And it is time we give them the privacy they need to access services they need to stay healthy.

Denise Chrysler is director of the Network for Public Health Law in the Mid-States Region. Robyn Rontal is network collaborator for health information data sharing at the Network for Public Health Law in the Mid-States Region. The views expressed in this article are those of the authors and do not represent the position or policy of the Network for Public Health Law or its funders.

The Privacy Dilemma

Joseph KvedarI recently had the opportunity to join Boston news media veteran, Dan Rea, on his AM radio program, Nightside with Dan Rea. It was a one-hour call in program, and an eye opening experience for me. Dan and I chatted about connected health and how it can truly disrupt care delivery and put the individual at the center of their own health. Then Dan opened the lines to the fine citizens of New England for questions, and the phones started ringing off the hook.

The overwhelming concern – actual fear — among callers was maintaining their privacy in an increasingly connected world, especially their personal health data. This is a topic I touched upon in my recent book, The Internet of Healthy Things, and one which I will explore further in my upcoming talk at our Connected Health Symposium in a few weeks. But I was so struck by the extent of concern, I thought I’d present a few theories I’ve been contemplating on the subject.Continue reading…

Give up Your Data to Cure Disease? Not so Fast!

flying cadeuciiThis weekend the NYTimes published an editorial titled Give Up Your Data to Cure Disease. When we will stop seeing mindless memes and tropes that cures and innovation require the destruction of the most important human and civil right in Democracies, the right to privacy? In practical terms privacy means the right of control over personal information, with rare exceptions like saving a life.

Why aren’t government and industry interested in win-win solutions?  Privacy and research for cures are not mutually exclusive.

How is it that government and the healthcare industry have zero comprehension that the right to determine uses of personal information is fundamental to the practice of Medicine, and an absolute requirement for trust between two people?

Why do the data broker and healthcare industries have so little interest in computer science and great technologies that enable research without compromising privacy?

Today healthcare “innovation” means using technology for spying, collecting, and selling intimate data about our minds and bodies.

This global business model exploits and harms the population of every nation.  Today no nation has a map that tracks the millions of hidden data bases where health information is collected and used, inaccessible and unaccountable to us.  How can we weigh risks when we don’t know where our data are held or how data are used? See www.theDataMap.org .

Continue reading…

Universal Patient Identifiers for the 21st Century

Healthcare is abuzz with calls for Universal Patient Identifiers. Universal people identifiers have been around for decades and experience can help us understand what, if anything, makes patients different from people. This post argues that surveillance may be a desirable side-effect of access to a health service but the use of unique patient identifiers for surveillance needs to be managed separately from the use of identifiers in a service relationship. Surveillance uses must always be clearly disclosed to the patient or their custodian each time they are sent by the service provider or “matched” by the surveillance agency. This includes health information exchanges or research data registries.

As a medical device entrepreneur, physician, engineer, and CTO of Patient Privacy Rights, I have decades of experience with patient identifier practices and standards. I feel particularly qualified to discuss patient identifiers because I serve on the Board and Management Council of the NIST-founded Identity Ecosystems Steering Group (IDESG) where I am the Privacy and Civil Liberties Delegate. I am also a core participant to industry standards groups Kantara-UMA and OpenID-HEART working on personal data and I consult on patient and citizen identity with public agencies.

Continue reading…

Anthem Was Right Not to Encrypt

Optimized-FredTrotterThe Internet is abuzz criticizing Anthem for not encrypting its patient records. Anthem has been hacked, for those not paying attention.

Anthem was right, and the Internet is wrong. Or at least, Anthem should be “presumed innocent” on the issue. More importantly, by creating buzz around this issue, reporters are missing the real story: that multinational hacking forces are targeting large healthcare institutions.

Most lay people, clinicians and apparently, reporters, simply do not understand when encryption is helpful. They presume that encrypted records are always more secure than unencrypted records, which is simplistic and untrue.

Encryption is a mechanism that ensures that data is useless without a key, much in the same way that your car is made useless without a car key. Given this analogy, what has apparently happened to Anthem is the security equivalent to a car-jacking.

When someone uses a gun to threaten a person into handing over both the car and the car keys needed to make that care useless, no one says “well that car manufacturer needs to invest in more secure keys”.

In general, systems that rely on keys to protect assets are useless once the bad guy gets ahold of the keys. Apparently, whoever hacked Anthem was able to crack the system open enough to gain “programmer access”. Without knowing precisely what that means, it is fair to assume that even in a given system implementing “encryption-at-rest”, the programmers have the keys. Typically it is the programmer that hands out the keys.

Most of the time, hackers seek to “go around” encryption. Suggesting that we use more encryption or suggesting that we should use it differently is only useful when “going around it” is not simple. In this case, that is what happened.

Continue reading…

Privacy and Security and the Internet of Things

Screen Shot 2015-02-03 at 8.28.53 AM

In the future, everything will be connected.

That future is almost here.

Over a year ago, the Federal Trade Commission held an Internet of Thingsworkshop and it has finally issued a report summarizing comments and recommendations that came out of that conclave.

As in the case of the HITECH Act’s attempt to increase public confidence in electronic health records by ramping up privacy and security protections for health data, the IoT report — and an accompanying publication with recommendations to industry regarding taking a risk-based approach to development, adhering to industry best practices (encryption, authentication, etc.) — seeks to increase the public’s confidence, but is doing it the FTC way: no actual rules, just guidance that can be used later by the FTC in enforcement cases. The FTC can take action against an entity that engages in unfair or deceptive business practices, but such practices are defined by case law (administrative and judicial), not regulations, thus creating the U.S. Supreme Court and pornography conundrum — I can’t define it, but I know it when I see it (see Justice Stewart’s timeless concurring opinion in Jacobellis v. Ohio).

To anyone actively involved in data privacy and security, the recommendations seem frighteningly basic:

build security into devices at the outset, rather than as an afterthought in the design process;

train employees about the importance of security, and ensure that security is managed at an appropriate level in the organization;

ensure that when outside service providers are hired, that those providers are capable of maintaining reasonable security, and provide reasonable oversight of the providers;

when a security risk is identified, consider a “defense-in-depth” strategy whereby multiple layers of security may be used to defend against a particular risk;

consider measures to keep unauthorized users from accessing a consumer’s device, data, or personal information stored on the network;

monitor connected devices throughout their expected life cycle, and where feasible, provide security patches to cover known risks.

consider data minimization – that is, limiting the collection of consumer data, and retaining that information only for a set period of time, and not indefinitely;

notify consumers and give them choices about how their information will be used, particularly when the data collection is beyond consumers’ reasonable expectations.

Continue reading…

An Open Letter to the People Who Brought Us HIPAA

flying cadeuciiOver the last five years, the United States has undergone more significant changes to its health care system perhaps since Medicare and Medicaid were introduced in the 1960s. The Health Information Technology for Economic and Clinical Health (HITECH) Act in 2009 and the Patient Protection and Affordable Care Act of 2010 have paved the way for tremendous changes to our system’s information backbone and aim to provide more Americans access to health care.

But one often-overlooked segment of our health care system has been letting us down. Patients’ access to their own medical information remains limited. The HIPAA Privacy Rule grants individuals the right to copies of their own medical records, but it comes at a noteworthy cost—health care providers are allowed to charge patients a fee for each record request. As explained on the Department of Health and Human Services’ website, “the Privacy Rule permits the covered entity to impose reasonable, cost-based fees.”

HIPAA is a federal regulation, so the states have each imposed guidelines outlining their own interpretations of “reasonable.” Ideally, the price of a record request would remain relatively constant—after all, the cost of producing these records does not differ significantly from state to state. But in reality, the cost of requesting one’s medical record is not only unreasonably expensive; it is also inconsistent, costing dramatically different amounts based on local regulation.Continue reading…

Black Turtlenecks, Data Fiends and Code. An Interview with John Halamka

John Halamka-Google Glass

Of the nearly 100 people I interviewed for my upcoming book, John Halmaka was one of the most fascinating. Halamka is CIO of Beth Israel Deaconess Medical Center and a national leader in health IT policy. He also runs a family farm, on which he raises ducks, alpacas and llamas. His penchant for black mock turtlenecks, along with his brilliance and quirkiness, raise inevitable comparisons to Steve Jobs. I interviewed him in Boston on August 12, 2014.

Our conversation was very wide ranging, but I was particularly struck by what Halamka had to say about federal privacy regulations and HIPAA, and their impact on his job as CIO. Let’s start with that.

Halamka: Not long ago, one of our physicians went into an Apple store and bought a laptop. He returned to his office, plugged it in, and synched his e-mail. He then left for a meeting. When he came back, the laptop was gone. We looked at the video footage and saw that a known felon had entered the building, grabbed the laptop, and fled. We found him, and he was arrested.

Now, what is the likelihood that this drug fiend stole the device because he had identity theft in mind? That would be zero. But the case has now exceeded $500,000 in legal fees, forensic work, and investigations. We are close to signing a settlement agreement where we basically say, “It wasn’t our fault but here’s a set of actions Beth Israel will put in place so that no doctor is ever allowed again to bring a device into our environment and download patient data to it.”

Continue reading…

Is Deborah Peel up to her old tricks?

Long time (well very long time) readers of THCB will remember my extreme frustration with Patients Privacyflying cadeucii Rights founder Deborah Peel who as far as I can tell spent the entire 2000s opposing electronic health data in general and commercial EMR vendors in particular. I even wrote a very critical piece about her and the people from the World Privacy Forum who I felt were fellow travelers back in 2008. And perhaps nothing annoyed me more than her consistently claiming that data exchange was illegal and that vendors were selling personally identified health data for marketing and related purposes to non-covered entities (which is illegal under HIPAA).

However, in recent years Deborah has teamed up with Adrian Gropper, whom I respect and seemed to change her tune from “all electronic data violates privacy and is therefore bad”, to “we can do health data in a way that safeguards privacy but achieves the efficiencies of care improvement via electronic data exchange”. But she never really came clean on all those claims about vendors selling personally identified health data, and in a semi-related thread on THCB last week, it all came back. Including some outrageous statements on the extent of, value of, and implications of selling personally identified health data. So I’ve decided to move all the relevant comments to this blog post and let the disagreement continue.

What started the conversation was a throwaway paragraph at the end of a comment I left in which I basically told Adrian to rewrite what he was saying in such a way that normal people could understand it. Here’s my last paragraph

As it is, this is not a helpful open letter, and it makes a bunch of aggressive claims against mostly teeny vendors who have historically been on the patients’ side in terms of accessing data. So Adrian, Deborah & PPR need to do a lot better. Or else they risk being excluded back to the fringes like they were in the days when Deborah & her allies at the World Privacy Forum were making ridiculous statements about the concept of data exchange.

Here’s Deborah’s first commentContinue reading…