Categories

Tag: analytics

WEBINAR: The How-To of Healthcare Analytics: Implementation to Activation

SPONSORED POST

30% of the world’s data volume is being generated by the healthcare industry. We have more information available to us than any other industry but have yet to realize the potential of this information to create predictive, personalized care. To help us navigate the data complexities currently holding us back, Reuters Events has united informatics experts, from NYC Health + HospitalsSutter Health, & UnityPoint Health.

Register here for the March 28 (THAT’s THIS COMING FRIDAY!) 11am ET webinar: ‘The How-To of Healthcare Analytics: Implementation to Activation’.

For Your Radar — Huge Implications for Healthcare in Pending Privacy Legislation

Deven McGraw
Vince Kuraitis

By VINCE KURAITIS and DEVEN McGRAW

Two years ago we wouldn’t have believed it — the U.S. Congress is considering broad privacy and data protection legislation in 2019. There is some bipartisan support and a strong possibility that legislation will be passed. Two recent articles in The Washington Post and AP News will help you get up to speed.

Federal privacy legislation would have a huge impact on all healthcare stakeholders, including patients.  Here’s an overview of the ground we’ll cover in this post:

  • Why Now?
  • Six Key Issues for Healthcare
  • What’s Next?

We are aware of at least 5 proposed Congressional bills and 16 Privacy Frameworks/Principles. These are listed in the Appendix below; please feel free to update these lists in your comments.  In this post we’ll focus on providing background and describing issues. In a future post we will compare and contrast specific legislative proposals.

Continue reading…

It’s The Platform, Stupid: Capturing the Value of Data in Campaigns — and Healthcare

If you’ve yet not discovered Alexis Madrigal’s fascinating Atlantic article (#longread), describing “how a dream team of engineers from Facebook, Twitter, and Google built the software that drove Barack Obama’s re-election,” stop right now and read it.

In essence, a team of technologists developed for the Obama campaign a robust, in-house platform that integrated a range of capabilities that seamlessly connected analytics, outreach, recruitment, and fundraising.  While difficult to construct, the platform ultimately delivered, enabling a degree of logistical support that Romney’s campaign reportedly was never able to achieve.

It’s an incredible story, and arguably one with significant implications for digital health.

(1) To Leverage The Power of Data, Interoperability Is Essential

Data are useful only to the extent you can access, analyze, and share them.  It increasingly appears that the genius of the Obama campaign’s technology effort wasn’t just the specific data tools that permitted microtargeting of constituents, or evaluated voter solicitation messages, or enabled the cost-effective purchasing of advertising time. Rather, success flowed from the design attributes of the platform itself, a platform built around the need for inoperability, and guided by an integrated strategic vision.

Continue reading…

Will Getting More Granular Help Doctors Make Better Decisions?

flying cadeuciiI’ve been thinking a lot about “big data” and how it is going to affect the practice of medicine.  It’s not really my area of expertise– but here are  a few thoughts on the tricky intersection of data mining and medicine.

First, some background: these days it’s rare to find companies that don’t use data-mining and predictive models to make business decisions. For example, financial firms regularly use analytic models to figure out if an applicant for credit will default; health insurance firms can predict downstream medical utilization based on historic healthcare visits; and the IRS can spot tax fraud by looking for fraudulent patterns in tax returns. The predictive analytic vendors are seeing an explosion of growth: Forbes recently noted that big data hardware/software and services will grow at a compound annual growth rate of 30% through 2018.

Big data isn’t rocket surgery. The key to each of these models is pattern recognition: correlating a particular variable with another and linking variables to a future result. More and better data typically leads to better predictions.

It seems that the unstated, and implicit belief in the world of big data is that when you add more variables and get deeper into the weeds, interpretation improves and the prediction become more accurate.Continue reading…

International Classification of Diseases Hampers the Use of Analytics to Improve Health Care

By ANDY ORAM

andy oramThe health care field is in the grip of a standard that drains resources while infusing little back in return. Stuck in a paradigm that was defined in 1893 and never revised with regard for the promise offered by modern information processing, ICD symbolizes many of the fetters that keep the health industries from acting more intelligently and efficiently.

We are not going to escape the morass of ICD any time soon. As the “I” indicates in the title, the standard is an international one and the pace of change moves too slowly to be clocked.

In a period when hospitals are gasping to keep their heads above the surface of the water and need to invest in such improvements as analytics and standardized data exchange, the government has weighed them down with costs reaching hundreds of thousands of dollars, even millions just to upgrade from version 9 to 10 of ICD. An absurd appeal to Congress pushed the deadline back another year, penalizing the many institutions that had faithfully made the investment. But the problems of ICD will not be fixed by version 10, nor by version 11–they are fundamental to the committee’s disregard for the information needs of health institutions.

Disease is a multi-faceted and somewhat subjective topic. Among the aspects the health care providers must consider are these:

  • Disease may take years to pin down. At each visit, a person may be entering the doctor’s office with multiple competing diagnoses. Furthermore, each encounter may shift the balance of probability toward some diagnoses and away from others.
  • Disease evolves, sometimes in predictable ways. For instance, Parkinson’s and multiple sclerosis lead to various motor and speech problems that change over the decades.
  • Diseases are interrelated. For instance, obesity may be a factor in such different complaints as Type 2 diabetes and knee pain.

All these things have subtle impacts on treatment and–in the pay-for-value systems we are trying to institute in health care–should affect reimbursements. For instance, if we could run a program that tracked the shifting and coalescing interpretations that eventually lead to a patient’s definitive diagnosis, we might make the process take place much faster for future patients. But all a doctor can do currently is list conditions in a form such as:

E66.0 – Obesity due to excess calories

E11 – Type 2 diabetes mellitus

M25.562 – Pain in left knee

The tragedy is that today’s data analytics allow so much more sophistication in representing the ins and outs of disease.Take the issues of interrelations, for instance.

These are easy to visualize as graphs, a subject I covered recently.

Continue reading…

Very Big Data

The field of analytics has fallen into a few big holes lately that represent both its promise and its peril.  These holes pertain to privacy, policy, and predictions.

Policy.  2.2/7. The biggest analytics project in recent history is the $6 billion federal investment in the health exchanges.  The goals of the health exchanges are to enroll people in the health insurance plans of their choice, determine insurance subsidies for individuals, and inform insurance companies so that they could issue policies and bills.

The project touches on all the requisites of analytics including big data collection, multiple sources, integration, embedded algorithms, real time reporting, and state of the art software and hardware.  As everyone knows, the implementation was a terrible failure.

The CBO’s conservative estimate was that 7 million individuals would enroll in the exchanges.  Only 2.2 million did so by the end of 2013.  (This does not include Medicaid enrollment which had its own projections.)  The big federal vendor, CGI, is being blamed for the mess.

Note that CGI was also the vendor for the Commonwealth of Massachusetts which had the worst performance of all states in meeting enrollment numbers despite its long head start as the Romney reform state and its groundbreaking exchange called the Connector. New analytics vendors, including Accenture and Optum, have been brought in for the rescue.

Was it really a result of bad software, hardware, and coding?   Was it  that the design to enroll and determine subsidies had “complexity built-in” because of the legislation that cobbled together existing cumbersome systems, e.g. private health insurance systems?  Was it because of the incessant politics of repeal that distracted policy implementation?  Yes, all of the above.

The big “hole”, in my view, was the lack of communications between the policy makers (the business) and the technology people.  The technologists complained that the business could not make decisions and provide clear guidance.  The business expected the technology companies to know all about the complicated analytics and get the job done, on time.

This ensuing rift where each group did not know how to talk with the other is recognized as a critical failure point.  In fact, those who are stepping into the rescue role have emphasized that there will be management status checks daily “at 9 AM and 5 PM” to bring people together, know the plan, manage the project, stay focused, and solve problems.

Walking around the hole will require a better understanding as to why the business and the technology folks do not communicate well and to recognize that soft people skills can avert hard technical catastrophes.

Continue reading…

HIMSS Unplugged

By ANDY ORAM

HIMSS has opened and closed in Florida and I’m in Boston with snow up to my rectus abdominis. After several years of watching keynote pageants and scarfing up the amenities at HIMSS conferences, I decided to stay home this year.

Writing articles from earlier conferences certainly called on all my energy and talents. In 2010 I called for more open source and standards in the health care field. In 2012 I decried short-term thinking and lack of interest in real health transformation. In 2013 I highlighted how far providers and vendors were from effective patient engagement.

In general, I’ve found that my attendance at HIMSS leads moaning and carping about the state of health IT. So this year I figured I could sit in my office while moaning and carping about the state of health IT.

In particular, my theme this year is how health IT is outrunning the institutions that need it, and what will happen to those left behind.

The scissors crisis: more IT expenditures and decreasing revenues

Although the trade and mainstream press discuss various funding challenges faced by hospitals and other health providers, I haven’t seen anyone put it all together and lay out the dismal prospects these institutions have for fiscal health. Essentially, everything they need to do in information technology will require a lot more money, and all the income trends are declining.

Certainly the long-term payoff for the investment in information technology could be cost reductions–but only after many years, and only if it’s done right. And certainly, some institutions are flush with cash and are even buying up others. What we’re seeing in health care is a microcosm of the income gap seen throughout the world. To cite Billie Holliday: them that’s got shall get; them that’s not shall lose.

Here are the trends in IT:

  • Meaningful Use requires the purchase of electronic health records, which run into the hundreds of thousands of dollars just for licensing fees. Training, maintenance, storage, security, and other costs add even more. The incentive payments from the federal government come nowhere near covering the costs. EHR providers who offer their record systems on the Web (Software as a Service) tend to be cheaper than the older wave of EHRs. Open source solutions also cost much less than proprietary ones, but have made little headway in the US.
  • Hot on the heals of Meaningful Use is ICD-10 compliance, a major upgrade to the diagnostic codes assigned to patient conditions. Training costs (and the inevitable loss of productivity caused by transitions) could be staggering. Some 80% of providers may miss the government’s deadline. The American Medical Association, citing estimated prices for a small practice of $56,639 to $226,105 (p. 2), recently urged the government to back off on requiring ICD-10. Their point of view seems to be that ICD-10 might have benefits, but far less than other things the providers need money for. Having already put off its deadline, the Department refuses to bend further.
    Continue reading…
  • Are Payors Changing What They Pay For Medical Billing Codes To Adjust For Supply and Demand?

    Startup Mojo from Rhode Island writes:

    Hey there, maybe THCB readers can weigh in on this one. I work at a healthcare startup. Somebody I know who works in medical billing told me that several big name insurers they know of are using analytics to adjust reimbursement rates for  medical billing codes on an almost daily and even hourly basis (a bit like the travel sites and airlines do to adjust for supply and demand) and encourage/discourage certain codes.  If that’s true, its certainly fascinating and pretty predictable, I guess.

    I’m not sure how I feel about this. It sounds draconian. On the other hand,  it also sounds cool. Everybody else is doing the same sort of stuff with analytics: why not insurers? Information on this practice would obviously be useful for providers submitting claims, who might theoretically be able to game the system by timing when and how they submit. Is there any data out there on this?

    Is this b.s. or not?

    Lost in the health care maze? Having trouble with your health Insurance? Confused about your treatment options? Email your questions to THCB’s editors. We’ll run the good ones as posts.

    Using Predictive Modeling to Make Better Decisions

    In an article posted earlier this year on this blog I argued that hospitals have traditionally done a sub-par job of leveraging what has now been dubbed “big data.” Effectively mining and managing the ever rising oceans of data presents both a major challenge – and a significant opportunity – for hospitals.

    By doing a better of job connecting the dots of their big data assets, hospital management teams can start to develop the crucial insights that enable them to make the right and timely decisions that are vital to success today. And, better, timelier decisions lead to improved results and a higher level of quality patient care.

    That’s the good news. The less than positive story is that hospitals are still way behind in using the mountains of data that are being generated within their institutions every day. Nowhere is this more apparent than in the advanced data management practice of predictive modeling.

    At its most basic, predictive modeling is the process by which data models are created and used to try to predict the probability of an outcome. The exciting promise of predictive modeling is that it literally gives hospitals the ability to see into (and predict) the future. Given the massive changes and continuing uncertainty that are buffeting all sectors of the healthcare industry (and especially healthcare providers), having a clearer future view represents an important strategic advantage for any hospital leader.

    Continue reading…