Go antiquing and you’ll find crazy stuff. Sometimes crazy stuff finds you. It’s past…but not gone. Health IT is like that.
Thanks mostly to the Feds, as of April 2015, about 98% of U.S. hospitals were collecting electronic health records (EHRs). About 95% of hospitals, according to the Office of the National Coordinator for Health IT, had demonstrated “Meaningful Use”. But, while those information systems and the data they were gathering had meaning to the government, they were not necessarily meaningful to healthcare providers or their patients.
Early this year 34 medical societies signed a letter to the National Coordinator for Health IT at Health and Human Services citing their growing frustration: “Many physicians find these (EHR) systems cumbersome, do not meet their workflow needs, decrease efficiency, and have limited, if any, interoperatbility. Most importantly, certified EHR technology (CEHRT) can present safety concerns for patients. We believe there is an urgent need to change the current certification program to better align end-to-end testing to focus on EHR usability, interoperability, and safety.”
The problem is not hard to isolate. Government metrics for “meaningful use” are satisfied using data collected, stored and pulled from silos in the healthcare enterprise. They are served up by disparate information systems typically designed and implemented to achieve specific clinical, financial, and operational goals.
As part of an integrated hospital system, patient data should be interpreted in context with clinical and financial purpose. For the most part they are not. But they could be.
In the ICU at Beth Israel Deaconess Medical Center, data are being mined by MIT engineers for better ways to manage and treat the sickest patients. More than five million patients enter ICUs annually in the U.S. As many as one in three die. The Beth Israel project, called “Risky States,” aims to calculate risk levels at specific times in the hospital’s seven ICUs, according to the Wall Street Journal. Among the risky situations identified: high numbers of admissions; greater numbers of sicker patients; a higher percentage of nurses with less than a year of ICU experience; and a high patient-to-nurse ratio.
As America moves from fee-for-service to pay-for-performance reimbursement, the need to factor in cost becomes paramount. The idea is simple; the execution is anything but: integrate financial and operational data with clinical data in ways that make sense for everyone involved.
Information systems, plied with radiofrequency identification (RFID) technologies, track medications at university medical centers in Vermont and Michigan, while cloud-based software analyze usage and order replacement stock. An RFID-enabled IT system at UC San Diego monitors the viability of plasma and specialty products, including anti-venom for snake bites.
Big data and advanced analytics are being used to mitigate risk – both to the patient and the institution. Carolinas HealthCare Systems is using it to evaluate a system for determining the risk of readmission. By modeling readmission events, the Carolinas staff has identified risk factors and the interventions that might keep them from running afoul of the Centers for Medicare and Medicaid Services (CMS).
The Hospital Readmissions Reduction Program is one of several ways CMS is trying to rein in healthcare costs. Some of these programs use a carrot to get hospitals to be more efficient and cost-effective in the management of patients. The Readmissions Reduction Program uses a stick. CMS expects to fine hospitals $428 million this fiscal year (from Oct 1, 2014 to Sept 30 2015) for having too many patient readmissions. That’s $200 million more than the previous fiscal year.
Carolinas might have a solution, but it’s too early to know. According to the healthcare analytics firm Jvion, only about 15 percent of U.S. hospitals and medical centers currently use this solution, called predictive analytics. But of those that do, 27% use it to predict – and potentially reduce – readmissions. Another 27% use it to predict sepsis; 18% for patient deterioration; 10% for general patient health; and 18% for finalizing decisions.
Unquestionably predictive analytics, and other health ITs, have shown promise as means to improve efficiency, reduce costs and, maybe even save lives. Two years ago, Meriter Hospital in downtown Madison, WI, was struggling with the same challenge as many other community hospitals. With its 300-plus beds occupied by different types of patients of different ages, beset by different problems, its staff was buried in data.
Many, if not the vast majority of these data were important. But none in themselves would lead to more efficient or effective patient care.
Meriter used business intelligence software to knit clinical, financial and supply chain data into a consolidated view of care about a specific slice of its clinical pie – orthopedic surgery. The result was savings of between 18% and 20% on orthopedic implants, savings that were achieved because the data gave administrators the knowledge to renegotiate supply contracts.
These are some of the building blocks of future medicine – mundane and exotic alike – ones from which the new world of healthcare will rise.
Twice weekly this blog will look at them…vignettes and assessments of HIT; some painted in broad brush; dimpled with today’s technology. Some shaded with what might come.
They will describe, explain and prognosticate the permutations and effects of health IT; recite lessons of what worked, what needs to be improved, what shouldn’t be repeated.
When healthcare meets IT, the results will be palpable.