One of the more challenging aspects of emergency department (ED) healthcare is determining which patients to admit for observation and which patients to send home. One healthcare system is tackling that challenge with predictive analytics.
Unnecessary hospitalizations cause numerous problems: longer wait times, a lack of beds for the patients that really need them, wasted time of emergency medical staff, not to mention costs borne by patients, hospitals and insurers. On the other hand, failing to admit patients that really do need care can have deadly consequences.
NorthShore University HealthSystem, which operates four hospitals in Illinois, is leveraging data and predictive analytics to address that challenge. The project, dubbed ‘Technology-driven Chest Pain Management in the ED,’ won NorthShore a CIO 100 Award in IT Excellence.
An evidence-based approach
Chest pain is the most common reason that ED staff elects to keep patients for observation in NorthShore’s emergency departments. But prior to 2017, NorthShore did not have an evidence-based practice for determining whether a chest pain patient should be admitted. Chest pain could be an indicator of a heart attack, but it could also be a symptom of something far less serious, like heartburn.
“The challenge is hospitals are generally really conservative in terms of ruling out that there’s a serious thing going on, as we should be,” says Chad Konchak, assistant vice president of clinical analytics at NorthShore. “The idea was, could we build tools physicians and nurses in the emergency room to help better understand and identify patients at high risk of a heart attack?”
NorthShore created a cross-functional team to address the problem. The team included the health system’s chief quality and transformation officer, a hospital president, two ED physicians, the Inpatient Health Information Technology team and the Clinical Analytics team. It also included hospital administration leadership, nursing leadership, finance, and members of the health system’s quality team. The team identified the HEART score (History, Electrocardiogram, Age, Risk factors, and initial Troponin), an assessment tool trialed in 2017 by nine hospitals in the Netherlands to score patients according to their short-term risk of major adverse cardiac events. It used the HEART score to develop a predictive model.
“The Heart Score, a clinically validated tool developed in the Netherlands, buckets patients into a risk score between one and 10 that indicates how likely they are to have a heart attack. So, the first thing we did was integrate this tool directly into the electronic health record,” Konchak says.
Signs of success
The score was embedded into an alert within each patient’s electronic medical record (EMR). But the first iteration of the tool had poor compliance, as physicians weren’t necessarily adding all the data required for the score calculation. The team went back and added improved alerts and hard stops that required physicians to score patients before they could be discharged or admitted. The team rolled out the first version of the new workflow in March 2017, with the hard stops added in June and July of that year. The later iterations pushed compliance to nearly 100 percent.
A key challenge, but also key to the project’s success, was integrating the prediction risk scoring to a prescriptive set of actions guided by the EMR system. Most solutions like this, explains NorthShore CIO Steve Smith, require clinicians to exit the EMR to interact with a non-integrated tool. Telling clinicians they need to go to another application is perhaps the best way to kill adoption.
“Our physicians and nurses use the electronic health record as the key source of information about the patient. It is their “cockpit” for patient care. Integration of any new analytics-driven technology must be in the clinical chart or the value will diminish. If we have to ask our clinicians to stop, leave that system, go to another system, log in, and start filling out a new tool, they just won’t do it,” Smith says.
The project has not only led to drop in the health system’s “Chest Pain Observation” rate, but perhaps more importantly, it has not increased the rate of ED returns, mortality, or morbidity.
“We were looking for a reduction in observation days, but also looking to make sure we didn’t create an unintended consequence of people returning to the ED,” Smith says. “We didn’t get an increase in return to the ED. We did get about a 10 percent drop in observation days, which is good for the patient, good for costs, good for quality, and good for our patients who would much rather go home than get admitted to the hospital unnecessarily.”