by Peter Sayer

Mercy turns to analytics to improve implant outcomes

Mar 12, 2020
AnalyticsDigital TransformationHealthcare Industry

Mercy Technology Services is using NLP technologies and an in-memory database to extract and analyze data from doctors’ notes in near real-time.

Electronic Health Records [EHR] / digital medical data / stethoscope, mobile phone
Credit: millionsjoker / Getty Images

Curtis Dudley was managing the operating room in Mercy’s largest hospital when he started to wonder whether the constant flow of new medical devices, and the costs associated with them, were really delivering the better outcomes they promised.

“We never had the evidence to really prove it,” he says.

A decade later, Dudley, now vice president of enterprise analytics and data services at Mercy Technology Services (MTS), says clinicians at the organization’s more than 60 hospitals can answer such questions in near real-time.

Having adopted an Epic electronic medical record (EMR) system in 2007, Mercy has more than a decade of de-identified patient data to help improve patient care. In 2009, the hospital network introduced barcode scanners to its operating theaters, enabling staff to log precise details of supplies used and devices implanted. Around five years ago, Mercy subsequently launched its Performance Through Metrics initiative to identify which areas of the organization had the greatest impact on results.

The organization now employs a series of data marts, one for each service line, to gather together key elements from the sea of EMR data using SAP HANA. The in-memory database facilitates real-time exploration of large datasets.

“It’s a lot different than when you approach the data with a question and you have to wait 30 minutes for a query to run every time,” Dudley says. “When you can explore in real time, you can approach the data with a hypothesis, and you can mine and explore the data and get to conclusions where you can actually make changes in what you do.”

Those changes enabled Mercy to save $33 million on implanted devices and surgical supplies over three years, without sacrificing exceptional care. Thanks to the de-identified nature of its patient data, Mercy can share such insights with other organizations, and is working to incorporate data from other sources into its system, the Real World Evidence Network, which has earned MTS a FutureEdge 50 Award for applications of emerging technology.

Products, not projects

When it comes to delivering data insights, MTS’s key challenge is organizational.

“If somebody gives me specs that my architects can understand, … then I can extract that data in no time flat. The problem is aligning the extraction or the definition or the metric with the way the business wants to define it,” he says.

That’s led Dudley to ask consumers of his data to tell him what problem they’re trying to solve, rather than tell him what data they think they want. “Then I can help you map to the data to help you solve that problem,” he says. “It’s been really effective to change that approach.”

Another key to keeping users happy is not to walk away after delivering a new dashboard or visualization.

“IT wants to treat these things as projects, but they’re products,” he says. That means continuing to work with users to keep their data representations relevant to the needs of the business.

Dudley’s biggest regret is not implementing master data management and metadata management tools at the beginning of the project. “That would have helped on the definition side,” he says.

Putting MDM tools in now presents its own challenges. “Along with those tools you have to have an operating plan and operating resources to run and manage them effectively,” he says, “and until the organization is willing to invest in those, it’s just not worth spending the money on the tool.”

Meanwhile, some service lines have invested in their own resources to help manage the data, and for them, he says, “Great things have happened. For those that did not invest in those resources and just let me do all the work for them, our turnaround time is much slower, and it takes longer.”

There are other workarounds, such as linking all the dashboards to an internal wiki to document the data definitions. “They’re largely dependent on my staff to manage and maintain, which is also very labor-intensive,” he says.

Reading the doctors’ notes

Natural language processing (NLP) software, introduced four years ago, has given Dudley access to a mountain of data previously locked up in clinicians’ freeform notes. Still, every clinician’s documentation is different, and there are few standards.

So Mercy picked a tool from Linguamatics, in large part because of its library of medical ontology, which helps the tool to identify synonyms for a condition or treatment.

“In the past, I would have to code for every variation of how somebody might say shortness of breath,” he says.

For organizations that tried NLP years ago, and failed, Dudley recommends taking another look, as specialist ontology libraries such as the ones Mercy uses are much improved.

Still, having highly skilled individuals on staff is also essential. “People with a data science background seem to be the most effective,” he says.

One of the benefits of extracting data using NLP in this way is that it can aid in de-identifying protected personal health information (PHI) so that Mercy can share it without breaching HIPAA regulations.

“We will do NLP and extract the data elements we’re looking for, for our analysis. But we don’t provide open access to just the clinical note because there can be lots of different things in the notes that could be PHI and we don’t provide that,” he says.

Better benchmarking

Mercy now uses its data platform to study quality and safety, and to develop its care path. But there’s still more to be done.

The data Mercy collects enables it to benchmark its services but only against internal standards. To get a better idea of how Mercy is performing, Dudley wants to measure its data against that from other organizations, by recruiting them to the Real World Evidence Network. But most organizations are only now starting to ensure their data is well-defined, documented and de-identified, so it’s not easy to find partners with which to share and compare.

For now, the focus is on working with a couple of likeminded organizations to analyze data in the field of cardiology.

“We want to master that with them before we go much broader and wider with the work,” says Dudley. “But ultimately, in order to benchmark and compare across all of the different things that we do, we need some way or some mechanism to be able to do that. And building a network with conformed data is the only way to do that.”