The centaurs of clinical decision support

The idea behind CDS is as noble as its potential benefit is great. But there are also dangers. What happens if providers become dependent on automatic alerts that don’t come? How many 'non-alerts' will it take before their absence is noticed? How many patients will not get the care they need?

man 40027 1280
Credit: Pixabay
“The Machine is much, but it is not everything.”
From E.M. Forster’s “The Machine Stops”

Who would want a machine for a doctor?  No bedside manner.  None.  Zippo. Nada.

But machines can be very good at accessing information. And, with no personal entanglements, they are focused on the duties to which they are assigned.  And they stay focused for…well, about as long as need be. Plug in a task and you can be pretty sure it will get done.

The popularity of electronic devices, therefore, is easy to understand. In fact, it’s hard to imagine any professional – doctor, pilot or business person – who does not take advantage of digital assistants.  In this context, then, clinical decision support (CDS), not only makes sense, it is all but inevitable. 

CDS promises to improve patient safety and outcome by helping to ensure the right diagnostics are ordered; the right procedures are performed; the correct medicines and their doses are administered. Moreover, in this time of tightening reimbursement and strict budgets, increased efficiency thanks to CDS might allow better utilization of staff and a consequent reduction in FTEs.    

Medical centaurs. The development of increasingly sophisticated CDS and its widening adoption raise the prospect of a new kind of healthcare professional – a medical centaur – that combines the reasoning power of the human mind and the speed, comprehensiveness, and focus of automation. The question is to what degree providers will rely on CDS – in short, how much of the medical centaur will be machine? How much reliance on CDS will be too much?

Putting aside philosophical arguments about how the dehumanizing potential of machines, practical considerations argue for limiting their use.  Science fiction illustrates the most dramatic.

In the early 1900s E.M. Forster wrote in his novella, “The Machine Stops,” about a world that had become dependent on technology. As “the Machine” begins to malfunction, so does its mechanism for self-repair. As people have lost the ability to repair the Machine or even recognize – until too late – that it is failing, its collapse brings civilization down with it.

Will too much automation in medicine make providers lazy; dissociate them from their patients; even impede decision making through their lessened engagement? Will medical centaurs who have placed their trust in CDS recognize when it is not working properly?

Slippery slopes. As a teenager in Wisconsin, I learned early to pump the brake pedal on my car when braking on icy roads. Today that skill is all but non-existent.  Anti-lock braking technology does this for me automatically. Today I can slam on the brakes in the worst road conditions and I will chug to a stop efficiently, effectively and safely.

But something unexpected happened last year. A sensor on my ABS went bad. It happened in the middle of summer.

All of a sudden, the ABS kicked when I braked, regardless of road conditions. This lengthened my stopping distance considerably. It significantly – and frighteningly – complicated such mundane tasks as driving into my garage and into parking spaces.

A friend understood the problem, identified the bad sensor and swapped it for a new one.  Problem solved.

But the fact remains that there was a problem. And, if it weren’t fixed, bad things could have happened. 

CDS vs. ABS. What happens if providers become dependent on automatic alerts that don’t come? How many “non-alerts” will it take before their absence is noticed? How many patients will not get the care they need?

The answer will be inversely related to the degree of trust placed in CDS and how much responsibility providers delegate.  Ironically, as CDS becomes more reliable and its use more widespread, the risk of a catastrophic failure will rise.

The problem with my ABS sensor was accompanied by the physical chugging of the car and the intermittent noise of the brakes clamping down again and again. The near misses with the far wall of my garage and the fenders of parked cars indicated early and unmistakably that I had a problem that needed fixing. A CDS malfunction may not be so obvious.

The idea behind CDS is as noble as its potential benefit is great. Automating healthcare promises to increase patient safety and make healthcare more effective, as it reduces cost through increased efficiency and improved patient outcome.  

But, while CDS may take on the role of prefect, it will – by the very nature of technology – be imperfect.  

Users must be vigilant. They must be ready to spot problems…and to correct them. 

This article is published as part of the IDG Contributor Network. Want to Join?

To comment on this article and other CIO content, visit us on Facebook, LinkedIn or Twitter.
Download the CIO October 2016 Digital Magazine
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.