How to right the wrongs of clinical decision support alerts

Charna Albert

December 2020—Always think about timing, maintain a log of malfunctions, and make the right decision the easy decision.

These are a few of the clinical decision support tips that Ronald Jackups Jr., MD, PhD, and Amanda Blouin, MD, PhD, presented last month in a CAP20 session.

“When you say clinical decision support, the image that occurs in most people’s minds is the ugly pop-up alert,” said Dr. Jackups, associate professor of pathology and immunology, Washington University School of Medicine in St. Louis. “In reality, the pop-up alert is the last resort.”

Laboratories can affect how providers order tests at the point of data entry in several ways, he said. “How we build the test, what questions or prompts we put in the order, what options we provide to the provider when they search for what they think is the appropriate test.” Then, too, there are lab test reflexes and algorithms. “Ultimately, if deemed necessary, an order alert may be a useful function.”

Dr. Jackups and Dr. Blouin, medical and scientific director of the histocompatibility laboratory, Memorial Sloan Kettering Cancer Center, opened the session with this scenario:

The director of infection prevention at a hospital where you serve as laboratory medical director contacts you about the hospital’s clinical decision support system alert designed to reduce unnecessary testing for Clostridium difficile. The alert fires when a provider places an electronic order for C. diff on a patient currently prescribed a laxative and warns the provider that such testing is associated with a high false-positive rate. The infection prevention director complains to you that too many providers continue to order C. diff unnecessarily despite this alert, leading to antibiotic overuse and federal financial penalties due to pseudo hospital-acquired infection. What can you do?

Clinical decision support guidelines, Dr. Jackups said, tend to fall under the five “rights” of clinical decision support:

  • The right information: “What are you telling the provider at the time they see the CDS tool? Is it evidence-based, useful, current information?”
  • The right person: “Are you making sure you’re targeting the right person? Should this go to a doctor, a nurse, a pharmacist?”
  • The right format: “Are you using the optimal format to get your information across? An alert may be what you have to use,” but another type of CDS such as an order set or reference information may be able to obtain the desired result.
  • The right channel: Should it be in the electronic medical record, ordering system, patient portal, other?
  • The right time: “Primarily it is when the clinician is making a decision or needs more information.”

So what might have gone wrong with the C. diff versus laxatives alert? Dr. Jackups cited a few of the possible malfunctions. The alert may have been built to fire when C. diff toxin was ordered, but perhaps the lab recently updated testing to C. diff PCR and the alert was not updated to account for the new method. Or the rule may have been designed as intended but missed certain triggering events—a commonly used laxative may have been left off the list of patient medications that trigger the alert, or the pharmacy formulary might have changed. It’s also possible the alert was vague, confusing, or wordy, leading providers to ignore it.

Dr. Blouin

“Or perhaps there are conflicting alerts,” he said. “You may not have been aware there was another alert in your hospital that recommends C. diff testing in any patient diagnosed with diarrhea.”

In a survey of hospitals and academic medical centers, Dr. Blouin said, 93 percent of chief medical information officers reported CDS system malfunctions at their institutions, and 65 percent reported at least one malfunction per year (Wright A, et al. J Am Med Inform Assoc. 2018;25[5]:496–506).

In this study, the most common causes of malfunction were found to have originated in the initial CDS build and implementation. “This includes not just coding errors but also design errors, often the rule criteria being incomplete,” she said. For example, for a rule that should fire for any patient on heparin, the rule designer included only unfractionated heparin in the specifications and therefore a patient on low-molecular-weight heparin doesn’t trigger the rule. Or a rule that fires to start a beta blocker in cardiac patients did not include the category of drugs with alpha beta blocking activity and thus it prompts physicians to order beta blockers for patients already on drugs with beta activity. Also common are errors that are made when moving the rule from the test and building environment into the production environment. And “any sort of system upgrade may affect how CDS rules fire,” Dr. Blouin said.

Changes to the underpinning data dictionaries used by a rule also can cause malfunctions, such as when a new medication has been added to the pharmacy formulary or replaces another drug.

“This is a type of error that is really pertinent to the laboratory,” she said. “Not only is CDS built to drive laboratory test ordering, it is driven by lab test results. We are generating the elements of these data dictionaries, and when we modify something in the lab, the CDS could be affected.”

An example: a change in reportable ranges. In one case, a lab that reported undetectable troponin values as “0.1” changed its system to report such values as “less than 0.01.” “There was a CDS alert to assess whether a patient with an acute MI was given aspirin in the ED. This was triggered by a diagnosis code of MI, or a troponin greater than 0.5,” Dr. Blouin said. The lab began to receive complaints from ED physicians that this alert was firing too frequently and on patients with a troponin of less than 0.01 (Stone EG. J Am Med Inform Assoc. 2018;25[5]:564–567).

“With investigation, they realized this rule was interpreting the ‘less than’ sign as a very, very large number, and it was triggering this alert. As more hospitals are adopting the high-sensitivity troponin method, this is something to think about, because this assay has different reportable ranges, different reference ranges, different units, and different clinical decision points,” she said, all of which are constantly evolving.

Sometimes a rule is built exactly as intended and firing as intended, she said, “but it’s still not having the intended effect,” and the reason can be provider nonacceptance. “Excessive alerts. Increased time on the computer. Decreased time with patients. And clicks—a lot of clicks,” she said, citing just some of the reasons for nonacceptance. One study found that in a single 10-hour emergency department shift, a physician can click a mouse up to 4,000 times (Hill RG Jr., et al. Am J Emerg Med. 2013;31[11]:1591–1594).

In an example from an academic medical center, providers were required to acknowledge a pop-up alert before finalizing an order for red blood cell transfusion. The intention was to ensure the patient had a type and screen within the past 72 hours. When the patient’s type and screen was active, the provider had to acknowledge the alert before finalizing the order—interrupting the workflow. “These types of ineffective CDS contribute to provider nonacceptance,” Dr. Blouin said, as does distrust, alert fatigue, and a perceived limitation to professional autonomy.

An estimated 50 to 90 percent of alerts are overwritten, she said. “This could be due to a malfunctioning or ineffective CDS system. And the problem is that high alert override rates can eventually lead to someone ignoring a critical alert.”

“So how do you find these CDS malfunctions?” Dr. Blouin asked. They can be discovered before implementation, by outcome, or during monitoring, but published studies have found them to be most commonly identified when an end user reports an issue. “These tend to be your false-positives”—the alert is firing when it shouldn’t. False-negatives—when alerts should have fired but didn’t—might be discovered during an investigation of an adverse event or near miss, through ongoing system monitoring, or by reviewing alert firing and override rates.

Reviewing the associated free text override comments can point to system malfunctions. A recent study found certain words are strongly associated with clinician frustration and malfunctioning alerts: dumb, idiot, wrong, epic, invalid, false, please stop, and more. Override comments revealed malfunctions in 26 percent of all rules active in the authors’ system. “If possible,” they wrote, “we recommend monitoring all override comments on a regular basis” (Aaron S, et al. J Am Med Inform Assoc. 2019;26[1]:37–43).

Dr. Blouin suggests maintaining a log of CDS malfunctions. “Periodic review of this log helps you look for patterns. And identifying a pattern may point to a practice that can be implemented during the build to prevent future alert malfunctions.”

How to solve CDS system malfunctions? “If the problem is the build, then the solution is correcting the code, followed by robust pretesting,” Dr. Blouin said. But if the problem with an alert is with the rule’s clinical design, then the rule needs to be redesigned. “At this point, engage with specialty physicians or the target group of the CDS in the redesign.”

For unanticipated changes in the system or workload, develop a process to review CDS rules that might be affected by test changes and build it into the lab test validation process, she said. “This year we have all rapidly implemented new SARS-CoV-2 testing”—viral RNA and antigen detection, antibody testing, different testing algorithms and frequencies, and testing for employees. “All of these are opportunities for clinical decision support.” And as the guidelines for test frequency and methods change, and as the lab switches methods based on reagent availability, think about the CDS that’s downstream of the SARS-CoV-2 testing and may be affected by those lab results, Dr. Blouin said.

“When you’re bringing on a new method or changing methodology, make reviewing affected CDS part of your internal process, along with the RFP, purchasing, wet testing method validation, LIS build. Bake in this review and it will become part of your routine.”

Think, too, about how to make CDS a more effective part of provider workflow, she said. “If the CDS has been a barrier, somebody has developed a workaround. Look for those provider workarounds. They can point you toward the pain points or problems with the CDS and then you can focus on your solutions. Count the clicks, and check the comments.”

All of this goes back to the five rights of CDS, she said, and to what have been described as the Ten Commandments for effective CDS (Bates DW, et al. J Am Med Inform Assoc. 2003;10[6]:523–530). Among them: recognize that physicians will strongly resist stopping, changing direction is easier than stopping, and speed is everything. “In following these or considering how these guidelines fit in with the five rights of CDS, the whole point is, can you make the right decision the easy decision?” she said. Getting buy-in, too, is necessary.

For the laboratory director faced with the C. diff ordering problem for patients on laxatives, Dr. Jackups provided troubleshooting advice. Step one: Gather data. How often has the alert fired and has the firing rate changed recently? What percentage of alerts are the ordering providers accepting? “The acceptance rate is important when we troubleshoot CDS alerts, because we see alerts that are accepted nearly all the time, but we also see alerts that are nearly always rejected, in which case there is something about the alert that is convincing the provider not to follow it, or is perhaps aggravating the provider and pushing them away from listening to the advice.”

Other questions: How often is C. diff testing ordered for patients on laxatives? Was the testing low for a while and then suddenly doubled? And who is receiving the alert? If it’s the nurse because the provider is giving a phone order, Dr. Jackups said, “the nurse is much less inclined to listen to the alert than to listen to what the provider has told the nurse to do.”

Step two: Consider options for improvement. First, make sure there are no obvious malfunctions, such as the test order changing—toxin to PCR—without the alert having been updated. If not, return to the five rights of CDS.

Dr. Jackups

Does the alert clearly and concisely explain why testing is not recommended or why discontinuing the laxatives is recommended? Is it the physician or nurse who sees the alert? Perhaps the format is not aggressive enough or needs to be stepped up a bit because it’s not catching the provider’s eye. Can the alert be changed from something that is simply a recommendation and can easily be bypassed to a restriction that says the lab has to be contacted for approval?

Finally, “always think about the timing of the alert,” Dr. Jackups said. Does it fire before or after the stool sample is obtained? “Perhaps there’s a workflow on the floor when a physician says, ‘Let’s do C. diff testing,’ where someone collects the stool sample and then goes to the electronic chart to place the order. The alert is not going to be effective at this point because the person collecting the sample has already put in the work and is more inclined to proceed with testing anyway.”

And this, Dr. Blouin said, is one of the Ten Commandments: Fit into the user’s workflow. “Don’t provide the CDS at the end, after someone has gone through so many screens of clicking to place an order. They’re not going to go back,” she said, before closing with yet another commandment: “Little things can make a big difference.”

Charna Albert is CAP TODAY associate contributing editor.