Minting QC and efficiency gains with middleware

 

CAP Today

 

 

 

June 2011
Feature Story

Anne Paxton

For data management challenges ranging from the basic to the complex, middleware increasingly has pride of place in the clinical laboratory. With a long track record facilitating communication between instruments and laboratory information ­systems, middleware is well known as the go-to go-between for speeding up, standardizing, and subtracting the guesswork from results reporting. But when laboratory directors discover some of its newer capabilities, such as correlation studies and moving averages, they often find themselves surprised by how much middleware can really do to enhance quality assurance and productivity.

“The original concept of middleware was to act as a buffer between the LIS and analyzers, because middleware is a very convenient way to compensate for things the LIS doesn’t do well,” says Martin Kroll, MD, chief of the Department of Laboratory Medicine at Boston Medical Center. “Then it progressed to linking analyzers together as part of that buffer, and over the years the manufacturers have started adding other features such as reports of moving averages, the ability to do calculations, and detection of particular problems either associated with your samples or your data.” With every year, he adds, almost all the middleware manufacturers are improving how the systems work.

“When I talk to people about middleware, I tell them to think of it as almost like artificial medical technologist intelligence,” says Frank Polito, MBA, MT(ASCP)SC, chemistry manager of the Department of Pathology at Dartmouth-Hitchcock Medical Center, Lebanon, NH. The 400–bed medical center has been using Data Innovations middleware in the chemistry lab for more than seven years to perform autoverification and dramatically enhance its quality control, he explains. “It’s a Roche-specific package. When we bought our Roche instrumentation we opted to go with the middleware that was provided.”

But there’s no comparison between the initial capabilities of the middleware and what the laboratory can do with it now. “We went from a very rudimentary system that allowed just autoverification and rules, to a system that’s become very refined, very slick, and very user-friendly.”

In addition to specialized features like value tables, which allow the lab to set up one rule when performing the same function over and over, “as they’ve upgraded the middleware they’ve added a direct connection to our Bio-Rad Unity Real Time QC system. So I can bypass my LIS quality control system altogether and go right from the instrument to the Bio-Rad system.”

With autoverification being applied to 90 to 95 percent of results, “the rules allow us to pull out only the questionable or problem results that I want the techs to follow up on, that are either abnormal or require some kind of technical intervention. These include critical values, delta checks, results outside the analytical range of the analyzer, and screens for things like turbidity and hemolysis.” Sitting before a screen, the technologists can take the five or 10 percent of samples that are flagged for review and decide to accept, not accept, or take additional action.

“When a result is held, you can actually add a comment or prompt that will tell them to dilute a certain type of slide with saline, hold for supervisor’s review, or report out if greater than or less than a certain value. These are all customizable comments based on rules you’ve written. Each facility has the power to integrate its own standard operating procedures into the rules.” On the preanalytic side, “not only do we spin, process, and separate specimens that are going to the analyzers, we can also make aliquot tubes with various different labels from other departments. We make a tube and pull off all the serum and plasma, and those are all done through rules that are programmed into the middleware.” He especially likes the middleware’s built-in flexibility. “So often you buy a package and you’re stuck with it, but this one we can change as many times as we want.”

For Polito’s lab, one of middleware’s main benefits has been the ability to go paperless. “We were printing out one piece of paper on every result ­before, and there’s no paper now.” That’s been important since the laboratory started an outreach program a year ago. “Our volume has gone up a million and a quarter tests just from that outpatient volume, but we’ve added no staff. Without the middleware, there’s no way I could have done the same amount of work. There wouldn’t be enough people to review the results.”

Middleware has helped keep other costs down as well. “We have an ‘in and out’ interface. So I can have numerous analyzers connected to the middleware and still use that same interface, which means I can keep adding new instruments up to capacity and I don’t have to pay the LIS vendor for new interfaces. I would pay Roche a connection fee, but that’s much cheaper than what an interface would be. We’re talking about a few thousand dollars versus $10,000.”

Before it acquired middleware, the microbiology laboratory at United Clinical Laboratories had the ultimate data communications problem: Because of limitations in the vendor’s system, staff had to enter all lab orders and demographic data into bioMérieux’s Vitek 1 system manually. “Our LIS is an internally developed HL7-based system,” says Annette J. Hall, MT, UCL’s supervisor of microbiology, histology, and cytology. “Before bringing on TechniData middleware, we were just taking information straight from our LIS requisitions and manually entering it into the bioMérieux data-management system. Now the bi-directional interface from the LIS to the TechniData middleware does that for us.”

Not having a commercial LIS has its advantages for UCL, which with 150 employees serves three local hospitals in Dubuque, Iowa, and the surrounding tri-state area: “All of our programmers and people who work on the system are in-house just down the hallway when we have a problem.” But when Hall and the LIS director thought about developing their own microbiology system about eight years ago, they never got much past the planning stages. “We sat down and spent some time writing functional specifications, so we finished that part of the project. But the man-hours and resources to end up with a full-function microbiology module were more than we wanted to handle at the time.”

Microbiology has unique data management problems, Hall notes. “In the clinical lab, you run the test and you’re done with it. You never have to see the results again, where in microbiology you can continue to work on the same specimen for days, weeks, even months. There’s a continuum for many of the specimens.”

UCL became a pilot site for TechniData microbiology software in 2009. “We had upgraded our bioMérieux Vitek 1 system to the Vitek 2 the year before. However, the software package that came with it for data management didn’t meet our needs. We needed something we could use for our daily reporting. So we developed the capability to report and distribute microbiology results from the Vitek 1 through our LIS and sat on the new analyzer for a year and a half without using it. Then bioMérieux approached us about bringing in the TD Micro middleware.” It proved to be essential to her microbiology lab’s use of the Vitek 2.

Once installed, the TechniData middleware boosted the laboratory’s efficiency at the front end and in the postanalytical phase. “We were definitely able to condense the amount of time it took from plate readout to results generation. As a direct result of the new bidirectional interface to the LIS we no longer had to re-enter demographic information, and we were also able to apply individual rules in TD Micro that would keep us from having to remember a list of if/thens, such as ‘if this particular doctor is ordering, then do a certain report,’ or ‘if this particular antibiotic is sensitive, then append a certain comment.’ The software makes some of those decisions instead of you having to remember every little nuance of the CLSI document guidelines.” On the reporting end, she adds, there’s greater efficiency because the technologists don’t have to enter results or comments manually—or even remember to have to enter them manually.

The laboratory’s clerical help performed that data entry in the past—but no longer. “I can’t say we’ve cut personnel because of the middleware; we’re just not that large and we need one clerical person regardless. But the middleware freed her up to do other things we need to do.” Other savings have been reaped from TD Micro’s capability of printing ad hoc plate numbers. “That has let us produce additional bar coding for plates, instead of hand-writing them,” Hall says.

Dr. Kroll’s laboratory at Boston Medical Center recently installed an Accelerator APS (Automated Processing System) line from Abbott Diagnostics with an Instrument Manager module. “That’s where the Data Innovations middleware sits,” he says, and one of the functions it performs is reporting moving averages for quality control.

“Moving averages have actually been around a long time in hematology, because it’s usually very hard to get good QC material for those tests,” Dr. Kroll says. “In chemistry, a smaller number of tests had moving averages, because the whole idea was to use them between controls, and when chemistry analyzers first came out, we used to run controls every 10 patients. That meant that if the controls were down and you had to repeat values, there weren’t that many patients to worry about.”

Over the past 30 years, the systems have become more stable and controls might be run once per shift, or every 12 hours. “So as you can imagine, if you run controls in the morning, run tests all day, and then find at the end of that shift that you have a problem, you may not know when it occurred. If you use a moving average, that can be a good indication that you’re getting into a problem. It limits the time period that you’re at risk of being out of control.”

Moving averages are very helpful, Dr. Kroll says, because over 10 or 20 or more samples, “you actually get rid of a lot of the variation. One exceedingly high sample isn’t necessarily going to change the average that much. So it really helps you follow trends, because you’ll see the averages slowly creeping up, and that’s not something you would expect with random variation.”

Typically the technologists get a report of moving averages every hour, “and ideally you would like to get it after some type of trigger showing the averages have moved beyond expectations. Then you might run the controls again after an hour or two, instead of running the whole shift.” For example, his laboratory recently had a problem where sodium values were starting to drift high. “You could see it on the controls, so we recalibrated to bring them back in, and we used moving averages to give us confidence we were doing all right. This way we could run controls every two hours and still feel confident, rather than running controls every half hour.” Running controls not only is time-consuming but also can mean escalating costs for reagents and staff time to look at the results, he notes.

“Patient moving averages is probably the most revolutionary form of quality assurance to come out in years,” says Polito, who is in the process of evaluating how moving averages might be used at Dartmouth-Hitchcock. “There’s a long history of their use in industry, but the problem with using moving averages with patients is that you don’t have a uniform population. In industry, you can write a program that will tell you when you’re out of tolerance for producing widgets. In health care, you have healthy and sick patients, male and female, every race, and the entire span of ages from newborn to over 100.”

So he is conducting a study to decide which moving average calculations will work best for particular tests. The chronic problem with controls being run only once or a few times a day is that “you’re only picking that one instance of time where something is or is not working, and just one second later you might be completely out of process. In addition, you might not find out until days later when a clinician calls and says ‘I got an abnormal value on a patient but I brought them in and they were normal.’ So you look back and you realize there was a small period of time where things were out of control. You may have had a slightly bent probe or you may have run out of wash solution, for example. But sometimes it’s impossible to go back and re-create what happened.”

Instead of having to face this quandary, “moving averages will allow the lab to stop immediately if there is something funny going on, run controls, do comparisons with other instruments, and thus lessen the chance that large batches of patients may have a problem.”

In his study, Polito is concentrating on the “heavy hitter” tests performed most frequently, starting with the reference ranges as mean values. But he may also start looking at some of the exponential moving averages for tests the lab performs in lower volumes. “This is another addition from the Data Innovations/Roche package that you can add at your option. The trick is customizing the moving average calculations so that the middleware is sensitive enough to flag values when there’s a problem, but is not issuing too many false alarms. I don’t want the tolerance levels to be so tight that I’m always crying wolf, because that won’t work.”

Moreover, the tolerances can’t just be ordered off a list, he notes. “They have to be based on your patient population, so you have to develop them yourself. I have a much different population here than, say, Los Angeles or Boston, because I’m out in the boonies.” People who are contemplating the use of moving averages need to take this into account, he says.

He is a strong advocate for middleware. “If it’s done right, it’s a valuable, powerful tool in the hands of the technologist; it allows them to concentrate. With automation they’re not popping tops off of tubes, loading, capping, archiving, and so on, and if you couple that with the middleware’s autoverification, it’s an extremely efficient system. It allows techs to be techs—to question, to play detective—and I see moving averages as just another piece of that.”

For Michael Sheehan, PhD, technical laboratory manager of Kaiser Permanente Regional Reference Laboratory, Denver, middleware recently has meant turning a months-long process into an afternoon’s work when doing correlation studies. “When you get a new analyzer,” he explains, “you have to prove it produces the same results as the old ones, so you need to find a set of samples that span the analytical range for 15 or 20 different assays.”

This process normally takes a long time, even months, he says. “You have to put a bunch of little racks in the fridge and hope that when the person from the company comes, you’ll have enough samples for them to do the studies.” Now that he is conducting correlation studies with the Advia CentraLink Data Management System middleware from Siemens Healthcare, the logistics are much simpler. He can locate the necessary samples with a query that takes about two minutes. “It takes correlation studies to a whole new level,” Dr. Sheehan says.

But this capability is only one fringe benefit of the middleware, he says. The laboratory got lucky about five years ago when the Kaiser pharmacy refill center, with which it was sharing space, needed to expand. “We said, ‘Sure, we’ll move, as long as we get a new building.’ So we were able to design our new lab from the ground up.” The total automation system, called Bayer LabCell (now Siemens Advia LabCell), and the CentraLink middleware that the laboratory installed allowed it to free up three people for its molecular diagnostic lab. “Without the system we wouldn’t have had any personnel to put in there,” he says.

It takes one person to staff what the laboratory calls its command center, which consists of three computers. “One to move the samples around, one to accept the data back from the instruments—that’s essentially the middleware—and one which is the LIS. The computers are all in a row and, like an air traffic controller, that person is supposed to keep track of them all. It’s pretty hectic, but they can do it.”

The ability to integrate quality control with patient moving averages, Dr. Sheehan has found, is the most useful feature of the laboratory’s middleware. “The user sets how often the averages are taken. For example, I might say for every 30 patients that go by, I want you to average the cholesterol results. Those are plotted on a graph just like QC results. But with moving averages, you have the patients acting as their own QC.”

It can save unnecessary worry, he points out. “If somebody calls and says ‘I’ve been getting a lot of high sodiums in the last hour,’ well, you haven’t run your control in the last hour but you have run your patient averages every 30 points, and if there was something actually wrong with the instrument, that moving average would have gone up.”

On the other hand, he has seen the moving averages uncover a problem when two identical instruments’ averages started drifting apart. “We had two instruments doing sodiums and there was nothing indicated wrong with the controls; you would keep running them forever. But the moving averages showed one instrument having an average of 143 versus 138 in the other. So we knew one was too high; we changed the electrode and then the average instantly became a 138. Whereas with commercial controls that we buy, you would not be able to tell which was which.”

Under the laboratory’s CAP accreditation, if the lab has two identical ways of measuring the same thing, at least twice a year the lab is required to verify that the analyzers are getting the same answer within an experimental hour. “Well, we do that every half hour, and I can go instantly into the middleware to check it. And I look at it two or three times a week to assure myself. It’s a really quick way, with analyzers that are redundant, to make sure there’s nothing drastically wrong. It takes about a minute and a half, and there’s no way I could check the QC that well that quickly with any other system.”

Another CAP requirement is to verify normal ranges, and the middleware allows this to be done on the fly, he says. For example, if most people have a calcium of 8.5 to 10.5, then the patient moving average should be 9.5. “So it’s another useful feature that the middleware is an indirect indication of your normal range.” However, he notes, this would not work as well in a hospital where the ranges might be distorted because of the number of sick people. “It works well in reference labs like ours, because we are testing only outpatients here, so most people are normal.”

Ad hoc queries are a middleware feature that Dr. Sheehan also finds helpful. “I can do what are called univariate or bivariate plots.” For example, he can check whether more high potassiums are coming off one instrument than off another. “Or, if I’m just curious to know what percent of calciums have low phosphorus, I can do that with the database on the instrument. You could do the same thing on the LIS, but you’d usually have to ask a member of the computer department to write those queries for you, and then you’d need to wait in line for a week.” He considers himself reasonably computer savvy, but not an expert, so he appreciates that “with the middleware, we can data-mine to our heart’s content.”

Although his lab is one of the smaller Kaiser laboratories, it turns out 5.4 million test results a year with only two people per shift (two shifts per day, Monday through Friday) and one person on Saturdays running the whole system. “I don’t think you could find anybody that turns out as many results with so few people. And there’s no way we could do this without the very useful middleware.” But perhaps because the manufacturers haven’t provided books that fully explain every capability, he believes many middleware users may not be aware of everything middleware can do. He compares the experience to reading Alice in Wonderland as a child, then reading it more carefully as an adult. “You find there’s a lot more in there than you realized.”

Despite the greater ease of use of newer middleware, it can have a downside, Dr. Kroll says. “The old software gave you a lot of freedom, even though it was more complex. It was basically open, and somebody with the time and energy could make it do all sorts of work.” The newer software can’t be manipulated that much. “So it’s a sort of double-edged sword. It’s a big help, but on the other hand it makes the work environment more complex. When you have a problem, it can be very difficult to come up with solid, rapid, simple solutions. Some days you’ll go two steps forward and one step back, and other days it’s two steps forward and two steps back.”

On the other hand, Dr. Sheehan says, the quality control and quality assurance functionality that middleware provides are an invaluable contribution to the laboratory. “Without the middleware you’d have to rely strictly on the technologists’ always doing what they’re supposed to, and I’d be fearful of reliable results. With the middleware, there are enough checks and balances that I can sleep at night. You can be very secure in the results you turn out.”


Anne Paxton is a writer in Seattle.