Big tests, small problems—POC documentation

 

CAP Today

 

 

 

May 2009
Feature Story

Anne Paxton

Hospitals appear to be making serious headway in bringing documentation of point-of-care glucose and urinalysis testing up to the mark, though areas of concern remain, particularly in urinalysis.

The 2008 CAP Q-Probes study, “Point of Care Testing Documentation,” reports on a large-scale audit of personnel training, competency assessment, and quality control documentation compliance in point-of-care testing programs. It also makes recommendations for ways laboratories can improve compliance, and it calls on labs to reassess and standardize policy on retaining POC urine specimens.

Overall, the study’s findings are encouraging, says study co-author Peter J. Howanitz, MD, director of clinical laboratories at State University Hospital in Brooklyn, NY, and a member of the CAP Quality Practices and the Point-of-Care Testing committees. “There’s really been a major, major improvement in the last 10 years in these quality attributes.”

The 106 institutions that participated in the study reviewed 22,317 POC glucose and urinalysis test records and the personnel training and/ or competency assessment documentation for 4,074 test operators. They audited QC documentation compliance and attempted to determine what practice parameters correlate with good compliance. A majority of labs reported 100 percent compliance in each quality indicator category.

In brief, the study finds that 94 percent of the test operators had completed training or competency assessment within the previous 12 months, 97 percent of the glucose and urine reagent strip test records had documentation of the person who performed the test, and 96 percent of QC events had been documented.

Within the small percentage of QC events (three percent) that were outside the designated range, 93 percent had documentation of appropriate followup, per institutional guidelines. Institutions with a higher percentage of laboratory personnel responsible for POC testing were slightly more likely to document followup action.

That’s solid progress, Dr. Howanitz says. “Looking at the series of important questions from the CAP checklist, 10 years ago when the Laboratory Accreditation Program inspected POC testing, it would find the error rate was in the double digits—some even at 50 percent. Now we’re under 10 percent on all the questions on POC testing.”

The study authors have a core take-home message for labs, calling on them to do the following: emphasize the importance of documentation during personnel training and competency assessment; provide practical, convenient mechanisms to document QC events; provide and perhaps post guidelines for followup of out-of-range QC events; conduct regular audits with feedback to test operators; implement an action plan for addressing inadequate compliance; possibly upgrade technology to assist in documentation compliance; and standardize POC urine specimen storage policy.

As their basic approach on this Q-Probes study, the authors decided to contrast two different analytes—blood glucose and urinalysis. “The reason we did that is glucose is the one that everyone uses and there are automated systems in place, and urinalysis is almost universally manual and there aren’t those kinds of computer interfaces available,” Dr. Howanitz explains.

Only one laboratory in the study had an automated urinalysis unit on the floor. “The median hospital is doing about 300 glucose tests a day outside the lab and only 25 urinalysis tests. It’s not very economical to have a high-powered instrument do urinalysis whereas there is usually the volume to support automation for glucose.”

Study co-author David S. Wilkinson, MD, PhD, notes the risk factors associated with glucose and urinalysis results are very different. “Glucose results, of course, have much more serious immediate clinical management implications, and as the devices have been rolled out, there’s been a lot more attention paid to policies and procedures and followup related to POC glucose testing.”

Particularly in the early generation of bedside glucose monitors, the limitations were well known, and there were many more measures to guard against errors, especially at low and high levels where the meters weren’t accurate, says Dr. Wilkinson, director of laboratories for the Virginia Commonwealth University Health System, Richmond, and chair of the CAP Quality Practices Committee.

“Generally, there’s not an immediate life-threatening result or action associated with a urine dipstick measurement. You can pick up very serious things, but if you don’t get the right reading on a dipstick, it’s not usually a matter of the patient possibly dying in the next 10 or 15 minutes. With glucose, it could be.”

The four quality indicators selected for the study were percentage of test operators with current documentation of training or competency assessment, percentage of test records with documentation of test operator, percentage of documented QC events, and percentage of out-of-range QC events with documentation of followup action. “We were all rather pleasantly surprised that overall, performance on these particular quality indicators was quite good,” Dr. Wilkinson says, noting that at the 50th percentile, there was essentially 100 percent compliance among the institutions studied.

A relatively high percentage of operators have current documentation of training and competence, as seen in the study, Dr. Howanitz says. “A number of hospitals certainly need to improve their performance, but I can tell you it is so difficult with nurses on a part-time basis or hired on a shift-by-shift basis. All these registry nurses come in under contract with the hospital, and it’s an immense problem to make sure they are competent.”

Figuring out when to test them is hard, especially when the POC testing supervisor is likely to be on a different shift. “And you have to test them if they’re going to be doing POC testing,” he says. “So buried in this data is that kind of problem.”

The study shows that at least half of the hospitals have tested all their people, “which I think is very, very remarkable, given the difficulty of doing this, and the fact that some of these hospitals have more than 2,000 people doing glucose testing,” Dr. Howanitz says. Add to that the number of people doing urinalysis, pregnancy testing, and coagulation testing, and “this is an immense, immense job. It’s one of the things we need to try to automate and to make sure we’re able to stop people who haven’t been assessed from doing.”

James Nichols, PhD, DABCC, FACB, director of clinical chemistry at Baystate Medical Center, Springfield, Mass., which runs upwards of 20,000 to 25,000 glucose tests a month, says he would expect more competency problems with training for urinalysis than for glucose meters.

The study’s finding that the percentage of test records with documentation of operator is lower for urine testing is somewhat predictable, primarily because of the data-management features of glucose versus the totally manual process with urine, Dr. Nichols says. “I’m actually surprised the compliance was as high as it was in urine testing.” Many glucose meters have an operator lockout and require a number that indicates the operator is competent in that meter to do patient testing. “You can add an operator ID into the automated reader, but you can always just read a urine test visually. So there are always these workarounds.”

Even glucose testing, however, doesn’t show 100 percent compliance on this score. “A lot of smaller hospitals are still using patient meters simply because the data-management meters are so expensive,” Dr. Nichols says. “Unless you have a fairly big volume with a couple thousand tests a year or more—or are part of a larger health system where you can partner with other hospitals—you’re not going to be able to afford data-management meters. And until manufacturers get software alternatives for those smaller clinics and smaller hospitals, I don’t think you’ll ever get 100 percent compliance.”

The study’s findings on the number of full-time equivalent staff dedicated to overseeing POC testing are of concern, in Dr. Howanitz’s view. “This is one of the major issues with POC testing, that if someone is going to implement it they really have to consider the support staff that’s going to be required to make sure this job is done correctly. We have 4.5 FTEs dedicated to oversight at our institution, and it’s not enough,” he says.

With the range of sizes among the institutions in the study—35 percent of the institutions that participated in this Q-Probes study have fewer than 150 beds—Dr. Wilkinson says the low averages of oversight FTEs are not surprising.

“There’s not a linear relationship; the numbers [of POC instruments] grow exponentially with the number of beds in a hospital. Let’s say you might have five devices in a 100-bed hospital. If you go to a 1,000-bed hospital, you’re not going to have just 50; you might have 200.” This is because a 1,000-bed hospital is likely to be doing much more complicated procedures, and its beds are occupied by patients who are sicker.

The study does find a significant relationship between one performance metric and a practice parameter: Better compliance was associated with a higher number of laboratory personnel involved in POC testing. “Ninety percent of POC testing is being done by nurses, but in the best situations it’s being overseen by laboratory personnel,” Dr. Wilkinson says.

“We are the ones who actually do the training, help them write their procedure manuals, establish their policies and procedures, then do audits and make sure they’re doing what they’re supposed to. But they’re the ones who are there with the patients, and they have to do everything else nurses do. So the study confirms the more lab people you have involved, it does have a statistically significant impact.”

Lab personnel’s greater knowledge of what can go wrong with a test, and their training and experience on pre- and postanalytical errors, are also important factors, Dr. Nichols, who was not an author of the study, points out. “You are going to have better compliance with documentation than you are with clinical staff who may not have that kind of background.”

Nurses and residents, he notes, are not thinking about instrument QC on a device. “They’re just thinking about how to get a result out of the device. QC for them isn’t first nature, and if they fail QC, they’re just going to want to fix it to make sure they go on. And with urine, there’s no real need to troubleshoot because the bottom line is you can still run it; it’s not going to lock you out.”

When POC testing operators do troubleshoot out-of-range QC results, this study shows, documentation of followup is inconsistent. The least compliant category in this study was documentation of followup action when out-of-range QC results are reported—with 10 percent of the institutions reporting less than 50 percent compliance. “People may take corrective action but not document it. We have trouble with that even in the main lab,” Dr. Wilkinson says.

“If there’s a QC failure, you don’t report results,” Dr. Howanitz says. “You determine what the problem is, make a correction, then rerun the patient specimen. And [the study shows] there is a major problem.” Part of it, in his view, is the operators’ training. “When they were in school, their curriculum didn’t really focus on clinical lab testing, so they often don’t have the background to be able to understand what they should be doing.” It falls to the POC coordinator and staff and others in leadership to close this education gap, he adds.

Says Dr. Nichols, “You may not force a person to put a comment code that says ‘I performed troubleshooting and repeated QC,’ or ‘This is what I did,’ but we should have followup with every failure whether it’s urine or glucose.” Glucose meters themselves will prompt for a comment code after a failure. The problem is the operators can just go away and turn off the device and the comment doesn’t get appended. “But we do want the comment with every failure because the comment indicates acknowledgment of the failure and action by staff,” he says.

In addition to producing a benchmark of POC testing documentation compliance, the Q-Probes study takes up a little-explored issue: how long to retain POC urinalysis specimens.

The study reveals a gap between what current CAP guidelines require and common practice. “CAP checklists specify how long urinalysis specimens should be stored, at least 24 hours,” Dr. Howanitz says. “But that checklist question was developed long before there was external POC testing. So there’s been a lot of discussion on what we should do in terms of requiring POC sites to retain urine specimens.”

It appears from the study that between 70 percent and nearly 90 percent of institutions discard their urinalysis specimens immediately, probably because they don’t have the storage facilities, Dr. Wilkinson says.

But Dr. Howanitz gives an example of where a retained specimen might be needed. A pregnancy test is done on a woman in the ER, and the result is negative—but erroneously negative. The woman then sees a radiologist, who finds after doing the radiological procedure that she is pregnant. They would want to evaluate again why the test was negative. If the test was not performed correctly, competency would need to be reassessed, but if there’s no specimen, it can’t be traced.

“This is the kind of reasoning behind why we keep specimens in the clinical lab for a period of time,” he notes. “If anything is required before a procedure is done or there is a mistake, we can go back and evaluate.”

So the gap between guidelines and what’s done in the field does raise a question, he says. “And we didn’t know what the answer ought to be. It was interesting that the majority of labs didn’t have a policy on storage of POC specimens. This is one of the controversies we wanted to raise when we designed the study. We felt we needed to gather this information so others can decide if it’s important to require POC personnel specifically to hold on to these specimens.”

There isn’t any published data that Dr. Howanitz is aware of on what happens with POC urine testing specimens after they are read, but he’s fairly sure that very few people are saving them. “They could send them to the clinical lab for storage, so that might be an appropriate solution for an ED that doesn’t have any facilities for storing.” But the chief concern is that standards should be set for what should be done at the point of care.

In hospital accreditation, the Joint Commission calls for the same standard of practice throughout the institution, Dr. Wilkinson points out. For example, with anesthesia, whoever is required to be present when a patient is put to sleep should not be different in the OR than it is in an outpatient clinic. “So here, you could end up with a situation where there is a ‘different standard of care.’” It may seem like a minor point, but it should be addressed, he says.

Dr. Howanitz suspects people at the point of care are not going to want to engage in storing urine specimens. “But at the same time there can only be one standard for testing. If we’re keeping it in the clinical lab, why throw it out at POC?” The reverse is also true, he adds. “There needs to be a universal standard that’s developed, and a policy needs to be written according to regulatory guidelines.”

For Dr. Nichols, the recommendation to standardize urine specimen storage policies was a bit of a surprise. “I don’t think the majority of institutions performing POC testing are going to hold on to specimens. They don’t have the resources to do it at the point where they’re doing testing, and there’s really no clinical need. In general they have orders to run a test, and if anything turns up positive, to send that specimen on to the main lab for further testing, and the lab would save that sample for them.”

Then, too, refrigeration is not always at hand. “If you’re going to store them, you’re going to have refrigeration, and they’re not necessarily sterilely collected. And staff may be dipping right there into the cup in the room. Are they even viable after they’ve been dipped?” When producing positive results, he adds, urine dips are just a “relative” screening method, because the concentrations in the urine are going to vary depending on the visual acuity of the person reading.

Baystate is one of the growing number of usually larger institutions that have automated urinalysis readers on the floor, which are more standardized and provide printouts that can be put directly into the patient’s chart or used to transfer results into the EMR (though urine dips are not yet electronically interfaceable in the way glucose meters are).

The study authors say they believe technology such as that should be considered as a means of remedying some of the quality shortcomings found. There is still a fair amount of variation in how POC test results are recorded in the medical record, the study shows. “For glucose results, it was evenly split between handwriting results versus downloading them,” Dr. Wilkinson says. At Virginia Commonwealth University Health System, until recently when dock stations were added, glucoses were being recorded on a bedside flow sheet but were not necessarily available in the electronic medical record.

But the clear trend is toward data management with automatic download of instruments directly into the LIS. “Over the last five years, it’s gone from almost zero, with almost all glucose results manually entered, to the many more with automated systems, and they’re now into their second generation,” Dr. Howanitz says. “I suspect if we did this same study next year, we’d find a 10 or 15 percent increase, so I think within five years we’ll be at 100 percent.”

That’s not true for urinalysis, however. The automated strip readers found in the main labs take the human element out of results reporting, but “they’re typically not available at the POC sites. They just do the test the old-fashioned way,” Dr. Wilkinson says.

Dr. Howanitz is sure the companies are working on it, “because this took about 10 or 15 years to really happen with glucose at the bedside,” he notes. Automating result reporting will increase operator efficiency and reduce errors in patient care, and it makes it much more likely results will get into the patient record. “If they have to do it manually, they just don’t like to do it,” he says. There’s also a large chance, with manual entry, of the result landing in someone else’s medical record. “We see that happen all the time.”

Despite such pitfalls, the study demonstrates that on the whole there has been impressive progress on POC testing documentation. With an increased focus on this important area through training, better guidelines, regular audits, and action plans, the authors of the study predict measurable improvement will continue and become apparent in future inspections of POC testing programs.

“We’re still not down to less than a one percent error rate—but we’re fast getting there,” Dr. Howanitz says.


Anne Paxton is a writer in Seattle. Bruce A. Jones, MD, of Henry Ford Hospital in Detroit, was also a co-author of this Q-Probes study.
 

Related Links Related Links