Checklists 2015: signposts are clarity, consistency

Anne Paxton

August 2015—It doesn’t come swathed in a ribbon on the showroom floor, but the 2015 edition of the CAP Laboratory Accreditation Program checklists is new, improved in style and substance, and ready to roll. More precise and consistent quality terminology, more consolidation of requirements into the All Common checklist, and increased clarity on how labs can demonstrate their level of quality are among the highlights of the 2015 edition.

The most noteworthy changes relate to quality terminology, personnel records, specimen labeling, laboratory-developed tests, cancer protocols, and next-generation sequencing. (See page 62 for comments on the next-generation sequencing checklist revisions, and the July issue for details on new Individual Quality Control Plan requirements. A new checklist section on in vivo microscopy, which has been added to the anatomic pathology checklist, will be the subject of a CAP TODAY story in an upcoming issue.)

Dr. Hoeltge

Dr. Hoeltge

The broadest changes this year involve quality terminology, says Gerald Hoeltge, MD, checklist commissioner and a member of the Checklists Committee. These revisions are a work in progress, but in this round they led to hundreds of changes in wording to make the use of important terms more precise and more consistent, to reduce redundancy, and to improve concordance with International Standards Organization (ISO) terminology.

“There were a couple of motivations behind the quality terminology changes,” Dr. Hoeltge says. “First is the fact that quality is consistency. We’ve got 21 different checklists and sometimes, not surprisingly, words have different shades of meaning in different locations of the checklist. Any word that appears in the checklists ought to be used the same way wherever it appears.”

“Second, we know that English is not the first language for many of the participants in the accreditation program. So we’ve added a long list of definitions to the checklist.”

The Checklists Committee edited three of every five checklist requirements this year with the quality terminology project in mind, so 60 percent of the requirements had some changes in them as a result, Dr. Hoeltge says. But the quality terminology project has not changed the meaning of any checklist requirement. “We hope it has made them all a little clearer,” he says.

“Terms like ‘procedure,’ ‘policy,’ ‘document,’ ‘preventive action,’ and ‘corrective action’ are used by the Clinical and Laboratory Standards Institute and international standards groups in a very specific way,” says William W. West, MD, chair of the Checklists Committee. “These are common terms in technical manuals and other laboratory publications. When we looked across the checklists, the terms were generally used properly but there was some variation within and between checklists. For example, in the 21 checklists, we found that each would use the term ‘process’ a little differently, sometimes even within the same checklist.”

“We were trying to standardize use of the terms so that no matter which checklist you pick up, the term will be used the same way and have the same meaning.”

The revisions mean that the word “policy” is sometimes replaced with “procedure,” while the word “documented” may be replaced with “recorded.” For example, the mandate “Data must be documented” becomes “Data must be recorded.”

Other changes standardize use of terms with subtly different meaning, such as “preventive action” and “corrective action.”

“Preventive action is defined as action to eliminate the cause of a potential non-conformity or any other undesirable potential situation,” Dr. West says, “while corrective action is action to eliminate the cause of a detected non-conformity or other undesirable situation. We used these terms interchangeably at one time, but they mean different things.”

Dr. West

Dr. West

Inconsistencies may seem minor and minimally perceptible to an end user, but they can compound and lead to confusion, he says. “As we start to move into international markets—a growing segment of our accredited laboratories—it becomes even more difficult for international users to understand why we would use the same terminology in different ways in different checklists.” Getting terminology in line with standards like ISO 15189 is a benefit of standardizing quality terminology, “so we’re all speaking the same language, so to speak.”

In a major move toward increased consistency, the 2015 checklists now describe specimen labeling requirements as part of the All Common checklist. “Specimen labeling was described in each of the specialty checklists before, so there were almost 40 ways it was referred to,” Dr. Hoeltge says. “Moving it into the All Common checklist obviously reduces the number of requirements for special labeling, and makes the requirements exactly the same now in every part of the laboratory.”

A handful of areas like transfusion medicine do have additional issues in specimen labeling and some requirements remain in those specialty checklists. “But the heavy lifting now is really done with just the All Common checklist requirements.”

A key issue in standardizing specimen labeling (in COM.06100 and COM.06200) was defining the concept of a primary and a secondary specimen. With the 2015 checklists, a primary specimen is defined as the body fluid, tissue, or sample submitted for examination, study, or analysis and that may be within a collection tube, cup, syringe, swab, slide, data file, or other form as received by the laboratory. “A primary label is the label that’s on the specimen when it arrives in the laboratory, so it has requirements for unequivocal identification of contents,” Dr. Hoeltge explains.

The secondary specimen is any derivative of the primary specimen used in subsequent phases of testing, he says. “So it may be an aliquot, dilution tube, slide, block, culture plate, reaction unit, data extract file, image, or other form during the processing or the testing of the specimen.” The aliquots or images created by automated devices and tracked by internal electronic means are not secondary specimens, he notes.

Taking an aliquot as an example, “we have a primary specimen that comes into the lab and we’re going to do some chemistry on it, some hematology on it, and some microbiology on it,” Dr. Hoeltge says. “So that primary specimen is now divided into three secondary specimens. The accession part of the lab will be really focused on COM.06100, the primary specimen labeling requirement. But the other parts of the lab will be getting secondary specimens, so they will be more interested in the COM.06200 part of the requirement.”

The newly formulated definitions are helpful for laboratories with less common specimen types too, Dr. Hoeltge notes. For laboratories doing next-generation sequencing, for instance, “a primary specimen can be a data file. We have some bioinformatics laboratories in the program that are bioinformatics only. They don’t do any wet testing at all; the data file is their primary specimen.”

In the new labeling requirements, the primary container is basically the innermost container that holds the patient specimen, Dr. West says. “So you have to have two unique ways of identifying that specimen; there have to be two patient-specific identifiers. The secondary specimen containers, on the other hand, just have to have one identifier, but it has to be traceable back to all the information associated with the primary specimen.”

The specimen labeling changes continue the Checklists Committee’s efforts to move as much as possible of the specialty checklists into the All Common checklist. Eventually the committee would like to have the All Common requirements and a specialty’s requirements in a single checklist for that particular section of the lab, “with the added value that those All Common requirements will look identical, whether it is a hematology lab, a chemistry lab, or any other part of the laboratory,” Dr. Hoeltge says.

Addressing one of the most common concerns about how to comply with checklist requirements on personnel records, the 2015 edition revises the Laboratory General checklist item GEN.54400 to allow labs to use a certification verification organization (CVO) to confirm personnel qualifications to perform lab testing.
The difficulty of maintaining current personnel qualifications on file—especially for large organizations with hundreds of laboratory testing personnel, such as nurses, who perform nonwaived testing—has been a chronic source of complaints by laboratories, which have been required to produce personnel documentation for inspectors on the day of inspection.

Based on these comments, the CAP is now requiring that if a laboratory uses a CVO, instead of being required to obtain paper copies of diplomas and transcripts for all personnel, the laboratory will have seven days to obtain a copy of the diploma or transcript upon request. GEN.54400 will require laboratories using a CVO to perform an initial validation of the effectiveness of the process and an annual audit to confirm that the process continues to be reliable.

Dr. Scanlan

Dr. Scanlan

The issue of personnel records has been the source of a lot of discrepancies between CAP and CMS laboratory inspections, says Richard M. Scanlan, MD, chair of the Commission on Laboratory Accreditation. “It’s been one of the No. 1 things, and that becomes our concern because if there are too many discrepancies, it gets the accreditation program in trouble. So we’re doing everything we possibly can to minimize these problems. And that’s why this change in the checklist is being made.”

Typically, large institutions that may have 2,000 nurses performing point-of-care testing outsource the verification of credentials to a third-party group, Dr. Scanlan says. “We found that the Joint Commission allows up to seven days to have the verification service produce the documentation, and we didn’t want to be any stricter on this than what was necessary. So we are allowing people to do this, and CMS will accept it too.”

Until now, the CAP was issuing citations if labs didn’t have the credentials on the day the inspectors were there. “So we’re just kind of loosening our standards a little to help the laboratories and let them know it’s okay to use the verification services, which we think will improve overall compliance.”

As the new requirements are written, the laboratory has to have a written plan for getting the personnel documents, and check on an annual basis that the plan works by requesting a sample of records and making sure it can get them within the seven-day period. “If they have a documented procedure for recovering the records and annually show that it works, we’ll accept their program,” Dr. Scanlan says.

In the All Common checklist, the CAP made significant changes to the method performance specifications section to address laboratory-developed tests. COM.40350 is an added checklist requirement that establishes the minimum number of samples labs must use for studies to show analytical accuracy of an LDT—generally a minimum of 20 samples, with concentrations distributed across the analytical measurement range.

Other changes address calibration/quality control procedures, LDT reporting, LDT clinical claims validation, and availability of method performance specifications to the inspection team and clients.

“There were a couple of problems that we needed to address,” says Stephen Sarewitz, MD, member of the Council on Accreditation and chair of the Workgroup on Labora-tory-Developed Tests. “The first is that both the CAP and the CLIA validation requirements, while they’re quite detailed, are actually very non-specific about certain elements of the analytic validation of laboratory-developed tests.”

The validation requirements don’t say anything about the extent of the studies that need to be done, Dr. Sarewitz explains. “This is a problem for a couple of reasons. First, there’s no guidance for the laboratory. But also, even though the vast majority of our labs are excellent, there are always a few that do studies that are clearly inadequate. And neither the checklist requirements nor CLIA have any language to allow the CAP accreditation program to go to those problem labs and say, ‘Look, you need to do a study that’s a little more robust.’”

The second problem, Dr. Sarewitz says, is that with the exception of the molecular pathology checklist requirements, there are no clinical validation requirements in CAP for LDTs and there are none at all in CLIA. “That represents a potentially significant problem. Just because a lab can produce a test that performs well analytically, it doesn’t mean that the test is good for diagnosing a certain condition or disease unless there is solid evidence in the literature, or the lab does a clinical validation study.”

While small hospital labs are likely to have very few LDTs, reference labs and large academic labs may have hundreds, Dr. Sarewitz notes. “So it’s a very big deal for them. And the whole nature of LDTs has changed over the last 10 or 15 years. These types of tests used to be relatively simple and straightforward, but that’s no longer the case. They can be extremely complex with molecular testing methods using next-generation sequencing, immunologic methods, high-performance liquid or gas chromatography, and so forth.”

Dr. Sarewitz

Dr. Sarewitz

That’s one of the reasons the FDA wanted to step in and directly regulate LDTs, Dr. Sarewitz says. “The Food and Drug Administration’s regulatory requirements are still under development, and we don’t know what the final outcome will be, though CAP has had extensive discussions with the agency. But regardless of FDA’s final requirements, we feel there needs to still be basic foundational enhancements to the checklist requirements.”

As part of the 2012 revisions, the LDT validation requirements were moved from the Laboratory General checklist to the All Common checklist. “The inspector who uses the All Common checklist is the discipline-specific inspector, not the laboratory general inspector,” Dr. Sarewitz says. “We felt that the inspector looking at individual sections of the lab is better suited to evaluate whether a test is properly evaluated.” A comparison of deficiency citations in 2011 and 2012 showed that inspectors detected more deficiencies after the shift. “We think it‘s because you had inspectors concentrating on that section of the lab where their expertise lies.”

For the 2015 changes, the new minimum number of samples for validation studies in COM.40350 was one of the main focuses of the LDT committee, Dr. Sarewitz says. “First of all, it’s a phase-one requirement because it’s new and we want labs to get a gradual introduction to it. We didn’t want to impose it in a stringent way initially. So it states that labs must validate analytic accuracy with 20 samples at a minimum. For quantitative tests, the samples should be distributed across the analytical range, and for qualitative tests, samples should include positives, negatives, and low positives.”

If the validation study uses fewer than 20 samples, the laboratory director must provide the criteria used to determine appropriate sample size. The minimum sample number does not apply to manual microscopy (e.g., histopathology, cytopathology, examination of body fluids or blood, or Gram stains) or to conventional microbiologic cultures and susceptibility studies. Also, for certain methods that test for multiple analytes (next-generation sequencing, FISH, HPLC, GC, and others), analytic accuracy may be established for each method, not necessarily each analyte.

Checklist requirement COM.40640 now includes additional language addressing the extent of clinical validation studies that need to be done if the laboratory makes a clinical claim about an LDT. A “clinical claim” means a statement by the laboratory (in its catalogue, website, test report, or newsletter, for example) that the LDT can detect a particular disease or condition or the risk of developing a disease or condition. A 20-sample minimum will now be required; if fewer than 20, the laboratory director must provide the criteria used to determine appropriate sample size. “This requirement only applies to LDTs whose validity in detecting the disease or condition is not well established in the medical literature,” Dr. Sarewitz says. “If a lab makes a clinical claim about such a test, then they need to have done these types of studies.”

The number 20 was chosen deliberately, he adds. “It’s not a number high enough to be statistically rigorous in many cases. And that’s because it’s fundamentally impractical for many labs to get large numbers of samples. But the idea was to prevent a clearly inadequate study consisting of one or two or three samples.” (However, labs do have an “out,” Dr. Sarewitz notes. If they have a good reason for an insufficient number of samples, the director needs to record that for the inspector.) For tests that have been in use since before the date of July 31, 2016, laboratories can also use data they have accumulated retrospectively to beef up a validation study (for example, PT data).

Other important checklist changes relating to LDTs:

  • Definition of terms. For the first time, analytical and clinical validation are now defined, in an effort to prevent confusion, says Dr. Sarewitz. Analytical validation, the checklist says, is “the process used to confirm with objective evidence that a laboratory-developed or modified FDA-cleared/approved test method or instrument system delivers reliable results for the intended application.” Clinical validation is “the determination of the ability of the test to diagnose or predict risk of a particular health condition or predisposition measured by sensitivity, specificity, and predictive values.”
  • Elimination of LDT “grandfather clause” (COM.40350). The older definition of LDTs said that for purposes of CAP requirements, the test had to be first used for clinical testing after April 23, 2003. Originally this provision was included out of concern that older, traditional LDTs might be subject to onerous requirements, but as LDT requirements have evolved this will not be the case, Dr. Sarewitz says.
  • More comprehensive LDT lists (COM.40200). The prior checklist required labs only to maintain a list of LDTs that were implemented in the previous two years; now the list is required to include all LDTs.
  • LDT reporting (COM.40630).This checklist requirement (phase one) has been revised to require only that the report state that the test was developed by the laboratory. The previous requirement that the report contain a description of the method has been changed to a suggestion that a “brief description” be included in the report if the information is not readily available elsewhere.

The most important feature of these LDT checklist requirement changes, Dr. Sarewitz says, is the numerical requirement for analytical accuracy. “Irrespective of what the FDA does, the CAP is going ahead and enhancing its checklist guidelines and requirements for LDTs in the interest of lab quality and in the interest of patients.” Other parameters of LDT analytic validation, such as precision, interferences, or lower-limit detection, may be part of future revisions to the checklist, he adds.

The CAP’s 68 cancer protocols must be used by CAP-accredited laboratories for reporting on the definitive resection specimen in which there is invasive malignancy or ductal carcinoma in situ of breast. With the 2015 checklist edition, the anatomic pathology checklist will require that the case summaries be written in synoptic format and that audits verify compliance with that format.

While tweaks of the checklist requirements relating to the cancer protocols are frequent, the most recent major overhaul of this part of the AP checklist was a couple of years ago, says Jean Simpson, MD, chair of the Cancer Committee. She says the impetus for the 2015 changes, which are significant, is the release of the new American Joint Committee on Cancer staging manual, scheduled for 2016.

The requirement for synoptic reporting (ANP.12385), until now a phase zero—meaning that laboratories just had to collect the data for the checklist requirement—will be given some teeth. “It will be a significant deficiency if you’re not doing it.,” Dr. Simpson says.

Synoptic reporting allows the treating physician to readily identify the essential data elements. “It’s a summary,” Dr. West says. “So if you have a complex case, then someplace in the report you have a list of data elements such as ‘tumor type’ or ‘size of tumor’ with a short answer following each element. By looking down this list in a relatively concise fashion, you would get a good idea of what was going on with that specific tumor in the resection specimen for that patient.”

“There’s a long history of pathologists using narrative reports,” Dr. Simpson says, “and it’s nice to be able to embellish and clarify things in a comment. But getting the essential elements in a form that is easy for the treating physician to find, so that the appropriate prognostic information and treatment management decisions can be made, is made much easier by using the synoptic report.”

Dr. West says the CAP isn’t alone in talking about synoptic reporting at this time. The Joint Commission and the American College of Surgeons are recommending synoptic reporting. But the Checklists Committee did not want the details of the synoptic format to be too prescriptive, Dr. West says. “They think it would be good for a laboratory to have a little bit of flexibility in how it designs its synoptic format.”

Dr. Simpson doesn’t expect the transition to be difficult for pathologists because many are already using synoptic reporting regularly. “The majority of cases that I see in my consultative practice do conform to a synoptic format,” she says.

However, Dr. West believes synoptic reporting is still not the most common format used in pathology. “There are a wide variety of specimens that aren’t even tumors, such as diagnostic biopsy specimens and cytology specimens, and we still use narrative reports for much of anatomic pathology. But for complex resection specimens with invasive tumors, now synoptic reporting will be required.” Ductal carcinoma in situ is the only noninvasive cancer for which the same requirement will apply.

The second major change is the addition of ANP.12360, which requires audits. This provision basically says “not only do you have to report the protocols but you also have to conduct an audit, through a random sample review, to show you reported them appropriately,” Dr. West says.

“And that is new,” he adds. “A lot of these cancer protocols are detailed. There are 68 of them altogether that the College put out for a lot of different tumor types, and some of them are one-page reports; some are two pages. They can get very detailed.”

“The question became whether inspectors have time to audit these fairly detailed cancer reports,” Dr. West says. “We thought a self-audit would be appropriate for starters; the inspector can look over the audits and later also spot-check to see if there’s an honest effort to meet the checklist requirements for reporting the cancer protocols.”

The checklist does not state exactly how the audits have to be performed, but it does require documentation that laboratories are conducting the audits and that they are performed at least once a year. “Some effort will be required to conduct these audits,” Dr. West says.

Ideally, Dr. Simpson says, laboratory managers can look for this information and compile a management report verifying that it’s being done. “It will be one more thing pathologists have to do,” Dr. Simpson admits. “But I think it will allow really keeping on top of reporting and making sure it’s done in a consistent manner.”

Overall, this round of revisions affected many checklist requirements, more than 1,400 in total, Dr. West says. “Most of the revisions were minor, involving exchanging one word for another, and will not significantly affect participating labs because they really did not change the intent of the requirements.” However, committee members who worked on the checklist revisions concurred with Dr. West that laboratories would be well advised to pay attention to the changes because a handful are significant.

More changes in the checklists are to come, Dr. Scanlan notes. “The organization of the checklist will be changing so that it better suits the customized structure of the laboratory. So if the laboratory happens to mix, for instance, hematology testing with a chemistry test and an immunology test, right now that section would get four checklists. When the project is completed, they will get just one checklist, but with all the right questions in it. We feel it will be easier for the labs to know which questions are relevant to their activities and which are not.”

The CAP’s Sept. 16 checklist update webinar should help smooth the transition to the 2015 checklist changes, Dr. West says. Register at https://attendee.gotowebinar.com/register/5489285781130757122. Registration is limited.

[hr]

Anne Paxton is a writer in Seattle.