Home >> ALL ISSUES >> 2022 Issues >> A look ahead at AI-based assistance in anatomic pathology

A look ahead at AI-based assistance in anatomic pathology

image_pdfCreate PDF

Charna Albert

February 2022—In a survey of the international pathology community on the integration of artificial intelligence into diagnostic pathology practice, 80 percent of the 487 respondents predicted integration within the next five or 10 years. Seventy-one percent indicated AI tools could increase their diagnostic efficiency (Sarwar S, et al. NPJ Digit Med. 2019;2:28). In a review of AI in anatomic pathology published last fall, the authors detailed what it will take to get there.

AI use for clinical work, the review authors write, should be “affordable, practical, interoperable, explainable, generalizable, manageable, and reimbursable” (Cheng JY, et al. Am J Pathol. 2021;191[10]:1684–1692). The domain expertise of pathologists is central to design and development. In addition to the needed buy-in and guidelines, they write, caution is needed in implementing machine-based assistance in clinical settings, “as pathologists’ diagnostic decisions are prone to be influenced by AI, introducing novel sources of bias” (Kiani A, et al. NPJ Digit Med. 2020;3:23).

Despite the largely positive attitudes toward AI tools of those responding to the international survey, 48 percent of respondents felt that diagnostic decision-making should remain a predominantly human task. Twenty-five percent said it should be shared equally with an AI algorithm.

Though some pathologists may fear being supplanted by AI, says Liron Pantanowitz, MD, MHA, director of the Division of Anatomic Pathology, University of Michigan Health, and one of the review’s coauthors, “we’re far away from that time. The people making AI algorithms are not making them to replace us. They’re making them to assist us, which is a good thing for now.” Furthermore, he says, most vendors are developing what’s called “narrow AI.”

“Let’s say you have to diagnose prostate cancer. You train a prostate algorithm to look at tissue, find abnormal glands, decide if they’re atypical or not, if they’re atypical how bad, and then you can get the grade.” Such an algorithm is trained to do one task, he says. “And that’s all it will do—very well and reproducibly, but it’s not very broad. If there’s something else in that tissue or biopsy, the algorithm won’t pick it up because it’s not designed to catch everything. It’s not going to have real intelligence.”

Dr. Pantanowitz and coauthors say the ultimate test of an AI-based system is whether it can be integrated into pathologists’ workflow and that computer-assisted automated Pap test screening was an early success story in this regard. Hologic, maker of the ThinPrep imaging system, has now developed a new deep-learning-based and fully digital cytology platform, known as Genius Digital Diagnostics. “We’re testing their product in our lab. We’ve asked for the scanner and AI, and we’re training everyone to do the validation,” Dr. Pantanowitz says of Genius.

Michael Quick, VP of research and development/innovation at Hologic, says Hologic has developed for Genius a scanner that uses volumetric imaging to capture a digital three-dimensional image of the cellular material. “What allows us to do that is capturing the full depth between the top of the glass and the bottom of the coverslip. So you can think of it as a kind of CT scan of a microscope slide.”

Quick

To make the massive amount of data captured clinically relevant, the scanner then collapses the three-dimensional image into a single two-dimensional representation, with all cells in focus in a single plane. “It allows the user to quickly get a good representation of the cellular content without having to focus up and down on the individual cells,” Quick says. The digital system captures the full cellular detail digitally. “But it’s not just capture it, analyze it, then discard it. We’re retaining it, so now the user can make the diagnosis on a high-resolution monitor.”

The ThinPrep imaging system narrows the cellular content to about 20 percent of the slide for review. “With the Genius platform,” Quick says, “we’re narrowing that even more, with better AI, to get a single gallery of about 30 images of individual cells or cell groups to make the diagnosis.” Genius is CE marked for diagnostic use in Europe, and Hologic is pursuing a regulatory path in the U.S.

Paige received de novo approval from the FDA last September for Paige Prostate. The pivotal study submitted to the FDA found that when pathologists were aided by Paige Prostate, there was a 70 percent reduction in the number of false-negatives, Juan Retamero, MD, Paige’s medical director, says. “This was due to improvements in sensitivity and specificity compared to when pathologists read the same cases without AI assistance,” he says. The study has been submitted for publication.

Developing deep-learning algorithms requires a data labeling step (malignant versus benign, necrosis versus fibrosis, for example), and this process is laborious, Dr. Pantanowitz and coauthors write, “especially considering the large number of images and significant person-hours required for review and annotation.” The annotation process creates a bottleneck and is “almost by definition a limiting process and one of the main problems of supervised learning,” Dr. Retamero said in a presentation at the Digital Pathology Association’s 2021 Pathology Visions conference. Paige didn’t train its algorithm by showing it annotated pixels. Instead, it employed multiple instance learning, a weakly supervised deep-learning approach that uses only the diagnostic report as labels for training.

“What we do is show the whole slide images and corresponding pathology report to the computer and let the computer figure out what’s going on,” Dr. Retamero explained at the conference. “So essentially the model learns from the pathology report.” This means that the model also learns from all the associated processes that may have been reflected in the report, such as additional stains and second opinions. “It’s not that the model learns from the immunohistochemistry images themselves,” he said. “It learns from whatever the pathologist put in the report, which of course may include information from other sources and not just the H&E.”

The Paige Prostate algorithm was trained on 32,300 slides (from 6,700 patients) originating in multiple laboratories. “So the amount of variability we are exposing the model to is incredible, and this is thanks largely to multiple instance training,” Dr. Retamero said. Annotating that number of slides isn’t feasible. With the alternative multiple instance approach, he said, “the system gets exposed to an enormous amount of variability when it comes to patients, preanalytical variables, staining, section thickness. And this amount of variability is the pillar of generalizability”—that is, a model trained on sufficient data “that can be used out of the box in any setting without calibration or further retraining.”

Dr. Retamero likes to think of AI as training a virtual graduate student, “because artificial intelligence isn’t here to replace pathologists,” he said. “It’s here to help pathologists do a better job.” And if a pathologist were to choose a virtual graduate student to screen cases, he said, “which one would be preferred—one who has seen thousands of slides or one who has seen only a few hundred?” That’s the advantage, he said, of the multiple instances trained model.

In the many laboratories that are not fully digitally transformed but have some level of digital operations, AI can be deployed for quality control, Dr. Retamero said. “Labs should strive to achieve the complete digital transformation of their operations, like radiology did decades ago. But for those labs that choose not to do so, artificial intelligence can provide a safe tool to perform quality control of an entire caseload very unobtrusively. This can be done by digitizing the cases and running AI after diagnosis,” he said.

CAP TODAY
X