Laboratory-developed testing as it relates to next-generation sequencing was up first in the NGS conversation led online by CAP TODAY publisher Bob McGonnagle on March 19. Other topics: in-house NGS testing, artificial intelligence, and bioinformatics. “There’s a reality now where bioinformatics is solid, stable, and reliable,” said José Luis Costa, PhD, of Thermo Fisher Scientific.
CAP TODAY’s guide to next-generation sequencing systems begins here.
May 2024—Laboratory-developed tests are top of mind for many people in laboratories. People who use next-generation sequencing panels and machines are, in many cases, heavily involved in laboratory-developed tests, in addition to the panels that many companies offer. Larissa Furtado, talk to us about the role of laboratory-developed tests in the context of next-generation sequencing testing.

Larissa V. Furtado, MD, molecular pathologist, St. Jude Children’s Research Hospital: They’re essential. In our practice, all the oncology NGS tests we have in-house are laboratory-developed tests. We don’t have FDA-approved tests.
Next-generation sequencing is a technology that evolves quickly. You can test multiple genomic anomalies at different genes simultaneously and multiple samples at one time. The flexibility of LDTs gives laboratories the ability to develop their own tests and update them according to their needs. I am in a pediatric hospital where most of our patients are in clinical trials, so we have a comprehensive testing approach, which may be different from other centers or smaller laboratories that use more targeted testing. With the existing accreditation requirements and framework, pathologists are well positioned to bring in these tests and ensure quality.
We’re on the verge of important decisions about laboratory-developed tests. The president of the CAP, Dr. Donald Karcher, is testifying this week during a congressional hearing on LDTs, as is our colleague Dr. Dara Aisner. José Luis Costa, I’m assuming Dr. Furtado’s comments are similar to those you hear from Thermo Fisher customers. Can you comment on the importance of LDTs for your customers?
José Luis Costa, PhD, global director of scientific affairs, clinical next-generation sequencing and oncology, Thermo Fisher Scientific: Dr. Furtado’s comments mirror what we hear from the community. LDTs were fundamental in the evolution of molecular pathology, are fundamental now in providing molecular testing for patients, and will continue to be fundamental in the future. Behind the strategy and discussion going on now, the question is around quality. To provide a test without quality is worse than providing no test at all. Having a layer of quality implemented with any LDT is something we as a community should look forward to. Of course, there is a balance—we can’t have a perfect test that is not being used to test patients. Equity of access is important, which is where LDTs come in as a key player. It’s a dual balance of having good-quality tests that are equally accessible to patients, as much as possible.
Karla Bellett, are you picking up on an anxiety within your installed base should an LDT decision from the FDA become so drastic that it removes the ability to run some of these tests?
Karla Bellett, MT(ASCP), CLS, senior staff regional segment marketing manager of oncology testing, Illumina: Yes. This is causing a level of concern like that raised by the FDA’s analyte-specific reagents guidance update in 2007. The laboratory community has CLIA and CAP guidelines that help you understand what you need to do for an LDT and how to validate it when it’s not an FDA-approved assay. A lab will do basically what a manufacturer would do; it’s like an analytic evaluation. You can check the limit of detection, variant allele frequency, sensitivity, specificity, CVs at certain levels if there’s a cutoff. Different groups, like the AMP and Cancer Genomics Consortium, have published recommendations for the different types of molecular technologies. As a community, we need to continue to raise awareness about what quality assurance programs are already in place to ensure quality laboratory testing and that the testing is governed in part by CAP and CLIA inspections. These are all ways to verify that the performance of an LDT is as good as, if not better than, an FDA-approved assay.
There’s a lot of concern in laboratories about the new proposed FDA regulation, and it could cause major institutions, especially the academic medical centers, to have to redo some of their work.
Sam Hester, we see announcements of health care systems that decide to do next-generation sequencing with large gene panels on virtually every cancer patient diagnosed in the system. A severe limitation on the use of LDTs would negatively impact that kind of initiative, wouldn’t it?
Sam Hester, staff product marketing manager, Illumina: Absolutely. There are always new innovations in sequencing, chemistry, costs, and accuracy, as well as new tools providers and new ways of analyzing data from tools providers. The research community marches ever onward in terms of setting new benchmarks and capabilities within basic and translational research. LDTs have been a way to potentially access the latest innovations a little faster on the clinical side. An IVD approval takes a while, and by the time it gets through, there’s potentially newer technology that cuts the cost in half again or enables a new benchmark in accuracy. Moving forward, it’s important to consider how we set our patients up to achieve the best possible outcomes with the technology that’s available, whatever system is in place.

Shu Boles, as we talk about the regulatory environment for this kind of testing, tell us what your thoughts are.
Shu Boles, PhD, MBA, director of genomics strategic marketing, Qiagen: This potential decision is affecting many customers in the field and causing anxiety. We hear in consensus that what we have so far is good and enables the best patient care, so why would we hinder progress that enables customers, patients, and molecular pathologists to access the latest validated assay to help in selecting and monitoring therapy.
There seems to be an increasing appetite to bring NGS in-house, not only because of turnaround time but also because clinicians are eager to begin treatment with a full testing profile on their cancer patients, though there are many noncancer conditions for which NGS is also important. Dr. Furtado, do you find, as you speak to your colleagues at the CAP, AMP, et cetera, that this movement to bring NGS in-house is gaining momentum?
Dr. Furtado (St. Jude): NGS is becoming standard of care because of its flexibility, ability to test multiple genes simultaneously, and cost-effectiveness from a workflow and tissue utilization standpoint. An advantage of developing tests in-house is you get to understand your test better, especially the bioinformatics, compared with onboarding a system or software developed by a third party. There is an advantage to having the flexibility to develop specifically what we need. Even if you’re purchasing commercially available NGS panels that have a large number of genes, if you develop the test in-house, you can decide if you want to validate all genes or a subset of them for clinical reporting. And, as you said, the turnaround time is faster. Also, because you know more about your test, you’re better prepared to answer clinicians’ technical questions related to the interpretation of the findings.
I see a lot of movement toward bringing tests in-house, but there will always be send-outs for tests that one is not equipped to provide in-house.
Shu Boles, other technologies are waiting around the corner—digital PCR has gotten a lot of publicity. Do you regard that as part of the basic family of genomic understanding of patients in the clinical context?
Dr. Boles (Qiagen): I do. We have great technology to support our doctors and patients, and we need to use the right technology to answer the right questions. For example, sequencing can reveal a lot of information, but if you use NGS to do tumor classification rather than use methylation array, the cost to obtain that amount of information would be astronomical to yield the most outcome for the doctors and patient. Similarly, you have to evaluate the cost of digital PCR, the information you can get, and the accessibility of the technology, especially when you are considering bringing it in-house. Digital PCR, for specific applications such as monitoring, can be more cost-effective. Maybe you’re not getting the entire picture, but in terms of accessibility and your in-house resources, perhaps it’s better suited for certain applications. It’s a balance.
We’ve been chronicling the evolution of the molecular tumor board. At one point, there was in some places a separate molecular tumor board. Coming away from that, someone from that board would go to a more general tumor board or case planning conference and share the NGS information that would help to inform. Now the molecular tumor board as a freestanding board is disappearing in many places because there’s more confidence that the people around the table can follow the discussion of sequencing and other advanced technology results. Is this true in your experience, Dr. Costa?

Dr. Costa (Thermo Fisher): I wouldn’t say it’s disappearing; it’s maturing. Institutions approach molecular tumor boards in different ways in terms of their participants and their competencies. In some institutions the tumor boards are still molecular based, almost a mutation board discussion on interpreting the identified variants and how they can provide information on prognosis or therapy decisions. Other institutions have not only the molecular biology component but also radiologists, pharmacists, oncologists, and pathologists. It’s a much broader tumor board that, in more complex cases, goes in-depth with all the clinical competencies to bring together the information, not just from NGS but also from imaging and biochemical testing, to provide a more complete interpretation of the status of the patient and better decisions for therapy.
There are also institutions that take part in networks of molecular tumor boards because, for example, a specialty or competency is not present at that institution so they bring external experts into discussions of certain cases.
Karla Bellett, talk to us about the common problem of reference laboratories doing next-generation sequencing, the results of which sometimes go to a surgeon or medical oncologist and are never fully integrated into the original pathology workup of a patient. Is there greater coordination and coherence within tumor boards so specialists have the same information as they talk about cases?
Karla Bellett (Illumina): I definitely see evidence of more specialized tumor boards. The National Cancer Institute hosts a weekly virtual neuro-oncology tumor board for central nervous system tumors. Anyone from across the United States, and even outside the U.S., is welcome to participate and submit cases for review, and they rely on the integrative approach Dr. Costa described. They want to include imaging, results from IHC, FISH, PCR, NGS, DNA methylation. What did the classifier say, what is the CNV analysis, what is the MGMT promoter methylation score. This approach is also being done at trade shows. For example, at the Society of Neuro-Oncology annual meeting there are satellite symposiums with panelists that include interventional radiologists, neuropathologists, neuro-oncologists, and neurosurgeons. They ask all the health care personnel involved in the CNS tumor patient continuum what their approach would be to a case and what they have found to be helpful.
Dr. Furtado, many pathologists, particularly those who are not specialized in molecular pathology, find the large number of tumor boards to be difficult in terms of staff time. What are your thoughts about the future of tumor boards?
Dr. Furtado (St. Jude): I see a lot of value in tumor board because it’s where you integrate the findings from your patient more holistically with an interdisciplinary group that comes together with the goal of management. Everything we are doing in pathology, with ancillary tests, and in radiology is aimed at getting to the best answer for that management. Tumor board discussions help personalize management, especially in more difficult scenarios. My colleagues consider it a part of what they do and a privilege.
Some of the initial excitement around NGS was muted because the informatics seemed to be a struggle. People were thinking they’d have to import experts in computational pathology and computer science to offer NGS. Dr. Costa, can you comment on the informatics component of NGS?
Dr. Costa (Thermo Fisher): There’s a reality now where bioinformatics is solid, stable, and reliable. There are plug-and-play solutions, so there’s no need for bioinformatic expertise or presence in a laboratory to provide NGS results. Another reality is in translational research-related activities, where the bioinformatic component is more impactful but requires more flexibility and expertise. The two realities coexist.
Some institutions might require both situations—clinically oriented, simple plug-and-play answers or more specialized bioinformatic expertise, such as for large academic centers that are doing research, bringing the knowledge, and moving the needle. Many health institutions are just required to offer simple NGS tests to provide clear guidance for patient management. Those solutions do not require any bioinformatic expertise because they are well established in the routine setting.
Sam Hester, what are your thoughts about the application of AI in the interpretation of NGS and other advanced tests?

Sam Hester (Illumina): There’s a lot of excitement and anxiety as well, when you consider the tools available and how rapidly they’re developing. The more that NGS data grows, especially human genetics data as a whole, the more important it is to apply machine learning and AI tools to be able to quickly and efficiently sort through the information, compare it to the databases that exist, and come up with the most integrated and informative answers. There is a race going on now with who can best apply the latest innovations in AI technology to answer those questions. Illumina is looking into that with our PrimateAI.
There are ethical questions in terms of how the data will be used and where and how those access points will integrate within the clinical world. There’s a lot going on in the translational basic science area now—how does this apply to the ultimate goal of providing the best patient answers? Those questions are yet to be fully answered.
Dr. Furtado, what are people at St. Jude talking about in the hallways around AI and its usefulness for your mission and patients?
Dr. Furtado (St. Jude): About four and a half years ago, I implemented a methylation array-based assay here for central nervous system tumor classification, and we developed our own classifier, which is a machine learning supervised method. We started with a neural network and evolved to another architecture. So machine learning is already a reality in our clinical practice with DNA methylation profiling for CNS tumors, and we have started to extend it to classification of other tumor types, such as leukemias. Other research laboratories at St. Jude are also doing exciting things in this realm, like multimodal and multiomics data analysis, initially for discovery but hopefully value can be demonstrated and these can be implemented clinically.
AI has a place in genomics—any omics—and in medicine in general if it’s implemented correctly. A framework already exists at the CAP for assay development and implementation that can be applicable to AI. I’m optimistic about it.
Karla Bellett, can you comment on the role of AI and how it’s evolving?

Karla Bellett (Illumina): I agree with Dr. Furtado. St. Jude developed a CNS tumor classifier using a neural-network–supervised learning algorithm model and has compared this to different algorithm approaches, like random forest, to determine which is better. The potential of machine learning algorithms, especially for tumor classification using array technology, is not only is it affordable, but once the classifiers are built and the collected data builds up the reference cohort, it’s like a fingerprint database. Data from an unknown CNS tumor can be uploaded to the classifier, and a tumor classification result with a confidence score will be returned in as little as 10 minutes. This new tumor’s epigenetic signature is now part of the classifier, so one value of building these AI classifiers is they continue to increase the molecular class representation and become more robust over time.
Illumina was notified on March 1 that Heidelberg Epignostix, a DKFZ spinoff group in Germany that pioneered the initial CNS tumor classification work, had received approval from the AMA for a new MAAA [multianalyte assays with algorithmic analyses] CPT code, 0200M. The code will be for CNS tumor methylation classifier analysis when testing is performed using a DNA methylation microarray kit and array scanner. It will be effective July 1, and laboratories could potentially bill CMS beginning January 1, 2025. This is an example of AI already in use and shows it’s not being done only at places like St. Jude. This will be an opportunity for other laboratories interested in the technology to use it and justify internalizing the testing.
Dr. Furtado, gene studies and panels have transformed the classification of tumors in all areas. The WHO blue books are being updated frequently based on the new knowledge. Do you anticipate that as AI applications mature, they will also have a transformative effect on the classification schema in cancers?
Dr. Furtado (St. Jude): If we use methylation for brain tumors as an example, it’s already here. Many of the tumor subtypes in the fifth edition of the WHO classification of CNS tumors have been classified based on methylation. Some tumor types, such as medulloblastoma and ependymoma, cannot be granularly subclassified by any method other than methylation. As it expands to other tumors, we may see more. The bottleneck is having good-quality and sufficiently large data sets to train the classifiers. That would potentially not allow things to move as fast as we are seeing for CNS.
Having the manpower and computer power to achieve these things is daunting, but it seems it’s going to happen and follow on quite naturally, based on these discoveries. Dr. Costa, do you agree?
Dr. Costa (Thermo Fisher): I tend to agree. We are thinking of AI in just one aspect of the landscape of a patient’s path. We focus on data analysis and bioinformatics, but there are many AI tools helping with the management of the logistics in a hospital, bringing the information together. The fear in AI is doing clinical interpretations. This is where things start to become gray. Interpretation will continue to be on pathologists; they aggregate the information and interpret the data. AI will be a support, bringing together and adding information for a better interpretation.
I did an exercise with ChatGPT in which I asked it to write a report of a patient with an EGFR mutation. It’s impressive how it writes a fantastic report according to the guidelines. There’s still a lot to learn, but AI is here. It’s providing new avenues not only for research but also for implementation.
Dr. Boles (Qiagen): Dr. Costa made an excellent point for what should happen. He asked ChatGPT to write a report about EGFR, but he read through it to see if it made sense. Similarly, if we were to leverage AI tools to do clinical interpretation, someone like Dr. Costa should be there to make sure we’re not going off the rails. Human intervention is necessary.
Dr. Furtado, can you tell us about the developments around NGS and reproductive health?
Dr. Furtado (St. Jude): There have been exciting developments around doing broader testing for preimplantation genetic diagnosis than just targeted chromosome analysis or single-gene analysis to improve screening and management capabilities.
There have been advances in infectious disease and NGS. Shu Boles, tell us about infectious disease and these advanced technologies, from your perspective.
Dr. Boles (Qiagen): The pandemic gave the general public a crash course on the application of NGS, which is fantastic because everyone can approach NGS with a little less fear and understand the potential it can bring, such as to infectious disease. We can harness the power of NGS to facilitate the identification of tuberculosis or drug-resistant TB strains to better manage people’s health, for example. Illumina launched a panel to do that, and Qiagen has its QuantiFeron franchise. We also have an NGS library prep pipeline to develop TB testing and beyond, precisely because of the massive sequencing power for metagenomic and metatranscriptomic studies that can be used to identify these viruses and infectious disease agents—you can detect a multitude of them at the same time. You can use NGS in combination with complementary technology for a faster diagnosis or for surveillance, as we have seen in the past three or four years.
During the height of the pandemic, but particularly as new variants were detected, public health labs were overwhelmed trying to get sequencing to identify the variants. I’m assuming, in preparation for the next pandemic, that more attention will be paid to the infrastructure for gene sequencing and other technologies in public health laboratories. Sam Hester, do you see that activity underway?
Sam Hester (Illumina): Absolutely. The transition away from COVID-specific variant testing to broader viral surveillance panels has been underway since late 2021, early 2022. We’re focusing on areas such as detecting coinfections, comorbidities, and new strains of multiple types of viruses, especially coronavirus and flu strains, and on different virus panels. We’re also focusing on bringing sequencing to a broader market globally. A big change with our recent launch of the NovaSeq X was ambient temperature shipping and the elimination of the cold chain. It expands access to where sequencing can occur, such as on the ground in places where diseases are emerging.
Dr. Furtado, do you have any final thoughts?
Dr. Furtado (St. Jude): NGS is here to stay. There are many exciting new possibilities not only in terms of chemistry but also different analyses we can derive from it. There’s a lot to look forward to, including new algorithms and the ability to do new analyses.
Dr. Costa, a final comment?
Dr. Costa (Thermo Fisher): NGS is not only here to stay, but NGS instruments have become standard laboratory equipment. The goal of the community is to democratize NGS access as much as possible. All the partners and stakeholders should work together to make this a reality. Even though NGS instruments are standard laboratory equipment, more than half of patients with cancer are still not adequately tested for the relevant mutations. There is a huge gap that needs to be filled, and it’s not because of a lack of technology; it’s a lack of accessibility to the technology. That’s where much of the effort needs to be put as a community.