Eyes on faster, cheaper, simpler next-gen sequencing

May 2021—Next-generation sequencing analysis and interpretation, as well as reimbursement, were some of what CAP TODAY publisher Bob McGonnagle asked Illumina and Thermo Fisher executives and Jeremy Segal, MD, PhD, about when they gathered on a March 24 call.

McGonnagle asked, too, about variants of unknown significance and for views on what lies ahead for NGS. “Circulating tumor DNA analysis is starting to move wholesale into the academic setting,” along with other applications, says Dr. Segal, of the University of Chicago.

What they said about NGS now and in the future begins on this page. See also CAP TODAY’s guide to next-generation sequencing instruments.

In all other areas of laboratory instrumentation, we have multiple vendors. Yet in next-generation sequencing, as important as it is and with its rapid growth, we have two companies with offerings for labs to choose from. Can you comment on that?
Pierre Del Moral, PhD, MBA, senior segment marketing manager, oncology testing, Illumina: Only 15 percent of cancer patients are being currently tested with NGS nationally. That means there’s a lot of room to grow, which means that adoption is not wide across labs for various reasons. That probably is the reason that we, as NGS vendors, are looking at positioning our solutions in a way that they replace some entrenched technologies, and maybe it is the reason why some other technology vendors are not here yet—because of the acceptance of their technologies already implemented in the laboratory.

Sohaib Qureshi, PhD, director of product management, instrumentation, Clinical NGS Division, Thermo Fisher Scientific: I agree with Pierre. As the utility of NGS increases over time with therapeutic indications based off of NGS testing for specific biomarkers, you’ll see that shift start to happen. We’ll eventually get there as NGS starts going and gains entry into laboratories with less expertise.

Part of that is making the workflows much easier. There’s nothing push-button at the benchtop for NGS at the moment. Illumina, like Thermo Fisher, is trying to get there—to make everything faster, cheaper, and more automated. That’s the part that will help propel NGS and make it possible to penetrate that space where NGS testing will be required for the therapeutic indication.

The Association for Molecular Pathology released in March an interesting analysis of the effort required for molecular test interpretation. One of the themes that emerges from the AMP’s research is that NGS still calls for a great burden of time for analysis and interpretation. Is that also your perception, Jeremy, and is that the feeling at the University of Chicago about NGS interpretation on specific cases?
Jeremy Segal, MD, PhD, director, genomic and molecular pathology, and associate professor, University of Chicago: Yes, certain cases do take a lot longer. I can speak more to the oncology setting. In the inherited disease setting, if you’re doing whole exome or whole genome sequencing, it could also take a fair amount of time, but we’re typically running large-scale cancer panels and, yes, some cases take a long time. Other cases are quick. I don’t begrudge the time spent on it; that’s the joy of the job for us. We have five or six people who spend their time signing out cases, and it’s always fun. We discuss the cases. We figure out what’s going on with the patient. We go back to the anatomic pathologist or oncologist if we need to and ask what’s going on. It’s the good part—the medicine part.

Fiona Nohilly, would you like to comment on anything you’ve heard so far, but particularly on reporting and analysis? There may be more anxiety about that step than there is about the actual understanding of NGS. I don’t think I’ve ever seen a single technology catch on as quickly with as many applications as next-gen sequencing.
Fiona Nohilly, staff product marketing manager, AMR regional marketing, Illumina: Analysis and interpretation is the next frontier and focus area for Illumina, and probably for Thermo Fisher, because sequencing has become a bit more mainstream within major academic medical centers. It’s definitely not widespread, and we have a lot more work to do on educating the broader pathology community, some of which starts with medical education. So much is learned in medical school that if we, as an industry, were able to get ourselves into the medical education, the process of introducing these technologies could begin at that point and then continue through training.

At Illumina we are focusing on the analysis portion; it’s something we’ve invested in. We’ve acquired several companies in the past year, including Enancio, and we’re integrating their genomic data compression technology into many of our platforms. Our technology teams are developing many tools for data interpretation, to help speed it up. It’s not to say that we don’t want the molecular pathologists involved in that; they have to be there. It’s essential. But enabling them to do that faster and not have manual work to do is what we’re investing in.

Years ago, before Thermo made the acquisition that has led to its important role in NGS, people were saying, “We might have this kind of machine in every doctor’s office. It wouldn’t be much different than a small chemistry analyzer. And we would hope to make the analysis simple, in a black box, and then have a relatively easy report that comes out of the machine.” Sohaib, is that still an ideal of yours?

Dr. Qureshi

Dr. Qureshi (Thermo Fisher): Yes, it is. If you’re familiar with Thermo Fisher’s technology, you know we pride ourselves on having a single point of contact in a single solution. But before I address that, I’d like to return to what Fiona said. Education is critical. Of all my friends who went to medical school, none was jumping out of their chairs to take genetics as an elective course.

There is this anxiety when you’re looking at so much data. How do we make it easier? Part of that is education. At Thermo, we have our Knowledgebase Reporter, and that interpretation tool has been around for some time.

But the final frontier for NGS is not up front of analysis. It is analysis and downstream. The visualization. How you look at the report. How easy the interpretation is. Data aggregation strategy. All of these things are going to need to be solved, and, like Illumina, Thermo Fisher is working on them. We feel like we have a nice solution right now to make it easier, more straightforward, and simpler. But as new applications come on board, as we get new biomarkers, as we move forward, we will need better software on the back end. Thermo has invested heavily in that internally, and we’re always looking at a build-by-partner strategy as well, so we’ll continue on that front. The ultimate goal is to make the interpretation much easier than it is today.

Jeremy, what are your thoughts on widening the availability of NGS? I’m sure you receive a lot of referred cases at the University of Chicago.
Dr. Segal (University of Chicago): We do some outreach, but the majority of our cases are for our own patients at UChicago. Many academic centers are like that. Going back to the topic of expanding NGS, one aspect of this is how many patients are getting tested, and the second aspect is where the testing is taking place. You could have more patients tested using the existing providers today, or you can talk about expanding NGS testing across many places that aren’t doing it currently. In some ways, these aspects are connected because if you’re being treated at a community center that is not already doing testing, and then now you have NGS capabilities at that place, you might find patients end up getting tested when they might not have otherwise, even though there is broad familiarity with labs like Foundation Medicine in the community.

Dr. Segal

This is an issue that is being seen as a trend in lung cancer, where patients at some clinics are not necessarily getting tested for EGFR because some oncologists are reflexively ordering immunotherapy. This is just one application where proper patient care requires testing, and getting oncologists to order these tests on an expanded basis means education, and perhaps education of patients because a patient who advocates for it is more likely to get it.

There’s also the issue of what the actual applications are. Many patients aren’t being tested because there are no great available markers that you would hope to find in their case. So part of it is going to be expansion of immunotherapies and new targeted therapies. As more applications and opportunities come on the market, those patients are going to start getting tested.

So one side of it is what oncologists are doing and whether they are ordering it. The other question is who’s doing the testing. Right now the big companies and the academic centers are doing it, but there are not a lot of community centers and small hospitals doing it. And it is a combination of factors that blocks them from doing it, mainly the expenses and expertise required and the lack of optimal reimbursement. Getting NGS testing to expand further into the community will require active work to improve these barriers.

I know you have brilliant surgical pathologists at the University of Chicago, and I’m sure they get cases in consultation. Is it relatively rare for one of those surgical pathologists to request sequencing in a case?
Dr. Segal (University of Chicago): We definitely do that, but it’s not a major fraction of our work.

Dr. Del Moral (Illumina): I’d like to add to what Jeremy said about reimbursement. The adoption of NGS is impacted by the NCCN guidelines. So our market development team is working on advancing and including NGS recommendations in the NCCN guidelines, as well as working alongside our medical affairs experts to showcase the clinical utility to influence reimbursement. So it is a multipronged approach that we need to leverage for NGS to become accessible from a technological standpoint.

Yes, and one of the things that strikes me is how the panel size simply keeps increasing. Years ago, two or three genes were identified and examined, and now we’re up to 50 or 60 or more. As these panels become larger, are we on the cusp of changing over to a different technology—whole exome, whole genome?
Dr. Segal (University of Chicago): We’re starting to get there. I don’t think of that as a whole different technology, though. If you look at our panel, or Foundation Medicine’s panel, you take the DNA, you fragment it. You make a library. You set up a capture reaction with your specific probes. We use a capture panel of 1,005 genes right now, and we’re reporting 155 of those genes. But switching to exome just means substituting exome capture reagents for our current probes. The protocol doesn’t even change.

The major difference with exome is just that you have to sequence all the genes, so it gets more expensive. We would probably spike our capture panel into the exome so that we increase the depth for the clinically relevant genes and then get the background exome at the same time. But we can also think about reflexing. We can do our regular capture, and if people wanted to get exome data from a particular sample, we can reflex that same library to an exome recapture and sequence that also.

The key thing that will drive a shift to exome is probably the degree to which new genomewide meta-analyses become important, for example homologous or combination deficiency, mutational signatures, or high accuracy discrimination of very low-level tumor mutational burden. For example, there is suspicion that in prostate cancer, a low TMB cutoff may be useful for immunotherapy treatment guidance. Maybe that’ll force us into exome, or maybe it will be neoantigen determination.

On the interpretation side, we know what to say about only a few hundred genes. I don’t know what to say about mutations in the remaining 20,000 genes and don’t know how valuable those mutations could be clinically. So I think it will be these genome-scale meta-analytes that push us toward greatly expanded sequencing.

Does that reflect the views at Thermo Fisher as well?
Dr. Qureshi (Thermo Fisher): With a slight modification. We’ve come a long way since EGFR; there are a lot more biomarkers out there. And our mindset is one size doesn’t fit all. As Jeremy pointed out, TMB could be a biomarker with a different level of throughput. You might need an exome-level analysis, or you might just want to look at hot spots for therapeutic indications, depending on where you’re doing the sequencing and why it needs to be done. And people have different interests in the research setting. If there were a silver bullet, that would be fantastic. I don’t think that exists today. I agree with Jeremy that it isn’t a new technology. I think it will be NGS. It’s the most expansive technology out there to give you a broad view and a narrow view at the same time if you need it.

Are we launching enough panels to satisfy needs? Could you satisfy that same need with a single very large panel? I don’t necessarily know if that’s true, so our view of life, if you will, is that it’s not one size fits all. You’re going to need a few different panels at minimum to address the different needs.

Fiona, what are your thoughts about this?
Fiona Nohilly (Illumina): I look at it a little more broadly outside of just oncology. We think about people having large panels and they can filter through what genes are of interest to them if those are included in that list. We also have ways people can customize and develop their own panels. Or, if they want to have some of that information for later, they can do an exome or a genome, and we have many different instruments at Illumina that can solve for any of those things, on even one instrument.

Nohilly

In addition to oncology-related panels, you can do noninvasive prenatal testing in the reproductive health space, for example. Our sequencers are being used for genomic surveillance related to COVID-19. We see it as one instrument that can serve many applications, including many different types of panels as well as exomes and genomes, that would be run potentially in a given hospital.

Dr. Qureshi (Thermo Fisher): The question is if you have an NGS application or panel that doesn’t require you to reflex from a different test up front, why wouldn’t you use that technology? Take EGFR as an example; it’s the one that’s often tested first. If you now get an answer from NGS in the same amount of time that it takes you to do EGFR testing, then that, to us, in Thermo Fisher’s world, is the next step forward. We just launched the Genexus instrument, and on that instrument, if you can get the same answer for EGFR mutation as well as for many other mutations that will have a therapeutic indication, and get a patient on therapy much quicker than was previously possible, why wouldn’t it be used? So we are building these, and we have one on the market today. That’s why we feel that NGS is the future if it can be made simpler and faster and cost-efficient.

Several years ago one of the big topics related to large panels was the reporting of variants of unknown significance. At that time, the panels were not that large, but still it was a hot issue. Do we need to report these variants or not? What is the feeling about how hot that topic is and the solution to that question?

Dr. Del Moral (Illumina): It is definitely a hot topic. There are a lot of questions. The answer would now be to leverage the community. In the reporting setting that we talked about, we see the benefits of leveraging a reporting solution that enables inputs or findings from the community to help the interpretation of these variants of unknown significance as a plus. NGS has opened the door to detecting all of these variants, but the interpretation is becoming one of the hurdles. Having a community-based sharing of information helps in dealing with it.

Jeremy, how hot a topic are the variants of unknown significance?
Dr. Segal (University of Chicago): Occasionally it can be very difficult for us to decide between VUS versus suspicious for pathogenic impact for a particular variant. But the bigger problem for us is that we have so many VUS variants, and I just wish we were better able to say something meaningful about them. Probably the majority of the variants of unknown significance we’re putting into our reports are benign polymorphisms that the patient has that are not present in databases. But we come across interesting ones from time to time that certain functional studies might suggest are doing something.

Too frequently there isn’t any way for us to follow them up or say much more about them. And it’s a little disheartening sometimes to have to put it into the unknown significance bucket. In an ideal world you would put it into a pathogenic or a benign bucket and wouldn’t have the variant of uncertain significance. We’ll get there over time, but it’s going to take us a long time to figure out, and we’re going to need clever screens and studies to help us evaluate many variants in a high-throughput fashion. We need better data to work with.

One of the questions is how we will remember in a report we did two years ago that we had all these variants of unknown significance, and then in the meantime there are four drugs and multiple Nature Medicine articles about this variant. How can we be sure we don’t forget we had that with someone?
Dr. Segal (University of Chicago): It’s rare for a variant of unknown significance to become super important. It’s hard to remember the last time that’s happened. If it does happen, it’s easy to look back and see who else had the variant. But it would be nice if it happened more often.

So that anxiety then is overblown perhaps, or at least it was in the early discussions of this. Would you agree with that, Sohaib?
Dr. Qureshi (Thermo Fisher): I think there are two aspects. First there must be more of a community-based approach. I attended AGBT Precision Health a few years ago and a half day was dedicated to a discussion on how to better share data on variants of unknown significance within the medical community. Second, continuing to improve software solutions that can organize variants of unknown significance and can make data easily accessible is just as important as sharing the actual data to uncovering variants, which could eventually be super important.

Fiona, are there one or two items in the field of NGS that are less appreciated today than they should be and that you think will grow in significance in the next two or three years? Will we start seeing infectious disease be at least as important as oncology, for example?
Fiona Nohilly (Illumina): It’s already happening. Science, not just medicine, has had quite a moment over the COVID-19 pandemic to come to the forefront of people’s conversations, outside of this group and our own industry. People are talking about things like PCR as if it’s a COVID-specific term, but in reality it’s a technology that’s ubiquitous in the life science tool space. So I think the pandemic will change how NGS is seen in the world, especially within medicine. It’s allowed people to understand the power of this technology and how it can be used. It goes back to what Jeremy said about patients advocating for themselves. That’s where we will find that inflection point: when patients and doctors understand what they need and companies like ours provide the right technology. We are at an exciting time with people understanding a little more about this technology even from a layperson’s standpoint, and that will help us in the next few years to catalyze the uptake of NGS in the clinic.

Pierre, what about your crystal ball?

Dr. Del Moral

Dr. Del Moral (Illumina): Looking at oncology and cancer research, I would say that multiomic types of approaches will grow. Spatial transcriptomics, essentially the way down to the single cell level, is making its way into clinical research, clinical trials. It’s likely moving slowly into the pathology lab. So I would say it’s the combination of several technologies providing better answers. Not just genomics.

Then what do we do with the data? It’s the data aggregation part. Data is truly where the value is. We see that from Google and Facebook and others—data is gold. In terms of mining the best and providing the best outcomes for patients, nailing this down will prove to be important in the next decade.

Jeremy, what are your predictions?
Dr. Segal (University of Chicago): We will see a continuation of what we’ve already been seeing. There are areas in molecular pathology where there is rapid development and innovation. Then things become a bit stable and end up becoming mainstream or packaged into kitted solutions. For these basic tumor panels, we’re starting to see that. People are starting to build end-to-end solutions for tumor panels, either in companies or in large academic consortia. I think we’ll see more of that, especially as oncology NGS moves into community practices in smaller centers.

But there’s no end to the amount of innovation that is continuing to occur in the NGS oncology space. Circulating tumor DNA analysis is starting to move wholesale into the academic setting as well as heme malignancy measurable residual disease testing using molecular barcode technology to facilitate better monitoring of patients after treatment or bone marrow transplantation. We’ll see expanded use of RNA-based analysis to look for expanded sets of fusions and also gene expression metrics, and methylation analysis too, whether that’s for profiling individual samples or for doing blood-based analysis for early cancer detection. You’ll start to see a lot of these different types of applications begin to get refined down to practice at the clinical level.

Sohaib, would you like to add anything?
Dr. Qureshi (Thermo Fisher): I agree with all of the assessments about the technology. My own comment is more short term, and this is not a company perspective but my own. I’m hopeful that we’ll be ready with the right tools to address the cancer cases we will see post-pandemic, and I think NGS will play a large part in that. We as an industry need to be ready for it. We need to gear up to battle all of the cancer cases that are coming our way because they couldn’t be treated because of COVID. When we get over this, we’re going to have to attack cancer with speed.