May 2019—Workflow, data interpretation, communication, and community—that and more came up when CAP TODAY publisher Bob McGonnagle spoke with five NGS experts in April: Boaz Kurtis, MD; Zhiyv (Neal) Niu, PhD; David Eberhard, MD, PhD; Luca Quagliata, PhD; and Arnaud Papin, MSc, MBA. What they said follows. Access CAP TODAY’s NGS product guide here.
Dr. Kurtis, what is the biggest challenge you face now in using next-generation sequencing? First, what is the greatest challenge you face in your immediate clinical responsibilities, and what is the biggest communication challenge?

Boaz Kurtis, MD, laboratory and medical director, Cancer Genetics, Los Angeles: For me, the greatest clinical challenge with NGS relates to its application in smaller panels. If we have clients who are interested in the mutation results for a small number of genes, let’s say three genes, and we are trying to move the lab testing paradigm to leverage the economies of scale so that all our testing is done on NGS, the challenge there becomes twofold.
First, are those economies of scale being used effectively by doing NGS for a sample on which you will ultimately only report and thus only bill for two or three genes? Second, are there other issues associated with that beneath the surface? In other words, is there an ethical dilemma in running an NGS panel—let’s say it’s a 50-gene panel just for argument’s sake—if you’re reporting out only two or three genes? That means that as a lab you have available at least raw data in some form on 47 other genes on a particular patient, and you’re not taking that test to completion for those 47 genes, or you’ve taken it to completion and as an analyst you’re simply blinding yourself to the results of those remaining 47 genes. Is there an ethical dilemma of any sort associated with that?
If yes, is that dilemma resolved by your notifying the clinician that you potentially have access to the results of those genes? Do they need to be informed? Does the client need to be informed?
I would say the communication challenge would be to the end user. Industry folks who are involved in NGS testing are well aware of its power, its limitations, and so on. But many of those on the end user side of the equation—clinicians and certainly patients—don’t necessarily know what its limitations may be. That’s true for all testing methods. But NGS can be limited in that maybe not all exons or all genes are assayed. So just communicating that to the clinicians can be a challenge. We do it via the requisition form; that may or may not get through to every reader of the form.
Other than that, life is easy, right? Dr. Niu, how do you answer the question I posed to Dr. Kurtis, or would you like to make additional comments having heard his answer?
Zhiyv (Neal) Niu, PhD, director, clinical genome sequencing laboratory, Department of Laboratory Medicine and Pathology, and consultant and assistant professor, Mayo Clinic, Rochester, Minn.: We are all dealing with complicated clinical questions and technologies, both of which are evolving quickly. As the clinical NGS sequencing laboratory for Mayo Clinic Laboratories, our lab provides next-generation sequencing for multiple clinical specialties such as solid tumor, hematological malignancy, hereditary diseases, and microbiology. One big operational challenge is how to manage all testing workflows and meet the clinical requirements such as turnaround time.
[divider]
Next-generation sequencing instruments product guide
[divider]
Communication is indeed a big challenge. Many lab tests are on the market, there is no universally accepted standard for many, and similar panels may have different gene selection and content—all can cause confusion for clinicians and ordering health providers. Very often—and this is part of our job—I get calls from my clinician colleagues who need help in deciding which panel to choose. We often review test method and performance information provided by labs to choose the best one for the clinical question.
Sometimes laboratories update their panel, which is a good practice. That could cause confusion when the clinician sees a returning patient with a genetic test reported previously.
Dr. Eberhard, what is your reaction to what you’ve just heard? Is this something you hear frequently?
David Eberhard, MD, PhD, senior medical director, oncology, Illumina, San Diego: Absolutely. And we are now in the phase of technology development where we can seriously ask: Does clinical testing require a collection of small panels or should we move to larger, more comprehensive panels, or even whole genome sequencing approaches? We are reaching the point where we can do this economically.

Now that we are able to perform genomic NGS at almost any scale desired, the successful implementation of genomic testing in the clinic faces three key areas of development—actionability, access, and awareness.
Actionability relates to how we can interpret the testing data, report it, and contextualize the results in a way that directly informs clinical decision-making and provides clinical utility.
Access means patients can have NGS testing done when it’s needed. Technically, we can perform the sequencing in what we consider to be an affordable economy of scale. But for it to be widely accessible to patients, it has to be reimbursed by payers. This means the testing can’t just be affordable—it needs to have a positive impact on health economics. So how do we engage payers and demonstrate the clinical utility of NGS results so assays are covered?
Awareness comes from education. There exists a need to improve education around the potential to improve medical practice. The field has moved so far and so fast that it’s daunting even for experts to keep up with the latest developments. We need to provide visibility and information about molecular testing to raise awareness. This applies even to small panels that have already been accepted into professional practice guidelines. The standard of care in molecular testing is not used nearly as often as one might think.
Dr. Quagliata, what are your thoughts about what you’ve heard so far?
Luca Quagliata, PhD, global head of medical affairs for the clinical NGS and oncology division, Thermo Fisher Scientific, Waltham, Mass.: While we promote and support comprehensive molecular profiling, we never push our customers in the direction of unnecessary testing. And this goes back to giving our customers a lot of flexibility by offering them either small gene panels that contain only clinically relevant information—for example, genes associated with an FDA-approved treatment option—or, of course, the opportunity to choose larger gene panels.
We are conscious about not forcing extensive and unnecessary testing. For the future, to make NGS cost-effective and fully exploit its intrinsic multiplex power, we envision that clinicians should report everything that is clinically relevant and not just what has an FDA-approved treatment option. In fact, it happens that only about 50 percent of patients when they are tested with, let’s say, a midsize panel—which could be in the range of 50 genes—would eventually have a clinically relevant variant linked to a specific therapy.
The other 50 percent of patients won’t have anything in that sense or they won’t have a classical driver mutation. If no driver is found, then the possibility of reporting also on the other genes should be given. This is something that the FDA needs to put some clarity around, and I think it is moving in that direction.
Arnaud Papin, MSc, MBA, senior global product manager, NGS, Qiagen, Waltham, Mass.: I will only add two points from Qiagen’s view: It is important to support advanced, experienced laboratories in transitioning their NGS assays from complex workflows designed more for research purposes than for clinical use into streamlined, integrated workflows that are more efficient for routine testing. Similarly, smaller community hospitals can adopt NGS quickly by building on automated solutions that fully integrate all steps of the workflow, from sample processing to interpretation and reporting—all from one provider and state of the art.
The second point is the role of interpretation. As others have commented, the transition from small panels to larger panels is a massive data analysis and interpretation challenge. In smaller, highly targeted panels, pathologists know the clinical impact of variants such as EGFR T790M or KRAS G12V. But new variants are being considered and the interpretation of existing ones is evolving, almost weekly. In particular, large panels like TMB present a challenge, as variants will show up that a standard molecular pathologist would see only once in a lifetime. Asking each laboratory to perform manual curation of their own database, especially to maintain it and keep it updated, is a massive resource requirement and simply not feasible.
Interpretation of the NGS data is believed to be a difficult problem of computational pathology. The three of you from industry have a reporting capability of sorts, some of it very clear, very elegant. Dr. Niu, can you comment on the challenge of the data and the data analysis as you see it?
Dr. Niu (Mayo Clinic): This is a very interesting field now because the coming of next-generation sequencing produced a lot of data, at a faster pace than we can handle. We follow multiple clinical interpretation guidelines for each disease group, and we are also taking advantage of large databases that become available through many larger sequencing cohorts. Clinical interpretation guidelines have been improving and will continue to improve. It’s becoming a bit dynamic. Often we are using the new knowledge to update our understanding of this disease and the gene association. That’s why we normally state knowledge base evaluated at the time of reporting. Many things can be done computationally now with automatic variant interpretation or classifier tools. However, after a lot of evaluations, I haven’t found a perfect one yet. There are many challenges and opportunities going forward.
At Mayo, do you have what I might call a suite or a toolbox of interpretive tools that you apply based on the patient’s condition and the panel you selected to run?
Dr. Niu (Mayo Clinic): Yes, all clinical reporting teams have specific clinical annotation to facilitate interpretation and defined rules on clinical reporting. Certain genes can be associated with entirely different diseases. And these are always predefined and documented during test development. All informatics tools are thoroughly tested for clinical use.
Dr. Kurtis, one of the anxieties I hear from laboratory directors who are involved in this field or who are looking at it to bring it in-house is related to the NGS results for genes that do not appear to be actionable, that were never ordered, but there is an ethical obligation to report them and then to store them. And there’s the added anxiety of what’s going to be in Nature Medicine next week. It may unlock an important link to a gene that we have the results on for a particular patient. Is that anxiety overblown in your estimation, or is that at the heart of what everyone has to be able to do if they’re going to offer an NGS service?
Dr. Kurtis (Cancer Genetics): I wouldn’t go so far as to say it’s an overblown anxiety because it’s justified to be concerned about what the implications are for the patient if you’re archiving data that don’t make much difference as far as their being actionable today but that could be actionable tomorrow when that article comes out in Nature Medicine.
That said, there is a sensible middle ground approach to this dilemma that basically says something to the effect of we are doing the best we can do today and we will continue to evaluate what best practices are when there is a need.
If best practices or guidelines evolve in the future to suggest that maybe we do need to be engaging the clinicians, who are of course acting on behalf of patients, in the decision-making process as far as whether to archive data that’s not reportable or to report everything, we’ll reach that point. We’re not there yet. And if you step back and take a look at the fact that NGS testing, which is high-parameter testing on a level we haven’t dealt with before—in terms of how many data points you get from a single assay—then it’s fair to say that it’s still a new age in diagnostics and therefore it’s okay if we learn as we go in an iterative process. We’ll evolve as needed. Maybe the science itself will force us to evolve. Like when there’s a critical mass of Nature Medicine articles that identify a new generation of biomarkers that don’t even exist today, then it could be time to have that discussion. But I wouldn’t go so far as to lose sleep over it today.
Dr. Quagliata, can you react to what you have heard here in terms of these challenges around data? Have you heard that same concern from the field?

Dr. Quagliata (Thermo Fisher): We’ve absolutely heard some of these concerns from our customers. But at the same time those same customers are reassuring us that we are now at a historical turning point for the next-generation sequencing approach and it is time to fully enter into the clinical space. That’s why we should be realistic, meaning that if there’s a paper coming out next week in Nature or Science that says a specific mutation or gene variant is important in certain contexts, it doesn’t necessarily mean that this is translated immediately into something clinically relevant and becomes something real for a patient today. This is the reason we strongly believe we should first, in this historical moment, make sure our customers, and everybody else using NGS, are able to answer primarily the clinically relevant questions, making sure that 100 percent of the clinically relevant variants are properly detected and properly reported, rather than focusing on enabling them to look for something that might be relevant in the future.
Second, whenever there are no variants related to an FDA-approved treatment option in that specific sample, we should allow the clinicians and data analysts to eventually look for genes that are in the clinical trial phase. That’s why we have solutions that enable our customers to identify those genes that may not at the moment be in guidelines for an approved drug but that are in a different phase of clinical development.
Dr. Eberhard (Illumina): The question of how to use NGS data that are acquired as part of clinical care and testing of patients is complex and has to be broken down into more specific considerations that can be approached in their own light. Of course, we should strive to report actionable variants to patients and clinicians and to provide tools and information for decision support so that test results can contribute to impactful clinical decisions.
What to do with the rest of the data has to do with questions of patient privacy, medicolegal aspects of the providers’ retention of data, and commoditization of data as well as future business development, pharma clinical trials, or research by academia. Those are all different aspects of how data could be used and how data should be protected. As a company, we recognize the critical importance of keeping DNA data secure and, as such, are steadfast in our commitment to privacy.
As an entire genomics ecosystem, this is an area that we as a community need to talk about; we need to address various situations in the appropriate ways to ensure the safeguarding of genomic data.
Qiagen has clear and good data reporting capabilities with its GeneReader platform. Arnaud, would you like to speak to the points your colleagues made? I’m interested in particular in Dr. Eberhard’s comment about our needing to have a way of discussing this and working together. First, I’m sure you’ll agree with that. But then I would be interested to hear from you and the others about where the various forums are in which this discussion can take place?
Papin (Qiagen): When we at Qiagen look at NGS, we focus on answering the questions and simplifying complexities across the workflow, from sample processing to interpretation and reporting. Interpretation is where we see that users are faced with massive challenges currently, but even more in the future as we move to larger panels.
Previously at Qiagen, I was responsible for launching PCR and pyrosequencing tests. This was a simpler task of data interpretation from a lab perspective. Now, in the era of large panels such as TMB, and whole exome and genome testing, as clinical research is continually updating variant information and testing volumes are increasing massively, Qiagen’s approach is to integrate knowledge bases and create informatics solutions for seamless analysis, interpretation, and reporting. We are seeing strong demand for our knowledge base solutions such as Qiagen Clinical Insight, or QCI, where institutions can benefit from a pool of hundreds of curators who ensure up-to-date and expertly curated knowledge, as well as from other private and public knowledge bases and databases, including intra-institution knowledge bases. With that, we can ensure for our customers access to the growing amount of genomic data that can further enrich the continually updated literature findings, but also can connect customers to create data-sharing forums.
Dr. Quagliata, all of the NGS platforms are probably in most of the large centers that are doing a lot of the NGS work. In the same way there are multiple immunoassay analyzers at Mayo Clinic, there’s also probably a sequencer from each of the three major manufacturers. I’m assuming that everyone in this conversation, on the clinical and industry sides, understands there is still a need for integration and interoperability of the data from the various sites, machines, and panels. I assume I’m correct. Can you tell me where I’m wrong or give me a little more color and describe how I’m right about that?
Dr. Quagliata (Thermo Fisher): That is a good point, and it’s a fact. To be honest, each platform has its pros and cons and each may be more suited to a defined application. One of Thermo Fisher’s strengths is being able to deliver results from very limited material. But there are other applications that might require a different approach. We are not hiding behind a wall, that’s a matter of fact. And we are not at all against the fact that our customers should be in the position of having to decide which is the best solution to go for. Peculiarities and differences allow the entire market to move forward and grow. I hope my colleagues from the other companies will agree on that.
Working in the clinical space, we do offer a structured end-to-end solution that allows our customers to work in a friendly but controlled environment. For instance, many of our customers use Ion Reporter and Oncomine knowledge database to classify the clinical variants they find during sequencing. We give them full flexibility and freedom to eventually push data somewhere else if this is something that is out of the scope of our product. We believe it’s a plus for the market. Again, I hope my colleagues would agree.
This is why we believe data should belong to the institution that generates it. There are companies, which are not part of this CAP TODAY discussion, especially one that receives samples for outsourcing, that retain the data. So it’s kind of a black box because clients do not have access to the raw data.
We are at the opposite of that. Everything generated in the laboratory using our solution belongs to that laboratory. This is why we want to enable our customers to do sequencing in-house. We believe this is a strong benefit and something that should be out there for as many people as possible in the market.
Dr. Eberhard, let me give you a chance to talk about your view and Illumina’s view on these same matters.
Dr. Eberhard (Illumina): Accuracy and reproducibility of results are of primary importance for clinical testing. While high sensitivity is very desirable, pushing the limits of sensitivity necessarily at some point will begin to compromise accuracy and reproducibility. The limits need to be demonstrated by rigorous validation. When tests are implemented in the clinical laboratory setting, one route to ensure accuracy and reproducibility is to develop IVD products that are validated, submitted for regulatory approval, and marketed for specific uses. Of course, with an IVD product, the ability of the customer to tweak the operation of the machinery or the analysis of the output is limited. But again, in this case we’re striving toward accuracy and reproducibility.
Dr. Kurtis, has there been adequate progress in areas such as standards, reference materials, and even proficiency testing serving the field? Is it keeping pace?
Dr. Kurtis (Cancer Genetics): With the guidelines for analytical validation of NGS assays in oncology as they are currently set, I would say yes. Those guidelines say that you as a laboratory need to validate your assay around each of the four major classes of genomic alterations. There’s various performance metrics you need to establish around each of those classes. But that is your benchmark as opposed to individual genes or variants. If that’s the threshold, then I would say, yes, our current paradigm for reference testing and standardization is sufficient. The question, though, is, will that paradigm, where you are measuring genomic alteration classes and not individual genes, hold up? And will that hold up as even higher-parameter assays are introduced into the clinic like whole exome and whole genome?
Those are available in a limited form in some institutions with a host of other issues surrounding them, notwithstanding the issue of reimbursement. But let’s say it does become much more widely accepted to run, instead of a 50-gene panel or a five-gene panel, a 500-gene panel or a whole exome. Can you plausibly say that validating the assay around just the ability to detect alteration classes is enough? I’m not saying the answer is yes or no, but I am suggesting that at that point it would be fair to ask whether standards and reference materials suffice.
Do our three industry partners have final thoughts to share?

Papin (Qiagen): We are excited about our progress so far, but there is still a lot to do to use the full power of NGS in clinical routine. Most important, our focus is to ensure that our customers can focus on the value of the information they are seeking and not lose sleep over the workflows. They should be able to rely on partners like Qiagen to provide the solutions technology can provide to get to such insights—and allow the power of NGS insights to benefit labs and patients. This is what Sample to Insight, the Qiagen slogan, stands for. For instance, a focus is now on the design of clinically relevant panels, encompassing DNA, RNA, and signatures such as MSI and TMB information. In general, we are committed to offering advanced yet easy-to-use, highly integrated and automated solutions that benefit from powerful interpretation and reporting capabilities. Our focus is on addressing the growing need to turn the increasing quantity and complexity of NGS evidence into clinically actionable insights. This will support the transition from research applications to IVD solutions that will improve patient outcomes.
Dr. Eberhard (Illumina): Our genomics technology has advanced to where we are now beginning to make a profound clinical impact, and the technologies will continue to develop to make even greater changes. Now we are starting to face the ancillary challenges around how we report and store genomic data. These advances will revolutionize the management paradigms for the millions of patients who are facing grave diseases and need to find the right diagnoses and treatments faster.
We are a community, and working together will help address some of these questions. How do we deliver clinical testing in the most appropriate ways? We work with the regulatory agencies to ensure that the genomics products, instrument platforms, reports, and data we deliver are dependable, accurate, and reproducible, and that they provide benefit to patients.
As a community, we are working together to develop and improve the field, such as with standards. For example, Illumina participates in the Medical Device Innovation Consortium, which includes representatives from several industry partners, academia, the FDA, the NCI, and others, to tackle questions of standardization and reproducibility. Friends of Cancer Research assembled a consortium from all areas to address the issues around tumor mutational burden: How do we harmonize and standardize that approach? These types of collaborative endeavors will continue to grow and be important in reaching consensus practices in how we approach genomics.
Dr. Quagliata (Thermo Fisher): While each company clearly wants to highlight the benefits of the platform it has developed and present it to the market, I like the idea that in order to make sure patients eventually get a clear benefit out of this, we also need to be able to work together. Tumor mutation burden harmonization efforts in the U.S. and Europe are an excellent example of many different companies collaborating to reach a common goal. Despite the fact that TMB is a suboptimal marker, we are part of that.
Each of us will shine under different points of view, but we will make sure the patients and clinicians get the most out of it. We believe the most appropriate treatment always starts with the most appropriate test.
Dr. Kurtis, I began with you, so I will give you the last word.
Dr. Kurtis (Cancer Genetics): While there is a lot of promise in the comments we just heard, and I share that sense of promise in where the field is going, it really is the tip of the iceberg. The validity of tests like tumor mutation burden is still in debate. So, yes, the sense of community is the best approach to take in an emerging field, in an emerging science like this one. At the same time, we have to understand that on a moment’s notice, we as a community may need to pivot to a new biomarker or a new set of biomarkers or integration with a multiomics approach. That’s a good thing because it means there’s a lot more room to grow in this field. It will keep us all busy for the entire duration of our professional lifetimes and beyond.