Home >> ALL ISSUES >> 2023 Issues >> Volume? Space? Automation decisions in coagulation

Volume? Space? Automation decisions in coagulation

image_pdfCreate PDF

Dr. Higgins, we talked about point of care versus a core lab or a dedicated workcell for coagulation automation. In the UT system, you service many physicians and others in different locations. How important is a point-of-care strategy for you in your role?
Dr. Higgins (UT Health): In coagulation, it’s important for the warfarin clinics. They like to provide a point-of-care test so they can adjust the medications onsite. Even though we have direct oral anticoagulants available that do not require monitoring, about half the patients are still on warfarin, so providing that INR at point of care is a good service. Once you start moving point of care into the hospital, it becomes more difficult. The workhorses like INR and PTT are costly to do at the point of care—these are high-volume tests. Also, there are limitations, at least in the literature, that would make us pause. For example, there’s one instrument that measures PT/INR that has an electrochemical endpoint and it’s completely insensitive to fibrinogen, so if you had somebody who didn’t have any fibrinogen on the hospital floor, that PT/INR would tell you that patient is normal. A lot of the PT/INR point-of-care tests are indicated for warfarin monitoring only, so it’s hard to know how those tests perform in an ICU, where there are many other considerations.

Dr. Salazar

Dr. Salazar, we still have a lot of warfarin use in this country despite the long-time availability of other oral anticoagulation agents. How do you look at these therapeutic decisions from your perch right now? Do you think warfarin is overused or does it still have a major place in treatment?
Dr. Salazar (UT Health): We’ve seen a replacement of the use of warfarin with direct oral anticoagulants, with an increasing trend. There are certain clinical scenarios where studies have shown that DOACs like direct Xa inhibitors cannot replace warfarin. A good example is a lupus anticoagulant causing a thrombosis. We still have to rely on warfarin in that scenario because studies have shown that the direct Xa inhibitors are not effective enough. From the patient perspective, moving to a direct Xa inhibitor is tremendously convenient.

At the same time, we have to be aware as a field that there are clinical scenarios where despite the fact that we don’t have to do routine monitoring for direct Xa inhibitors or DOACs in general, it might be clinically useful. We’ve looked at that closely and noticed that what happens to an outpatient might not necessarily apply to the inpatient setting. If you have, for example, a patient with renal failure who is in the ICU or needs to transition to another anticoagulant, having some understanding of a level could be informative. From the laboratory perspective, we have to make an assessment—does that mean we need to offer this kind of testing?

Ken, how much reflex testing is built into the operation of your customers? Coagulation is complex, difficult to understand. Are you seeing a greater demand for reflex testing?
Ken Huffenus (Werfen): Definitely. When we originally designed the ACL Top system, it had a lot of reflex and rerun rules built in. But during the early years of use, it was relatively infrequent to do more than an automated rerun. That’s changed completely, primarily because we now have data management solutions—we have HemoHub—complementing the analyzer software. It has capability to build an algorithm not only to do a reflex to one test but also to incorporate an entire workflow, and follow that in an automated or manual fashion. Bringing more control back to the laboratorian to define those workflows in-depth has driven the increase in the use of reflex testing.

Matt, can you speak to that? You’re often caught in the middle between an instrument vendor and a customer in a laboratory, and they’re probably looking to you for help or solutions. Are you seeing a lot of demand in these kinds of cases?
Matt Modleski (Orchard): We have a sophisticated rule set inside our LISs that helps labs choose when and how to reflex. It becomes an individual laboratory decision.

We typically don’t develop automation or an automation line until a laboratory says, we’re buying this from XYZ vendor, and they bring it in. We will then work closely with the vendor to develop that software hardcoded into our LISs. We do a lot of work with vendors pre-launch and once the product is in the marketplace. It drives what we develop.

Nichole, tell us about reflex testing from your perspective. What’s built into the analyzer and the instrument solution and what is left to an LIS or middleware vendor?
Nichole Howard (Diagnostica Stago): We’ve had our Max generation analyzers in the field since 2013, and each one comes connected to a middleware management system called Coag Expert that has the ability to build in rules. We like to work with our customers so they can automate the process and the standard operating procedures for testing. We’re seeing it used frequently in lupus testing, where you can build in the International Society on Thrombosis and Haemostasis guidelines so everything flows end to end and you can automate the process with the full panel. We’re seeing it with factor VIII and IX testing. Once you have well-informed customers, leaders in the industry who understand the testing, then they are validating locally and feel confident. It’s been powerful, especially with the technologist shortages, to be able to have traveling technologists come in and have less of a learning curve.

Dr. Salazar, let’s go to the other end of the testing process. Are you and your clinicians and others happy with the way coagulation test results are reported in the EHR?
Dr. Salazar (UT Health): We have to be aware of whether we are conveying what we want to to our clinicians from the testing we’re doing. Are the comments or interpretations we’re making visible to them? Am I happy with it? It can always be improved.

One example is lupus anticoagulant testing, because I’ve seen a few different models for the way those interpretations come across. Lupus anticoagulant testing involves many different tests and sometimes a long interpretation. I get the feeling that sometimes hematologists or clinicians are looking for a yes or no answer, and sometimes we’re coming up with inconclusive. From a clinician’s perspective it’s tough to know how to act on that, so we could work on not just standardizing but making sure it comes across appropriately.

The second example is a little like the elephant in the room for our conversation, and that’s viscoelastic testing. A lot of clinicians are using viscoelastic testing in place of routine testing and making important patient decisions based on these curves and parameters on the curves. Have we as coagulation experts and laboratorians taken a close look at how clinicians are using that information and are we appropriately interpreting that for them? Those curves or dials that viscoelastic platforms are producing are sometimes difficult for the clinician to access. It might be a PDF that gets scanned in after the fact. It might be an image that shows up after the clinical information was necessary. We need to look at this area more closely.

Dr. Higgins, tell us more about viscoelastic testing at the point of care. Is it ready for prime time? Is it fully matured, is it understood? What has been your experience?
Dr. Higgins (UT Health): We don’t have a choice, because a lot of this was direct marketed to surgeons and anesthesiologists, and the TEG [thromboelastography] showed up in our ORs and we had to deal with it. In my opinion the literature needs to catch up. What are the triggers for transfusion on this TEG for a particular surgery or scenario? The algorithms I’ve seen that are designed well end up needing a laboratory test in the central lab to confirm, okay, the maximum amplitude is low—is that due to low platelets or fibrinogen? To have an impact, we need to know more. Surgeons and anesthesiologists need an algorithm that they put together so they’re at least treating patients uniformly. That part is hard to control. There may be software coming down the line for some instruments, and if so that would help. We could, as a hospital, as a group, program an algorithm and they could use it to treat patients in a standard way. But it’s like the Wild West at the moment.

Nichole, your reaction to this?
Nichole Howard (Diagnostica Sta­go): Our teams, our colleagues at HemoSonics, a sister company, are working together to understand this because we are hearing the same—this was sold at the surgeon or cardiac level and now the lab is scrambling to try to deal with it. As vendors it’s our job to facilitate those conversations and get all the stakeholders to the table to figure out a solution that makes a difference in how we provide care and reduce the use of blood products.

Ken, can you comment on this? I’m going to throw in parenthetically that when we start to see testing sold to nonpathologists and nonlaboratorians, troubles often arise.
Ken Huffenus (Werfen): The good news for the laboratory is that Nichole’s company and mine are getting into the viscoelastic testing world now. We have our ROTEM sigma system for viscoelastic testing at the point of care, with the hemostasis expertise to understand what those results mean. We leveraged the expertise from our laboratory hemostasis team and brought it together with the ROTEM team and we’ve found that to be powerful. Now we can tell the story, educate those in the laboratory and in the clinical settings about what the results mean and how they can help improve patient care and blood usage.

Modleski

Matt, can you comment on how these results look in the EHR? Are you seeing a greater sense of satisfaction with how the EHR is reporting lab tests overall?
Matt Modleski (Orchard): Of all the testing regimens, this one seems more complicated than the average set of testing. What Dr. Salazar said about results back to the EHR is indicative of what most experts would say, which is, we’re trying to tell a sophisticated story with finite data. This is one of the times when individual data sets for hard data, without some language around it, isn’t as helpful as it could be. Most of the time we like data and a hard number and a clear set of yes/nos. This doesn’t lend itself to that, so there’s room to improve how the clinician receives the total diagnosis.

Dr. Salazar, does coagulation testing and the exploration of this complex science have more to offer that we haven’t yet uncovered completely? In other words, is there more potential in coagulation testing than many people might think?
Dr. Salazar (UT Health): Historically when we have thought about coagulation testing, the way it has been and continues to be applied often is a decision about whether we should transfuse a patient—should we give a blood product—or should we change anticoagulation? On the other hand, we learned during the pandemic that one of the most important biomarkers for COVID-19 survival was a D-dimer. So we’ve learned that routine coagulation parameters can be more than just a decision to transfuse or to anticoagulate.

We can get into esoteric testing, where you have esoteric biomarkers or diagnostic parameters like an ADAMTS13 or a fibrinolysis marker that’s not offered at many places and may or may not be clinically relevant under different scenarios. There’s a lot of research now into long COVID, and at least two studies suggest that a coagulation parameter is perhaps an important indicator of long COVID, and that’s the ADAMTS13-to-von Willebrand ratio. So the answer to your question is yes, and more to come. We’re learning a lot more.

Ken, do you have a closing comment on that point?
Ken Huffenus (Werfen): Ken Friedman [MD, of Versiti Blood Center of Wisconsin and Medical College of Wisconsin] presented an excellent overview of parameters related to the ADAMTS13–von Willebrand ratio and what they might mean, during the ISTH Congress in London last July. This is available online at Werfen Academy [academy.werfen.com]. As far as what coagulation has to offer, I go back to the pandemic, where hemostasis really shined. Prior to that, when I’d say “hemostasis,” people would ask if it’s clinical chemistry. Now everyone knows, and they usually know about D-dimer testing too. There’s more interest in these different thrombotic complications. We see adoption of a certain testing regimen, and complementary care related to it, in some areas of the world, and we see a proliferation to the rest of the world happening over time. From where I sit, coagulation still has a lot to offer.

Nichole, does coagulation and its study and development still have much to offer?
Nichole Howard (Diagnostica Sta­go): Yes. D-dimer is one of our health care heroes in COVID-19. When we saw the growth of D-dimer through COVID, we expected things to come down much faster than they are. We’re not seeing patients being hospitalized at the same rate; however, D-dimer numbers are still not coming down as much. It will be interesting to see how that shifts the continuum of care in the next 12 to 18 months, especially when you think about the data coming from Jeff Kline [MD, of Wayne State University School of Medicine] that highlights underutilization of D-dimer and overutilization of imaging in the ER and how it changes an overall continuum of care [Kline JA, et al. Circ Cardiovasc Qual Outcomes. 2020;13(1):e005753]. Also, how do we manage using anti-Xa for its care benefits, reduced dosage changes, reduced length of stay, but also manage DOACs? There’s a lot of development there.

Dr. Higgins, tell us more about the promise that coagulation studies still have.
Dr. Higgins (UT Health): We’re getting much better at what we do in the coagulation lab, and some of it is automation. Think about the ristocetin cofactor assay—this is our classic way of measuring the activity of von Willebrand factor [VWF]. This was done on a platelet aggregometer until about 15 years ago. Now there are automated von Willebrand activity assays that can be put on instruments. That’s huge for monitoring patients who are on therapy in the hospital. We don’t have enough MLS staff to perform the old aggregometry method every day, so putting it on the instrument is a huge win. Another win is the availability of rapid heparin-induced thrombocytopenia testing on automated instruments—getting those results rapidly not only saves money in argatroban costs, it’s better patient care.

The regulatory environment gets in the way a little. For example, the rest of the world had access to tests like the von Willebrand factor glycoprotein 1bM [VWF:GP1bM] assay much earlier than the United States. Guidelines recommend using these newer automated tests [see related story], but at the time they weren’t available and certainly not on every instrument and platform. Additionally, the regulatory environment is such that new tests often get approved only on a specific instrument. Very recently, the FDA approved a VWF:GP1bM test on a few Siemens and Sysmex instruments, but it is not FDA approved on the other manufacturers’ instruments. My favorite approach is when companies send tests to the FDA without tying them to an instrument because we can put them on our instruments and my neighbors can put them on theirs, regardless of the platform. This approach was helpful for the implementation of the bovine chromogenic VIII assay in our laboratory, which is another up-and-coming assay. So we’re making strides with automation, and there are great assays coming out that help us take care of patients.

CAP TODAY
X