Home >> ALL ISSUES >> 2021 Issues >> Newsbytes

Newsbytes

image_pdfCreate PDF

Data analytics: from considerations to implementations

April 2021—The rewards of data analytics can be sizable, but so can the challenges of extracting data, transforming it, and loading it into the appropriate systems to facilitate the functionality.

“Analytics is born out of an idea that there have to be better ways of visualizing and using data other than simply feeding it into a spreadsheet or templated report,” says David McClintock, MD, associate chief medical information officer and associate director of pathology informatics for Michigan Medicine, at the University of Michigan. “At a minimum, we need to be able to drill down into our data to find what we need and discover what we don’t know.” Yet advancing from a desire to access data in a specified manner to delivery of data analytics can be an arduous process, he says.

Dr. Reineks

Dr. McClintock and Edmunds Reineks, MD, PhD, section head of point-of-care testing at the Cleveland Clinic, shared their experiences with data analytics in separate American Association for Clinical Chemistry meeting presentations in December and in a recent conversation with CAP TODAY.

At the Cleveland Clinic, says Dr. Reineks, test cartridge recalls are just one example of how data analytics has led to time-saving improvements and other gains. Ten years ago, the lab’s point-of-care team might have contacted each of the hundreds of locations within the Cleveland Clinic system that were using point-of-care devices to track down recalled cartridges. Today, however, “we can pull the data, see who is using cartridges with those lot numbers, and hone in on those locations right away,” Dr. Reineks says.

While the value of such an approach is obvious, the preparation necessary to reach that point can be less so. A primary issue is that some electronic health record and laboratory information systems store data in hierarchical, or tree-like, formats instead of relational, or spreadsheet-like, formats that can be queried easily, Dr. McClintock explains. Therefore, some labs will need to find or build data analytics platforms that can interface with the former to capture and convert data so it can be readily retrieved.

“The hardest part for most of us is the data extraction,” says Dr. McClintock, who also notes that many lab systems have older database schemas “with little standardization in how they are set up, meaning they don’t lend themselves well to modern data-extraction tools.” Furthermore, the database schemas used for health care information systems can be complicated—one commonly used EHR has more than 22,000 tables—adding to the uphill battle of data extraction, he says.

Dr. Reineks recalls encountering hurdles with data extraction and transformation when he began exploring data analytics at the Cleveland Clinic. At the time, he was medical director of the core automated chemistry lab, which analyzed approximately 7,000 samples a day and produced massive amounts of data.

“One big challenge that I had was getting that information into a useful format,” he says. “Early on, that’s what I struggled with—identifying a mechanism where if I wanted to look at calcium results during a specific time period, I could quickly get that information.”

After some investigation, with the assistance of the Cleveland Clinic’s Center for Pathology Informatics, Dr. Reineks’ group helped select third-party software that solved its data-acquisition and formatting issues. About eight years ago, he recalls, the laboratory medicine department adopted Altosoft (now Kofax Altosoft), an analytics platform that can extract information from the lab system and provide pathways to customize dashboards or permit additional ad hoc analytics.

Laboratories should select third-party software based on the analytics use case for which it is intended, Dr. McClintock says. While many labs apply analytics to LIS data, a lab may, for example, seek an analytics platform that can extract and transform data from its automation line with the objective of improving the efficiency of that line and the laboratory in general. Labs need to be precise when determining what data they want to analyze, which systems house that data, and which analytics platforms are best suited to manage that data capture and transformation, he says.

In the Cleveland Clinic’s point-of-care testing division, Dr. Reineks’ team uses a Telcor middleware solution as part of its processes to improve patient care, including to monitor quality issues and trends in patient results. During the early days of the pandemic, for example, the team retrieved Telcor-generated data from point-of-care devices that performed international normalized ratios and observed a concerning trend: Patients seemed reluctant to come to the clinic for blood tests to monitor the effectiveness of their Coumadin therapy. “Getting this data and analyzing it quickly led to the decision to implement drive-thru INR testing at several locations,” he says. “This allowed some concerned patients to continue monitoring their medication with less risk.”

Laboratorians can visualize the data they have loaded into spreadsheets by incorporating it into charts, graphs, or other models, Dr. McClintock says. If they are using analytics platforms that lack sophisticated data visualization capability, he adds, they can employ commercial tools, such as Tableau, Shiny from RStudio, Microsoft Power BI, and Plotly, for this task.

Dr. Reineks uses Tableau and the open-source visualization program Orange, among others, to create graphs, charts, and other visual representations of data. The Orange software is highly icon-driven, he says, which allows him to select data parameters and create, change, or reconfigure graphs easily.

Before determining whether to build or buy a data analytics package, labs need to weigh the opportunities and drawbacks for each option, Dr. McClintock says. A lab considering building an analytics platform should take into account such factors as the skill set of its information technology team and whether that team has the resources to prioritize such a project. An in-house IT team may be well versed in maintaining laboratory systems but have insufficient knowledge and software development resources to tackle a complex analytics development project, he adds.

Regardless of whether labs buy or build an analytics platform, they need to be aware that the new technology initially will add significantly to their technology team’s workload, he says. “Just because you are going with a vendor-based solution doesn’t mean you are going to pay the vendor to do everything for you,” Dr. McClintock explains. Analytics platforms typically require local servers, workstations, large displays, and interfaces, as well as system configuration, data validation, report validation, and general maintenance. “Before you know it,” he says, “you have multiple people and IT groups involved in creating new reports, supporting the system, and maintaining your data feeds.”

CAP TODAY
X