Home >> ALL ISSUES >> 2019 Issues >> Newsbytes

Newsbytes

image_pdfCreate PDF

Building a lab or modernizing? Don’t forget the following

March 2019—Building a new pathology lab or revamping an existing one gives laboratory decision-makers an opportunity to rethink information technology infrastructure and address persistent problems, plan for new technology, and improve processes. Some enhancements, such as incorporating barcoding and tracking systems, may be obvious, while others may be overlooked. CAP TODAY writer Jan Bowers asked four pathology informatics leaders what IT-related infrastructure or accommodations should be included in forward-looking plans for a new or refurbished lab. Here is some of what they had to say.

Alexis Carter, MD, physician informaticist, pathology and laboratory medicine, Children’s Healthcare of Atlanta: In labs I’ve worked in, there have been problems with where servers are located, specifically middleware servers for chemistry and hematology. Those servers tend to be in the laboratories themselves because the instruments often have to be connected to the servers directly or via a box that will translate an analog signal to TCP/IP.

Dr. Carter

Often servers are placed in locations that aren’t optimal. For example, if you put one of these middleware servers right behind a big analyzer that’s generating a lot of heat, it’s not good for the server. In addition, the server can make it difficult for an instrument maintenance technician to get in and maintain your instruments and vice versa; it can cause a problem with an IT analyst being able to maintain the server. On top of that, you’ve got cords and cables running in and out of the server—anybody can knock into the server or trip over the cords. And when you have one instrument go down, that’s one thing, but when you have the entire connection network for 10 to 20 of your instruments going into a single server, that is a very different downtime issue. These types of failures don’t happen very often, but when they do, they can be devastating.

For a new hospital we’re planning, our team has requested to have a server closet in the lab that will be accessible to all of our instruments via overhead cabling. A server closet can provide a good cooling system, which servers need, and fire protection. In addition, server closets are locked and their access more controlled, improving general security.

I’ve also found in various laboratories I’ve worked in that they don’t always have a test server. In my opinion, anytime you have a server hooked up to multiple things—especially if it’s middleware that’s handling somewhere between 50 to 75 percent of your laboratory data—it’s really important not just to have a production server for your live data but also a test server that allows you to test changes to the middleware prior to putting them into production. The test server can also serve as a backup to the hardware. In the event that your production middleware server crashes, a test server, if you’ve configured it properly, could very quickly be brought back up as a production server. Small to midsize labs sometimes don’t think about using a test server because of the cost associated with it. That can be a disincentive. But the advantages of having it, in my opinion, outweigh the disadvantages. A test server can help you troubleshoot and when you’re testing things to make sure you are doing them in a safe way.

Matthew G. Hanna, MD, clinical instructor in breast pathology and informatics, Memorial Sloan Kettering Cancer Center: Laboratories need to think about having adequate network access control, as well as bandwidth, to support emerging technologies such as digital pathology and molecular bioinformatics. They need to look at how much bandwidth the lab currently uses and make projections for the future so they’re not limiting themselves. Another consideration is planning for adequate digital file storage—again, this is speaking from the aspect of digital pathology or bioinformatics, where the file types are probably the largest in medicine.

Dr. Hanna

I think some people may be waiting for digital pathology systems to become FDA approved or validated, but I think it’s truly worth investing in now. If you’re rebuilding the lab and you really want to scale prospective whole slide image clinical scanning for primary diagnosis, you want to look at placing those scanners in the lab. For surgical pathology, tissue should go into the scanner right after it is stained. Therefore, the architect planning the operational laboratory blueprints should not create siloed scanning space. If the scanners are in disparate locations, this will create a lot of inefficiency in bringing glass slides back and forth. Efficiency may not be as much of a factor if you’re using digital pathology for research. But for large-scale archival or prospective clinical scanning, improper hardware locations can create significant bottlenecks. Therefore, for clinical scanning, the best-case scenario is having the scanners in the lab. As vendors continue to create modular hardware, the glass slides may never need to leave the lab before moving to storage. You would then view the digital slides on your computer workstation as opposed to filling out a paper or electronic request for somebody to retrieve the glass slides.

Somak Roy, MD, assistant professor and director of molecular informatics and genetics services, Division of Molecular and Genomic Pathology, University of Pittsburgh Medical Center: Make sure you can securely archive sequencing or imaging data off site from the lab space to prevent permanent data loss in the event of a leaking water pipe near the servers, for example, or a massive power surge that affects the entire lab building. “Off site” means in a location that, typically, is a managed data center. It’s tightly controlled in terms of temperature and moisture, redundant power sources, how the servers are connected, the amount of backup that the servers are subjected to, who has access, all in a HIPAA-compliant environment. The data center could be anywhere—next door, or two blocks away, or miles away.

Dr. Roy

Data transfer over networks is another very critical thing to consider, and I often think people realize the hard way that it should have been done. We’re talking about data transfers from any kind of high-throughput sequencing or digital analysis platforms. One of the most common activities that involves handling these large data sets is backup, just moving the data, say, from the lab to an off-site data center. Most hospital systems have low-medium bandwidth network connections (10/100 Mbps), which are adequate for standard network communications like email exchanges and routine communications between EHRs and laboratory information systems. But when we start to move these very large amounts of data on a standard network connection—say, someone is trying to move a sequence (BAM) file from the sequencer to the server that will do the processing—it quickly overwhelms the bandwidth, creating a network traffic jam. It interferes with critical communications and can really impact patient care. As lab directors are preparing to launch their NGS assays or whole slide imaging setups, specific network requirements should be discussed with the IT team. Make sure the discussion includes consideration of a dedicated, high-bandwidth network: 1 Gb per second to 10 Gb per second, depending on the institution’s resources. The idea is to isolate the high-bandwidth network from the regular network.

Another critical aspect is data security. Lab directors, or whoever is the appropriate designee, must fully review the details of data security for the servers. What kind of encryption protocol will be used to transfer the data? What are the service provider’s policies for securing stored data and in the event of a data breach? This should be done at the institutional level. Multiple stakeholders are typically involved in assessing the potential risks, especially when sending the data outside the institution’s firewall. It’s important that lab directors are aware of the necessary questions to ask, or they delegate the task to personnel with expertise in this area.

James H. Harrison, MD, PhD, associate professor and director of clinical laboratory informatics, Department of Pathology, University of Virginia: Make sure you have enough network plugs, located correctly in your physical space, to plug everything in. If the new system supports tracking of histology workflow and you intend to use it, make sure you plan for device placement and network plugs in the histology lab to support the required data entry.

Dr. Harrison

If you plan to use Wi-Fi to connect lab devices, make sure security has been adequately considered. If the Wi-Fi signal will travel outside the secure workplace or be accessible to visitors, it should have secure login and encryption. Base stations should be installed in appropriate locations to cover all required devices.

If you would like to do your own analysis of operational and clinical data that goes beyond using simple Excel spreadsheets, consider setting up a secure server in the local network environment running Python or R. This will allow secure transfer of data extracted from the LIS, EHR, and local business systems. Analyses can be carried out securely on the server using remote desktop technology or a server-based analytic tool like JupyterLab.

Gestalt Diagnostics acquires Peak Medical

The digital pathology company Gestalt Diagnostics has purchased Peak Medical, a provider of laboratory integration services, to expand its market reach and laboratory solutions offerings.

The acquisition includes Peak Medical employees who bring to Gestalt “skills and experience for all types of connectivity, with direct expertise in supporting legacy and current LISs alike,” said Dan Roark, chief executive officer of Gestalt Diagnostics, in a press statement.

CAP TODAY
X