October 2002
The microbiology laboratory and bioterrorism
The laboratory response network of the Centers for Disease Control and Prevention
is directed toward ensuring that hospital and community clinical microbiologists
possess the knowledge and skills necessary to recognize the potential agents
of bioterrorism. The agents most likely to be used as bioterrorist weapons include
variola major (smallpox), Bacillus anthracis (anthrax), Yersinia
pestis (plague), Francisella tularensis (tularemia), Brucella
species (brucellosis), and Clostridium botulinum intoxication (botulism).
If any of these agents is suspected or confirmed, the institution's infection
control practitioners and public health authorities must be notified immediately.
The authors reviewed data from literature searches from 1997 through June 2001
using the subject headings of bioterrorism, biological weapons, biological warfare,
anthrax, brucellosis, tularemia, smallpox, plague, and botulism. In addition,
they obtained information from publications of the Center for Civilian Studies
at Johns Hopkins University, CDC, American Society for Microbiology, and United
States Army Medical Research Institute of Infectious Diseases. The findings
from these studies and publications were analyzed to determine the microorganisms
most likely to be involved in a bioterrorist attack and the most efficient means
by which they could be identified. In all instances, the guidelines from the
CDC for level A laboratories were observed. The aforementioned listing of microorganisms
most likely to be used as biological weapons was derived from this search. The
authors concluded that while knowledge of the potential of these microorganisms
is critical, clinical microbiologists and medical technologists possess the
basic tools to rule out the suspected pathogens or to refer these isolates to
public health authorities for identification and susceptibility testing.
Robinson-Dunn B. The microbiology laboratory's role in response to bioterrorism.
Arch Pathol Lab Med. 2002;126:291-294.
Reprints: Dr. Barbara Robinson-Dunn, Dept. of Clinical Pathology, William
Beaumont Hospital, 3811 W. 13 Mile Road, Royal Oak, MI 48073; brobinson-dunn@smtpgw.beaumont.edu
Healthy ranges for serum alanine aminotransferase levels
Serum alanine aminotransferase concentration is commonly used to diagnose and
assess liver disease. The upper limits of normal for the serum ALT level have
been debated, and current reference ranges may underestimate the frequency of
chronic liver disease. Defining the "healthy" range for ALT may be more efficacious
than simply defining a generic normal range. The authors studied 9,221 blood
donors during a four-year period. They performed a psychosocial assessment,
medical history, physical examination, and infectious disease testing. Laboratory
tests were performed on morning, fasting blood samples. Of the 9,221 blood donors,
6,835 met the inclusion criteria. The median ALT level for the entire study
sample was 200 nkat/L (12 U/L); the median for males was 250 nkat/L (15 U/L)
and for females was 150 nkat/L (9 U/L). A total of 152 individuals had abnormal
ALT levels by standard reference ranges—that is, upper limits of normal
were 40 U/L in men and 30 U/L in women—and these individuals were re-evaluated.
Of this group, 59 had confirmed abnormal ALT levels and were evaluated further
for the presence of liver disease, and 93 had abnormal ALT levels that were
not confirmed. Of the 59 abnormal ALT levels, 34 patients had no definite causes
of liver disease and 25 were found to have various liver diseases—for
example, chronic alcohol abuse, cholelithiasis, and cholecystitis. The authors
found that the serum ALT was independently related to body mass index and laboratory
measurements of abnormal lipid or carbohydrate metabolism. The new, healthy
upper limit of normal serum ALT was found to be 500 nkat/L (30 U/L) for men
and 317 nkat/L (19 U/L) for women. The authors concluded that consideration
should be given to revising the upper limits of normal for ALT to improve its
sensitivity in identifying subclinical liver disease.
Prati D, Taioli E, Zanella A, et al. Updated definitions of healthy ranges
for serum alanine aminotransferase levels. Ann Intern Med. 2002;137:1-9.
Correspondence: Dr. Daniele Prati, Centro Transfusionale e di Immunologia
dei Trapianti IRCCS Ospedale Maggiore Via Francesco Sforza, 35 20122 Milano,
Italy; dprati@yahoo.com
Quantifying fetomaternal hemorrhage by fluorescence microscopy
Fetal cells in the maternal circulation can occur during pregnancy and can induce
maternal immunization against antigens on fetal red blood cells. Approximately
one to two percent of Rh(D)-negative mothers giving birth to Rh(D)-positive
babies will produce detectable anti-D antibody. To detect and prevent immunization
of Rh(D)-negative mothers giving birth to Rh(D)-positive infants, the quantification
of fetal red cells in the maternal circulation followed by Rh immune globulin
administration is necessary. The Kleihauer-Betke test, which identifies cells
containing fetal hemoglobin (hemoglobin F), is a widely used standard method
to quantify fetomaternal hemorrhage. The accuracy of the Kleihauer-Betke test
depends on several variables, and the test cannot distinguish hemoglobin F from
the hemoglobin F that may be present in adults with thalassemia or sickle cell
disease. The authors compared the sensitivity and specificity of the Kleihauer-Betke
test, flow cytometry, and fluorescence microscopy in quantifying fetomaternal
hemorrhage. Serial dilutions of 1, 0.5, 0.3, and 0.1 percent Rh(D)-positive
fetal blood in maternal Rh(D)-negative blood were prepared. The Kleihauer-Betke
test was performed per standard method. Anti-D antibody with the appropriate
conjugate was used for flow cytometric analysis or fluorescence microscopy.
Six samples of each ABO blood group were studied. Thus, a total of 24 blood
samples (96 titrations or 384 samples) were analyzed. The Kleihauer-Betke test
overestimated fetal blood content by 85 percent (P<0.0001), and the
accuracy of the measurements decreased with increasing dilution. The Kleihauer-Betke
test did not appropriately quantify fetomaternal hemorrhage in the clinically
relevant range of zero to one percent fetal cells in maternal blood, which corresponds
to a fetal bleed of zero to 50 mL. Flow cytometry and fluorescence microscopy
were found to have a high degree of accuracy for quantitating fetomaternal hemorrhage
in the clinically relevant range of fetal cell frequencies (zero to one percent),
and 0.1 percent Rh(D)-positive red cells was detected regularly. The authors
concluded that, in contrast to the Kleihauer-Betke test, anti-D flow cytometry
and fluorescence microscopy are equally reliable and precise in detecting fetomaternal
hemorrhage. Fluorescence microscopy also offers advantages over flow cytometry
and could become the established standard for quantifying fetomaternal hemorrhage.
Ochsenbein-Imhof N, Ochsenbein AF, Seifert B, et al. Quantification of fetomaternal
hemorrhage by fluorescence microscopy is equivalent to flow cytometry. Transfusion.
2002;42:947-953.
Reprints: Dr. Roland Zimmermann, Dept. of Obstetrics, University Hospital
Zurich, Frauenklinikstrasse 10, CH-8091 Zurich, Switzerland; roland.zimmermann@fhk.usz.ch
Automated chromatography and prenatal diagnosis of ß-thalassemia
Prenatal diagnosis of ß-thalassemia ideally is conducted in the first
trimester of pregnancy using chorionic villus tissue DNA analysis. Second trimester
fetal blood analysis is done when couples are diagnosed late or when a conclusive
diagnosis cannot be arrived at by DNA analysis. The conventional approach for
fetal blood analysis is to estimate relative rates of synthesis of globin chains
of hemoglobin in mid-trimester fetuses and base the diagnosis on the beta-to-alpha
(ß/α) biosynthetic ratio. The authors compared this traditional method
with the simpler approach of determining the amount of HbA present in the fetal
blood sample using the Bio-Rad Variant hemoglobin testing system. They used
the ß-thalassemia short program on the Bio-Rad Variant in comparison with
the conventional globin biosynthesis measurement in 58 pregnancies. The ß/α
biosynthesis ratios in 13 homozygous fetuses ranged from zero to 0.3, and the
adult HbA levels by automated chromatography ranged from zero to 0.4 percent.
Normal or heterozygous fetuses had ß/α ratios of greater than 0.04
and HbA levels ranging from 2.1 to 10.6 percent. The authors also correlated
the beta gene mutations in 17 fetuses with the predicted genotypes using automated
high-pressure liquid chromatography. Followup of the 18 unaffected fetuses using
the Variant system at birth showed a significant increase in HbA levels.
Wadia MR, Phanasgaokar SP, Nadkarni AH, et al. Usefulness of automated chromatography
for rapid fetal blood analysis for second trimester prenatal diagnosis of
b-thalassemia. Prenat Diagn. 2002;22: 153-157.
Reprints: R.B. Colah, assistant director, Institute of Immunohaematology
(ICMR), 13th floor, NMS Bldg., KEM Hospital Campus, Parel, Mumbai 400 012,
India; mohanty@bom5.vsnl.net.in
Osmolality vs. specific gravity of urine
Osmolality of urine often can be estimated by determining the specific gravity
of the specimen. Urinary specific gravity correlates reasonably well with urine
osmolality. Specific gravity, however, is a measure of urinary weight, so the
relationship is altered when larger or smaller molecules are present in the
urine. This discrepancy between specific gravity and urine osmolality is well
recognized by nephrologists but less so by primary care physicians. The authors
carried out a detailed in vitro examination of different solutions with varying
molecular weights to compare osmolality and specific gravity. They examined
simulated urines of varying composition. They varied compositions of sodium
chloride, urea, creatinine, glucose, intravenous contrast dye, and albumin and
examined the resulting changes in the relationship between specific gravity
and osmolality. The authors also examined a series of urine samples from patients
with common clinical conditions that were likely to influence the relationship,
such as uncontrolled diabetes mellitus, nephrotic syndrome, post-administration
of intravenous radio contrast material, or saline diuresis. The authors found
a linear correlation between the specific gravity and osmolality of sodium chloride,
urea, creatinine, glucose, contrast dye, and albumin and for their combinations.
Urine samples obtained from patients with different clinical conditions showed
that relying on specific gravity could over- or underestimate urine osmolality.
The authors concluded that in those clinical conditions, urine osmolality should
always be determined and should not be estimated based on specific gravity.
Voinescu GC, Shoemaker M, Moore H, et al. The relationship between urine
osmolality and specific gravity. Am J Med Sci. 2002;323: 39-42.
Reprints: Dr. G. Voinescu, Division of Nephrology, University Hospital &
Clinics, 1 Hospital Drive, MA 436, Columbia, MO 65201; voinescuc@health.missouri.edu
Oral vitamin K lowers INR in warfarin-associated coagulopathy
Warfarin-associated coagulopathy and excessive prolongation of the international
normalized ratio can cause life-threatening bleeding complications. Vitamin
K administered subcutaneously is typically used to treat warfarin-associated
coagulopathy and to normalize the INR. The authors conducted a multi-center,
randomized, controlled trial to examine the efficacy of oral versus subcutaneous
vitamin K in patients with an INR between 4.5 and 10. For eligible patients,
the warfarin was stopped for at least one day and then the patients were randomized
to receive 1 mg of vitamin K orally or subcutaneously. The INR was measured
the day after vitamin K was administered. Fifty-one patients were studied. The
mean INR on day zero for the 26 patients who received oral vitamin K was 5.8
(range, 4.5 to 7.6). The 25 patients in the subcutaneous vitamin K group had
a mean INR on day zero of 6.2 (range, 4.8 to 9.0). On the day after vitamin
K administration, 15 of 26 patients (58 percent) who received the oral drug
and six of 25 patients (24 percent) who received the subcutaneous drug had INRs
of 1.8 to 3.2—less than 4.5 was felt to be safe (P=0.015). None
of the patients who received oral vitamin K and two patients (eight percent)
who received subcutaneous vitamin K had an increased INR on the day after study
drug administration. On the second and third days, the mean INRs were higher
in the subcutaneous group than the oral group. During the one-month followup
period, no episodes of thromboembolism or bleeding occurred. The authors concluded
that 1 mg of oral vitamin K lowers the INR faster than 1 mg of subcutaneous
vitamin K in asymptomatic patients with elevated INRs (less than 10).
Crowther MA, Douketis JD, Schnurr T, et al. Oral vitamin K lowers the international
normalized ratio more rapidly than subcutaneous vitamin K in the treatment
of warfarin-associated coagulopathy. Ann Intern Med. 2002;137:251-254.
Correspondence: Dr. Mark A. Crowther, St. Joseph's Hospital, Room L208,
50 Charlton Ave. East, Hamilton, Ontario L8N 4A6, Canada; crowthrm@mcmaster.ca
Biochemistry of large pericardial effusions
Pericardial effusions result from a variety of pathologies. Echocardiography
provides an accurate and noninvasive method for diagnosing pericardial effusions,
but the etiology of the effusion is often uncertain. Laboratory testing of pleural
fluid is a well-documented means to identify the cause of pleural effusions,
but there is relatively little literature on the corresponding approach to pericardial
fluids. A study was conducted to examine the diagnostic utility of biochemical
testing for pericardial fluids. One hundred and ten hospital patients older
than 12 years of age who presented to the echocardiography department of a large
academic medical center in South Africa as well as 12 control subjects who underwent
open-heart surgery were studied in a consecutive prospective case series. Adenosine
deaminase, microbiology, hematology, cytology, and biochemical tests were performed
on the pericardial fluid. The etiology of each pericardial fluid sample was
established using predetermined criteria. The biochemical composition of the
pericardial exudates differed significantly from that of the pericardial transudates.
According to Light's criteria, an exudate has one or more of the following:
pleural fluid/serum protein ratio greater than 0.5, pleural fluid/serum lactate
dehydrogenase ratio greater than 0.6, or pleural fluid LDH level greater than
200 U/L. When these criteria were applied to pericardial fluids, the corresponding
sensitivity of the criteria was 98 percent. The authors noted that, although
the lab tests were useful guidelines for assessing the etiology of pericardial
effusions, the majority of large, clinically significant pericardial effusions
result from exudates.
Burgess LJ, Reuter H, Taljaard JJF, et al. Role of biochemical tests in
the diagnosis of large pericardial effusions. Chest. 2002;121: 495-499.
Reprints: Dr. Lesley J. Burgess, Dept. of Cardiology, University of Stellenbosch
and Tygerberg Hospital, P.O. Box 19174, Tygerberg 7505, South Africa; treadres@iafrica.com
|
|
|