Clinical Abstracts

 

 

 

 

November 2008

Editor:
Michael Bissell, MD, PhD, MPH

Serial brain natriuretic peptides in patients presenting with chest pain
Staphylococcus aureus sensitization and allergies in early childhood
Evaluation and performance characteristics of ACL TOP coagulation analyzer
Confirmation testing for rubella virus infection
Diagnosing leprosy in patients treated by an ophthalmologist
Comparison of disease activity score ESR and CRP threshold values
Use of a microbead assay to detect HPA-1 alloantibodies
Predictive value of serum sodium in cirrhosis patients awaiting liver transplant

Serial brain natriuretic peptides in patients presenting with chest pain Serial brain natriuretic peptides in patients presenting with chest pain

Twenty percent of patients arriving at the emergency department have symptoms suggestive of acute myocardial infarction (MI). Diagnosis of acute MI is based on the rise and fall of troponin or creatine kinase-MB with at least one of the following: ischemic symptoms, electrocardiographic changes, or coronary artery intervention. Although the tissue specificity of cardiac troponin T (cTnT) and cardiac troponin I have improved the ability to make an accurate diagnosis, the search continues for optimal markers for acute coronary syndromes (ACS). Natriuretic peptides, which primarily have been devoted to the diagnosis of congestive heart failure, have also been found to be elevated with ACS. Natriuretic peptides are vasoactive hormones secreted by the heart as part of a systemic response to cardiac stress and ventricular dysfunction. The precursor peptide of brain natriuretic peptide (BNP) is stored in granules of ventricular myocytes, where it is cleaved into an amino-terminal product (NT-proBNP) and the physiologically active BNP. Release of BNP and NT-proBNP is regulated by wall stress and myocyte stretch. The BNP levels trend upward to a peak between 14 and 40 hours after an ischemic event. Elevated BNP and NT-proBNP concentrations at admission with ACS are associated with poor prognosis, including increased mortality, development of congestive heart failure, and recurrent ischemic events. The authors conducted a study to characterize the diagnostic and prognostic utility of short-term dynamic changes in BNP and NT-proBNP in patients presenting to the emergency department with chest pain. Most previous studies involving the serial use of natriuretic peptides in the setting of ACS examined time points in the time frames of days, weeks, and months. Because BNP is synthesized on demand in response to an appropriate stimulus, the authors were interested in determining if serial measurements of natriuretic peptides during the first 24 hours had diagnostic or prognostic value, or both. The authors followed 276 patients, who presented to the emergency department with chest pain, for 90 days. They sampled BNP and NT-proBNP up to five times within 24 hours of presentation and again at discharge. Follow-up data were collected at 30 and 90 days after admission. Adverse events included emergency department visits for chest pain, cardiac- related readmission, and death. The authors assessed the prognostic and diagnostic value of baseline natriuretic peptide measurements with receiver operating characteristic analyses. They found that natriuretic peptides were diagnostic for congestive heart failure and new-onset congestive heart failure, but less so for ACS. The prognostic utility of serial sampling was evaluated by testing the statistical contribution of each future time point, as well as variability over time, over and above the baseline values in logistic regression models. The authors concluded that baseline elevated BNP and NT-proBNP concentrations were predictive of adverse events at 30 and 90 days. Serial sampling did not improve the prognostic value of BNP or NT-proBNP.

Kwan G, Isakson SR, Beede J, et al. Short-term serial sampling of natriuretic peptides in patients presenting with chest pain. J Am Coll Cardiol. 2007;49:1186–1192.

Correspondence: Dr. Robert L. Fitzgerald at rlfitzgerald@vapop.ucsd.edu

Staphylococcus aureus sensitization and allergies in early childhood Staphylococcus aureus sensitization and allergies in early childhood

Staphylococcus aureus is a human pathogen responsible for a variety of clinical diseases. There is increased interest in strains secreting enterotoxins and their potential role in the pathophysiology of atopic diseases. S. aureus-secreted enterotoxins (SEs) are a large protein family. The skin of patients with eczema is frequently colonized by enterotoxin-excreting S. aureus strains, which can cause a higher degree of disease activity than nontoxigenic strains. Applying SE-B directly to skin elicits a local inflam­ma­tory response in healthy people and those with eczema. Adults with allergic rhinitis are four times more likely to be sensitized to SEs than healthy controls (25 percent versus 6.3 percent). Moreover, rhinitis severity is more likely to increase in SE-sensitized patients than in nonsensitized patients. SE sensitization in patients with mite allergy with rhinitis may contribute to a higher mite-specific IgE response. Growing evidence indicates that SEs may be implicated in the pathogenesis of asthma. However, little information exists about the relationship between SEs and atopic disease and allergic disease in children. Within the context of a population-based birth cohort study (Manchester Asthma and Allergy Study [MAAS]), the authors investigated the potential role of SEs in atopic disease during early childhood using SE-specific IgE as a marker. Children (n=510) were followed from birth to five years of age. The methods used were repeated questionnaires, IgE to inhalant and food allergens, lung function (spirometry and plethysmography), and airway reactivity (dry air challenge). The authors measured SE-mix-specific IgE (SE-A, SE-C, and toxic shock syndrome toxin 1) using fluorescence immunoassay. They found no association between rhinitis and SE-mix sensitization. Children with eczema were more frequently SE-mix sensitized than children without (17.4 versus 8.3 percent; P=.02). The SE-mix sensitization rate rose significantly with increasing eczema severity (no eczema, mild, moderate/severe: 8.3 percent, 14.8 percent, 42.9 percent; P=.003) and remained independently associated with eczema in a multivariate model adjusting for total IgE (adjusted odds ratio, 2.19; 95 percent confidence interval, 1.05–4.56; P=.04). SE-mix sensitization was associated with current wheeze in the univariate but not the multivariate model. Among wheeze phenotypes, persistent wheezers were most commonly sensitized to SE-mix (never, transient, late onset, persistent: 8.5 percent, 3.8 percent, 7.7 percent, 17.6 percent; P=.05). Among wheezers, those who were SE-mix sensitized had significantly higher airway reactivity compared with those who were nonsensitized (mean FEV1 change, mL [95 percent CI]: –59 [–121, 3] versus 19 [–10.2, 48.9]; P=.04), with little difference after adjusting for atopy. The authors concluded that there are differences in SE-mix IgE antibodies between healthy five-year-old children and children with eczema and wheeze. The proportion of patients sensitized to SE-mix goes up with increasing disease severity. Staphylococcal enterotoxins are potential modifiers of childhood wheeze and eczema.

Semic-Jusufagic A, Bachert C, Gevaert P, et al. Staphylococcus aureus sensitization and allergic disease in early childhood: population-based birth cohort study. J Allergy Clin Immunol. 2007;119:930–936.

Correspondence: Dr. Aida Semic-Jusufagic at aidajusufagic@yahoo.com

Evaluation and performance characteristics of ACL TOP coagulation analyzer Evaluation and performance characteristics of ACL TOP coagulation analyzer

The increasing demand for coagulation tests in an environment of decreasing staff and reagent budgets has raised interest in hemostasis laboratory automation. The capabilities of the current generation of fully automated coagulation analyzers include primary tube sampling, automatic rerun, dilution, and, in some instances, cap piercing. The analyzers can perform basic coagulation tests, such as prothrombin time (PT) and activated partial thromboplastin time (aPTT), as well as more sophisticated coag­u­lation, chromogenic, and immunologic assays using smaller sample and reagent volumes than manual methods. The authors evaluated the automated hemostasis analyzer ACL TOP in routine practice at a coagulation laboratory of a university hospital. The ACL TOP is a fully-automated, random-access multiparameter coagulation analyzer equipped with a photo-optical clot-detection unit. The authors evaluated the analyzer’s technical features for performing routine coagulation (PT, aPTT, fibrinogen, and single coagulation factor levels), chromogenic assays (antiactivated factor X, antithrombin, and protein C activities), and immunologic assays (free protein S and von Willebrand factor antigen concentrations). The evaluation addres­sed ease of operation, method availa­bility, reagent and patient sample onboard capabilities, ability to perform automatic dilution, rerun and reflex testing, and validation of performances. The authors found that using fresh and lyophilized plasma samples, the intra-assay and inter-assay coefficients of variation were below five percent for most of the parameters in the normal and pathological ranges. For clotting assays performed at 671 nm, no significant interference could be demonstrated with hemolytic, icteric, and lipemic samples, as demonstrated by results similar to those obtained using a mechanical clot-detection-based analyzer (STAR). No sample carryover was detected in measuring alternatively heparinized (1.0 IU/mL unfractionated heparin) and normal plasma samples. The results of the different coagulation, chromogenic, and immunologic assays obtained on the ACL TOP were well correlated with those obtained on the STAR analyzer (correlation coefficient, 0.876–0.990). The authors concluded that using the ACL TOP analyzer, routine hemostasis testing can be performed with satisfactory precision, as can more specialized and specific tests, such as single factor activity or antigen concentration.

Appert-Flory A, Fischer F, Jambou D, et al. Evaluation and performance characteristics of the automated coagulation analyzer ACL TOP. Thrombosis Res. 2007;120:733–743.

Correspondence: Pierre Toulon at toulon.p@chu-nice.fr

Confirmation testing for rubella virus infection Confirmation testing for rubella virus infection

Rubella virus infection in early pregnancy often causes death of the fetus or, if the fetus survives, congenital defects in about 90 percent of newborns. The major defects include deafness, cataracts, and heart disorders, which are collectively known as congenital rubella syndrome (CRS). Although rubella and CRS have been eliminated in the United States through an aggressive vaccination program, rubella is still endemic in countries without an immunization program or without a good program, and explosive outbreaks may occur. It was estimated in 2003 that more than 100,000 infants worldwide are born with CRS each year. In general, a large proportion of unimmunized populations in areas where rubella is endemic are infected and become immune before puberty. Nevertheless, approximately three percent to 23 percent of adults remain susceptible in various countries and areas. Laboratory tests are essential for confirming sporadic cases and outbreaks of rubella. It often is necessary to detect rubella virus to confirm rubella cases and to identify specimens to be used to characterize wild-type rubella viruses. The authors evaluated the sensitivities of four methods for detecting rubella virus infection using 22 throat swabs collected from patients in Henan and Anhui provinces in China who had clinically suspected rubella virus infection. The four methods used were reverse transcription-polymerase chain reaction (RT-PCR) followed by Southern hybridization using RNA extracted directly from clinical specimens, virus growth in tissue culture followed by virus detection by RT-PCR, low-background immunofluorescence in infected tissue culture cells using monoclonal antibodies to the structural proteins of rubella virus, and a replicon-based method of detecting infectious virus. Among these four methods, direct RT-PCR followed by hybridization was the most sensitive. The replicon-based method was the least difficult to perform.

Zhu Z, Xu W, Abernathy ES, et al. Comparison of four methods using throat swabs to confirm rubella virus infection. J Clin Microbiol. 2007;45:2847–2852.

Correspondence: Joseph Icenogle at jcil@cdc.gov

Diagnosing leprosy in patients treated by an ophthalmologist Diagnosing leprosy in patients treated by an ophthalmologist

Leprosy, or Hansen’s disease, is a chronic contagious disease caused by Mycobacterium leprae, an acid-fast, rod-shaped bacterium. It is one of the oldest recorded infectious diseases affecting humans, with potentially incapacitating neurologic and social sequelae. Leprosy has been reported to have the highest incidence of ocular involvement of any human bacterial infection and represents a leading cause of blindness in countries such as India, Nepal, and Brazil. The disease is also widespread in tropical and sub­tropical areas of Asia, Africa, and Latin America. It is estimated that leprosy affects 10 million to 12 million people worldwide and visually impairs three to seven percent of them. Because of the rarity of leprosy in developed countries, there is a lack of awareness of the manifestations of ocular and periocular leprosy, and such signs as corneal scarring, uveitis, and eyelid anomalies are seldom considered in the differential diagnosis. Herein, the authors present their experience with the diagnosis of ocular and periocular leprosy in patients with no known prior history of leprosy who sought treatment at a tertiary eye care center. The authors’ study involved reviewing the clinical records for these patients. Outcome measures were patient demographics, presenting symptoms and signs, diagnostic studies, complications, and treatment. Among the six patients (five women and one man; average age, 55 years), only two were found to have leprosy based on clinical examination alone. Histopathologic characteristics or demonstration of acid-fast bacilli, suggestive of leprosy, were found in five patients. Definite confirmation of leprosy was made by polymerase chain reaction performed on formalin-fixed, paraffin-embedded tissues from four patients suspected of having leprosy based on clinicohistopathologic examination results. The authors concluded that the diagnosis of leprosy relies on the clinical symptom complex, epidemiologic factors, and demonstration of acid-fast bacilli in the tissue sample. An ophthalmologist may be the first one to encounter the disease, in which case suspicion and detection of ocular findings may lead to early treatment of the infection. Polymerase chain reaction may be a new tool for diagnosing leprosy when suspicion of the disease is raised by clinicohistopathologic studies.

Chaudhry IA, Shamsi FA, Elzaridi E, et al. Initial diagnosis of leprosy in patients treated by an ophthalmologist and confirmation by conventional analysis and polymerase chain reaction. Ophthalmology. 2007;114:1904–1911.

Correspondence: Dr. Imtiaz A. Chaudhry at orbitdr@hotmail.com

Comparison of disease activity score ESR and CRP threshold values Comparison of disease activity score ESR and CRP threshold values

The development of new classes of therapeutic agents for treating rheumatoid arthritis has led to a subset of patients experiencing relief from pain and disability in recent years. However, physicians still require a reliable patient evaluation system that allows the effects of these agents to be assessed accurately. The disease activity score uses 28 joint counts (DAS28) and has been widely used in clinical trials and to monitor the disease activity of clinic patients with rheumatoid arthritis. The DAS28 is calculated from tender joint count, swollen joint count (both performed by the treating physician), visual analog scale (VAS) score of the patient’s global health, and the laboratory parameter erythrocyte sedimentation rate (ESR). The DAS28 is widely used, and cut-off points of 2.6, 3.2, and 5.1 have been proposed to be indicative of remission, low disease activity, and high disease activity, respectively. A DAS28 based on C-reactive protein (CRP) levels rather than ESR recently has been proposed. Although the formula for calculating DAS28-CRP values was designed to produce results equivalent to those for DAS28-ESR, DAS28-CRP values seem to be lower than DAS28-ESR values in clinical practice. Because changes in ESR and CRP levels represent different underlying pathophysiologies, DAS28-CRP threshold values might be expected to differ from those of the DAS28-ESR. Nevertheless, few studies have been performed to validate or compare these two systems. Therefore, the authors conducted a study to evaluate the DAS28-CRP threshold values that correspond to criteria for remission, low disease activity, and high disease activity according to the DAS28-ESR. They analyzed DAS28 data from a large observational study (Institute of Rheumatology Rheumatoid Arthritis) database of 6,729 patients with rheumatoid arthritis. First, they studied the relationship between DAS28-ESR and DAS28-CRP values. Second, they calculated the best DAS28-CRP trade-off values for each threshold using receiver operating characteristic (ROC) curves. The authors found that the correlation coefficient of ESR versus CRP was 0.686, whereas that of DAS28-ESR versus DAS28-CRP was 0.946, showing a strong linear relationship between DAS28-ESR and DAS28-CRP values. DAS28-CRP thres­hold values corresponding to remission, low disease activity, and high disease activity were 2.3, 2.7, and 4.1, respectively. The sensitivity and specificity from the ROC curves were gradually reduced as DAS28 values decreased. The authors concluded that DAS28-CRP and DAS28-ESR were well correlated, but the threshold values should be reconsidered. Because the results were derived only from Japanese patients, it is essential to compare DAS28-CRP threshold values in people of other ethnic groups.

Inoue E, Yamanaka H, Hara M, et al. Comparison of Disease Activity Score (DAS)28-erythrocyte sedimentation rate and DAS28-C-reactive protein threshold values. Ann Rheum Dis. 2007;66:407–409.

Correspondence: E. Inoue at einoue@kde.biglobe.ne.jp

Use of a microbead assay to detect HPA-1 alloantibodies Use of a microbead assay to detect HPA-1 alloantibodies

Neonatal alloimmune thrombocytopenia (NAIT) is characterized by thrombocytopenia induced by the destruction of fetal and neonatal platelets through transplacental passage of maternal platelet-specific alloantibodies. NAIT manifestations range from asymptomatic to severe bleeding events and include intracranial hemorrhage, which is the most common cause of mortality and morbidity in 15 to 20 percent of affected people. In Caucasians, NAIT is caused by alloantibody against human platelet antigen 1a (HPA-1a) in approx­imately 75 percent of cases. In addition, alloantibodies against HPA-1a are the most common cause of platelet destruction in patients with post-transfusion purpura (PTP). The epitopes of HPA-1a are localized on the glycoprotein IIIa (GPIIIa) subunit of the platelet fibrinogen receptor ­GPIIb/IIIa complex. Recent evidence from the crystal structures of ­GPIIb/IIIa showed that the N-term­inal PSI (plexins, semaphorins, integrins) domain of GPIIIa harbors the single-point mutation Pro33Leu. Substitution of Leu33 with Pro33 of the GPIIIa subunit in a conformationally restrained manner alters the structure of this loop, leading to the formation of HPA-1a and HPA-1b epitopes, respectively. In the past three decades, several approaches have been implemented to characterize platelet reactive antibodies. Capture assays based on immobilization of platelet antigens with monoclonal antibodies (MoAbs), such as MoAb-immobilized platelet antigen (MAIPA) assay, are the gold standard for detecting platelet alloantibodies. However, such capture assays are limited by the availability of MoAbs and require several washing steps. The authors have reported on a rapid gel test based on a microbead assay for detecting platelet alloantibodies. However, the assay still depends on applying MoAbs and includes washing procedures. More recently, the auth­ors conducted a study in which they developed a rapid gel test, termed gel antigen-specific assay (GASA), for detecting platelet alloantibodies without MoAbs and without washing steps. In the study, GPIIb/IIIa was purified, using affinity chromatography, from outdated platelet concentrates derived from HPA-1aa or HPA-1bb donors. Purified glycoproteins were biotinylated, immobilized onto strepta­vidin beads, and used to analyze HPA-1a alloantibodies by a microtyping system. HPA-1a serum samples derived from mothers with NAIT (n=36) and from post-trans­fusion purpura patients (n=2), as well as HPA-1b (n=4), HPA-5b (n=2), HPA-3a (n=4), and HLA class I (n=2) allo­anti­serum samples from multitransfused patients, were investigated in GASA and MAIPA assays. The authors found that GASA was able to detect all HPA-1a and -1b alloantibodies recognized by MAIPA. The authors did not observe cross-reactivity with other platelet T-reactive alloantibodies. Interestingly, three of 36 serum samples, which showed only moderate reactivity in MAIPA, reacted strongly in GASA. The authors concluded that GASA has proved to be a rapid met­hod for detecting HPA-1a alloantibodies and may be useful for platelet antibody screening, especially for initially assessing suspected NAIT cases.

Bakchoul T, Meyer O, Agaylan A, et al. Rapid detection of HPA-1 alloantibodies by platelet antigens immobilized onto microbeads. Transfusion. 2007;47:1363–1368.

Correspondence: Dr. Sentot Santoso at sentot.santoso@immunologie.med.unigiessen.de

Predictive value of serum sodium in cirrhosis patients awaiting liver transplant Predictive value of serum sodium in cirrhosis patients awaiting liver transplant

The model for end-stage liver disease (MELD) score is the most widely used method for allocating organs in liver transplantation. The model, which includes variables related to liver and renal function, was implemented in the United States in 2002 and is being used in many countries to classify cirrhosis patients awaiting transplantation based on the severity of their liver disease. Nevertheless, several studies, as well as clinical observation, indicate that some subsets of patients with cirrhosis may have high mortality despite low MELD scores. Therefore, it is necessary to improve the MELD score. To this end, several studies have shown that serum sodium concentration is a good marker of prognosis for patients awaiting transplantation. Based on the results of these studies, serum sodium has been recommended for assessing the severity of cirrhosis. However, several issues surround the use of serum sodium as a predictor of prognosis. They are whether the value of serum sodium is equally effective in assessing short-term prognosis (three months) compared with mid-term prognosis (12 months); whether serum sodium is equally accurate in predicting prognosis in different subpopulations of patients with cirrhosis; and whether serum sodium improves the accuracy of the MELD score. The authors conducted a study to investigate these issues. The study included 308 consecutive patients with cirrhosis listed for transplantation during a five-year period. The end-point was survival at three and 12 months before transplantation. Variables obtained when being listed for transplantation were analyzed for prognostic value using multivariable analysis. Accuracy of prognostic variables was analyzed by receiver operating characteristics (ROC) curves. The authors found that the MELD score and serum sodium concentration were the only independent predictors of survival at three and 12 months after being listed. Low serum sodium was associated with an increased risk of death in all subpopulations of patients with cirrhosis categorized according to the major complication that developed before being listed. The area under the ROC curves for serum sodium and MELD score was not significantly different at three months (0.83 versus 0.79, respectively) and 12 months (0.70 versus 0.77, respectively). Adding serum sodium did not significantly improve the accuracy of the MELD score in predicting survival at three and 12 months. The authors concluded that in patients with cirrhosis awaiting liver transplantation, serum sodium and MELD were found to be independent predictors of survival. Larger studies are needed to determine whether adding serum sodium to MELD can improve its prognostic accuracy.

Londono M-C, Cardenas A, Guevara M, et al. MELD score and serum sodium in the prediction of survival of patients with cirrhosis awaiting liver transplantation. Gut. 2007; 56: 1283– 1290.

Correspondence: Dr. Pere Gines at pgines@clinic.ub.es


Dr. Bissell is Professor and Director of Clinical Services and Vice Chair, Department of Pathology, Ohio State University Medical Center, Columbus.