Recent Research

Probiotics, new-generation anticoagulants and antiplatelet therapy, and more.


Critically ill patients may benefit from probiotics, but evidence is weak

Probiotics may reduce infectious complications, including ventilator-associated pneumonia (VAP), in critically ill patients, a systematic review and meta-analysis suggests.

Researchers searched databases, article reference lists, and personal files from 1980 to 2011. Their review included 23 randomized controlled trials that evaluated use of probiotics versus placebo in critically ill adults for outcomes including infections (primary) and mortality and length of stay (secondary). Subgroup analyses were performed to compare high and low doses of probiotics, different strains of probiotics, and methodological quality of studies. Results were published in the December 2012 Critical Care Medicine.

The mean methodological score of all trials was 9.5 (range, 6 to 13) of a possible 14. Of the 23 trials, seven (30%) tested the viability of the probiotics used in the intervention.

Different probiotic therapies were used among the 23 included trials; eight used Lactobacillus plantarum and three used Lactobacillus rhamnosus strain GG.

In the 11 trials that reported infectious complications, pooled results show that probiotics were associated with a reduction in such complications (risk ratio [RR], 0.82; P=0.03). Pooled data from the seven trials reporting VAP data found a significant reduction in VAP rates associated with probiotics (RR, 0.75; P=0.03). Probiotics were associated with a non-significant trend toward reduced ICU mortality (RR, 0.80; P=0.16) but had no effect on hospital mortality or length of stay in the ICU or hospital. More dramatic treatment effects were seen in trials of lower methodological quality.

The findings in this analysis were “tempered by the presence of significant clinical and statistical heterogeneity and the lack of statistical precision in some analyses,” the authors noted, and were further weakened by the fact that more dramatic results were found in the lower-quality trials. The results are, however, consistent with a 2009 review that suggested probiotics reduce VAP. Unlike other studies, this review didn't find an association between probiotic use and fewer cases of diarrhea, they noted. The best type and dosage of probiotics also were not clear from this analysis, they wrote.

A separate meta-analysis of 13 trials, published in the Sept. 10, 2012, Chest, found that probiotics didn't significantly reduce hospital or ICU mortality or the duration of mechanical ventilation. Probiotics were, however, associated with lower incidence of ICU-acquired pneumonia and a shorter stay in the ICU.

New-generation anticoagulants associated with bleeding after ACS in patients on antiplatelet therapy

New-generation anticoagulants are associated with increased bleeding risk when used after acute coronary syndrome (ACS) in patients taking antiplatelet agents, according to a meta-analysis.

Researchers analyzed data from randomized, placebo- controlled trials that examined anti-Xa or direct thrombin inhibitors in post-ACS patients who were receiving antiplatelet therapy. Stent thrombosis, overall mortality rate and a composite end point including major ischemic events were evaluated as efficacy measures. Thrombolysis in myocardial infarction (TIMI) major bleeding events was the safety end point. The researchers calculated the net clinical benefit of therapy by summing composite ischemic events and major bleeding events. The results were published in the Nov. 12, 2012, Archives of Internal Medicine.

Seven trials published between Jan. 1, 2000, and Dec. 31, 2011, were included in the meta-analysis. The trials involved 31,286 patients who had been admitted to the hospital for unstable angina pectoris, ST-segment elevation MI, or non-ST-segment elevation MI. Patients with severe cardiac, renal or liver insufficiency and those with high bleeding risk (including those requiring long-term oral anticoagulation) were excluded. Follow-up ranged from three months to five years.

Overall, new-generation oral anticoagulants were associated with a significant increase in major bleeding in patients receiving antiplatelet therapy after ACS (odds ratio, 3.03; P<0.001). Risk for stent thrombosis or composite ischemic events was significantly but moderately reduced with the new agents, while no significant effect was seen on overall mortality. The new anticoagulants did not appear to provide an advantage over placebo in net clinical benefit (odds ratio, 0.98; P=0.57).

The authors noted that some of the included trials had high rates of drug discontinuation or had short follow-up periods. In addition, the trials evaluated different drugs in different dosages and used different definitions of composite outcomes. However, the authors concluded that anti-Xa and direct thrombin inhibitors are linked to increased major bleeding risk in ACS patients receiving antiplatelet therapy, to the point that the net clinical benefit of the drugs is neutral.

“Because the use of new-generation P2Y12 ADP-receptor antagonists may result in greater reductions of ischemic events, with substantially lower risk for bleeding complications, the role of oral anticoagulant agents after an ACS is debatable,” they wrote.

An invited commentary pointed out that while the relative risk for major bleeding seemed dramatic, the absolute risk increase was 0.9% (1.3% for novel oral anticoagulants compared with 0.4% for control therapy). However, the absolute risk decreases for the composite ischemic outcome and for MI were −1.3% (6.5% vs. 7.8%) and −0.8% (3.7% vs. 4.5%), respectively, and the absolute risk differences for net clinical benefit and for overall death were each −0.5%.

“The benefit is largely canceled by the harm; therefore, the routine use of [novel anticoagulants] among patients with ACS is unwarranted,” the commentary author wrote. He called for additional research to examine the use of newer anticoagulants in specific populations of patients with ACS.

Recurrent TIA and minor stroke associated with resulting disabilities, study finds

A detailed prospective study of transient ischemic attacks (TIAs) and minor stroke found 15% of patients were disabled at 90 days, even though most of them did not have a recurrent event.

Researchers prospectively enrolled consecutive patients with stroke rated less than 4 on the National Institutes of Health Stroke Scale or patients with TIA who were previously not disabled and had computed tomography or computed tomography angiography (CT/CTA) within 24 hours of symptoms.

Disability was assessed at 90 days using the modified Rankin Scale. Predictors of disability (modified Rankin Scale ≥2) and the relative impact of the initial event versus recurrent events were assessed. Results were published in the November 2012 Stroke.

Among 499 patients, 74 (15%; 95% CI, 12% to 18%) were disabled at 90 days. Baseline factors predicting disability as an outcome included being 60 years or older, having diabetes, having a premorbid modified Rankin Scale 1, having ongoing symptoms, having a CT/CTA-positive metric, or having positive diffusion-weighted imaging.

In the multivariable analysis, factors that predicted disability included ongoing symptoms (odds ratio [OR], 2.4; 95% CI, 1.3 to 4.4; P=0.004), diabetes (OR, 2.3; 95% CI, 1.2 to 4.3; P=0.009), female sex (OR, 1.8; 95% CI, 1.1 to 3; P=0.025), and CT/CTA-positive metric (OR, 2.4; 95% CI, 1.4 to 4; P=0.001).

Of the 463 patients who did not have a recurrent event, 55 were disabled (12%). Of the 36 who did have a recurrent event, 19 (53%) patients were disabled (risk ratio, 4.4; 95% CI, 3 to 6.6; P<0.0001). Researchers noted that recurrent events are a very important surrogate for disability but are numerically not the major factor in predicting disability.

“There has been a general emphasis on recurrent events after minor stroke or TIA rather than on disability, yet disability is the accepted outcome after disabling stroke,” researchers wrote. “Our study is novel in that it emphasizes the need to examine disability even in minor stroke and brings together … careful clinical assessments and imaging data to emphasize this point.”

Restarting warfarin after GI bleeding associated with better outcomes

Patients previously on warfarin who don't resume the therapy soon after a gastrointestinal (GI) bleeding event may have a higher risk for thrombosis and death, according to a recent study.

Researchers from Kaiser Permanente Colorado performed a retrospective cohort study to examine time to resumption of anticoagulation and incidence of thrombosis, recurrent GI bleeding and death in the 90 days after a GI bleeding event. Clinical and administrative data were used to identify patients who developed GI bleeding while taking warfarin, stopped the therapy, and then did or did not restart it within 90 days. Outcomes were compared between the “resumed therapy” and “did not resume therapy” groups. The study results were published in the Oct. 22, 2012, Archives of Internal Medicine.

A total of 442 patients with an index event of warfarin-associated GI bleeding were included in the study. Approximately half of the patients were men, and the mean age was 74.2 years. Overall, 46.2% of patients had taken aspirin in the 90 days before their first GI bleeding event.

Two hundred sixty patients (58.8%) resumed warfarin therapy within 90 days of the index event, and 182 (41.2%) did not. Resuming warfarin therapy was associated with a lower adjusted risk for thrombosis and death (hazard ratios, 0.05 [95% CI, 0.01 to 0.58] and 0.31 [95% CI, 0.15 to 0.62], respectively) and did not appear to significantly increase the risk for recurrent GI bleeding (hazard ratio, 1.32 [95% CI, 0.50 to 3.57]).

The authors noted that they could not assess all of the factors involved in physicians' clinical decisions about therapy, that there is potential for confounding, and that they did not have data on aspirin use after the index bleeding event. However, they concluded that thrombosis and death after GI bleeding are more likely in patients who do not resume warfarin therapy within 90 days, and that the benefits of the therapy will outweigh the risk in many cases. They called for further research to determine the optimal interval of warfarin interruption after GI bleeding and to better define which patients can tolerate a longer interruption in therapy.

An accompanying editorial outlined three key findings from the study: Patients and physicians are usually willing to restart warfarin after GI bleeding; anticoagulation was usually restarted within a week of the bleeding event; and recurrent bleeding events, none of which were fatal, occurred at an acceptably low rate.

Based on the available data, the editorialists wrote, “We believe that most patients with warfarin-associated GI bleeding and indications for continued long-term antithrombotic therapy should resume anticoagulation within the first week following the hemorrhage, approximately 4 days afterward.” However, they advised against also continuing antiplatelet therapy in these patients without a “compelling indication” to do so and said that the study's findings should not be applied to patients taking new anticoagulant agents, such as dabigatran and rivaroxaban.

VTE risk appears increased in patients with slightly low eGFR, albumin-creatinine ratio

Even mild kidney disease is associated with an increase in risk of venous thromboembolism (VTE), a review found.

The pooled analysis of individual data included five prospective cohort studies of almost 100,000 people in Europe and the United States. A total of 1,178 VTEs occurred during the follow-up of about six years. Using Cox regression models, the researchers calculated that patients with decreased estimated glomerular filtration rate (eGFR) had a higher risk of VTE than patients with an eGFR of 100 mL/min/1.73 m2. The hazard ratio (HR) was 1.29 (95% CI, 1.04 to 1.59) for an eGFR of 75 mL/min/1.73 m2, 1.31 (95% CI, 1.00 to 1.71) for an eGFR of 60 mL/min/1.73m2, 1.82 (95% CI, 1.27 to 2.60) for an eGFR of 45 mL/min/1.73m2 and 1.95 (95% CI, 1.26 to 3.01) for an eGFR of 30 mL/min/1.73m2.

A worse albumin-creatinine ratio (ACR) than 5.0 mg/g was also associated with higher risk of VTE: for 30 mg/g, the HR was 1.34 (95% CI, 1.04 to 1.72); for 300 mg/g, the HR was 1.60 (95% CI, 1.08 to 2.36); and for 1,000 mg/g, the HR was 1.92 (95% CI, 1.19 to 3.09). The associations of the two measurements with VTE were independent of each other and traditional cardiovascular risk factors, the researchers said. They concluded that both eGFR and ACR are associated with more VTE, even in normal ranges. The study was published in the Oct. 16, 2012, Circulation.

The results clarify previous inconsistent findings on this question, the authors said, and support the existence of a direct association between chronic kidney disease and VTE. That association may result from endothelial injury or changes in procoagulant proteins, they suggested. Because these levels of chronic kidney disease are prevalent in the general population and are more modifiable with medication than most VTE risk factors, some VTEs might be prevented by attention to this relationship. Future studies should assess the impact of drugs that improve albuminuria on VTE, the authors said.

Nonpayment for “preventable” infections didn't affect rates of infection

Withholding payment for bloodstream and urinary tract infections deemed preventable didn't have a measurable effect on infection rates in U.S. hospitals, a study found.

Researchers sought to determine the effect of a Centers for Medicare and Medicaid Services' (CMS) policy that, starting in October 2008, discontinued payment for some hospital-acquired conditions considered preventable. They used data from January 2006 through March 2011 to study changes in rates of central catheter-associated bloodstream infections and catheter-associated urinary tract infections—both targeted by the CMS policy. They compared these rates to those of ventilator-associated pneumonia, which was not targeted by the policy. Data came from 398 hospitals or health systems in 41 states. Results were published online Oct. 11, 2012, by the New England Journal of Medicine.

There were no significant changes in quarterly rates of central catheter–associated bloodstream infections, catheter-associated urinary tract infections or ventilator-associated pneumonia after the policy implementation (incidence-rate ratios in the post-implementation vs. pre-implementation period, 1.00, 1.03 and 0.99, respectively.) Findings didn't differ by whether a hospital was in a state with mandatory reporting, or by hospital size, type of ownership, teaching status, or quartile of percentage of Medicare admissions.

In fact, rates of targeted and untargeted infections began decreasing before the policy was implemented and continued to do so afterward, but without a significant change in the speed of decrease. When researchers limited their analysis to 155 hospital units in states without mandatory reporting of catheter-associated urinary tract infections, they found that implementing the CMS policy actually was associated with a significant slowing of decreases in catheter-associated urinary tract infection rates (relative rate, 1.06; P=0.03).

Possible explanations for the finding that the CMS policy didn't affect infection rates overall include that hospitals may have changed their billing practices, that quality-improvement efforts for the infections measured were already under way, or that the financial incentives at stake were too small, the authors said. “As CMS continues to expand this policy…careful evaluation is needed to determine when these programs work, when they have unintended consequences, and what might be done to improve patient outcomes,” the authors concluded.

Two models may offer accurate assessment of GFR in older adults

Two models to estimate glomerular filtration rate (GFR) may provide more accurate assessment of kidney function in older adults, a study found.

To derive the Berlin Initiative Study (BIS) equation, researchers designed a cross-sectional, random community-based population of 610 participants age 70 years or older (mean age, 78.5 years) from a large insurance company. Data were split for analysis into two sets for equation development and internal validation.

Two estimates of GFR were developed and validated, one based on creatinine only (BIS1) and one based on both creatinine and cystatin C measurements (BIS2). Iohexol plasma clearance was measured for the gold standard. Results appeared in the Oct. 2, 2012, Annals of Internal Medicine.

The new BIS2 equation yielded the smallest bias, followed by BIS1 and Cockcroft-Gault equations. All other equations considerably overestimated GFR.

The BIS equations confirmed a high prevalence of persons older than 70 years with a GFR less than 60 mL/min per 1.73 m2 (BIS1, 50.4%; BIS2, 47.4%; measured GFR, 47.9%). The total misclassification rate for this criterion was smallest for the BIS2 equation (11.6%), followed by the cystatin C equation 2 (15.1%) proposed by the Chronic Kidney Disease Epidemiology Collaboration. Among the creatinine-based equations, BIS1 had the smallest misclassification rate (17.2%), followed by the Chronic Kidney Disease Epidemiology Collaboration equation (20.4%).

Researchers noted that the BIS2 equation should be used to estimate GFR in persons age 70 years or older with normal or mild to moderately reduced kidney function. If cystatin C is not available, the BIS1 equation is an acceptable alternative.

“Compared with current creatinine-based or creatinine- and cystatin C-based equations, the new BIS1 and BIS2 equations showed better precision and excellent agreement with [measured]GFR, especially in a population with an [estimated]GFR greater than 30 mL/min per 1.73 m2 (CKD [chronic kidney disease] stages 1 to 3),” researchers wrote. “This is important because a validated equation to estimate GFR in older adults, especially in cases of normal or only moderately reduced kidney function, has been lacking.”

Also, researchers noted that older adults may be a unique population in which traditional assumptions about GFR are not necessarily true. In elderly participants, the MDRD study equation yielded higher estimated GFRs across CKD stages than did the CKD-EPI and especially the Cockcroft-Gault equation. This contrasts with the situation in younger adults in whom implementation of the CKD-EPI equation has reduced CKD prevalence, but agrees with current results seen in older adults.

“The most striking result was that incorporation of cystatin C in the equation decreased the effect of age and sex,” researchers wrote. “This confirms the independence of cystatin C from age- and sex-associated conditions and may thus make it the preferred laboratory variable to be included in a GFR-estimating equation in an elderly population where reduction in muscle mass is common.”

Benzodiazepines may confer dementia risk for elderly patients

Patients who started taking benzodiazepines after age 65 had a greater risk of developing dementia than those who never used the drugs, a study found.

Researchers examined 3,777 community-dwelling French people aged 65 years and older in a prospective cohort study on brain aging. They observed the subjects for three to five years in order to identify factors that led to benzodiazepine initiation, then followed up for 15 years. Eligible subjects (n=1,063, mean age 78.2 years) were dementia-free and didn't start taking benzodiazepines until at least the third year of follow up. The main outcome was incident dementia confirmed by a neurologist. Results were published online by BMJ Sept. 27, 2012.

Nearly nine percent (n=95) of patients started taking benzodiazepines during the study. These new users were more likely to be single or widowed, have less education, have more significant depressive symptoms, use antihypertensives, use platelet inhibitors or oral anticoagulants, and to consume wine less regularly.

About 24% (n=253) of all patients developed dementia, including 30 benzodiazepine users and 223 non-users. Starting benzodiazepines was associated with a significant increase in the risk of developing dementia (hazard ratio [HR], 1.60; 95% confidence interval, 1.08 to 2.38), a result that was basically unchanged when adjusted for depressive symptoms (HR, 1.62; 95% CI, 1.08 to 2.43). The incidence rate of dementia was 4.8 per 100 person years in the group that took benzodiazepines vs. 3.2 per 100 person years in the non-using group.

Benzodiazepines are useful for treating acute anxiety and persistent insomnia, but evidence is mounting that their use may have adverse outcomes in the elderly including falls, fall-related fractures and now dementia, the authors wrote. “Physicians should carefully assess the expected benefits of the use of benzodiazepines” and limit prescriptions to a few weeks when possible, they said. “In particular, uncontrolled chronic use of benzodiazepines in elderly people should be cautioned against,” they wrote.

Moderate or severe hypoglycemia associated with mortality risk in the ICU

Moderate or severe hypoglycemia in the intensive care unit (ICU) was significantly associated with mortality, according to a recent analysis of the NICE-SUGAR trial.

The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) trial randomly assigned intensive care patients to either intensive (mean blood glucose level, 115 mg/dL) or conventional (mean blood glucose level, 144 mg/dL) blood glucose control, and results were published in 2009. This new analysis provides more detail on the associations between intensive control and hypoglycemia and mortality found in the original study. The analysis appeared in the Sept. 20, 2012, New England Journal of Medicine.

Of the about 6,000 patients studied, 45% had moderate hypoglycemia, defined as a blood glucose level 41 to 70 mg/dL; 82% of them were in the intensive control group. Severe hypoglycemia (blood glucose level ≤40 mg/dL) occurred in 3.7% of patients (93% of them in the intensive group). Mortality was higher among the patients with severe hypoglycemia (35.4% mortality rate; adjusted hazard ratio [HR] for death, 2.10) and moderate hypoglycemia (28.5%; HR, 1.41) than in those with no hypoglycemia (23.5%; HR, 1). Patients also had a higher risk of dying if they had moderate hypoglycemia on more than one day or had severe hypoglycemia while not taking insulin.

Intensive glucose control leads to hypoglycemia, which in turn is associated with increased risk of death in critically ill patients, the researchers concluded. They noted the existence of a dose-response relationship but cautioned that the data cannot establish a causal relationship. However, a causal relationship would be “plausible,” according to the researchers, and is consistent with the finding that hypoglycemic patients were significantly more likely to die of distributive shock.

It is also possible that the hypoglycemia was a marker rather than a cause of death, at least in some cases, such as the patients not taking insulin whose hypoglycemia likely resulted from underlying disease processes. Still, the authors concluded that critical care clinicians should work to avoid hyperglycemia and hypoglycemia in their patients, and follow current guidelines for a blood glucose target of 144 to 180 mg/dL. An accompanying editorial noted that the study showed the difficulty of implementing insulin protocols and accurately monitoring glucose. The editorialist expressed hope that new technologies will make this easier in the future.

Thrombocytosis in CAP brings mortality, respiratory risks

Patients with thrombocytosis in the setting of community-acquired pneumonia (CAP) are at higher risk of poor outcome, complicated pleural effusion and empyema than those with a normal platelet count, a study found.

Researchers assessed consecutive patients age 14 years or older who were hospitalized with CAP between January 2000 and October 2006 at two hospitals in Spain. A total of 2,423 subjects met inclusion criteria and had platelet count available at admission; 53 patients (2%) had thrombocytopenia, 204 (8%) had thrombocytosis, and 2,166 (90%) had a normal platelet count. Thrombocytopenia and thrombocytosis were defined as platelet counts <10/mm3 and ≥ 4×105/mm3, respectively. Researchers calculated the Pneumonia Severity Index (PSI) and the Confusion, Respiratory rate and Blood pressure plus age ≥65 years (CRB-65) score at admission. They excluded patients with immunosuppression, neoplasm, active tuberculosis or hematological disease. Results were published online Sept. 10, 2012, by Chest.

Thrombocytosis patients were younger (P<0.001) while those with thrombocytopenia more often had chronic heart and liver disease (P<0.001 for both). Thrombocytosis patients presented more frequently with respiratory complications such as complicated pleural effusion and empyema (P<0.001), while those with thrombocytopenia presented more often with severe sepsis (P<0.001), septic shock (P=0.009), need for invasive mechanical ventilation (P<0.001) and intensive care unit admission (P=0.011).

Both patients with thrombocytosis and thrombocytopenia had longer hospital stays (P=0.004), higher 30-day mortality (P=0.001) and higher readmission rates (P=0.011) than those with normal platelet counts, with no significant differences between thrombocytosis and thrombocytopenia patients. Multivariate analysis confirmed a significant association between thrombocytosis and 30-day mortality (odds ratio, 2.720; P<0.001). Adding thrombocytosis to the CRB-65 score also improved accuracy in predicting mortality (area under the receiver-operating characteristic curve increase from 0.634 to 0.654; P=0.049).

The study results indicate that platelet count should be monitored in patients with CAP, the authors wrote. Thrombocytopenic patients should be watched for possible septic complications and hemodynamic alterations, while thrombocytosis patients should be watched for respiratory complications like pleural effusion that may need treatment like drainage, they wrote. Since the study has shown thrombocytosis to be a marker of poor outcome in CAP, it should be considered in the severity evaluation of CAP patients, they concluded.

Wells score, negative D-dimer test can rule out pulmonary embolism in primary care patients

A Wells score of 4 or lower and a negative qualitative D-dimer test result can safely and efficiently exclude pulmonary embolism, a recent study found.

Researchers conducted a prospective cohort study among 300 primary care doctors in the Netherlands, providing them with study forms and written instruction on how to use the D-dimer test from July 2007 to December 2010. Results appeared in the Oct. 9, 2012, BMJ.

There were 598 patients considered, with a mean age of 48 years, 71% of whom were women. Venous thromboembolism was present in 73 patients (12.2%), with 68 cases of pulmonary embolism diagnosed immediately after referral. Four additional cases of pulmonary embolism and one deep vein thrombosis were found during three months of follow up.

Overall, 422 patients had a Wells score of 4 or less and 237 had a score of less than 2. Venous thromboembolism was present in 21 (5%; 95% CI, 3.1% to 7.5%) and 7 (3%; 95% CI, 1.2% to 6%) of these patients, respectively. Venous thromboembolism occurred in 52 patients with a Wells score of greater than 4 (29.5%; 95% CI, 22.9% to 36.9%).

In total, 272 of the patients had both a Wells score of 4 or lower and a negative D-dimer test result, and only four of them were diagnosed with pulmonary embolism, a failure rate of 1.5% (95% CI, 0.4% to 3.7%). A failure rate of less than 2% is considered safe by most consensus statements, researchers noted. The combination of Wells score and D-dimer had a sensitivity of 94.5% and a specificity of 51%.

“Such a rule-out strategy makes it possible for primary care doctors to safely exclude pulmonary embolism in a large proportion of patients suspected of having the condition, thereby reducing the costs and burden to the patient (for example, reducing the risk of contrast nephropathy associated with spiral computed tomography) associated with an unnecessary referral to secondary care,” the researchers concluded.

Fast action, individualized treatment necessary in fungal meningitis outbreak

A research letter and an opinion piece published in Annals of Internal Medicine on Dec. 4, 2012, provided details and guidance on caring for patients in the recent outbreak of fungal meningitis due to contaminated methylprednisolone.

The research letter detailed the case of one of the index patients in the outbreak, a 51-year-old woman who sought care in the emergency department for a headache after receiving a steroid injection in her neck. The patient had not received previous injections, was not immunocompromised, and was not taking any long-term medications. She was discharged after a normal physical exam and head CT but returned the next day with neurological symptoms and was hospitalized. Her condition deteriorated rapidly, and she died on hospital day 10.

Exserohilum species was found in the patient's cerebrospinal fluid, and autopsy revealed severe damage to her brain and spinal cord. The authors stressed that this case demonstrates the aggressive, invasive nature of Exserohilum species as well as its short incubation time. Rapid recognition and treatment are necessary, they said, to limit morbidity and mortality.

The opinion piece was written by a physician who treated patients during another fungal meningitis outbreak, involving Exophiala dermatitidis and also due to contaminated methylprednisolone, in 2002. Voriconazole, which was used successfully in the 2002 outbreak, appears to be the logical drug of choice in the current outbreak as well, but the author stressed that exact dosing, outcomes, and drug level monitoring have not been determined and must be based on expert opinion.

“Individual physicians cannot wait for definitive answers and must act decisively at an early stage of infection,” he wrote. Appropriate duration of therapy, use of empirical voriconazole and screening methods are also unknown and management will need to be individualized according to each patient's circumstances, he said.

The author reminded readers about the “importance of sterility and the powerful disease-producing interactions between corticosteroids and fungi” and stressed that regulation of pharmacy compounding at state and national levels will need to be revisited.

Less chloride in IVs lowered rate of creatinine increase, dialysis

Reducing use of chloride-rich intravenous (IV) fluids reduced creatinine levels and risk of acute kidney injury in a recent study of critically ill patients.

The prospective, open-label study included about 1,500 intensive care unit (ICU) patients admitted to one Australian hospital. Half the patients were treated during a six-month period in which the ICU provided standard IV fluids. The second half (admitted over six months the following year) were treated with a chloride-restrictive strategy. Under that strategy, an attending specialist had to approve any use of chloride-rich IV fluids (0.9% saline, 4% succinylated gelatin solution or 4% albumin solution). Otherwise, patients received a lactate solution (Hartmann solution), a balanced solution (Plasma-Lyte 148) and a chloride-poor 20% albumin. Results appeared in the Oct. 17, 2012, Journal of the American Medical Association.

Comparing the two periods, researchers found a smaller mean increase in serum creatinine from baseline to peak level in the chloride-restricted patients (22.6 µmol/L vs.14.8 µmol/L; P=0.03). Those patients also had less acute kidney injury, as measured by the RIFLE classification (14% vs. 8.4% [P<0.001]), and less need for renal replacement therapy (10% vs. 6.3%; P=0.005). Both of these differences remained significant after adjustment for other factors. Study authors concluded that a chloride-restrictive strategy was associated with a significant decrease in incidence of acute kidney injury and use of renal replacement therapy.

The findings are in keeping with earlier observational evidence, the authors noted. There are multiple possible mechanisms for this result, and the study was not designed to identify a specific one. As a bundle-of-care study, the trial also couldn't definitively show which part of the intervention was responsible for the results. In addition to the chloride restriction, using a balanced solution, removing the gelatin solution, or changing the albumin solutions could have affected the outcomes.

The study was limited by its unblinded design, and restricting chloride could pose risks for patients with hyponatremia, alkalemia, cerebral edema or traumatic brain injury, the authors noted. But the current findings, as well as the higher cost of chloride-rich IV solutions, argues for prudence in the use of these fluids in the ICU, especially in patients at risk of renal dysfunction, they concluded.

ACIP: pneumococcal vaccine schedule outlined for high-risk adults

The Advisory Committee on Immunization Practices (ACIP) outlined recommendations and dosing regimens for the use of 13-valent pneumococcal conjugate vaccine, the Centers for Disease Control and Prevention reported.

ACIP issued the recommendation June 20, 2012, and it was given a Category A recommendation according to the report published in the Oct. 12, 2012, MMWR.

There should be routine use of 13-valent pneumococcal conjugate vaccine (PCV13; Prevnar 13, Wyeth Pharmaceuticals, Inc.) for those age 19 years or older with immunocompromising conditions, functional or anatomic asplenia, cerebrospinal fluid leaks or cochlear implants. PCV13 should be administered to eligible adults in addition to the 23-valent pneumococcal polysaccharide vaccine (PPSV23; Pneumovax 23, Merck & Co.), which is also recommended for this patient population.

This population should receive a dose of PCV13 first, followed by a dose of PPSV23 at least eight weeks later. Subsequent doses of PPSV23 should follow current PPSV23 recommendations for adults at high risk.

Additionally, those who received PPSV23 before age 65 years for any indication should receive another dose of the vaccine at age 65 years, or later if at least five years have elapsed since their previous PPSV23 dose.

Those who previously received at least one dose of PPSV23 should be given a PCV13 dose a year or later after the previous PPSV23 dose. For those who require additional doses of PPSV23, the first such dose should be given no sooner than eight weeks after PCV13 and at least five years after the most recent dose of PPSV23.

All adults are eligible for a dose of PPSV23 at age 65 years, regardless of previous PPSV23 vaccination; however, a minimum interval of five years between PPSV23 doses should be maintained.