Clostridium difficile infection (CDI) is an emerging healthcare problem in the world. The purpose of this study was to perform a systematic epidemiological research of CDI in Tongji hospital, the central of China.
Stool samples from hospitalized adults suspected of CDI were enrolled. The diagnosis of CDI were based on the combination of clinical symptoms and laboratory results. Clinical features of CDI and non-CDI patients were compared by appropriate statistical tests to determine the risk factors of CDI. Multilocus sequence typing (MLST) was employed for molecular epidemiological analysis. Susceptibility testing and relevant antimicrobial agent resistance genes were performed as well.
From June 2016 to September 2017, 839 hospitalized adults were enrolled. Among them, 107 (12.8%, 107/839) patients were C. difficile culture positive, and 73 (8.7%, 73/839) were infected with toxigenic C. difficile (TCD), with tcdA + tcdB+ strains accounting for 90.4% (66/73) and tcdA-tcdB+ for 9.6% (7/73). Meanwhile, two TCD strains were binary toxin positive and one of them was finally identified as CD027. Severe symptoms were observed in these two cases. Multivariate analysis indicated antibiotic exposure (p = 0.001, OR = 5.035) and kidney disease (p = 0.015, OR = 8.329) significantly increased the risk of CDI. Phylogenetic tree analysis demonstrated 21 different STs, including one new ST (ST467); and the most dominant type was ST54 (35.6%, 26/73). Multidrug-resistant (MDR) TCD were 53.4% (39/73); resistance to ciprofloxacin, erythromycin, and clindamycin were > 50%. Other antibiotics showed relative efficiency and all strains were susceptible to metronidazole and vancomycin. All moxifloxacin-resistant isolates carried a mutation in GyrA (Thr82 → Ile), with one both having mutation in GyrB (Ser366 → Ala).
Knowledge of epidemiological information for CDI is limited in China. Our finding indicated tcdA + tcdB+ C. difficile strains were the dominant for CDI in our hospital. Significant risk factors for CDI in our setting appeared to be antibiotic exposure and kidney disease. Metronidazole and vancomycin were still effective for CDI. Although no outbreak was observed, the first isolation of CD027 in center China implied the potential spread of this hypervirulent clone. Further studies are needed to enhance our understanding of the epidemiology of CDI in China.
Clostridium difficile is a gram-positive bacterium notorious for causing epidemic diarrhea globally with a significant health burden. The pathogen is clinically challenging with increasing antibiotic resistance and recurrence rate. We provide here an in-depth review of one particular strain/ribotype 027, commonly known as NAP1/B1/027 or North American pulsed-field gel electrophoresis type 1, restriction endonuclease analysis type B1, polymerase chain reaction ribotype 027, which has shown a much higher recurrence rate than other strains.
Clostridium difficile (C. diff) is a gram-positive, anaerobic, motile, spore-forming, rod-shaped bacteria [1-2]. It has been isolated from almost all mammals, including pigs, cows, horses, elephants, and Kodiak bears, as well as in poultry and ostriches. It has also been found in the soil and feces of humans and animals. It is transmitted from person to person by the fecal-oral route. The C. diff isolates found in animals are similar to the ones found in humans, but according to Hensgens et al., this similarity does not mean that interspecies transmission occurs. However, immunocompromised people are still at risk for interspecies transmission . Its pathogenicity is dependent on the two toxins that it produces: enterotoxin A (Toxin A or TcdA) and cytotoxin B (Toxin B or TcdB). Enterotoxin damages the actin in target cells which leads to neutrophil infiltration, inflammation, and necrosis of epithelial cells. Cytotoxin B has been shown to damage tight junctions of epithelial cells, which increases vascular permeability and causes hemorrhage [2-3]. These toxins form the basis of stool analysis when diagnosing people with the suspected infection. Despite all the virulence characters described, C. diff is a poor competitor against other gut flora in the human colon. In a healthy colon, this pathogen is not in sufficient quantity to produce a clinically significant disease. Risk factors that disrupt this balance include antibiotics exposure, health care environment, acid suppressants, and elemental diet. The bacterium can cause severe watery diarrhea that can progress to pseudomembranous colitis [3-8]. It has been named as one of the three microorganisms with an ‘urgent’ threat level by the Centers for Disease Control and Prevention (CDC) based on its public health impact in the United States (US) with an estimated $1.5 billion US in annual health care expenditures . Patients who have more than three episodes of unexplained and new onset unformed stools in 24 hours should be referred for testing for a Clostridium difficile infection (CDI). Also, patients with risk factors described previously should undergo testing for this pathogen . The ribotype 027 strain of C. diff is particularly noteworthy as contradicting evidence in the literature is present regarding the disease severity it causes. We provide here a brief overview of the epidemiology, pathophysiology, and treatment of this particular strain.
Clostridium difficile can be characterized according to its ribotyping which is performed using the polymerase chain reaction. Several different ribotypes have been associated with CDI. The ribotypes 001, 002, 014, 046, 078, 126, and 140 have been found to be prevalent in the Middle East [10-12]. In Asia, ribotypes 001, 002, 014, 017, and 018 are more prevalent [13-15]. The predominant strains in Europe and North America include ribotypes 001, 014, 020, 027, and 078 . The ribotype 027 (also referred to as NAP1/B1/027) has emerged in the last decade. Studies have underlined antimicrobial resistance as one of the causes of its epidemic outbreaks. Capillary electrophoresis (CE) ribotyping is used as the standard for characterization of C. diff isolates. This method relies on the intergeneric region variability between 16S and 23S ribosomal deoxyribonucleic acid (DNA) . Ribotype 027 was found to have reduced susceptibility to metronidazole, rifampicin, moxifloxacin, clindamycin, imipenem, and chloramphenicol [17-18]. It is clinically and financially concerning as it leads to severe disease presentation, as well as antimicrobial resistance with high morbidity and mortality rates as compared to other strains . Strains, such as ribotype 027 (especially its spores), spread more easily within the hospital because they can resist the hospital environment, cleaning, and disinfectants . An observational study conducted on patients admitted with diarrhea in a Veteran Affairs Medical Center showed that around 22% of the patients were positive for the NAP1/B1/027 strain out of all the people who tested positive for CDI. Further, a reduction in the rate of diarrhea caused by the NAP1/B1/027 strain was observed with a prevalence of 16.9% in 2016, down from 26.2% in 2013. An increase in the level of awareness and education was thought to be the reason for this decline . The prevalence of this strain in North America is reportedly around 22% – 36%. Ribotype 027 was identified as the most prevalent strain causing CDI with recent outbreaks in North America [20-22]. The prevalence of this strain was shown to be 48% in hospitals in Poland with an outbreak of CDI during September 2011 to August 2013 .
Toxigenicity and Pathogenesis
The North American pulsed-field gel electrophoresis type 1, restriction endonuclease analysis type B1, polymerase chain reaction ribotype 027 (NAP1/B1/027) strain has been shown to contain a gene locus, CdtLoc, that encodes for CD196 ADP-ribosyltransferase (CDT) or binary toxin. The bacterium also produces Toxin A and Toxin B, similar to non-027 ribotypes, through the PaLoc gene locus [23-24]. CDT was first isolated by Popoff et al. . The toxin comprises two separate toxin components: CDTa and CDTb. CDTa, which is an ADP-ribosyltransferase enzyme, modifies actin which results in depolymerization and destruction of the actin cytoskeleton in the gut. CDTb binds to gut cells and increases uptake of CDTa. The destruction caused by CDT favors adherence of bacteria and increased uptake of Toxin A and Toxin B .
In addition to the toxins, this strain (along with few others) carries a base pair frameshift deletion at nucleotide 117 of the TcdC gene, which is a negative regulator of Toxins A and B. A mutation in this gene thus causes hyperexpression of toxins by this particular strain. Warny et al. showed that NAP1/B1/027 produces Toxin A approximately 16 times and Toxin B approximately 23 times more than the control strains . One study also proposed that increased sporulation by this strain may also be associated with the increased spread of CDI . The virulent factors associated with NAP1/B1/027 strain have been summarized in Table 1.
Previous studies have shown contradicting evidence regarding the severity of disease caused by this particular strain. A recent retrospective analysis by Bauer et al. concluded that NAP1/B1/027 was associated with a decreased odds of severe disease (odds ratio (OR): 0.35, 95% confidence interval (CI) 0.13 – 0.93) and did not increase in-hospital mortality (OR: 1.02, 95% CI 0.53 – 1.96) or recurrence rate (OR: 1.16, 95% CI 0.36 – 3.77) . Several other studies conducted (including cross-sectional, case-control, and cohort studies) did not show any worse outcomes compared to other strains [29-31]. Sirad et al. demonstrated that although NAP1/B1/027 strain may produce more toxins compared to other strains, they produced fewer spores and were not always associated with severe disease . On the contrary, Rao et al. conducted a cohort study and concluded that ribotype 027 was associated with severe CDI (OR: 1.73, 95% CI 1.03 – 2.89; p = 0.037) and increased mortality (OR: 2.02, 95% CI 1.19 – 3.43; p = 0.009) compared to other ribotypes . Another study showed similar results with the North American pulsed-field gel electrophoresis type 1 (NAP1) strain. Multivariate regression analysis exhibited an increase in the severity of CDI with the NAP1 strain (OR: 1.66, 95% CI: 1.90 – 2.54) and increased mortality (OR: 2.12, 95% CI: 1.22 – 3.68) . One study from Quebec labeled this strain to be responsible for severe diseases twice as frequently as compared to other strains .
The basis for these contradictory findings can be explained by several reasons, including study design, study population, sample size, the method of detection for C. diff, study setting, and unmeasured confounders. Given these contradictory results, healthcare providers should focus on treating this infection based on their clinical judgment and markers of severe infection, including the number of diarrheal episodes, signs of dehydration, creatinine level, albumin level, white blood cell count, associated co-morbidities, immunocompromised state, etc.
Preventive strategies employed for NAP1/B1/027 strain are similar to strategies taken for other strains. These include barrier methods (gloves and gown while examining patient), use of disposable equipment, handwashing with soap and water, disinfecting the environment, and antimicrobial stewardship . Further vaccines are being developed targeting the toxins, including TcdA and TcdB, for simultaneous prevention and treatment of CDI. Actoxumab and bezlotoxumab, which are monoclonal antibodies against TcdA and TcdB, are being investigated for this purpose. A combined Phase III trial (MODIFY I (NCT01241552) and MODIFY II (NCT01513239)) showed benefit from bezlotoxumab, but the combination of actoxumab and bezlotoxumab did not yield any further benefit . Bezlotoxumab has received Food and Drug Administration (FDA) approval in October 2016 and is to be used in patients more than 18 years of age, who are at high risk of recurrence from CDI, and are receiving antibiotics . A novel tetravalent vaccine against TcdA, TcdB, CDTa, and CDTb has been proposed by Secore et al. using a hamster model which has shown promising results .
A novel drug, SYN-004 (ribaxamase), is under investigation that has shown promising results for preventing CDI. This drug, which is a β-lactamase, is excreted into the gut and degrades the excess antibiotic that prevents disruption of normal gut flora, ultimately preventing CDI . The Phase IIa clinical trial of this drug showed that ribaxamase at a dose of 150 mg every six hours results in an undetectable concentration of ceftriaxone in the intestine which can potentially decrease the likelihood of a C. diff infection, given the less probability of disruption of the gut bacteria.
Resistance to Antibiotics and Treatment
Cases of NAP1/B1/027 reported in Panama were found to be highly resistant to clindamycin, moxifloxacin, levofloxacin, ciprofloxacin, and rifampin but were susceptible to metronidazole and vancomycin . Susceptibility of ribotype 027 and non-027 ribotypes to different antibiotics was tested in a study in Canada. Ribotype 027 showed a resistance of 92.2% to moxifloxacin compared to 11.2% for other strains. Similarly, 78.2% of ribotype 027 strains were resistant to ceftriaxone compared to 15.7% of other strains. Ribotype 027 demonstrated a greater than four-fold higher minimum inhibitory concentration (MIC) to metronidazole (4 vs. 1 μg/ml) and two-fold higher MIC for fidaxomicin (1 vs. 2 μg/ml). For clindamycin and vancomycin, the resistance was similar in both groups .
Resistance to erythromycin is linked to mutations in the ribosomal methylase genes, whereas resistance to fluoroquinolones is due to a mutation in DNA gyrase. Resistance to rifamycin and fidaxomicin is attributed to ribonucleic acid (RNA) polymerase methylation. The presence of phenicol and lincosamide genes has been shown to cause resistance to linezolid. A study conducted in hospitals of Mexico showed some isolates of ribotype 027 to have reduced susceptibility to fidaxomicin despite the unavailability of this drug in Mexico and the patients being unexposed to it . Antibiotics form the basis of treatment for the NAP1/B1/027 strain. Currently, no specific Infectious Diseases Society of America (IDSA) guidelines are available to guide treatment for this particular strain, and hence, the treatment is similar to a non-NAP1/B1/027 strain . Based on the current guidelines for treating CDI overall, we propose the following table for treating infection caused by the NAP1/B1/027 strain (Table 2).
This strain has not shown any resistance to fidaxomicin, but there has been some contradicting evidence to this. A case report was published in 2017 in which the NAP1 C. diff infection, resistant to treatment with fidaxomicin and fecal transplants, was effectively treated with intravenous immunoglobulin (IVIG) . Given the emerging threat of antibiotic resistance, increasing awareness, controlling infections, and antimicrobial stewardship can be effective measures to reduce this threat .
Currently, several novel antibiotics are under investigation which have gone through various randomized controlled trials for CDI treatment. Ridinilazole and cadazolid have completed Phase II trials, while surotomycin has completed two Phase III trials which have shown promising results [44-47].
The data regarding the NAP1/B1/027 strain is inconclusive with ongoing debates whether this particular strain is associated with severe disease. Further research, including meta-analyses, are needed to solve this enigma. Clinicians should guide treatment based on their judgment and objective evidence of disease severity.
Gaining a better understanding of sources and risk factors for C. diff can help reverse colonization and transmission or prevent it altogether, authors of a new paper suggest.
To view the article in its entirety, please click on the following link to be redirected:
“This is a review/commentary article that provides a high-level overview of the literature dealing with C. diff colonization and the microbiome changes associated with C. diff colonization,” author Silvia Munoz-Price, MD, PhD, from the Medical College of Wisconsin in Milwaukee told our sister publication MD Magazine.
After reviewing the literature, authors of the study postulated that when it comes to the potential for C. diff colonization, exposure to and transmissions of the virus occurs outside of hospitals. In fact, it seemed like most of the patients became symptomatic during their hospital stay, rather than acquiring the virus while hospitalized.
For example, the investigators cited one study from Canada that had been conducted from 2006 to 2007 where more than 4000 patients were screened for C. diff colonization upon hospitalization, during their stay (on a weekly basis) and at discharge. They found that 4% of the patients were colonized upon hospitalization and 3% acquired C. diff during their stay in the hospital.
The authors also found evidence indicating that community-acquired C. diff appears to be on the rise. The authors discuss a decade-long study which took place in Minnesota where community-acquired C. diff infection rates rose from 2.8 to 14.9 per 100,000-person-years within the 10-year span. The patients in that study more likely to acquire C. diff were younger, female, and healthier than patients with hospitalization acquired C. diff. The reviewers also said that rates of community-acquired C. diff have also been rising in Finland, Australia, and England, according to published studies.
Most of the common risk factors for community-acquired C. diff infections still applied, the researchers found, including antibiotic exposure, household contact, and animals. A 2013 study showed that two-thirds of community-acquired C. difficile patients were exposed to antibiotics in the preceding 12 weeks of their infection, and about one-third had been exposed to proton pump inhibitors.
While studies examining transmissibility within households are difficult to come by, the study authors found one review from Quebec. The review consisted of 2222 cases of C. diff diagnosed between 1998 and 2009, and investigators found that 8 cases were designated to be transmitted by household contacts. However, the researchers noted, confirmation using strain typing was not performed in that study.
Looking at farm livestock, a 2013 Dutch study showed that individuals with daily contact with pigs showed rates of C. diff positivity of 25%; in those with weekly contact, it was 14%. In the same study, C. diff was found in the manure from all the farms in 10% to 80% of the samples per farm. The reviewers also said that C. diff has been found in the stool of farm chickens, calves, and retail ground meat. Dogs and cats are also known to culture positive for C. diff, and the researchers wrote that the bacteria can also be present in vegetables and water (tap water, swimming pools, as well as rivers, lakes, and seas). They hypothesized that the presence of C. diff in vegetables may come from the use of organic fertilizer.
“We envision that in the future we should be able to take advantage of our increasing knowledge about microbiome changes so that we will be able to: identify patients at risk for de novo C. difficile colonization during their hospitalization and manipulate our patients’ microbiome to prevent or reverse C. difficile colonization,” Dr. Munoz-Price said.
“Different from what we do now, the latter would be accomplished not by withholding or changing antibiotics but by correcting the deficient flora of a patient in an individualized fashion. This new approach would revolutionize the field of Infection Control and Antibiotic Stewardship,” she concluded.
ADULT AND PEDIATRIC
|Clinical Definition||Supportive Clinical Data||Recommended Treatmenta||Strength of Recommendation/ Quality of Evidence|
|Initial episode, non-severe||Leukocytosis with a white blood cell count of ≤15000 cells/mL and a serum creatinine level <1.5 mg/dL||• VAN 125 mg given 4 times daily for 10 days, OR||Strong/High|
|• FDX 200 mg given twice daily for 10 days||Strong/High|
|• Alternate if above agents are unavailable: metronidazole, 500 mg 3 times per day by mouth for 10 days||Weak/High|
|Initial episode, severeb||Leukocytosis with a white blood cell count of ≥15000 cells/mL or a serum creatinine level >1.5 mg/dL||• VAN, 125 mg 4 times per day by mouth for 10 days, OR||Strong/High|
|• FDX 200 mg given twice daily for 10 days||Strong/High|
|Initial episode, fulminant||Hypotension or shock, ileus, megacolon||• VAN, 500 mg 4 times per day by mouth or by nasogastric tube. If ileus, consider adding rectal instillation of VAN. Intravenously administered metronidazole (500 mg every 8 hours) should be administered together with oral or rectal VAN, particularly if ileus is present.||Strong/Moderate (oral VAN); Weak/Low (rectal VAN); Strong/Moderate (intravenous metronidazole)|
|First recurrence||…||• VAN 125 mg given 4 times daily for 10 days if metronidazole was used for the initial episode, OR||Weak/Low|
|• Use a prolonged tapered and pulsed VAN regimen if a standard regimen was used for the initial episode (eg, 125 mg 4 times per day for 10–14 days, 2 times per day for a week, once per day for a week, and then every 2 or 3 days for 2–8 weeks), OR||Weak/Low|
|• FDX 200 mg given twice daily for 10 days if VAN was used for the initial episode||Weak/Moderate|
|Second or subsequent recurrence||…||• VAN in a tapered and pulsed regimen, OR||Weak/Low|
|• VAN, 125 mg 4 times per day by mouth for 10 days followed by rifaximin 400 mg 3 times daily for 20 days, OR||Weak/Low|
|• FDX 200 mg given twice daily for 10 days, OR||Weak/Low|
|• Fecal microbiota transplantationc||Strong/Moderate|
Abbreviations: FDX, fidaxomicin; VAN, vancomycin.
aAll randomized trials have compared 10-day treatment courses, but some patients (particularly those treated with metronidazole) may have delayed response to treatment and clinicians should consider extending treatment duration to 14 days in those circumstances.
bThe criteria proposed for defining severe or fulminant Clostridium difficile infection (CDI) are based on expert opinion. These may need to be reviewed in the future upon publication of prospectively validated severity scores for patients with CDI.
cThe opinion of the panel is that appropriate antibiotic treatments for at least 2 recurrences (ie, 3 CDI episodes) should be tried prior to offering fecal microbiota transplantation.
|Clinical Definition||Recommended Treatment||Pediatric Dose||Maximum Dose||Strength of Recommendation/ Quality of Evidence|
|Initial episode, non-severe||• Metronidazole × 10 days (PO), OR
• Vancomycin × 10 days (PO)
|• 7.5 mg/kg/dose tid or qid
• 10 mg/kg/dose qid
|• 500 mg tid or qid
• 125 mg qid
|Initial episode, severe/ fulminant||• Vancomycin × 10 days (PO or PR) with or without
metronidazole × 10 days (IV)a
|• 10 mg/kg/dose qid
• 10 mg/kg/dose tid
|• 500 mg qid
• 500 mg tid
|First recurrence, non-severe||• Metronidazole × 10 days (PO), OR
• Vancomycin × 10 days (PO)
|• 7.5 mg/kg/dose tid or qid
• 10 mg/kg/dose qid
|• 500 mg tid or qid
• 125 mg qid
|Second or subsequent recurrence||• Vancomycin in a tapered and pulsed regimenb, OR
• Vancomycin for 10 days followed by rifaximinc for 20 days, OR
• Fecal microbiota transplantation
|• 10 mg/kg/dose qid
• Vancomycin: 10 mg/kg/dose qid; rifaximin: no pediatric dosing
|• 125 mg qid
• Vancomycin: 500 mg qid; rifaximin: 400 mg tid
Abbreviations: IV, intravenous; PO, oral; PR, rectal; qid, 4 times daily; tid, 3 times daily.
aIn cases of severe or fulminant Clostridium difficile infection associated with critical illness, consider addition of intravenous metronidazole to oral vancomycin.
bTapered and pulsed regimen: vancomycin 10 mg/kg with max of 125 mg 4 times per day for 10–14 days, then 10 mg/kg with max of 125 mg 2 times per day for a week, then 10 mg/kg with max of 125 mg once per day for a week, and then 10 mg/kg with max of 125 mg every 2 or 3 days for 2–8 weeks.
cNo pediatric dosing for rifaximin; not approved by the US Food and Drug Administration for use in children <12 years of age.
Since publication of the 2010 Infectious Diseases Society of America (IDSA)/Society for Healthcare Epidemiology of America (SHEA) Clostridium difficile infection (CDI) clinical practice guideline, there has been continued expanding interest in the epidemiology, prevention, diagnosis, and treatment of CDI. This reflects the ongoing magnitude of these infections impacting all aspects of healthcare delivery and reaching out into the community. Also new since the previous guidelines, quality of evidence and strength of recommendations was evaluated using GRADE methodology [1–4]. While there is evidence that CDI rates have declined remarkably in England and other parts of Europe since their peak before 2010, rates have plateaued at historic highs in the United States since about 2010 . Recent estimates suggest the US burden of CDI is close to 500000 infections annually, although the exact magnitude of burden is highly dependent upon the type of diagnostic tests used . Depending upon the degree and method of attribution, CDI is associated with 15000–30000 US deaths [6, 7] and excess acute care inpatient costs alone exceed $4.8 billion . Due to this US burden of CDI, national efforts to control and prevent CDI have been put in place including incentives for public reporting of hospital rates  and hospital “pay for performance” . It is in this context of CDI remaining a major public health problem, undermining both patient safety and the efficiency and value of healthcare delivery, that the 2010 recommendations are now revised and updated. There are no updates in the clinical definition of CDI or the clinical manifestations of CDI. The reader is referred to the 2010 guideline for the definition, background information, and clinical manifestations of CDI.
Since completion of this guideline, a new therapeutic agent and a molecular diagnostic test platform have become available for CDI. Bezlotoxumab, a monoclonal antibody directed against toxin B produced by C. difficile, has been approved as adjunctive therapy for patients who are receiving antibiotic treatment for CDI and who are at high risk for recurrence . Multiplex polymerase chain reaction (PCR) platforms that detect C. difficile as part of a panel of >20 different enteric pathogens have also become available . These most recent innovations and other innovations that may become available in the near future will be covered in subsequent guideline updates.
“Clinical practice guidelines are statements that include recommendations intended to optimize patient care that are informed by a systematic review of evidence and an assessment of the benefits and harms of alternative care options” .
A panel of 14 multidisciplinary experts in the epidemiology, diagnosis, infection control, and clinical management of adult and pediatric patients with CDI was convened to develop these practice guidelines. A systematic evidence-based approach was adopted for the guideline questions and population, intervention, comparator, outcome (PICO) formulations, the selection of patient-important outcomes, as well as the literature searches and screening of the uncovered citations and articles. The rating of the quality of evidence and strength of recommendation was supported by a Grading of Recommendations Assessment, Development, and Evaluation (GRADE) methodologist. In addition to members of both IDSA and SHEA, representatives from the American Society for Health-Systems Pharmacists (ASHP), the Society of Infectious Diseases Pharmacists (SIDP), and the Pediatric Infectious Diseases Society (PIDS) were included.
For this 2017 guideline update, search strategies, in collaboration with the guideline panel members, were developed and built by independent health sciences librarians from National Jewish Health (Denver, Colorado). Each strategy incorporated medical subject headings and text words for “Clostridium difficile,” limited to human studies or nonindexed citations. In addition, the strategies focused on articles published in English or in any language with available English abstracts. The Ovid platform was used to search 5 electronic evidence databases: Medline, Embase, Cochrane Central Registry of Controlled Trials, Health Technology Assessment, and the Database of Abstracts of Reviews of Effects.
To supplement the electronic search, reviewers also hand-searched relevant journals, conference proceedings, reference lists from manuscripts retained from electronic searches, and regulatory agency web sites for relevant articles. Literature searches were originally implemented on 4 December 2012, updated on 3 March 2014, and further extended to 31 December 2016. The 2010 guideline used a search cutoff of 2009 and thus for this guideline, the literature review included a defined search period of 2009–2016. Separate, nondiscrete evidence libraries were created for adults and pediatrics. The result of the searching was 14479 citations being eligible at title and abstract phase of screening for the adult literature. As the 2010 guideline did not address pediatrics as part of any searching, a decision was made to reexamine the evidence landscape for pediatric-related studies that could inform the guideline. For this, the period of 1977–2016 was searched, yielding 3572 citations eligible at title and abstract phase. Those retained at the title and abstract phase of screening were then examined at the full-text phase.
To evaluate the initial search evidence for eligibility, the panel followed a process consistent with other IDSA guidelines. The process for evaluating the evidence was based on the IDSA Handbook on Clinical Practice Guideline Development and involved a systematic weighting of the quality of the evidence and the grade of recommendation using the GRADE system (Figure 1) [1–4].
Each author was asked to review the literature (based on screening examination of titles and abstracts and manuscript full-text examination, as well as abstraction of relevant variables/data from eligible studies/reports), evaluate the evidence, and determine the strength of the recommendations along with an evidence summary supporting each recommendation. The panel reviewed all recommendations, their strength, and quality of evidence. For recommendations in the category of good practice statements that should not be graded, we followed published principles by the GRADE working group on how to identify such recommendations and use appropriate wording choices . Accordingly, a formal GRADE rating was not pursued for those statements as these statements would make it clear that they would do greater good than harm or greater harm than good, and thus a study would not be warranted to address such a question. Discrepancies were discussed and resolved, and all panel members are in agreement with the final recommendations.
The panel met face-to-face on 3 occasions and conducted numerous monthly subgroup and full panel conference calls to complete the work of the guideline. The panel as a whole reviewed all individual sections. The guideline was reviewed and approved by the IDSA Standards and Practice Guidelines Committee (SPGC) and SHEA Guidelines Committee as well as both organizations’ respective Board of Directors (BOD). The guideline was endorsed by ASHP, SIDP, and PIDS.
All members of the expert panel complied with the IDSA policy on conflicts of interest, which requires disclosure of any financial, intellectual, or other interest that might be construed as constituting an actual, potential, or apparent conflict. To provide thorough transparency, IDSA requires full disclosure of all relationships, regardless of relevancy to the guideline topic . Evaluation of such relationships as potential conflicts of interest (COI) is determined by a review process that includes assessment by the SPGC chair, the SPGC liaison to the development panel, and the BOD liaison to the SPGC, and, if necessary, the COI Task Force of the Board. This assessment of disclosed relationships for possible COI is based on the relative weight of the financial relationship (ie, monetary amount) and the relevance of the relationship (ie, the degree to which an association might reasonably be interpreted by an independent observer as related to the topic or recommendation of consideration). See Acknowledgments section for disclosures reported to IDSA.
At annual intervals and more frequently if needed, IDSA and SHEA will determine the need for revisions to the guideline on the basis of an examination of the current literature and the likelihood that any new data will have an impact on the recommendations. If necessary, the entire expert panel will be reconvened to discuss potential changes. Any revision to the guideline will be submitted for review and approval to the appropriate Committees and Boards of IDSA and SHEA.
A recommended case definition for surveillance requires (1) the presence of diarrhea or evidence of megacolon or severe ileus and (2) either a positive laboratory diagnostic test result or evidence of pseudomembranes demonstrated by endoscopy or histopathology. An incident case is defined as a new primary episode of symptom onset (ie, no episode of symptom onset with positive result within the previous 8 weeks) and positive assay result (eg, toxin enzyme immunoassay [EIA] or nucleic acid amplification test [NAAT]). A recurrent case is defined as an episode of symptom onset and positive assay result following an episode with positive assay result in the previous 2–8 weeks. The minimum surveillance that should be performed by all healthcare facilities is tracking of healthcare facility–onset (HO) cases, which will allow for detection of elevated rates or an outbreak within the facility . HO-CDI cases are defined by the Centers for Disease Control and Prevention (CDC)’s National Healthcare Safety Network (NHSN) as Laboratory-Identified (LabID) Events collected >3 days after admission to the facility (ie, on or after day 4) . Facilities may also monitor cases of CDI occurring within 28 days after discharge from a healthcare facility, which are considered community-onset, healthcare facility-associated (CO-HCFA) cases (ie, postdischarge cases).
Because the risk of CDI increases with the length of stay, the most appropriate denominator for HO-CDI rates is the number of patient-days. If a facility notes an increase in the incidence of CDI from the baseline rate, or if the incidence is higher than in comparable institutions or above national and/or facility reduction goals, surveillance data should be stratified by hospital location or clinical service to identify particular patient populations where infection prevention measures may be targeted. In addition, measures should be considered for tracking severe outcomes, such as colectomy, intensive care unit (ICU) admission, or death, attributable to CDI.
In the United States, CDI surveillance in healthcare facilities is conducted via the CDC’s NHSN Multidrug-Resistant Organism and C. difficile Infection Module LabID Event Reporting . To allow for risk-adjusted reporting of healthcare-associated infections (HAIs), CDC calculates the standardized infection ratio (SIR) by dividing the number of observed events by the number of predicted events. The number of predicted events is calculated using LabID probabilities estimated from models constructed from NHSN data during a baseline time period, which represents a standard population . These have been recently updated using a 2015 baseline period with specific models developed for each of 4 facility types: acute care hospitals, long-term acute care hospitals, critical access hospitals (rural hospitals with ≤25 acute care inpatient beds), and inpatient rehabilitation facilities . Use of more sensitive tests (eg, NAATs) for C. difficile have been demonstrated to result in substantial increases in reported CDI incidence rates compared with those derived from toxin detection by enzyme immunoassay [18, 19]. Consistent with this, the impact of test type on facilities’ reported rates is an independent predictor in each of the aforementioned NHSN risk adjustment models except that for critical access hospitals . The prevalence of CO cases not associated with the facility (ie, defined in NHSN as present-on-admission with no discharge from the same facility within the previous 4 weeks) is also associated with HO-CDI [20, 21]. This likely reflects colonization pressure in the admitted patient population, and is an independent predictor in each of the NHSN risk adjustment models except for inpatient rehabilitation facilities .
Despite these attempts to risk-adjust based upon data that hospitals are already reporting to NHSN, there are limitations. For example, adjustment by test type accounts for only the pooled mean impact on rates resulting from differences in sensitivity between major test categories (eg, NAAT, toxin EIA) and does not account for differences in sensitivity between individual test manufacturers, nor potential interaction of C. difficile strain types on relative test sensitivity [22, 23]. Similarly, there are inherent limitations in all surveillance adjusting for the disease risk in the surveyed population. For example, Thompson et al demonstrated how the Medicare Case Mix Index, a summary metric calculated at the hospital level and reflecting clinical complexity and resource consumption of patients within a hospital, could further explain variation across hospital CDI rates over and above the existing model . However, any potential benefit to hospital performance improvement from additional risk adjustment strategies must be balanced by any increased data-reporting burden or impact on timeliness.
Clostridium difficile is the most commonly recognized cause of infectious diarrhea in healthcare settings. Among 711 acute care hospitals in 28 states conducting facility-wide inpatient LabID-CDI event reporting to NHSN in 2010, the pooled rate of HO-CDI was 7.4 (median, 5.4) per 10000 patient-days . As these data were reported prior to development of the SIR, they were unadjusted; at that time, 35% of NHSN hospitals reported using NAATs. Based on data from the CDC’s Emerging Infections Program (EIP)  population-based surveillance system in 2011, the estimated number of incident CDI cases in the United States was 453000 (95% confidence interval [CI], 397100–508 500), with an incidence of 147.2 (95% CI, 129.1–165.3) cases/100000 persons . The incidence was highest among those aged ≥65 years (627.7) and was greater among females and whites. Of the total estimated 453000 incident cases, 293300 (64.7%) were considered to be healthcare-associated, of which 37% were HO, 36% had their onset in long-term care facilities (LTCFs), and 28% were CO healthcare-associated (ie, specimen collected in an outpatient setting or ≤3 calendar days after hospital admission and documented overnight stay in a healthcare facility in the prior 12 weeks). Of the estimated 159700 community-associated CDI cases (ie, no documented overnight stay in a healthcare facility in the prior 12 weeks), 82% were associated with outpatient healthcare exposure; therefore, the overwhelming majority (94%) of all cases of CDI had a recent healthcare exposure [6, 27].
A multistate prevalence survey of HAIs conducted by EIP in 2011 found that C. difficile was the most common causative pathogen, accounting for 61 of 504 (12.1%) HAIs identified in 183 hospitals . The increasing burden of CDI was also noted in a network of community hospitals in the southeastern United States, where C. difficile surpassed methicillin-resistant Staphylococcus aureus (MRSA) as the most common cause of HAIs .
Recent hospital discharge data  indicate that the total number of hospital discharges with a diagnosis of CDI in the United States plateaued at historic highs between 2011 and 2013. During this apparent plateau in hospital discharges, there has been an 8% decline in the risk-adjusted HO-CDI SIR of NHSN .
As most LTCFs do not report CDI data, limited data are available about the burden of CDI in these settings. LTCF residents are often elderly, have numerous comorbid conditions, and have been exposed to antibiotics, which are important risk factors for C. difficile colonization and infection [32, 33]. Data from the CDC EIP and other sources suggest that the burden is high; >20% of all CDIs identified in 2011 had onset in LTCFs . Furthermore, in 2012 there were an estimated 112800 cases of CDI with onset in LTCFs ; 57% of these patients were discharged from a hospital within 1 month. Conversely, 20% of HO-CDI cases were found to occur in patients who had been LTCF residents any time in the previous 12 weeks . Using a multilevel longitudinal nested case-control study of Veterans Affairs LTCFs, all but 25% of the variability in LTCF rates could be explained by 2 factors: the importation of active or convalescing cases with hospital-onset CDI in the previous 8 weeks, and LTCF antibiotic use as measured by antibiotic days per 10000 resident-days .
Severity of CDI has been reported to have increased coincident with the increasing incidence during the outbreaks and emergence of the PCR ribotype 027 epidemic strain (also known as the North American pulsed field type 1 [NAP1] or restriction endonuclease analysis pattern “BI”) in the 2000s [36, 37]. Severity of CDI has been variably defined based on laboratory data, physical examination findings, ICU stay, colectomy, and/or mortality. Reported colectomy rates in hospitalized patients with CDI during endemic periods range from 0.3% to 1.3%, whereas during epidemic periods, colectomy rates range from 1.8% to 6.2% . Other indicators of CDI morbidity include recurrent CDI, readmissions to the hospital, and discharge to LTCFs. Overall, 0.8% of patients develop candidemia in the 120 days after CDI and both more severe CDI and treatment with the combination of vancomycin and metronidazole are associated with increased candidemia risk . After a first diagnosis of CDI, 10%–30% of patients develop at least 1 recurrent CDI episode, and the risk of recurrence increases with each successive recurrence [40, 41]. A national estimate of first CDI recurrences in 2011 was 83000 (95% CI, 57100–108900) .
Prior to 2000, the attributable mortality of CDI was low, with death as a direct or indirect result of infection occurring in <2% of cases [42–45]. Since 2000, CDI-attributable mortality has been reported to be higher, both during endemic periods, where mortality ranges from 4.5% to 5.7%, and during epidemic periods, where mortality ranges from 6.9% to 16.7% . However, a recent study in 6 Canadian hospitals evaluating CDI cases in 2006–2007 found an attributable mortality of 1.7%, similar to historic data . Based on 2011 EIP data, the estimated number of deaths within 30 days of the initial diagnosis of CDI in the United States was 29300 (95% CI, 16500 to 42100) . After controlling for demographics, underlying severity of illness, and medications during an index hospitalization, recurrent CDI is associated with a 33% increased risk of mortality at 180 days relative to patients who do not suffer a recurrence .
The attributable excess costs of CDI suggest a substantial burden on the healthcare system. Studies adjusting for cost by propensity score matching have found that the CDI-attributable cost for acute care hospitals is $3427–$9960 per episode (adjusted for 2012 US dollars) . Extrapolating these estimates to the nation using 2012 Healthcare Cost Utilization Project data, the total annual US acute care cost attributable to CDI is estimated to be $1.2–$5.9 billion .
The emergence of the virulent, epidemic ribotype 027 strain was associated with increased incidence, severity, and mortality during the mid-2000s and resulted in outbreaks across North America [36, 48, 49], England [50, 51], parts of continental Europe [52, 53], and Asia . The recent isolates of the 027 strain are more highly resistant to fluoroquinolones compared to historic strains of the same type . This, coupled with increasing use of the fluoroquinolones worldwide likely promoted dissemination of a once uncommon strain .
Consistent with the presence of one or more molecular markers responsible for increased virulence, patients infected with the 027 epidemic strain in Montreal were shown to have more-severe disease than patients infected with other strains . In a later Canadian multicenter study of hospitalized patients, the 027 strain was predominant among patients with CDI, whereas other strains were more common among asymptomatically colonized patients . Similarly, in a sample of isolates and patient information collected from 10 CDC EIP sites between 2009 and 2011, ribotype 027 was the most prevalent strain (28.4%) and was associated with more severe disease, severe outcomes, and death than other strains, controlling for patient risk factors, healthcare exposure, and antibiotic use .
Since the emergence and spread of 027, recent data from Europe suggest that the prevalence of this strain is decreasing. England has seen a dramatic decrease in 027 prevalence since the establishment of a nationwide ribotyping network in 2007 . Ribotype 027 decreased significantly between 2007 and 2010, dropping from 55% prevalence to 21%, coincident with significant decreases in reported CDI incidence and related mortality. The decrease in 027 prevalence was likely driven by significant reductions in fluoroquinolone use during this time period , although increase in awareness and improved infection control may also have impacted CDI incidence.
Continued molecular typing will enable detection of emerging C. difficile strains with novel virulence factors, risk factors, and antibiotic resistance patterns. For example, evidence of emergence of a virulent strain, ribotype 078, has been reported from the Netherlands . The prevalence of ribotype 078 increased between 2005 and 2008 and was associated with similar severity compared to CDI cases due to ribotype 027, but was associated with a younger population and more CA CDI. There was also a high degree of genetic relatedness between 078 isolates found in humans and pigs, an association also noted in the United States .
In the context of the changing epidemiology of CDI in hospitals in the mid-2000s, evidence suggested increasing incidence of CDI in the community, even in healthy people previously at low risk, including peripartum women [59–64]. The sources of and risk factors for CA CDI (ie, occurring in patients with no inpatient stay in the previous 12 weeks) are not well defined. An analysis of CA CDI cases identified during 2009–2011 in the CDC EIP surveillance found that the majority of cases (82%) had some kind of healthcare exposure in the 12 weeks prior to CDI diagnosis. A relatively large percentage (36%) of CA CDI cases did not report antibiotic exposure in the 12 weeks prior to infection, although medication exposures were self-reported and may have been subject to limitations in recall. Among patients without reported antibiotic exposure, 31% received proton pump inhibitors (PPIs) . In another recent study, a predictive risk scoring system developed in one cohort in a capitated-payment healthcare system and validated in another cohort in the same system proved useful for differentiating CDI risk in patients following an outpatient healthcare visit . Major components of the scoring system included age, recent inpatient stay, chronic conditions (eg, liver and kidney disease, inflammatory bowel disease [IBD], cancer), and antibiotics; the role of PPIs was not examined or otherwise not included.
Patients with IBD, especially ulcerative colitis, are at increased risk of not only primary CDI but also recurrent disease, as well as increased morbidity and mortality from CDI. The risk of CDI within 5 years of a diagnosis of ulcerative colitis may be >3% and worsens prognosis by increasing risk of colectomy, postoperative complications, and death . Patients with IBD are 33% more likely to suffer recurrent CDI . There is an increased colectomy risk from CDI occurrence in patients with IBD overall, especially patients with ulcerative colitis .
Other patient populations at increased risk include solid organ transplant recipients: With an overall prevalence of 7.4%, rates in this population are 5-fold greater than among general medicine patients, and cases are associated with remarkable increases in hospital days and costs [69, 70]. Risks are highest in multiple solid organ transplants, followed by lung, liver, intestine, kidney, and pancreas with an overall prevalence of severe disease of 5.3% and risk of recurrence approximately 20% . Patients with chronic kidney disease and end-stage renal disease have an approximately 2- to 2.5-fold increased risk of CDI and recurrence, a 1.5-fold increased risk of severe disease, and similarly increased mortality [71, 72]. Finally, hematopoietic stem cell transplant patients have a rate of CDI that is approximately 9 times greater than that in hospitalized patients overall; within this population, rates are about twice as high in allogeneic (vs autologous) transplants, where CDI occurs in about 1 in 10 transplants . Most of this risk is during the peritransplantation period (ie, first 100 days posttransplant).
Clostridium difficile transmission resulting in disease in the healthcare setting is most likely a result of person-to-person spread through the fecal–oral route or, alternatively, direct exposure to the contaminated environment. Studies have found that the prevalence of asymptomatic colonization with C. difficile is 3%–26% among adult inpatients in acute care hospitals [46, 74, 75] and is 5%–7% among elderly patients in LTCFs [33, 76]. In contrast, the prevalence of C. difficile in the stool among asymptomatic adults without recent healthcare facility exposure is <2% [77, 78]. A recent meta-analysis found that the pooled colonization rate upon hospital admission across 19 studies (mostly since 2005 and through 2014) was 8.1% with the main risk factor for such colonization being a previous hospitalization . Notably, neither antibiotic use nor previous CDI was associated with colonization on hospital admission
The period between initial colonization with C. difficile and the occurrence of CDI (ie, incubation period) was estimated in 3 earlier studies to be a median of 2–3 days [66, 68]. However, recent evidence suggests a longer incubation period, even >1 week; Curry et al, in a study of asymptomatic C. difficile carriers, found 7 of 100 patients with CDI that tested positive for highly related C. difficile isolates 8–28 days prior to infection diagnosis . Other early studies suggested that persons who remain asymptomatically colonized with C. difficile over longer periods of time are at decreased, rather than increased, risk for development of CDI [74, 80–82]. In contrast, the aforementioned recent meta-analysis found that preceding colonization increased the risk of subsequent CDI 6-fold; however, neither the time course from first detection of colonization to symptom onset nor the impact of diagnostic methods on this risk were examined .
Thus it is likely that the daily risk of progression from colonization to infection is not static but decreases over time; if so, the protection afforded by more long-standing colonization may be mediated in part by the boosting of serum antibody levels against C. difficile toxins A and B [46, 80, 81]. It is also likely that as long as an individual is colonized by one strain they are protected from infection caused by another strain; there is evidence of protection from CDI in both humans and in animal models following colonization with nontoxigenic strains, suggesting competition for nutrients or access to the mucosal surface [82, 83].
The hands of healthcare personnel, transiently contaminated with C. difficile spores , and environmental contamination [75, 85–88] are probably the main means by which the organism is spread within healthcare. Although occupying a room where a prior occupant had CDI is a significant risk factor for CDI acquisition, this accounts for approximately 10% of CDI cases, indicating other vectors are more common . There have also been outbreaks in which particular high-risk fomites, such as electronic rectal thermometers or inadequately cleaned commodes or bedpans, were shared between patients and were found to contribute to transmission .
The potential role of asymptomatically colonized patients in transmission has recently been highlighted. Using multilocus variable number of tandem repeats analysis, Curry et al found that 29% of CDI cases in a hospital were associated with asymptomatic carriers, compared to 30% that were associated with CDI patients . Similarly, 2 studies of hospitalized patients in the United Kingdom found that only 25%–35% of CDI cases were genetically linked to previous CDI cases [91, 92], suggesting a role for other sources of transmission such as asymptomatic carriers and the environment. In the Curry et al study, environmental transmission may have occurred in 4 of 61 incident healthcare-associated CDI cases .
Two recent studies highlight how antibiotics may affect CDI risk in hospitalized patients through impacting the contagiousness of asymptomatically colonized patients. Through use of a multilevel model, ward-level antibiotic prescribing (ie, among both CDI and non-CDI patients, therefore including potential asymptomatic carriers) was found to be a risk factor for CDI that was independent of the risk from antibiotics and other factors in individual patients . Meanwhile, the individual risk of symptomatic CDI was found to be higher in patients admitted to a room where a previous patient without CDI was administered antibiotics, suggesting induced shedding of C. difficile from asymptomatic carriers .
Shedding of C. difficile spores is particularly high among patients recently treated for CDI, even after resolution of diarrhea [84, 95], suggesting a population of asymptomatic carriers who might be more likely to transmit the organism. In one study, the frequency of skin contamination and environmental shedding remained high at the time of resolution of diarrhea (60% and 37%, respectively), decreased at the end of treatment, and increased again 1–4 weeks after treatment (58% and 50%, respectively) .
Advanced age, potentially as a surrogate for severity of illness and comorbidities, is one of the most important risk factors for CDI [46, 96, 97], as is duration of hospitalization. The daily increase in the risk of C. difficile acquisition during hospitalization suggests that duration of hospitalization may be a proxy for the duration and degree of exposure to the organism, likelihood of exposure to antibiotics, and severity of underlying illness [46, 74, 98]. The most important modifiable risk factor for the development of CDI is exposure to antibiotic agents. Virtually every antibiotic has been associated with CDI through the years, but certain classes—third-/fourth-generation cephalosporins , fluoroquinolones [36, 37, 100], carbapenems , and clindamycin [101, 102]—have been found to be high risk. Receipt of antibiotics increases the risk of CDI because it suppresses the normal bowel microbiota, thereby providing a “niche” for C. difficile to flourish . The relative risk of therapy with a given antibiotic agent and its association with CDI depends on the local prevalence of strains that are highly resistant to that particular antibiotic agent .
The disruption of the intestinal microbiota by antibiotics is long-lasting, and risk of CDI increases both during therapy and in the 3-month period following cessation of therapy. The highest risk of CDI (7- to 10-fold increase) appears to be during and in the first month after antibiotic exposure . Both longer exposure to antibiotics  and exposure to multiple antibiotics increase the risk for CDI . Nonetheless, even very limited exposure, such as single-dose surgical antibiotic prophylaxis, increases a patient’s risk of C. difficile colonization and symptomatic disease . However, as previously noted, asymptomatic colonization, at least as detected among patients commonly admitted to the hospital, may not be associated with prior antibiotics .
Cancer chemotherapy is another risk factor for CDI that is, at least in part, mediated by the antibiotic activity of several chemotherapeutic agents [105, 106] but could also be related to the immunosuppressive effects of neutropenia [107, 108]. Evidence suggests that C. difficile is an important pathogen causing bacterial diarrhea in US patients infected with human immunodeficiency virus, which suggests that these patients are at specific increased risk because of their underlying immunosuppression, exposure to antibiotics, exposure to healthcare settings, or some combination of those factors . Other risk factors for CDI include gastrointestinal surgery  or manipulation of the gastrointestinal tract, including tube feeding . Meta-analyses of risk factors for recurrence identified many of those described above for initial CDI including advanced age, antibiotics during follow-up, PPIs, and strain type, as well previous exposure to fluoroquinolones [111, 112]. Meanwhile, risk factors for complicated disease include older age, leukocytosis, renal failure and comorbidities, while risk factors for mortality from CDI alone include age, comorbidities, hypoalbuminemia, leukocytosis, acute renal failure, and infection with ribotype 027 . Recent data confirm the role of humoral immunity, primarily directed against toxin B, at least for protecting against recurrent disease . There may be an important role for vitamin D in protecting against CDI, with low levels being an independent risk factor among both general patients with community-associated disease, older patients, and those with underlying inflammatory bowel disease [114, 115].
Breaches in the protective effect of stomach acid or the antibiotic activity of acid-suppressing medications, such as histamine-2 blockers and PPIs, while a potential risk factor, remain controversial. Although a number of studies have suggested an epidemiologic association between use of stomach acid–suppressing medications, primarily PPIs, and CDI [37, 60, 116–119], results of other well-controlled studies suggest this association is the result of confounding with the underlying severity of illness, non-CDI diarrhea, and duration of hospital stay [36, 120, 121].
In a retrospective study of 754 patients with healthcare-associated CDI, continuous use of PPIs was independently associated with a 50% increased risk for recurrence, whereas reexposure to antibiotics was associated with only a 30% increased risk . Moreover, long-term use of PPIs has been shown to decrease lower gastrointestinal microbial diversity . However, whether as a risk factor for primary or recurrent disease, the choice of control group in such epidemiologic studies is important. PPIs and histamine-2 blockers may be associated with CDI when comparing cases to nontested controls but not when comparing cases to tested-negative controls . This reflects why understanding the role of these drugs in the pathogenesis of CDI remains elusive; PPIs induce diarrhea on their own, making it more likely patients are tested for CDI. More careful assessment of confounding factors, symptoms, and criteria for testing for recurrence, as is typical in a prospective clinical trial, may then explain why PPIs were not associated with recurrence in clinical trials of fidaxomicin .
Similar to the findings in adults, the incidence of CDI has risen in children since 2000 [124–129]. The majority of pediatric studies have evaluated the incidence of CDI-related hospitalizations among multicenter cohorts of hospitalized children [126–128]. More recently, a population-based study of children residing in Olmsted County, Minnesota, between 1991 and 2009 identified an increase in incidence of CDI among pediatric residents from 2.6 to 32.6 per 100000 using standard surveillance definitions .
The incidence of CDI has increased overall, including increases in CDI among children in community and outpatient settings [124, 125, 130]. Using data from active population- and laboratory-based surveillance by the EIP, Wendt et al showed that 71% of pediatric CDI identified by positive C. difficile stool testing arose from the community . These estimates are limited by reliance on laboratory surveillance methods, where differences in testing practices may undermine the accuracy of some longitudinal and interinstitutional comparisons of rates of CDI in children [132, 133]. Nonetheless, these data indicate an epidemiologic shift with increased disease in nonhospitalized children.
One important feature of the epidemiology of C. difficile in children is the presence of asymptomatic colonization with either toxigenic or nontoxigenic strains among many infants and young children, with the highest rates (which can exceed 40%) in infants <12 months of age [134–141]. Nontoxigenic strains are more common than toxigenic strains among colonized infants, but colonization is transient and different strains are found to colonize the same infant at different times [135, 139, 142–144]. Colonization is less frequent among breastfed as compared with bottle-fed infants [140, 145–147]. Some evidence implicates the hospital environment as a source of acquisition of colonizing strains [134, 135, 138, 143, 148–150].
Colonization rates decrease with increasing age [140, 147, 151, 152]. The prevalence of asymptomatic colonization with C. difficile is still elevated in the second year of life, although to a lesser degree than in infants [139, 153, 154]. Therefore, testing in this population should also be avoided unless other infectious and noninfectious causes of diarrhea have been excluded. Consistent with the epidemiology of CDI in infants and young children, the NHSN does not permit reporting of CDI from newborn nurseries and neonatal ICU locations. Additionally, public reporting of cases in children <2 years of age is strongly discouraged. By 2–3 years of age, approximately 1%–3% of children are asymptomatic carriers of C. difficile (a rate similar to that observed in healthy adults). While young children are unlikely to have C. difficile infection, asymptomatically colonized infants and children may serve as a source of transmission of the organism to adults, leading to C. difficile infection among adult contacts [27, 139, 155, 156].
Many of the risk factors for C. difficile infection in children mirror those for adults, including recent antibiotic exposure, hospitalization, and underlying complex chronic conditions such as malignancy, solid organ transplant, and inflammatory bowel disease [126, 127, 157–160]. In children, the presence of a gastrostomy or jejunostomy tube has been found to be an additional independent risk factor . Recent studies suggest that acid-suppressing medications may also be an independent risk factor for CDI in children, although the association has been more consistently observed in children who receive histamine-2 receptor antagonists than PPIs [161, 162].
Severe disease and complications due to CDI are less common in children [126, 158, 163] but have been described [164, 165]. Among hospitalized children who are otherwise similar in important demographic and clinical characteristics, CDI has been associated with worse outcomes, including prolonged hospital stay, increased total hospital costs, and higher mortality rates [127, 166].
Determining the optimal number of episodes of diarrhea that justifies the need for CDI testing depends on the likelihood of infection (high vs low CDI rates), potential confounders (underlying diseases and/or medical or surgical interventions that increase the chance of iatrogenic diarrhea), risk factors for CDI, and the chosen testing methods (high vs low specificity/predictive value methods).
If a patient has diarrheal symptoms not clearly attributable to underlying conditions (IBD, and therapies such as enteral tube feeding, intensive cancer chemotherapy, or laxatives), then testing to determine if diarrhea is due to C. difficile is indicated. Alternatively, testing may be indicated if symptoms persist after stopping therapies to which diarrhea may be otherwise attributed (eg, laxatives). However, some of these conditions and interventions associated with diarrhea in their own right, such as IBD and enteral tube feeding, have been shown to have increased risk of CDI when compared with a matched cohort . So, in practice it is difficult to exclude the possibility of CDI on clinical grounds alone in a patient with new-onset or worsened diarrhea.
The evidence base to optimize CDI testing is weak. Clinical criteria for the diagnosis of CDI have altered as awareness of CDI has increased. Notably, the number and frequency of diarrheal stools required to justify CDI testing have declined over the past 40 years. Tedesco et al defined diarrhea as >5 loose stools per day in 1974 ; Teasley et al as >6 loose stools over a period of 36 hours in 1983 ; Fekety et al as liquid stools or >4 bowel movements per day for at least 3 days in 1989 ; and Johnson et al as ≥3 loose or watery bowel movements in 24 hours in 2013 . Using the latter definition of diarrhea, Dubberke et al and Peterson et al (also using additional clinical criteria) have examined the frequency of these symptoms in patients whose stool is submitted for CDI testing [171, 172]. Peterson et al that found 39% of patients did not meet the minimal diarrhea definition and were dropped from further analysis .
Dubberke et al used a clinical definition of ≥3 diarrheal bowel movements (type 6 or 7 stool on the Bristol Stool Chart)  in the 24 hours preceding stool collection, or diarrhea plus patient-reported abdominal pain or cramping. They found that 36% of patients failed to meet the clinical definition but were retained in the study . The authors caution that even in the presence of clinical diarrheal symptoms, there may be confounding clinical issues such as laxative use, which was found in 19% within the previous 48 hours .
Clinicians can improve laboratory test relevance by only testing patients likely to have C. difficile disease. This includes not routinely performing testing on stool from a patient who has received a laxative within the previous 48 hours. Laboratories can improve specificity by rejecting specimens that are not liquid or soft (ie, take the shape of the container). In addition, laboratories may wish to collaborate with available quality improvement teams such as infection prevention and control and antibiotic stewardship, to assess appropriateness of testing in the population from which samples are submitted. This may involve periodic chart review in a series of patients to assess for clinical risk factors, signs, and symptoms suggestive of CDI.
Two diagnostic testing recommendations based on institutional and laboratory preagreed criteria for patient stool submission are prefaced by questions VII and VIII (Figure 2).
There is a variety of available options for laboratory testing to support the diagnosis of CDI, and these are well described in several recent reviews [174, 175]. In brief, these methods detect either the organism or one or both of its major toxins (A and B) directly in stool. Table 3 lists these methods in decreasing order of analytical sensitivity. Toxigenic culture (TC) uses a prereduced selective agar, cycloserine-cefoxitin-fructose agar or a variant of it, followed by anaerobic incubation for several days. Once there is growth, the organism is identified by several methods including matrix-assisted laser desorption/ionization–time of flight mass spectrometry, although the characteristic “horse barn odor” often heralds its presence. To enhance the recovery of the organism, a spore selection step, whether heat or alcohol shock, is applied to the stool prior to inoculating media. Once an organism is identified, a toxin test must be performed on the isolate to confirm its toxigenic potential. TC, although not standardized, has been one of the reference methods against which other methods are compared.
|Toxigenic culture||High||Lowa||Clostridium difficile vegetative cells or spores|
|Nucleic acid amplification tests||High||Low/moderate||C. difficile nucleic acid (toxin genes)|
|Glutamate dehydrogenase||High||Lowa||C. difficile common antigen|
|Cell culture cytotoxicity neutralization assay||High||High||Free toxins|
|Toxin A and B enzyme immunoassays||Low||Moderate||Free toxins|
aMust be combined with a toxin test.
The other reference method is the cell cytotoxicity neutralization assay (CCNA), which detects toxin directly in stool. This assay begins with preparation of a stool filtrate, which is applied to a monolayer of an appropriate cell line, such as Vero cells, or human fibroblasts, among others. Following incubation, the cells are observed for cytopathic effect (CPE); duplicate testing is usually carried out simultaneously with neutralizing antibodies to Clostridium sordellii or C. difficile toxin, to ensure that the observed CPE is truly caused by C. difficile toxins and not by other substances in the stool. Incubation continues for up to 48 hours, but the majority of positives are detected after overnight incubation. This method is cumbersome, time-consuming, and lacks standardization, although if optimized, it is one of the most sensitive and specific methods available for C. difficile toxin detection. As laboratories abandoned their viral cell culture facilities in favor of antigen and molecular tests, CCNA became less popular. Enzyme immunoassays, initially for toxin A detection alone, and later both toxins, became available and replaced the above reference methods for routine clinical testing in the late 1980s and early 1990s. EIAs use monoclonal or polyclonal antibodies to detect C. difficile toxins and there are numerous commercial assays available. Performance is variable and their overall poor performance sparked development of other methods such as GDH immunoassays and molecular tests for toxin gene detection [174, 176, 177]. While toxin EIAs remain insensitive in the detection of toxigenic C. difficile when compared with these successive technologies, sensitivities vary among available toxin EIA tests. Results across both sponsored and nonsponsored studies should be considered to select a relatively more sensitive EIA for general use . Also, there is some evidence that newer EIAs have improved sensitivity compared with those examined in older studies .
Glutamate dehydrogenase immunoassays detect the highly conserved metabolic enzyme (common antigen) present in high levels in all isolates of C. difficile. Since this antigen is present in both toxigenic and nontoxigenic strains, GDH immunoassays lack specificity and must be combined with another (usually toxin) test. GDH testing is the initial screening step in 2- and 3-step algorithms that combine it with a toxin test and/or a molecular test for toxin gene detection. The combination has allowed for rapid results and improved sensitivity compared with toxin EIA testing alone, and can be economical [174, 176, 177].
Although NAATs for C. difficile detection in stool began to appear in the literature in the early 1990s, the first US Food and Drug Administration (FDA)–cleared platform was not available in the United States until 2009 . There are at least 12 available commercial platforms that detect a variety of gene targets including tcdA, tcdB, and 16S ribosomal RNA (rRNA). These assays are more sensitive for C. difficile detection than toxin EIAs (and possibly than GDH EIAs) but less sensitive than TC. However, the positive predictive value of NAATs for CDI is low to moderate depending upon disease prevalence and the limit of detection of the assay.
The optimum method for laboratory diagnosis of CDI remains elusive as patients may harbor toxigenic strains and not have clinical disease, an observation that was made in early studies soon after the discovery of C. difficile [78, 179]. In addition, diarrhea in hospitalized patients is common and C. difficile is the culprit in <30% and often in as little as 5%–10% of patients [179–181]. Consensus regarding the best laboratory testing method is lacking. Much of the literature on diagnostic testing comparing laboratory methods is limited by use of an inappropriate comparative standard (ie, standards other than clinical disease) or a reference method that has never been standardized (ie, CCNA or the toxigenic component of TC) . Furthermore, use of an inappropriate comparative reference method is a recurring issue (eg, using TC to assess the accuracy of a toxin test when the correct comparator is CCNA). In addition, comparative methods are often performed without knowledge of the prevalence of true disease in the population based on clinical presentation. There are very few studies that incorporate clinical assessment into analyses of test performance. These are discussed below. Finally, much of the literature is derived from single centers and/or is underpowered to achieve definitive conclusions upon which to base recommendations; thus, current GRADE methodology is not well adapted to gauging the strength of a recommendation using the type of evidence currently available for diagnostic tests.
Given these various conundrums and the paucity of large prospective studies, the recommendations, while strong in some instances, are based upon a very low to low quality of evidence (Table 4).
|Use a stool toxin test as part of a multistep algorithm vs a NAAT alone for all specimens received in the clinical laboratory when there are no preagreed institutional criteria for patient stool submission|
|Evidence Supporting Diagnostic Tests||Design||No. of Subjects||Methodologic Limitations||Quality of Evidence (GRADEa)||Reference, First Author|
|GDH and NAATs had the highest sensitivity but poor PPV in patients with no symptoms; all tests had high NPV regardless of symptoms||Observational study, patient interviews, and ID physician assessment||150||Small sample size; only standard of care assay was tested in real time; others frozen||Dubberke |
|Toxin-negative, NAAT-positive patients who were not treated did not have adverse outcomes. Recurrence of CDI was more common when both NAAT and toxin assays were positive than when NAAT alone was positive (31% vs 14%; P = .03)||Observational retrospective study||128||Small sample size||Kaltsas |
|No difference in toxin EIA positivity between patients with mild vs severe disease (49% vs 58%; P = .31)||Observational study, prospective testing, retrospective chart review||299||Single-center study||Humphries |
|Complications were more common among patients positive by both NAAT and GDH/EIA/CCNA compared to NAAT alone (39% vs 3%; P < .001)||Prospective cohort study; observational||1321||Only some of the samples were tested using a gold standard||Longtin |
|Patients who were CCNA positive or GDH/EIA positive had higher all-cause mortality than patients who were NAAT or TC positive alone (P = .022)||Prospective, multicenter, observational study||12420||Limited clinical data||Planche |
|Patients who were EIA toxin positive had longer median duration of diarrhea, more CDI-related complications, higher CDI-related mortality than toxin negative/PCR positive patients (8.4% vs 0.6%; P =. 001)||Prospective, single-center observational cohort study||1416||Single-center study; differences in empiric treatment and risk allocation between groups||Polage |
|Quality of evidence for diagnosis when the pretest probability is unknown or low||⊕ ⊕ ⊖ ⊖ (Low)|
|Use a NAAT alone or multiple-step algorithm for testing (ie, GDH plus toxin; GDH plus toxin, arbitrated by NAAT, or NAAT plus toxin) vs a toxin test alone when there are preagreed institutional criteria for patient stool submission|
|Evidence Supporting Diagnostic Tests||Design||No. of Subjects||Methodologic Limitations||Quality of Evidence (GRADEa)||Reference, First Author|
|PCR was more sensitive (93.3%) than toxin EIA (73.3%; P < .05) and direct cytotoxin testing (76.7%) when applied to patients who met clinical criteria for C. difficile disease||Observational; prospective patient interviews||350||Small number of positive patients||Peterson |
|Using clinical diagnosis as the reference standard, PCR was more sensitive than CCNA and GDH (99.1% vs 51% vs 83.8%). Close to double the number of patients were positive by PCR compared to CCNA and 91.5% of those were clinically confirmed.||Prospective performed at 2 centers||1051||Specimens were not consecutive; limited statistical analysis; limited patient follow-up||Berry |
|Quality of evidence for diagnosis when there is a high likelihood of CDI||⊕ ⊕ ⊖ ⊖ (Low)|
Abbreviations: CCNA, cell cytotoxicity neutralization assay; CDI, Clostridium difficile infection; EIA, enzyme immunoassay; GDH, glutamate dehydrogenase; GRADE, Grading of Recommendations, Assessment, Development and Evaluation; ID, infectious disease; NAAT, nucleic acid amplification test; NPV, negative predictive value; PCR, polymerase chain reaction; PPV, positive predictive value; TC, toxigenic culture.
aFor GRADE interpretation, see Figure 1.
In 2011, Dubberke and colleagues performed an observational study of 150 patients to assess the impact of clinical symptoms (>3 diarrheal bowel movements in the 24 hours preceding stool collection, or diarrhea plus patient-reported abdominal pain or cramping) on interpretation of diagnostic assays for CDI . While the study is too small to draw definitive conclusions, it illustrates some important caveats about diagnostic evaluations. The authors evaluated 8 diagnostic assays including 2 toxin EIAs, a test for GDH, a commercial CCNA assay, and 3 NAATs . TC was also performed for all specimens. Two reference standards were assessed, each with and without consideration of patient symptoms. The prevalence of true CDI based upon a gold standard of clinically significant diarrhea and a positive TC was 11% . However, this rate was determined only for the first 100 samples, and given the use of a relatively nonspecific (TC) testing method, it is likely to be an overestimation of the true CDI rate. As expected, given the choice of reference method (TC), the toxin tests detected fewer positive samples. Conversely, the GDH and NAATs detected the most positive samples. Compared with this TC gold standard, the least sensitive assays were the CCNA (62.9% sensitive, 95% CI, 46.3%–76.8%) and one of the toxin A/B EIA tests (80.0% sensitive; 95% CI, 64.1%–90.0%) . The most sensitive methods (all >90%) were the GDH assay, all NAATs, and one of the EIAs performed on frozen stools. While all assays had a negative predictive value of > 95%, the positive predictive values (PPVs) for the GDH and NAATs were <50%, suggesting that they were positive in many patients who did not meet the clinical criteria for diarrhea . By contrast, the TechLab toxin EIA PPV (notably when testing freeze-thawed stools) was 59%. Other important observations from this study were that 19% of patients had received a laxative in the 48 hours prior to testing, and another 36% of patients who were tested did not have clinically significant diarrhea, indicating that improvements in validated criteria for deciding when to test patients are needed .
Kaltsas et al attempted to understand the clinical and epidemiological impact of transitioning from a 2-step algorithm, which involved screening with GDH followed by a CCNA, to NAAT for the diagnosis of CDI in a major cancer hospital . Test performance for 128 samples was assessed in the context of symptoms, severity of illness, and patient outcomes. Two time periods were evaluated: May to August 2008 and March to May 2010 . For both time periods, CDI cases were defined as having clinical symptoms including diarrhea (84%), fever and abdominal pain (4%), nausea and vomiting (2%), abdominal pain, leukocytosis, or sepsis (2% each), and fever alone (1%) with a positive NAAT or a positive CCNA . Different NAATs were used in the first compared with the second time period and no information was provided on overall test positivity or other indicators of the prevalence of CDI in the tested population. Testing for CDI was performed on diarrheal (84%) and nondiarrheal (16%) stool samples in patients in whom it may be very difficult to interpret the true clinical significance of diarrhea, namely cancer patients undergoing intensive chemotherapy . There was no statistically significant difference in the clinical presentations at the onset of infection and severity of disease between patients positive by NAAT alone compared with those concordant for both NAAT and 2-step algorithm assays . Among 23 toxin-negative, NAAT-positive patients who were not treated, the only possible adverse outcome was recurrence in 3 patients; however, only 15 (65%) had diarrhea on the day of testing . Recurrence of CDI was more common in patients when both assays were positive than when NAAT alone was positive (31% vs 14%; P = .03). In summary, it is not clear what the results mean from this modestly sized cohort of difficult-to-interpret cases (patients with high frequency of multifactorial diarrhea), other than the impact of a 2-fold increase in reported C. difficile rates when transitioning to the more sensitive, but probably less specific NAAT method .
Longtin et al assessed the impact of diagnostic test methods on CDI rates and the occurrence of complications based upon the tests used to diagnose CDI . This was a prospective cohort study in Quebec over a 1-year period . CDI was defined by documented diarrhea of ≥3 loose or liquid stools in <24 hours and symptoms lasting ≥24 hours in combination with a positive test for toxin-producing C. difficile or clinical diagnosis based upon histopathology or presence of pseudomembranes on colonoscopy . Structured data collection forms were used to collect information prospectively regarding complications and whether patients with positive tests met the case definition. All samples submitted to the laboratory were tested by a NAAT that detected the toxin B gene and a 3-step algorithm that began with screening for GDH followed by toxin A/B EIA testing . Samples positive by both methods (NAAT and 3-step algorithm) were considered positive for C. difficile. GDH-positive, toxin EIA-negative samples were retested using a CCNA . Only NAAT results were reported to clinicians and infection control. A total of 1321 stool specimens from 888 patients were assessed over the 1-year period, of which 17% were positive by NAAT and 12.3% were positive by the 3-step algorithm . There were 85 cases of healthcare-associated CDI detected by NAAT whereas only 56 of these cases were diagnosed by EIA/CCNA (P = .01). Complications (ie, 30-day mortality, colectomy, ICU admission, or readmission for recurrence) were more common among patients positive by both test methods (NAAT and 3-step algorithm) compared with cases detected by NAAT alone (39% vs 3%, P < .001). The major limitation of this study was that it was performed at a single center and only some of the specimens were tested by a recognized gold standard method (ie, CCNA). That said, the results support the findings by Planche and colleagues discussed below .
Planche et al sought to validate the reference methods for C. difficile diagnosis, namely TC and CCNA testing according to clinical outcomes in an attempt to derive the optimal diagnostic laboratory method . This was a large observational, multicenter study of 12420 routinely submitted fecal samples. The authors examined the results of the 2 reference assays (TC and CCNA) along with 4 commercial methods—2 toxin A/B enzyme EIAs, GDH, and a NAAT . Limited clinical data were collected (all patients had diarrhea but stool frequency was not known) and outcomes were assessed for 6522 inpatients who were stratified into 3 groups as follows: CCNA positive (group 1; n = 435), TC positive but CCNA negative (group 2; n = 207), and negative by both methods (group 3; n = 5880). On univariate analysis, leukocytosis was greater in group 1 than group 2 or 3, and white blood cell (WBC) counts were similar in groups 2 and 3. However, both groups 1 and 2 had similarly low serum albumin levels compared with group 3; group 2, but not group 1, had a higher mean rise in creatinine than group 3. Both groups 1 and 2 had similarly longer mean lengths of stay (before and after testing) than group 3. All-cause 30-day mortality was markedly higher in group 1 (16.6%) than group 2 (9.7%) (P = .022). The mortality in group 2 was not significantly different from the control group (8.6%) . When the analysis was performed using NAAT in place of TC, the findings were similar, with the absolute difference in mortality between patients who were CCNA positive vs those with NAAT positive but CCNA negative of 6.9% (P = .004). The combination of GDH immunoassay plus toxin EIA (TechLab assay) was almost identical in performance to CCNA. In a multivariate logistic regression model, group 1 patients were older and had greater leukocytosis, serum creatinine rise, depressed albumin, and 30-day mortality compared with group 3 . Lengths of stay were not independently associated with group 1, and all other group multivariate comparisons, including mortality in group 1 vs 2, were not significant. The failure to find a mortality difference in groups 1 vs 2 on multivariate analysis may be due to the much smaller number of patients in group 2 than in group 3. Another limitation was the relatively low prevalence of true disease in the tested population based upon the positivity rate of either the CCNA (5.9%) or TC (8.3%); this reflected national endemic rates of CDI at that time.
Clinical outcome data were available for 69% (143/206) of inpatients with discordant reference method results. Of these patients, 75 (52%) who were TC positive but CCNA negative received no CDI treatment. Among the 4 of 75 cases that were TC positive and CCNA negative who died and did not receive CDI treatment, none had a diagnosis of this infection on their death certificate. Also, 64 of 143 (45%) patients with a discordant reference method result did not have diarrhea recorded on their stool chart; for the remainder of the patients, the median duration of diarrhea was 2 days.
The authors concluded that patients with a positive toxin test should be treated and those who are positive by TC and/or NAAT alone could be considered “excretors” who may present an infection control risk but do not require treatment.
In the Planche et al study, based upon the assay comparison validation, the authors recommended using a multistep algorithm such as screening with GDH and confirming positives with a “sensitive” toxin A/B enzyme immunoassay , and this has been national UK policy since 2012. The 2 toxin EIAs used in the study, the Meridian Tox A/B test and the TechLab assay, had significantly differing sensitivities of 69.2% (95% CI, 64.3%–73.8%) and 82.3% (95% CI, 78.1%–85.9%), respectively .
Support for using the Meridian Tox A/B toxin testing alone instead of a NAAT alone to diagnose CDI is provided in a more recent study by Polage et al. In a large (n = 1416) prospective, observational cohort study performed at a single academic medical center, the authors assessed the natural history and need for treatment of patients who were toxin EIA positive (assay in clinical use) compared with toxin negative/PCR positive (blindly tested) . The toxin-positive/PCR-positive arm had 131 patients (9.3%), 162 patients were toxin negative/PCR positive (11.4%), and 1123 patients were toxin negative/PCR negative. Patient demographics were similar among all 3 arms as were the proportions with leukopenia, renal insufficiency, and hypoalbuminemia. The toxin-positive/PCR-positive group had more diarrhea and longer duration of diarrhea, more prior antibiotic exposure, and more patients with leukocytosis. In the multivariable model, the frequency of CDI-related complications was highest in the toxin-positive/PCR-positive group compared with the toxin-negative/PCR-positive and toxin-negative/PCR-negative patients (7.6% vs 0% vs 0.3%; P < .001). The rate of CDI-related complications was similar between the PCR-positive/toxin-negative patients and patients who were negative by both tests (0% vs 0.3%; P > .99). In terms of mortality, similar observations were noted. There were 11 CDI-related deaths among the toxin-positive/PCR-positive patients, one death among the PCR alone cohort, and no deaths among the group with negative tests (P < .001). The authors also assessed repeat testing and treatment within 14 days of onset of symptoms as surrogates of ongoing clinical suspicion or empiric treatment for CDI in the toxin-negative/PCR-positive group, and again during the 15- to 30-day period following symptom onset to assess recurrent or prolonged CDI during the latter time period. Sixty-one toxin-negative/PCR-positive patients were retested (37.7%) and 8% had toxins detected. While none of the patients had CDI-related complications, one patient had CDI as a contributing factor to death.
During the early period, only 21 patients (13%) received a full course of treatment and close to 60% received no treatment . Likewise, in the later period (15–30 days after onset), most (78%) toxin-negative/PCR-positive patients received no treatment. During that period, patients who were toxin positive were twice as likely to have repeat testing and 3 times more likely to be positive compared with toxin-negative/PCR-positive patients. The authors conclude that toxin EIA positivity was a better predictor of CDI-related complications and deaths, and outcomes in patients who were PCR positive alone were comparable with those in patients who were negative by both tests. The use of molecular tests alone is likely to lead to overdiagnosis and overtreatment. There are several strengths of this study including the large number of patients assessed, the prospective study design, and assessment of patient outcomes. The weaknesses include that fact it was a single-center study and risk allocation between the 2 groups was not equivalent. In addition, empiric treatment may have affected outcomes in some patients in the toxin-negative/PCR-positive group.
Absence of toxin in stool may not be predictive of CDI severity. Investigators at the University of California, Los Angeles attempted to assess the significance of detecting C. difficile in patient samples in the absence of toxins, for example, in NAAT-positive, EIA-negative situations . The goal was to determine if patients who tested negative for C. difficile toxins by EIA but were positive by NAAT were more likely to have mild disease . Retrospective chart review was performed following completion of laboratory testing. Patients were selected on the basis of initial NAAT result, selecting one NAAT-negative patient for every NAAT-positive patient until a predicted necessary sample size to test the above goal was reached. Thus, 296 patients were enrolled in the study with 143 classified as true CDIs (48% of the cohort) based on multiple different results, some of which likely lacked specificity for CDI . Among the 143 with CDI, there was no difference in toxin EIA positivity between patients with mild vs severe disease (49% vs 58%; P = .31) according to the criteria of Zar et al; however, patients with mild disease had a 2.7-fold lower all-cause mortality [187, 188]. Although the toxin EIA–positive patients did have significantly longer overall hospital stays, the authors concluded that, because of similarly low toxin EIA positivity in both less- and more-severe disease, NAAT-positive, EIA-negative results are clinically meaningful and therefore a NAAT should be used for the diagnosis of CDI . This study is likely underpowered and may suffer from bias based upon its retrospective design and suboptimal choice of the toxin EIA; also, the reference method used (toxigenic culture) for assessment of the toxin EIA results was not ideal as this would have underestimated the sensitivity of the latter test.
In summary, if laboratories have no clinical data and accept all unformed stools for testing, it is most appropriate to use a diagnostic approach that includes a test that is more specific for CDI, such as a relatively sensitive toxin test as part of a multistep algorithm.
One of the first studies to incorporate clinical information in the validation of a molecular test was by Peterson et al . This study was performed prior to the availability of the first FDA-cleared molecular assay using an in-house developed assay that detected the toxin B gene (tcdB). This real-time PCR test was compared with TC, a toxin EIA, and an in-house CCNA . The authors performed 2 clinical evaluations. A checklist of validated clinical criteria for diagnosis of C. difficile disease was used for both the retrospective and prospective investigations . Toxigenic culture was used as the reference method for other assay comparisons for the retrospective study and the reference method for the prospective study was diarrhea defined as ≥3 loose stools for at least 1 day and ≥2 positive test results . For the initial investigation (retrospective clinical assessment), the authors observed that documentation was so poor that clinical criteria could not be used for correlation with test performance . Compared with toxigenic culture, the toxin EIA had a sensitivity of 66.7% and specificity of 91.8% and the values for the PCR assay were 94.4% and 96.8%, respectively . For the second investigation, patients were interviewed prospectively and among the 350 patients with 365 unique episodes of potential CDI, 39% did not have sufficient diarrhea to warrant testing and were not further analyzed . There were 30 true-positive results in this analysis . The PCR was more sensitive (93.3%) than toxin EIA (73.3%; P < .05) and direct cytotoxin testing (76.7%). The authors concluded that PCR outperformed the other diagnostic test methods when applied to patients who meet clinical criteria for C. difficile disease. Overall, the design of this study was quite complex with varying reference methods for the 2 study arms, and despite the prospective design for the second investigation, limitations were the very small numbers of positive patients and the fact that it was a single-center study.
In a later publication, Berry et al assessed prospectively whether a rapid PCR assay correlated well and reliably with clinical CDI diagnosis . The GeneXpert C. difficile assay was compared with CCNA and a GDH/Toxin A/B EIA algorithm. Clinical diagnosis, adjudicated by an unblinded team of multidisciplinary experts, served as the reference for evaluation of the different test performances (>1000 PCR and CCNA tests were performed). Sixty-two patients were both PCR and CCNA positive and an additional 59 specimens were PCR positive alone, among which 54 (91.5%) were in patients clinically diagnosed as having CDI. When the GDH screen was evaluated, 16.2% of patients with clinical CDI would not have been detected. Combining GDH and EIA testing, 59.7% of patients with CDI would have been missed (GDH positive, toxin EIA negative). Patients who were CCNA positive/PCR positive had higher all-cause 30-day mortality compared with CCNA-negative/PCR-positive patients. This study only presented results obtained after repeat testing of indeterminate results. The claimed PPV of 91.9%, using clinical diagnosis as the reference, is much higher than found elsewhere . Patients were not followed long term to assess other clinical outcomes.
In summary, if patients are screened carefully for clinical symptoms likely associated with CDI (at least 3 loose or unformed stools in ≤24 hours with history of antibiotic exposure), then a highly sensitive test such as a NAAT alone or multistep algorithm (ie, GDH plus toxin; GDH plus toxin, arbitrated by NAAT; or NAAT plus toxin) may be best. A 2- or 3-stage approach increases the PPV vs one-stage testing.
The issue of if or when to retest for CDI is inherently linked to the accuracy of the employed routine testing method. Methods with suboptimal sensitivity for C. difficile (eg, stand-alone toxin EIAs) led to frequent retesting in some settings. Ironically, use of tests with suboptimal specificity means that multiple repeat testing runs a high risk that false-positive results could eventually be generated. Ideally, in the absence of clear changes to the clinical presentation of suspected CDI (ie, change in character of diarrhea or new supporting clinical evidence), repeat testing should not be performed. This advice is based on the above-mentioned issues and also on studies that have shown that the diagnostic yield of repeat testing within a 7-day period (with either toxin A/B EIA or NAAT) is approximately 2% [191, 192]. Furthermore, use of highly sensitive testing strategies (eg, 2-stage algorithms or stand-alone NAATs) means that the single tests have very high negative predictive value (typically >99%) for CDI.
There may be more value of repeat testing in epidemic settings where CDI acquisition is more frequent [193, 194]. For symptomatic patients with a high clinical suspicion of CDI but a negative CDI test, particularly those in whom symptoms worsen, repeat testing should be considered; this does not equate to routine retesting, given that the great majority of patients with suspected CDI do not have the disease.
Given that recurrent CDI occurs commonly, a recurrence of symptoms following successful treatment and diarrhea cessation should be assessed by repeat testing. Testing for recurrent CDI should ideally include toxin detection, as persistence of toxigenic C. difficile can occur commonly after infection. Patients can have reduced health scores for months after CDI, and may experience altered bowel habits for prolonged periods. In one study in which all CDI patients with recurrent diarrhea were tested for toxin in stool, 35% were negative . Empiric treatment, that is without confirmatory testing of suspected recurrence, is discouraged, as this may be unnecessary and indeed possibly harmful to microbiome restoration.
A variety of fecal biomarkers to distinguish inflammatory causes of diarrhea from noninflammatory conditions, such as irritable bowel syndrome, have evolved over the last few decades. Lactoferrin is an iron binding glycoprotein found in neutrophils and its concentration in stool is proportional to the number of neutrophils present . Calprotectin is a calcium binding protein found in the cytosol of neutrophils . Secretion of cytokines in the intestines such as interleukin 8 and interleukin 1β has also been evaluated [199–201]. While they have utility in diagnosing IBD, their usefulness in the diagnosis of CDI has not been established. Most of the published studies include small or moderate numbers of patients. There are few prospective studies. Interpretation of the literature is further complicated by the use of different methods of testing (latex agglutination vs EIA in the case of fecal lactoferrin), deviation from the manufacturers’ cutoffs for interpretation, and other confounding factors. Some of these biomarkers may be helpful in identifying patients at risk for severe disease. Given these limitations, no recommendations for their routine use can be made.
The rate of C. difficile colonization among asymptomatic infants can exceed 40% [136, 143, 154]. Colonization rates among hospitalized neonates are greater than observed for healthy infants . Although the rate of colonization declines over the first year of life, intermittent detection of C. difficile toxin can persist throughout infancy . Clostridium difficile toxin can still be detected in approximately 15% of 12-month-old infants . Thus, there is a substantial risk of a biologic false positive when C. difficile diagnostic testing is performed in neonates and infants. Another challenge to defining when an infant with diarrhea should be tested for C. difficile is the absence of a validated definition of clinically significant diarrhea in this age group, where passage of frequent loose stools is common. Children <12 months of age should only be tested for C. difficile if they have evidence of pseudomembranous colitis or toxic megacolon, or if they have clinically significant diarrhea and other causes of diarrhea have been excluded.
The prevalence of asymptomatic colonization with C. difficile is elevated in the second year of life, although to a lesser degree than in infants [139, 153, 154]. Therefore, testing in this population should also be avoided unless other infectious and noninfectious causes of diarrhea have been excluded. However, by 2–3 years of age, approximately 1%–3% of children are asymptomatic carriers of C. difficile (a rate similar to that observed in healthy adults). Rarely, some conditions such as Hirschprung disease may predispose young children to CDI, and testing should be considered in this population [203, 204]. The role of C. difficile in community-onset diarrhea in otherwise healthy young children remains controversial. Studies of children hospitalized with acute gastroenteritis have documented that C. difficile can be isolated in >50% of children in whom an alternate gastrointestinal pathogen has been identified . Additionally, one recently published study found that among 100 children <2 years of age who were hospitalized with diarrhea and had C. difficile toxin detected; all had resolution of diarrhea regardless of whether C. difficile–specific therapy was administered . Limited data suggest that identification of multiple enteric pathogens (including C. difficile) may predict more severe symptoms .
Isolation of patients with CDI or suspected CDI is a prevention measure used by most healthcare facilities regardless of local epidemiology; however, additional measures are often implemented, particularly when CDI rates are high. An infection control “bundle” strategy has been used to successfully control major CDI outbreaks [207–211]. The “bundle” approach involves multifaceted interventions and includes hand hygiene, isolation measures, environmental disinfection, and antibiotic stewardship. However, it is often difficult to determine which interventions were the most effective in controlling the outbreak as they are implemented simultaneously.
Hospital room design and handwashing accessibility are essential elements in the prevention and control of CDI. Private rooms may facilitate better infection control practices. In a cohort study of healthcare-associated CDI acquisition, higher rates of CDI were demonstrated among patients housed in double rooms than in single rooms (17% vs 7%; P = .08) and there was a significantly higher risk of acquisition after exposure to a roommate with a positive culture result . The effect of private rooms on CDI and other bacterial acquisition rates was studied when an ICU was renovated to only private rooms with accessible handwashing facilities . There was a significant reduction in CDI rates by 43%, although other potential confounders, such as antibiotic utilization, were not examined . Private rooms may not be available and cohorting patients with CDI in a multibed room may be required. The risk of recurrence was examined among patients with CDI admitted to a cohort ward while adjusted for potential risk factors such as age, comorbidities, and continued antibiotic use . Admission to a C. difficile cohort ward was shown to be an independent predictor for recurrence . If cohorting is required, dedicated commodes should be provided to the patients to reduce further cross-transmission.
In conclusion, patients with CDI should be placed in a private room to decrease transmission to other patients. If there is a limited number of private single rooms, CDI patients with stool incontinence should be prioritized for placement in private rooms. If cohorting is required, it is recommended to cohort patients infected or colonized with the same organism(s) ie, do not cohort patients with CDI who are discordant for other multidrug-resistant organisms such as MRSA or vancomycin-resistant Enterococcus (VRE).
Additional isolation techniques (contact precautions, private rooms, and cohorting of patients with active CDI) have been used for control of outbreaks, with variable success [207, 214, 215]. Contact precautions include the donning of gowns and gloves when caring for patients with CDI. The hands of personnel can become contaminated with C. difficile spores, particularly when gloves are not used and when exposed to fecal soiling [74, 216]. Wearing gloves in conjunction with hand hygiene should decrease the concentration of C. difficile organisms on the hands of healthcare personnel. A prospective controlled trial of vinyl glove use for handling body substances showed a significant decrease in CDI rates, from 7.7 cases per 1000 discharges before institution of glove use to 1.5 cases per 1000 discharges after institution of glove use (P=.015), but not on control wards that did not institute the glove intervention . Care should also be taken to prevent contamination of hands when removing gloves.
Clostridium difficile has been detected on nursing uniforms, but there is no evidence that uniforms are a source of transmission to patients . The use of gowns has been recommended because of potential soiling and contamination of the uniforms of healthcare personnel with C. difficile and high quality of evidence for reducing transmission of other enteric multidrug-resistant organisms (ie, VRE) [219, 220]. In addition, the fact that gloves reduce transmission provides further indirect evidence for gowns.
It is important to place patients suspected of having CDI on contact precautions before diagnostic laboratory test confirmation if there will be a lag before test results are available. In a prospective study of 100 patients suspected of CDI, skin contamination was evaluated as well as the average time for test results to become available . The potential for healthcare personnel hand contamination was assessed by applying sterile gloved hands to frequently examined patient skin sites and then imprinting the gloves onto agar for C. difficile culture. Twenty of these 100 patients (20%) were diagnosed with CDI but the test results were not available for 2.07 days. The frequency of C. difficile acquisition on gloved hands of healthcare personnel after skin contact with these patients was 69%. This study supports that patients with suspected CDI should be placed on preemptive contact precautions pending the C. difficile test results if the results cannot be obtained the same day as when the specimen was collected.
The CDC currently recommends that contact precautions be continued for the duration of the illness . The UK guidelines recommend continuing contact precautions for at least 48 hours after diarrhea resolves . Clostridium difficile was suppressed to undetectable levels in stool samples from most patients by the time diarrhea resolved (mean, 4.2 days) in a prospective study of 52 patients . However, at the time of resolution of diarrhea, skin and environmental contamination was high at 60% and 37%, respectively. In addition, stool detection of C. difficile was 56% at 1–4 weeks posttreatment. Continue contact precautions for at least 48 hours after diarrhea has ceased. There are no studies that demonstrate further extending contact precautions results in reductions in CDI incidence. Prolonging contact precautions until discharge remains a special control measure if CDI rates remain high despite implementation of standard infection control measures against CDI .
Transmission of C. difficile strains commonly occurs via the hands of healthcare personnel. After caring for patients with CDI, the proportion of healthcare personnel with hand contamination when gloves are not worn ranges from 14% to 59% [74, 87, 216, 224]. Hand hygiene is considered to be one of the cornerstones of prevention of transmission of C. difficile, as it is for most other healthcare-associated infections. Many studies have documented low rates of handwashing by healthcare personnel, particularly when sinks are not readily accessible [225–228]. The introduction of alcohol-based hand antiseptics has been considered transformative for increasing hand hygiene compliance. Hand hygiene guidelines recommend the use of alcohol-based products, unless the hands have come into contact with body fluids or are visibly soiled, in which case handwashing with soap and water is recommended. These alcohol-based antiseptics are popular because of their ease of use at the point of care and their effectiveness in rapid killing of most vegetative bacteria and many viruses that contaminate hands. However, C. difficile spores are highly resistant to killing by alcohol. Indeed, the addition of ethanol to stool samples in the laboratory facilitates the culture of C. difficile from these specimens . Therefore, healthcare personnel who do not wear gloves or whose hands become contaminated when doffing gloves may be merely redistributing spores over the hand surface when using alcohol-based products. This could potentially increase the risk of transferring C. difficile to patients under their care, but numerous studies have not shown an association between the use of alcohol-based hand hygiene products and an increased incidence of CDI. The impact of using an alcohol-based hand hygiene product on rates of infection with MRSA, VRE, and CDI 3 years before and after its implementation was studied . After implementation, the rates of MRSA and VRE infections decreased by 21% and 41%, respectively, whereas the incidence of CDI was unchanged. This finding is consistent and has been reproduced in other studies [231–234]. A large prospective, ecological interrupted time series study was conducted from July 2004 to June 2008 in England and Wales to evaluate the impact of the “cleanyourhands” campaign on the rates of hospital procurement of alcohol hand rub and soap and to investigate the association between the rates of MRSA bacteremia and CDI . Procurement of these products was used as a proxy for hand hygiene compliance. This study demonstrated that increased soap procurement was significantly associated with a decline in CDI rates whereas increased alcohol hand rub procurement was significantly associated with a reduction in MRSA bacteremia rates.
The use of alcohol-based products has been compared with other methods of hand hygiene in removal of C. difficile spores [236, 237]. These studies evaluated the efficacy of different handwashing methods among volunteers for removal of spores of a nontoxigenic strain of C. difficile. Handwashing with soap and water, or with an antimicrobial soap and water, was found to be more effective at removing C. difficile spores than alcohol-based hand hygiene products. McFarland et al showed that chlorhexidine-containing antiseptic was more effective than plain soap for eliminating C. difficile from the hands of healthcare personnel . Clostridium difficile was recovered from the hands of 88% of personnel (14 of 16) who had washed with plain soap. Washing with 4% chlorhexidine gluconate reduced the rate to 14% (1 of 7 personnel) ; in contrast, another study that conducted experimental hand seeding with C. difficile spores showed no difference between plain soap and chlorhexidine gluconate in removing C. difficile from hands .
In summary, there is a theoretical possibility for alcohol-based hand hygiene products to increase the incidence of CDI because of their inability to eliminate C. difficile spores from the hands. However, there have not been any clinical studies to support that the use of alcohol-based hand hygiene products results in an increased incidence of CDI. Therefore, before and after providing care for a patient with CDI, it is recommended to preferentially use soap and water over alcohol-based products alone for hand hygiene in CDI-hyperendemic (sustained high rates) or outbreak settings. It is important to confirm compliance with glove use and to use alcohol-based products in nonoutbreak or endemic settings.
The hands of patients can also become contaminated with C. difficile at a rate of 32% . Potentially, these patients can transmit C. difficile to surfaces. In addition, this could be a factor in CDI recurrence when the spores are ingested from their contaminated hands. Patient bathing can also decrease skin contamination of C. difficile. Among 37 patients with CDI, showering was more effective than bed bathing in decreasing the rate of positive skin cultures . Encouraging patients to wash hands and shower could be a useful strategy to reduce the burden of spores on the skin.
Single-use disposable equipment should be used to prevent CDI transmission. Nondisposable medical equipment should be dedicated to the patient’s room, and other equipment should be thoroughly cleaned after use in a patient with CDI. Environmental contamination has been associated with the spread of C. difficile via contaminated commodes, blood pressure cuffs, and oral and rectal electronic thermometers [74, 241, 242]. Replacement of electronic thermometers with single-use disposable thermometers has been associated with significant decreases in CDI incidence . During simulated routine physical examinations on patients with CDI, stethoscopes were found to acquire and transfer C. difficile spores as often as gloved hands . These results support the recommendation to use disposable patient equipment when possible and to ensure that reusable equipment is cleaned and disinfected with a US Environmental Protection Agency–registered, sporicidal disinfectant, when possible. It is important to ensure that the responsibility and methods for cleaning and disinfection are clearly defined in standard operating procedures.
Clostridium difficile produces spores that are resistant to most standard hospital environmental disinfectants and can survive for months in the hospital environment . Patients who are colonized with C. difficile shed spores and contaminate their local environment. These spores can serve as a source of transmission to other patients. Surfaces from which C. difficile spores have been cultured include toilets, commodes, floors, bed rails, call buttons, sinks, and over bed tables [87, 246]. Although some studies demonstrated that epidemic strains have increased capacity for sporulation, other studies have not . Environmental contamination is lowest in rooms of culture-negative patients (<8% of rooms), intermediate in rooms of patients with asymptomatic C. difficile colonization (8%–30% of rooms), and highest in rooms of patients with CDI (9%–50% of rooms) [74, 87, 245, 248]. Samore et al found the degree of environmental contamination to correlate with the degree of healthcare personnel hand contamination . Hand contamination was 0%, 8%, and 26% when environmental contamination was 0–25%, 26%–50%, and >50%, respectively. Of note, this study was conducted prior to the routine use of contact precautions for patients with CDI, so regular use of gloves may decrease hand contamination if implemented.
Measuring the effect of environmental agents with sporicidal activity on the incidence of CDI is complicated by data that indicate that most patients with CDI do not directly acquire C. difficile from the environment, the existence of different methods to apply these agents, and the record of inconsistent impact of sporicidal agents on reducing CDI incidence in nonoutbreak settings. Several recent studies provide insight as to why this may be. Shaughnessy et al found admission to an ICU room that previously housed a patient with CDI to be a risk factor for CDI, but only 11% of patients who developed CDI had this risk factor . Consistent with this finding, a modeling study found that environmental contamination with C. difficile spores likely contributes to only 10% of new CDI cases . In addition, studies using sequencing to characterize isolates found only 2%–7% of new CDI cases could be attributed to environmental contamination [75, 250]. Studies that have found a reduction in CDI after implementation of a sporicidal agent have mostly occurred in outbreak settings, with implementation of the sporicidal agent occurring concurrently with other interventions to prevent CDI [251–253]. However, sporicidal agents have not been associated with reductions in CDI in nonoutbreak settings [86, 88]. This is likely because in an endemic setting, in the absence of consecutive patients admitted to a room developing CDI, the degree of environmental contamination is not sufficient to cause transmission. In addition, C. difficile spores are physically removed when surfaces are wiped down. Other confounding variables in studies include the following: Several different products have been used including various dilutions of sodium hypochlorite, phenol-based agents, peroxide-based agents, and ultraviolet irradiation; applied by people or by automated systems; and with daily cleaning alone, daily cleaning and terminal cleaning, terminal cleaning alone, and periodic “deep cleaning.”
In outbreak settings, terminal disinfection with a sporicidal agent in conjunction with other interventions to prevent CDI has been associated with reductions in CDI. However, terminal disinfection with a sporicidal agent has not been associated with consistent reductions in CDI in nonoutbreak settings. Therefore this remains most appropriate as a supplemental intervention for outbreaks, hyperendemic settings, and evidence of repeated cases of CDI in the same room.
If a sporicidal agent is implemented, compliance with thoroughness of cleaning has been associated with reductions in viable C. difficile spores from the environment.
To decrease C. difficile spore contamination, one hospital found, over the course of several interventions that included terminal disinfection with bleach, use of fluorescent markers to assess cleaning adequacy, use of an automated ultraviolet radiation device, and a dedicated team focused on daily cleaning of rooms housing patients with CDI, that the latter intervention was clearly the most effective at removing viable C. difficile spores from the environment . Several methods have been used to assess thoroughness of cleaning, including fluorescent markers and adenosine triphosphate bioluminescence [254, 255]. These measures of cleaning adequacy are most effective when feedback is given in real time. Barriers to effective cleaning may be due to insufficient time for cleaning, inadequate cleaning supplies, inadequate education, and poor communication . Just as, if not more, important than using markers and providing feedback is having environmental services staff dedicated to thorough cleaning .
“No-touch” disinfection technologies have garnered much interest of late. In general, these products use ultraviolet radiation or hydrogen peroxide vapor to disinfect the environment, and several studies have found that these products are effective at reducing viable C. difficile spores from patient rooms [254, 256, 257]. No single methodology (“no-touch” or otherwise) appears to be superior in regard to reductions in CDI incidence. Automated, terminal disinfection using a sporicidal method has been associated with reductions in viable C. difficile spores from the environment. There have been several reports associating use of no-touch disinfection technologies and reductions in CDI, but all of these have at least one significant limitation. These include before–after study designs, inappropriate statistical methods to analyze the data, other concurrent interventions, high baseline incidence of CDI prior to implementation, reduction of CDI back to baseline prior to no-touch technology implementation, and reductions driven by results from single units without apparent impact on other units [256, 258–264]. Data are currently too limited to draw any conclusions as to whether/when these devices should be a component of a CDI prevention program.
Daily sporicidal disinfection can be effective at reducing C. difficile environmental contamination and has been associated with reductions in CDI in outbreak settings in conjunction with other interventions to prevent CDI. Mayfield et al reported that the introduction of disinfection with a hypochlorite-based solution (5000 ppm available chlorine) was associated with reduced incidence of CDI in a bone marrow transplant unit where there was a relatively high incidence of CDI . Notably, the incidence of CDI increased almost to the baseline level after the reintroduction of the original quaternary ammonium compound as the principal cleaning agent. However, the environmental contamination of C. difficile was not measured in this study, and the results were not reproducible on other units with low CDI incidence. Orenstein et al evaluated the use of daily disinfection with bleach wipes containing 0.55% active chlorine on the incidence of HA-CDI in 2 units with hyperendemic rates . The intervention successfully decreased the incidence by 85%. Daily disinfection of high-touch surfaces using a peracetic acid-based disinfectant was also shown to reduce contamination of healthcare workers’ hands . In contrast to daily disinfection, Hacek et al conducted a study to examine the impact of only terminal room cleaning with hypochlorite containing solution and no change to the daily room cleaning with quaternary ammonium . With this intervention, there was a statistically significant decrease of 48% in the incidence of CDI.
There have not been any head-to-head comparisons of daily vs terminal cleaning using only sporicidal disinfection.
In institutions with higher rates of CDI (7.8–22.5 cases per 1000 discharges), the number of asymptomatic carriers has been found to be considerably higher than the number with CDI [74, 87]. These asymptomatic carriers admitted to a ward could represent an important source of healthcare-associated spread of infection [92, 267, 268]. Results from mathematical modeling studies have suggested that reductions in CDI incidence by 10%–25% could be achieved by identifying and isolating carriers upon hospital admission [269, 270]. This novel approach was implemented by Longtin et al in an acute care hospital in Quebec that had high endemic rates of CDI . Using a quasi-experimental design and time series analysis, the effect of detecting and isolating asymptomatic carriers was evaluated. Potential confounders such as antibiotic and PPI utilization, hand hygiene compliance, and intensity of CDI testing were taken into consideration. The incidence of CDI decreased significantly after this intervention compared with the preintervention period and the lower incidence was sustained for at least 1 year after the study terminated. This study provides the most compelling evidence to date for the significant effect of isolating carriers. However, several potential confounders were not assessed including compliance with isolation precautions, effect of environmental cleaning, and knowledge of C. difficile carrier status on the management of a patient. Ultimately, these promising results need to be reproduced in multiple centers prior to being considered for widespread adoption. If these findings are confirmed in various different hospital settings, implementation of screening and isolation of asymptomatic carriers may be an important strategy to decrease CDI rates.
Antibiotic restriction may be one of the most useful control measures for a CDI outbreak. Fifteen quasi-experimental studies published between 1994 and 2013 were identified that evaluated the effectiveness of interventions to decrease antibiotic usage and changes in CDI rates [272–286]. Most studies were considered moderate (n = 13) or low (n = 2) quality. No randomized controlled trials (RCTs) were identified. A summary of the published studies is shown in Table 5. Studies published during 1994–2014 from hospitals (n = 13) or long-term care facilities (n = 2) were based in North America (n = 7) or the United Kingdom (n = 8). All studies but one were associated with an ongoing CDI epidemic (defined by most studies as a dramatic increase in rate of CDI) of which 7 studies demonstrated a clonal, epidemic strain. All studies used either a formulary restriction strategy (n = 11) or prospective audit and feedback (n = 4) as their predominant stewardship strategy. Targeted antibiotics included fluoroquinolones (n = 7 studies), cephalosporins (n = 10), clindamycin (n = 5), amoxicillin or amoxicillin-clavulanate (n = 3), other β-lactamase inhibitors, carbapenems, vancomycin, or aztreonam (n = 1 each). Many studies targeted more than one antibiotic (n = 6). Second- and third-generation cephalosporins were more likely targets of intervention from studies published in the 1990s to early 2000s with fluoroquinolones targeted more frequently in studies published after 2000. Antibiotics within the same class (eg, cephalosporins) may not have the same risk for CDI and studies usually targeted the antibiotic most likely causing the current epidemic (generally considered the most widely used antibiotic in the hospital). All interventions were highly effective at decreasing usage of the targeted antibiotic(s) with percentage reduction that ranged from 50% to >90%, indicative of a successful process implementation. When reported, a global decrease for all antibiotics was shown in 5 of 9 studies. Change of CDI incidence was recorded as number per 10000 patient-days (10 studies), CDI cases per month (3 studies), or CDI cases per 1000 discharges (2 studies). Three studies evaluated the change in incidence rate of CDI as a result of antibiotic change. Reduction in CDI incidence rates ranged from 33% to >90%, indicative of a successful outcome measure. After the intervention, rates of CDI ranged from 0.3–1.2 cases per 10000 patient days.
|Year [Reference]||Area||Time Frame||Setting (Bed Size)||Dominant Strain||Stewardship Method||Target||Targeted Antibiotics Decrease||Global Change in Hospital Antibiotic Use||CDI Rate Method||Pre- intervention||Post- intervention||Reduction in CDI Rates|
|1994 ||US||1990–1992||Hospital (168)||J7||Restrictive use||Clindamycin||>90%||NR||a||15.8||1.9||88%|
|1997 ||UK||1994–1995||Hospital (NR)||NR||Restrictive use||Cefuroxime||>90%||NR||b||5.3||2.3||57%|
|1998 ||US||1992–1996||Hospital (703)||Clonal strain A||Restrictive use||Clindamycin||>90%||No change||b||11.5||3.3||71%|
|2003 ||UK||1995–2000||Hospital (800)||NR||Restrictive use||Ceftriaxone||>90%||NR||c||14.6||3.4||77%|
|2003 ||US||1991–1998||Hospital (159)||NR||Prospective audit and feedback||Third-generation cephalosporins and aztreonam||75%||Decreased||c||2.2||0.3||86%|
|2004 ||UK||1997–2002||Hospital (24 ward beds)||NR||Restrictive use||Cefotaxime||83%||No change||a||46||22||52%|
|2004 ||US||2001–2003||LTCF (100)||A||Restrictive use||Gatifloxacin||>90%||No change||c||1.32||0.51||61%|
|2007 ||Canada||2003–2006||Hospital (683)||027||Restrictive use||High-risk antibiotics||80%||Decreased||c||2.03||0.82||60%|
|2007 ||UK||1999–2003||Hospital (78 ward beds)||NR||Prospective audit and feedback||Cephalosporins and amoxicillin-clavulanate||50%–75%||No change||b||NR||NR||65%|
|2011 ||UK||2005–2007||Hospital (495)||027||Restrictive use||High-risk antibiotics||50%–75%||No change||c||2.22||0.45||80%|
|2012 ||Canada||2008–2010||Hospital (48 ICU beds)||NR||Prospective audit and feedback||High-risk antibiotics||20%||Decreased||c||1.12||0.71||37%|
|2013 ||UK||2008–2011||Hospital (215)||NR||Restrictive use||Ceftriaxone and ciprofloxacin||70%–90%||NR||c||2.398||1.2||50%|
|2011 ||Ireland||2004–2009||Hospital (665)||027||Restrictive use||Quinolones||>90%||NR||c||0.8||0.746||d|
|2012 ||Ireland||2004–2010||Hospital (233)||NR||Restrictive use||High-risk antibiotics||70%–90%||Decreased||c||0.8||0.7||e|
|2012 ||US||2007–2010||LTCF (160)||NR||Prospective audit and feedback||High-risk antibiotics||25%–50%||Decreased||c||NR||NR||f|
Abbreviations: CDI, Clostridium difficile infection; ICU, intensive care unit; LTCF, long-term care facility; NR, not reported; UK, United Kingdom; US, United States.
aCDI cases per 1000 hospital discharges.
bCDI cases per month.
cCDI cases per 10000 patient-days.
dEach defined daily dose reduction in quinolone per 100 bed-days resulted in reduced incidence of CDI by 0.054 cases per 100 bed-days.
eCDI incidence rate decreased by 0.0047/100 bed-days per month. 8
fCDI rates reduced by 0.2 cases per 1000 patient-days.
The number and duration of antibiotics can also influence the development of CDI. Use of multiple antibiotics (mean number used, 4.2 vs 1.4 antibiotics) was found to be an important risk factor for developing CDI and the incidence of CDI increases with the number of antibiotics prescribed (relative risk, 1.49; 95% CI, 1.23–1.81) [102, 287]. A retrospective cohort of 241 patients examined the risk of development of CDI and cumulative antibiotic exposures. The risk of CDI was associated with increasing cumulative dose, number of antibiotics, and days of antibiotic exposure. For example, compared to patients who received only 1 antibiotic, the adjusted hazard ratios (HRs) for those who received 2, 3 or 4, or ≥5 antibiotics were 2.5 (95% CI, 1.6–4.0), 3.3 (95% CI, 2.2–5.2), and 9.6 (95% CI, 6.1–15.1), respectively . Therefore, it is critical to avoid unnecessary antibiotics and to minimize the duration of use to reduce the risk of CDI.
Although many hospitals have implemented an antibiotic stewardship program (ASP), it is important to sustain the program with the required resources. The benefits of ASP include improved patient outcomes, reduced adverse events (including CDI), improvement in rates of antibiotic susceptibilities, and optimization of resource utilization .
There is a clinical association between PPI use and CDI [290–293]. Three recent meta-analyses assessed the association between PPI use and the risk for CDI using data from >47 studies containing >300000 patients. All studies demonstrated significant heterogeneity in the dataset, and 2 of 3 noted publication bias (the third did not perform this analysis due to underlying heterogeneity of data). Kwok et al assessed 42 total studies (30 case-control; 12 cohort) totaling 313000 patients . Summary odds ratios (ORs) were presented for incident cases of CDI (OR, 1.74; 95% CI, 1.47–2.85) as well as recurrent CDI (OR, 2.51; 95% CI, 1.16–5.44). Concomitant use of non–C. difficile antibiotics increased the risk of CDI with PPI usage (OR, 1.96; 95% CI, 1.03–3.70). Histamine type 2 receptor antagonists had decreased risk of CDI compared to PPI use. Janarthan et al assessed 23 total studies (17 case-control and 6 cohort) totaling 288620 patients . Incidence of CDI increased with exposure to PPIs (OR, 1.69; 95% CI, 1.34–1.97). There was no difference in the summary OR if the analysis was limited to cohort (OR, 1.66; 95% CI, 1.23–2.24) or case-control studies (OR, 1.65; 95% CI, 1.38–1.98). Finally, Tleyjeh assessed 47 total studies (37 case control and 14 cohort) . Incidence of CDI increased with exposure to PPIs (OR, 1.69; 95% CI, 1.34–1.97). Two studies assessed the number of cases likely to occur with the addition of PPI therapy. Number needed to harm was higher for the general population (range, 899–3925) compared with hospitalized patients not on concomitant antibiotics (range, 202–367), or hospitalized patients receiving concomitant antibiotics (range, 28–50). Despite clinical data showing consistently increased risk, heterogeneity of the data, role of unknown confounders, lack of dose–response relationships, and other methodologic considerations are considerable limitations to the practical application of these data.
A number of further observational studies have investigated the association between PPI use and CDI after publication of these meta-analyses [27, 294–297]. A large, population surveillance study of 984 patients with community-associated CDI showed that 31% of patients with CDI who did not receive antibiotics did receive a PPI . Three studies investigated the association between PPI usage and recurrent CDI in 1627 patients [294, 295, 297]. Two of the 3 studies did not show an association between PPI use and recurrent CDI. Finally, a study of 483 patients colonized with C. difficile showed that exposure to PPI increased the risk of developing CDI . Thus, there appears to be a clinical association between PPI use and CDI, but the true causal relationship is unclear. No RCTs or quasi-experimental studies have studied the relationship between discontinuing or avoiding PPI use and risk of CDI. Thus, a recommendation to globally discontinue PPIs in patients at high risk for CDI or recurrent CDI regardless of need for PPI will require further causal proof. However, stewardship activities to discontinue unneeded PPIs are warranted.
Several meta-analyses indicate probiotics may be effective at preventing CDI when given to patients on antibiotics who do not have a history of CDI [298–300]. The typical CDI incidence among hospitalized people >65 years of age on antibiotics with a length of stay >2 days is ≤3%, even during outbreaks of CDI [21, 36, 248]. The studies with the greatest influence on the results of the meta-analyses had a CDI incidence 7–20 times higher in the placebo arms than would otherwise be expected based on the patient population studied, potentially biasing the results to benefit of the probiotic [301, 302]. When these studies are excluded, a trend toward a reduction in CDI remains, but it is not as great as when these studies are included. Many limitations remain when the studies with extremely high CDI incidence are excluded, including differences in probiotic formulations studied, duration of probiotic administration, definitions of CDI, duration of study follow-up, and inclusion of patients not typically considered at high risk for CDI. There is also the potential for organisms in probiotic formulations to cause infections in hospitalized patients [303–305]. Due to these issues, there are insufficient data to recommend administration of probiotics for primary prevention of CDI.
Discontinuation of inciting antibiotic agent(s) as soon as possible should always be considered as their continued use has been shown to decrease clinical response and increase recurrence rates [292, 306]. Antibiotic therapy should be started empirically if a substantial delay in laboratory confirmation is expected (eg, >48 hours) or if a patient presents with fulminant CDI. For other patients, antibiotic therapy should be started after diagnosis to limit overuse of antibiotics and associated toxicities including overgrowth of multidrug-resistant pathogens . Historically, administering antimotility agents to patients with diarrhea without consideration or specific therapy for CDI has led to bad outcomes. Addition of an antimotility agent such as loperamide as an adjunct to specific antibacterial therapy for CDI may be safe, although no prospective or randomized studies are available [308, 309].
For 30 years, metronidazole and oral vancomycin have been the main antibiotic agents used in the treatment of CDI. Consensus on optimal treatment of CDI is evolving with the availability of new data on established agents and introduction of a new, FDA-approved drug, fidaxomicin. Two RCTs conducted in the 1980s and 1990s that compared metronidazole therapy and vancomycin therapy found no difference in outcomes but included <50 patients per study arm [168, 310]. However, since 2000, additional randomized, placebo-controlled trials have shown that oral vancomycin was superior to metronidazole (Table 6) [170, 188]. The first study assessed clinical cure rates of 150 patients with CDI given oral metronidazole 250 mg 4 times daily (n = 79) compared to oral vancomycin 125 mg 4 times daily (n = 71) . Cure was superior for all patients given oral vancomycin (97%) compared to metronidazole (84%; P < .006). Clinical cure superiority was also observed in 69 patients with severe disease given vancomycin (97%) compared to metronidazole (76%; P = .02). The second publication was a combined analysis of 2 multinational studies that compared the efficacy of tolevamer (n = 563), a toxin-binding polymer, with oral vancomycin 125 mg 4 times daily (n = 266) and oral metronidazole 250 mg 4 times daily (n = 289) . Tolevamer was inferior to both metronidazole and vancomycin (P < .001). Metronidazole clinical response rates (72.7%) were also inferior to vancomycin (81.1%) response rates (P = .02). Combined, these RCTs published since 2000 demonstrated that metronidazole was inferior to oral vancomycin for clinical cure in patients with CDI (P = .002). These studies also demonstrated that metronidazole was inferior to oral vancomycin for resolution of diarrhea at end of treatment without CDI recurrence 21–30 days after treatment (P = .002). A recent retrospective study of hospitalized patients with mild-to-moderate CDI found that metronidazole was inferior to vancomycin for treatment response in this population as well .
Nearly all randomized trials have compared 10-day regimens of CDI treatment agents, and 10 days should be sufficient to resolve symptoms in most patients. However, some patients may have delayed response to treatment, particularly those treated with metronidazole . The recent randomized trial data (Table 6) [170, 188] have confirmed prior observational studies that demonstrated decreased effectiveness of oral metronidazole [312, 313]. If patients have improved, but have not had symptom resolution by 10 days, extension of the treatment duration to 14 days should be considered . Use of oral metronidazole, however, should be restricted to an initial episode of nonsevere CDI in cases where other therapies are contraindicated or not available (Tables 4 and 5), and treatment should be limited to one course due to case reports of neurotoxicity with prolonged or repeated use [315, 316]. Although cost and utilization analyses were not specifically addressed in these guidelines, compounding of the intravenous formulation of vancomycin for oral administration has been used as a less expensive alternative when barriers to use of the capsular form of vancomycin exist (Table 7).
|Outcomes||No. of Participants
(No. of Studies)
|Percentage Resolution||Relative Effecta
|P Value||Quality of Evidence (GRADE)b||Reference, First Author|
|Direct comparisons of metronidazole and vancomycin|
|Resolution of diarrhea at end of (10 days) treatment||RCTs prior to 2000:
RCTs since 2000:
|RR, 0.97 (.91–1.03)
RR, 0.89 (.82–.96)
RR, 0.89 (.85–.96)
|⊕⊕⊕⊕ High||Teasley 
|Resolution of diarrhea at end of treatment without CDI recurrence ~1 month after treatment||RCTs prior to 2000:
RCTs since 2000:
|RR, 1.0 (.90–1.2)
RR, 0.84 (.74–.94)
RR, 0.87 (.79–.96)
|⊕⊕⊕⊕ High||Teasley 
|Direct comparisons of fidaxomicin and vancomycin|
|Resolution of diarrhea at end of (10 days) treatment||1105d (2)||88 (FDX)
|RR, 1.0 (.98–1.1)||.36||⊕⊕⊕⊕ High||Louie 
|Resolution of diarrhea at end of treatment without CDI recurrence ~1 month after treatment||1105d (2)||71 (FDX)
|RR, 1.2 (1.1–1.4)||<.0001||⊕⊕⊕⊕ High||Louie 
|Direct comparisons of FMT and vancomycin|
|Resolution of diarrhea at end of treatment without CDI recurrence 56 days after treatment||29 (1)||81 (FMT)
|RR, 2.6 (1.1, 6.2)||.01||⊕⊕⊕⊖ Moderate||van Nood |
Abbreviations: CDI, Clostridium difficile infection; CI, confidence interval; FDX, fidaxomicin; FMT, fecal microbiota transplantation; GRADE, Grading of Recommendations, Assessment, Development and Evaluation; MTR, metronidazole; RCT, randomized controlled trial; RR, relative risk; VAN, vancomycin.
aAll relative risks calculated using vancomycin as the comparator agent. An RR <1.0 represents results favoring the use of vancomycin; an RR >1.0 represents results favoring the comparator.
bFor GRADE interpretation, see Figure 1.
cFull analysis set. Population in the 2 phase 3 tolevamer trials published in the same journal article .
dModified intention-to-treat population (combined analysis of both phase 3 fidaxomicin trials ).
eA second control group of 13 patients who received a bowel lavage in addition to vancomycin was included in this study. The RR for this comparison (FMT vs VAN + lavage) was 3.5 (95% CI, 1.1–9.8).
|Agent||Adult Dose||Costa||Initial Treatment Responseb||Recurrence Riskb||Resistance in Clinical Isolates||Adverse Events||Evidence Supporting Efficacy|
|Vancomycin||125 mg PO qid × 10 days|
|$ (Liq)||+++||++||Not reported||Minimally absorbed||Multiple RCTs; US FDA approved|
|Fidaxomicin||200 mg PO bid × 10 days|
|+++||+||One clinical isolate with increased MIC||Minimally absorbed||Two phase 3 RCT comparisons to vancomcyin; US FDA approved|
|Metronidazole||500 mg PO tid × 10 days||$||++||++||Increased MIC reported in some studies; hetero-resistance also reported||Neuropathy, nausea||Multiple RCTs|
|Nitazoxanide||500 mg PO bid × 10 days||$$||+++||++||Not reported||GI symptoms||Small RCT comparison to vancomycin and a modest-sized RCT comparison to metronidazole|
|Fusidic acid||250 mg PO tid × 10 days||NA in United States||++||++||Reported to develop in vivo resistance||GI symptoms||Modest-sized RCT comparison to metronidazole and a small RCT comparison to vancomycin|
|Inadequate data to support efficacy|
|Rifaximin||400 mg PO tid × 10 days||$$$||++||+?||Potential for development of high-level resistance||Minimally absorbed||1 small RCT comparison to vancomycin for primary treatment; case series and 1 RCT pilot study show promise for use as a post-vancomycin, “chaser” strategy in management of recurrent CDI|
|Tigecycline||50 mg IV every bid × 10 days|
|++?||?||Not reported||GI symptoms||Case reports and small case series|
|Bacitracin||25000 units PO qid × 10 days||$$||+||+?||Increasing resistance noted||Minimally absorbed, poor taste||Two small RCT comparisons to vancomycin|
Abbreviations: bid, twice daily; CDI, Clostridium difficile infection; FDA, Food and Drug Administration; GI, gastrointestinal; IV, intravenous; Liq, liquid formulation of vancomycin compounded from powder intended for intravenous administration; MIC, minimum inhibitory concentration; NA, not available; PO, oral; qid, 4 times daily; RCT, randomized controlled trial; tid, 3 times daily.
aAll prices are estimated in US dollars as quoted from Red Book Online Search, Micromedex Solutions, last accessed on 10 March 2015 or approximated hospital pharmacy pricing (tigecycline, bacitracin). $, $0–100;
b+, lowest; ++, intermediate; +++, highest; ?, unknown.
The previous IDSA/SHEA guidelines used severity criteria to guide treatment decisions, and use of vancomycin in particular. The criteria used were based on expert opinion and had not been validated at the time. Subsequently, other severity criteria  have been used to document improved clinical response rates for patients with severe CDI who received vancomycin as opposed to metronidazole .
Several recent studies have evaluated potential factors for correlation with disease severity  or treatment outcome [319, 320]. The data base of the recent phase 3 fidaxomicin vs vancomycin treatment trials has been used to develop [319, 320] and validate  factors that might predict treatment failure  or cure . Bauer et al found that fever (>38.5°C), WBC count >15 × 109/L, and creatinine >1.5 mg/dL correlated with treatment failure and that timing of measurement with respect to the positive stool C. difficile assay influenced the values of the variables . Miller et al  measured 6 different factors individually and in various combinations to look for correlation with cure following treatment. WBC count was the only single factor that correlated with cure and a score based on a combination of age, treatment with non-CDI systemic antibiotics, leukocyte (WBC) count, albumin, and serum creatinine (ATLAS) was the most discriminatory. The ATLAS score showed excellent predictive value in the validation cohort, although it was designed as a continuous variable and the optimal cutoff score was not clear. In addition, severely ill patients were not included and metronidazole treatment response was not evaluated.
As a practical measure, we continue to recommend WBC count and serum creatinine as supportive clinical data for the diagnosis of severe CDI, but have changed the creatinine value to an absolute value as opposed to the previous comparison to baseline values, which are not always available  (Table 1). Further validation of these criteria is still needed, and these criteria do not perform well for patients with underlying hematologic malignancies  or renal insufficiency .
Two RCTs compared oral vancomycin to oral fidaxomicin for the treatment of CDI [321, 324]. Primary and secondary endpoints were resolution of diarrhea at the end of the 10-day treatment course and resolution of diarrhea at the end of treatment without CDI recurrence 25 days after treatment, respectively. In total, 1105 patients were enrolled and eligible for the intention-to-treat analysis. Resolution of diarrhea was similar in patients given fidaxomicin (88%) or vancomycin (86%) (RR, 1.0; 95% CI, .98–1.1). Resolution of diarrhea at end of treatment without recurrence 25 days after treatment (sustained clinical response) was superior for fidaxomicin (71%) compared to vancomycin (57%) (RR, 1.2; 95% CI, 1.1–1.4). A post hoc exploratory time to event meta-analyses from the 2 studies investigated a composite endpoint of persistent diarrhea or CDI recurrence or death over 40 days in patients given fidaxomicin or vancomycin . Fidaxomicin reduced the incidence of the composite endpoint by 40% compared to vancomycin (95% CI, 26%–51%; P < .001), primarily due to decreased recurrence in patients given fidaxomicin. Deaths within the first 12 days of therapy occurred in 7 of 572 patients given fidaxomicin and 17 of 592 given vancomycin (P = .06). The effect of fidaxomicin compared to vancomycin was reduced in patients infected with the epidemic BI strain (HR, 0.78; 95% CI, .51–1.19) compared to non-BI strains (HR, 0.30; 95% CI, .19–.46). Finally, a subanalysis from the North American study demonstrated that patients treated with fidaxomicin were less likely to have acquisition and overgrowth of vancomycin-resistant Enterococcus and Candida species . However, subpopulations of VRE with elevated fidaxomicin minimum inhibitory concentrations (MICs) were common, suggesting that this effect may change over time if enterococci resistance to fidaxomicin becomes common. Although these data were derived from 2 separate studies and patients with fulminant CDI were not included, both studies included the same treatment protocols and >1000 patients were randomized in a double-blinded manner. Based on these 2 large clinical trials and meta-analyses, fidaxomicin should be considered along with vancomycin as the drug of choice for an initial episode of CDI.
Additional treatment agents that are probably effective, but have less supportive evidence and which have not received FDA approval, include nitazoxanide and fusidic acid (Table 7). Additional agents with inadequate evidence to recommend treatment of an initial CDI episode include rifaximin, tigecycline, and bacitracin (Table 7). Rifaximin, however, has been more extensively studied as an adjunctive postvancomycin treatment regimen in patients with recurrent CDI (see section XXXI). One potential concern for use of rifaximin is the potential for resistance. Isolates with high MICs (>256 µg/mL) and development of high MICs during treatment with rifaximin are well documented .
Vancomycin, administered orally at high dosage, has been the historical recommendation for fulminant CDI and there remains a lack of high-quality evidence to support this recommendation. If an ileus is present, then vancomycin can also be administered per rectum even though it is unclear whether a sufficient quantity of the drug reaches beyond the left colon [44, 328, 329]. Despite the lack of data, it seems prudent to administer vancomycin by oral and/or rectal routes at higher dosages for patients with fulminant CDI (500 mg 6 hourly by mouth and 500 mg in approximately 100 mL of normal saline by retention enema). Use of high doses of vancomycin is safe, but serum concentrations have been noted with high doses, prolonged exposure, renal failure, and disrupted intestinal epithelial integrity . Hence, it may be appropriate to monitor trough serum concentration in such circumstances to rule out drug accumulation.
In fulminant CDI, intravenously administered metronidazole (500 mg every 8 hours) should be used in addition to vancomycin . This is especially important if ileus is present as this may impair the delivery of orally administered vancomycin to the colon, but intravenously administered metronidazole is likely to achieve therapeutic concentrations in an inflamed colon. In patients not responding to vancomycin and metronidazole, intravenously administered tigecycline (loading dose of 100 mg followed by 50 mg 2 times per day) or passive immunotherapy with intravenous immunoglobulins (150–400 mg/kg) has been used, but no controlled trials have been performed [332–337]. Surgical intervention can be life-saving for selected patients . A rising WBC count (≥25000) or a rising lactate level (≥5 mmol/L) is associated with high mortality and may be helpful in identifying patients whose best hope for survival lies with early surgery . Subtotal colectomy is the established surgical procedure for patients with megacolon, colonic perforation, an acute abdomen, or for patients with septic shock and associated organ failure (renal, respiratory, hepatic, or hemodynamic compromise) [338, 339]. More recently, an alternative procedure has been proposed (loop ileostomy with antegrade vancomycin lavage) as a colon-preserving, less invasive (usually laparoscopic), and less morbid approach that warrants further investigation as it may lead to improved outcomes as well as colon salvage .
The frequency of further episodes of CDI necessitating retreatment remains a major concern. Approximately 25% of patients treated for CDI with vancomycin can be expected to experience at least 1 additional episode [321, 324]. Recurrent CDI results from the same or a different C. difficile strain but, in clinical practice, it is impossible to distinguish these 2 mechanisms [341, 342]. Diagnosis and management do not differ between the former (relapse) or the latter (new infection). Recurrence rates are significantly lower following treatment of an initial CDI episode with fidaxomicin as compared to vancomycin [321, 322, 324]. Risk factors for CDI recurrence are the administration of other antibiotics during or after initial treatment of CDI, a defective humoral immune response against C. difficile toxins, advancing age, and increasingly severe underlying disease [81, 343]. Continued use of PPIs has also been associated with an increased risk of recurrence [344, 345].
A first recurrence of CDI may be treated with oral vancomycin (particularly if metronidazole was used for the first episode), vancomycin followed by a tapered and pulsed regimen, or fidaxomicin. In a randomized, stratified substudy of patients with a first CDI recurrence, a subsequent, second recurrence was less common following therapy with fidaxomicin compared to a standard 10-day course of vancomycin (19.7% vs 35.5%; P = .045) . Uncontrolled, postapproval experience with fidaxomicin suggests less efficacious responses in terms of cure and subsequent recurrence after treatment of patients with recurrent CDI, particularly ≥2 recurrences . Oral vancomycin should be used as a tapered and pulsed-dose regimen if a standard 10-day course of vancomycin was used for the initial episode. Various regimens have been used and are similar to this one: After the usual dosage of 125 mg 4 times per day for 10–14 days, vancomycin is administered at 125 mg 2 times per day for a week, 125 mg once per day for a week, and then 125 mg every 2 or 3 days for 2–8 weeks, in the hope that C. difficile vegetative forms will be kept in check while allowing restoration of the normal microbiota. Metronidazole is not recommended for treatment of recurrent CDI as initial and sustained response rates are lower than for vancomycin (Table 7). Furthermore, metronidazole should not be used for long-term therapy because of the potential for cumulative neurotoxicity [348, 349].
Second or subsequent CDI recurrences may be treated with oral vancomycin as a tapered and pulsed-dose regimen as described above . In a small RCT, patients received rifaximin 400 mg 3 times daily or placebo for 20 days immediately after completing standard therapy for CDI . CDI recurrences occurred in 5 of 33 (15%) patients given rifaximin and in 11 of 35 (31%) patients given placebo (P = .11). Experience using fidaxomicin to treat multiply recurrent CDI is limited. There is little evidence that adding cholestyramine, colestipol, or rifampin to the treatment regimen decreases the risk of a further recurrence .
Several probiotics including Saccharomyces boulardii and Lactobacillus species have shown promise for the prevention of CDI recurrence [352–354]. However, as yet, none has demonstrated significant and reproducible efficacy in controlled clinical trials.
Some patients need to receive other antibiotics during or shortly after the end of CDI therapy. These patients are at a higher risk of a recurrence and its attendant complications [81, 306, 343]. Many clinicians prolong the duration of treatment of CDI in such cases, until after the other antibiotic regimens have been stopped. Lower doses may be sufficient to prevent recurrence (eg, vancomycin 125 mg once daily). Whether this approach reduces the risk of CDI recurrence is unknown, but one retrospective study suggested no benefit for extension of CDI treatment beyond 10–14 days . A similar concern is encountered among patients who have successfully completed treatment for CDI but subsequently are administered systemic antibiotics. Two retrospective cohort studies have been published looking at the risk of recurrent CDI in patients who received subsequent antibiotic exposure between those who were empirically treated with vancomycin during that exposure and those who were not [356, 357]. One of these studies looked at patients who received antibiotics within 90 days of the prior episode and one looked at patients who were rehospitalized (1–22 months later) and given systemic antibiotics. The vancomycin dose and regimen varied considerably, but both studies showed a decreased risk of subsequent CDI for some patients treated empirically with vancomycin. One study showed a decreased risk for those whose previous CDI episode was itself a recurrent CDI episode, but not for those following a primary CDI episode . The obvious bias in these studies was the unknown factors that dictated prescribing oral vancomycin prophylaxis. In addition, the long-term benefit is unknown. To date there are no prospective, randomized studies of secondary prophylaxis of CDI to guide recommendations, but if the decision is to institute CDI prevention agents, it may be prudent to administer low doses of vancomycin or fidaxomicin (eg, 125 mg or 200 mg, respectively, once daily) while systemic antibiotics are administered. Factors that might influence the decision to administer secondary prophylaxis include length of time from previous CDI treatment, and patient characteristics (number of previous CDI episodes, severity of previous episodes, and underlying frailty of the patient).
Patients who have failed to resolve recurrent CDI despite repeated antibiotic treatment attempts present a particularly difficult challenge. Clinical investigations of patients with recurrent CDI have shown significant disruption of the intestinal microbiome diversity as well as relative bacterial population numbers. Instillation of processed stool collected from a healthy donor into the intestinal tract of patients with recurrent CDI has been used with a high degree of success to correct the intestinal dysbiosis brought about by repeated courses of antibiotic administration [358–361]. Anecdotal treatment success rates of fecal microbiota transplantation (FMT) for recurrent CDI have been high regardless of route of instillation of feces, and have ranged between 77% and 94% with administration via the proximal small bowel [358, 362]; the highest success rates (80%–100%) have been associated with instillation of feces via the colon [360, 363–366]. By March 2016, >1945 patients (reported as single case reports and larger case series) with recurrent CDI had been described in the peer-reviewed literature (J. S. Bakken, unpublished data).
Despite the large number of anecdotal reports that have consistently demonstrated high efficacy of FMT, the first prospective randomized clinical trial that compared the outcome of standard antibiotic therapy to FMT was published in 2013 . In this unblinded trial, van Nood and collaborators randomly assigned 43 patients with ≥2 recurrent episodes of CDI to receive either a standard 14-day course of oral vancomycin (13 patients), vancomycin with bowel lavage (13 patients), or a 4-day course of vancomycin followed by bowel lavage and subsequent FMT infusion administered through a nasoduodenal tube (17 patients) . The primary endpoint was initial response without relapse for 10 weeks after completion of therapy. The investigation was terminated early after interim analysis, due to the marked difference in treatment outcomes. Thirteen of the 16 (81%) patients in the FMT arm had a sustained resolution of diarrhea after the first fecal infusion; only 7 of the 26 (27%) patients who were treated with vancomycin resolved their CDI (P < .001). Four additional randomized trials of FMT have been published through 2016 [368–371]. One of these trials compared FMT to antibiotic treatment  and the other 3 compared various refinements of the FMT product , delivery of the product , or FMT to autologous FMT . In general, the reported efficacy of FMT is lower in most randomized trials than in nonrandomized reports. The largest of these randomized trials reported an efficacy of approximately 50% for one FMT delivered by enema, which increased to 75% for 2 FMT administrations and approximately 90% for >2 FMT administrations. Patient selection, proximity to recurrent CDI episode, and antibiotic treatment prior to FMT all likely influence response to FMT.
FMT has been well accepted by patients and represents a viable alternative treatment approach to an increasing clinical problem. Judged by the published literature, FMT appears to be safe in the short term [359, 367, 372, 373] and mild to moderate posttreatment adverse events are for the most part self-limited . A recent retrospective multicenter case series report of 80 immunocompromised patients concluded that FMT was safe and well tolerated, although they included a heterogenous group of conditions . Reported infectious complications directly attributed to the instillation of donor feces has so far been limited to 2 patients who developed norovirus gastroenteritis after FMT for treatment of CDI despite use of asymptomatic donors and lack of sick contacts . Physical complications from the FMT instillation procedure (upper gastrointestinal bleed after nasogastric tube insertion, colon perforation during colonoscopy) has been occasionally reported and may occur with the same frequency as when these procedures are performed for gastrointestinal illnesses other than recurrent CDI. Potential unintended long-term infectious and noninfectious consequences of FMT are still unknown in the absence of large-scale controlled trials with sufficient follow-up.
Potential candidates for FMT include patients with multiple recurrences of CDI who have failed to resolve their infection despite treatment attempts with antibiotic agents targeting CDI. Although there are no data to indicate how many antibiotic treatments should be attempted before referral for FMT, the opinion of the panel is that appropriate antibiotic treatments for at least 2 recurrences (ie, 3 CDI episodes) should be tried. There are limited data on FMT administration in patients with severe, refractory CDI [377, 378]. FMT has also been used for treating recurrent CDI in patients with underlying IBD, although it appears to be less effective for this population compared to those without IBD , and flares of underlying disease activity have been reported following FMT for recurrent CDI in patients with IBD [379–381]. Once a patient has been found to be a candidate for FMT, an appropriate stool donor must be identified. Occult contagious pathogens may be present in the stool of a candidate FMT donor, which could potentially place the recipient at risk for a transmissible infection. Careful evaluation and selection of all candidate stool donors is therefore important to minimize the risk for an iatrogenic infection and to maximize the likelihood for a successful treatment outcome. The designated stool donor should undergo screening of blood and feces prior to the stool donation in accordance with recommendations recently published . Detection of any transmissible microbial pathogen should disqualify the individual from donating stool. Individuals who have been treated with an antibiotic agent during the preceding 3 months of donating stool, and those with preexisting chronic medical conditions, such as IBD, malignant diseases, chronic infections, active autoimmune illnesses, or individuals who are receiving active treatment with immunosuppressive medication should also be disqualified from donating stool .
Most investigators have recommended that patients who are not receiving active antibiotic treatment prior to planned FMT should be placed on a brief “induction course” of oral vancomycin for 3–4 days prior to FMT administration to reduce the burden of vegetative C. difficile. The patient and the treating physician must also decide the route of FMT instillation, taking into consideration individual preferences and recognizing that the rate of success varies with the route of instillation .
Robust data assessing the optimal approach for treating an initial episode of CDI in children are limited, and evidence of the comparative effectiveness of metronidazole and vancomycin for treating pediatric CDI is lacking. There are no RCTs comparing the use of these agents in children. A few recent studies suggest that failure rates with metronidazole may be higher than traditionally reported, but these data have limitations. Kim et al  prospectively studied 82 children with CDI, of whom 56 received metronidazole; 6 (11%) of them had treatment failure, but half of these were children with severe disease. Khanna et al  performed a population-based cohort study of CDI epidemiology in children 0–18 years of age. Among 69 patients with community-acquired CDI, treatment failure rate was 18% for metronidazole and 0% for vancomycin, but these rates were not statistically different. In a survey of pediatric infectious diseases physicians by Sammons et al , 100% of respondents reported using metronidazole for initial therapy in healthy children with mild CDI, but the proportion fell to 41%–79% for treating mild CDI in children with underlying comorbidities. Schwenk et al  used a national administrative database to study vancomycin use for pediatric CDI and found that vancomycin use for initial therapy increased significantly between 2006 and 2011, with substantial variability between children’s hospitals. Complications and mortality from CDI in children are uncommon, regardless of severity of disease or choice of antibiotic for treatment [125, 126, 158, 345].
Treatment recommendations for pediatric CDI should balance the accumulated experience of good outcomes with metronidazole for initial mild disease and emerging data in both adults and children, suggesting a possible difference in favor of vancomycin. At the current time there are insufficient pediatric data to recommend vancomycin over metronidazole as preferred treatment, so either metronidazole or vancomycin should be used for an initial episode or first recurrence of nonsevere CDI in children (Table 2). However, because oral vancomycin is not absorbed, the risk of side effects is lower than for metronidazole. Nonetheless, studies have demonstrated that vancomycin exposure promotes carriage of vancomycin-resistant enterococci in the intestinal flora of treated patients, although available data suggest that metronidazole use is also associated with this outcome [307, 384].
There are no well-designed trials that examine the comparative effectiveness of metronidazole and oral vancomycin for the initial treatment of children with severe CDI. As noted above, observational studies of hospitalized children with CDI suggest that the rate of treatment failure may be greater among children with severe disease as compared to those with nonsevere disease . Although pediatric studies have not demonstrated conclusively that the therapeutic agent used to treat a child with severe CDI is associated with different outcomes, evidence from adult RCTs has demonstrated improved outcomes in adult patients with severe CDI who are treated with oral vancomycin compared with those treated with oral metronidazole. Therefore, clinicians should use vancomycin in children who present with severe or fulminant CDI (Table 2). Because fidaxomicin was not approved for use in patients <18 years of age, at the time of this writing, it is not recommended for routine use in the treatment of children with severe CDI, although a recent survey of pediatric infectious disease physicians revealed that it had been used or recommended by 12% of respondents . Of note, neither vancomycin nor fidaxomicin is significantly absorbed when orally administered; thus, there are few systemic adverse events associated with these drugs.
There are no well-designed trials that examine the effectiveness of various treatment regimens in children with multiply recurrent CDI. In addition, pediatric studies have not demonstrated conclusively that there is a difference in the risk of recurrence related to the therapeutic agent used to treat an initial episode [125, 165]. Thus, recommendations about the therapeutic approach to children with multiply recurrent CDI must be guided by evidence drawn from the studies performed in adults and an assessment of the theoretical benefits and harms associated with various treatment regimens. As described above, evidence from adult studies supports the use of an extended course of oral vancomycin (tapered or pulse regimen), oral vancomycin followed by rifaximin, or fidaxomicin in patients with multiply recurrent CDI. For children with a second recurrence of CDI who have been treated exclusively with metronidazole, a conventional course of oral vancomycin should be considered. For children with multiple recurrences of CDI despite conventional courses of metronidazole and oral vancomycin, an alternate therapeutic regimen should be used (Table 2).
Vancomycin, fidaxomicin, and rifaximin are not absorbed when orally administered; thus, there are few systemic adverse events associated with these drugs. Rifaximin has been approved by the FDA for the treatment of traveler’s diarrhea in children ≥12 years of age but has been used in younger children with refractory IBDs  and small intestinal bacterial overgrowth  with few reports of adverse events. As noted above, fidaxomicin was not approved for use in patients <18 years of age at the time of this writing. In contrast to vancomycin and fidaxomicin, repeated or prolonged exposure to metronidazole has been associated with neuropathies. Additional concerns have been voiced about the risk of resistance associated with the use of rifaximin.
Management of multiply recurrent CDI can be challenging. As detailed above, FMT restores gut microbiota diversity through instillation of donor stool into the gastrointestinal tract of patients with CDI. Good clinical response has been shown in adults with refractory or recurrent CDI with few reports of adverse events. At present, robust data examining the effectiveness of FMT for pediatric patients are lacking. Thus, recommendations regarding the therapeutic approach to multiply recurrent CDI in children should be guided primarily by evidence from adult studies. Limited evidence from case reports and case series in pediatric patients suggests that FMT via nasogastric tube or colonoscopy can be effective in children with multiply recurrent CDI who have failed standard antibiotic therapy, with follow-up periods up to 16 months [387, 388]. In most reported cases, fecal sample donation was from the child’s mother or father . Despite limited pediatric data, a survey of pediatric infectious diseases physicians revealed that 18% of respondents who reported using alternative therapies for CDI had recommended FMT, most commonly for the treatment of a third or later recurrence . Finally, the potential benefits of FMT must be balanced against theoretical risks.
As described above, instillation of donor stool typically requires use of nasogastric tube or colonoscopy, which may carry procedure-related risks. In addition, use of donor stool introduces the potential for transmission of resistant organisms and blood-borne pathogens, necessitating donor-screening protocols. There is a general concern that FMT might ultimately lead to unexpected adverse events such as metabolic or immune-based disorders .
The initial step in developing a rational clinical research agenda is the identification of gaps in information. The process of guideline development, as practiced by SHEA and the IDSA, serves as a natural means by which such gaps are identified. Clinical questions identified by the IDSA/SHEA Expert Panel and by members of the IDSA Research Committee that could inform a C. difficile research agenda are listed below.
What is the epidemiology of CDI? What is the incubation period of C. difficile? What is the infectious dose of C. difficile? How should hospital rates be risk-adjusted for appropriate interhospital comparisons? Does administration of PPIs increase the risk of CDI and, if so, what is the magnitude of risk? What are the sources for C. difficile transmission in the community? Is exposure to antibiotics (or equivalent agents, such as chemotherapy drugs) required for susceptibility to CDI? If not, what are the antibiotic surrogates or other factors that place patients at risk for CDI, particularly in the community? What is the role of asymptomatic carriers in transmission of C. difficile in the healthcare setting? What are the validated clinical predictors of severe CDI? Can clinical predictors of severe CDI in children be identified? At what age and to what degree is C. difficile pathogenic among infants and young children? How should clinically significant diarrhea be defined in infants and children who are not continent of stool? How should pediatric healthcare facilities conduct surveillance and report rates of C. difficile infection? Should data from infants <12 months of age be included in laboratory-based surveillance and reporting?
What is the role and optimal sequence for multistep testing for CDI? Is GDH detection in stool sufficiently sensitive as a screening test for C. difficile colitis? How well does GDH correlate with culture for toxigenic C. difficile? Which of the “gold standard” assays (culture for toxigenic C. difficile or cell culture cytotoxicity assay) is optimal as a reference test for diagnosis of CDI? Does screening by GDH test, coupled with confirmatory testing for toxigenic C. difficile by cell culture cytotoxicity assay or NAAT for toxin genes, better identify patients with CDI than using NAAT alone? What should be done with patients who are positive by NAAT but toxin negative? What is the best diagnostic method for hospital laboratories that do not have molecular technology available?
What is the role for NAAT in the diagnosis of CDI? Is molecular testing for toxin genes too sensitive for clinical utility? Are there patient populations in whom a NAAT is method of choice?
Additional diagnostic research questions: Should infants and young children with diarrhea be tested for C. difficile? Which children in the ambulatory setting who present with diarrhea should be tested for C. difficile? Can new diagnostic tests be developed that will accurately distinguish colonization from infection? When should multiplex PCR test platforms for enteric pathogens be used for diagnosis of CDI? Should these platforms exclude C. difficile or should the C. difficile result be hidden given the availability of specific C. difficile diagnostics and the consideration of the different indications for testing (eg, traveler’s diarrhea, hospital onset, antibiotic-associated diarrhea)? Should testing for C. difficile be performed on patients with ileostomy/colostomy?
What is the best treatment for recurrent CDI? What is the best method to prevent recurrent CDI? What is the best way to restore colonization resistance of intestinal microbiota? When should fecal transplant be considered? Should specific commensal bacteria be administered in place of minimally screened fecal specimens from donors? What is the role of adjunctive therapy as new agents become available (eg, monoclonal antibodies [bezlotoxumab, a monoclonal antibody that binds to toxin B, received FDA approval at the time this guideline was being finalized], nontoxigenic C. difficile, toxin-binding agents). What is the role for new anti–C. difficile antibiotics that are being developed? Does the in vitro spectrum of activity of new CDI treatment agents against gut commensal bacteria predict clinical outcome with respect to CDI recurrence in clinical trials with these agents? Assuming an effective vaccine is developed, what population should be targeted? What is the best approach to treatment of fulminant CDI? What are the criteria for colectomy in a patient with fulminant CDI? Should diverting loop ileostomy be the preferred procedure over colectomy in this setting? What is the role of treatment with vancomycin or other antibiotics alone or in combination, or FMT in fulminant infection? What is the role of treatment with passive antibodies (immunoglobulin or monoclonal antibody therapy) in fulminant infection?
Additional treatment research questions: When should vancomycin be used to treat children with CDI? Is fidaxomicin safe and effective in children? How is a CDI episode best distinguished from an IBD flare in patient with ulcerative colitis or Crohn’s disease? What role does C. difficile play in IBD flares? How is CDI best managed in this population? Can postinfectious irritable bowel syndrome be distinguished from recurrent CDI?
What preventive measures can be taken to reduce the incidence of CDI? What is the best method to identify patients at risk of primary or recurrent CDI? Can administration of probiotics or biotherapeutic agents effectively prevent CDI? What are the most effective antibiotic stewardship strategies to prevent CDI? What are the most effective transmission prevention strategies (ie, environmental management and isolation) to prevent CDI in inpatient settings? What is the incremental impact of each? Is there a core “bundle” infection control strategy that can be used by a wide-range of healthcare facilities? Can vaccination effectively prevent CDI, and what would be the composition of the vaccine and the route of administration? What are systemic or mucosal serologic markers that predict protection against CDI? What is the role of anti-CDI agents in secondary CDI prevention of CDI (patients successfully treated for CDI but who receive subsequent oral, intravenous, or intramuscular antibiotics)? What drugs, dosages, and duration? What patient characteristics should be considered for initiating secondary prophylaxis (eg, age, number of previous CDI episodes, and time since previous CDI episode)? What is the effect of screening patients on admission for C. difficile carriage and isolating positive C. difficile carriers on the incidence of hospital-acquired CDI?
What is the biology of C. difficile spores that leads to clinical infection? How do spores interact with the human gastrointestinal immune system? What are the triggers for sporulation and germination of C. difficile in the human gastrointestinal tract? Where does spore germination occur in the human gastrointestinal tract? What is the role of sporulation in recurrent C. difficile disease? What is the role of bile acid metabolism and the potential for using bile acid metabolites for CDI treatment intervention?
What is the basic relationship of C. difficile to the human gut mucosa and immune system? Where in the gut do C. difficile organisms reside? What enables C. difficile to colonize patients? What are the critical constituents of the microbiota that provide colonization resistance to C. difficile? Is there a C. difficile biofilm in the gastrointestinal tract? Is mucosal adherence necessary for development of CDI? Is there a nutritional niche that allows C. difficile to establish colonization? What is the role of mucosal and systemic immunity in preventing clinical CDI? What causes C. difficile colonization to end? Do C. difficile toxins enter the circulation during infection? What are the factors in infants and young children that influence susceptibility to C. difficile infection vs asymptomatic colonization?
Acknowledgments. The expert panel expresses its gratitude for thoughtful reviews of an earlier version by Curtis Collins, PharmD of the ASHP, Christopher Ohl, MD, and Ellie Goldstein, MD. The panel greatly appreciates National Jewish Health for assistance with the literature searches and Valery Lavergne, MD, Paul Alexander, PhD, and Genet Demisashi of the IDSA for their assistance and support in the development of these guidelines.
Disclaimer. The findings and conclusions in this report are those of the author(s), and do not necessarily represent the official position of the CDC or the US Department of Veterans Affairs.
Financial support. Support for this guideline was provided by the IDSA and SHEA.
Potential conflicts of interest. The following list is a reflection of what has been reported to IDSA. To provide thorough transparency, IDSA requires full disclosure of all relationships, regardless of relevancy to the guideline topic. Evaluation of such relationships as potential conﬂicts of interest (COI) is determined by a review process that includes assessment by the Standards and Practice Guidelines Committee (SPGC) Chair, the SPGC liaison to the development Panel, the Board of Directors liaison to the SPGC, and, if necessary, the COI Task Force of the Board. This assessment of disclosed relationships for possible COI will be based on the relative weight of the ﬁnancial relationship (ie, monetary amount) and the relevance of the relationship (ie, the degree to which an association might reasonably be interpreted by an independent observer as related to the topic or recommendation of consideration). The reader of these guidelines should be mindful of this when the list of disclosures is reviewed. For activities outside of the submitted work, D. G. has served as board member for Rebiotix, Merck, Actelion, Summit, and DaVolterra; has served as a consultant for Pfizer, Sanofi Pasteur, and MGB Pharma; received a grant from Seres Therapeutics; and holds patents and technology for nontoxigenic C. difficile for the treatment and prevention of CDI under NTCD, LLC. For activities outside of the submitted work, S. J. has served on the advisory board member for Bio-k+, Synthetic Biologics, Summit, Therapeutics, and CutisPharma; has served on Pfizer’s data and safety monitoring board for vaccine study; and has received payment for lectures from Merck. For activities outside of the submitted work, K. C. has received research grants from GenePOC, Accelerate, and BD Diagnostics; has received royalties from McGraw-Hill and ASM Press; and has received travel expenses as board member with ASM. For activities outside of the submitted work, S. C. has received payment as expert testimony for medical-legal consultation; has received research grants from the Agency for Health Research and Quality, CDC, and National Institutes of Health (NIH); and has received payment for lectures from IDSA, CDC, and American Academy of Pediatrics. For activities outside of the submitted work, E. R. D. has served as a consultant for Sanofi Pasteur, Nestle, Valneva, Pfizer, Rebiotix, GSK, and Merck; has received research grants from Sanofi Pasteur, Pfizer, Merck, and Rebiotix; and has received payment for lectures from Alere and Biofire. For activities outside of the submitted work, K. G. has received research grants from Merck & Co, Summit Pharmaceuticals, and Techlab, served as a consultant for bioMérieux, Merck & Co, and Summit Pharmaceuticals; and received payment for the development of educational presentation by bioMérieux and Merck & Co. For activities outside of the submitted work, C. K. has received research grants from the NIH, Institut Mérieux, and Aptalis; has received personal fees serving as scientific advisor for Facile Therapeutics, Summit (Oxford), Synthetic Biologics, Actelion, Artugen, First Light Diagnostics, Finch, GlaxoSmithKline, Merck, Seres Therapeutics, Summit, Vedanta, Celimmune, Cour Pharma, Takeda, Innovate, Valeant, and ImmunogenX; and has received payment for the development of educational presentations by Merck and Seres. For activities outside of the submitted work, V. L. has served as a consultant for Merck, and received payment for serving on the speaker’s bureau for Merck. For activities outside of the submitted work, J. S. has received grants from CDC Epicenters. For activities outside of the submitted work, M. W. has received research grants, consultancy/lecture fees from Actelion, Cubist, Astellas, Optimer, Sanofi Pasteur, Summit, Seres, bioMérieux, Da Volterra, Qiagen, and Pfizer; served as a consultant for Merck, Valneva, Alere, AstraZeneca, Durata, Nabriva, Pfizer, Roche, The Medicines Company, Abott, Basilea, and the European Tissue Symposium; and received research grants from Cerexa, Abbott, and the European Tissue Symposium. All other authors report no potential conflicts of interest. All authors have submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Conflicts that the editors consider relevant to the content of the manuscript have been disclosed.
It is important to realize that guidelines cannot always account for individual variation among patients. They are not intended to supplant physician judgment with respect to particular patients or special clinical situations. IDSA and SHEA consider adherence to the guidelines listed below to be voluntary, with the ultimate determination regarding their application to be made by the physician in the light of each patient’s individual circumstances. While IDSA makes every effort to present accurate and reliable information, the information provided in these guidelines is “as is” without any warranty of accuracy, reliability, or otherwise, either express or implied. Neither IDSA nor its officers, directors, members, employees, or agents will be liable for any loss, damage, or claim with respect to any liabilities, including direct, special, indirect, or consequential damages, incurred in connection with these guidelines or reliance on the information presented.
Please click on the following link to be redirected:
“This study is consistent with previous literature that has demonstrated a significant and substantial increase in health care resource utilization for CDI over and above similar patients without CDI,” researcher Dongmu Zhang, PhD, of Merck’s Center for Observational and Real-World Evidence, and colleagues wrote. “It has also shown that having rCDI is associated with substantial health care resource use as compared to similar CDI patients who do not have a recurrence.”
To estimate costs and time of hospitalization associated with CDI and rCDI, the researchers conducted a retrospective observational study. They assessed patient records using databases of commercial and Medicare health care claims. Both databases included information on demographics, diagnoses and prescriptions, among other data.
The researchers matched patients without CDI to those with the infection in a 1:1 ratio to estimate costs and lengths of hospital stay due to primary CDI. They then matched patients with primary CDI 1:1 to those with rCDI in a similar comparison. Each patient was followed for 6 months.
The study included records for 55,504 patients diagnosed with CDI between
July 2010 and July 2014.
The mean patient age was 61.3 years,
62% of patients were women.
Nearly a quarter of patients — 24.8% — had rCDI.
The estimated cumulative hospital stays due to CDI and rCDI were 5.2 days and 1.95 days, respectively.
The estimated health care costs due to CDI and rCDI were $24,205 and $10,580, respectively.
Zhang and colleagues said the data show that clinicians must act to control CDI.
“The health care resource utilization and economic burden associated with primary and rCDI are quite substantial,” they wrote. “Better prevention and treatment of CDI, especially rCDI, are needed.” – by Joe Green
To read the article in full entirety please click on the following link:
To evaluate the epidemiology of Clostridium difficile infection in several Romanian hospitals.
A survey was conducted from November 2013 to February 2014 in 9 hospitals selected from different Romanian regions.
The survey identified 393 patients with C. difficile infection. The median age was 67 years (range: 2-94 years) with 56% of patients older than 65 years. The mean C. difficile infection prevalence was 5.2 per 10.000 patient-days, with the highest prevalence, 24.9 and 20 per 10.000 patients-days, reported in a gastroenterology and an infectious diseases hospital, respectively. The origin of C. difficile infection was health care-associated for 70.5% of the patients, community-acquired for 10.2% of patients and indeterminate for other 19.3%. Severe C. difficile infection was registred in 12.3% cases and in hospital all-cause mortality was 8.8%. Polymerase chain reaction-ribotype 027 was the most prevalent in all participating hospitals, and represented 82.6% of the total ribotyped isolates. Moxifloxacin minimal inhibitory concentrations were higher than 4 μg/mL for 59 of 80 tested isolates (73.8%). Fifty-four of these 59 isolates were highly resistant to moxifloxacin, (minimal inhibitory concentration ≥32 μg/mL) and belonged more frequently to polymerase chain reaction-ribotype 027 (p<0.0001).
The present study is the first multicentre study performed in Romania and shows that the ribotype 027 is largely predominant in C. difficile infection cases in Romania. The prevalence of C. difficile infection in some specialized hospitals is higher than the European mean prevalence and demonstrates the need of strict adherence to infection control programmes.