Coronavirus disease 2019 (COVID-19) changed healthcare across the world. With this change came an increase in healthcare-associated infections (HAIs) and a concerning concurrent proliferation of MDR organisms (MDROs). In this narrative review, we describe the impact of COVID-19 on HAIs and MDROs, describe potential causes of these changes, and discuss future directions to combat the observed rise in rates of HAIs and MDRO infections.
Background. The 2014-2015 Ebola epidemic in West Africa had global impact beyond the primarily affected countries of Guinea, Liberia, and Sierra Leone. Other countries, including the United States, encountered numerous patients who arrived from highly affected countries with fever or other signs or symptoms consistent with Ebola virus disease (EVD).
Methods. We describe our experience evaluating 25 travelers who met the US Centers for Disease Control and Prevention case definition for a person under investigation (PUI) for EVD from July 20, 2014 to January 28, 2015. All patients were triaged and evaluated under the guidance of institutional protocols to the emergency department, outpatient tropical medicine clinic, or Emory's Ebola treatment unit. Strict attention to infection control and early involvement of public health authorities guided the safe evaluation of these patients.
Results. None were diagnosed with EVD. Respiratory illnesses were common, and 8 (32%) PUI were confirmed to have influenza. Four patients (16%) were diagnosed with potentially life-threatening infections or conditions, including 3 with Plasmodium falciparum malaria and 1 with diabetic ketoacidosis.
Conclusions. In addition to preparing for potential patients with EVD, Ebola assessment centers should consider other life-threatening conditions requiring urgent treatment, and travelers to affected countries should be strongly advised to seek pretravel counseling. Furthermore, attention to infection control in all aspects of PUI evaluation is paramount and has presented unique challenges. Lessons learned from our evaluation of potential patients with EVD can help inform preparations for future outbreaks of highly pathogenic communicable diseases.
OBJECTIVE: Understand how the built environment can affect safety and efficiency outcomes during doffing of personal protective equipment (PPE) in the context of coronavirus disease 2019 (COVID-19) patient care. STUDY DESIGN: We conducted (1) field observations and surveys administered to healthcare workers (HCWs) performing PPE doffing, (2) focus groups with HCWs and infection prevention experts, and (3) a with healthcare design experts. SETTINGS: This study was conducted in 4 inpatient units treating patients with COVID-19, in 3 hospitals of a single healthcare system. PARTICIPANTS: The study included 24 nurses, 2 physicians, 1 respiratory therapist, and 2 infection preventionists. RESULTS: The doffing task sequence and the layout of doffing spaces varied considerably across sites, with field observations showing most doffing tasks occurring around the patient room door and PPE support stations. Behaviors perceived as most risky included touching contaminated items and inadequate hand hygiene. Doffing space layout and types of PPE storage and work surfaces were often associated with inadequate cleaning and improper storage of PPE. Focus groups and the design charrette provided insights on how design affording standardization, accessibility, and flexibility can support PPE doffing safety and efficiency in this context. CONCLUSIONS: There is a need to define, organize and standardize PPE doffing spaces in healthcare settings and to understand the environmental implications of COVID-19-specific issues related to supply shortage and staff workload. Low-effort and low-cost design adaptations of the layout and design of PPE doffing spaces may improve HCW safety and efficiency in existing healthcare facilities.
OBJECTIVES: To determine the association between time period of hospitalization and hospital mortality among critically ill adults with coronavirus disease 2019. DESIGN: Observational cohort study from March 6, 2020, to January 31, 2021. SETTING: ICUs at four hospitals within an academic health center network in Atlanta, GA. PATIENTS: Adults greater than or equal to 18 years with coronavirus disease 2019 admitted to an ICU during the study period (i.e., Surge 1: March to April, Lull 1: May to June, Surge 2: July to August, Lull 2: September to November, Surge 3: December to January). MEASUREMENTS AND MAIN RESULTS: Among 1,686 patients with coronavirus disease 2019 admitted to an ICU during the study period, all-cause hospital mortality was 29.7%. Mortality differed significantly over time: 28.7% in Surge 1, 21.3% in Lull 1, 25.2% in Surge 2, 30.2% in Lull 2, 34.7% in Surge 3 (p = 0.007). Mortality was significantly associated with 1) preexisting risk factors (older age, race, ethnicity, lower body mass index, higher Elixhauser Comorbidity Index, admission from a nursing home); 2) clinical status at ICU admission (higher Sequential Organ Failure Assessment score, higher d-dimer, higher C-reactive protein); and 3) ICU interventions (receipt of mechanical ventilation, vasopressors, renal replacement therapy, inhaled vasodilators). After adjusting for baseline and clinical variables, there was a significantly increased risk of mortality associated with admission during Lull 2 (relative risk, 1.37 [95% CI = 1.03–1.81]) and Surge 3 (relative risk, 1.35 [95% CI = 1.04–1.77]) as compared to Surge 1. CONCLUSIONS: Despite increased experience and evidence-based treatments, the risk of death for patients admitted to the ICU with coronavirus disease 2019 was highest during the fall and winter of 2020. Reasons for this increased mortality are not clear.
by
Nadezhda Duffy;
Cedric J. Brown;
Sandra N. Bulens;
Wendy Bamberg;
Sarah J. Janelle;
Jesse Jacob;
Chris Bower;
Lucy Wilson;
Elisabeth Vaeth;
Ruth Lynfield;
Paula Snippes Vagnone;
Erin C. Phipps;
Emily B. Hancock;
Ghinwa Dumyati;
Cathleen Concannon;
Zintars G. Beldavs;
P. Maureen Cassidy;
Marion Kainer;
Daniel Muleta;
Isaac See
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are an urgent threat in the United States because of high morbidity and mortality, few treatment options, and potential for rapid spread among patients. To assess for changes in CRE epidemiology and risk among populations, we analyzed CDC Emerging Infections Program (EIP) 2012–2015 surveillance data for CRE. Methods: Active, population-based CRE surveillance was initiated in January 2012 at 3 EIP sites (GA, MN, OR) and expanded to 5 additional sites (CO, MD, NM, New York, TN) by 2014. An incident case was the first Escherichia coli, Enterobacter, or Klebsiella isolate (non-susceptible to at least one carbapenem and resistant to all third-generation cephalosporins tested) collected from urine or a normally sterile body site from a patient during a 30-day period. Data were collected from patients’ medical records. Cases were hospital-onset (HO) or long-term care facility (LTCF) onset if patients were in the respective facility ≥3 days prior to culture or at the time of culture; and community-onset (CO) otherwise. We calculated incidence rates based on census data for EIP sites and described by type of infection onset. Results: A total of 1,582 incident CRE cases were reported in 2012–2015. Most cases (88%) were identified through urine cultures; 946 (60%) were female, and median age was 66 years (interquartile range: 55–77). The median incidence by site was 2.95 per 100,000 population (range: 0.35–8.98). Among the three sites with four full years of data, a different trend was seen in each (Figure). Trends in GA and MN were statistically significant, and no significant trend was seen in OR. Overall, 480 cases (30%) were HO, 524 (33%) were LTCF onset, and 578 (37%) were CO. Of CO cases, 308 (53%) had been hospitalized, admitted to a long- term acute care hospital or were a LTCF resident in the prior year. Conclusion: CRE incidence varied more than 20-fold across surveillance sites, with evidence of continued increases in MN. Measuring impact of programs aimed at reducing CRE transmission in other regions will require obtaining local data to identify cases occurring during and after healthcare facility discharge. Further study of changes in incidence in some settings and areas might offer opportunities to refine and expand effective control strategies.
by
Uzma Ansari;
Adrian Lawsin;
Davina Campbell;
Valerie Albrecht;
Gillian McAllister;
Sandra Bulens;
Maroya Spalding Walters;
Jesse Jacob;
Sarah Satola;
Lucy E Wilson;
Ruth Lynfield;
Paula Snippes Vagnone;
Sarah J. Janelle;
Karen Xavier;
Ghinwa Dumyati;
Dwight Hardy;
Eric C. Phipps;
Karissa Culbreath;
Zintars Beldavs;
Karim Morey;
Marion A. Kalner;
Sheri Roberts ;
Alexander Kallen;
J. Kamile Rasheed;
Maria S. Karlsson
Background: Carbapenem-resistant Enterobacteriaceae (CRE) have emerged as an important cause of healthcare-associated infections. We characterized the molecular epidemiology of CRE in isolates collected through the Emerging Infections Program (EIP) at the Centers for Disease Control and Prevention (CDC). Methods: From 2011–2015, 8 U.S. EIP sites (CO, GA, MD, MN, NY, NM, TN and OR) collected CRE (Escherichia coli, Enterobacter aerogenes, Enterobacter cloacae complex, Klebsiella pneumoniae, and Klebsiella oxytoca) isolated from a normally sterile site or urine. Isolates were sent to CDC for reference antimicrobial susceptibility testing and real-time PCR detection of carbapenemase genes (blaKPC, blaNDM, blaOXA-48). Phenotypically confirmed CRE were analyzed by whole genome sequencing (WGS) using an Illumina MiSeq benchtop sequencer. Results: Among 639 Enterobacteriaceae evaluated, 414 (65%) were phenotypically confirmed as CRE using CDC’s current surveillance definition (resistant to ertapenem, imipenem, doripenem, or meropenem). Among isolates confirmed as CRE, 303 (73%) were carbapenemase-producers (CP-CRE). The majority of CP-CRE originated from GA (39%), MD (35%) and MN (11%); most non-CP-CREs originated from MN (27%), CO (25%) and OR (17%). K. pneumoniae was the predominant carbapenemase-producing species (78%) followed by E. cloacae complex spp (12%), E. coli (7.9%), E. Aerogenes (0.9%) and K. oxytoca (0.6%). The most common carbapenemase genes detected were blaKPC-3 (76%) and blaKPC-2 (19%); blaNDM and blaOXA-48-like genes were detected in 1.6% and 0.3% of isolates, respectively. For carbapenemase-producing K. pneumoniae, Enterobacter spp, and E. coli, the predominant sequence types (ST) were ST258 (65%), ST171 (35%) and ST131 (29%), respectively. Conclusion: The distribution of CP and non-CP-CRE varied across the catchment sites. Among CP-CRE, KPC-producing K. pneumoniae predominated; other carbapenemases were rarely identified in the locations under surveillance. Strain types known to have increased epidemic potential (ST258 and ST131) were common among carbapenemase-producing K. pneumoniae and E. coli isolates, respectively.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) and Acinetobacter baumannii (CRAB) pose a threat to public health, but comparisons of disease burden are limited. We compared survival in patients following cultures positive for CRE or CRAB. Methods: The Georgia Emerging Infections Program performs active population-based and laboratory-based surveillance for CRE and CRAB in metropolitan Atlanta, GA. Using standard CDC definitions, we included patients who had incident carbapenem-nonsusceptible E. coli, Klebsiella spp., Enterobacter spp., or Acinetobacter baumannii isolated from urine only (noninvasive infection) or a sterile site (invasive infection) between 8/2011 and 12/2015. Death dates, verified by Georgia Vital Statistics records, were used to calculate 30- and 90-day mortality rates. We used the chi-square test for mortality rates and the log-rank test for survival analysis to 90 days to compare patients with invasive CRAB, noninvasive CRAB, invasive CRE, and noninvasive CRE. Results: There were 535 patients with CRE (87 invasive, 448 noninvasive) and 279 (78 invasive, 201 noninvasive) with CRAB. Nearly all patients with CRE and CRAB had healthcare exposures (97.2% vs. 100%) and most were immunosuppressed (62.6% vs. 56.3%). Both 30-day (24.4% vs. 18.3%, p = 0.04) and 90-day (37.6% vs. 30.5%, p = 0.04) mortality were higher in patients with CRAB than CRE. Patients with invasive infections were more likely to die at 90 days than those with noninvasive infections (53.3% vs. 38.4%, p < 0.0001). Overall mortality rates for invasive infection were similar between CRAB and CRE at 30 (44.9% vs. 34.5% p = 0.2) and 90 days (59.0% vs. 48.3%, p = 0.2). Using survival analysis at 90 days, invasive CRAB had the worst outcomes, followed by invasive CRE, noninvasive CRAB and noninvasive CRE
(p < 0.0001, see Figure). Conclusion: Ninety-day mortality for invasive infections with CRE and CRAB was ~50%, and patients with CRAB had lower survival than those with CRE, suggesting that prevention efforts may need to prioritize CRAB as highly as CRE in facilities with endemic CRAB. With the high proportion of healthcare exposures and immunosuppression, these infections may signify poor prognosis or directly contribute to mortality.
Background: The Joint Commission (TJC) now requires antimicrobial stewardship programs (ASP) at all hospitals starting January 1, 2017. The purpose of this study was to determine the time it takes to perform ASP activities at a small community hospital as well as barriers to remote stewardship. Methods: This was a prospective chart review and time study conducted in patients identified by a clinical decision support and electronic surveillance application as potential opportunities for antimicrobial therapy modification at Emory Johns Creek Hospital (EJCH), a suburban, 110-bed acute care hospital. The chart review was conducted remotely between December 12, 2016 and March 31, 2017 using predefined electronic alerts. These results were then communicated electronically to the EJCH pharmacists, who would communicate the recommendations to the patient’s provider. The primary endpoint was a time study for stewardship activities at a small community hospital. Secondary endpoints included describing barriers encountered to remote stewardship, and a cost-benefit analysis of remote stewardship at a small community hospital. Results: A total of 3,060 minutes were spent on ensuring regulatory compliance with 20.5% of that time spent reporting data on antimicrobial utilization. The time study also revealed an average of 11 alerts per day, 9 chart reviews per day, 8 interventions per day, and 5 minutes per chart. Seven hundred twenty-four alerts were evaluated with the most common alerts constituting opportunities for de-escalation (29%), targeted drugs (22%), positive blood cultures (18%), IV to PO (17%), and antimicrobial renal monitoring (8%). Interventions were accepted (11%), accepted modified (6%), rejected (35%), or undetermined (48%). Barriers to implementation included workflow and indirect communication. For patients with accepted interventions, there was an average of $279.82 per patient in savings of pharmacy charges. Conclusion: Remote stewardship is a feasible option for small community hospitals. In addition to the cost savings, this intervention appeared to positively impact quality and safety of care while providing compliance with the new TJC antimicrobial stewardship standard.
Background: Central-line associated bloodstream infections (CLABSI) are a subset of hospital-onset bacteremia and fungemia (HOB), a potential indicator of healthcare-associated infections (HAIs) that can be objectively and directly obtained from electronic health records. We undertook a pilot study to elucidate the causes and determine the preventability of HOB. Methods: HOB was defined as growth of a microorganism from a blood culture obtained ≥3 calendar days after admission in a hospitalized patient. A random sampling of HOB events across 2 academic hospitals and a pediatric intensive care unit in a third academic hospital were identified between October 1, 2014 and September 30, 2015. Medical records were reviewed to identify potential risk factors and sources of bacteremia. Two physicians used underlying patient factors, microorganism, and other clinical data to rate the preventability of each HOB event in an “ideal hospital” on a 6-point Likert scale. Results: Medical records for 60 HOB events (20 in each hospital) were reviewed. The most common organisms were coagulase-negative Staphylococcus (28%) and Candida spp. (17%) (Figure 1). The most likely sources of bacteremia and fungemia included CLABSI (28%) and skin contaminants/commensals (17%) (Figure 2). Forty-nine percent of HOB events not attributed to skin commensals were rated as potentially preventable (Figure 3). Fifty percent of HOB events randomly sampled across 2 hospitals occurred in an intensive care unit. Central venous catheters, urinary catheters, and mechanical ventilation were present in the previous 2 days among 73%, 20%, and 25% of all HOB events, respectively. Only 10% of all HOB events occurred in a patient without an indwelling device. Only 20% of HOB events resulted in a National Healthcare Safety Network (NHSN) reported CLABSI. Conclusion: Half of HOB events are potentially preventable in this pilot study. HOB may be an indicator for a large number of preventable HAIs not currently measured by NHSN. Larger studies across a variety of hospital settings are needed assess the generalizability of these results the implications of HOB surveillance for infection prevention practices and patient outcomes.
A 46-year-old man with human immunodeficiency virus infection and active intravenous drug use presented with approximately 2 weeks of fevers and body aches. On physical examination, he was somnolent and had a new systolic murmur, bilateral conjunctival hemorrhages, diffuse petechiae, and left-sided arm weakness. Echocardiography revealed a large mitral valve vegetation, and brain imaging demonstrated numerous embolic infarctions. Blood cultures grew Serratia marcescens. Despite aggressive treatment with meropenem, the patient died due to intracranial hemorrhage complicated by herniation. Serratia marcescens is an uncommon cause of infective endocarditis. While this disease has historically been associated with intravenous drug use, more recent reports suggest that it is now largely a consequence of opportunistic infections of the chronically ill. Our case highlights several characteristic features of this infection, including isolation of a nonpigmented strain of the organism, an antibiotic susceptibility profile suggestive of AmpC β-lactamase production, and rapid clinical deterioration with multiple embolic complications resulting in death. In this review, we discuss the history, epidemiology, and management of endovascular infections due to Serratia species, emphasizing the continued importance of considering this organism in the differential diagnosis of endocarditis among intravenous drug users and as a potential indication for surgical therapy.