Background:
Shared decision-making using a decision aid is required for patients undergoing implantation of primary prevention implantable cardioverter-defibrillators (ICD). It is unknown how much this process has impacted patients’ experiences or choices. Effective shared decision-making requires an understanding of how patients make ICD decisions. A qualitative key informant study was chosen to capture the breadth of patients’ experiences making ICD decisions in the context of required shared decision-making.
Methods and Results:
We conducted in-depth interviews with 20 patients referred to electrophysiology clinics for the consideration of primary prevention ICD implantation. Purposeful sampling from a prior survey study evaluating mandated shared decision-making was based on patient characteristics and responses to the initial survey questions. Qualitative descriptive analysis of the interviews was performed using a multilevel coding strategy. Patients’ paths to an ICD decision often involved multiple visits with multiple clinicians. However, the decision aid was almost exclusively provided to the patient during electrophysiology clinic visits. Some patients used the numeric data in the decision aid to make an ICD decision based on the risk–benefit profile; others made decisions based on other data or based on trust in clinicians’ recommendations. Patients highlighted information related to living with the device as particularly important in helping them to make their ICD decisions. Some patients struggled with the emotional aspects of making an ICD decision.
Conclusions:
Patients’ ICD decision-making paths poses a challenge to episodic shared decision-making and may make tools such as decision aids perfunctory if used solely during the electrophysiology visit. Understanding patients’ ICD decision-making paths, especially in the context of encounters with primary cardiologists, can inform the implementation strategies of shared decision-making help to enhance its impact. Components of decision aids focusing on the experience of living with an ICD rather than probabilistic data may also be more impactful, although the nature of their impact will differ.
Background-—The incidence of cancer treatment–induced arrhythmia (CTIA) associated with novel, targeted chemotherapeutic agents (TCAs) has not been well described.
Methods and Results-—We identified all patients treated at our institution from January 2010 to December 2015 with selected TCAs. We defined CTIA as any new arrhythmia diagnosis code within 6 months after treatment initiation. As a comparison, we also identified patients treated with anthracycline chemotherapy during the same period. We identified 5026 patients, of whom 2951 (58.7%) received TCAs and 2075 (41.3%) received anthracycline chemotherapy. In the overall cohort, 601 patients (12.0%) developed CTIA. Patients with CTIA were significantly older and more likely to have hypertension, diabetes mellitus, congestive heart failure, coronary disease, and sleep apnea. The incidence of CTIA at 6 months was significantly lower in the TCA group (9.3% versus 15.8%; P<0.001). In multivariate analysis, a history of hypertension (hazard ratio, 1.63; 95% confidence interval, 1.34–1.98), congestive heart failure (hazard ratio, 2.12; 95% confidence interval, 1.78–2.68), and male sex (hazard ratio, 1.25; 95% confidence interval, 1.06–1.47) were associated with a significantly increased risk of CTIA, whereas treatment with TCAs, compared with anthracycline chemotherapy, was associated with a significantly lower risk (hazard ratio, 0.60; 95% confidence interval, 0.51– 0.71).
Conclusions-—Compared with anthracyclines, treatment with TCAs was associated with an ≈40% reduced risk of new-onset arrhythmia diagnoses during the first 6 months of treatment.
Background: T wave oversensing (TWOS) is a major drawback of the subcutaneous implantable cardioverter defibrillator (S-ICD). Data on predictors of TWOS in S-ICD recipients are limited. We sought to investigate predictors of TWOS in a cohort of patients receiving an S-ICD at our institution.
Methods: S-ICD recipients at our center were identified retrospectively and stratified based on the presence or absence of TWOS. Clinical and electrocardiographic parameters were collected and compared between the 2 groups.
Results: Ninety-two patients underwent an S-ICD implantation at our institution between April 2010 and January 2015. Six (6.5%) patients had TWOS. These patients were younger (38.1±13.7 vs. 52.3±16.1 years, p=0.04) and had higher left ventricle ejection fractions (48.5±14.9% vs. 28.4±12.2%, p<0.01) than patients without a history of TWOS. Baseline 12-lead electrocardiogram (ECG) parameters were not different between the 2 groups. Leads I, II, and avF (which mimic the sensing vectors of the S-ICD) were further inspected to identify ECG characteristics that could predict TWOS. The QRS amplitude in ECG lead I was significantly smaller in the TWOS group than in the non-TWOS group (3.7 vs. 7.4 mV, p=0.02).
Conclusion: In this study, younger age, higher ejection fraction, and lower QRS amplitude were associated with TWOS. These findings could help identify patients referred for S-ICD at high-risk of TWOS.
The efficacy of implantable cardioverter-defibrillators (ICD) for primary prevention of sudden cardiac death (SCD) has not been studied in patients with end-stage renal disease (ESRD) and left ventricular dysfunction. We sought to identify predictors of long-term survival among ICD recipients with and without ESRD. Methods Patients implanted with an ICD at our institution from January 2006 to March 2014 were retrospectively identified. Clinical and demographic characteristics were collected. Patients were stratified by the presence of ESRD at the time of ICD implant. Mortality data were collected from the Social Security Death Index (SSDI). Results A total of 3453 patients received an ICD at our institution in the pre-specified time period, 184 (5.3%) of whom had ESRD. In general, ESRD patients were sicker and had more comorbidities. Kaplan Meier survival curve showed that ESRD patients had worse survival as compared with non-dialysis patients (p < 0.001). Following adjustment for differences in baseline characteristics, patients with ESRD remained at increased long-term mortality in the Cox model. The one-year mortality in the ESRD patients was 18.1%, as compared with 7.7% in the non-dialysis cohort (p < 0.001). The three-year mortality in ESRD patients was 43%, as compared with 21% in the non-dialysis cohort (p < 0.001). Conclusion ESRD patients are at significantly increased risk of mortality as compared with a non-dialysis cohort. While the majority of these patients survive more than one year post-diagnosis, the three-year mortality is high (43%). Randomized studies addressing the benefits of ICDs in ESRD patients are needed to better define their value for primary prevention of SCD.
Background: Clinical outcomes of cardiac resynchronization therapy (CRT) in patients over the age of 80 have not been well described.
Methods: We retrospectively identified 96 consecutive patients ≥ 80 years old who underwent an initial implant or an upgrade to CRT, with or without defibrillator (CRT-D vs. CRT-P), at our institution between January 2003 and July 2008. The control cohort consisted of 177 randomly selected patients < 80 years old undergoing CRT implant during the same time period. The primary efficacy endpoint was all-cause mortality at 36 months, assessed by Kaplan-Meier time to first event curves.
Results: In the octogenarian cohort, mean age at CRT implant was 83.1 ± 2.9 years vs. 60.1 ± 8.8 years among controls (P < 0.001). Across both groups, 70% were male, mean left ventricular ejection fraction (LVEF) was 24.8% ± 14.1% and QRS duration was 154 ± 24.8 ms, without significant differences between groups. Octogenarians were more likely to have ischemic cardiomyopathy (74% vs. 37%, P < 0.001) and more likely to undergo upgrade to CRT instead of an initial implant (42% vs. 19%, P < 0.001). The rate of appropriate defibrillator shocks was lower among octogenarians (14% vs. 27%, P = 0.02) whereas the rate of inappropriate shocks was similar (3% vs. 6%, P = 0.55). At 36 months, there was no significant difference in the rate of all-cause mortality between octogenarians (11%) and controls (8%, P = 0.381).
Conclusion: Appropriately selected octogenarians who are candidates for CRT have similar intermediate-term mortality compared to younger patients receiving CRT.
Primary prevention implantable cardioverter‐defibrillators (ICDs) play a crucial role in the treatment of patients at heightened risk of sudden cardiac death (SCD) caused by ventricular arrhythmias. The use of ICDs for primary prevention is supported by multiple, randomized clinical trials now codified into guidelines that drive clinical practice.1 However, there are well‐known shortcomings of the current paradigm, which determines primary prevention ICD candidacy based largely on left ventricular ejection fraction (LVEF) and heart failure class.2, 3 Most ICD recipients never receive appropriate device therapy, whereas others may receive shocks but die from progressive heart failure or noncardiac causes without deriving a meaningful survival benefit from the device.4
Despite well‐acknowledged limitations, there have been no significant new trials refining patient selection for primary prevention ICDs in nearly 15 years, with the guidelines themselves likely presenting a significant barrier to randomization. In this article, we review areas where current risk stratification algorithms perform poorly and highlight opportunities for improving decision making regarding ICD implantation. We also propose research and policy solutions for improving the yield of primary prevention ICD implantation.
by
Omid Sayadi;
Dheeraj Puppala;
Nosheen Ishaque;
Rajiv Doddamani;
Faisal Merchant;
Conor Barrett;
Jagmeet P. Singh;
E. Kevin Heist;
Theofanie Mela;
Juan Pablo Pablo Martinez;
Pablo Laguna;
Antonis A. Armoundas
Background-This study investigates the hypothesis that morphologic analysis of intracardiac electrograms provides a sensitive approach to detect acute myocardial infarction or myocardial infarction-induced arrhythmia susceptibility. Large proportions of irreversible myocardial injury and fatal ventricular tachyarrhythmias occur in the first hour after coronary occlusion; therefore, early detection of acute myocardial infarction may improve clinical outcomes. Methods and Results-We developed a method that uses the wavelet transform to delineate electrocardiographic signals, and we have devised an index to quantify the ischemia-induced changes in these signals. We recorded body-surface and intracardiac electrograms at baseline and following myocardial infarction in 24 swine. Statistically significant ischemia-induced changes after the initiation of occlusion compared with baseline were detectable within 30 seconds in intracardiac left ventricle (P<0.0016) and right ventricle-coronary sinus (P<0.0011) leads, 60 seconds in coronary sinus leads (P<0.0002), 90 seconds in right ventricle leads (P<0.0020), and 360 seconds in body-surface electrocardiographic signals (P<0.0022). Intracardiac leads exhibited a higher probability of detecting ischemia-induced changes than body-surface leads (P<0.0381), and the right ventricle-coronary sinus configuration provided the highest sensitivity (96%). The 24-hour ECG recordings showed that the ischemic index is statistically significantly increased compared with baseline in lead I, aVR, and all precordial leads (P<0.0388). Finally, we showed that the ischemic index in intracardiac electrograms is significantly increased preceding ventricular tachyarrhythmic events (P<0.0360). Conclusions-We present a novel method that is capable of detecting ischemia-induced changes in intracardiac electrograms as early as 30 seconds following myocardial infarction or as early as 12 minutes preceding tachyarrhythmic events.
Background: Data suggests that same day discharge after implantation of trans-venous pacemakers is safe and feasible. We sought to determine whether same day discharge was feasible and safe following implantation of Medtronic MICRA leadless pacemakers. Methods: We retrospectively identified all patients undergoing MICRA placement at our institution between April 2014 to August 2018 (n=167). Patients were stratified into two groups: those discharged on the same day as their procedure (SD, n=25), and those observed for at least one night in the hospital (HD, n=142). The primary endpoint included a composite of major complications including: access site complications, new pericardial effusion, device dislodgement, and need for device revision up to approximately 45 days of follow up. Results: SD and HD had similar age (75±13 vs. 75±13 years, p=0.923), prevalence of male sex (49 vs. 44%, p=0.669), and frequency of high-grade heart block as an indication for pacing (38 vs. 32%, p=0.596). There were more Caucasians in the SD group (72 vs. 66%, p=0.038). The rate of the composite endpoint was statistically non-significantly higher in the HD group (3.5% vs. 0.0%, p=1.00). The rates of each individual components comprising the composite endpoint were similar between groups. Conclusions: Our data suggest that in appropriately selected patients, same day discharge can occur safely following Micra leadless pacemaker implantation.
by
Kanchan Kulkarni;
Faisal Merchant;
Mohamad B. Kassab;
Furrukh Sana;
Kasra Moazzami;
Omid Sayadi;
Jagmeet P. Singh;
E. Kevin Heist;
Antonis A. Armoundas
Sudden cardiac death (SCD) is frequently the initial manifestation of a cardiac arrhythmia, resulting in about 350 000 deaths annually in the United States.1 Devices such as the implantable cardioverter‐defibrillator (ICD) seek to restore normal rhythm and may abort SCD.2 However, given the complex spatiotemporal dynamics of cardiac electrophysiology, predicting the onset of an arrhythmia and preventing the transition from a stable to an unstable rhythm is highly challenging. Deciphering the mechanisms that lead to an unstable heart rhythm and developing therapies to prevent unstable rhythms is an urgent clinical need.
In 1908, Heinrich Hering first described ECG alternans, a pattern of beat‐to‐beat oscillation in the ECG waveform.3 Subsequently, repolarization alternans (RA), or alternans that manifests during ventricular repolarization, has been associated with an increased risk for ventricular tachyarrhythmic events (VTEs) and SCD under a wide range of pathophysiologic substrates including ischemic and nonischemic cardiomyopathy and recent acute coronary syndromes.4, 5 RA may also be seen in structurally normal hearts under conditions of significant metabolic stress6 and chronotropic stimulation.7, 8 Early pioneering work has shown that different regions of the heart may alternate out of phase to form spatially discordant RA, and that phenomenon alone was a key factor promoting arrhythmogenesis by predisposing the heart to reentrant wave propagation.9, 10 Furthermore, in in silico studies, it was demonstrated that spatially discordant alternans led to markedly increased dispersion in repolarization (DR) that formed an ideal substrate for an ectopic trigger beat to instigate spiral‐wave breakups leading to the onset of lethal arrhythmias, such as ventricular tachycardia (VT) and ventricular fibrillation (VF).11
This review provides a contemporary perspective of the subcellular and cellular mechanisms that give rise to cardiac alternans and potential therapeutic approaches based on this mechanistic understanding.
Background: Pulmonary vein (PV) reverse remodeling has been recognized following atrial fibrillation (AF) ablation. However, the extent of physiologic reverse remodeling after AF ablation and the potential impact of reverse remodeling on the radiographic diagnosis of PV stenosis have not been well characterized. Methods: From January 2004 to February 201 186 patients underwent paired cardiac magnetic resonance imaging (MRI) to delineate PV orifice dimensions before and after (mean 109 ± 61 days) an initial AF ablation. Results: Negative remodeling of the PV orifice cross sectional area occurred in 67.8% of veins with a mean reduction in area of 21.0 ± 14.1%, and positive remodeling was seen in the remaining PVs with an increase in area of 22.1 ± 23.4% compared to baseline. No PVs demonstrated a reduction in cross-sectional area of < 75% (maximum reduction observed was 58%). Negative remodeling of the PV long axis dimension was observed in 55.2% of veins with a mean reduction of 14.6 ± 9.2% compared to pre-ablation and positive remodeling was observed in 25.3% of PVs with a mean increase in diameter of 14.7 ± 12.6%. Only 1 PV demonstrated a reduction in orifice diameter of < 50%. There were no clinically evident or suspected cases of PV stenosis in this cohort. Conclusions: Negative remodeling of the PV orifice area was noted in the majority of PVs following AF ablation. However, in almost all cases, the extent of negative remodeling was well below commonly used thresholds for the radiographic diagnosis of PV stenosis.