by
Christian T Stackhouse;
Joshua C Anderson;
Zongliang Yue;
Thanh Nguyen;
Nicholas J Eustace;
Catherine P Langford;
Jelai Wang;
James R Rowland IV;
Chuan Xing;
Fady Mikhail;
Xiangqin Cui;
Hasan Alrefai;
Ryan E Bash;
Kevin J Lee;
Eddy S Yang;
Anita B Hjelmeland;
C Ryan Miller;
Jake Y Chen;
G Yancey Gillespie;
Christopher D Willey
Key molecular regulators of acquired radiation resistance in recurrent glioblastoma (GBM) are largely unknown, with a dearth of accurate preclinical models. To address this, we generated 8 GBM patient-derived xenograft (PDX) models of acquired radiation therapy-selected (RTS) resistance compared with same-patient, treatment-naive (radiation-sensitive, unselected; RTU) PDXs. These likely unique models mimic the longitudinal evolution of patient recurrent tumors following serial radiation therapy. Indeed, while whole-exome sequencing showed retention of major genomic alterations in the RTS lines, we did detect a chromosome 12q14 amplification that was associated with clinical GBM recurrence in 2 RTS models. A potentially novel bioinformatics pipeline was applied to analyze phenotypic, transcriptomic, and kinomic alterations, which identified long noncoding RNAs (lncRNAs) and targetable, PDX-specific kinases. We observed differential transcriptional enrichment of DNA damage repair pathways in our RTS models, which correlated with several lncRNAs. Global kinomic profiling separated RTU and RTS models, but pairwise analyses indicated that there are multiple molecular routes to acquired radiation resistance. RTS model-specific kinases were identified and targeted with clinically relevant small molecule inhibitors. This cohort of in vivo RTS patient-derived models will enable future preclinical therapeutic testing to help overcome the treatment resistance seen in patients with GBM.
Background: Sepsis is one of the leading causes of hospital mortality, and diabetes is a risk factor for the development of infections. Although strong evidence has shown an association between metformin and reduced risk of infections, the risk of developing infections with newer classes of oral anti-diabetic drugs (OADs) has been less certain. Our study aims to examine the association between outpatient OAD use and hospital admissions for infections. Methods: The study cohort included 1.39 million adults with diabetes utilizing the Veterans Health Affairs Corporate Data Warehouse. Multivariate logistic regression was used to estimate the effect of each drug class on hospital admission for infection while adjusting for covariates. Results: After adjusting for covariates, those who took metformin during the study period had 3.3% lower odds of hospital admission for infection compared to those who were never on metformin (OR 0.97, 95% CI 0.95-0.98). OADs that were associated with a statistically significant increased odds of being admitted included meglitinides (OR 1.22, 95% CI 1.07-1.38), SGLT2 inhibitors (OR 1.16, 95% CI 1.08-1.24), alpha-glucosidase inhibitors (OR 1.09, 95% CI 1.04-1.15), and DPP4 inhibitors (OR 1.04, 95% CI 1.01-1.06). Conclusions: Metformin was associated with lower odds of hospital admission for infection while meglitinides, SGLT2 inhibitors, alpha-glucosidase inhibitors, and DPP4 inhibitors were associated with higher odds of admission for infection.
BACKGROUND: Despite early diagnosis and compliance with phenylalanine (Phe)-restricted diets, many individuals with phenylketonuria (PKU) still exhibit neurological changes and experience deficits in working memory and other executive functions. Suboptimal choline intake may contribute to these impairments, but this relationship has not been previously investigated in PKU. The objective of this study was to determine if choline intake is correlated with working memory performance, and if this relationship is modified by diagnosis and metabolic control. METHODS: This was a cross-sectional study that included 40 adults with PKU and 40 demographically matched healthy adults. Web-based neurocognitive tests were used to assess working memory performance and 3-day dietary records were collected to evaluate nutrient intake. Recent and historical blood Phe concentrations were collected as measures of metabolic control. RESULTS: Working memory performance was 0.32 z-scores (95% CI 0.06, 0.58) lower, on average, in participants with PKU compared to participants without PKU, and this difference was not modified by total choline intake (F[1,75] = 0.85, p = 0.36). However, in a subgroup with complete historical blood Phe data, increased total choline intake was related to improved working memory outcomes among participants with well controlled PKU (Phe = 360 µmol/L) after adjusting for intellectual ability and mid-childhood Phe concentrations (average change in working memory per 100 mg change in choline = 0.11; 95% CI 0.02, 0.20; p = 0.02). There also was a trend, albeit nonsignificant (p = 0.10), for this association to be attenuated with increased Phe concentrations. CONCLUSIONS: Clinical monitoring of choline intake is essential for all individuals with PKU but may have important implications for working memory functioning among patients with good metabolic control. Results from this study should be confirmed in a larger controlled trial in people living with PKU.
Objective: To evaluate the feasibility of recruitment, preliminary efficacy, and acceptability of auricular percutaneous electrical nerve field stimulation (PENFS) for the treatment of fibromyalgia in veterans, using neuroimaging as an outcome measure and a biomarker of treatment response. Design: Randomized, controlled, single-blind. Setting: Government hospital. Subjects: Twenty-one veterans with fibromyalgia were randomized to standard therapy (ST) control or ST with auricular PENFS treatment. Methods: Participants received weekly visits with a pain practitioner over 4 weeks. The PENFS group received reapplication of PENFS at each weekly visit. Resting-state functional connectivity magnetic resonance imaging (rs-fcMRI) data were collected within 2 weeks prior to initiating treatment and 2 weeks following the final treatment. Analysis of rs-fcMRI used a right posterior insula seed. Pain and function were assessed at baseline and at 2, 6, and 12 weeks post-treatment. Results: At 12 weeks post-treatment, there was a nonsignificant trend toward improved pain scores and significant improvements in pain interference with sleep among the PENFS treatment group as compared with the ST controls. Neuroimaging data displayed increased connectivity to areas of the cerebellum and executive control networks in the PENFS group as compared with the ST control group following treatment. Conclusions: There was a trend toward improved pain and function among veterans with fibromyalgia in the ST + PENFS group as compared with the ST control group. Pain and functional outcomes correlated with altered rs-fcMRI network connectivity. Neuroimaging results differed between groups, suggesting an alternative underlying mechanism for PENFS analgesia.
Choline is an essential nutrient for brain development and function that is attained through high-protein foods, which are limited in the phenylalanine-restricted diet of people with phenylketonuria (PKU). This study compared choline consumption among individuals with PKU to a reference sample from the National Health and Nutrition Examination Survey (NHANES), and identified treatment and diet-related factors that may modulate choline needs. Participants were individuals with PKU (n = 120, 4–61 years) managed with dietary therapy alone (n = 49), sapropterin dihydrochloride for ≥1 year (n = 38), or pegvaliase for ≥1 year with no medical food (n = 33). NHANES participants were not pregnant or nursing and came from the 2015–2018 cycles (n = 10,681, 4–70 years). Dietary intake data were used to estimate total usual intake distributions for choline, and mean probability of adequacy (MPA) was calculated as a summary index of nutrient adequacy for four methyl-donor/co-factor nutrients that impact choline utilization (folate, vitamin B12, vitamin B6, and methionine). Only 10.8% (SE: 2.98) of the total PKU sample (14.7% [SE: 4.03] of children; 6.8% [SE: 2.89] of adults) achieved the adequate intake (AI) for choline, while 12.2% (SE:0.79) of the NHANES sample met the recommended level. Adults receiving pegvaliase were the most likely to exceed the AI for choline (14.82% [SE: 4.48]), while adults who were on diet therapy alone were the least likely (5.59% [SE: 2.95]). Without fortified medical foods, individuals on diet therapy and sapropterin would not be able to achieve the AI, and MPA for other methyl donor/co-factor nutrients would be reduced. More frequent monitoring of choline intake and increased choline fortification of medical foods could benefit this population.
Immunoglobulin replacement therapy (IGRT) can protect against lung function decline in CVID. We tested whether increasing IgG dosage was beneficial in patients who exhibited a decline in forced expiratory flow at 25–75% (FEF25–75%) even though they were receiving IgG doses within the therapeutic range. Of 189 CVID patients seen over 12 years, 38 patients met inclusion criteria, were seen on ≥ 3 visits, and demonstrated a ≥ 10% decrease in FEF25–75% from visits 1 to 2. FEF25–75%, forced expiratory flow at 1 s (FEV1), and FEV1/FVC at visit 3 were compared among those with non-dose adjustment (non-DA) versus additional IgG dose adjustment (DA). Three FEF25–75% tiers were identified: top (> 80% predicted), middle (50–80%), and bottom (< 50%). DA and non-DA groups did not differ in clinical infections or bronchodilator use, although the non-DA group tended to use more antibiotics. In the top, normal tier, FEF25–75% increased in DA, but the change did not achieve statistical significance. In the middle moderate obstruction tier, visit 3 FEF25–75% increased among DA but not non-DA sets (11.8 ± 12.4%, p = 0.003 vs. 0.3 ± 9.9%, p = 0.94). Improvement in FEV1/FVC at visit 3 was also significant among DA vs. non-DA (7.2 ± 12.4%, p = 0.04 vs. − 0.2 ± 2.7%, p = 0.85). In the bottom, severe tier, FEF25–75% was unchanged in DA (− 0.5 ± 5.2%, p = 0.79), but increased in non-DA (5.1 ± 5.2%, p = 0.02). Among IGRT CVID patients with moderate but not severe obstruction as assessed by spirometry, increasing IgG dosage led to an increase in FEF25–75% and FEV1/FVC.
Background
Somatic mutations in TP53 are present in 20%–30% of all breast tumors. While there are numerous population‐based analyses of TP53, yet none have examined the relationship between somatic mutations in TP53 and tumor invasive immune cells.
Methods
Clinical and genetic data from 601 women drawn from The Cancer Genome Atlas (TCGA) were used to test the association between somatic TP53 mutation and immune‐rich or immune‐poor tumor status; determined using the CIBERSORT‐based gene expression signature of 22 immune cell types. Our validation dataset, the Molecular Taxonomy of Breast Cancer International Consortium (METABRIC), used a pathologist‐determined measure of lymphocyte infiltration.
Results
Within TP53‐mutated samples, a mutation at codon p.R175H was shown to be present at higher frequency in immune‐rich tumors. In validation analysis, any somatic mutation in TP53 was associated with immune‐rich status, and the mutation at p.R175H had a significant association with tumor‐invasive lymphocytes. TCGA‐only analysis of invasive immune cell type identified an increase in M0 macrophages associated with p.R175H.
Conclusions
These findings suggest that TP53 somatic mutations, particularly at codon p.R175H, are enriched in tumors with infiltrating immune cells. Our results confirm recent research showing inflammation‐related gain of function in specific TP53 mutations.
Background:
DNA methylation, an important epigenetic mark, is well known for its regulatory role in gene expression, especially the negative correlation in the promoter region. However, its correlation with gene expression across genome at human population level has not been well studied. In particular, it is unclear if genome-wide DNA methylation profile of an individual can predict her/his gene expression profile. Previous studies were mostly limited to association analyses between single CpG site methylation and gene expression. It is not known whether DNA methylation of a gene has enough prediction power to serve as a surrogate for gene expression in existing human study cohorts with DNA samples other than RNA samples.
Results:
We examined DNA methylation in the gene region for predicting gene expression across individuals in non-cancer tissues of three human population datasets, adipose tissue of the Multiple Tissue Human Expression Resource Projects (MuTHER), peripheral blood mononuclear cell (PBMC) from Asthma and normal control study participates, and lymphoblastoid cell lines (LCL) from healthy individuals. Three prediction models were investigated, single linear regression, multiple linear regression, and least absolute shrinkage and selection operator (LASSO) penalized regression. Our results showed that LASSO regression has superior performance among these methods. However, the prediction power is generally low and varies across datasets. Only 30 and 42 genes were found to have cross-validation R2 greater than 0.3 in the PBMC and Adipose datasets, respectively. A substantially larger number of genes (258) were identified in the LCL dataset, which was generated from a more homogeneous cell line sample source. We also demonstrated that it gives better prediction power not to exclude any CpG probe due to cross hybridization or SNP effect.
Conclusion:
In our three population analyses DNA methylation of CpG sites at gene region have limited prediction power for gene expression across individuals with linear regression models. The prediction power potentially varies depending on tissue, cell type, and data sources. In our analyses, the combination of LASSO regression and all probes not excluding any probe on the methylation array provides the best prediction for gene expression.
Patients with coronavirus disease 2019 (COVID-19) seem to be at high risk for venous thromboembolism (VTE) development, but there is a paucity of data exploring both the natural history of COVID-19–associated VTE and the risk for poor outcomes after VTE development. This investigation aims to explore the relationship between COVID-19–associated VTE development and mortality. A prospectively maintained registry of patients older than 18 years admitted for COVID-19–related illnesses within an academic health care network between March and September 2020 was reviewed. Codes from the tenth revision of the International Classification of Diseases for VTE were collected. The charts of those patients with a code for VTE were manually reviewed to confirm VTE diagnosis. There were 2,552 patients admitted with COVID-19–related illnesses. One hundred and twenty-six patients (4.9%) developed a VTE. A disproportionate percentage of patients of Black race developed a VTE (70.9% VTE v 57.8% non-VTE; P = .012). A higher proportion of patients with VTE expired during their index hospitalization (22.8% VTE v 8.4% non-VTE; P < .001). On multivariable logistic regression analysis, VTE was independently associated with mortality (odds ratio = 3.17; 95% confidence interval, 1.9–5.2; P < .001). Hispanic/Latinx ethnicity was associated with decreased mortality (odds ratio = 0.45; 95% confidence interval, 0.21–1.00; P = .049). Hospitalized patients of Black race with COVID-19 were more prone to VTE development, and patients with COVID-19 who developed in-hospital VTE had roughly nearly threefold higher odds of mortality. Further emphasis should be placed on optimizing COVID-19 anticoagulation protocols to reduce mortality in this high-risk cohort.
Chronic kidney disease (CKD), as well as its common causes (e.g., diabetes and obesity), are recognized risk factors for severe COVID-19 illness. To explore whether the most common inherited cause of CKD, autosomal dominant polycystic kidney disease (ADPKD), is also an independent risk factor, we studied data from the VA health system and the VA COVID-19-shared resources (e.g., ICD codes, demographics, pre-existing conditions, pre-testing symptoms, and post-testing outcomes). Among 61 COVID-19-positive ADPKD patients, 21 (34.4%) were hospitalized, 10 (16.4%) were admitted to ICU, 4 (6.6%) required ventilator, and 4 (6.6%) died by August 18, 2020. These rates were comparable to patients with other cystic kidney diseases and cystic liver-only diseases. ADPKD was not a significant risk factor for any of the four outcomes in multivariable logistic regression analyses when compared with other cystic kidney diseases and cystic liver-only diseases. In contrast, diabetes was a significant risk factor for hospitalization [OR 2.30 (1.61, 3.30), p<0.001], ICU admission [OR 2.23 (1.47, 3.42), p<0.001], and ventilator requirement [OR 2.20 (1.27, 3.88), p=0.005]. Black race significantly increased the risk for ventilator requirement [OR 2.00 (1.18, 3.44), p=0.011] and mortality [OR 1.60 (1.02, 2.51), p=0.040]. We also examined the outcome of starting dialysis after COVID-19 confirmation. The main risk factor for starting dialysis was CKD [OR 6.37 (2.43, 16.7)] and Black race [OR 3.47 (1.48, 8.1)]. After controlling for CKD, ADPKD did not significantly increase the risk for newly starting dialysis comparing with other cystic kidney diseases and cystic liver-only diseases. In summary, ADPKD did not significantly alter major COVID-19 outcomes among veterans when compared to other cystic kidney and liver patients.