Background
Health clinics in rural Africa are typically resource-limited. As a result, many patients presenting with fever are treated with anti-malarial drugs based only on clinical presentation. This is a considerable issue in Uganda, where malaria is routinely over-diagnosed and over-treated, constituting a wastage of resources and an elevated risk of mortality in wrongly diagnosed patients. However, rapid diagnostic tests (RDTs) for malaria are increasingly being used in health facilities. Being fast, easy and inexpensive, RDTs offer the opportunity for feasible diagnostic capacity in resource-limited areas. This study evaluated the rate of malaria misdiagnosis and the accuracy of RDTs in rural Uganda, where presumptive diagnosis still predominates. Specifically, the diagnostic accuracy of “gold standard” methods, microscopy and PCR, were compared to the most feasible method, RDTs.
Methods
Patients presenting with fever at one of two health clinics in the Kabarole District of Uganda were enrolled in this study. Blood was collected by finger prick and used to administer RDTs, make blood smears for microscopy, and blot Whatman FTA cards for DNA extraction, polymerase chain reaction (PCR) amplification, and sequencing. The accuracy of RDTs and microscopy were assessed relative to PCR, considered the new standard of malaria diagnosis.
Results
A total of 78 patients were enrolled, and 31 were diagnosed with Plasmodium infection by at least one method. Comparing diagnostic pairs determined that RDTs and microscopy performed similarly, being 92.6 and 92.0 % sensitive and 95.5 and 94.4 % specific, respectively. Combining both methods resulted in a sensitivity of 96.0 % and specificity of 100 %. However, both RDTs and microscopy missed one case of non-falciparum malaria (Plasmodium malariae) that was identified and characterized by PCR and sequencing. In total, based on PCR, 62.0 % of patients would have been misdiagnosed with malaria if symptomatic diagnosis was used.
Conclusions
Results suggest that diagnosis of malaria based on symptoms alone appears to be highly inaccurate in this setting. Furthermore, RDTs were very effective at diagnosing malaria, performing as well or better than microscopy. However, only PCR and DNA sequencing detected non-P. falciparum species, which highlights an important limitation of this test and a treatment concern for non-falciparum malaria patients. Nevertheless, RDTs appear the only feasible method in rural or resource-limited areas, and therefore offer the best way forward in malaria management in endemic countries.
by
Elizabeth L. Barry;
Leila A. Mott;
Michal L. Melamed;
Judith R. Rees;
Anastasia Ivanova;
Robert S. Sandler;
Dennis J. Ahnen;
Robert S. Bresalier;
Robert W. Summers;
Roberd Bostick;
John A. Baron
Background
Calcium supplements are widely used among older adults for osteoporosis prevention and treatment. However, their effect on creatinine levels and kidney function has not been well studied.
Methods
We investigated the effect of calcium supplementation on blood creatinine concentration in a randomized controlled trial of colorectal adenoma chemoprevention conducted between 2004–2013 at 11 clinical centers in the United States. Healthy participants (N = 1,675) aged 45–75 with a history of colorectal adenoma were assigned to daily supplementation with calcium (1200 mg, as carbonate), vitamin D3 (1000 IU), both, or placebo for three or five years. Changes in blood creatinine and total calcium concentration were measured after one year of treatment and multiple linear regression was used to estimate effects on creatinine concentrations.
Results
After one year of treatment, blood creatinine was 0.013±0.006 mg/dL higher on average among participants randomized to calcium compared to placebo after adjustment for other determinants of creatinine (P = 0.03). However, the effect of calcium treatment appeared to be larger among participants who consumed the most alcohol (2–6 drinks/day) or whose estimated glomerular filtration rate (eGFR) was less than 60 ml/min/1.73 m2 at baseline. The effect of calcium treatment on creatinine was only partially mediated by a concomitant increase in blood total calcium concentration and was independent of randomized vitamin D treatment. There did not appear to be further increases in creatinine after the first year of calcium treatment.
Conclusions
Among healthy adults participating in a randomized clinical trial, daily supplementation with 1200 mg of elemental calcium caused a small increase in blood creatinine. If confirmed, this finding may have implications for clinical and public health recommendations for calcium supplementation.
Asymptomatic infection by fecal enteropathogens is a major contributor to childhood malnutrition. Here, we investigated the incidence rate of asymptomatic infection by enterotoxigenic Escherichia coli (ETEC) and assessed its association with childhood stunting, wasting, and being underweight among children under 2 years of age. The Malnutrition and Enteric Disease birth cohort study included 1,715 children who were followed from birth to 24 months of age from eight distinct geographic locations including Bangladesh, Brazil, India, Peru, Tanzania, Pakistan, Nepal, and South Africa. The TaqMan array card assay was used to determine the presence of ETEC in the nondiarrheal stool samples collected from these children. Poisson regression was used to estimate the incidence rate, and multiple generalized estimating equations with binomial family, logit link function, and exchangeable correlation were used to analyze the association between asymptomatic ETEC infection and anthropometric indicators such as stunting, wasting, and being underweight. The site-specific incidence rates of asymptomatic ETEC infections per 100 child-months were also higher at the study locations in Tanzania (54.81 [95% CI: 52.64, 57.07]) and Bangladesh (46.75 [95% CI: 44.75, 48.83]). In the Bangladesh, India, and Tanzania sites, the composite indicator of anthropometric failure was significantly associated with asymptomatic ETEC infection. Furthermore, a significant association between asymptomatic heat-stable toxin ETEC infections and childhood stunting, wasting, and being underweight was found in only the Bangladesh and Tanzania sites.
Background: It has been estimated that $154 million per year will be required during 2015-2020 to continue the Global Programme to Eliminate Lymphatic Filariasis (GPELF). In light of this, it is important to understand the program's current value. Here, we evaluate the cost-effectiveness and cost-benefit of the preventive chemotherapy that was provided under the GPELF between 2000 and 2014. In addition, we also investigate the potential cost-effectiveness of hydrocele surgery.
Methods: Our economic evaluation of preventive chemotherapy was based on previously published health and economic impact estimates (between 2000 and 2014). The delivery costs of treatment were estimated using a model developed by the World Health Organization. We also developed a model to investigate the number of disability-adjusted life years (DALYs) averted by a hydrocelectomy and identified the cost threshold under which it would be considered cost-effective.
Results: The projected cost-effectiveness and cost-benefit of preventive chemotherapy were very promising, and this was robust over a wide range of costs and assumptions. When the economic value of the donated drugs was not included, the GPELF would be classed as highly cost-effective. We projected that a typical hydrocelectomy would be classed as highly cost-effective if the surgery cost less than $66 and cost-effective if less than $398 (based on the World Bank's cost-effectiveness thresholds for low income countries).
Conclusions: Both the preventive chemotherapy and hydrocele surgeries provided under the GPELF are incredibly cost-effective and offer a very good investment in public health.
by
Matthew N. Ezewudo;
Sandeep J. Joseph;
Santiago Castillo-Ramirez;
Deborah Dean;
Carlos Del Rio;
Xavier Didelot;
Jo-Anne Dillon;
Richard F. Selden;
William Shafer;
Rosemary S. Turingan;
Magnus Unemo;
Timothy Read
Neisseria gonorrhoeae is the causative agent of gonorrhea, a sexually transmitted infection (STI) of major importance. As a result of antibiotic resistance, there are now limited options for treating patients. We collected draft genome sequence data and associated metadata data on 76 N. gonorrhoeae strains from around the globe and searched for known determinants of antibiotics resistance within the strains. The population structure and evolutionary forces within the pathogen population were analyzed. Our results indicated a cosmopolitan gonoccocal population mainly made up of five subgroups. The estimated ratio of recombination to mutation (r/m = 2.2) from our data set indicates an appreciable level of recombination occurring in the population. Strains with resistance phenotypes to more recent antibiotics (azithromycin and cefixime) were mostly found in two of the five population subgroups.
Background: Little is known about environmental determinants of autoimmune diseases.
Objectives: We studied autoimmune diseases in relation to level of exposure to perfluorooctanoic acid (PFOA), which was introduced in the late 1940s and is now ubiquitous in the serum of residents of industrialized countries.
Methods: In 2008–2011 we interviewed 32,254 U.S. adults with high serum PFOA serum levels (median, 28 ng/mL) associated with drinking contaminated water near a chemical plant. Disease history was assessed retrospectively from 1952 or birth (if later than 1952) until interview. Self-reported history of autoimmune disease was validated via medical records. Cumulative exposure to PFOA was derived from estimates of annual mean serum PFOA levels during follow-up, which were based on plant emissions, residential and work history, and a fate-transport model. Cox regression models were used to estimate associations between quartiles of cumulative PFOA serum levels and the incidence of autoimmune diseases with ≥ 50 validated cases, including ulcerative colitis (n = 151), Crohn’s disease (n = 96), rheumatoid arthritis (n = 346), insulin-dependent diabetes (presumed to be type 1) (n = 160), lupus (n = 75), and multiple sclerosis (n = 98).
Results: The incidence of ulcerative colitis was significantly increased in association with PFOA exposure, with adjusted rate ratios by quartile of exposure of 1.00 (referent), 1.76 (95% CI: 1.04, 2.99), 2.63 (95% CI: 1.56, 4.43), and 2.86 (95% CI: 1.65, 4.96) (ptrend < 0.0001). A prospective analysis of ulcerative colitis diagnosed after the baseline 2005–2006 survey (n = 29 cases) suggested a positive but non-monotonic trend (ptrend = 0.21).
Discussion: To our knowledge, this is the first study of associations between this common environmental exposure and autoimmune diseases in humans. We found evidence that PFOA is associated with ulcerative colitis.
Social networks are believed to affect health-related behaviors and health. Data to examine the links between social relationships and health in low- and middle-income country settings are limited. We provide guidance for introducing an instrument to collect social network data as part of epidemiological surveys, drawing on experience in urban India. We describe development and fielding of an instrument to collect social network information relevant to health behaviors among adults participating in a large, population-based study of non-communicable diseases in Delhi, India. We discuss basic characteristics of social networks relevant to health including network size, health behaviors of network partners (i.e., network exposures), network homogeneity, network diversity, strength of ties, and multiplexity. Data on these characteristics can be collected using a short instrument of 11 items asked about up to 5 network members and 3 items about the network generally, administered in approximately 20 minutes. We found high willingness to respond to questions about social networks (97% response). Respondents identified an average of 3.8 network members, most often relatives (80% of network ties), particularly blood relationships. Ninety-one percent of respondents reported that their primary contacts for discussing health concerns were relatives. Among all listed ties, 91% of most frequent snack partners and 64% of exercise partners in the last two weeks were relatives. These results demonstrate that family relationships are the crux of social networks in some settings, including among adults in urban India. Collecting basic information about social networks can be feasibly and effectively done within ongoing epidemiological studies.
We systematically evaluated studies published through May 2014 in which investigators assessed the dose-response relationship between serum levels of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) and the occurrence of diabetes mellitus (DM), and we investigated the extent and sources of interstudy heterogeneity. The dose-response relationship between serum TCDD and DM across studies was examined using 2 dependent variables: an exposure level–specific proportion of persons with DM and a corresponding natural log-transformed ratio measure of the association between TCDD and DM. Regression slopes for each dependent variable were obtained for each study and included in a random-effects meta-analysis. Sensitivity analyses were used to assess the influence of inclusion and exclusion decisions, and sources of heterogeneity were explored using meta-regression models and a series of subanalyses. None of the summary estimates in the main models or in the sensitivity analyses indicated a statistically significant association. We found a pronounced dichotomy: a positive dose-response in cross-sectional studies of populations with low-level TCDD exposures (serum concentrations <10 pg/g lipid) and heterogeneous, but on balance null, results for prospective studies of persons with high prediagnosis TCDD body burdens. Considering the discrepancy of results for low current versus high past TCDD levels, the available data do not indicate that increasing TCDD exposure is associated with an increased risk of DM.