As an agricultural country and one of the world’s major food exporters, Thailand relies heavily on the use of pesticides to protect crops and increase yields. During the past decade, the Kingdom of Thailand has experienced an approximate four-fold increase in pesticide use. This increase presents a challenge for the Royal Thai Government in effectively managing and controlling pesticide use based upon the current policies and legal infrastructure. We have reviewed several key components for managing agricultural pesticides in Thailand. One of the main obstacles to effective pesticide regulation in Thailand is the lack of a consolidated, uniform system designed specifically for pesticide management. This deficit has weakened the enforcement of existing regulations, resulting in misuse/overuse of pesticides, and consequently, increased environmental contamination and human exposure. This article provides a systematic review of how agricultural pesticides are regulated in Thailand. In addition, we provide our perspectives on the current state of pesticide management, the potential health effects of widespread, largely uncontrolled use of pesticides on the Thai people and ways to improve pesticide management in Thailand.
by
Hyacinth Hyacinth;
Eunice B Nartey;
Jonathan Spector;
Seth Adu-Afarwuah;
Catherine L Jones;
Alan Jackson;
Agartha Ohemeng;
Rajiv Shah;
Alice Koryo-Dabrah;
Amma Benneh Akwasi Kuma;
Matilda Steiner-Asiedu
Background: Sickle cell disease (SCD) is an inherited blood disorder that predominantly affects individuals in sub-Saharan Africa. However, research that elucidates links between SCD pathophysiology and nutritional status in African patients is lacking. This systematic review aimed to assess the landscape of studies in sub-Saharan Africa that focused on nutritional aspects of SCD, and highlights gaps in knowledge that could inform priority-setting for future research. Methods: The study was conducted using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines. Inclusion criteria comprised original, peer-reviewed research published between January 1995 and November 2020 involving individuals in Africa with any phenotypic variant of SCD and at least one nutritional status outcome. Nutritional status outcomes were defined as those that assessed dietary intakes, growth/anthropometry, or nutritional biomarkers. Databases used were Ovid Embase, Medline, Biosis and Web of Science. Results: The search returned 526 articles, of which 76 were included in the final analyses. Most investigations (67%) were conducted in Nigeria. Studies were categorized into one of three main categories: descriptive studies of anthropometric characteristics (49%), descriptive studies of macro- or micronutrient status (41%), and interventional studies (11%). Findings consistently included growth impairment, especially among children and adolescents from sub-Saharan Africa. Studies assessing macro- and micronutrients generally had small sample sizes and were exploratory in nature. Only four randomized trials were identified, which measured the impact of lime juice, long-chain fatty acids supplementation, ready-to-use supplementary food (RUSF), and oral arginine on health outcomes. Conclusions: The findings reveal a moderate number of descriptive studies, most with small sample sizes, that focused on various aspects of nutrition and SCD in African patients. There was a stark dearth of interventional studies that could be used to inform evidence-based changes in clinical practice. Findings from the investigations were generally consistent with data from other regional settings, describing a significant risk of growth faltering and malnutrition among individuals with SCD. There is an unmet need for clinical research to better understand the potential benefits of nutrition-related interventions for patients with SCD in sub-Saharan Africa to promote optimal growth and improve health outcomes.
The need to minimise consumer risk, especially for food that can be consumed uncooked, is a continuing public health concern, particularly in places where safe sanitation and hygienic practices are absent. The use of wastewater in agriculture has been associated with disease risks, though its relative significance in disease transmission remains unclear. This study aimed at identifying key risk factors for produce contamination at different entry points of the food chain. Over 500 produce and ready-to-eat salad samples were collected from fields, markets, and kitchens during the dry and wet seasons in Accra, Ghana, and over 300 soil and irrigation water samples were collected. All samples were analysed for E. coli, human adenovirus and norovirus using standard microbiological procedures, and real time RT-PCR. Finally, critical exposures associated with microbial quality of produce were assessed through observations and interviews. The study found that over 80% of produce samples were contaminated with E. coli, with median concentrations ranging from 0.64 to 3.84 Log E. coli/g produce. Prepared salad from street food vendors was found to be the most contaminated (4.23 Log E. coli/g), and that consumption of salad exceeded acceptable health limits. Key risk factors identified for produce contamination were irrigation water and soil at the farm level. Storage duration and temperature of produce had a significant influence on the quality of produce sold at markets, while observations revealed that the washed water used to rinse produce before sale was dirty. The source of produce and operating with a hygiene permit were found to influence salad microbial quality at kitchens. This study argues for a need to manage produce risk factors at all domains along the food chain, though it would be more effective to prioritise at markets and kitchens due to cost, ease of implementation and public health significance.
by
Jason R. Rohr;
Christopher B. Barrett;
David Civitello;
Meggan E. Craft;
Bryan Delius;
Guilio A. DeLeo;
Peter J. Hudson;
Nicolas Jouanard;
Karena H. Nguyen;
Richard S. Ostfeld;
Justin Remais;
Gilles Riveau;
Susanne H. Sokolow;
David Tilman
Infectious diseases are emerging globally at an unprecedented rate while global food demand is projected to increase sharply by 2100. Here, we synthesize the pathways by which projected agricultural expansion and intensification will influence human infectious diseases and how human infectious diseases might likewise affect food production and distribution. Feeding 11 billion people will require substantial increases in crop and animal production that will expand agricultural use of antibiotics, water, pesticides and fertilizer, and contact rates between humans and both wild and domestic animals, all with consequences for the emergence and spread of infectious agents. Indeed, our synthesis of the literature suggests that, since 1940, agricultural drivers were associated with >25% of all — and >50% of zoonotic — infectious diseases that emerged in humans, proportions that will likely increase as agriculture expands and intensifies. We identify agricultural and disease management and policy actions, and additional research, needed to address the public health challenge posed by feeding 11 billion people.
Background
Folate is a B‐vitamin required for DNA synthesis, methylation, and cellular division. Wheat and maize (corn) flour are staple crops consumed widely throughout the world and have been fortified with folic acid in over 80 countries to prevent neural tube defects. Folic acid fortification may be an effective strategy to improve folate status and other health outcomes in the overall population.
Objectives
To evaluate the health benefits and safety of folic acid fortification of wheat and maize flour (i.e. alone or in combination with other micronutrients) on folate status and health outcomes in the overall population, compared to wheat or maize flour without folic acid (or no intervention).
Search methods
We searched the following databases in March and May 2018: Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE and MEDLINE In Process, Embase, CINAHL, Web of Science (SSCI, SCI), BIOSIS, Popline, Bibliomap, TRoPHI, ASSIA, IBECS, SCIELO, Global Index Medicus‐AFRO and EMRO, LILACS, PAHO, WHOLIS, WPRO, IMSEAR, IndMED, and Native Health Research Database. We searched the International Clinical Trials Registry Platform and ClinicalTrials.gov for ongoing or planned studies in June 2018, and contacted authors for further information.
Selection criteria
We included randomised controlled trials (RCTs), with randomisation at the individual or cluster level. We also included non‐RCTs and prospective observational studies with a control group; these studies were not included in meta‐analyses, although their characteristics and findings were described. Interventions included wheat or maize flour fortified with folic acid (i.e. alone or in combination with other micronutrients), compared to unfortified flour (or no intervention). Participants were individuals over two years of age (including pregnant and lactating women), from any country.
Data collection and analysis
Two review authors independently assessed study eligibility, extracted data, and assessed risk of bias.
Main results
We included 10 studies: four provided data for quantitative analyses (437 participants); five studies were randomised trials (1182 participants); three studies were non‐RCTs (1181 participants, 8037 live births); two studies were interrupted time series (ITS) studies (1 study population of 2,242,438, 1 study unreported). Six studies were conducted in upper‐middle‐income countries (China, Mexico, South Africa), one study was conducted in a lower‐middle‐income country (Bangladesh), and three studies were conducted in a high‐income country (Canada). Seven studies examined wheat flour fortified with folic acid alone or with other micronutrients. Three studies included maize flour fortified with folic acid alone or with other micronutrients. The duration of interventions ranged from two weeks to 36 months, and the ITS studies included postfortification periods of up to seven years. Most studies had unclear risk of bias for randomisation, blinding, and reporting, and low/unclear risk of bias for attrition and contamination.
Neural tube defects: none of the included RCTs reported neural tube defects as an outcome. In one non‐RCT, wheat flour fortified with folic acid and other micronutrients was associated with significantly lower occurrence of total neural tube defects, spina bifida, and encephalocoele, but not anencephaly, compared to unfortified flour (total neural tube defects risk ratio (RR) 0.32, 95% confidence interval (CI) 0.21 to 0.48; 1 study, 8037 births; low‐certainty evidence).
Folate status: pregnant women who received folic acid‐fortified maize porridge had significantly higher erythrocyte folate concentrations (mean difference (MD) 238.90 nmol/L, 95% CI 149.40 to 328.40); 1 study, 38 participants; very low‐certainty evidence) and higher plasma folate (MD 14.98 nmol/L, 95% CI 9.63 to 20.33; 1 study, 38 participants; very low‐certainty evidence), compared to no intervention. Women of reproductive age consuming maize flour fortified with folic acid and other micronutrients did not have higher erythrocyte folate (MD ‐61.80 nmol/L, 95% CI ‐152.98 to 29.38; 1 study, 35 participants; very low‐certainty evidence) or plasma folate (MD 0.00 nmol/L, 95% CI ‐0.00 to 0.00; 1 study, 35 participants; very low‐certainty evidence) concentrations, compared to women consuming unfortified maize flour. Adults consuming folic acid‐fortified wheat flour bread rolls had higher erythrocyte folate (MD 0.66 nmol/L, 95% CI 0.13 to 1.19; 1 study, 30 participants; very low‐certainty evidence) and plasma folate (MD 27.00 nmol/L, 95% CI 15.63 to 38.37; 1 study, 30 participants; very low‐certainty evidence), versus unfortified flour. In two non‐RCTs, serum folate concentrations were significantly higher among women who consumed flour fortified with folic acid and other micronutrients compared to women who consumed unfortified flour (MD 2.92 nmol/L, 95% CI 1.99 to 3.85; 2 studies, 657 participants; very low‐certainty evidence).
Haemoglobin or anaemia: in a cluster‐randomised trial among children, there were no significant effects of fortified wheat flour flatbread on haemoglobin concentrations (MD 0.00 nmol/L, 95% CI ‐2.08 to 2.08; 1 study, 334 participants; low‐certainty evidence) or anaemia (RR 1.07, 95% CI 0.74 to 1.55; 1 study, 334 participants; low‐certainty evidence), compared to unfortified wheat flour flatbread.
Authors' conclusions
Fortification of wheat flour with folic acid may reduce the risk of neural tube defects; however, this outcome was only reported in one non‐RCT. Fortification of wheat or maize flour with folic acid (i.e. alone or with other micronutrients) may increase erythrocyte and serum/plasma folate concentrations. Evidence is limited for the effects of folic acid‐fortified wheat or maize flour on haemoglobin levels or anaemia. The effects of folic acid fortification of wheat or maize flour on other primary outcomes assessed in this review is not known. No studies reported on the occurrence of adverse effects. Limitations of this review were the small number of studies and participants, limitations in study design, and low‐certainty of evidence due to how included studies were designed and reported.
Despite strong policy and program commitment, essential maternal nutrition services are not reaching enough women in many countries. This paper examined multifactorial determinants (personal, family, community, and health services) associated with maternal nutrition practices in Uttar Pradesh, India. Data were from a household survey of pregnant (n = 667) and recently delivered women (n = 1,835). Multivariable regression analyses were conducted to examine the determinants of four outcomes: consumption of diverse diets, consumption of iron folic acid (IFA) and calcium tablets, and weight monitoring during pregnancy.
Population attributable risk analysis was used to estimate how much the outcomes can be improved under optimal program implementation. During pregnancy, women consumed 28 IFA and 8 calcium tablets, 18% consumed diverse diet, and 17% were weighed ≥3 times. Nutrition knowledge was associated with consumption of diverse diet (odds ratio [OR] = 2.2 times), IFA (2.3 times), calcium (11.7 times), and weight monitoring (1.3 times). Beliefs and self-efficacy were associated with IFA (OR = 2.0) and calcium consumption (OR = 4.6). Family support and adequate health services were also associated with better nutrition practices.
Under optimal program implementation, we estimate that 51% of women would have adequate diet diversity, an average consumption of 98 IFA, and 106 calcium tablets, and women would be weighed 4.9 times during pregnancy. Strengthening existing program operations and increasing demand for services has the potential to result in large improvements in maternal nutrition practices from current baseline levels but may not be sufficient to meet World Health Organization-recommended levels without creating an enabling environment including improvements in education and income levels to support behaviour change.
In the agricultural setting, core global food safety elements, such as hand hygiene and worker furlough, should reduce the risk of norovirus contamination on fresh produce. However, the effect of these practices has not been characterized. Using a quantitative microbial risk model, we evaluated the individual and combined effect of farm-based hand hygiene and worker furlough practices on the maximum risk of norovirus infection from three produce commodities (open leaf lettuce, vine tomatoes, and raspberries). Specifically, we tested two scenarios where a harvester's and packer's norovirus infection status was: 1) assumed positive; or 2) assigned based on community norovirus prevalence estimates. In the first scenario with a norovirus-positive harvester and packer, none of the individual interventions modeled reduced produce contamination to below the norovirus infectious dose. However, combined interventions, particularly high handwashing compliance (100%) and efficacy (6 log10 virus removal achieved using soap and water for 30 s), reduced produce contamination to <1–82 residual virus. Translating produce contamination to maximum consumer infection risk, 100% handwashing with a 5 log10 virus removal was necessary to achieve an infection risk below the threshold of 0.032 infections per consumption event. When community-based norovirus prevalence estimates were applied to the harvester and packer, the single interventions of 100% handwashing with 3 log10 virus removal (average 0.02 infection risk per consumption event) or furlough of the packer (average 0.03 infection risk per consumption event) reduced maximum infection risk to below the 0.032 threshold for all commodities. Bundled interventions (worker furlough, 100% glove compliance, and 100% handwashing with 1-log10 virus reduction) resulted in a maximum risk of 0.02 per consumption event across all commodities. These results advance the evidence-base for global produce safety standards as effective norovirus contamination and risk mitigation strategies.
Propionibacterium acnes is implicated in the pathogenesis of acne vulgaris, which impacts >85% of teenagers. Novel therapies are in high demand and an ethnopharmacological approach to discovering new plant sources of anti-acne therapeutics could contribute to filling this void in effective therapies. The aims of our study were two-fold: (1) To determine if species identified in ethnopharmacological field studies as having traditional uses for skin and soft tissue infection (SSTI) exhibit significantly more activity against P. acnes than species with no such reported use; and (2) Chemically characterize active extracts and assess their suitability for future investigation. Extracts of Italian medicinal (for acne and other skin infection) and randomly collected plants and fungi were screened for growth-inhibitory and anti-biofilm activity in P. acnes using broth microdilution methods. Bioactive extracts were chemically characterized by HPLC and examined for cytotoxicity against human keratinocytes (HaCaTs). Following evaluation of 157 extracts from 10 fungi and 58 plants, we identified crude extracts from seven species exhibiting growth inhibitory activity (MICs 64-256 μg mL-1). All active extracts were examined for cytotoxicity against HaCaTs; extracts from one fungal and one plant species were toxic (IC50 256 μg mL-1). HPLC analysis with chemical standards revealed many of these extracts contained chlorogenic acid, p-coumaric acid, ellagic acid, gallic acid, and tannic acid. In conclusion, species used in traditional medicine for the skin exhibited significantly greater (p < 0.05) growth inhibitory and biofilm eradication activity than random species, supporting the validity of an ethnobotanical approach to identifying new therapeutics. The anti-acne activity of three extracts is reported for the first time: Vitis vinifera leaves, Asphodelus microcarpus leaves, and Vicia sativa aerial parts.
Antibiotic resistance poses one of the greatest threats to global health today; conventional drug therapies are becoming increasingly inefficacious and limited. We identified 16 medicinal plant species used by traditional healers for the treatment of infectious and inflammatory diseases in the Greater Mpigi region of Uganda. Extracts were evaluated for their ability to inhibit growth of clinical isolates of multidrug-resistant ESKAPE pathogens. Extracts were also screened for quorum quenching activity against S. aureus, including direct protein output assessment (δ-toxin), and cytotoxicity against human keratinocytes (HaCaT). Putative matches of compounds were elucidated via LC–FTMS for the best-performing extracts.
These were extracts of Zanthoxylum chalybeum (Staphylococcus aureus: MIC: 16 μg/mL; Enterococcus faecium: MIC: 32 μg/mL) and Harungana madagascariensis (S. aureus: MIC: 32 μg/mL; E. faecium: MIC: 32 μg/mL) stem bark. Extracts of Solanum aculeastrum root bark and Sesamum calycinum subsp. angustifolium leaves exhibited strong quorum sensing inhibition activity against all S. aureus accessory gene regulator (agr) alleles in absence of growth inhibition (IC50 values: 1–64 μg/mL). The study provided scientific evidence for the potential therapeutic efficacy of these medicinal plants in the Greater Mpigi region used for infections and wounds, with 13 out of 16 species tested being validated with in vitro studies.
While curbing the spread of Coronavirus Disease 2019 (COVID-19), lockdown policies and “stay-at-home” restrictions caused massive supply chain disruptions worldwide. This led to breaks in spatial market integration, which could further lead to market inefficiency and resource misallocation. Taking daily price data from 2016 to 2021, this study investigates COVID-19's effect on the spatial market integration of fish in China using cointegration tests. We find a high degree of spatial market integration for fish in China before the COVID-19 pandemic. Further, our results show that COVID-19's effect on the spatial market integration of fish varies spatially in China. Specifically, COVID-19 reduces the degree of spatial market integration in most provinces, especially those with high infection rates. Meanwhile, the degree of spatial market integration in provinces with low infection rates remains high. Therefore, the government should be regionally specific when formulating market recovery policies.