Agricultural land reforms are crucial to promote investments in sustainable land management and food production amidst accelerating urbanization and increasing population growth. However, notable gaps remain in the literature regarding how land reforms designed at the national level are implemented in localized contexts, especially as they interplay with customary tenure regimes. Adopting an institutional bricolage perspective, we explore interactions between local tenure arrangements and government land reforms and the resulting implications for food production in rural Mali. We show that specific market-based land tenure arrangements in the study area emerged from a combination of urbanization pressures and government-designed land reform. We find that tenure security is linked to agricultural investment decisions, as also documented by previous studies. We likewise show that anxieties and ambiguities stemming from state-mandated land registration foster the emergence of monetized forms of access to collective land. These new market-based systems drive greater out-migration of productive community members, leading to labour shortages and weakening the social cohesion and mutual support systems upon which the most vulnerable depend. The findings show that top-down land reforms in rural Mali lead to disruptions of the social fabric, along with re-organizations of tenure systems to accommodate social norms and priorities. We illustrate how, in the context of centralized policy making with limited local consultation, community members resist cooperating and creatively search for alternatives to achieve their social goals. Empirical investigations of socio-institutional challenges such as land tenure arrangements are critical for effective scaling of agricultural innovations and sustainable food production.
Innovation platforms have emerged as a way of enhancing the resilience of agricultural and food systems in the face of environmental change. Consequently, a great deal of theoretical reflection and empirical research have been devoted to the goal of understanding the factors that enhance and constrain their functionality. In this article, we further examine this enquiry by applying the concept of institutional embeddedness, understood as encompassing elements of platform design, structure, and functions as well as aspects of the broader historical, political, and social context to which platforms are connected. We present a case study of sub-national platforms established in three districts of the climatically-stressed Upper West Region of Ghana and charged with facilitating climate change responses at the local level and channelling community priorities into national climate change policy. A different kind of organization − the traditional chief council, the agricultural extension service, and a local NGO − was chosen by members to convene and coordinate the platform in each district. We examine platform members’ accounts of the platform formation and selection of facilitating agent, their vision for platform roles, and their understandings of platform agenda and impacts. We analyse these narratives through the lens of institutional embeddedness, as expressed mostly, but not solely, by the choice of facilitating agents. We illustrate how the organizational position − and related vested interests − of facilitating agents contribute to shaping platform agendas, functions, and outcomes. This process hinges on the deployment of legitimacy claims, which may appeal to cultural tradition, technical expertise, community engagement, and dominant scientific narratives on climate change. Iinstitutional embeddedness is thereby shown to be a critical aspect of agency in multi-actor processes, contributing to framing local understandings of the climate change and to channelling collective efforts towards select response strategies. In conclusion, we stress that the institutional identity of facilitating agents and their relationship to members of the platform and to powerholders in the broader context provides a useful diagnostic lens to analyse the processes that shape the platform’s ability to achieve its goals.
Background
Folate is a B‐vitamin required for DNA synthesis, methylation, and cellular division. Wheat and maize (corn) flour are staple crops consumed widely throughout the world and have been fortified with folic acid in over 80 countries to prevent neural tube defects. Folic acid fortification may be an effective strategy to improve folate status and other health outcomes in the overall population.
Objectives
To evaluate the health benefits and safety of folic acid fortification of wheat and maize flour (i.e. alone or in combination with other micronutrients) on folate status and health outcomes in the overall population, compared to wheat or maize flour without folic acid (or no intervention).
Search methods
We searched the following databases in March and May 2018: Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE and MEDLINE In Process, Embase, CINAHL, Web of Science (SSCI, SCI), BIOSIS, Popline, Bibliomap, TRoPHI, ASSIA, IBECS, SCIELO, Global Index Medicus‐AFRO and EMRO, LILACS, PAHO, WHOLIS, WPRO, IMSEAR, IndMED, and Native Health Research Database. We searched the International Clinical Trials Registry Platform and ClinicalTrials.gov for ongoing or planned studies in June 2018, and contacted authors for further information.
Selection criteria
We included randomised controlled trials (RCTs), with randomisation at the individual or cluster level. We also included non‐RCTs and prospective observational studies with a control group; these studies were not included in meta‐analyses, although their characteristics and findings were described. Interventions included wheat or maize flour fortified with folic acid (i.e. alone or in combination with other micronutrients), compared to unfortified flour (or no intervention). Participants were individuals over two years of age (including pregnant and lactating women), from any country.
Data collection and analysis
Two review authors independently assessed study eligibility, extracted data, and assessed risk of bias.
Main results
We included 10 studies: four provided data for quantitative analyses (437 participants); five studies were randomised trials (1182 participants); three studies were non‐RCTs (1181 participants, 8037 live births); two studies were interrupted time series (ITS) studies (1 study population of 2,242,438, 1 study unreported). Six studies were conducted in upper‐middle‐income countries (China, Mexico, South Africa), one study was conducted in a lower‐middle‐income country (Bangladesh), and three studies were conducted in a high‐income country (Canada). Seven studies examined wheat flour fortified with folic acid alone or with other micronutrients. Three studies included maize flour fortified with folic acid alone or with other micronutrients. The duration of interventions ranged from two weeks to 36 months, and the ITS studies included postfortification periods of up to seven years. Most studies had unclear risk of bias for randomisation, blinding, and reporting, and low/unclear risk of bias for attrition and contamination.
Neural tube defects: none of the included RCTs reported neural tube defects as an outcome. In one non‐RCT, wheat flour fortified with folic acid and other micronutrients was associated with significantly lower occurrence of total neural tube defects, spina bifida, and encephalocoele, but not anencephaly, compared to unfortified flour (total neural tube defects risk ratio (RR) 0.32, 95% confidence interval (CI) 0.21 to 0.48; 1 study, 8037 births; low‐certainty evidence).
Folate status: pregnant women who received folic acid‐fortified maize porridge had significantly higher erythrocyte folate concentrations (mean difference (MD) 238.90 nmol/L, 95% CI 149.40 to 328.40); 1 study, 38 participants; very low‐certainty evidence) and higher plasma folate (MD 14.98 nmol/L, 95% CI 9.63 to 20.33; 1 study, 38 participants; very low‐certainty evidence), compared to no intervention. Women of reproductive age consuming maize flour fortified with folic acid and other micronutrients did not have higher erythrocyte folate (MD ‐61.80 nmol/L, 95% CI ‐152.98 to 29.38; 1 study, 35 participants; very low‐certainty evidence) or plasma folate (MD 0.00 nmol/L, 95% CI ‐0.00 to 0.00; 1 study, 35 participants; very low‐certainty evidence) concentrations, compared to women consuming unfortified maize flour. Adults consuming folic acid‐fortified wheat flour bread rolls had higher erythrocyte folate (MD 0.66 nmol/L, 95% CI 0.13 to 1.19; 1 study, 30 participants; very low‐certainty evidence) and plasma folate (MD 27.00 nmol/L, 95% CI 15.63 to 38.37; 1 study, 30 participants; very low‐certainty evidence), versus unfortified flour. In two non‐RCTs, serum folate concentrations were significantly higher among women who consumed flour fortified with folic acid and other micronutrients compared to women who consumed unfortified flour (MD 2.92 nmol/L, 95% CI 1.99 to 3.85; 2 studies, 657 participants; very low‐certainty evidence).
Haemoglobin or anaemia: in a cluster‐randomised trial among children, there were no significant effects of fortified wheat flour flatbread on haemoglobin concentrations (MD 0.00 nmol/L, 95% CI ‐2.08 to 2.08; 1 study, 334 participants; low‐certainty evidence) or anaemia (RR 1.07, 95% CI 0.74 to 1.55; 1 study, 334 participants; low‐certainty evidence), compared to unfortified wheat flour flatbread.
Authors' conclusions
Fortification of wheat flour with folic acid may reduce the risk of neural tube defects; however, this outcome was only reported in one non‐RCT. Fortification of wheat or maize flour with folic acid (i.e. alone or with other micronutrients) may increase erythrocyte and serum/plasma folate concentrations. Evidence is limited for the effects of folic acid‐fortified wheat or maize flour on haemoglobin levels or anaemia. The effects of folic acid fortification of wheat or maize flour on other primary outcomes assessed in this review is not known. No studies reported on the occurrence of adverse effects. Limitations of this review were the small number of studies and participants, limitations in study design, and low‐certainty of evidence due to how included studies were designed and reported.
This brief addresses the rationale and priorities for investments in trade in livestock and other agricultural commodities such as market development and access, cross-border trade, and sanitary, phytosanitary and food safety standards, to build resilience in the drylands.
It should be noted at the outset that livestock trade functions reasonably well in the Intergovernmental Authority for Development (IGAD) countries. As shown by the impressive growth in the volume and value of trade in livestock and animal products in the region since 2001, markets are functioning reasonably well. A rough estimate is that trade in livestock and livestock products in the IGAD countries (Djibouti, Eritrea, Ethiopia, Kenya, Somalia, South Sudan, Sudan) equals USD 1 billion or more in foreign exchange in many years, and probably 5–6 times that amount in local currencies. Live animal and meat exports, especially from Ethiopia, Somalia/Somaliland and Sudan, have increased rapidly as has domestic trade centred on key urban markets such as Addis Ababa, Khartoum, Mombasa and Nairobi. Much of what we suggest in this brief describes actions that can be taken to ensure that producers in the lowlands of the Horn benefit from growing trade opportunities.
For two decades, Ethiopia has been one of the world’s leading recipients of food aid and the largest recipient in Africa. There are frequent claims that rural Ethiopia suffers from a food aid dependency syndrome that constrains productive investments and hinders sustainable development. Yet, is it true that rural households in Ethiopia are excessively dependent on food aid?
This research brief addresses food aid dependency in one of Ethiopia’s most chronically food insecure areas: South Wollo (including the neighboring Oromiya Zone), which has been referred to as the buckle in the country’s so-called “famine belt.” Using household and community data from a three-year study, this brief argues that, while large numbers of Ethiopians receive food aid, only a small percentage are highly dependent on it, even during the frequent droughts. Instead of food aid, households often rely on purchases, gifts, and other sources to meet consumption needs. Uncertainties surrounding the amounts and timing of food aid delivery have taught local farmers not to depend on it. Yet, official perceptions of food aid dependency can be used to justify socially and economically costly programs like resettlement, while discouraging investments in local livelihoods. The research findings caution that these perceptions might be mistaken
Background: Although the importance of adolescent nutrition has gained attention in the global nutrition community, there is a gap in research focused on adolescent dietary diversity and food group consumption. Objectives: This study aimed to characterize population-level food group consumption patterns and quantify the extent of dietary diversity among United States adolescents using a large nationally representative sample of adolescents aged 10–19 y. Methods: We used 24-h dietary recall data from the National Health and Nutrition Examination Survey (NHANES) from 2007 to 2018 to construct the 10 food groups comprising the minimum dietary diversity for women (MDD-W) indicator and estimated the prevalence of intake of each food group. A composite metric adolescent dietary diversity score (ADDS) was derived for each adolescent where 1 point was awarded per food group. Both population scores and the distribution of individual scores were estimated. Differences in proportions of food groups consumed across sociodemographic categories were tested using the Rao–Scott χ2 test, and pairwise comparisons were expressed as population prevalence differences and prevalence ratios. Results: Food group consumption patterns were very similar across 2 d of dietary recall but varied significantly by sex, race/ethnicity, and income status. The food groups with the highest prevalence of consumption were grains, white, roots, and tubers (∼99%), milk products (∼92%), and meat, poultry, and fish (∼85%), whereas <15% of adolescents consumed key micronutrient-dense foods, such as vitamin A–rich fruits and vegetables and dark green vegetables. The mean ADDS was 4.69, with modest variation across strata. Conclusions: On average, United States youth consumed fewer than 5 food groups on a given day. The lack of dietary variety and relatively low prevalence of consumption of several micronutrient-rich plant-based foods could pose a risk for adolescents’ ability to achieve micronutrient adequacy in the United States.
Using a predetermined set of criteria, including burden of anemia and neural tube defects (NTDs) and an enabling environment for large-scale fortification, this paper identifies 18 low-and middle-income countries with the highest and most immediate potential for large-scale wheat flour and/or rice fortification in terms of health impact and economic benefit. Adequately fortified staples, delivered at estimated coverage rates in these countries, have the potential to avert 72.1 million cases of anemia among non-pregnant women of reproductive age; 51,636 live births associated with folic acid-preventable NTDs (i.e., spina bifida, anencephaly); and 46,378 child deaths associated with NTDs annually. This equates to a 34% reduction in the number of cases of anemia and 38% reduction in the number of NTDs in the 18 countries identified. An estimated 5.4 million disabilityadjusted life years (DALYs) could be averted annually, and an economic value of 31.8 billion United States dollars (USD) generated from 1 year of fortification at scale in women and children beneficiaries. This paper presents a missed opportunity and warrants an urgent call to action for the countries identified to potentially avert a significant number of preventable birth defects, anemia, and under-five child mortality and move closer to achieving health equity by 2030 for the Sustainable Development Goals.
In the agricultural setting, core global food safety elements, such as hand hygiene and worker furlough, should reduce the risk of norovirus contamination on fresh produce. However, the effect of these practices has not been characterized. Using a quantitative microbial risk model, we evaluated the individual and combined effect of farm-based hand hygiene and worker furlough practices on the maximum risk of norovirus infection from three produce commodities (open leaf lettuce, vine tomatoes, and raspberries). Specifically, we tested two scenarios where a harvester's and packer's norovirus infection status was: 1) assumed positive; or 2) assigned based on community norovirus prevalence estimates. In the first scenario with a norovirus-positive harvester and packer, none of the individual interventions modeled reduced produce contamination to below the norovirus infectious dose. However, combined interventions, particularly high handwashing compliance (100%) and efficacy (6 log10 virus removal achieved using soap and water for 30 s), reduced produce contamination to <1–82 residual virus. Translating produce contamination to maximum consumer infection risk, 100% handwashing with a 5 log10 virus removal was necessary to achieve an infection risk below the threshold of 0.032 infections per consumption event. When community-based norovirus prevalence estimates were applied to the harvester and packer, the single interventions of 100% handwashing with 3 log10 virus removal (average 0.02 infection risk per consumption event) or furlough of the packer (average 0.03 infection risk per consumption event) reduced maximum infection risk to below the 0.032 threshold for all commodities. Bundled interventions (worker furlough, 100% glove compliance, and 100% handwashing with 1-log10 virus reduction) resulted in a maximum risk of 0.02 per consumption event across all commodities. These results advance the evidence-base for global produce safety standards as effective norovirus contamination and risk mitigation strategies.
THE SOUTH WOLLO AND OROMIYA ZONES have a terrifying nickname: the “buckle of the Ethiopian famine belt.” Farmers there tell of massive losses of livestock and other assets as a result of the inevitable droughts that afflict the region. It has been estimated that two-thirds of people there are poor and that one out of seven live in extreme poverty. Evidence suggests that many households “churn” in and out of poverty, often as a result of severe shocks such as drought. Aggregate statistics and one-time studies miss this poverty dynamic and cannot measure which families recover from a temporary drop into poverty, nor why. BASIS-sponsored research attempted to discover the degree to which the drought of 1999-2000 affected poverty trends in rural Ethiopia.
The water-food nexus literature examines the synergies and trade-offs of resource use but is dominated by large-scale analyses that do not sufficiently engage the local dimensions of resource management. The research presented here addresses this gap with a local-scale analysis of integrated water and food management in Burkina Faso. Specifically, we analyse the implementation of a national food security campaign (Opération Bondofa) to boost maize production in a subbasin that exhibits two important trends in Africa: a large increase in small-scale irrigators and the decentralisation of water management. As surface water levels dropped in the region, entities at different scales asserted increased control over water allocation, exposing the contested nature of new decentralised institutions, and powerful actors’ preference for local control. These scalar power struggles intersected with a lack of knowledge of small-scale irrigators’ cultural practices to produce an implementation and water allocation schedule that did match small-scale irrigator needs, resulting in low initial enthusiasm for the project. Increased attention from national governments to strengthen decentralised water management committees and spur greater knowledge of, and engagement with, small-scale irrigators can result in improved programme design to better incorporate small-scale irrigators into national food security campaigns.