Confidence in the projected impacts of climate change on agricultural systems has increased substantially since the first Intergovernmental Panel on Climate Change (IPCC) reports. In Africa, much work has gone into downscaling global climate models to understand regional impacts, but there remains a dearth of local level understanding of impacts and communities’ capacity to adapt. It is well understood that Africa is vulnerable to climate change, not only because of its high exposure to climate change, but also because many African communities lack the capacity to respond or adapt to the impacts of climate change. Warming trends have already become evident across the continent, and it is likely that the continent’s 2000 mean annual temperature change will exceed +2°C by 2100. Added to this warming trend, changes in precipitation patterns are also of concern: Even if rainfall remains constant, due to increasing temperatures, existing water stress will be amplified, putting even more pressure on agricultural systems, especially in semiarid areas. In general, high temperatures and changes in rainfall patterns are likely to reduce cereal crop productivity, and new evidence is emerging that high-value perennial crops will also be negatively impacted by rising temperatures. Pressures from pests, weeds, and diseases are also expected to increase, with detrimental effects on crops and livestock.
Much of African agriculture’s vulnerability to climate change lies in the fact that its agricultural systems remain largely rain-fed and underdeveloped, as the majority of Africa’s farmers are small-scale farmers with few financial resources, limited access to infrastructure, and disparate access to information. At the same time, as these systems are highly reliant on their environment, and farmers are dependent on farming for their livelihoods, their diversity, context specificity, and the existence of generations of traditional knowledge offer elements of resilience in the face of climate change. Overall, however, the combination of climatic and nonclimatic drivers and stressors will exacerbate the vulnerability of Africa’s agricultural systems to climate change, but the impacts will not be universally felt. Climate change will impact farmers and their agricultural systems in different ways, and adapting to these impacts will need to be context-specific.
Current adaptation efforts on the continent are increasing across the continent, but it is expected that in the long term these will be insufficient in enabling communities to cope with the changes due to longer-term climate change. African famers are increasingly adopting a variety of conservation and agroecological practices such as agroforestry, contouring, terracing, mulching, and no-till. These practices have the twin benefits of lowering carbon emissions while adapting to climate change as well as broadening the sources of livelihoods for poor farmers, but there are constraints to their widespread adoption. These challenges vary from insecure land tenure to difficulties with knowledge-sharing.
While African agriculture faces exposure to climate change as well as broader socioeconomic and political challenges, many of its diverse agricultural systems remain resilient. As the continent with the highest population growth rate, rapid urbanization trends, and rising GDP in many countries, Africa’s agricultural systems will need to become adaptive to more than just climate change as the uncertainties of the 21st century unfold.
Dominic Moran and Jorie Knook
Climate change is already having a significant impact on agriculture through greater weather variability and the increasing frequency of extreme events. International policy is rightly focused on adapting and transforming agricultural and food production systems to reduce vulnerability. But agriculture also has a role in terms of climate change mitigation. The agricultural sector accounts for approximately a third of global anthropogenic greenhouse gas emissions, including related emissions from land-use change and deforestation. Farmers and land managers have a significant role to play because emissions reduction measures can be taken to increase soil carbon sequestration, manage fertilizer application, and improve ruminant nutrition and waste. There is also potential to improve overall productivity in some systems, thereby reducing emissions per unit of product. The global significance of such actions should not be underestimated. Existing research shows that some of these measures are low cost relative to the costs of reducing emissions in other sectors such as energy or heavy industry. Some measures are apparently cost-negative or win–win, in that they have the potential to reduce emissions and save production costs. However, the mitigation potential is also hindered by the biophysical complexity of agricultural systems and institutional and behavioral barriers limiting the adoption of these measures in developed and developing countries. This includes formal agreement on how agricultural mitigation should be treated in national obligations, commitments or targets, and the nature of policy incentives that can be deployed in different farming systems and along food chains beyond the farm gate. These challenges also overlap growing concern about global food security, which highlights additional stressors, including demographic change, natural resource scarcity, and economic convergence in consumption preferences, particularly for livestock products. The focus on reducing emissions through modified food consumption and reduced waste is a recent agenda that is proving more controversial than dealing with emissions related to production.
David E. Clay, Sharon A. Clay, Thomas DeSutter, and Cheryl Reese
Since the discovery that food security could be improved by pushing seeds into the soil and later harvesting a desirable crop, agriculture and agronomy have gone through cycles of discovery, implementation, and innovation. Discoveries have produced predicted and unpredicted impacts on the production and consumption of locally produced foods. Changes in technology, such as the development of the self-cleaning steel plow in the 18th century, provided a critical tool needed to cultivate and seed annual crops in the Great Plains of North America. However, plowing the Great Plains would not have been possible without the domestication of plants and animals and the discovery of the yoke and harness. Associated with plowing the prairies were extensive soil nutrient mining, a rapid loss of soil carbon, and increased wind and water erosion. More recently, the development of genetically modified organisms (GMOs) and no-tillage planters has contributed to increased adoption of conservation tillage, which is less damaging to the soil. In the future, the ultimate impact of climate change on agronomic practices in the North American Great Plains is unknown. However, projected increasing temperatures and decreased rainfall in the southern Great Plains (SGP) will likely reduce agricultural productivity. Different results are likely in the northern Great Plains (NGP) where higher temperatures can lead to increased agricultural intensification, the conversion of grassland to cropland, increased wildlife fragmentation, and increased soil erosion. Precision farming, conservation, cover crops, and the creation of plants better designed to their local environment can help mitigate these effects. However, changing practices require that farmers and their advisers understand the limitations of the soils, plants, and environment, and their production systems. Failure to implement appropriate management practices can result in a rapid decline in soil productivity, diminished water quality, and reduced wildlife habitat.
Christopher Morgan, Shannon Tushingham, Raven Garvey, Loukas Barton, and Robert Bettinger
At the global scale, conceptions of hunter-gatherer economies have changed considerably over time and these changes were strongly affected by larger trends in Western history, philosophy, science, and culture. Seen as either “savage” or “noble” at the dawn of the Enlightenment, hunter-gatherers have been regarded as everything from holdovers from a basal level of human development, to affluent, ecologically-informed foragers, and ultimately to this: an extremely diverse economic orientation entailing the fullest scope of human behavioral diversity. The only thing linking studies of hunter-gatherers over time is consequently simply the definition of the term: people whose economic mode of production centers on wild resources. When hunter-gatherers are considered outside the general realm of their shared subsistence economies, it is clear that their behavioral diversity rivals or exceeds that of other economic orientations. Hunter-gatherer behaviors range in a multivariate continuum from: a focus on mainly large fauna to broad, wild plant-based diets similar to those of agriculturalists; from extremely mobile to sedentary; from relying on simple, generalized technologies to very specialized ones; from egalitarian sharing economies to privatized competitive ones; and from nuclear family or band-level to centralized and hierarchical decision-making. It is clear, however, that hunting and gathering modes of production had to have preceded and thus given rise to agricultural ones. What research into the development of human economies shows is that transitions from one type of hunting and gathering to another, or alternatively to agricultural modes of production, can take many different evolutionary pathways. The important thing to recognize is that behaviors which were essential to the development of agriculture—landscape modification, intensive labor practices, the division of labor and the production, storage, and redistribution of surplus—were present in a range of hunter-gatherer societies beginning at least as early as the Late Pleistocene in Africa, Europe, Asia, and the Americas. Whether these behaviors eventually led to the development of agriculture depended in part on the development of a less variable and CO2-rich climatic regime and atmosphere during the Holocene, but also a change in the social relations of production to allow for hoarding privatized resources. In the 20th and 21st centuries, ethnographic and archaeological research shows that modern and ancient peoples adopt or even revert to hunting and gathering after having engaged in agricultural or industrial pursuits when conditions allow and that macroeconomic perspectives often mask considerable intragroup diversity in economic decision making: the pursuits and goals of women versus men and young versus old within groups are often quite different or even at odds with one another, but often articulate to form cohesive and adaptive economic wholes. The future of hunter-gatherer research will be tested by the continued decline in traditional hunting and gathering but will also benefit from observation of people who revert to or supplement their income with wild resources. It will also draw heavily from archaeology, which holds considerable potential to document and explain the full range of human behavioral diversity, hunter-gatherer or otherwise, over the longest of timeframes and the broadest geographic scope.
This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of Environmental Science. Please check back later for the full article.
In recent years, a number of food safety incidents in Europe and East Asia have led to concerns about threats to the environment and human health. In this context, the significance of a re-evaluation of risks with regards to food safety is essential, which includes re-visiting Western risk theories advanced by Ulrich Beck and Anthony Giddens. The dimensions of risks and food safety are four-fold.
First, major incidents, such as the disaster at the Fukushima nuclear plant in Japan in 2011 and the melamine crisis in China in 2008, have impacted the perception of food safety among consumers. These incidents led to fears of an increase of food safety incidents and to a collapse of trust in established brand products and technologies in post-industrial societies. It is necessary, therefore, to re-assess the risks of utilizing future-oriented technologies and mass food production systems.
Second, the use of genetically modified organisms in food products and the consumption of food additives have produced new food-related risks. This underlines the significance of risk assessment, in particular, as “reflexive modernization” requires individuals to familiarize themselves with new and possibly harmful food technologies and to assess, manage, and avoid risks on their own responsibility and on a highly personalized basis.
Third, various food-related incidents, such as the case of imported poisoned dumplings in Japan in 2008, have triggered the emergence of civil engagement in the form of consumer education initiatives. Both governmental and non-governmental initiatives stress the significance of locality, providence, and food heritage preservation as a way to ensure and maintain food safety and balanced nutritional habits. In other words, the notion of locality is linked to the desire to minimize the risk.
Fourth, poor individual eating habits, as self-imposed risks, have attracted scholarly attention. Food education initiatives in European and Asian nations seek to strengthen the culinary competence of individuals and embrace national staple foods and local food specialties at the same time. Efforts to provide adequate information about nutrition and to counter the rise of health conditions such as obesity and diabetes often coincide with a return to conservative gender perceptions and family values. This calls for new forms of culinary education that take the demands of working parents, individualized work schedules, and dining outside the home into consideration.