Fifty Million Farmers

by Richard Heinberg. This is an excerpt from the Economic Design dimension of Gaia Education’s Design for Sustainability curriculum.

Gaia Education
19 min readMar 15, 2018

There was a time not so long ago when famine was an expected, if not accepted, part of life. Until the 19th century — whether in China, France, India or Britain — food came almost entirely from local sources and harvests were variable. In good years, there was plenty — enough for seasonal feasts and for storage in anticipation of winter and hard times to come; in bad years, starvation cut down the poorest and the weakest — the very young, the old, and the sickly. Sometimes bad years followed one upon another, reducing the size of the population by several percent. This was the normal condition of life in pre-industrial societies, and it persisted for thousands of years.

Today, in America, such a state of affairs is hard to imagine. Food is so cheap and plentiful that obesity is a far more widespread concern than hunger. The average mega-supermarket stocks an impressive array of exotic foods from across the globe, and even staples are typically trucked from hundreds of miles away. Many people in America did go hungry during the Great Depression, but those were times that only the elderly can recall. In the current regime, the desperately poor may experience chronic malnutrition and may miss meals, but for most the dilemma is finding time in the day’s hectic schedule to go to the grocery store or to cook. As a result, fast-food restaurants proliferate: the fare may not be particularly nutritious, but even an hour’s earnings at minimum wage will buy a meal or two. The average American family spent 20 percent of its income on food in 1950; today the figure is 10 percent.

This is an extraordinary situation; but because it is the only one that most Americans alive today have ever experienced, we tend to assume that it will continue indefinitely. However there are reasons to think that our current anomalous abundance of inexpensive food may be only temporary; if so, present and future generations may become acquainted with that old, formerly familiar but unwelcome houseguest — famine.

The following are four principal bases (there are others) for this gloomy forecast.

The first has to with looming fuel shortages. This is a subject I have written about extensively elsewhere, so I shall not repeat myself in any detail. Suffice it to say that the era of cheap oil and natural gas is coming to a crashing end, with global oil production projected to peak in 2010 and North American natural gas extraction rates already in decline. These events will have enormous implications for America’s petroleum-dependent food system.

Modern industrial agriculture has been described as a method of using soil to turn petroleum and gas into food. We use natural gas to make fertilizer, and oil to fuel farm machinery and power irrigation pumps, as a feedstock for pesticides and herbicides, in the maintenance of animal operations, in crop storage and drying, and for transportation of farm inputs and outputs. Agriculture accounts for about 17 percent of the U.S. annual energy budget; this makes it the single largest consumer of petroleum products as compared to other industries. By comparison, the U.S. military, in all of its operations, uses only about half that amount. About 350 gallons (1,500 liters) of oil equivalents are required to feed each American each year, and every calorie of food produced requires, on average, ten calories of fossil-fuel inputs. This is a food system profoundly vulnerable, at every level, to fuel shortages and skyrocketing prices. And both are inevitable.

An attempt to make up for fuel shortfalls by producing more biofuels — ethanol, butanol, and biodiesel — will put even more pressure on the food system, and will likely result in a competition between food and fuel uses of land and other resources needed for agricultural production. Already 14 percent of the U.S. corn crop is devoted to making ethanol, and that proportion is expected to rise to one quarter, based solely on existing projects-in-development and government mandates.

The second factor potentially leading to famine is a shortage of farmers. Much of the success of industrial agriculture lies in its labor efficiency: far less human work is required to produce a given amount of food today than was the case decades ago (the actual fraction, comparing the year 2000 with 1900, is about one seventh). But that very success implies a growing vulnerability. We don’t need as many farmers, as a percentage of the population, as we used to; so, throughout the past century, most farming families — including hundreds of thousands and perhaps millions that would have preferred to maintain their rural, self-sufficient way of life — were economically forced to move to cities and find jobs. Today so few people farm that vital knowledge of how to farm is disappearing. The average age of American farmers is over 55 and approaching 60. The proportion of principal farm operators younger than 35 has dropped from 15.9 percent in 1982 to 5.8 percent in 2002. Of all the dismal statistics I know, these are surely among the most frightening. Who will be growing our food twenty years from now? With less oil and gas available, we will need far more knowledge and muscle power devoted to food production, and thus far more people on the farm, than we have currently.

The third worrisome trend is an increasing scarcity of fresh water. Sixty percent of water used nationally goes toward agriculture. California’s Central Valley, which produces the substantial bulk of the nation’s fruits, nuts, and vegetables, receives virtually no rainfall during summer months and relies overwhelmingly on irrigation. But the snowpack on the Sierras, which provides much of that irrigation water, is declining, and the aquifer that supplies much of the rest is being drawn down at many times its recharge rate. If these trends continue, the Central Valley may be incapable of producing food in any substantial quantities within two or three decades. Other parts of the country are similarly overspending their water budgets, and very little is being done to deal with this looming catastrophe.

Fourth and finally, there is the problem of global climate change. Often the phrase used for this is “global warming,” which implies only the fact that the world’s average temperature will be increasing by a couple of degrees or more over the next few decades. The much greater problem for farmers is destabilization of weather patterns. We face not just a warmer climate, but climate chaos: droughts, floods, and stronger storms in general (hurricanes, cyclones, tornadoes, hail storms) — in short, unpredictable weather of all kinds. Farmers depend on relatively consistent seasonal patterns of rain and sun, cold and heat; a climate shift can spell the end of farmers’ ability to grow a crop in a given region, and even a single freak storm can destroy an entire year’s production. Given the fact that modern American agriculture has become highly centralized due to cheap transport and economies of scale (almost the entire national spinach crop, for example, comes from a single valley in California), the damage from that freak storm is today potentially continental or even global in scale. We have embarked on a century in which, increasingly, freakish weather is normal.

I am not pointing out these problems, and their likely consequences, in order to cause panic. As I propose below, there is a solution to at least two of these dilemmas, one that may also help us address the remaining ones. It is not a simple or easy strategy and it will require a coordinated and sustained national effort. But in addition to averting famine, this strategy may permit us to solve a host of other, seemingly unrelated social and environmental problems.

Intensifying Food Production

In order to get a better grasp of the problems and the solution being proposed, it is essential that we understand how our present exceptional situation of cheap abundance came about. In order to do that, we must go back not just a few decades, but at least ten thousand years.

The origins of agriculture are shrouded in mystery, though archaeologists have been whittling away at that mystery for decades. We know that horticulture (gardening) began at somewhat different periods, independently, in at least three regions — the Middle East, Southeast Asia, and Central America. Following the end of the last Ice Age, roughly 12,000 years ago, much of humanity was experiencing a centuries-long food crisis brought on by the over-hunting of the megafauna that had previously been at the center of the human diet. The subsequent domestication of plants and animals brought relative food security, as well as the ability to support larger and more sedentary populations.

As compared to hunting and gathering, horticulture intensified the process of obtaining food. Intensification (because it led to increased population density — i.e., more mouths to feed), then led to the need for even more intensification: thus horticulture (gardening) eventually led to agriculture (field cropping). The latter produced more food per unit of land, which enabled more population growth, which meant still more demand for food. We are describing a classic self-reinforcing feedback loop.

As a social regime, horticulture did not represent a decisive break with hunting and gathering. Just as women had previously participated in essential productive activities by foraging for plants and hunting small animals, they now played a prominent role in planting, tending, and harvesting the garden — activities that were all compatible with the care of infants and small children. Thus women’s status remained relatively high in most horticultural societies. Seasonal surpluses were relatively small and there was no full-time division of labor.

But as agriculture developed — with field crops, plows, and draft animals — societies inevitably mutated in response. Plowing fields was men’s work; women were forced to stay at home and lost social power. Larger seasonal surpluses required management as well as protection from raiders; full-time managers and specialists in violence proliferated as a result. Societies became multi-layered: wealthy ruling classes (which had never existed among hunter-gatherers, and were rare among gardeners) sat atop an economic pyramid that came to include scribes, soldiers, and religious functionaries, and that was supported at its base by the vastly more numerous peasants — who produced all the food for themselves and everyone else as well. Writing, mathematics, metallurgy, and, ultimately, the trappings of modern life as we know it thus followed not so much from planting in general, as from agriculture in particular.

As important an instance of intensification as agriculture was, in many respects it pales in comparison with what has occurred within the past century or so, with the application of fossil fuels to farming. Petroleum-fed tractors replaced horses and oxen, freeing up more land to grow food for far more people. The Haber-Bosch process for synthesizing ammonia from fossil fuels, invented just prior to World War I, has doubled the amount of nitrogen available to green nature — with nearly all of that increase going directly to food crops. New hybrid plant varieties led to higher yields. Technologies for food storage improved radically. And fuel-fed transport systems enabled local surpluses to be sold not just regionally, but nationally and even globally. Through all of these strategies, we have developed the wherewithal to feed seven times the population that existed at the beginning of the Industrial Revolution. And, in the process, we have made farming uneconomical and unattractive to all but a few.

That’s the broad, global overview. In America, whose history as an independent nation begins at the dawn of the industrial era, the story of agriculture comprises three distinct periods:

The Expansion Period (1600 to 1920): Increases in food production during these three centuries came simply from putting more land into production; technological change played only a minor role.

The Mechanization Period (1920 to 1970): In this half-century, technological advances issuing from cheap, abundant fossil-fuel energy resulted in a dramatic increase in productivity (output per worker hour). Meanwhile, farm machinery, pesticides, herbicides, irrigation, new hybrid crops, and synthetic fertilizers allowed for the doubling and tripling of crop production. Also during this time, U.S. Department of Agriculture policy began favoring larger farms (the average U.S. farm size grew from 100 acres in 1930 to almost 500 acres by 1990), and production for export.

The Saturation Period (1970-present): In recent decades, the application of still greater amounts of energy have produced smaller relative increases in crop yields; meanwhile an ever-growing amount of energy is being expended to maintain the functioning of the overall system. For example, about ten percent of the energy in agriculture is used just to offset the negative effects of soil erosion, while increasing amounts of pesticides must be sprayed each year as pests develop resistances. In short, strategies that had recently produced dramatic increases in productivity became subject to the law of diminishing returns.

While we were achieving miracles of productivity, agriculture’s impact on the natural world was also growing; indeed it is now the single greatest source of human damage to the global environment. That damage takes a number of forms: erosion and salinization of soils; deforestation (a strategy for bringing more land into cultivation); fertilizer runoff (which ultimately creates enormous “dead zones” around the mouths of many rivers); loss of biodiversity; fresh water scarcity; and agrochemical pollution of water and soil.

In short, we created unprecedented abundance while ignoring the long-term consequences of our actions. This is more than a little reminiscent of how some previous agricultural societies — the Greeks, Babylonians, and Romans — destroyed soil and habitat in their mania to feed growing urban populations, and collapsed as a result.

Fortunately, during the past century or two we have also developed the disciplines of archaeology and ecology, which teach us how and why those ancient societies failed, and how the diversity of the web of life sustains us. Thus, in principle, if we avail ourselves of this knowledge, we need not mindlessly repeat yet again the time-worn tale of catastrophic civilizational collapse.

The 21st Century: De-Industrialization

How might we avoid such a fate?

Surely the dilemmas we have outlined above are understood by the managers of the current industrial food system. They must have some solutions in mind.

Indeed they do, and, predictably perhaps, those solutions involve a further intensification of the food production process. Since we cannot achieve much by applying more energy directly to that process, the most promising strategy on the horizon seems to be the genetic engineering of new crop varieties. If, for example, we could design crops to grow with less water, or in unfavorable climate and soil conditions, we could perhaps find our way out of the current mess.

Unfortunately, there are some flaws with this plan. Our collective experience with genetically modifying crops so far shows that glowing promises of higher yields, or of the reduced need for herbicides, have seldom been fulfilled. At the same time, new genetic technologies carry with them the potential for horrific unintended consequences in the forms of negative impacts on human health and the integrity of ecosystems. We have been gradually modifying plants and animals through selective breeding for millennia, but new gene-splicing techniques enable the re-mixing of genomes in ways and to degrees impossible heretofore. One serious error could result in biological tragedy on an unprecedented scale.

Yet even if future genetically modified commercial crops prove to be much more successful than past ones, and even if we manage to avert a genetic apocalypse, the means of producing and distributing genetically engineered seeds is itself reliant on the very fuel-fed industrial system that is in question.

Is it possible, then, that a solution lies in another direction altogether — perhaps in deliberately de-industrializing production, but doing so intelligently, using information we have gained from the science of ecology, as well as from traditional and indigenous farming methods, in order to reduce environmental impacts while maintaining total yields at a level high enough to avert widespread famine?

This is not an entirely new idea (as you all well know, the organic and ecological farming movements have been around for decades), but up to this point the managers of the current system have resisted it. This is no doubt largely because those managers are heavily influenced by giant corporations that profit from centralized industrial production for distant markets. Nevertheless, the fact that we have reached the end of the era of cheap oil and gas demands that we re-examine the potential costs and benefits of our current trajectory and its alternatives.

I believe we must and can de-industrialize agriculture. The general outline of what I mean by de-industrialization is simple enough: this would imply a radical reduction of fossil fuel inputs to agriculture, accompanied by an increase in labor inputs and a reduction of transport, with production being devoted primarily to local consumption.

Once again, fossil fuel depletion almost ensures that this will happen. But at the same time, it is fairly obvious that if we don’t plan for de-industrialization, the result could be catastrophic. It’s worth taking a moment to think about how events might unfold if the process occurs without intelligent management, driven simply by oil and gas depletion.

Facing high fuel prices, family farms would declare bankruptcy in record numbers. Older farmers (the majority, in other words) would probably choose simply to retire, whether they could afford to or not. However, giant corporate farms would also confront rising costs — which they would pass along to consumers by way of dramatically higher food prices.

Yields would begin to decline — in fits and starts — as weather anomalies and water shortages affected one crop after another.

Meanwhile, people in the cities would also feel the effects of skyrocketing energy prices. Entire industries would falter, precipitating a general economic collapse. Massive unemployment would lead to unprecedented levels of homelessness and hunger.

Many people would leave cities looking for places to live where they could grow some food. Yet they might find all of the available land already owned by banks or the government. Without experience of farming, even those who succeeded in gaining access to acreage would fail to produce much food and would ruin large tracts of land in the process.

Eventually these problems would sort themselves out; people and social systems would adapt — but probably not before an immense human and environmental tragedy had ensued.

I wish I could say that this forecast is exaggerated for effect. Yet the actual events could be far more violent and disruptive than it is possible to suggest in so short a summary.

Examples and Strategies

Things don’t have to turn out that way. As I have already said, I believe that the de-industrialization of agriculture could be carried out in a way that is not catastrophic and that in fact substantially benefits society and the environment in the long run. But to be convinced of the thesis we need more than promises — we need historic examples and proven strategies. Fortunately, we have two of each.

In some respects the most relevant example is that of Cuba’s Special Period. In the early 1990s, with the collapse of the Soviet Union, Cuba lost its source of cheap oil. Its industrialized agricultural system, which was heavily fuel-dependent, immediately faltered. Very quickly, Cuban leaders abandoned the Soviet industrial model of production, changing from a fuel- and petrochemical-intensive farming method to a more localized, labor-intensive, organic mode of production.

How they did this is itself an interesting story. Eco-agronomists at Cuban universities had already been advocating a transition somewhat along these lines. However, they were making little or no headway. When the crisis hit, they were given free rein to, in effect, redesign the entire Cuban food system. Had these academics not had a plan waiting in the wings, the nation’s fate might have been sealed.

Heeding their advice, the Cuban government broke up large, state-owned farms and introduced private farms, farmer co-ops, and farmer markets. Cuban farmers began breeding oxen for animal traction. The Cuban people adopted a mainly vegetarian diet, mostly involuntarily (Meat eating went from twice a day to twice a week). They increased their intake of vegetable sources of protein and farmers decreased the growing of wheat and rice (Green Revolution crops that required too many inputs). Urban gardens (including rooftop gardens) were encouraged, and today they produce 50 to 80 percent of vegetables consumed in cities.

Early on, it was realized that more farmers were needed, and that this would require education. All of the nation’s colleges and universities quickly added courses on agronomy. At the same time, wages for farmers were raised to be at parity with those for engineers and doctors. Many people moved from the cities to the country; in some cases there were incentives, in others the move was forced.

The result was survival. The average Cuban lost 20 pounds of body weight, but in the long run the overall health of the nation’s people actually improved as a consequence. Today, Cuba has a stable, slowly growing economy. There are few if any luxuries, but everyone has enough to eat. Having seen the benefit of smaller-scale organic production, Cuba’s leaders have decided that even if they find another source of cheap oil, they will maintain a commitment to their new, decentralized, low-energy methods.

I don’t want to give the impression that Cubans sailed through the Special Period unscathed. Cuba was a grim place during these years, and to this day food is far from plentiful there by American standards. My point is not that Cuba is some sort of paradise, but simply that matters could have been far worse.

It could be objected that Cuba’s experience holds few lessons for our own nation. Since Cuba has a very different government and climate, we might question whether its experience can be extrapolated to the U.S.

Let us, then, consider an indigenous historical example. During both World Wars, Americans planted Victory Gardens. During both periods, gardening became a sort of spontaneous popular movement, which (at least during World War II) the USDA initially tried to suppress, believing that it would compromise the industrialization of agriculture. It wasn’t until Eleanor Roosevelt planted a Victory Garden in the White House lawn that agriculture secretary Claude Wickard relented; his agency then began to promote Victory Gardens and to take credit for them. At the height of the movement, Victory Gardens were producing roughly 40 percent of America’s vegetables, an extraordinary achievement in so short a time.

In addition to these historical precedents, we have new techniques developed with the coming agricultural crisis in mind; two of the most significant are Permaculture and Biointensive farming (there are others — such as efforts by Wes Jackson of The Land Institute to breed perennial grain crops — but limitations of time and space require me to pick and choose).

Currently Biointensive farming is being taught extensively in Africa and South America as a sustainable alternative to the globalized monocropping. The term “biointensive” suggests that what we are discussing here is not a de-intensification of food production, but rather the development of production along entirely different lines. While both Permaculture and Biointensive have been shown to be capable of dramatically improving yields-per-acre, their developers clearly understand that even these methods will eventually fail us unless we also limit demand for food by gradually and humanely limiting the size of the human population.

In short, it is possible in principle for industrial nations like the U.S. to make the transition to smaller-scale, non-petroleum food production, given certain conditions. There are both precedents and models.

However, all of them imply more farmers. Here’s the catch — and here’s where the ancillary benefits kick in.

The Key: More Farmers!

One way or another, re-ruralization will be the dominant social trend of the 21st century. Thirty or forty years from now — again, one way or another — we will see a more historically normal ratio of rural to urban population, with the majority once again living in small, farming communities. More food will be produced in cities than is the case today, but cities will be smaller. Millions more people than today will be in the countryside growing food.

They won’t be doing so the way farmers do it today, and perhaps not the way farmers did it in 1900.

Indeed, we need perhaps to redefine the term farmer. We have come to think of a farmer as someone with 500 acres and a big tractor and other expensive machinery. But this is not what farmers looked like a hundred years ago, and it’s not an accurate picture of most current farmers in less-industrialized countries. Nor does it coincide with what will be needed in the coming decades. We should perhaps start thinking of a farmer as someone with 3 to 50 acres, who uses mostly hand labor and twice a year borrows a small tractor that she or he fuels with ethanol or biodiesel produced on-site.

How many more farmers are we talking about? Currently the U.S. has three or four million of them, depending on how we define the term.

Let’s again consider Cuba’s experience: in its transition away from fossil-fueled agriculture, that nation found that it required 15 to 25 percent of its population to become involved in food production. In America in 1900, nearly 40 percent of the population farmed; the current proportion is close to one percent.

Do the math for yourself. Extrapolated to this country’s future requirements, this implies the need for a minimum of 40 to 50 million additional farmers as oil and gas availability declines. How soon will the need arise? Assuming that the peak of global oil production occurs within the next five years, and that North American natural gas is already in decline, we are looking at a transition that must occur over the next 20 to 30 years, and that must begin approximately now.

Fortunately there are some hopeful existing trends to point to. The stereotypical American farmer is a middle-aged, Euro-American male, but the millions of new farmers in our future will have to include a broad mix of people, reflecting America’s increasing diversity. Already the fastest growth in farm operators in America is among female full-time farmers, as well as Hispanic, Asian, and Native American farm operators.

Another positive trend worth noting: Here in the Northeast, where the soil is acidic and giant agribusiness has not established as much of a foothold as elsewhere, the number of small farms is increasing. Young adults — not in the millions, but at least in the hundreds — are aspiring to become Permaculture or organic or Biointensive farmers. Farmers markets and CSAs are established or springing up throughout the region. This is somewhat the case also on the Pacific coast, much less so in the Midwest and South.

What will it take to make these tentative trends the predominant ones? Among other things we will need good and helpful policies. The USDA will need to cease supporting and encouraging industrial monocropping for export, and begin supporting smaller farms, rewarding those that make the effort to reduce inputs and to grow for local consumption. In the absence of USDA policy along these lines, we need to pursue state, county, and municipal efforts to support small farms in various ways, through favorable zoning, by purchasing local food for school lunches, and so on.

This is an excerpt from the Economic Design dimension of Gaia Education’s online course in Design for Sustainability. The course will start on 19 March 2018, so sign up now!

— — —

Gaia Education is a leading-edge provider of sustainability education that promotes thriving communities within planetary boundaries.

--

--

Gaia Education
Gaia Education

Written by Gaia Education

Leading provider of sustainability education that promotes thriving communities within planetary boundaries.

No responses yet