Agriculture, Food, and the Environment (2024)

Introduction

In 1990, in a Journal of American History roundtable on the burgeoning field of environmental history, Donald Worster observed that throughout history, humans needed to find food sources and develop consistent production systems to sustain life. Identifying these processes as “the most vital, constant, and concrete way” that people connected to the natural world, he called for an “agroecological” approach to environmental history that focused on the transition from subsistence farming to capitalist modes of food production.1 Other contributors found the materialist emphasis of Worster’s perspective too limiting. However, they accepted his underlying assumption about the centrality of food. William Cronon applauded examining food as a strategy to “encourage historians to reconstruct the intricate web of linkages between human beings and other organisms,” but added that food is more than “simply a system of bundled calories and nutrients that sustain the life of a human community by concentrating the trophic energy flows of an ecosystem.”2 What people choose to eat and drink, and how they choose to do so, is wrapped within culture, revealing beliefs about society and nature. Many environmental historians place agriculture at the center of their concerns, recognizing that over the long course of history, people’s most essential bond to nature was the food that sustained them physically, economically, and spiritually. Yet, over time, modern agriculture has distanced Americans from the nation’s productive landscapes, leaving them with little knowledge of the source of most food.

Agricultural Beginnings

Between 11,000 and 3,000 years ago, people living in small groups around the world gradually shifted from herding, hunting, and gathering to growing crops and providing pasture for animals. As a result, human populations and the number of domesticated animals and plants increased, growing together in mutual dependence.3 People settled into more densely populated, hierarchical societies. Agriculture, the practice of cultivating soil, producing crops, and raising livestock for food and other products, was a transformative ecological phenomenon that replaced natural biodiversity with a limited number of selected species. However, by altering environments, humans, and the biota they sought to tame became more susceptible to pests, disease, and famine.

Agriculture began in Central Mexico approximately 6,000 years ago and in other parts of North America perhaps 2,000 to 1,000 years later. The absence of significant domesticated animals to aid in production slowed agricultural development in the Western Hemisphere, but long before European contact many societies committed to farming, adapting to a wide variety of environments. The Hohokam Indians, for example, founded settlements across a 25,000-square-mile region in present-day New Mexico and Arizona. Emerging around the 1st century ce, they cultivated some fifty plant species and drought-resistant seeds. To water crops, they employed diverse technologies, such as reservoirs and catchment basins, which captured rainwater. Achieving a consistent downhill gradient, complex irrigation systems lasted for hundreds of years. The Hohokam disappeared around the end of the 15th century; extended droughts, the effects of long-term irrigation such as salinization, or some combination of these factors prompted an outmigration.4

Founded around 700 ce on the Mississippi River floodplain, Cahokia persisted for nearly 700 years, the largest city in the present-day United States until the 18th century. Agricultural surpluses engendered a large population (10,000–20,000 by the 12th century), extensive built environment, hierarchical society, and centralized government. By the 13th century, the demand for wood led to deforestation, increasing erosion, siltation, and floods, and undermining agricultural production. A period of climatic cooling exacerbated these problems. Declining harvests coincided with outbreaks of tuberculosis and blastomycosis, while competing economic centers emerged. By 1400 ce, residents abandoned Cahokia.5

Despite the Cahokian and Hohokam recessions, other Indian agricultural societies followed. By 1000 ce many Indian farmers across North America combined hunting and foraging with mutually beneficial plantings of corn, beans, and squash. Eastern Woodlands Indians called these crops the “Three Sisters.” The structure of corn (or maize) provided support for growing bean stalks. Beans restored to the soil the nitrogen that other crops depleted. Low-lying squash blocked sunlight and thus diminished weeds.6

Agriculture, Food, and the Environment (1)

Open in new tab

Figure 1. Photograph of Cahokia Mound, Illinois, 1907.

National Photographic Library/Library of Congress, LC-USZ62-118522.

Indians also routinely used fire to clear land, remove plants that competed with crops, add nutrients to soil, and create favorable habitats for desired game animals.7 At the time of contact with Europeans, Native Americans maintained complex food production systems, reflecting the continent’s environmental and cultural diversity.8

The Columbian Exchange

Alfred Crosby coined the phrase “Columbian Exchange” to explain relations between the Western and Eastern Hemispheres following contacts between Europeans, Africans, and Americans beginning around 1500. These exchanges, which were both intentional and unintentional, involved biological elements such as crops, domesticated animals, and pests. The most profound interaction involved the introduction of infectious diseases from Europe and Africa to the Americas. Lacking acquired immunities, indigenous people died in large numbers. Mortality rates varied depending on the disease and overall health of the Indian community, but were significant enough to disrupt kinship networks, food and trade systems, and religious beliefs.9

Foods constituted a substantial share of the Columbian Exchange. Corn, peppers, pumpkins, and potatoes, for example, entered the diets of people in the Eastern Hemisphere. Sugarcane, livestock (cattle, pigs, sheep, horses), and grains (wheat, barley, rice) moved to the Americas. This exchange also included the transfer of ideas and values, often embedded in legal systems, modes of production, and religious practices. The labor-intensive rice culture which developed in Georgia and South Carolina’s coastal lowlands involved transplanting plants, technologies for floodplain irrigation, and cultural practices such as a reliance on female labor. The geographer Judith Carney explains how women “displayed specialized knowledge about soil fertility, seed selection, hoeing, and rice processing,” all of which required a “sophisticated understanding of the specific demands made upon diverse rice microenvironments, such as water availability, the influence of salinity, flooding levels, and soil conditions.”10 A common tendency to undervalue women’s contributions, now and in the past, explains, in part, why slaves’ contributions to colonial and early American agriculture had received minimal credit prior to Carney’s work.

Agriculture, Food, and the Environment (2)

Open in new tab

Figure 2. Wood engraving of “Rice Culture on the Ogeechee, near Savannah, Georgia,” sketched by A.R. Waud for Harper’s Weekly, 5 January 1867. Library of Congress, LC-DIG-ds-07365.

Colonial Agriculture

Colonists sold rice and other staple crops within a European mercantile system that promoted government economic regulation and emphasized the need for colonies to augment state power vis-à-vis rival nations. Within this economically subordinate, environmentally exploitive system, colonists shipped crops and raw materials unavailable in the mother country, depending on these commodities as the basis for acquiring high-priced finished goods. Plants became staple crops because they were suited to particular environments and growers developed economies of scale by controlling sufficient labor to produce surpluses and lower prices.11

Colonial farmers’ roles varied across North America. New Englanders built their economy around whaling, commercial cod fishing, and shipbuilding.12 With a shorter growing season, small farmers produced crops for local consumption. In the Middle Colonies, larger commercial farms, although still family ventures, fed the region, but also shipped wheat and other grains to the larger British Empire. In the Southern colonies, a small number of planters controlled thousands of acres and hundreds of slaves to produce labor-intensive staple crops for export, but often needed to import food. Small farmers in all regions fed their families with diverse crops and livestock because of the difficulties of reaching markets and risks associated with single crop production. Livestock also altered the landscape. Cattle and pigs compacted soils, contributing to rainfall runoff that exacerbated erosion and clouded rivers with silt. In the South, free-roaming livestock destroyed river cane brakes, disrupting native animal habitats. And colonists’ insatiable hunger for farmland heightened tensions with Indians.13

When Spanish settlers arrived in New Mexico at the end of the 16th century, they discovered that the Pueblo Indians cultivated native plants and watermelons that the Spaniards previously introduced in Mexico. The Spaniards brought European crops such as wheat and plants from farther south such as tomatoes and new maize varieties, but their irrigation systems confronted the same salinization problems that plagued pre-contact Pueblo farmers. Meanwhile, Spanish sheep dominated the upper Rio Grande Valley where overgrazing on common grounds weakened vegetative cover and exacerbated soil erosion. Sheep encroached on Indian lands, contributing to conflicts. With increased populations, many Spanish settlements exceeded “their carrying capacity by the end of eighteenth century.”14 In many ways, regional disputes over land and resources were established by the end of the colonial era.

The New Nation

Agriculture remained foundational to the U.S. society and economy after the American Revolution. As the 18th century closed, 95 percent of the nation’s nearly 4 million people lived in rural communities. With a few exceptions, families provided the labor to produce most of their food, trading some crops and animals with neighbors for service or goods. Most nutrition came from local foodsheds, the region that produces food for a particular population. Horses’ endurance limited most foodsheds, although a few ingredients such as coffee, tea, and sugar became familiar under mercantilism and provided limited extensions of foodshed boundaries.15

The young, pre-industrial nation’s land laws reaffirmed agriculture’s significance, laying the foundation for federal policies into the 20th century. Thomas Jefferson celebrated the yeoman farmer, writing in 1785, “Cultivators of the earth are the most valuable citizens. They are the most vigorous, the most independent, the most virtuous & they are tied to their country & wedded to liberty & interest by the most lasting bonds.”16 While federal laws did not require people to farm the lands they acquired, lawmakers assumed that small freeholding farmers, or yeomen, would fill the public domain. Designed to raise revenue and encourage “settlers to farm and farmers to settle,” the Land Ordinance of 1785 created the U.S. Rectangular Land Survey (“the grid”), which divided the public domain into symmetrical sections divisible for sale.17 The initial high purchase price favored speculators over small farmers, but subsequent laws provided preemption for squatters (1841) and free homesteading for qualified Americans (1862).18 As the historian Ted Steinberg observes, “the grid was the outward expression of a culture wedded not simply to democracy, but to markets and exchange as well.”19 By the early 19th century, more American farmers became connected to ever-more-distant markets.20

Technology, Finance, Transportation

In the 19th century new technologies, innovative financial institutions, and improved transportation increasingly linked food production to national and international exchanges; transforming American farmscapes into more specialized, more commoditized agriculture. Settlers obtained prairie lands through the federal system; after 1837 John Deere’s steel plow allowed them to break and till their dense soils. Agriculturalists replaced natural grasses with grains such as corn and wheat. Monocultural production enhanced economies of scale, offering farmers potentially greater profits. One of the Earth’s most common crops, wheat offered advantages. It grows quickly, produces high yields, self-pollinates, and proves easy to harvest and store. The abundance of grains made carbohydrate-rich breads a regular part of the American diet. Before the Civil War, wheat fields stretched from Ohio to Kansas and wheat emerged as the leading Midwestern crop.21 Following the Homestead Act of 1862, farmers filled the Great Plains from northern Texas to Canada with wheat crops. Mechanical reapers and threshers (in the 1830s) and binders (in the 1870s) facilitated harvests while significantly reducing the amount of human labor required for such tasks.22

In the 19th century, gateway cities funneled agricultural products to markets. In 1848, local businessmen founded the Chicago Board of Trade, the world’s oldest futures and options exchange, to correct instability in the city’s grain markets. Five years later the Crimean War spurred the European demand for grain, and the Board secured treble prices for American wheat and tripled the amount of grain shipped from Chicago. Railroads facilitated commoditization through use of the steam-powered grain elevator (1842). Following this technological innovation farmers no longer shipped grain in sacks. Instead they delivered it to local elevators where it was sorted by grade and loaded into freight cars. Mixing a farmer’s grain with his or her neighbors’ severed his or her ownership of the physical product; wheat or corn became abstract commodities. In 1864, the Board of Trade established a futures market. Speculators gambled on future prices, increasing grain’s abstraction. Farmers objected to these conditions, but responded to enhanced systematic efficiency by expanding their fields, diminishing biodiversity, and producing more crops.23

The technological shift brought by the railroads also reshaped the cattle industry. Introduced by Europeans, cattle thrived in North America. By the 17th century, farmers drove animals to regional markets where butchers slaughtered them to produce meat for Atlantic Coast cities. By the early 1800s, farmers bred beef cattle separately from dairy cows. In Texas, the famed longhorn emerged from mixing Spanish stock from Mexico with Anglo-American breeds from the American South. For Texas ranchers, cattle were a source of capital, not food. Animals wandered across an unfenced range. In the second half of the century the government assisted cattle ranching, and wheat farming as well, by relocating the region’s Indians and assisting in the bison’s near-extinction. After the Civil War, the cattle range spread across the Great Plains and new cattle trails developed, although railroads connected these to more distant industrial locations. Chicago was the nation’s railroad hub, and its Union Stock Yards centralized cattle transportation. Factory-style meatpacking facilities sprouted around the yards, earning Chicago the nickname “hog butcher for the world.” Meatpackers dumped animal waste in the Chicago River, known as “Bubbly Creek” because of the gaseous byproducts of decomposition.24

Railroads carried cattle and other food to the Northeast’s burgeoning urban populations as improved technologies pushed more people off the nation’s farms into its factories. Agricultural surpluses underwrote the nation’s industrial development.25

Agriculture, Food, and the Environment (3)

Open in new tab

Figure 3. Photograph of the Chicago Board of Trade in Session by Lawrence George, c. 1901. Library of Congress, LC-USZ62-120484.

Thus, railroads expanded American foodsheds as tastes and practices began to change. By the 1830s, Americans purchased mass-produced flour instead of growing their own corn or wheat. Trains lowered delivery times and, as a result for example, citydwellers received better milk produced by grass-fed cows in the countryside. In the past, cows in local swill dairies ate the fermented mash residue from whiskey production and produced a foul-tasting milk. After the Civil War, the South joined eastern urban foodsheds, converting some cotton plantations to truck farms and supplying winter vegetables and fruits.26 By the 1890s, refrigerated cars carried “fragile” foods such as butter, fish, and meat greater distances. Until the 1880s, few Americans had eaten meat slaughtered more than a few miles from their homes, and meatpackers such as Gustavus Swift faced consumers’ resistance and objections from local butchers. By cutting prices, Swift lured customers to his product. Urbanites discovered that they preferred the cheaper corn-fattened, Chicago-dressed beef over range-fed cattle that previously traveled the rails for local butchering. By 1900, few cattle were slaughtered in eastern cities, as most were imported from the Midwest.27

Some historians have argued that corporate and government corruption underwrote the construction of the transcontinental railroads and the transformations wrought by railroads. At the same time, the open-range cattle industry and wheat monoculture undercut Indians’ economy, replaced native flora and fauna, and stressed arid environments with tragic consequences in subsequent droughts.28

Government Expansion

The Pacific Railroad Act and land laws were among many actions that gave government greater and permanent roles in promoting commercial agriculture. The U.S. Department of Agriculture (USDA), launched in 1862, tried to remedy consequences of an older policy. The Patent Office, which previously oversaw agriculture, issued free seeds and cuttings in the 1830s and 1840s, which unintentionally increased susceptibility to destructive pests and plant diseases.29 Monocultural production also contributed to the problem. Grasshoppers plagued Kansas wheat fields in the late 19th century, for example. Wheat stem rust later destroyed crops. With the food demands of World War I, the USDA initiated a program to remove barberry bushes near wheat fields because they were an alternate host for stem rust.30

All such problems, however, were not attributable to the seed program. In the 1870s and 1880s, the glassy-winged sharpshooter, the vector for Pierce’s Disease from the American Southeast, ravaged southern California vineyards. In northern California, phylloxera, an aphid indigenous to the Mississippi River watershed, disrupted vineyards, requiring expensive replantings of European vines on resistant American rootstocks.31 White settlers found the climate and irrigated fields of Idaho’s Snake River Valley suitable to potato production. By the early 20th century, blight undermined their crops until they accepted USDA promotions of the resistant Russet developed by botanist Luther Burbank.32

In the late 19th century, the USDA’s most popular effort was its annual reports. Filled with data gathered across the nation, these enhanced its authority in scientific agriculture. The USDA promoted mechanization, monocultural production, and chemical inputs.33 Under the Morrill Act (1862), land grant colleges offered alternatives to traditional university curricula, including scientific agriculture. For providing such schools, each state received 90,000 acres of federal land to allocate or sell. The law opened American higher education to a wider range of social classes, while these institutions, through teaching, research, experiment stations, and cooperative extension, affected almost every form of food production while increasingly aligning with large agribusiness as it ascended during the 20th century.34

With such actions, the federal government moved farther from the Jeffersonian vision and helped to foster instead marketing and production systems over which farmers exercised limited control and which provided them with limited financial rewards.35 These circ*mstances frustrated many late-19th-century American food producers. Founded in 1867, the Patrons of Husbandry (the Grange) promoted family farmers through education, cooperation, and mutual protection. The organization initially avoided politics but soon engaged in debates over railroad regulation. Some states, particularly in the Midwest, passed laws limiting freight rates. Similar associations emerged in most regions. By the 1890s, growing frustrations fueled the rise of the Populist Party. Among other policies, Populists sought government ownership of railroads and warehouses where farmers might store crops until prices improved. Despite some electoral successes in 1894, the Populist Party dissolved by the end of the century. Progressive reformers later embraced some Populist initiatives such as greater railroad regulation and crop warehousing, but other objectives designed to even the playing field for the farmers failed to take hold.36

Industrial Agriculture

In the first half of the 20th century, the trends of increasing scale and alienation of U.S. consumers from their food sources continued and intensified. For producers, these meant greater use of inputs (machinery, artificial fertilizers, chemical pesticides, and feed additives) in pursuit of higher outputs. This “industrial ideal” in agriculture led to increasing levels of debt and ongoing declines in rural populations as farm operations consolidated.37 For consumers, changes meant trips to supermarkets to purchase processed foods that offered greater selection and convenience at the expense of knowledge of the foods’ origins.38 These changes, while affecting every agricultural product to some extent, can be understood in the context of three agribusinesses: Great Plains wheat fields, California’s fruit industry, and nation-wide meat production.

Wheat, Hybridization, and Milling

In the late 19th and early 20th century, wheat ranked foremost among the crops farmers coaxed from the often-unforgiving plains. Bringing new acres under cultivation accounted for much of the United States’ expanded wheat production during this period, but technology also played a critical role. The mechanical reaper reduced labor needs for harvest. Later the binder mechanically gathered and bound the crop into shocks, bringing greater efficiency to farms on the American and Canadian plains. Henequen and sisal fibers from Mexico’s Yucatan Peninsula made strong, durable, and insect-resistant twine for the binders, and bound the Yucatan, where workers in slave-like conditions cultivated the plants, to North America’s grain commodities markets for some seven decades. In the mid-20th century, the combine harvester replaced the binder.39

New wheat strains contributed to production increases. Moving into drier environments with greater temperature extremes, farmers found that plants suited to the more hospitable Midwest performed poorly. Bonanza farms, or very large farms performing industrial-scale operations, proliferated in the Dakotas and other parts of the Great Plains where the flat terrain combined with mechanization and widely available rail transport to encourage massive wheat operations; the monocultural practices associated with these farms, however, also invited rust and pests. Under these conditions, simply maintaining yields required innovation. Wheat plants introduced from Russia, the Mediterranean, and elsewhere proved more effective. Agricultural researchers developed new varieties that encouraged resistance to drought and predation.40 Efforts to diversify wheat stocks in the 19th century presaged similar undertakings in the 20th. Hybrid sorghum played a key role in the development of the Southern Plains after 1950, and hybrid corn made impacts worldwide beginning in the 1920s.41

The harder wheat that predominated in the northern Great Plains by the beginning of the 20th century proved ill suited to the stone mills that ground the softer wheat of the eastern United States. Roller milling, a process introduced from Hungary during the 1880s, broke the hard husks and separated the bran. A subsequent bleaching process produced fine, white flour, and the millers’ advertising campaign convinced consumers of this patent flour’s desirability by linking its color to purity and enlightenment. Removing the bran and bleaching, however, raised doubts regarding the flour’s nutritional value. Millers compensated for lost nutrients through enrichment, but dietary deficiencies led to more cases of pellagra wherever patent flour was consumed.42

Agriculture, Food, and the Environment (4)

Open in new tab

Figure 4. Photograph of Combine Harvester in the San Joaquin Valley, California, by Dorothea Lange for the U.S. Farm Security Administration, 1938. Library of Congress, LC-USF34-018318-D.

Nonetheless, heavily processed white flours found their way into cake mixes and other packaged foods. Intended to save time in the kitchen, these convenience foods reduced the gratification that many housewives received from their cooking and threatened to obviate their hard-earned foodways knowledge.43

During the 1930s, wheat expansion reached its limits in the environmental catastrophe of the Dust Bowl. Farmers had ploughed the Southern Plains to take advantage of the high prices of World War I. When prices declined during the 1920s, they tried to compensate by expanding their acreages and producing larger harvests. When the devastating droughts of 1934–1939 struck, crops withered, and the land, deprived of its native cover by the sodbusters, blew away in massive dust storms. Despite government relief efforts, prosperity did not return to the region until the rains, and the demands of another world war arrived in the 1940s.44 From this point forward, the nation largely turned from expanding acreage of wheat and other crops to increasing yields per acre. Post–World War II agriculture saw increasing uses of petrochemical fertilizers and pesticides to bring greater productivity. Artificial fertilizer was developed in Germany when World War I cut the nation off from natural phosphate sources, such as Peruvian guano. Artificial fertilizers produced using the Haber–Bosch process gained worldwide ascendency. They enhanced yields and kept worn-out lands in production, but also altered soil chemistry and polluted waterways.45 Mechanization also enhanced agricultural productivity. Fossil fuel-reliant machines replaced human and animal laborers, freeing more land for crops. On the Plains in the 1920s, one-tenth of farmland supported work horses; few remained by century’s end.46 Similarly, the number of humans on farms declined from nearly half the U.S. population in 1920 to just 3 percent by 1990.47

Fruit and Agricultural Labor

Luther Burbank arrived in California in 1875 with a sack of his potatoes, which quickly supplanted varieties found locally. Burbank turned his attention to other crops, selecting profitable traits from plums, peas, walnuts, and peaches, and cross-breeding them to produce new varieties with greater commercial value. Many possessed stronger resistance to frost and pests. They grew in regular sizes and shapes, lending themselves to mechanized harvesting. Although Burbank never received patents during his lifetime, his efforts helped subsequent breeders gain the same protections as authors and inventors.48 The state, serving an associative role with industry, also engaged in plant research. The Citrus Experiment Station in Riverside, established by the California legislature in 1905, sought improved and standardized varieties of oranges, an endeavor that largely benefitted the California Fruit Growers Exchange (Sunkist).49

Although fruits attained greater regularity in size and appearance, some specimens failed to meet the standards of growers and consumers. Another manifestation of the associative state, the Fruit Products Laboratory at the University of California, Berkeley, offered a solution: fruit co*cktail. Combining diced peaches, pears, and pineapples with grapes and cherries, the new product disguised the ill-shapen fruit. Like other processed and packaged foods, fruit co*cktail deprived consumers of knowledge of their food’s origins.50

Agriculture, Food, and the Environment (5)

Open in new tab

Figure 5. Poster from U.S. Department of Agriculture promoting “Contour Farming,” 1943. USDA National Agricultural Library.

Like other canned products, fruit co*cktail resisted spoilage over long journeys or sits on pantry shelves. Canned meats, beans, and other goods fed the miners who flocked to the Klondike gold fields of Alaska in 1898. Cans obscured the natural origins of their meals but connected prospectors to the wider world by expanding their foodsheds.51 Canning in California had commenced with the 1849 gold rush, but it remained a chaotic industry in the 19th century. In 1899, several of the state’s largest canners merged under the Del Monte brand. The conglomerate advertised widely, making its shield one of the most recognizable U.S. food logos. It could afford the mounting costs of canning operations. Del Monte and other large canners influenced federal legislation, which standardized regulations for canning and raised entry costs for smaller competitors. Del Monte told consumers that its products were safe and convenient, while emphasizing dangers and difficulties in home canning. Botulism contamination, a growing problem in the canning industry in the early 20th century, undermined these efforts. Industry-sponsored research, enforced by state regulations, instituted strict standards for heating canned foods to ensure safety.52 Canneries unleashed a slew of environmental problems. Increased fruit production encouraged by profitable canneries drew vast quantities of ground water, depleting aquifers and leading to subsidence. Although canneries sought profitable outlets for their byproducts, they often required encouragement from local, state, or federal legislation to improve waste disposal systems that threatened wildlife and urban health.53

Early efforts to unionize cannery workers failed due to the seasonal nature of the work; not until the pro-labor New Deal legislation of the 1930s did these workers successfully organize. Mechanization also facilitated unionization. The teamwork necessitated by mechanized lines helped workers organize, and the speed demanded by machines eliminated the piece-work pay system in favor of hourly wages. Raising these wages constituted one of the chief demands of the cannery unions.54 Orchards, however, resisted mechanization and continued to rely on human hands to pick the tender fruit. This made California an attractive location for agricultural workers displaced by mechanization elsewhere.55

Race and gender often determined labor arrangements. Chinese immigrants, many of whom brought citrus-picking skills with them from tropical Guangdong Province, initially predominated in the orchards. The Chinese Exclusion Act (1882) made room for Japanese workers, although they lacked experience handling delicate fruit. Protecting oranges during picking and packing occupied much of the growers’ time. Slight damage to the skin opened up fruit to destructive mold. Men in the orchards and women in the packing houses were encouraged to handle the fruit with care. Paying workers based on the price their boxes brought on the market rather than the quantity packed helped achieve this end.56

World War II brought labor shortages to California. The Bracero program, which allowed Mexican workers to enter the United States, helped make the Central Valley’s orchards more Hispanic after 1942.57 Postwar workers suffered numerous health problems due to the increasing use of pesticides in the fruit groves, but medical paradigms of the time, combined with a race- and class-based disregard for migrant farmworkers, prevented health officials from giving them serious attention. Germ and genetic theories of health obscured environmental threats, and workers often received blame for creating their own insalubrious surroundings. Farmworkers recognized the source of their maladies, but unlike infectious disease, to which workers could become “seasoned,” toxins accumulated in their bodies over time. Only in the 1960s when it appeared to threaten consumers and neighboring property owners did California take action to regulate pesticide use.58

Livestock Industry Integration

Infectious diseases affected other food industries over the 20th century. Testing milk for bacteria was a lynchpin in Progressive reformers’ effort to reduce infant mortality. Efforts by agricultural reformers in both government and industry to increase milk consumption necessitated changing perceptions of milk, a food source long held in suspicion. Convincing mothers to turn from breastfeeding to bottles meant greater reliance on experts, including physicians, manufacturers, and public health officials.59 Better milk required improving the farms that produced it. Dairy manufacturers demanded that farmers meet sanitation standards; state inspectors enforced these standards. Farmers who wanted access to milk markets invested in expensive storage tanks and other equipment, raising the cost of entry and favoring larger producers. Butter producers, who placed less exacting standards on suppliers and attracted smaller farmers farther from major markets, faced pressure to standardize and improve their product.60 Distant dairy farms moved closer to their consumers, in time, if not in space. Highways that encouraged suburbanization permitted faster transportation of milk. Like wheat and fruit, dairy cattle underwent improvement. Artificial insemination, introduced by dairy researchers in 1937, was key, allowing dairy farmers to overcome nature’s limitations in animal breeding. Thus, as consumers grew more distant from the sources of their food, dairy cattle were alienated from their own means of reproduction.61

Disease factored heavily into the development of the beef cattle industry. Although widespread in the southern United States by the end of the 18th century, Texas fever gained notoriety after the Civil War when cattle driven north from Texas brought the infection to Midwestern herds, prompting quarantines that helped end the brief heyday of the long cattle drive. In addition to denying northern markets to the southern herds, Texas fever stunted the growth of southern cattle and discouraged herd improvement. In the early 1890s, scientists linked Texas fever to the ticks cattle carried.62 A massive federal effort commenced to eliminate it. A quarantine line cordoned southern cattle, with raisers required to submit their herds to dipping vats that rid them of ticks and the disease. The process favored large cattle raisers. Texas fever hindered the growth of southern cattle, although given their resistance, it rarely proved fatal. Eradication, therefore, benefitted those who intended to improve their herds by importing cattle from abroad and then selling them in markets outside the South. Those with large herds could afford to move them to the dipping vats and keep them in quarantine, smallholders could not. Resistance to the dipping process grew fierce. Some small cattle raisers dynamited dipping vats; others harassed and killed federal officials. Nonetheless, the process succeeded in ridding the South of Texas fever by the 1940s, allowing its cattle industry to flourish.63 Extension service researchers conquered other diseases as well. Brucellosis and blackleg came under control during the mid-20th century, which aided cattle raisers in the West. Federal efforts also reduced numbers of wild horses who spread disease and consumed forage.64

Improved cattle herds led to more and higher quality cuts of beef. In the early 1950s, Southern Plains landowners began exploiting the vast Ogallala Aquifer under their land. This water source offered relief from drought, and allowed them to grow hybrid sorghum and corn, which had previously been impossible. Seeking a market for abundant new grains, landowners began feeding them to cattle. Vast feedlots housing tens of thousands of head proliferated during the 1960s. Unlike the smaller 19th-century Midwestern feedlots, these enterprises operated year-round.65 The meatpackers, who formerly concentrated in terminal market cities such as Chicago and Omaha to utilize rail networks, followed the feedlots to the plains. New competitors such as Iowa Beef Packers (IBP) brought innovations, rendering older meatpackers such as Swift and Armour obsolete. Rather than ship entire carcasses to markets, Monfort, IBP, and others reduced the carcass to cuts and shipped them as boxed beef, saving space in transit and making them more appealing to shoppers in the suburban supermarkets that appeared after World War II.66 Small towns across the Plains faced new challenges, as numerous workers at the plants, many of them immigrants, overwhelmed social services.

Industrial-scale beef production presented other concerns. Feedlots concentrated manure, and threatened nearby watercourses. Feedlots began using growth hormones such as diethylstilbestrol (DES). Although DES had been used for years to treat female infertility, it was not until the hormone, a synthetic form of estrogen, threatened the masculinity of cattle and men who consumed beef that it raised concerns. Finally, when evidence surfaced in 1971 linking DES to several cases of vagin*l cancer, federal hearings and a ban ensued.67 Other hormones have taken its place, with long-lasting consequences for the industry. Consumer distrust of meat had focused on the meatpackers since Upton Sinclair published The Jungle (1906), but DES brought scrutiny to the entire chain of beef production. Cattle raisers’ failure to take these concerns seriously sparked consumers’ suspicions. Scientists’ inability to differentiate between dose-dependent toxins and hormone disruptors such as DES undermined the authority they cultivated since the Progressive era.68 Concerns also mounted over antibiotics, added to feed to stimulate growth. Critics argued they contributed to resistant strains of bacteria, and consumers feared the health risks of antibiotic-laced meat, leading to sagging beef sales in the 1980s.69

The beef industry’s 20th-century transformation paled before those in the chicken and pork industries. Chicken lagged far behind beef and pork in popularity at the beginning of the century. To change that status, scientists in industry and the extension service developed larger, meatier birds and sought more versatile cuts. They classified birds by age and hence their suitability for cooking. Rather than purchase whole chickens, shoppers bought packs of legs or thighs. Chicken producers such as Perdue emphasized the alleged health benefits of chicken compared to red meat. By century’s end, chicken was the nation’s most popular meat. Pork attempted to follow its lead by advertising itself as “the other white meat.”70

Pigs never lacked for versatility, making their way into sausages and diverse products. The challenge for the pork industry rested on transforming pigs from mere consumers of cattle’s leftovers, as they were in the Midwestern feedlots, to efficient meat producers. In the mid-20th century, agricultural scientists at Iowa State University and elsewhere worked to free the industry from the natural rhythms of hog farrowing and the availability of feed on Midwestern farms. This required bringing hogs into confined buildings for strict environmental control. Scientifically designed feeds brought efficiency and regularity. Adding antibiotics to the feed limited disease in the confined conditions, and allowed piglets to wean earlier, decreasing the time between litters. The industry needed to sell farmers and consumers on this shift. The former blamed industrialized farming for hurting their profits; the latter worried about the safety of such pork products.71 Industrialized hog farming grew swiftly, however. On the Southern Plains, landowners found they could just as easily feed their grains to hogs; gigantic hog barns proliferated during the 1990s.72

Agriculture, Food, and the Environment (6)

Open in new tab

Figure 6. Photograph of cattle in a feed lot in Van Zandt County, Texas by Carol Highsmith, 2014. The Lyda Hill Texas Collection of Photographs in Carol M. Highsmith’s America Project, Library of Congress, LC-DIG-highsm-29994.

In the 1940s, Tyson Chicken of Arkansas developed a system for raising chickens that tasked landowners with raising hatchlings owned by the company. These landowners, contractors for Tyson, built sheds that met Tyson’s exacting standards. Constructing these sheds, as well as maintaining ventilation and adequate temperatures, proved costly. This system placed the least profitable portion of the enterprise, raising the chickens, on debt-ridden contractors while Tyson controlled the entire operation, including ownership of the birds. Expanding into pork over the latter half of the 20th century, Tyson applied this system to hogs. Working in conjunction with its similarly structured competitors to achieve a near monopoly, Tyson left independent hog raisers without a market for their livestock. Tyson purchased IBP in 2001, but so far the cattle industry has resisted this degree of integration.73

Reversing Trends

As industrialized agriculture alienated consumers from nature and their food’s origins, some looked to reverse that trend. Complaints that processed foods were not “natural” date back to the beginning of the 20th century, when food manufacturers first began advertising their products as exactly that way. Defining “natural” proved problematic, however, as some substances could be produced by either human or non-human methods. More recently, defining organic provided a similar challenge.74 Others pursued unprocessed or locally produced foods. In 1971, chef Alice Waters, fearing that American foods did not compare in quality with the fresh-picked ingredients she experienced in Europe, opened Chez Panisse in Berkeley, California, which specialized in fresh, local foods.75 Raw, unpasteurized milk and grass-fed beef found renewed markets.76 Some urbanites preferred a more hands-on experience, planting herb and vegetable gardens. One Denver suburb set itself apart by permitting residents to raise crops, chickens, and goats on their property.77 Such efforts, however, cannot sustainably feed the nation. America’s demand for beef cannot be met by free-range, grass-fed cattle, nor have “alternative” foods replaced their industrial counterparts. Yet while Americans demanded cheap, plentiful, consistent, and safe foods, many also expressed a desire for more information about their food’s origins. Balancing and satisfying these desires present a challenge to the food industry, scientists, and anyone concerned with the nature of their food.

Discussion of the Literature

Worster and Cronon, who debated the balance between materiality and culture in 1990, but agreed on the centrality of food, wrote important early histories on capitalist agricultural transitions. Worster’s groundbreaking Dust Bowl (1979) argues that the disaster on the plains was caused by humans not nature. He indicts the supposedly rational capitalist system that compelled farmers to cultivate and abuse marginal lands in response to distant market forces.78 In Changes in the Land (1983), Cronon contends that English colonists’ notions of property and their pursuit of capitalism transmuted New England’s ecosystems, usurping the ecological balances achieved by the region’s indigenous people. His subsequent book, Nature’s Metropolis (1991), outlined Chicago’s transformative role in American agriculture through its railroads, Board of Trade, and other capitalist mechanisms.79

A few selections capture the spectrum of approaches environmental historians subsequently adopted. In examining whether Europeans’ arrival inevitably involved resource mismanagement, Brian Donahue challenges scholars, including Cronon, who “argued that the profit motive drove [colonial farmers] to commoditize and exploit natural resources as rapidly as possible, rather than conserve them as the Indians had done.”80 Concord, Massachusetts farmers implemented English mixed husbandry practices. Concord’s common field system distributed limited acreage for meadow and tillage and reserved woodlands for grazing. This ecologically viable system persisted until early 19th-century demographic pressures and market demands pushed farmers from diversified landscapes toward specialized dairy farming that required greater pasturage and led to deforestation. Carney’s “black rice” thesis, discussed above, similarly engages a rich debate. Carney argues that African slaves’ innovations and environmental mastery gave them agency even within slavery. Critics suggest that local conditions required adaptations that constrained slaves’ power, but all debaters see slavery as a social and labor system within the context of agroecological production.81

As Linda Nash observes, environmental historians increasingly understood “agriculture not as successful or destructive, modern or traditional, but as a series of material exchanges and negotiations, and a process (sometimes) of environmental learning.”82 Edmund Russell, for example, emphasizes evolutionary history. Animals, when bent to the needs of human culture, served as technology, transforming grass and other feeds unappealing to humans into meat and other products.83 In Industrial Cowboys, David Igler discusses German immigrants, Charles Lux and Henry Miller, who built an empire by acquiring land and water rights in California’s Central Valley in the late 19th century as the state became the nation’s agricultural leader. Engineering landscapes through irrigation and reclamation, they controlled limited regional water, maximized cattle production, and cornered the far western meat market.84 Human needs reshaped plants and animals, which in turn reshaped human cultures.

Other studies stand at this intersection of human culture and non-human nature. Marsha Weisiger’s Dreaming of Sheep in Navajo Country explores the long-term consequences of species introduced to the Southwest by Spanish settlers. The federal solution to an overgrazing problem on tribal lands—a stock reduction program—undermined Navajo economic independence and devastated the community’s women, who despite their ownership of the stock, found themselves ignored in the decision-making process.85 Cindy Ott’s Pumpkin examines a vegetable with little practical value that offered a badge of American identity through its connection to an imagined past. From the Pilgrims to Cinderella and Charlie Brown, Ott’s work embodies a cultural turn in environmental history.86

At the same time, Ott’s book reflects a significant shift in agricultural history which has focused more recently on commodities and the transnational linkages underlying their production. One of the first and still most important books to explore the history of a single commodity was Sydney Mintz’s Sweetness and Power. It explores how Europeans and Americans transformed sugar from a rare foreign luxury to a commonplace necessity of modern life for the multitudes. Grown as a “slave crop” in Europe’s Caribbean colonies, sugar became the opiate of the industrial proletariat, changing work patterns, eating habits, and human health.87

In Bound in Twine, Sterling Evans explores how the promotion of the Canadian and American plains as wheat fields for international markets depended on the development of henequen plantations in Mexico’s Yucatan Peninsula. The latter region’s agave plant produces the fibrous material known as henequen or sisal that worked well in the mechanical binders that proliferated from the late 19th to the mid-20th century. This henequen-wheat complex locked three nations in processes of economic and ecological change, although the more devastating consequences were not experienced equally. No group suffered more than large numbers of Yaqui Indians forced into slave labor on the plantations.88

Finally, in Kitchen Literacy, Ann Vileisis analyzes changing American eating habits. Whatever its ecological disruptions, she argues, colonial agriculture rested on nature’s rhythms. Industrialization, urbanization, and agribusiness, by contrast, left humans’ most basic connection with nature—as sustenance—attenuated and indifferent with consumers ignorant of the world around them.

Primary Sources

Scholars use a wide variety of sources to explore the relationship between agriculture, food, and the environment, including letters, diaries, recipes, cookbooks, government reports, agricultural trade journals, newspapers, almanacs, advertisem*nts, scientific studies, among many others. They have increasingly turned to material culture, or the evidence of cultures in the objects and architecture they left behind; these physical aspects help explain behavior and belief systems. Environmental historians also walk the landscape in an attempt to discern physical changes within specific ecosystems. The last decade has witnesses a proliferation of useful databases, such as Cornell University’s Core Historical Literature of Agriculture.

Agriculture, Food, and the Environment (2024)
Top Articles
Latest Posts
Article information

Author: Gov. Deandrea McKenzie

Last Updated:

Views: 5848

Rating: 4.6 / 5 (46 voted)

Reviews: 93% of readers found this page helpful

Author information

Name: Gov. Deandrea McKenzie

Birthday: 2001-01-17

Address: Suite 769 2454 Marsha Coves, Debbieton, MS 95002

Phone: +813077629322

Job: Real-Estate Executive

Hobby: Archery, Metal detecting, Kitesurfing, Genealogy, Kitesurfing, Calligraphy, Roller skating

Introduction: My name is Gov. Deandrea McKenzie, I am a spotless, clean, glamorous, sparkling, adventurous, nice, brainy person who loves writing and wants to share my knowledge and understanding with you.