In Defense of Food Page 12
Simplification of the food chain occurs at the level of species diversity too. The astounding variety of foods on offer in today’s supermarket obscures the fact that the actual number of species in the modern diet is shrinking. Thousands of plant and animal varieties have fallen out of commerce in the last century as industrial agriculture has focused its attentions on a small handful of high-yielding (and usually patented) varieties, with qualities that suited them to things like mechanical harvesting and processing. Half of all the broccoli grown commercially in America today is a single variety-Marathon-notable for its high yield. The overwhelming majority of the chickens raised for meat in America are the same hybrid, the Cornish cross; more than 99 percent of the turkeys are Broad-Breasted Whites.
With the rise of industrial agriculture, vast monocultures of a tiny group of plants, most of them cereal grains, have replaced the diversified farms that used to feed us. A century ago, the typical Iowa farm raised more than a dozen different plant and animal species: cattle, chickens, corn, hogs, apples, hay, oats, potatoes, cherries, wheat, plums, grapes, and pears. Now it raises only two: corn and soybeans. This simplification of the agricultural landscape leads directly to the simplification of the diet, which is now to a remarkable extent dominated by-big surprise-corn and soybeans. You may not think you eat a lot of corn and soybeans, but you do: 75 percent of the vegetable oils in your diet come from soy (representing 20 percent of your daily calories) and more than half of the sweeteners you consume come from corn (representing around 10 percent of daily calories).
Why corn and soy? Because these two plants are among nature’s most efficient transformers of sunlight and chemical fertilizer into carbohydrate energy (in the case of corn) and fat and protein (in the case of soy)-if you want to extract the maximum amount of macronutrients from the American farm belt, corn and soy are the crops to plant. (It helps that the government pays farmers to grow corn and soy, subsidizing every bushel they produce.) Most of the corn and soy crop winds up in the feed of our food animals (simplifying their diets in unhealthy ways, as we’ll see), but much of the rest goes into processed foods. The business model of the food industry is organized around “adding value” to cheap raw materials; its genius has been to figure out how to break these two big seeds down into their chemical building blocks and then reassemble them in myriad packaged food products. With the result that today corn contributes 554 calories a day to America’s per capita food supply and soy another 257. Add wheat (768 calories) and rice (91) and you can see there isn’t a whole lot of room left in the American stomach for any other foods.
Today these four crops account for two thirds of the calories we eat. When you consider that humankind has historically consumed some eighty thousand edible species, and that three thousand of these have been in widespread use, this represents a radical simplification of the human diet. Why should this concern us? Because humans are omnivores, requiring somewhere between fifty and a hundred different chemical compounds and elements in order to be healthy. It’s hard to believe we’re getting everything we need from a diet consisting largely of processed corn, soybeans, rice, and wheat.
3) From Quality to Quantity
While industrial agriculture has made tremendous strides in coaxing macronutrients-calories-from the land, it is becoming increasingly clear that these gains in food quantity have come at a cost to its quality. This probably shouldn’t surprise us: Our food system has long devoted its energies to increasing yields and selling food as cheaply as possible. It would be too much to hope those goals could be achieved without sacrificing at least some of the nutritional quality of our food.
As mentioned earlier, USDA figures show a decline in the nutrient content of the forty-three crops it has tracked since the 1950s. In one recent analysis, vitamin C declined by 20 percent, iron by 15 percent, riboflavin by 38 percent, calcium by 16 percent. Government figures from England tell a similar story: declines since the fifties of 10 percent or more in levels of iron, zinc, calcium, and selenium across a range of food crops. To put this in more concrete terms, you now have to eat three apples to get the same amount of iron as you would have gotten from a single 1940 apple, and you’d have to eat several more slices of bread to get your recommended daily allowance of zinc than you would have a century ago.
These examples come from a 2007 report entitled “Still No Free Lunch” written by Brian Halweil, a researcher for Worldwatch, and published by the Organic Center, a research institute established by the organic food industry. “American agriculture’s single-minded focus on increasing yields created a blind spot,” Halweil writes, “where incremental erosion in the nutritional quality of our food…has largely escaped the notice of scientists, government, and consumers.” The result is the nutritional equivalent of inflation, such that we have to eat more to get the same amount of various essential nutrients. The fact that at least 30 percent of Americans have a diet deficient in vitamin C, vitamin E, vitamin A, and magnesium surely owes more to eating processed foods full of empty calories than it does to lower levels of nutrients in the whole foods we aren’t eating. Still, it doesn’t help that the raw materials used in the manufacture of processed foods have declined in nutritional quality or that when we are eating whole foods, we’re getting substantially less nutrition per calorie than we used to.*
Nutritional inflation seems to have two principal causes: changes in the way we grow food and changes in the kinds of foods we grow. Halweil cites a considerable body of research demonstrating that plants grown with industrial fertilizers are often nutritionally inferior to the same varieties grown in organic soils. Why this should be so is uncertain, but there are a couple of hypotheses. Crops grown with chemical fertilizers grow more quickly, giving them less time and opportunity to accumulate nutrients other than the big three (nutrients in which industrial soils are apt to be deficient anyway). Also, easy access to the major nutrients means that industrial crops develop smaller and shallower root systems than organically grown plants; deeply rooted plants have access to more soil minerals. Biological activity in the soil almost certainly plays a role as well; the slow decomposition of organic matter releases a wide range of plant nutrients, possibly including compounds science hasn’t yet identified as important. Also, a biologically active soil will have more mycorrhizae, the soil fungi that live in symbiosis with plant roots, supplying the plants with minerals in excha
nge for a ration of sugar.
In addition to these higher levels of minerals, organically grown crops have also been found to contain more phytochemicals-the various secondary compounds (including carotenoids and polyphenols) that plants produce in order to defend themselves from pests and diseases, many of which turn out to have important antioxidant, antiinflammatory, and other beneficial effects in humans. Because plants living on organic farms aren’t sprayed with synthetic pesticides, they’re forced to defend themselves, with the result that they tend to produce between 10 percent and 50 percent more of these valuable secondary compounds than conventionally grown plants.
Some combination of these environmental factors probably accounts for at least part of the decline in the nutritional quality of conventional crops, but genetics likely plays just as important a role. Very simply, we have been breeding crops for yield, not nutritional quality, and when you breed for one thing, you invariably sacrifice another. Halweil cites several studies demonstrating that when older crop varieties are grown side by side with modern cultivars, the older ones typically have lower yields but substantially higher nutrient levels. USDA researchers recently found that breeding to “improve” wheat varieties over the past 130 years (a period during which yields of grain per acre tripled) had reduced levels of iron by 28 percent and zinc and selenium by roughly a third. Similarly, milk from modern Holstein cows (in which breeders have managed to more than triple daily yield since 1950) has considerably less butterfat and other nutrients than that from older, less “improved” varieties like Jersey, Guernsey, and Brown Swiss.
Clearly the achievements of industrial agriculture have come at a cost: It can produce a great many more calories per acre, but each of those calories may supply less nutrition than it formerly did. And what has happened on the farm has happened in the food system as a whole as industry has pursued the same general strategy of promoting quantity at the expense of quality. You don’t need to spend much time in an American supermarket to figure out that this is a food system organized around the objective of selling large quantities of calories as cheaply as possible.
Indeed, doing so has been official U.S. government policy since the mid-seventies, when a sharp spike in food prices brought protesting housewives into the street and prompted the Nixon administration to adopt an ambitious cheap food policy. Agricultural policies were rewritten to encourage farmers to plant crops like corn, soy, and wheat fencerow to fencerow, and it worked: Since 1980, American farmers have produced an average of 600 more calories per person per day, the price of food has fallen, portion sizes have ballooned, and, predictably, we’re eating a whole lot more, at least 300 more calories a day than we consumed in 1985. What kind of calories? Nearly a quarter of these additional calories come from added sugars (and most of that in the form of high-fructose corn syrup); roughly another quarter from added fat (most of it in the form of soybean oil); 46 percent of them from grains (mostly refined); and the few calories left (8 percent) from fruits and vegetables.* The overwhelming majority of the calories Americans have added to their diets since 1985-the 93 percent of them in the form of sugars, fats, and mostly refined grains-supply lots of energy but very little of anything else.
A diet based on quantity rather than quality has ushered a new creature onto the world stage: the human being who manages to be both overfed and undernourished, two characteristics seldom found in the same body in the long natural history of our species. In most traditional diets, when calories are adequate, nutrient intake will usually be adequate as well. Indeed, many traditional diets are nutrient rich and, at least compared to ours, calorie poor. The Western diet has turned that relationship upside down. At a health clinic in Oakland, California, doctors report seeing overweight children suffering from old-time deficiency diseases such as rickets, long thought to have been consigned to history’s dustheap in the developed world. But when children subsist on fast food rather than fresh fruits and vegetables and drink more soda than milk, the old deficiency diseases return-now even in the obese.
Bruce Ames, the renowned Berkeley biochemist, works with kids like this at Children’s Hospital and Research Center in Oakland. He’s convinced that our high-calorie, low-nutrient diet is responsible for many chronic diseases, including cancer. Ames has found that even subtle micronutrient deficiencies-far below the levels needed to produce acute deficiency diseases-can cause damage to DNA that may lead to cancer. Studying cultured human cells, he’s found that “deficiency of vitamins C, E, B12, B6, niacin, folic acid, iron or zinc appears to mimic radiation by causing single-and double-strand DNA breaks, oxidative lesions, or both”-precursors to cancer. “This has serious implications, as half of the U.S. population may be deficient in at least one of these micronutrients.” Most of the missing micronutrients are supplied by fruits and vegetables, of which only 20 percent of American children and 32 percent of adults eat the recommended five daily servings. The cellular mechanisms Ames has identified could explain why diets rich in vegetables and fruits seem to offer some protection against certain cancers.
Ames also believes, though he hasn’t yet proven it, that micronutrient deficiencies may contribute to obesity. His hypothesis is that a body starved of critical nutrients will keep eating in the hope of obtaining them. The absence of these nutrients from the diet may “counteract the normal feeling of satiety after sufficient calories are eaten” and that such an unrelenting hunger “may be a biological strategy for obtaining missing nutrients.” If Ames is right, then a food system organized around quantity rather than quality has a destructive feedback loop built into it, such that the more low-quality food one eats, the more one wants to eats, in a futile-but highly profitable-quest for the absent nutrient.
4) From Leaves to Seeds
It’s no accident that the small handful of plants we’ve come to rely on are grains (soy is a legume); these crops are exceptionally efficient at transforming sunlight, fertilizer, air, and water into macronutrients-carbohydrates, fats, and proteins. These macronutrients in turn can be profitably converted into meat, dairy, and processed foods of every description. Also, the fact that they come in the form of durable seeds which can be stored for long periods of time means they can function as commodities as well as foods, making these crops particularly well adapted to the needs of industrial capitalism.
The needs of the huma
n eater are a very different matter, however. An oversupply of macronutrients, such as we now face, itself represents a serious threat to our health, as soaring rates of obesity and diabetes indicate. But, as the research of Bruce Ames and others suggests, the undersupply of micronutrients may constitute a threat just as grave. Put in the most basic terms, we’re eating a lot more seeds and a lot fewer leaves (as do the animals we depend on), a tectonic dietary shift the full implications of which we are just now beginning to recognize. To borrow, again, the nutritionist’s reductive vocabulary: Leaves provide a host of critical nutrients a body can’t get from a diet of refined seeds. There are the antioxidants and phytochemicals; there is the fiber; and then there are the essential omega-3 fatty acids found in leaves, which some researchers believe will turn out to be the most crucial missing nutrient of all.
Most people associate omega-3 fatty acids with fish, but fish get them originally from green plants (specifically algae), which is where they all originate.* Plant leaves produce these essential fatty acids (we say they’re essential because our bodies can’t produce them on their own) as part of photosynthesis; they occupy the cell membranes of chloroplasts, helping them collect light. Seeds contain more of another kind of essential fatty acid, omega-6, which serves as a store of energy for the developing seedling. These two types of polyunsaturated fats perform very different functions in the plant as well as the plant eater. In describing their respective roles, I’m going to simplify the chemistry somewhat. For a more complete (and fascinating) account of the biochemistry of these fats and the story of their discovery read Susan Allport’s The Queen of Fats.†