- Home
- Michael Pollan
In Defense of Food Page 11
In Defense of Food Read online
Page 11
To get a better grip on the nature of these changes is to begin to understand how we might alter our relationship to food-for the better, for our health. These changes have been numerous and far reaching, but consider as a start these five fundamental transformations to our foods and ways of eating. All of them can be reversed, if not perhaps so easily in the food system as a whole, certainly in the life and diet of any individual eater, and without, I hasten to add, returning to the bush or taking up hunting and gathering.
1) From Whole Foods to Refined
The case of corn points to one of the key features of the modern diet: a shift toward increasingly refined foods, especially carbohydrates. People have been refining cereal grains since at least the Industrial Revolution, favoring white flour and white rice over brown, even at the price of lost nutrients. Part of the reason was prestige: Because for many years only the wealthy could afford refined grains, they acquired a certain glamour. Refining grains extends their shelf life (precisely because they are less nutritious to the pests that compete with us for their calories) and makes them easier to digest by removing the fiber that ordinarily slows the release of their sugars. Also, the finer that flour is ground, the more surface area is exposed to digestive enzymes, so the quicker the starches turn to glucose. A great deal of modern industrial food can be seen as an extension and intensification of this practice as food processors find ways to deliver glucose-the brain’s preferred fuel-ever more swiftly and efficiently. Sometimes this is precisely the point, as when corn is refined into corn syrup; other times, though, it is an unfortunate by-product of processing food for other reasons.
Viewed from this perspective, the history of refining whole foods has been a history of figuring out ways not just to make them more durable and portable, but also how to concentrate their energy and, in a sense, speed them up. This acceleration took a great leap forward with the introduction in Europe around 1870 of rollers (made from iron, steel, or porcelain) for grinding grain. Perhaps more than any other single development, this new technology, which by 1880 had replaced grinding by stone throughout Europe and America, marked the beginning of the industrialization of our food-reducing it to its chemical essence and speeding up its absorption. Refined flour is the first fast food.
Before the roller-milling revolution, wheat was ground between big stone wheels, which could get white flour only so white. That’s because while stone grinding removed the bran from the wheat kernel (and therefore the largest portion of the fiber), it couldn’t remove the germ, or embryo, which contains volatile oils that are rich in nutrients. The stone wheels merely crushed the germ and released the oil. This had the effect of tinting the flour yellowish gray (the yellow is carotene) and shortening its shelf life, because the oil, once exposed to the air, soon oxidized-turned rancid. That’s what people could see and smell, and they didn’t like it. What their senses couldn’t tell them, however, is that the germ contributed some of the most valuable nutrients to the flour, including much of its protein, folic acid, and other B vitamins; carotenes and other antioxidants; and omega-3 fatty acids, which are especially prone to rancidity.
The advent of rollers that made it possible to remove the germ and then grind the remaining endosperm (the big packet of starch and protein in a seed) exceptionally fine solved the problem of stability and color. Now just about everyone could afford snowy-white flour that could keep on a shelf for many months. No longer did every town need its own mill, because flour could now travel great distances. (Plus it could be ground year-round by large companies in big cities: Heavy stone mills, which typically relied on water power, operated mostly when and where rivers flowed; steam engines could drive the new rollers whenever and wherever.) Thus was one of the main staples of the Western diet cut loose from its moorings in place and time and marketed on the basis of image rather than nutritional value. In this, white flour was a modern industrial food, one of the first.
The problem was that this gorgeous white powder was nutritionally worthless, or nearly so. Much the same was now true for corn flour and white rice, the polishing of which (i.e., the removing of its most nutritious parts) was perfected around the same time. Wherever these refining technologies came into widespread use, devastating epidemics of pellagra and beriberi soon followed. Both are diseases caused by deficiencies in the B vitamins that the germ had contributed to the diet. But the sudden absence from bread of several other micronutrients, as well as omega-3 fatty acids, probably also took its toll on public health, particularly among the urban poor of Europe, many of whom ate little but bread.
In the 1930s, with the discovery of vitamins, scientists figured out what had happened, and millers began fortifying refined grain with B vitamins. This took care of the most obvious deficiency diseases. More recently, scientists recognized that many of us also had a deficiency of folic acid in our diet, and in 1996 public health authorities ordered millers to start adding folic acid to flour as well. But it would take longer still for science to realize that this “Wonder Bread” strategy of supplementation, as one nutritionist has called it, might not solve all the problems caused by the refining of grain. Deficiency diseases are much easier to trace and treat (indeed, medicine’s success in curing deficiency diseases is an important source of nutritionism’s prestige) than chronic diseases, and it turns out that the practice of refining carbohydrates is implicated in several of these chronic diseases as well-diabetes, heart disease, and certain cancers.
The story of refined grain stands as a parable about the limits of reductionist science when applied to something as complex as food. For years now nutritionists have known that a diet high in whole grains reduces one’s risk for diabetes, heart disease, and cancer. (This seems to be true even after you correct for the fact that the kind of people who eat lots of whole grains today probably have lifestyles healthier in other ways as well.) Different nutritionists have given the credit for the benefits of whole grain to different nutrients: the fiber in the bran, the folic acid and other B vitamins in the germ, or the antioxidants or the various minerals. In 2003 the American Journal of Clinical Nutrition* published an unusually nonreductionist study demonstrating that no one of those nutrients alone can explain the benefits of whole-grain foods: The typical reductive analysis of isolated nutrients could not explain the improved health of the whole-grain eaters.
For the study, University of Minnesota epid
emiologists David R. Jacobs and Lyn M. Steffen reviewed the relevant research and found a large body of evidence that a diet rich in whole grains did in fact reduce mortality from all causes. But what was surprising was that even after adjusting for levels of dietary fiber, vitamin E, folic acid, phytic acid, iron, zinc, magnesium, and manganese in the diet (all the good things we know are in whole grains), they found an additional health benefit to eating whole grains that none of the nutrients alone or even together could explain. That is, subjects getting the same amounts of these nutrients from other sources were not as healthy as the whole-grain eaters. “This analysis suggests that something else in the whole grain protects against death.” The authors concluded, somewhat vaguely but suggestively, that “the various grains and their parts act synergistically” and suggested that their colleagues begin paying attention to the concept of “food synergy.” Here, then, is support for an idea revolutionary by the standards of nutritionism: A whole food might be more than the sum of its nutrient parts.
Suffice it to say, this proposition has not been enthusiastically embraced by the food industry, and probably won’t be any time soon. As I write, Coca-Cola is introducing vitamin-fortified sodas, extending the Wonder Bread strategy of supplementation to junk food in its purest form. (Wonder Soda?) The big money has always been in processing foods, not selling them whole, and the industry’s investment in the reductionist approach to food is probably safe. The fact is, there is something in us that loves a refined carbohydrate, and that something is the human brain. The human brain craves carbohydrates reduced to their energy essence, which is to say pure glucose. Once industry figured out how to transform the seeds of grasses into the chemical equivalent of sugar, there was probably no turning back.
And then of course there is sugar itself, the ultimate refined carbohydrate, which began flooding the marketplace and the human metabolism around the same time as refined flour. In 1874, England lifted its tariffs on imported sugar, the price dropped by half, and by the end of the nineteenth century fully a sixth of the calories in the English diet were coming from sugar, with much of the rest coming from refined flour.
With the general availability of cheap pure sugar, the human metabolism now had to contend not only with a constant flood of glucose, but also with more fructose than it had ever before encountered, because sugar-sucrose-is half fructose.* (Per capita fructose consumption has increased 25 percent in the past thirty years.) In the natural world, fructose is a rare and precious thing, typically encountered seasonally in ripe fruit, when it comes packaged in a whole food full of fiber (which slows its absorption) and valuable micronutrients. It’s no wonder we’ve been hardwired by natural selection to prize sweet foods: Sugar as it is ordinarily found in nature-in fruits and some vegetables-gives us a slow-release form of energy accompanied by minerals and all sorts of crucial micronutrients we can get nowhere else. (Even in honey, the purest form of sugar found in nature, you find some valuable micronutrients.)
One of the most momentous changes in the American diet since 1909 (when the USDA first began keeping track) has been the increase in the percentage of calories coming from sugars, from 13 percent to 20 percent. Add to that the percentage of calories coming from carbohydrates (roughly 40 percent, or ten servings, nine of which are refined) and Americans are consuming a diet that is at least half sugars in one form or another-calories providing virtually nothing but energy. The energy density of these refined carbohydrates contributes to obesity in two ways. First, we consume many more calories per unit of food; the fiber that’s been removed from these foods is precisely what would have made us feel full and stop eating. Also, the flash flood of glucose causes insulin levels to spike and then, once the cells have taken all that glucose out of circulation, drop precipitously, making us think we need to eat again.
While the widespread acceleration of the Western diet has given us the instant gratification of sugar, in many people-especially those newly exposed to it-the speediness of this food overwhelms the ability of insulin to process it, leading to type 2 diabetes and all the other chronic diseases associated with metabolic syndrome. As one nutrition expert put it to me, “We’re in the middle of a national experiment in the mainlining of glucose.” And don’t forget the flood of fructose, which may represent an even greater evolutionary novelty, and therefore challenge to the human metabolism, than all that glucose.
It is probably no accident that rates of type 2 diabetes are lower among ethnic Europeans, who have had longer than other groups to accustom their metabolisms to fast-release refined carbohydrates: Their food environment changed first.* To encounter such a diet for the first time, as when people accustomed to a more traditional diet come to America or when fast food comes to them, delivers a shock to the system. This shock is what public health experts mean by the nutrition transition, and it can be deadly.
So here, then, is the first momentous change in the Western diet that may help to explain why it makes some people so sick: Supplanting tested relationships to the whole foods with which we coevolved over many thousands of years, it asks our bodies now to relate to, and deal with, a very small handful of efficiently delivered nutrients that have been torn from their food context. Our ancient evolutionary relationship with the seeds of grasses and fruit of plants has given way, abruptly, to a rocky marriage with glucose and fructose.
2) From Complexity to Simplicity
At every level, from the soil to the plate, the industrialization of the food chain has involved a process of chemical and biological simplification. It starts with industrial fertilizers, which grossly simplify the biochemistry of the soil. In the wake of Liebig’s identification of the big three macronutrients that plants need to grow-nitrogen, phosphorus, and potassium (NPK)-and Fritz Haber’s invention of a method for synthesizing nitrogen fertilizer from fossil fuels, agricultural soils began receiving large doses of the big three but little else. Just like Liebig, whose focus on the macronutrients in the human diet failed to take account of the important role played by micronutrients such as vitamins, Haber completely overlooked the importance of biological activity in the soil: the contribution to plant health of the complex underground ecosystem of soil microbes, earthworms, and mycorrhizal fungi. Harsh chemical fertilizers (and pesticides) depress or destroy this biological activity, forcing crops to subsist largely on a simple ration of NPK. Pla
nts can live on this fast-food diet of chemicals, but it leaves them more vulnerable to pests and diseases and appears to diminish their nutritional quality.
It stands to reason that a chemically simplified soil would produce chemically simplified plants. Since the widespread adoption of chemical fertilizers in the 1950s, the nutritional quality of produce in America has declined substantially, according to figures gathered by the USDA, which has tracked the nutrient content of various crops since then. Some researchers blame this decline on the condition of the soil; others cite the tendency of modern plant breeding, which has consistently selected for industrial characteristics such as yield rather than nutritional quality. (The next section will take up the trade-off between quality and quantity in industrial food.)
The trend toward simplification of our food continues up the chain. As we’ve seen, processing whole foods-refining, chemically preserving, and canning them-depletes them of many nutrients, a few of which are then added back: B vitamins in refined flour, vitamins and minerals in breakfast cereal and bread. Fortifying processed foods with missing nutrients is surely better than leaving them out, but food science can add back only the small handful of nutrients that food science recognizes as important today. What is it overlooking? As the whole-grain food synergy study suggests, science doesn’t know nearly enough to compensate for everything that processing does to whole foods. We know how to break down a kernel of corn or grain of wheat into its chemical parts, but we have no idea how to put it back together again. Destroying complexity is a lot easier than creating it.