- Home
- Michael Pollan
In Defense of Food Page 9
In Defense of Food Read online
Page 9
The Aborigines divided their seven-week stay in the bush between a coastal and an inland location. While on the coast, their diet consisted mainly of seafood, supplemented by birds, kangaroo, and witchetty grubs, the fatty larvae of a local insect. Hoping to find more plant foods, the group moved inland after two weeks, settling at a riverside location. Here, in addition to freshwater fish and shellfish, the diet expanded to include turtle, crocodile, birds, kangaroo, yams, figs, and bush honey. The contrast between this hunter-gatherer fare and their previous diet was stark: O’Dea reports that prior to the experiment “the main dietary components in the urban setting were flour, sugar, rice, carbonated drinks, alcoholic beverages (beer and port), powdered milk, cheap fatty meat, potatoes, onions, and variable contributions of other fresh fruits and vegetables”-the local version of the Western diet.
After seven weeks in the bush, O’Dea drew blood from the Aborigines and found striking improvements in virtually every measure of their health. All had lost weight (an average of 17.9 pounds) and seen their blood pressure drop. Their triglyceride levels had fallen into the normal range. The proportion of omega-3 fatty acids in their tissues had increased dramatically. “In summary,” O’Dea concluded, “all of the metabolic abnormalities of type II diabetes were either greatly improved (glucose tolerance, insulin response to glucose) or completely normalized (plasma lipids) in a group of diabetic Aborigines by a relatively short (seven week) reversion to traditional hunter-gatherer lifestyle.”
O’Dea does not report what happened next, whether the Aborigines elected to remain in the bush or return to civilization, but it’s safe to assume that if they did return to their Western lifestyles, their health problems returned too. We have known for a century now that there is a complex of so-called Western diseases-including obesity, diabetes, cardiovascular disease, hypertension, and a specific set of diet-related cancers-that begin almost invariably to appear soon after a people abandons its traditional diet and way of life. What we did not know before O’Dea took her Aborigines back to the bush (and since she did, a series of comparable experiments have produced similar results in Native Americans and native Hawaiians) was that some of the most deleterious effects of the Western diet could be so quickly reversed. It appears that, at least to an extent, we can rewind the tape of the nutrition transition and undo some of its damage. The implications for our own health are potentially significant.*
The genius of Kerin O’Dea’s experiment was its simplicity-and her refusal to let herself be drawn into the scientific labyrinth of nutritionism. She did not attempt to pick out from the complexity of the diet (either before or after the experiment) which one nutrient might explain the results-whether it was the low-fat diet, or the absence of refined carbohydrates, or the reduction in total calories that was responsible for the improvement in the group’s health. Her focus instead was on larger dietary patterns, and while this approach has its limitations (we can’t extract from such a study precisely which component of the Western diet we need to adjust in order to blunt its worst effects), it has the great virtue of escaping the welter of conflicting theories about specific nutrients and returning our attention to more fundamental questions about the links between diet and health.
Like this one: To what extent are we all Aborigines? When you consider that two thirds of Americans are overweight or obese, that fully a quarter of us have metabolic syndrome, that fifty-four million have pre diabetes, and that the incidence of type 2 diabetes has risen 5 percent annually since 1990, going from 4 percent to 7.7 percent of the adult population (that’s more than twenty million Americans), the question is not nearly as silly as it sounds.
TWO - THE ELEPHANT IN THE ROOM
I n the end, even the biggest, most ambitious, and widely reported studies of diet and health-the Nurses’ Health Study, the Women’s Health Initiative, and nearly all the others-leave undisturbed the main features of the Western diet: lots of processed foods and meat, lots of added fat and sugar, lots of everything except fruits, vegetables, and whole grains. In keeping with the nutritionism paradigm and the limits of reductionist science, most nutrition researchers fiddle with single nutrients as best they can, but the populations they recruit and study are typical American eaters doing what typical American eaters do: trying to eat a little less of this nutrient, a little more of that one, depending on the latest thinking. But the overall dietary pattern is treated as a more or less unalterable given. Which is why it probably should not surprise us that the findings of such research should be so modest, equivocal, and confusing.
But what about the elephant in the room-this pattern of eating that we call the Western diet? In the midst of our deepening confusion about nutrition, it might be useful to step back and gaze upon it-review what we do know about the Western diet and its effects on our health. What we know is that people who eat the way we do in the West today suffer substantially higher rates of cancer, cardiovascular diseases, diabetes, and obesity than people eating any number of different traditional diets. We also know that when people come to the West and adopt our way of eating, these diseases soon follow, and often, as in the case of the Aborigines and other native populations, in a particularly virulent form.
The outlines of this story-the story of the so-called Western diseases and their link to the Western diet-we first learned in the early decades of the twentieth century. That was when a handful of dauntless European and American medical professionals working with a wide variety of native populations around the world began noticing the almost complete absence of the chronic diseases that had recently become commonplace in the West. Albert Schweitzer and Denis P. Burkitt in Africa, Robert McCarrison in India, Samuel Hutton among the Eskimos in Labrador, the anthropologist Ales? Hrdlicka among Native Americans, and the dentist Weston A. Price among a dozen different groups all over the world (including Peruvian Indians, Australian Aborigines, and Swiss mountaineers) sent back much the same news. They compiled lists, many of which appeared in medical journals, of the common diseases they’d been hard pressed to find in the native populations they had treated or studied: little to no heart disease, diabetes, cancer, obesity, hypertension, or stroke; no appendicitis, diverticulitis, malformed dental arches, or tooth decay; no varicose veins, ulcers, or hemorrhoids. These disorders suddenly appeared to these researchers under a striking new light, as suggested by the name
given to them by the British doctor Denis Burkitt, who worked in Africa during World War II: He proposed that we call them Western diseases. The implication was that these very different sorts of diseases were somehow linked and might even have a common cause.
Several of these researchers were on hand to witness the arrival of the Western diseases in isolated populations, typically, as Albert Schweitzer wrote, among “natives living more and more after the manner of the whites.” Some noted that the Western diseases followed closely on the heels of the arrival of Western foods, particularly refined flour and sugar and other kinds of “store food.” They observed too that when one Western disease arrived on the scene, so did most of the others, and often in the same order: obesity followed by type 2 diabetes followed by hypertension and stroke followed by heart disease.
In the years before World War II the medical world entertained a lively conversation on the subject of the Western diseases and what their rise might say about our increasingly industrialized way of life. The concept’s pioneers believed there were novelties in the modern diet to which native populations were poorly adapted, though they did not necessarily agree on exactly which novelty might be the culprit. Burkitt, for example, believed it was the lack of fiber in the modern diet while McCarrison, a British army doctor, focused on refined carbohydrates while still others blamed meat eating and saturated fat or, in Price’s case, the advent of processed food and industrially grown crops deficient in vitamins and minerals.
Not everyone, though, bought into the idea that chronic disease was a by-product of Western lifestyles and, in particular, that the industrialization of our food was taking a toll on our health. One objection to the theory was genetic: Different races were apt to be susceptible to different diseases went the argument; white people were disposed to heart attacks, brown people to things like leprosy. Yet as Burkitt and others pointed out, blacks living in America suffered from the same chronic diseases as whites living there. Simply by moving to places like America, immigrants from nations with low rates of chronic disease seemed to quickly acquire them.
The other objection to the concept of Western diseases, one you sometimes still hear, was demographic. The reason we see so much chronic disease in the West is because these are illnesses that appear relatively late in life, and with the conquest of infectious disease early in the twentieth century, we’re simply living long enough to get them. In this view, chronic disease is the inevitable price of a long life. But while it is true that our life expectancy has improved dramatically since 1900 (rising in the United States from forty-nine to seventy-seven years), most of that gain is attributed to the fact that more of us are surviving infancy and childhood; the life expectancy of a sixty-five-year-old in 1900 was only about six years less than that of a sixty-five-year-old living today.* When you adjust for age, rates of chronic diseases like cancer and type 2 diabetes are considerably higher today than they were in 1900. That is, the chances that a sixty-or seventy-year-old suffers from cancer or type 2 diabetes are far greater today than they were a century ago. (The same may well be true of heart disease, but because heart disease statistics from 1900 are so sketchy, we can’t say for sure.)
Cancer and heart disease and so many of the other Western diseases are by now such an accepted part of modern life that it’s hard for us to believe this wasn’t always or even necessarily the case. These days most of us think of chronic diseases as being a little like the weather-one of life’s givens-and so count ourselves lucky that, compared to the weather, the diseases at least are more amenable to intervention by modern medicine. We think of them strictly in medical rather than historical, much less evolutionary, terms. But during the decades before World War II, when the industrialization of so many aspects of our lives was still fairly fresh, the price of “progress,” especially to our health, seemed more obvious to many people and therefore more open to question.
One of the most intrepid questioners of the prewar period was Weston A. Price, a Canadian-born dentist, of all things, who became preoccupied with one of those glaring questions we can’t even see anymore. Much like heart disease, chronic problems of the teeth are by now part of the furniture of modern life. But if you stop to think about it, it is odd that everyone should need a dentist and that so many of us should need braces, root canals, extractions of wisdom teeth, and all the other routine procedures of modern mouth maintenance. Could the need for so much remedial work on a body part crucially involved in an activity as critical to our survival as eating reflect a design defect in the human body, some sort of oversight of natural selection? This seems unlikely. Weston Price, who was born in 1870 in a farming community south of Ottawa and built a dental practice in Cleveland, Ohio, had personally witnessed the rapid increase in dental problems beginning around the turn of the last century and was convinced that the cause could be found in the modern diet. (He wasn’t the only one: In the 1930s an argument raged in medical circles as to whether hygiene or nutrition was the key to understanding and treating tooth decay. A public debate on that very question in Manhattan in 1934 attracted an overflow audience of thousands. That hygiene ultimately won the day had as much to do with the needs of the dental profession as it did with good science; the problem of personal hygiene was easier, and far more profitable, to address than that of the diet and entire food system.)
In the 1930s, Price closed down his dental practice so he could devote all his energies to solving the mystery of the Western diet. He went looking for what he called control groups-isolated populations that had not yet been exposed to modern foods. He found them in the mountains of Switzerland and Peru, the lowlands of Africa, the bush of Australia, the outer islands of the Hebrides, the Everglades of Florida, the coast of Alaska, the islands of Melanesia and the Torres Strait, and the jungles of New Guinea and New Zealand, among other places. Price made some remarkable discoveries, which he wrote up in articles for medical journals (with titles like “New Light on Modern Physical Degeneration from Field Studies Among Primitive Races”) and ultimately summarized in his 510-page tome, Nutrition and Physical Degeneration, published in 1939.
Although his research was taken seriously during his lifetime, Weston Price has been all but written out of the history of twentieth-century science. The single best account I could find of his life and work is an unpublished master’s thesis by Martin Renner, a graduate student in history at UC Santa Cruz.* This neglect might owe to the fact that Price was a dentist
, and more of an amateur scientist in the nineteenth-century mode than a professional medical researcher. It might also be because he could sometimes come across as a bit of a crackpot-one of his articles was titled “Dentistry and Race Destiny.” His discussions of “primitive races” are off-putting to say the least, though he ended up a harsh critic of “modern civilization,” convinced his primitives had more to teach us than the other way around. He was also something of a monomaniac on the subject of diet, certain that poor nutrition could explain not just tooth decay and heart disease but just about everything else that bedeviled humankind, including juvenile delinquency, the collapse of civilizations, and war.
Still, the data he painstakingly gathered from his control groups, and the lines of connection he was able to trace, not only between diet and health but also between the way a people produces food and that food’s nutritional quality, remain valuable today. Indeed, his research is even more valuable today than in 1939, because most of the groups he studied have long since vanished or adopted more Western ways of eating. If you want to study the Western diet today, control groups are few and far between. (You can of course create them, as Kerin O’Dea did in Australia.) Price’s work also points the way toward a protoecological understanding of food that will be useful as we try to escape the traps of nutritionism.