In Defense of Food Page 3
Leave aside for now the virtues, if any, of a low-meat and/or low-fat diet, questions to which I will return, and focus for a moment on language. For with these subtle changes in wording a whole way of thinking about food and health underwent a momentous shift. First, notice that the stark message to “eat less” of a particular food-in this case meat-had been deep-sixed; don’t look for it ever again in any official U.S. government dietary pronouncement. Say what you will about this or that food, you are not allowed officially to tell people to eat less of it or the industry in question will have you for lunch. But there is a path around this immovable obstacle, and it was McGovern’s staffers who blazed it: Speak no more of foods, only nutrients. Notice how in the revised guidelines, distinctions between entities as different as beef and chicken and fish have collapsed. These three venerable foods, each representing not just a different species but an entirely different taxonomic class, are now lumped together as mere delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves. Now the culprit is an obscure, invisible, tasteless-and politically unconnected-substance that may or may not lurk in them called saturated fat.
The linguistic capitulation did nothing to rescue McGovern from his blunder. In the very next election, in 1980, the beef lobby succeeded in rusticating the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein squatting in the middle of its plate. Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, but would instead arrive dressed in scientific euphemism and speaking of nutrients, entities that few Americans (including, as we would find out, American nutrition scientists) really understood but that, with the notable exception of sucrose, lack powerful lobbies in Washington.*
The lesson of the McGovern fiasco was quickly absorbed by all who would pronounce on the American diet. When a few years later the National Academy of Sciences looked into the question of diet and cancer, it was careful to frame its recommendations nutrient by nutrient rather than food by food, to avoid offending any powerful interests. We now know the academy’s panel of thirteen scientists adopted this approach over the objections of at least two of its members who argued that most of the available science pointed toward conclusions about foods, not nutrients. According to T. Colin Campbell, a Cornell nutritional biochemist who served on the panel, all of the human population studies linking dietary fat to cancer actually showed that the groups with higher cancer rates consumed not just more fats, but also more animal foods and fewer plant foods as well. “This meant that these cancers could just as easily be caused by animal protein, dietary cholesterol, something else exclusively found in animal-based foods, or a lack of plant-based foods,” Campbell wrote years later. The argument fell on deaf ears.
In the case of the “good foods” too, nutrients also carried the day: The language of the final report highlighted the benefits of the antioxidants in vegetables rather than the vegetables themselves. Joan Gussow, a Columbia University nutritionist who served on the panel, argued against the focus on nutrients rather than whole foods. “The really important message in the epidemiology, which is all we had to go on, was that some vegetables and citrus fruits seemed to be protective against cancer. But those sections of the report were written as though it was the vitamin C in the citrus or the beta-carotene in the vegetables that was responsible for the effect. I kept changing the language to talk about ‘foods that contain vitamin C’ and ‘foods that contain carotenes.’ Because how do you know it’s not one of the other things in the carrots or the broccoli? There are hundreds of carotenes. But the biochemists had their answer: ‘You can’t do a trial on broccoli.’”
So the nutrients won out over the foods. The panel’s resort to scientific reductionism had the considerable virtue of being both politically expedient (in the case of meat and dairy) and, to these scientific heirs of Justus von Liebig, intellectually sympathetic. With each of its chapters focused on a single nutrient, the final draft of the National Academy of Sciences report, Diet, Nutrition and Cancer, framed its recommendations in terms of saturated fats and antioxidants rather than beef and broccoli.
In doing so, the 1982 National Academy of Sciences report helped codify the official new dietary language, the one we all still speak. Industry and media soon followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids, flavonols, carotenoids, antioxidants, probiotics, and phytochemicals soon colonized much of the cultural space previously occupied by the tangible material formerly known as food.
The Age of Nutritionism had arrived.
TWO - NUTRITIONISM DEFINED
T he term isn’t mine. It was coined by an Australian sociologist of science by the name of Gyorgy Scrinis, and as near as I can determine first appeared in a 2002 essay titled “Sorry Marge” published in an Australian quarterly called Meanjin. “Sorry Marge” looked at margarine as the ultimate nutritionist product, able to shift its identity (no cholesterol! one year, no trans fats! the next) depending on the prevailing winds of dietary opinion. But Scrinis had bigger game in his sights than spreadable vegetable oil. He suggested that we look past the various nutritional claims swirling around margarine and butter and consider the underlying message of the debate itself: “namely, that we should understand and engage with food and our bodies in terms of their nutritional and chemical constituents and requirements-the assumption being that this is all we need to understand.” This reductionist way of thinking about food had been pointed out and criticized before (notably by the Canadian historian Harvey Levenstein, the British nutritionist Geoffrey Cannon, and the American nutritionists Joan Gussow and Marion Nestle), but it had never before been given a proper name: “nutritionism.” Proper names have a way of making visible things we don’t easily see or simply take for granted.
The first thing to understand about nutritionism is that it is not the same thing as nutrition. As the “-ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s still exerting its hold on your cul�
�ture. A reigning ideology is a little like the weather-all pervasive and so virtually impossible to escape. Still, we can try.
In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. Put another way: Foods are essentially the sum of their nutrient parts. From this basic premise flow several others.
Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists reach the public) to explain the hidden reality of foods to us. In form this is a quasireligious idea, suggesting the visible world is not the one that really matters, which implies the need for a priesthood. For to enter a world where your dietary salvation depends on unseen nutrients, you need plenty of expert help.
But expert help to do what exactly? This brings us to another unexamined assumption of nutritionism: that the whole point of eating is to maintain and promote bodily health. Hippocrates’ famous injunction to “let food be thy medicine” is ritually invoked to support this notion. I’ll leave the premise alone for now, except to point out that it is not shared by all cultures and, further, that the experience of these other cultures suggests that, paradoxically, regarding food as being about things other than bodily health-like pleasure, say, or sociality or identity-makes people no less healthy; indeed, there’s some reason to believe it may make them more healthy. This is what we usually have in mind when we speak of the French paradox. So there is at least a question as to whether the ideology of nutritionism is actually any good for you.
It follows from the premise that food is foremost about promoting physical health that the nutrients in food should be divided into the healthy ones and the unhealthy ones-good nutrients and bad. This has been a hallmark of nutritionist thinking from the days of Liebig, for whom it wasn’t enough to identify the nutrients; he also had to pick favorites, and nutritionists have been doing so ever since. Liebig claimed that protein was the “master nutrient” in animal nutrition, because he believed it drove growth. Indeed, he likened the role of protein in animals to that of nitrogen in plants: Protein (which contains nitrogen) comprised the essential human fertilizer. Liebig’s elevation of protein dominated nutritionist thinking for decades as public health authorities worked to expand access to and production of the master nutrient (especially in the form of animal protein), with the goal of growing bigger, and therefore (it was assumed) healthier, people. (A high priority for Western governments fighting imperial wars.) To a considerable extent we still have a food system organized around the promotion of protein as the master nutrient. It has given us, among other things, vast amounts of cheap meat and milk, which have in turn given us much, much bigger people. Whether they are healthier too is another question.
It seems to be a rule of nutritionism that for every good nutrient, there must be a bad nutrient to serve as its foil, the latter a focus for our food fears and the former for our enthusiasms. A backlash against protein arose in America at the turn of the last century as diet gurus like John Harvey Kellogg and Horace Fletcher (about whom more later) railed against the deleterious effects of protein on digestion (it supposedly led to the proliferation of toxic bacteria in the gut) and promoted the cleaner, more wholesome carbohydrate in its place. The legacy of that revaluation is the breakfast cereal, the strategic objective of which was to dethrone animal protein at the morning meal.
Ever since, the history of modern nutritionism has been a history of macronutrients at war: protein against carbs; carbs against proteins, and then fats; fats against carbs. Beginning with Liebig, in each age nutritionism has organized most of its energies around an imperial nutrient: protein in the nineteenth century, fat in the twentieth, and, it stands to reason, carbohydrates will occupy our attention in the twenty-first. Meanwhile, in the shadow of these titanic struggles, smaller civil wars have raged within the sprawling empires of the big three: refined carbohydrates versus fiber; animal protein versus plant protein; saturated fats versus polyunsaturated fats; and then, deep down within the province of the polyunsaturates, omega-3 fatty acids versus omega-6s. Like so many ideologies, nutritionism at bottom hinges on a form of dualism, so that at all times there must be an evil nutrient for adherents to excoriate and a savior nutrient for them to sanctify. At the moment, trans fats are performing admirably in the former role, omega-3 fatty acids in the latter. It goes without saying that such a Manichaean view of nutrition is bound to promote food fads and phobias and large abrupt swings of the nutritional pendulum.
Another potentially serious weakness of nutritionist ideology is that, focused so relentlessly as it is on the nutrients it can measure, it has trouble discerning qualitative distinctions among foods. So fish, beef, and chicken through the nutritionist’s lens become mere delivery systems for varying quantities of different fats and proteins and whatever other nutrients happen to be on their scope. Milk through this lens is reduced to a suspension of protein, lactose, fats, and calcium in water, when it is entirely possible that the benefits, or for that matter the hazards, of drinking milk owe to entirely other factors (growth hormones?) or relationships between factors (fat-soluble vitamins and saturated fat?) that have been overlooked. Milk remains a food of humbling complexity, to judge by the long, sorry saga of efforts to simulate it. The entire history of baby formula has been the history of one overlooked nutrient after another: Liebig missed the vitamins and amino acids, and his successors missed the omega-3s, and still to this day babies fed on the most “nutritionally complete” formula fail to do as well as babies fed human milk. Even more than margarine, infant formula stands as the ultimate test product of nutritionism and a fair index of its hubris.
This brings us to one of the most troubling features of nutritionism, though it is a feature certainly not troubling to all. When the emphasis is on quantifying the nutrients contained in foods (or, to be precise, the recognized nutrients in foods), any qualitative distinction between whole foods and processed foods is apt to disappear. “[If] foods are understood only in terms of the various quantities of nutrients they contain,” Gyorgy Scrinis wrote, then “even processed foods may be considered to be ‘healthier’ for you than whole foods if they contain the appropriate quantities of some nutrients.”
How convenient.
THREE - NUTRITIONIS
M COMES TO MARKET
N o idea could be more sympathetic to manufacturers of processed foods, which surely explains why they have been so happy to jump on the nutritionism bandwagon. Indeed, nutritionism supplies the ultimate justification for processing food by implying that with a judicious application of food science, fake foods can be made even more nutritious than the real thing. This of course is the story of margarine, the first important synthetic food to slip into our diet. Margarine started out in the nineteenth century as a cheap and inferior substitute for butter, but with the emergence of the lipid hypothesis in the 1950s, manufacturers quickly figured out that their product, with some tinkering, could be marketed as better-smarter!-than butter: butter with the bad nutrients removed (cholesterol and saturated fats) and replaced with good nutrients (polyunsaturated fats and then vitamins). Every time margarine was found wanting, the wanted nutrient could simply be added (Vitamin D? Got it now. Vitamin A? Sure, no problem). But of course margarine, being the product not of nature but of human ingenuity, could never be any smarter than the nutritionists dictating its recipe, and the nutritionists turned out to be not nearly as smart as they thought. The food scientists’ ingenious method for making healthy vegetable oil solid at room temperature-by blasting it with hydrogen-turned out to produce unhealthy trans fats, fats that we now know are more dangerous than the saturated fats they were designed to replace. Yet the beauty of a processed food like margarine is that it can be endlessly reengineered to overcome even the most embarrassing about-face in nutritional thinking-including the real wincer that its main ingredient might cause heart attacks and cancer. So now the trans fats are gone, and margarine marches on, unfazed and apparently unkillable. Too bad the same cannot be said of an unknown number of margarine eaters.