When Bread Became Bad for Us

Adding chemical nutrients doesn't cut it

Thanks for visiting The Bittman Project, a place where food is everything (or pretty close).

Welcome to the second installment of Animal, Vegetable, Junk: A History of Food, from Sustainable to Suicidal, that came out last week. Here, I link vitamania, the fad that never ended, with why we’ve accepted bad bread for most of our lives. And if you missed it, also check out last week’s “why I wrote this,” as well as the first excerpt on hamburgers. I'm curious to hear what you think. — Mark

You may remember Justus von Liebig, who defined the trinity of plant nutrition: nitrogen, potassium, and phosphorus. Since plants needed only three key nutrients, according to Liebig’s erroneous writings, it should follow that the laws of human metabolism would be analogous. All we are is the sum of the plants we eat, he reasoned, and those plants eaten by the animals we, in turn, eat. Recall the reductionist maxim: Everything can be understood as the sum of its constituent parts. 

Since humans understood very little about nutrition at the time, not much stood in the way of Liebig’s confirmation bias. And so, in his mid-19th century, Researches on the Chemistry of Food, and the Motion of the Juices in the Animal Body, he described the “dietetic trinity” of human nutrition: proteins, carbohydrates,  and fats. 

Just before 1900, German researchers began to measure the ideal amount and type of food humans need. Wilbur Olin Atwater, a professor at Wesleyan who’d spent time in Germany, was inspired to develop a kind of respiration chamber called a room cal- orimeter. These were sealed, and well-appointed rooms in which subjects would live for days, performing whatever activities one would consider normal in a twenty-eight-square-foot room. Energy expended was measured by heat exchange, and, through this, Atwater was able to determine the caloric values of thousands of foods. He also pioneered the understanding that the human body metabolized calories of macronutrients — fat, carbohydrate, or protein — differently, and that all three were needed for nutritional balance. 

But the calorie-reduced food to a measure of heat, and the tautological statement, “A calorie is a calorie” — meaning all food was of essentially the same quality — gained authority. Thermodynamically speaking, it’s true. A calorie is defined as the amount of energy required to increase the temperature of a gram of water one-degree centigrade, no matter where or in what it’s found. (That  amount of energy is 4.2 joules.) But nutritionally, it’s a harmful, cynical, and worse-than-idiotic argument, fodder for nefarious propaganda from producers of nutritionally bereft foods. Liebig and Atwater’s findings were reducing the complexity of food to some components of nutrition, and making the latter more important than the former, a fundamental mistake we live with today. Just as soil needs more than potassium, phosphorus, and nitrogen, human nutrition is more complex than the calories contained in proteins, fats, and carbohydrates, along with whatever micronutrients we’ve managed to analyze. 

The simplistic approach, however, dominated. Nutritionists and others everywhere began addressing the symptoms of malnourishment rather than the causes. 

For example, although eating brown rice prevents beriberi, white rice had a better shelf life and could stave off beriberi as long as it was dosed with synthetic thiamine. Yet brown rice is more than white rice plus thiamine, and by failing to recognize this complexity (and even mystery), producers have degraded our diets in ways nutritionists don’t fully understand. 

That argument has not been a match for expediency; it was easier and more profitable to address deficiencies by adding micronutrients in chemical form. Furthermore, vitamins provided a novel selling point to deliver peace of mind: Think, for example,  of the staying power of orange juice’s vitamin-C-laced reputation.  People called this new marketing obsession “vitamania,” and it became a fad that never ended. In 1942, the vitamin market was worth two hundred million dollars a year; today it’s thirty billion dollars. 

In the early twentieth century, vitamania changed no traditional food more radically than bread. Until then, white flour had been difficult to produce. It was associated with wealth and believed to be more pure than actually pure ground whole wheat. (In America, this association took on racial overtones: white loaves were  “chaste,” and dark ones “defiled.”) But without added nutrients,  white flour offers little besides calories. 

“Grains” are actually a subset of fruits. As a category, they include a long list of some of the world’s most widely consumed foods: rice, quinoa, fonio, and more. (Corn is included, although it’s technically a vegetable. None of this taxonomy is as precise  as Linnaeus, the eighteenth-century Swede who developed modern taxonomy, believed.) Whole grains have been providing the bulk of calories for humans since just after the birth of agriculture; they’ve been the most reliable source of nutrition for entire civilizations. And near the top of the list of civilization-sustaining grains is wheat. 

The bran of whole wheat — the tough outer layer — contains the fiber that few of us get enough of, as well as some B vitamins and minerals. The germ, the most nutrient-rich part of the kernel,  has the greatest share of vitamin E, folic acid, phosphorus, zinc,  magnesium, and thiamine. Take those away and you’re left with the starchy endosperm, which makes up the bulk of the grain and contains most of its carbohydrates. It’s a good source of calories,  but it’s not a whole, nourishing food. 

In addition to nutrients, though, the bran and germ contain oil,  and that oil can go rancid with time, ruining the flour. Get rid of those pesky elements and the resulting white four keeps more or less forever. So wheat producers were faced with a choice: local production, which required quick sale and consumption of a higher-quality ingredient, or mass production, with a long shelf life and a nutritionally inferior product. For large producers, the choice was an easy one. 

Separating bran and germ from endosperm had always been challenging. Even simple milling requires that a dried wheat berry, sometimes almost as hard as a pebble, be so finely ground as to make it palatable when mixed with water. “Stone-ground” means just that: pulverizing grains between two stones, traditionally moved and turned by humans, animals, or water- or wind power. 

Historically, discarding bran and germ took more steps, more time, and more work; it was also wasteful. Hence the limited availability of white flour and, again, the tendency for milled grains to be sold locally, before the flour could spoil. 

With the mid-nineteenth-century invention of the steel roller mill in Budapest, powered by steam and later electricity, the process of producing white flour sped up immeasurably. In the United States especially, the sheer amount of flour produced,  along with the great distances it traveled to market, meant that never-spoiling white flour quickly became the norm. Industrially produced white bread followed soon thereafter. 

Until a hundred years ago, the major problem with commercially made bread was that buyers didn’t know what was in it. Flour was often extended with filling but non-nutritious ingredients, and sometimes dangerous ones like chopped leaves and straw, sand, plaster of Paris, and who knows what else. (In the nineteenth century, sawdust was sometimes dubbed “tree flour.”) When white flour became widely available, the industry followed Heinz’s model, using public fear to its advantage and selling “purity,” as if whiteness was a guarantee. And then white flour became even whiter, through bleaching with chlorine, benzoyl peroxide (now an ingredient in acne medication), calcium peroxide, and chlorine dioxide. All of these are still allowed in U.S. food processing, though none are allowed in the EU. 

With new machinery, super-duper yeast, and a host of accelerants, a loaf of bread could be produced way faster than ever before.  To protect it from tampering or contamination, the new factory-made white bread was sealed in “hygienic” waxed paper. Because wrapping machines were expensive, smaller local bakeries struggling to keep up with the trends were driven out of business. 

And since this packaging also eliminated the ability to see or smell the bread, brands promoted the notion that softer was fresher and therefore better.

100 Percent Whole-Wheat Bread

Makes: 1 loaf
Time: 14 to 28 hours, almost completely unattended

What this loaf lacks in airiness it makes up for with its intensely nutty flavor. Slices are perfect for topping with pungent cheeses, jams, or any spread and the dough can accommodate all sorts of additional ingredients. You can also substitute rye, cornmeal, oat, or other whole grain flour for up to 1 cup of wheat flour.


  • 3 cups whole wheat flour

  • 2 teaspoons salt

  • 1/2 teaspoon instant yeast

  • Good-quality vegetable oil for greasing and brushing


1. Whisk the flour, salt, and yeast together in a large bowl. Add 1 1/2 cups water and stir until blended; the dough should be very wet, almost like a batter. Add more water if it’s too thick. Cover the bowl with plastic wrap and let it rest in a warm place for about 12 hours, or in a cooler place (even the fridge) for up to 24 hours. The dough is ready when its surface is dotted with bubbles. Rising time will be shorter at warmer temperatures, a bit longer if your kitchen is chilly.

2. Use a little oil to grease a 9x5-inch loaf pan. Scoop the dough into the loaf pan and use a rubber spatula to gently spread it in evenly. Brush or drizzle the top with a little more oil. Cover with a towel and let rise until doubled in size, an hour or 2 depending on the warmth of your kitchen. (It won’t reach the top of the pan, or will just barely.) When it’s almost ready, heat the oven to 350 degrees.

3. Bake until the bottom of the loaf sounds hollow when you tap it or the internal temperature is about 200 degrees on an instant-read thermometer, about 45 minutes. Remove from the pan and cool on a rack before slicing.

— Recipe from How to Cook Everything Vegetarian


Makes: About 6 large rounds
Time: About a day, mostly unattended

In Ethiopian cuisine, this spongy, sour bread is used to sop up all sorts of fragrant, saucy stews. The main ingredient is teff flour, which is ground from a tiny ancient grain (and just so happens to be gluten-free). It’s mixed with water and fermented overnight (or longer) to produce a distinctly tangy batter that you cook in a skillet much like a pancake.


  • 2 cups teff flour

  • 3/4 teaspoon salt

  • Neutral oil (like grapeseed or corn) for coating the pan, if necessary


1. Put the flour in a large bowl and whisk in 2 1/2 cups water until smooth. Cover with plastic or a kitchen towel and let the batter sit at room temperature at least overnight, but ideally 24 hours (the longer the batter ferments, the more it develops its trademark sourness).

2. After the batter has fermented, gently stir in the salt. Put a large pan over medium-high heat. If it’s nonstick, you don’t need any oil; if not, drizzle a little oil into the pan and spread it around with a crumpled paper towel (this helps cover the entire surface of the pan and soak up excess oil).

3. Ladle about 3/4 cup of the batter into the pan and swirl it around to coat the bottom. Cook, undisturbed, until bubbles appear on the surface (like a pancake), just a minute or two. Cover the pan and continue to cook until the top of the injera is dried out and slightly glossy, the edges begin to curl, and the middle is cooked through, another minute or 2. Invert the pan so the injera falls onto a platter or cutting board (use a rubber spatula to help it out if necessary). Repeat with the remaining batter. Serve warm or at room temperature.

— Recipe from How to Bake Everything