This post was originally published on this site

http://chriskresser.com/

Kids need to eat meat and healthy, whole foods like this apple.

Because of the prevailing idea in our culture that vegetarian and vegan diets are healthy, more and more kids are being raised from birth (or even conception!) without meat on their plates. If you’re considering a plant-based diet for your family, read on. Here’s my take on why kids need to eat meat to grow into healthy adults.

Is a Plant-Based Diet Safe for Your Children?

Both the Academy of Nutrition and Dietetics (AND) and the USDA have stated that vegetarian and vegan diets are safe during pregnancy, but critical analyses by several researchers have questioned whether these recommendations are based on sufficient evidence.

Even though vegetarian and vegan diets have a reputation for being “healthy,” there are consequences to choosing a meat-free diet for your family. Find out why your kids need to eat meat to thrive.

One review remarked that “the evidence on vegan–vegetarian diets in pregnancy is heterogeneous and scant,” suggesting that more research is needed to answer the question of whether they are, in fact, safe during pregnancy. (1)

On meat-free diets for children, another review stated that “the existing data do not allow us to draw firm conclusions on health benefits or risks of present-day vegetarian type diets on the nutritional or health status of children and adolescents.” (2)

The limited studies on meat-free diets in children have severe shortcomings, including:

  • Comparing meat-free diets to the typical unhealthy Standard American Diet (SAD), which is far from healthy
  • Healthy-user bias (those who choose meat-free diets are also more likely to engage in “healthier” behaviors like exercising regularly, not smoking, etc.)
  • Upper-class bias (higher-income families are more likely to choose meat-free diets and, in general, have better health)
  • Small sample sizes
  • Inaccuracies in diet reporting, which is a huge problem in nutrition research
However, despite these shortcomings, vegetarian and vegan diets for children carry significant risks of nutrient deficiencies that can have dire health consequences. (3, 4, 5)

Your Kids Need Nutrient-Dense Foods to Thrive

I know—it can be a struggle to get your kids to eat the “right” foods. With the processed junk that passes as “kid’s food” today, it’s even more difficult to ensure that your child eats a healthy diet. Research shows that the more restrictive the diet in children, the greater the risk for nutrient deficiencies. (6) When meat, which is one of the most nutrient-dense foods available, isn’t part of a child’s diet, these risks are even greater. Meat-free diets are unavoidably low in nutrient density, by favoring whole grains and legumes over animal products, making it harder for kids to get adequate nutrition. (7)

B12 Deficiency

The prevalence of B12 deficiency is 67 percent in American children, 50 percent in New Zealand children, and 85 percent in Norwegian infants who have followed vegetarian or vegan diets their entire lives. (8)

Studies have shown that kids raised until age 6 or 7 on a vegan diet are still B12 deficient years after adding at least some animal products to their diet (9). These children demonstrated persistent cognitive deficits 5 to 10 years after switching to a lactovegetarian or omnivorous diet.

One study found an association between B12 status and measures of intelligence and memory, with formerly vegan kids scoring lower than omnivorous kids. (10) Devastating case studies have reported B12 deficiency in young vegan children that have led to neurological damage and developmental delays. (11, 12, 13)

Vitamin D Deficiency

Vitamin D is virtually absent from vegan diets and often severely lacking in vegetarian diets. Along with calcium, magnesium, and vitamin K2, vitamin D is critical for proper bone growth and remodeling, especially during infancy and childhood. During the first year of life, human bone mass nearly doubles. (14) Massive bone growth occurs from birth to adulthood, during which a child will grow from approximately 20 inches to 60 or 70 inches or more. Even with adequate calcium intake, bone turnover markers were lower in child vegetarians compared to omnivores, increasing the risks of impaired bone growth and lower peak bone mass during adolescence. (15, 16)

Hypothyroidism

Low nutrient intake extends beyond vitamin B12 and D. Other case studies have attributed hypothyroidism in young children to a maternal and/or childhood vegan diet, possibly due to insufficient iodine. (17, 18)

DHA and EPA

Compared to breast milk from omnivorous mothers, breast milk from vegan mothers had lower levels of DHA and EPA, which are vital for brain development, especially in the first year of life, when a baby’s brain literally doubles in size. (19)

Iron Deficiency

Iron is already the most common nutritional deficiency in children. Because meat-free diets require at least 1.8 times the iron intake due to lower iron availability in plant foods, iron deficiency is more common in vegetarian and vegan children than in omnivores. (20, 21, 22, 23) Children on vegetarian and vegan diets also can have lower intakes of vitamin A and zinc. (24, 25)

Your Kids Need to Eat Meat—Not Supplements

Can you just give your meat-free kid a multivitamin and call it a day? Unfortunately, probably not. Research indicates that regular supplementation with iron, zinc, and B12 does not mitigate all of the serious developmental risks in fetuses and children. (26) Nutrient-dense whole foods, like meat, are the best sources for vitamins and minerals.

Childhood is the critical time for proper nutrition. Kids can be notoriously “picky eaters,” so we should be sure that each bite counts by providing the nutrients they need to thrive. In children, the stakes are too high to risk missing key nutrients by skipping meat entirely.

Now I’d like to hear from you. Are you considering a vegetarian or vegan diet for your child, or do you believe that kids need to eat meat? Let me know below in the comment section!

The post Do Your Kids Need to Eat Meat to Thrive? appeared first on Chris Kresser.

Be Nice and Share!
This post was originally published on this site

http://chriskresser.com/

An optimal human diet includes a mix of nutrient-rich plant- and animal-based foods, similar to what this member of the Maasai would eat.

Eggs are bad for you. Wait, eggs are good for you! Fat is bad. Wait, fat is good and carbs are bad! Skipping breakfast causes weight gain. Wait, skipping breakfast (intermittent fasting) is great for weight loss and metabolic health.

It’s enough to make you crazy, right? These are just a few of the many contradictory nutrition claims that have been made in the media over the past decade, and it’s no wonder people are more confused about what to eat than ever before.

Everyone has an opinion on the optimal human diet—from your personal trainer to your UPS driver, from your nutritionist to your doctor—and they’re all convinced they are right. Even the “experts” disagree. And they can all point to at least some studies that support their view. On the surface, at least, all of these studies seem credible, since they’re published in peer-reviewed journals and come out of respected institutions like Harvard Public Health.

This has led to massive confusion among both the general public and health professionals, a proliferation of diet books and fad approaches, and a (justifiably) growing mistrust in public health recommendations and media reporting on nutrition.

Unfortunately, millions of dollars and decades of scientific research haven’t added clarity—if anything, they have further muddied the waters. Why? Because, as you’ll learn below, we’ve been asking the wrong questions, and we’re using the wrong methods.

If you’re confused about what to eat and frustrated by the contradictory headlines that are constantly popping up in your news feed, you’re not alone. The current state of nutritional research, and how the media reports on it, virtually guarantees confusion.

In this article, my goal is to step back and look at the question of what we should eat through a variety of lenses, including archaeology, anthropology, evolutionary biology, anatomy and physiology, and biochemistry—rather than relying exclusively on observational nutrition research, which, as I’ll explain below, is highly problematic (and that’s saying it nicely).

Armed with this information, you’ll be able to make more informed choices about what you eat and what you feed your family members.

Let’s start with the question that is on everyone’s mind …

What Is the Optimal Human Diet?

Drumroll, please!

There isn’t one.

Note the emphasis on “one.”

There is no way to answer the question “What is the optimal human diet?” because there is no single, optimal diet for every human.

When I explain this to people I talk to, they immediately understand. It makes sense to them that we shouldn’t all be following the exact same diet.

Yet that is exactly what public health recommendations and dietary guidelines assume, and I would argue that this fallacy is both the greatest source of confusion and the most significant obstacle to answering our key questions about nutrition.

Why? Because although human beings share a lot in common, we’re also different in many ways: we have different genes, gene expression, health status, activity levels, life circumstances, and goals.

Modern dieting advice is often confusing, contradictory, and just plain wrong. And, while there’s no such thing as one optimal human diet, there are some foods humans are designed to eat. Find out what should be on your plate—from a Paleo perspective.

Imagine two different people:

  • A 55-year-old, sedentary male office worker who is 60 pounds Overweight and has pre-diabetes and hypertension
  • A 23-year-old, female Olympic athlete who is training for three hours a day, in fantastic health, and is attempting to build muscle for a competition

Should they eat exactly the same diet? Of course not.

Our Differences Matter When It Comes to Diet

Although that may be an extreme example, it’s no less true that what works for a young, single, male, CrossFit enthusiast who is getting plenty of sleep and not under a lot of stress won’t work for a mother of three who also works outside the house and is burning the candle at both ends.

These differences—in our genes, behavior, lifestyle, gut microbiome, etc.—influence how we process both macronutrients (protein, carbohydrates, and fat) and micronutrients (vitamins, minerals, and trace minerals), which in turn determine our response to various foods and dietary approaches. For example:

  • People with lactase persistence—a genetic adaptation that allows them to digest lactose, the sugar in milk, into adulthood—are likely to respond better to dairy products than people that don’t have this adaptation.
  • Populations with historically high starch intake produce more salivary amylase than populations with low starch intake. (1)
  • Changes to gut microbiota can help with the assimilation of certain nutrients. Studies of Japanese people have found, for example, that their gut bacteria produce specific enzymes that help them break down seaweed, which can be otherwise difficult for humans to digest. (2)
  • Organ meats and shellfish are extremely nutrient dense and a great choice for most people—but not for someone with hemochromatosis, a genetic disorder that leads to aggressive iron storage, since these foods are so rich in iron.
  • Large, well-controlled studies (involving up to 350,000 participants) have found that, on average, higher intakes of saturated fat are not associated with a higher risk of heart disease. (3) But is this true for people with certain genes that make them “hyper-absorbers” of saturated fat and lead to a significant increase in LDL particle number (a marker associated with greater risk of cardiovascular disease)?

This is just a partial list, but it’s enough to make the key point: there are important differences that determine what an optimal diet is for each of us, but those differences are rarely explored in nutrition studies. Most research on diet is almost exclusively focused on top-down, population-level recommendations, and since a given dietary approach will yield variable results among different people, this keeps us stuck in confusion and controversy.

It has also kept us stuck in what Gyorgy Scrinis has called “the ideology of nutritionism,” which he defines as follows: (4)

Nutritionism is the reductive approach of understanding food only in terms of nutrients, food components, or biomarkers—like saturated fats, calories, glycemic index—abstracted out of the context of foods, diets, and bodily processes.

In other words, it is a focus on quantity, not quality.

Nutrition research has assumed that a carbohydrate is a carbohydrate, a fat is a fat, and a protein is a protein, no matter what type of food they come packaged in. If one person eats 50 percent of calories from fat in the form of donuts, pizza, candy, and fast food and another person eats 50 percent of calories from fat in the form of whole foods like meat, fish, avocados, nuts, and seeds, they will still be lumped together in the same “50 percent of calories from fat” group in most studies.

Most people are shocked to learn that this is how nutrition research works. It doesn’t take a trained scientist to understand why this would be problematic.

And yet, although there are some signs that the tide is turning (which I’ll discuss more below), the vast majority of epidemiological studies that have served as the basis for public health recommendations and dietary guidelines are plagued by this focus on quantity over quality.

But Aren’t There Some Foods That Are Better for All Humans to Eat (And Not Eat)?

I just finished explaining why there’s no “one-size-fits-all” approach to diet, but that doesn’t mean there aren’t core nutrition principles that apply to everyone.

For example, I think we can all agree that a steady diet of donuts, chips, candy, soda, and other highly processed and refined foods is unhealthy. And most people would agree that a diet based on whole, unprocessed foods is healthy.

It’s the middle ground where we get into trouble. Is meat good or bad? If it’s bad, does that apply to all meats, or just processed meat or red meat? What about saturated fat? Should humans consume dairy products?

A better question than “What is the optimal human diet?” then, might be “What is a natural human diet?” or, more specifically, “What is the range of foods that human beings are biochemically, physiologically, and genetically adapted to eat?”

In theory, there are two ways to answer this question:

  1. We can look at evolutionary biology, archaeology, medical anthropology, and comparative anatomy and physiology to determine what a natural human diet is.
  2. We can look at it from a biochemical perspective: what essential and nonessential nutrients contribute to human health (and where are they found in foods), how various functional components of food influence our body at the cellular and molecular level, and how certain compounds in foods—especially those prevalent in the modern, industrialized diet—damage our health via inflammation, disruption of the gut microbiome, hormone imbalance, and other mechanisms.

Let’s take a closer look through each of these lenses.

The Evolutionary Perspective

Human beings, like all other organisms in nature, evolved in a particular environment, and that evolutionary process dictated our biology and physiology as well as our nutritional needs.

Archaeological Evidence for Meat Consumption

Isotope analysis from archaeological studies suggests that our hominid ancestors have been eating meat for at least 2.5 million years. (5) There is also wide agreement that going even further back in time, our primate ancestors likely ate a diet similar to modern chimps, which we now know eat vertebrates. (6) The fact that chimpanzees and other primates evolved complex behavior like using tools and hunting in packs indicates the importance of animal foods in their diet—and ours.

Anatomical Evidence for Meat Consumption

The structure and function of the digestive tract of all animals can tell us a lot about their diet, and the same is true for humans. The greatest portion (45 percent) of the total gut volume of our primate relatives is the large intestine, which is good for breaking down fiber, seeds, and other hard-to-digest plant foods. In humans, the greatest portion of our gut volume (56 percent) is the small intestine, which suggests we’re adapted to eating more bioavailable and energy-dense foods, like meat and cooked starches, that are easier to digest.

Some advocates of plant-based diets have argued that humans are herbivores because of our blunt nails, small mouth opening, flat incisors and molars, and relatively dull canine teeth—all of which are characteristics of herbivorous animals. But this argument ignores the fact that we evolved complex methods of procuring and processing food, from hunting to cooking to using sharp tools to rip and tear flesh. These methods/tools take the place of anatomical features that serve the same function.

Humans have relatively large brains and small guts compared to our primate relatives. Most researchers believe that consuming meat and fish is what led to our larger brains and smaller guts compared to other primates because animal foods are more energy dense and easier to digest than plant foods. (7)

Genetic Changes Suggestive of Adaptation to Animal Foods

Most mammals stop producing lactase, the enzyme that breaks down lactose, the sugar in milk, after they’re weaned. But in about one-third of humans worldwide, lactase production persists into adulthood. This allows those humans to obtain nutrients and calories from dairy products without becoming ill. If we were truly herbivores that aren’t supposed to eat animal foods at all, we would not have developed this genetic adaptation.

Studies of Contemporary Hunter–Gatherers

Studies of contemporary hunter–gatherer populations like the Maasai, Inuit, Kitavans, Tukisenta, !Kung, Aché, Tsimané, and Hadza have shown that, without exception, they consume both animal and plant foods, and they go to great lengths to obtain plant or animal foods when they’re in short supply.

For example, in one analysis of field studies of 229 hunter–gatherer groups, researchers found that animal food provided the dominant source of calories (68 percent) compared to gathered plant foods (32 percent). (8) Only 14 percent of these societies got more than 50 percent of their calories from plant foods.

Another report on 13 field studies of the last remaining hunter–gatherer tribes carried out in the early and mid-20th century found similar results: animal food comprised 65 percent of total calories on average, compared with 35 percent from plant foods. (9)

The amount of protein, fat, and carbohydrates, the proportion of animals vs. plants, and the macronutrient ratios consumed vary, but an ancestral population following a completely vegetarian or vegan diet has never been discovered.

The Lifespan of Our Paleolithic Ancestors

Critics of Paleo or ancestral diets often claim that they are irrelevant because our Paleolithic ancestors all died at a young age. This common myth has been debunked by anthropologists. (10) While average lifespan is and was lower among hunter–gatherers than ours is today, this is heavily skewed by high rates of infant mortality (due to a lack of emergency medical care and other factors) in these populations.

The anthropologists Gurven and Kaplan studied lifespan in extant hunter–gatherers and found that, if they survive childhood, their lifespans are roughly equivalent to our own in the industrialized world: 68 to 78 years. (11) This is notable because hunter–gatherers today survive only in isolated and marginal environments like the:

  • Kalahari Desert
  • Amazon rainforest
  • Arctic circle

What’s more, in many cases hunter–gatherers reach these ages without acquiring the chronic diseases that are so common in Western countries. They’re less likely to have heart disease, diabetes, dementia and Alzheimer’s, and many other debilitating, chronic conditions.

For example, one study of the Tsimané people in Bolivia found that they have a prevalence of atherosclerosis 80 percent lower than ours in the United States and that nine in 10 Tsimané adults aged 40 to 94 had completely clean arteries and no risk of heart disease. (12) They also found that the average 80-year-old Tsimané male had the same vascular age as an American in his mid-50s. (Did you notice that this study included adults up to age 94? So much for the idea that hunter–gatherers all die when they’re 30!)

When you put all of this evidence together, it suggests the following themes:

  • Meat and other animal products have been part of the natural human diet for at least 2.5 million years
  • All ancestral human populations that have been studied ate both plants and animals
  • Human beings can survive on a wide variety of foods and macronutrient ratios within the general template of plants and animals they consumed

Additional Reading

For a deeper dive on this topic, I recommend the following articles:

The Biochemical Perspective

Understanding ancestral diets and their relationship to the health of hunter–gatherer populations is a good starting place, but on its own, it doesn’t prove that such diets are the best option for modern humans.

To know that, we need to examine this question from a biochemical perspective. We need to know what nutrients are essential to human health, where they are found in food, and how various components of the diet and compounds in foods affect our physiology—both positively and negatively.

The good news is, there are tens of thousands of studies in this category. Collectively, they bring us to the same conclusion we reached above:

That a whole-foods diet that contains both plants and animals is the best—and in some cases, only—way to meet our nutrient needs from food.

Nutrient Density

Nutrient density is arguably the most important concept to understand when it comes to answering the question, “What should humans eat?”

The human body requires approximately 40 different micronutrients for normal metabolic function.

Maximizing nutrient density should be the primary goal of our diet because deficiencies of any of these essential nutrients can contribute to the development of chronic disease and even shorten our lifespan.

There are two types of nutrients in food: macronutrients and micronutrients. Macronutrients refer to the three food substances required in large amounts in the human diet, namely:

  • Protein
  • Carbohydrates
  • Fats

Micronutrients, on the other hand, are vitamins, minerals, and other compounds required by the body in small amounts for normal physiological function.

The term “nutrient density” refers to the concentration of micronutrients and amino acids, the building blocks of proteins, in a given food. While carbohydrates and fats are important, they can be provided by the body for a limited amount of time when dietary intake is insufficient (except for the essential omega-6 and omega-3 fatty acids). On the other hand, micronutrients and the essential amino acids found in protein cannot be manufactured by the body and must be consumed in the diet.

With this in mind, what are the most nutrient-dense foods? There are several studies that have attempted to answer this question. In the most comprehensive one, which I’ll call the Maillot study, researchers looked at seven major food groups and 25 subgroups, characterizing the nutrient density of these foods based on the presence of 23 qualifying nutrients. (13)

Maillot and colleagues found that the most nutrient-dense foods were (score in parentheses):

  1. Organ meats (754)
  2. Shellfish (643)
  3. Fatty fish (622)
  4. Lean fish (375)
  5. Vegetables (352)
  6. Eggs (212)
  7. Poultry (168)
  8. Legumes (156)
  9. Red meats (147)
  10. Milk (138)
  11. Fruits (134)
  12. Nuts (120)

As you can see, eight of the 12 most nutrient-dense categories of foods are animal foods. All types of meat and fish, vegetables, fruit, nuts, and dairy were more nutrient-dense than whole grains, which received a score of only 83. Meat and fish, veggies, and fruit were more nutrient dense than legumes, which were slightly more nutrient dense than dairy and nuts.

There are a few caveats to the Maillot analysis:

  • It penalized foods for being high in saturated fat and calories
  • It did not consider bioavailability
  • It only considered essential nutrients

Caloric Density and Saturated Fat

In the conventional perspective, nutrient-dense foods are defined as those that are high in nutrients but relatively low in calories. However, recent evidence (which I’ll review below) has found that saturated fat doesn’t deserve its bad reputation and can be part of a healthy diet. Likewise, some foods that are high in calories (like red meat or full-fat dairy) are rich in key nutrients, and, again, can be beneficial when part of a whole-foods diet. Had saturated fat and calories not been penalized, foods like red meat, eggs, dairy products, and nuts and seeds would have appeared even higher on the list.

Bioavailability

Bioavailability is a crucial factor that is rarely considered in studies on nutrient density. It refers to the portion of a nutrient that is absorbed in the digestive tract. The amount of bioavailable nutrients in a food is almost always lower than the amount of nutrients the food contains. For example, the bioavailability of calcium from spinach is only 5 percent. (14) Of the 115 mg of calcium present in a serving of spinach, only 6 mg is absorbed. This means you’d have to consume 16 cups of spinach to get the same amount of bioavailable calcium in one glass of milk!

The bioavailability of protein is another essential component of nutrient density. Researchers use a measure called the Protein Digestibility Corrected Amino Acid Score (PDCAAS), which combines the amino acid profile of a protein with a measure of how much of the protein is absorbed during digestion to assess protein bioavailability. The PDCAAS rates proteins on a scale of 0 to 1, with values close to 1 representing more complete and better-absorbed proteins than ones with lower scores.

On the scale, animal proteins have much higher scores than plant proteins; casein, egg, milk, whey, and chicken have scores of 1, indicating excellent amino acid profiles and high absorption, with turkey, fish, and beef close behind. Plant proteins, on the other hand, have much lower scores; legumes, on average, score around 0.70, rolled oats score 0.57, lentils and peanuts are 0.52, tree nuts are 0.42, and whole wheat is 0.42.

Thus, had bioavailability been considered in the Maillot study, animal foods would have scored even higher, and plant foods like legumes would have scored lower. 

Essential vs. Nonessential Nutrients

The Maillot study—and a similar analysis from Harvard University chemist Dr. Mat LaLonde—only considered essential nutrients. In a nutritional context, the term “essential” doesn’t just mean “important,” it means necessary for life. We need to consume essential nutrients from the diet because our bodies can’t make them on their own.

Focusing on essential nutrients makes sense, since we can’t live without them. That said, over the past few decades many nonessential nutrients have been identified that are important to our health, even if they aren’t strictly essential. These include:

  • Carotenoids
  • Polyphenols
  • Flavonoids
  • Lignans
  • Fiber

Many of these nonessential nutrients are found in fruits and vegetables. Had these nutrients been included in the nutrient density analyses, fruits and vegetables would likely have scored higher than they did.

What Can We Conclude from the Biochemical Perspective?

When we look at a natural human diet through the lens of biochemistry and physiology, we arrive at the same conclusion: our diet should consist of a combination of organ meat, meat, fish, shellfish, eggs, fresh vegetables and fruits, nuts, seeds, and starchy plants.

But how much of the diet should come from animals, and how much from plants? The answer to this question will vary based on individual needs. If we look at evolutionary history, we see that on average, humans obtained about 65 percent of calories from animal foods and 35 percent of calories from plant foods on average, but the specific ratios varied depending on geography and other factors.

That does not mean that two-thirds of what you put on your plate should be animal foods! Remember, calories are not the same as volume (what you put on your plate). Meat and animal products are much more calorie-dense than plant foods. One cup of broccoli contains just 30 calories, compared to 338 calories for a cup of beef steak.

This means that even if you’re aiming for 50 to 70 percent of calories from animal foods, plant foods will typically take up between two-thirds and three-quarters of the space on your plate.

(Side note: this is why I’ve always rejected the notion of Paleo as an “all-meat” diet; a more accurate descriptor would be a plant-based diet that also contains animal products).

When we consider the importance of both essential and nonessential nutrients, it also becomes clear that both plant and animal foods play an important role because they are rich in different nutrients. Dr. Sarah Ballantyne broke this down in part three of her series “The Diet We’re Meant to Eat: How Much Meat versus Veggies.”

Plant Foods

  • Vitamin C
  • Carotenoids (lycopene, beta-carotene, lutein, zeaxanthin)
  • Diallyl sulfide (from the allium class of vegetables)
  • Polyphenols
  • Flavonoids (anthocyanins, flavan-3-ols, flavonols, proanthocyanidins, procyanidins, kaempferol, myricetin, quercetin, flavanones)
  • Dithiolethiones
  • Lignans
  • Plant sterols and stanols
  • Isothiocyanates and indoles
  • Prebiotic fibers (soluble and insoluble)

Animal Foods

  • Vitamin B12
  • Heme iron
  • Zinc
  • Preformed vitamin A (retinol)
  • High-quality protein
  • Creatine
  • Taurine
  • Carnitine
  • Selenium
  • Vitamin K2
  • Vitamin D
  • DHA (docosahexaenoic acid)
  • EPA (eicosapentaenoic acid)
  • CLA (conjugated linoleic acid)

Additional Reading

For a deeper dive on these subjects, check out the following articles:

Focus Your Diet on Nutrient Density

Whether we look through the lens of evolutionary biology and history or modern biochemistry, we arrive at the same conclusion:

If you eat only plant foods or only animal foods, your diet will be significantly less nutrient dense than if you ate both. There’s simply no way around it.

Anthropology and archaeology suggest that it’s possible for humans to thrive on a variety of food combinations and macronutrient ratios within the basic template of whole, unprocessed animal and plant foods.

For example, the Tukisenta of Papua New Guinea consumed almost 97 percent of calories in the form of sweet potatoes, and the traditional Okinawans also had a very high intake of carbohydrate and low intake of animal protein and fat. On the other hand, cultures like the Maasai and Inuit consumed a much higher percentage of calories from animal protein and fat, especially at certain times of year.

How much animal vs. plant food you consume should depend on your specific preferences, needs, and goals. For most people, a middle ground is what appears to work best, with between 35 and 50 percent of calories from animal foods and between 50 and 65 percent of calories coming from plant foods. (Remember, we’re talking about calories, not volume.)

Now I’d like to hear from you. What is your “optimal human diet”? Have you experimented with different ratios of animal vs. plant foods? What works for you? Let me know in the comments section. 

The post What Is the Optimal Human Diet? appeared first on Chris Kresser.

Be Nice and Share!
This post was originally published on this site

http://chriskresser.com/

Collecting data, like this researcher is doing by examining a petri dish, impacts the latest nutrition headlines.

This article is Part 2 of a two-part series about the problems with nutrition research and the way it’s presented in the media. For more reasons why you should be skeptical of the latest nutrition headlines, check out Part 1 of this series.

In my last article in this series, I talked about why observational studies aren’t a great tool for proving causal relationships; how the data collection methods researchers use rely on memory, not facts; how the healthy-user bias can impact study results; and how, in many cases, nutritional studies uncover “risks” that look an awful lot like pure chance. In this post, I’ll delve deeper into the reasons why you should take nutrition headlines with a grain of salt.

Some Scientific Results Can’t Be Replicated

Science works by experiments that can be repeated; when they are repeated, they must give the same answer. If an experiment does not replicate, something has gone wrong. – Young & Karr, The Royal Statistical Society (1)

As Young and Karr suggest above, replication is a key feature of the scientific method. An initial finding does not carry much weight on its own. For it to be considered valid, it needs to be replicated by other researchers.

We’re supposed to trust nutrition researchers to help us understand our health, but in some cases, the way they think about nutrition is faulty. Check out more reasons why you should remain skeptical of nutrition headlines.

In the context of nutrition research, because observational studies cannot prove causality, their findings should ideally be replicated in a randomized controlled trial
(RCT). RCTs are specifically designed to prove causality, and while not perfect (see below), they are much more persuasive as evidence than observational studies.

The results from most observational nutrition studies have not been replicated by RCTs. In fact, one analysis found that:

Zero of 52 nutrition claims from observational studies for a wide variety of dietary patterns and nutrient supplementation were replicated, and five claims were statistically significant in the opposite direction.

Yes, you read that correctly. Out of 52 claims made in observational nutrition studies, zero were replicated and five indicated the opposite of what the observational study suggested!

Let’s look at a specific example. Observational studies suggested that people with the highest intakes of beta-carotene, an antioxidant nutrient found primarily in fruits and vegetables, had a 31 percent lower risk of death compared to those with the lowest intake. Yet RCTs of supplementation with beta-carotene not only failed to confirm this benefit, they found an increased risk of cancer in the group with the highest intake. (2) Oops! Similar results have been found with vitamin E. (3)

Researchers Focus on Quantity, Not Quality

People don’t eat nutrition, they eat food. – Margaret Mead

The vast majority of observational studies today focus only on nutrients, isolated food components, or biomarkers—like saturated fats, carbohydrates, calories, LDL cholesterol—abstracted out of the context of foods, diets, and bodily processes.

This reductionist approach, which philosopher of science Gyorgy Scrinis calls “nutritionism,” has interfered with nutrition science’s ability to provide useful individual and public health guidance. (4)

The upside of nutritionism has been the discovery of drugs, vitamins, and minerals that have saved millions of lives. The downside is that Americans (and people all over the industrialized world) are obsessing over details like the percentage of fat or carbohydrates they consume rather than focusing on the broader and more important issues, like the quality of the food they eat.

Two examples of how this has manifested over the past few decades are:

  • The promotion of margarine over the much better-tasting butter because of concerns about butter’s saturated fat content
  • The vilification of eggs due to their cholesterol content without considering their overall nutrient value

(And of course, we now know that butter is healthier than margarine and dietary cholesterol has no impact on heart disease. Another oops!)

Nutritionism is a relatively new phenomenon. It started in 1977 with the McGovern Report, the first widely disseminated nutrition guidance to provide detailed, quantitative, nutrient-focused dietary recommendations. (5) Prior to that, dietary guidelines were based on familiar concepts of food groups and serving sizes and relatively simple information on what foods to buy and eat to maintain health. The average person could easily understand—and most importantly, act on—the guidelines.

After the McGovern Report, dietary guidelines became increasingly complex and difficult for the layperson to comprehend. The 1980 dietary guidelines were published in a short, 19-page brochure; in 1985 it grew to 28 pages; in 2010 it was 112 pages; and in 2015, the most recent dietary guidelines took up 517 pages!

What Happens When You Look at Food Quality

A more recent example of nutritionism can be found in the heated debate over whether low-fat or low-carb diets are superior for weight loss and metabolic and cardiovascular health. Each side of the debate has its advocates, and the controversy continues.

In early 2018, a group of researchers led by Dr. Christopher Gardner set out to settle this debate with an RCT. They assigned participants into two groups: low-carb and low-fat. But here’s the catch: they instructed both groups to:

1) maximize vegetable intake; 2) minimize intake of added sugars, refined flours, and trans fats; and 3) focus on whole foods that were minimally processed, nutrient dense, and prepared at home whenever possible. (6)

For example, foods like fruit juice, pastries, white rice, white bread, and soft drinks are low in fat but were not recommended to the low-fat group. Instead, the dietitians encouraged participants to eat whole foods like lean meat, brown rice, lentils, low-fat dairy products, legumes, and fruit. Meanwhile, the low-carb group was instructed to focus on foods rich in healthy fats, like olive oil, avocados, salmon, cheese, nut butters, and pasture-raised animal products.

Perhaps not surprisingly—if you don’t embrace nutritionism, that is—the researchers found that on average, people who cut back on added sugar, refined grains, and processed food lost weight over 12 months—regardless of whether the diet was low-carb or low-fat.

This was a fantastic example of what a nutrition study should look like. It resulted in clinically relevant, practical advice that is easy for people to follow: eat real food. Just imagine where we might be now if most nutrition studies over the past 40 years had been designed like this?

RCTs Are Better than Observational Studies but Still Problematic

If observational studies cannot prove causality, then why do they continue to form the foundation of dietary guidelines and public health recommendations? The answer is that RCTs also have several shortcomings that, thus far, have made them impractical as a tool for studying population health.

Duration

Most relationships between nutritional factors and disease can take years, if not decades, to develop. What’s more, the effects of some nutritional interventions in the short term are different than they are over the long term.

Weight loss is a great example. Both low-carb and low-fat diets have been shown to cause weight loss in the short term, but over the long term (more than 12 months) people tend to regain the weight they lost.

Inadequate Sample Size

The sample size, or number of participants in an RCT, is one of the most important factors in determining whether the results of the study are generalizable to the wider population. Most nutrition RCTs do not have a large enough sample size.

Dr. John Ioannidis, a professor at the Stanford School of Medicine, highlighted this problem in a recent editorial in BMJ called “Implausible Results in Human Nutrition Research.”

To identify a nutrition-related intervention that produces a legitimate 5 to 10 percent relative risk reduction in total mortality, we’d need studies that are 10 times as large as the highly publicized PREDIMED trial (which had around 7,500 participants), in addition to long-term follow-up, linkage to death registries, and careful efforts to maximize adherence.

RCTs Are Expensive

One reason that it’s such a huge challenge to design RCTs with sufficient duration and sample size is cost. RCTs are enormously expensive. In the pharmaceutical world, drug companies pay for RCTs because they have a vested financial interest in their results. But who will pay for long-term RCTs in the nutrition world? Public funding for nutrition research (and many other types of research) is declining, not increasing, which makes it unlikely that we’ll see long-term RCTs with sufficient sample sizes anytime soon.

Quality RCTs Are Difficult to Do

As Dr. Peter Attia points out in his excellent series Studying Studies, designing high-quality RCTs is fraught with challenges:

These trials need to establish falsifiable hypotheses and clear objectives, proper selection of endpoints, appropriate subject selection criteria (both inclusionary and exclusionary), clinically relevant and feasible intervention regimens, adequate randomization, stratification, and blinding, sufficient sample size and power, and anticipation of common practical problems that can be encountered over the course of an RCT.

That’s not an easy task and few nutrition RCTs meet the challenge.

Conflicts of Interest Are Very Common

It’s difficult to get a man to understand a thing if his salary is dependent upon him not understanding it. – Upton Sinclair

Many have written about financial conflicts of interest and their impact on all forms of research, including nutrition research. In short, research has shown that when studies are funded by industry, they are far more likely to report results that are favorable to the sponsor.

In one analysis performed by Marion Nestle, 90 percent of industry-sponsored studies returned sponsor-friendly results. (7) For a summary of the issues and how they impact the quality of nutrition research, I recommend this story from Vox.

In this article, I’d like to focus on another type of conflict of interest: allegiance bias, which is also known as “white hat bias.” Allegiance bias is not as well recognized as financial conflicts of interest are, which is one of the many reasons that it has an insidious effect on nutrition research.

Allegiance bias has been defined as “bias leading to distortion of research-based information in the service of what may be perceived as righteous ends.” (8)

For example, imagine that a vegan researcher sets out to do a study on the health impacts of a vegan diet. Is it possible that the researcher’s ideological commitment to veganism could influence, both consciously and unconsciously, how the study is designed, executed, and interpreted? Of course it could. In fact, it’s difficult to see how it couldn’t.

In a 2018 editorial called “Disclosures in Nutrition Research: Why It Is Different,” Dr. Ioannidis suggests that allegiance bias should be disclosed by researchers, just as financial conflicts of interest are. He says:

Therefore, it is important for nutrition researchers to disclose their advocacy or activist work as well as their dietary preferences if any are relevant to what is being presented and discussed in their articles. This is even more important for dietary preferences that are specific, circumscribed, and adhered to strongly. [emphasis added]

Ioannidis goes on to say that advocacy and activism, while laudable, are contrary to “a key aspect of the scientific method, which is to not take sides preemptively or based on belief or partisanship.” [emphasis added]

Veganism certainly meets the criteria of dietary recommendations that are “specific, circumscribed, and adhered to strongly.” In fact, some have pointed out that veganism meets the four dimensions of religion:

  • Belief: Veganism began as a way to express moral integrity regarding the appropriation and suffering of non-humans.
  • Ritual: Veganism involves strict dietary restrictions, including abstaining from the use of materials made from any animal products.
  • Experience: The “holistic connectedness” of veganism would be considered a religious experience to those who live it.
  • Community: There are many official and unofficial vegan associations across the world, and in 2017 a civil flag was created for the international vegan community.

Researchers and physicians like T. Colin Campbell, Kim Williams, Caldwell Esselstyn, Joel Fuhrman, John McDougall, and Neal Barnard could all be expected to suffer from this “white hat bias.” They’re involved in vegan advocacy and activism, both of which could be expected to be a source of allegiance bias.

A Famous Example of Allegiance Bias at Work

The China Study, a book by vegan physician and researcher T. Colin Campbell, is a perfect example. Campbell claimed that this study—which was not peer-reviewed—proved that:

  • Animal protein causes cancer
  • A plant-based diet protects against heart disease
  • You can get all the nutrients you need from plants

Campbell even went as far as saying, “Eating foods that contain any cholesterol above 0 mg is unhealthy,” a claim that has been completely disproven and is reflected in the 2015 change in the U.S. Dietary Guidelines that no longer regards dietary cholesterol as a nutrient of concern.

However, since The China Study was published, several independent, peer-reviewed studies of the data have refuted T. Colin Campbell’s claims. For a great summary of the issues with The China Study, see this article by nutritional scientist Dr. Chris Masterjohn.

Allegiance bias can take several forms. It can involve:

  • Cherry-picking studies to support a cherished view
  • Misleadingly describing the results of studies that are cited in a paper
  • “Data dredging” to search for statistical significance within given data sets (when no such significance is present)
  • Not reporting null results
  • Designing experiments for the purpose of obtaining a particular answer
  • And more
It’s important to point out that allegiance bias is not always, or even often, conscious. Most researchers believe they are acting with scientific rigor and integrity. This is exactly why it’s so difficult to guard against, and why it’s so important to disclose.

Nutrition Policy Is Informed by Politics and Religion—Not Just Science

In a perfect world, dietary guidelines and nutritional policy would be the product of a thorough and dispassionate review of the available scientific evidence and not be unduly influenced by politics—and certainly not by religion. Dissenting views that are well informed would be not only welcomed but encouraged. As Syd Shapiro once said, “We should never forget that good science is skeptical science.”

Alas, we don’t live in a perfect world. In our world, dissenting views are are not welcomed; they’re suppressed. Dr. D. Mark Hegsted, a founding member of the Nutrition Department at the Harvard School of Public Health, made this opening remark in the 1977 McGovern hearing:

The diet of Americans has become increasingly rich—rich in meat, other sources of saturated fat and cholesterol … [and] the proportion of the total diet contributed by fatty and cholesterol-rich foods … has risen.

The only problem with this statement is that it directly contradicted USDA economic data which suggested that total calories and the availability of meat, dairy, and eggs at the time of the report were equivalent or marginally less than amount consumed in 1909. Full-fat dairy consumption was lower in 1977 than 1909, having declined steadily from 1950 to 1977. (9) Other evidence that contradicted Dr. Hegsted’s opinion was also ignored.

The feedback from the scientific community on the McGovern Report was “vigorous and constructive,” explicitly stated the “lack of consensus among nutrition scientists,” and presented evidence for the diversity of scientific opinion on the subject. (10) Other countries, such as Canada and Great Britain, also noted the lack of consensus on whether dietary cholesterol intake should be limited. U.S. senators issued the following statement about the McGovern Report:

It is clear that science has not progressed to the point where we can recommend to the general public that cholesterol intake be limited to a specified amount. The variances between different individuals are simply too great. A similar divergence of scientific opinion on the question of whether dietary change can help the heart illustrates that science cannot yet verify with any certainty that coronary heart disease will be prevented or delayed by the diet recommended in this report. (See footnote)

Nevertheless, these cautionary words were ignored, and the recommendations from the McGovern Report were adopted. This kicked off the fat and cholesterol phobia that would grip the United States for the next four decades.

Religion Can Impact Nutrition Guidelines

Another example of how non-scientific factors drive nutrition policy is the influence of the Seventh Day Adventists on public health recommendations in the United States and around the world. Seventh Day Adventists (SDA) is a Protestant denomination that grew out of the Millerite movement in the United States. Health has been a focus of SDA teachings since the inception of the church in the 1860s. According to Wikipedia:

Adventists are known for presenting a “health message” that advocates vegetarianism and expects adherence to the kosher laws, particularly the kosher foods described in Leviticus 11, meaning abstinence from pork, shellfish, and other animals proscribed as “unclean.” The church discourages its members from consuming alcoholic beverages, tobacco or illegal drugs. … In addition, some Adventists avoid coffee, tea, cola, and other beverages containing caffeine.

Ellen White, an early SDA church leader, received her first major health reform vision in 1863, and “for the first time, God’s people were urged to abstain from flesh food in general and from swine’s flesh in particular.” Most SDA diet beliefs are based on White’s health visions.

White believed that the church had a duty to educate the public about health as a way to control desires and passions. Adventists continue to believe that eating meat stirs up “animal passions,” and that is one of the reasons for avoiding it.

Another early SDA leader, Lenna Cooper, was a dietitian who cofounded the American Dietetics Association, which continues to advocate a vegetarian diet to this day. Cooper wrote textbooks and other materials that were used in dietetic and nursing programs, not only in the United States but around the world, for more than 30 years. The SDA Church established hundreds of hospitals, colleges, and secondary schools and tens of thousands of churches around the world—all promoting a vegetarian diet—and played a major role in the development and mass production of plant-based foods, such as meat analogues, breakfast cereals, and soy milk. (11)

Adventists have been behind much of the early research on vegetarian diets at Loma Linda University in San Diego, where SDA leaders established a dietetics department in 1908. This was an ostensibly scientific endeavor at a university that was established by a religious group that believed vegetarianism was ordained by God.

If you think this raises a huge red flag for allegiance bias, you’re not wrong. In fact, as Jim Banta pointed out in a fascinating review of the SDA influence on diet, administrators at Loma Linda University in the mid-1900s initially discouraged research on vegetarian diets because “if you find the diets of vegetarians are deficient, it will embarrass us.” That is not the attitude of skepticism and open-minded inquiry that characterizes good science.

My Final Thoughts on Nutrition Research

I’d like to conclude with the opening two paragraphs of a recent open letter that scientists Edward Archer and Chip J. Lavie wrote to the National Academies of Sciences, Engineering, and Medicine:

“Nutrition” is now a degenerating research paradigm in which scientifically illiterate methods, meaningless data, and consensus-driven censorship dominate the empirical landscape. Since the 1950s, there was a naïve but politically expedient consensus that a person’s usual diet could be measured simply by asking what he or she remembered eating and drinking. Despite the credulous and unfalsifiable nature of this memory-based method, investigators used it to produce hundreds of thousands of publications and acquire billions of taxpayer dollars.

Over time, the sustained funding of demonstrably pseudo-scientific research methods has subverted the self-correcting nature of science and suppressed skeptical scholarship. Consequently, many decades of politics taking precedence over critical inquiry produced contradictory dietary guidelines, failed public policies, and the continued confusion over “what-to-eat.”

I couldn’t have said it better myself.

What do you think about the latest nutrition headlines? Do you read the newest research with skepticism? Let me know below in the comments—and be sure to check out Part 1 of this two-part series!

Staff of the Select Committee on Nutrition and Human Needs, United States Senate. Dietary Goals for the United States. 2nd ed. Washington, DC: Government Printing Office; December 1977.

The post Why You Should Be Skeptical of the Latest Nutrition Headlines: Part 2 appeared first on Chris Kresser.

Be Nice and Share!
This post was originally published on this site

http://chriskresser.com/

This nutrient-dense meal includes organ meat and shellfish.

I’ll also clear up some common misconceptions about nutrient density, and reveal why an omnivorous diet that includes both animal and plant foods is the best choice from a nutrient density perspective.

Understanding Nutrient Density

The nutrients in our food fall into two categories: macronutrients and micronutrients.

  • Macronutrients refer to the three main substances required in large (macro) amounts in the human diet: protein, carbohydrates, and fats.
  • Micronutrients are vitamins, minerals, and other compounds required by the body in small (micro) amounts for normal physiological function.

We need a mix of both to stay healthy.

What’s the key to maximizing your nutrient density? Eating nutritious, real food from a variety of sources. Find out which foods pack the most punch.

The term “nutrient density” refers to the concentration of micronutrients and amino acids, the building blocks of proteins, in a given food. While carbohydrates and fats are important, these macronutrients can be partially synthesized by the body for a limited amount of time if dietary intake has been insufficient. (A major exception: the essential omega-6 and omega-3 fatty acids, which we can only get through food.)

Conversely, micronutrients and the essential amino acids found in protein cannot be manufactured by the body and must be consumed from food. (A reminder: the word “essential” in front of fatty or amino acids means that our bodies can’t produce them; we must get them from food sources.)

Nutrient density means two very different things in the conventional nutrition and ancestral health communities. Among conventional practitioners, nutrient-dense foods are defined as those that are high in nutrients but relatively low in calories. According to MyPlate, the current nutrition guide endorsed by conventional medicine, the most nutrient-dense foods are:

  • Vegetables
  • Fruits
  • Whole grains
  • Legumes
  • Unsalted nuts and seeds
  • Lean meats and poultry
  • Fat-free or low-fat dairy products

MyPlate’s definition of nutrient density excludes foods that are high in saturated fat and animal fat.

In contrast, while the ancestral health community also acknowledges the nutrient density of meat, poultry, vegetables, nuts, and seeds, it does not demonize or overlook foods that are high in calories and saturated fats.

Instead, the ancestral health perspective recognizes that some of the most nutrient-dense foods on the planet are sourced from animals and contain plenty of fat, such as organ meats, red meat, and full-fat dairy. I consider nutrient density and calorie density separately—because some high-calorie foods are exceptionally nutrient-dense and can be healthy additions to our diet.

Why Nutrient Density Matters

The human body requires approximately 40 different micronutrients for normal metabolic function. Maximizing nutrient density should be the primary goal of our diet because deficiencies of any of these essential nutrients can contribute to the development of chronic disease and even shorten our lifespan. Listed below are just a few examples of how nutrient deficiencies contribute to chronic disease.

  • Vitamin C deficiency increases chronic disease risk factors such as C-reactive protein, waist circumference, and blood pressure. (1)
  • Vitamin D deficiency is associated with immune dysfunction and an increased risk of metabolic syndrome and cardiovascular disease. (2, 3, 4)
  • Magnesium deficiency is linked to depression, metabolic syndrome, and cardiovascular disease. (5, 6, 7)
  • Choline deficiency promotes DNA damage and impairs brain development and liver function. (8, 9)
  • Vitamin B12 deficiency contributes to cognitive dysfunction and reversible tremors and other Parkinson’s-like symptoms. (10)
  • Folate deficiency increases the risk of birth defects and promotes the production of a compound called homocysteine that damages blood vessels when present in large amounts and impairs DNA methylation, which in turn can lead to altered gene expression and an increased risk of cancer. (11, 12)

Nutrient deficiencies are not only a cause of chronic health conditions, but they can also be an effect. Small intestinal bacterial overgrowth (SIBO), dysbiosis, and gastritis impair nutrient absorption in the gastrointestinal tract and increase nutrient needs. Chronic inflammation increases the degradation of and need for vitamin B6 and reduces the body’s production of vitamin D from UVB light exposure. Exposure to environmental toxins such as heavy metals increases the need for essential minerals and nutrients involved in methylation. These factors make nutrient density even more crucial.

The SAD State of Nutrition in the United States

If the United States were to receive a report card rating the quality of the Standard American Diet, it would get a solid “F” for nutrient density. Despite being high in calories, the Standard American Diet (SAD) is nutrient poor. Vegetable oils and sugar, which together comprise 36 percent of the SAD diet, are virtually devoid of nutrients.

It should come as no surprise that nutrient deficiency is widespread in the United States; recent statistics indicate that nearly one-third of Americans are at risk for at least one vitamin deficiency or anemia, with hundreds of thousands of people at risk for multiple deficiencies. (13)

To make matters worse, the RDA (recommended dietary allowance) used in studies to assess nutrient sufficiency merely represents the daily intake level required to avoid acute deficiency symptoms—it does not represent the nutrient intake needed to promote optimal health! (RDAs were originally developed during World War II to create “nutritious” rations for soldiers. While they have been updated since then, RDA numbers represent the minimum amount of a nutrient a person needs in order to avoid a malnutrition-triggered disease, like scurvy or rickets. Furthermore, they don’t really take into account gender, age, or health, meaning that a teenage athlete and a middle-aged sedentary person are given the same RDAs.)

This means that studies of nutrient status may seriously underestimate the impact of nutrient deficiencies on the health of the general population.

The Crucial (But Underappreciated) Role of Bioavailability

No discussion of nutrient density would be complete without considering another critical nutritional factor: bioavailability.

“Bioavailability” refers to the portion of a nutrient that is absorbed in the digestive tract and released into the bloodstream for the body’s use. The amount of bioavailable nutrients in a food is almost always lower than the amount of nutrients the food contains. For example, the bioavailability of calcium from spinach is only 5 percent. Of the 115 mg of calcium present in a serving of spinach, only 6 mg is absorbed. This means you’d have to consume 16 cups of spinach to get the same amount of bioavailable calcium in one glass of milk. (14)

Factors That Impact Bioavailability

Three factors influence the bioavailability of nutrients in food:

  1. The form of the nutrients
  2. The presence of other nutrients that boost bioavailability (nutrient synergy)
  3. The presence of nutrient inhibitors and anti-nutrients

The form that nutrients take significantly impacts their bioavailability in the body. For example, heme iron, a form of iron found only in animal products such as meat and poultry, is far more bioavailable than nonheme iron, found in plant foods. Fifteen to 35 percent of heme iron is absorbed, whereas only 2 to 20 percent of nonheme iron is absorbed.

The absorption of nutrients is also affected by the presence (or lack) of other nutrients. For example, vitamin C enhances the absorption of iron and fat increases the solubility of fat-soluble nutrients such as vitamins A, D, E, and carotenoids.

Nutrient inhibitors and anti-nutrients reduce the bioavailability of nutrients in foods. Phytate, an anti-nutrient found in large amounts in grains and legumes, binds to calcium, iron, and zinc, making them unavailable for absorption. The impact of anti-nutrients on bioavailability is one of the reasons why our Paleolithic ancestors developed nutrient deficiencies when they transitioned from a hunter–gatherer lifestyle, largely free of high amounts of phytate-rich foods, to one that revolved around agriculture.

The concept of nutrient synergy is closely related to bioavailability. Nutrient synergy refers to how nutrients, enzymes, and other cofactors work together to create greater health effects. Examples of nutrient synergy are found in the family of fat-soluble vitamins (vitamins A, D, E, and K2), which must be balanced to promote optimal health. Another example is the importance of copper for maintaining normal iron metabolism in red blood cell formation. While synergistic groups of nutrients occur naturally in foods, they are often not included in synthetic vitamins.

The role of nutrient synergy may explain the surprising results of studies that looked at supplementation with isolated antioxidants. In fact, both beta-carotene and vitamin E have been associated with increased cancer risk. (15, 16)

The bioavailability of protein is another essential component of nutrient density. Researchers rely on a measure called the Protein Digestibility Corrected Amino Acid Score (PDCAAS), which combines the amino acid profile of a protein with a measure of how much of the protein is absorbed during digestion, to assess protein bioavailability. The PDCAAS rates proteins on a scale of 0 to 1, with values close to 1 representing more complete and better-absorbed proteins than ones with lower scores.

On the scale, animal proteins have much higher scores than plant proteins; casein, egg, milk, whey, and chicken have scores of 1, indicating excellent amino acid profiles and high absorption, with turkey, fish, and beef close behind.

Plant proteins, on the other hand, have much lower scores; legumes, on average, score around 0.70, rolled oats score 0.57, lentils and peanuts are 0.52, tree nuts are 0.42, and whole wheat is 0.42. Importantly, the PDCAAS does not consider the anti-nutrient content of foods. If anti-nutrients were taken into consideration, plant proteins would score even lower on the bioavailability scale due to their high anti-nutrient levels. 

In Search of a Nutrient-Dense Diet

A nutrient-dense diet is the best protection against nutrient deficiencies. But what exactly is the most nutrient-dense diet?

There are several studies that have attempted to answer this question. In the most comprehensive one, referred to as the Maillot study, researchers looked at seven major food groups and 25 subgroups, characterizing the nutrient density of these foods based on the presence of 23 qualifying nutrients. The table below displays the results; each food was giving a numbered score for reference, with the highest numbers corresponding to the highest levels of nutrient density. (17)

Food Group Nutrient Density Score
Meat
Organ meats 754
Shellfish 643
Fatty fish 622
Lean fish 375
Eggs 212
Poultry 168
Red meats 147
Deli meats (processed) 120
Fruits and Vegetables
Vegetables 352
Fruits 134
Nuts 120
Dried fruits 85
Dairy Products
Milk 138
Yogurt 119
Cheese 101
Starches and Grains
Legumes 156
Whole grains 83
Potatoes 75
Refined grains 40
Added Fats
Vegetable fats 80
Animal fats 25

Even without considering bioavailability, all categories of meat and fish, vegetables, fruit, nuts, and dairy were more nutrient dense than whole grains. Meat and fish, veggies, and fruit were more nutrient dense than legumes, which were slightly more nutrient dense than dairy and nuts.

What would the scale have looked like had researchers separated caloric density from nutrient density? Harvard University chemist Dr. Mat Lalonde used a formula similar to that used in the Maillot study, but without penalizing foods for having more calories. (18) (He used a different scoring scale, but higher numbers still correlate with higher nutrient density.) The results, which he shared at the Ancestral Health Symposium in 2012, are below.

Category Average Nutrient Density Score
Organ meats 21.3
Herbs and spices 12.3
Nuts and seeds 7.5
Cacao 6.4
Fish and seafood 6.0
Beef 4.3
Lamb, veal and wild game 4.0
Vegetables (raw) 3.8
Pork 3.7
Eggs and dairy 3.1
Poultry 3.1
Processed meat 2.8
Legumes 2.3
Vegetables (cooked or canned) 2.0
Fruit 1.5
Plant fats and oils 1.4
Grains and pseudograins* 1.2
Animal fats and oils 1.0

*Pseudograins include quinoa and amaranth.

On the Lalonde scale, organ meats are again the most nutrient-dense foods by far, followed by:

  • Herbs and spices
  • Nuts and seeds
  • Cacao

Seafood, red meat, and wild game were more nutrient dense than raw vegetables; all forms of meat, fish, fruit, and vegetables (raw and cooked) were more nutrient dense than grains and pseudograins (such as quinoa or amaranth).

If bioavailability had been taken into account in both the Maillot study and Lalonde’s analyses, legumes and grains would have been even lower on the scales when compared to organ meats, meats, dairy products, and fruits and vegetables.

It’s worth noting that both the Maillot and Lalonde analyses only included nutrients like vitamins, minerals, protein, fiber, and essential fatty acids, all of which have been proven to be essential. As I touched on earlier, in a nutrition context, the word “essential” doesn’t just mean “important” or “must-have.” It means we can’t live without these substances and we must get them from food sources, as our bodies can’t manufacture them.

Although these nutrients should always be the primary focus, research over the past two decades has shown that other nutrients, while not essential, play a vital role in health. They include:

  • Polyphenols
  • Carotenoids
  • Flavonoids
  • Diallyl sulfides (from the allium class of vegetables)
  • Lignans

These nutrients are found primarily in whole fruits and vegetables, so had they been factors in the Maillot and Lalonde analyses, fruits and vegetables would have scored higher.

Putting It All Together

The most nutrient-dense diet is one that contains a wide variety of both animal and plant foods.

Animals and plants each contain different nutrients that our bodies require, as Dr. Sarah Ballantyne indicated in her excellent article, “How Much Meat versus Veggies.”

Plant Foods

  • Vitamin C
  • Carotenoids (like lycopene, beta-carotene, lutein, and zeaxanthin)
  • Diallyl sulfide (from the allium class of vegetables)
  • Polyphenols
  • Flavonoids (like anthocyanins, flavan-3-ols, flavonols, proanthocyanidins, procyanidins, kaempferol, myricetin, quercetin, and flavanones)
  • Dithiolethiones
  • Lignans
  • Plant sterols and stanols
  • Isothiocyanates and indoles
  • Prebiotic fibers (soluble and insoluble)

Animal Foods

  • Vitamin B12
  • Heme iron
  • Zinc
  • Preformed vitamin A (retinol)
  • High-quality protein
  • Creatine
  • Taurine
  • Carnitine
  • Selenium
  • Vitamin K2
  • Vitamin D
  • DHA (docosahexaenoic acid)
  • EPA (eicosapentaenoic acid)
  • CLA (conjugated linoleic acid)
When we eat a wide variety of animal and plant foods and maximize the nutrient density of every bite of food we put in our mouths, all of the other important aspects of our diet fall into place.

We end up avoiding processed, refined foods because they are calorie dense and nutrient poor. We eat fresh, local, organic, whole foods that are not only higher in nutrients than their conventional counterparts, but also lower in pesticides and other toxins. We eat meat, eggs, wild-caught seafood, and pasture-raised, full-fat dairy when tolerated. Finally, we naturally eat less because nutrient-dense foods tend to be more satiating, have more fiber and water, and often have fewer calories.

Fine-Tuning Your Diet

Some people may need to fine-tune the nutrient density of their diet to replenish depleted nutrients. Some of the key nutrients to consider when refining the nutrient density of your diet include vitamins A, D, E, and K2, magnesium, iodine, and calcium.

Vitamin A

Preformed vitamin A can be obtained by eating three to six ounces of beef liver or chicken liver per week or by regularly consuming egg yolks. Carotenoids in carrots, sweet potatoes, winter squash, kale, spinach, collard greens, and pumpkin serve as precursors to vitamin A. If you find that these foods are not enough, you can also take one teaspoon of cod liver oil per day as a source of vitamin A.

Vitamin D

Sun exposure is the best “nutrient” for raising vitamin D levels. Full-body exposure to midday summer sun yields approximately 1,000 IU of vitamin D, though this value is highly dependent on your skin tone and latitude. Frequent sun exposure is not always possible, and vitamin D deficiency is widespread, so one teaspoon of cod liver oil per day (a natural source of vitamin D) may be a beneficial addition to your diet.

Vitamin E

Vitamin E supplementation is also not recommended because long-term studies indicate that alpha-tocopherol, the form of vitamin E typically used in multivitamins, may increase the risk of cardiovascular disease. (19) Instead, get your vitamin E from foods such as:

  • Spinach
  • Turnip greens
  • Chard
  • Sunflower seeds
  • Almonds
  • Bell peppers
  • Asparagus
  • Collards
  • Kale
  • Broccoli
  • Brussel sprouts

Eating these foods with fat—plant or animal—will enhance vitamin E absorption.

Vitamin K2

The primary forms of vitamin K2 are menaquinone-4 (MK-4) and menaquinone-7 (MK-7). Grass-fed, full-fat dairy products, gouda, brie, poultry liver (especially goose liver), and pastured egg yolks are excellent sources of MK-4, while natto and other fermented foods are rich sources of MK-7.

Magnesium

The optimal intake of magnesium is 500 to 700 mg/day from food and supplements. Food sources of magnesium include:

  • Leafy greens
  • Molasses
  • Dark chocolate
  • Bananas
  • Pumpkin seeds
  • Almonds
  • Lentils

It is hard to get enough magnesium from food alone, so you may need to supplement with 100 to 500 mg of magnesium glycinate or magnesium malate per day.

Iodine

Ironically, iodine deficiency may be more common in people who eat healthier diets, due to the avoidance of iodized salt and processed foods that contain iodized salt. To boost your iodine levels, eat sea vegetables (such as nori), cod, yogurt, milk, and eggs.

Calcium

Most calcium supplements, with the exception of bone meal, are no longer recommended due to studies linking them to an increased risk of cardiovascular disease, kidney stones, and other problems—even increased risk of fracture. Instead, focus on food sources such as:

  • Bone-in sardines
  • Canned sockeye salmon with bones
  • Yogurt
  • Cheese
  • Sesame seeds
  • Collard greens
  • Spinach
  • Turnip greens
  • Bok choy
  • Almonds
  • Summer squash
  • Herbs and spices

If you are looking for nutritional powerhouses that can deliver a bundle of nutrients at once, consider eating liver, beef, and oysters. Liver is nature’s most concentrated source of vitamin A and is an excellent source of B vitamins, bioavailable iron, CoQ10, copper, zinc, and chromium. Beef contains highly bioavailable zinc, iron, selenium, and B vitamins, especially B12. Last but not least, oysters contain 111 percent of the RDA for zinc, 110 percent for selenium, 267 percent for B12, 79 percent for copper, and 28 percent of the RDA for iron. They are also an excellent source of omega-3 fatty acids.

Misconceptions about Nutrient Density

There are some common misconceptions about nutrient density that circulate throughout the nutrition world and blogosphere. Since these misconceptions have the potential to significantly impact the nutrient density of your diet and your long-term health, I’d like to set the record straight by addressing a few of them here.

“Many Vegetables Are High in Vitamin A.”

This is a common misconception that is even promoted on food labels! Vegetables, especially dark green, orange, and yellow vegetables, are often described as good sources of vitamin A. In reality, they contain vitamin A precursors called carotenoids. Carotenoids must be converted into vitamin A in the body, and unfortunately, this conversion is inefficient for many people. (20)

“You Can Get Vitamin B12 from Fermented Foods and Brewer’s Yeast.”

This is a common myth among vegetarians and vegans. B12 is found exclusively in animal products, with the exception of rarely consumed plant foods like nori and wild mushrooms. (21) Some plant foods do contain compounds that resemble B12, but in most cases these are B12 analogs (meaning that they have a similar chemical structure). Despite their molecular similarity, B12 analogs actually block B12 receptors, increasing the need for this crucial vitamin. (22)

“You Can Get All the Calcium You Need from a Vegan Diet.”

Although foods included in a vegan diet contain calcium, the calcium is far less bioavailable. You would need to eat 16 servings of spinach to get the same amount of calcium in one glass of milk. (23) Oxalate and phytate, which are frequently found in plant foods that contain calcium, inhibit calcium absorption. Dairy products and bone-in fish remain the most bioavailable sources of calcium.

“Vegetarian Foods Are Loaded with Iron.”

Many plant foods contain nonheme iron, but it is far less bioavailable than heme iron in animal foods. Furthermore, phytates and oxalates reduce the absorption of nonheme iron.

“You Can Get Omega-3 Fatty Acids from Plant Foods Such as Chia Seeds and Flax Seeds.”

Some plant foods, such as flax and chia, contain alpha-linolenic acid. It’s possible for alpha-linolenic acid to be converted into EPA and DHA, the omega-3 fatty acids that are essential to our health, but conversion is poor in many people. (24)

Nutrient Deficiencies on Vegetarian and Vegan Diets

The nutrition myths mentioned above are often promoted by advocates of vegetarian and vegan diets. Perhaps not surprisingly, nutrient deficiencies are very common on such diets. (25) Deficiencies frequently occur in vegetarian and vegan diets because the main food sources in these diets are grains, legumes, vegetables, and fruits, which are not as nutrient dense as foods included in omnivorous diets, including meat, seafood, and dairy products.

Also, many of the staple foods of vegetarian and vegan diets, especially grains and legumes, are high in phytic acid and other nutrients and thus have low bioavailability. If your goal is to optimize the nutrient density of your diet, then vegetarianism and veganism may not be your best option. For more information on the nutritional pitfalls of vegetarian and vegan diets, check out my article “Why You Should Think Twice about Vegetarian and Vegan Diets.”

Can Taking a Multivitamin Prevent Nutrient Deficiencies?

For decades, people have been encouraged by nutrition authorities to take a daily multivitamin. Currently, more than a third of Americans take a daily multivitamin, but do we really need one? A recent review of multivitamins published by Examine indicates that multivitamins are only useful for some populations, such as: (26)

  • People who can’t afford or access quality food
  • The elderly
  • Pregnant women
  • Those who can’t obtain nutrients from food (likely due to illness or other health conditions)

Contrary to what we’ve been told, most studies show no health benefits from multivitamins, and some even suggest that they cause harm. How could a multivitamin possibly be harmful? Supplemental nutrients don’t have the same effect on the body as nutrients obtained from food, and large doses of isolated nutrients may promote oxidative stress and inflammation. Foods remain the safest source of nutrients, as they contain a full spectrum of nutrients in forms that are recognized by the body. Furthermore, it is much harder to overdose on nutrients with foods than with supplements.

While supplements still have a place in Functional Medicine, they should not be used to replace a nutrient-dense diet.

I hope this article has helped you gain a better understanding of nutrient density and its crucial role in your health. By consuming a diet composed of nutrient-dense foods and being aware of bioavailability, you can optimize your nutrition status, protect yourself from nutrient deficiencies, and create the foundation for lifelong health.

The post What Is Nutrient Density and Why Is It Important? appeared first on Chris Kresser.

Be Nice and Share!
This post was originally published on this site

http://chriskresser.com/

Collecting data, like this researcher is doing on a patient chart, impacts the latest nutrition headlines.

This article is Part 1 of a two-part series about the problems with nutrition research. For more on why you should be skeptical of the latest nutrition headlines, check out Part 2 of this series.

Nutritional epidemiology is basically the board game equivalent of a Ouija board—whatever you want it to say, it will say. – Dr. Peter Attia

Every week, we’re bombarded with splashy headlines in the media about the latest nutrition research. Here’s a sampling from the last few weeks alone:

Within a six-week period, we learned that low-carb diets will make you live longer and shorten your lifespan and that they’re both good and bad for diabetes. We also learned that consuming even small amounts of alcohol, which has long been regarded as health promoting, is now unhealthy.

For decades, we were told to limit dietary fat and cholesterol because they would clog our arteries, give us heart attacks, and send us to an early grave. Yet in 2010, the federal government removed its restriction on total fat from the U.S. Dietary Guidelines, and in 2015, they did the same thing for cholesterol, remarking that it is “not a nutrient of concern for overconsumption.” (1)

If you’re confused by this, or you’ve just stopped listening altogether, you’re not alone. And who could blame you? In a recent, scathing critique of nutrition research in JAMA, Dr. John Ioannidis, a professor at the Stanford School of Medicine, said:

Nutritional research may have adversely affected the public perception of science.

… the emerging picture of nutritional epidemiology is difficult to reconcile with good scientific principles. The field needs radical reform. (2)

In other words, you’re not crazy for doubting the latest media headlines or just throwing up your hands in frustration! In this article, I’m going to explore the reasons why skepticism is an appropriate response when it comes to most nutrition studies. Armed with this information, you’ll be better able to protect yourself and your family from the latest media hype and focus on what really matters when it comes to diet and nutrition.

Why You Can’t Trust Observational Studies as “Proof”

An observational study is one that draws inferences about the effect of an exposure or intervention on subjects where the researcher or investigator has no control over the subject. It’s not an experiment where researchers are directing a specific intervention (like a low-carb diet) and making things happen. Instead, they are just looking at populations of people and making guesses about the effects of a diet or lifestyle variable.

Observational studies are good for generating hypotheses, but they can’t prove that a specific variable causes a specific outcome.

That is the domain of a randomized controlled trial (RCT), which randomly assigns participants to two groups—a treatment group that receives the intervention being studied and a control group that does not—and then observes them for a specific period of time.

We’ve all seen nutrition headlines promising groundbreaking information that will change the way we view our health. But how many news stories are based on studies with faulty methods, uncontrolled biases, and other major problems? Check out this article to find out.

Every scientist knows this, and most journalists should as well. Yet today, it’s not uncommon to see headlines like “Low-carb diet shortens your lifespan” and “Eating processed meat increases your risk of cancer,” which imply that the studies proved a causal relationship when, in fact, all they did is establish a correlation.

Correlation Is Not Causation

The problem is that two variables that are correlated, or associated together, do not always have a causal relationship. Consider the following examples, from Tyler Vigen’s excellent webpage called Spurious Correlations:

  • S. spending on space, science, and technology is 99.8 percent correlated with suicides by hanging, strangulation, and suffocation.
  • Per capita consumption of margarine in the United States and the divorce rate in the state of Maine are correlated at 99.3 percent.
  • Total revenue generated by arcades is 5 percent correlated with computer science doctorates awarded in the United States.

Those are incredibly strong correlations, but I think it’s fairly obvious that consumption of margarine in the United States has absolutely no impact on the divorce rate in Maine … right?

Another great example of how easy it is to derive spurious correlations—especially when you set out with an agenda—comes from a large study of the most common diagnoses for hospitalization in 10.6 million Canadians. The researchers found that 24 diagnoses were significantly associated with the participants’ astrological signs: (3)

  • People born under Leo had a 15 percent higher risk of hospitalization due to gastrointestinal hemorrhage compared to other residents of Ontario.
  • People born under Sagittarius had a 38 percent higher risk of hospitalization for arm fractures compared to people with other signs.

In Dr. Ioannidis’s editorial in JAMA, he notes:

Almost all nutritional variables are correlated with one another; thus, if one variable is causally related to health outcomes, many other variables will also yield significant associations in large enough data sets.

As an example of just how absurd this can become, he notes that, if taken at face value, observational studies have inferred that:

… eating 12 hazelnuts daily (1 oz) would prolong life by 12 years (ie, 1 year per hazelnut), drinking 3 cups of coffee daily would achieve a similar gain of 12 extra years, and eating a single mandarin orange daily (80g) would add 5 years of life. Conversely, consuming 1 egg daily would reduce life expectancy by 6 years, and eating 2 slices of bacon (30g) daily would shorten life by a decade, an effect worse than smoking.

Are these relationships truly causal? Of course not, Ioannidis says. Yet study authors often use causal language when reporting the findings from these studies.

In fact, according to an analysis in 2013, authors of observational studies made medical or nutritional recommendations (suggesting their data showed a causal relationship) in 56 percent of cases. (4) The study authors summed up their findings as follows:

In conclusion, our empirical evaluation shows that linking observational results to recommendations regarding medical practice is currently very common in highly influential journals. Such recommendations frequently represent logical leaps. As such, if they are correct, they may accelerate the translation of research but, if they are wrong, they may cause considerable harm. [emphasis added]

I should note that it’s at least possible to become reasonably confident of a causal association between variables in an observational study using what is known as the Bradford Hill criteria:

  • Strength of the association
  • Consistency
  • Specificity
  • Temporality
  • Biological gradient
  • Plausibility
  • Coherence
  • Experiment
  • Analogy

The more of these criteria that are met, the more likely causation is present.

However, observational nutrition studies rarely satisfy these criteria, which makes the frequent claims of causality even more dubious.

There Are Problems with Data Collection Methods Too

There’s a saying in science: “Data are only as good as the instrument used to collect them.”

Way back in the 13th century, the English philosopher and Franciscan friar Roger Bacon said that scientific data must be: (5)

  1. Independently observable
  2. Measurable
  3. Falsifiable
  4. Valid
  5. Reliable

To use a simple example, if someone is eating an apple right in front of you, you can observe, measure, and either verify or repute that they’re doing that. But if they simply tell you that they ate an apple at some time in the past, you can neither observe, measure, verify, nor refute their story. You just have to take their word for it—and that is not science.

The term “observational nutrition study” is a misnomer because it suggests that researchers are actually observing what participants eat. But of course that’s not true; researchers aren’t standing around in people’s kitchens and going out to restaurants with them.

Instead, they are collecting data on what people eat by giving them questionnaires to fill out. There are different versions of these used in research, from food frequency questionnaires (FFQs), which may ask people to recall what they ate months or even years prior, to 24-hour recall surveys where people are asked what they ate over the past 24 hours.

These “memory-based assessments,” or “M-BMs,” bear little relation to actual calorie or nutrient consumption. Why? Because memory is not a literal, accurate, or even precise reproduction of past events. (6)

In a paper criticizing the validity of M-BMs for data collection, Edward Archer pointed out:

When a person provides a dietary report, the data collected are not actual food or beverage consumption but rather an error-prone and highly edited anecdote regarding memories of food and beverage consumption. (7)

Going back to the apple example above, researchers aren’t watching participants eat an apple. They’re relying on the participants’ reports of eating apples—sometimes several years prior!

We Can’t Rely on Memory When It Comes to Nutrition Research

But just how inaccurate are M-BMs? To find out, Archer analyzed questionnaires from participants in the National Health and Nutrition Examination Survey (NHANES), which is a long-running series of studies on the health and nutritional status of the American public. NHANES has served as the basis of dietary guidelines and public health recommendations.

Archer found that, over the 39-year history (at the time of his study) of NHANES, the self-reported calorie intake on the majority of respondents (67 percent of women and 59 percent of men) was not physiologically plausible, and the average calorie intake levels reported by overweight and obese people (i.e., the majority of Americans) were incompatible with life.

In other words, a bedridden, frail, elderly woman (i.e., a person with the lowest possible calorie requirements) could not survive on the number of calories reported by the average person in the NHANES survey!

And this isn’t just a problem in the United States. The inaccuracy of M-BMs has been replicated consistently over three decades and in multiple countries around the world. (8)

Can you see why this would be a problem?

All macronutrients (protein, fat, and carbohydrate) and micronutrients (vitamins, minerals, and trace minerals) are consumed as calories, so when calories are misreported, all nutrients will also be misreported.

What’s more, certain subgroups are more prone to underreporting, including people who are obese or have a high calorie intake. Obese subjects have been found to underreport up to half of their calorie intake, and in particular, they underreport fat and carbs. (9)

One consequence of this is that the health risks associated with a high fat (or carb) intake would be overestimated. Imagine that someone reports a saturated fat intake of 50 grams, and they have a total cholesterol of 200 mg/dL. But say they underreported their saturated fat intake by 40 percent, and their actual intake was 80 grams. This would overestimate the effect of saturated fat intake on total cholesterol because it assumed that eating 50 grams—rather than 80 grams—led to a total cholesterol of 200 mg/dL.

Where does that leave us? Archer doesn’t pull any punches:

Data collected from M-BM are pseudoscientific and inadmissible in scientific research and the formulation of national dietary guidelines. (10)

… the uncritical faith in the validity and value of M-BM has wasted significant resources and continues the single greatest impediment to actual scientific progress in the fields of obesity and nutrition research. (11)

Most people have no idea that the entire field of observational nutrition research—and all of the media headlines that come out of it—is based on questionnaires about what people eat. Now that you know, will you ever look at nutrition headlines in the same way again?

How the “Healthy-User” Bias Impacts Findings

The “healthy-user” bias refers to the observation that people who engage in a behavior perceived as healthy are more likely to engage in other behaviors that are also perceived as healthy and vice versa.

For example, because red meat has been perceived as “unhealthy” for so many years, on average, people that eat more red meat are more likely to: (12)

  • Smoke
  • Be physically inactive
  • Eat fewer fruits and vegetables
  • Be less educated

Of course, most researchers are well aware of the influence of confounding factors and the healthy-user bias, and good ones do their best to control for as many of these factors as they can. But even in the best studies, researchers can’t control for all possible confounding factors because our lives are simply too complex. As Norman Breslow, a former biostatistician at the University of Washington, once said:

People think they may have been able to control for things that aren’t inherently controllable.

One of the inevitable results of the healthy-user bias is that many observational studies end up comparing two groups of people that are not at all similar, and this casts doubt on the findings.

For example, early studies suggested that vegetarians live longer than omnivores. However, these studies compared Seventh Day Adventists—a religious group that advocates a vegetarian diet and a healthy lifestyle as part of their belief system—with the general population.

That introduces serious potential for healthy-user bias because the members of the SDA church engage in lifestyle behaviors—like not smoking or drinking alcohol, eating more fresh fruits and vegetables, and getting more exercise—that have been shown to reduce the risk of death from cardiovascular disease and all causes. So, we can’t possibly know whether the reduction in deaths observed in these studies was related to the vegetarian diet or these other causes, and thus the findings are not generalizable to the wider population.

(As a side note, four later studies that compared vegetarians with a more health-conscious population of omnivores found that both groups lived longer than the general population, but there was no difference in lifespan between the vegetarians and healthy omnivores. You can read more about this in my article “Do Vegetarians and Vegans Really Live Longer than Meat Eaters?”)

The healthy-user bias plagues most observational nutrition studies, and yet we hardly ever hear it mentioned when these studies are reported in the media. Now that you know about it, how might you respond differently to some of the headlines I shared at the beginning of the article?

  • “Low-carb diets could shorten life, study suggests”
  • “Eating cheese and butter every day linked to living longer”
  • “Whole grains one of the most important food groups for preventing type 2 diabetes”

Would you ask questions like:

  • Since fat has been perceived as unhealthy and low-carb diets are high in fat, were the people eating low-carb diets also engaging in other behaviors perceived as unhealthy?
  • Were the people eating more cheese and butter doing anything else that might have contributed to a longer lifespan?
  • Were the people who were eating more whole grains exercising more or engaging in other behaviors perceived as healthy (since eating whole grains is perceived as healthy)?

The “Risks” Are Often Pure Chance

In 2015, the International Agency for Research on Cancer (IARC) issued a report suggesting that for every 50 grams of processed meat consumed, the relative risk of cancer was increased by 18 percent compared to those who ate the least processed meat. (13)

How confident can we be of that claim? In epidemiology outside the field of nutrition (and even within the nutrition field until recently), the threshold for confidence in relative risk is between 100 and 300 percent. In other words, we’d need to see an increase or decrease of risk of between 100 and 300 percent for a given intervention before we could be confident that the change observed was due to the intervention and not simply to chance.

According to the late epidemiologist Syd Shapiro, cofounder of the Slone Epidemiology Center, at the higher end of this range, one can be guardedly confident, but “we can hardly ever be confident about estimates of less than 100 percent, and when estimates are much below 100 percent, we are simply out of business.” (14)

Marcia Angell, the former editor of the New England Journal of Medicine, said much the same thing in a 1995 article in Science called “Epidemiology Faces Its Limits”:

As a general rule of thumb, we are looking for a relative risk of three or more [before accepting a paper for publication], particularly if it is biologically implausible or if it’s a brand-new finding.

And Robert Temple, who was the director of drug evaluation at the Food and Drug Administration (FDA) at the time, put it even more bluntly in the same Science article:

My basic rule is if the relative risk isn’t at least three or four, forget it.

Most epidemiologists that were interviewed for the Science article said they would not take seriously a single study reporting a new potential cause of cancer unless the increase in risk was at least three-fold.

This is bad news for observational nutrition studies since the vast majority of relative risks reported fall well below this threshold. Most are well below 100 percent, and many—like the IARC finding on processed meat and cancer—are below 25 percent.

To put this in perspective, the increased relative risk of lung cancer from smoking cigarettes is between 1,000 and 3,000 percent. The increased relative risk of liver cancer from eating grains contaminated with aflatoxin is 600 percent.

It’s also important to consider the difference between absolute and relative risk reduction. Researchers often use relative risk statistics to report the results of nutrition studies. For example, in the IARC report, they said that every 50 grams of processed meat consumed increased the risk of cancer by 18 percent. But when that increase in relative risk is stated in absolute terms, it doesn’t sound quite as impressive. The lifetime absolute risk of colon cancer in vegetarians is 4.5 out of 100; in people eating 50 grams of processed meat every day for a lifetime, the risk is 5.3 out of 100. (15)

All of this suggests that most findings in observational nutrition studies are indistinguishable from chance and are unlikely to be validated by RCTs (which is exactly what has happened in most cases, as I’ll explain shortly). Yet despite this, many of these studies are highly publicized in the media and often reported as if they conclusively discovered a causal relationship.

The current climate in both academia and the media, unfortunately, contributes to this. Null results—when researchers don’t find a positive or negative association—are significantly less likely to be published, and without publication, researchers are out of a job. (16, 17) And the pressure to get clicks and generate advertising revenue in the digital media world leads to splashy headlines that overstate or distort what the study actually found. One study found that 43 percent of front-page stories reporting on medical research are based on research with mostly preliminary findings (i.e., they were observational studies that didn’t prove causality). (18)

“The sin,” Dr. Sander Greenland, an epidemiologist at UCLA, has said, “comes from believing a causal hypothesis is true because your study came up with a positive result.” (19)

Sadly, this is more the rule than the exception today.

I hope this article has given you some reasons to remain skeptical about nutrition research. For more information on this topic, check out Part 2 of this article series—and let me know what you think in the comments below!

The post Why You Should Be Skeptical of the Latest Nutrition Headlines: Part 1 appeared first on Chris Kresser.

Be Nice and Share!
This post was originally published on this site

http://chriskresser.com/

An elderly person eats a vegetarian meal, possibly due to a belief that vegetarians and vegans live longer than omnivores.

In this post, I’m going to do a deep dive into the research into whether vegetarians and vegans live longer than omnivores—people who consume a mixed diet including meat. It’s a long article, because I wanted to cover all of the available studies on the topic and be as thorough as possible.

For the time-challenged among you, I’ll spill the beans right up front: while early studies did suggest a survival advantage for vegetarians and vegans, more recent and much higher-quality evidence has found no difference in lifespan between omnivores and vegetarians and vegans.

That’s the TL;DR. Let’s take a closer look at how researchers arrived at this conclusion.

Comparing Apples to Apples: Addressing the “Healthy-User Bias”

One of the biggest issues with studies that compare the health and longevity of vegetarians with omnivores is that they’re not comparing “apples to apples.”

The average vegetarian tends to be more health-conscious than the average omnivore: the very fact that a person has chosen to be a vegetarian indicates that she’s thinking more about her health than a “SAD omnivore”—someone following a Standard American Diet (SAD) and lifestyle.

This explains why studies show that vegetarians typically engage in a healthier lifestyle overall than SAD omnivores: they smoke and drink less, are less likely to be overweight or have conditions like diabetes, have higher levels of physical activity, and eat more fruits and vegetables, among other factors. (1, 2)

If scientists do a study to compare lifespan between SAD omnivores and vegetarians, and they observe that vegetarians live longer, how do they know that it was abstaining from eating meat that led to that survival advantage? How do they know that it wasn’t exercising more, drinking and smoking less, and/or other healthy lifestyle choices that extended the vegetarians’ lives?

They don’t.

These lifestyle factors are highly significant—more on this below—and are almost certainly responsible for the results of the few studies that did find a survival advantage in vegetarians.

This issue is known as the healthy-user bias, and it’s a huge problem that plagues most nutrition studies. I discussed it in detail in a podcast called “Heart Attacks and Red Meat—Correlation or Causation?,” but here’s the short version: People who engage in a behavior perceived as healthy are more likely to engage in other behaviors that are also perceived as healthy, and vice versa.

No, vegetarians and vegans don’t live longer than omnivores. But how did that myth get started? It turns out that a handful of poorly designed but well-publicized studies are to blame. Check out what the data really shows about the impacts of a plant-based diet.

Since eating meat has (wrongly) been perceived as unhealthy for the past five decades, people who eat less meat are less likely to smoke, drink excessively, or be overweight, and are more likely to exercise, eat fruits and vegetables, and get enough sleep. Can you see how this might cause problems?

If we really want to know whether removing meat and animal products from the diet extends lifespan, we’d need to study two groups of people:

  • Vegetarians
  • Health-conscious omnivores, or “nutrivores” (see footnote)

A nutrivore would be someone who lives a healthy lifestyle (doesn’t smoke, doesn’t drink excessively, gets plenty of exercise and sleep, etc.) and consumes nutrient-dense, anti-inflammatory, whole foods like:

  • Meat
  • Organ meat
  • Shellfish
  • Fish
  • Bone broth
  • A wide variety of fresh fruits and vegetables (including sea vegetables)
  • Nuts
  • Seeds
  • Tubers

Comparing vegetarians with nutrivores would be more like comparing apples to apples.

To date, no studies have done this in a rigorous way. However, there are four studies that at least attempted to select a population of more health-conscious omnivores to compare against vegetarians and vegans:

  • The Health Food Shoppers Study
  • The Oxford Vegetarians Study
  • The EPIC-Oxford Cohort
  • The Heidelberg Study

We also have the 45 and Up Study from Australia. While this study didn’t select a healthier omnivore population, as the four studies I mentioned above did, the researchers did a much better job of controlling for confounding factors—like obesity, diabetes, smoking, drinking, and socioeconomic status—that would be likely to influence lifespan.

I’m going to review these studies in more detail below. But before I do that, let’s first discuss why total mortality is always the most important endpoint to consider in this kind of research.

Dying From Cancer Isn’t Better Than Dying From Heart Disease: Why Total Mortality Is the Most Important Endpoint in Longevity Studies

Let’s say a study finds vegetarians have a 30 percent lower risk of death from heart disease than omnivores. Sounds impressive, right? This difference will be what is reported in splashy media headlines, and people scanning through their newsfeeds will come away with the idea that vegetarians do live longer than meat eaters.

But then we take a closer look at the data, and we find that there was no difference in the number of deaths overall between the two groups. In medical research, the term “total mortality” refers to deaths from all causes—and it’s the most important endpoint to consider in studies comparing lifespan between groups.

Why? Because a reduction in deaths from one condition doesn’t mean much if it comes with an equal or greater increase in deaths from other conditions. The overall risk of death would be the same. Researchers call this “disease substitution”—because one cause of death is simply substituted for another.

You’ll see why this is such an important concept to understand as we review the studies below.

The Seventh Day Adventist Studies: The Healthy-User Bias Strikes Again!

Seventh Day Adventists (SDA) is a Protestant Christian denomination that grew out of the Millerite movement in the United States. Health has been a focus of the SDA teachings since the inception of the church in the 1860s. According to Wikipedia:

Adventists are known for presenting a “health message” that advocates vegetarianism and expects adherence to the kosher laws, particularly the kosher foods described in Leviticus 11, meaning abstinence from pork, shellfish, and other animals proscribed as “unclean.” The church discourages its members from consuming alcoholic beverages, tobacco or illegal drugs. … In addition, some Adventists avoid coffee, tea, cola, and other beverages containing caffeine.

Interesting side note: If you think U.S. dietary guidelines are only based on science, think again. SDA, a religious group, has influenced dietary guidelines for more than 100 years. For example, Adventist Lenna Cooper co-founded the American Dietetic Association (ADA) in 1917. She was an acolyte of Dr. John Harvey Kellogg, one of the founders of the breakfast cereal industry (Corn Flakes, anyone?), and she wrote textbooks and other materials that were used in dietetic and nursing programs not only in the United States, but around the world, for more than 30 years. For a fascinating look at the history of the SDA church’s influence on the dietetics field and the promotion of a vegetarian diet in the U.S., see this article.

Three studies have been done comparing the lifespan of SDAs to the general population:

  1. Adventist Netherlands (1968–1977)
  2. The U.S. Adventist Health Study (1977–1982)
  3. The U.S. Adventist Health Study 2 (2002–2009)

I’ve summarized the relevant findings of each of these studies below.

Adventist Netherlands

This study focused on 522 Dutch SDAs. Compared to omnivores, vegetarians had: (3)

  • 59 percent fewer deaths from cardiovascular disease
  • 57 percent fewer deaths from ischemic heart disease
  • 46 percent fewer deaths from cerebrovascular disease
  • 55 percent fewer deaths from all causes

The Adventist Health Study

This study examined 34,198 SDAs in the United States. Compared to omnivores, vegetarians had: (4)

  • 38 percent fewer deaths from ischemic heart disease
  • 7 percent fewer deaths from cerebrovascular disease
  • 20 percent fewer deaths from all causes

The Adventist Health Study 2

In this study of 73,308 SDAs in the United States, compared to omnivores, vegetarians had: (5)

  • 19 percent fewer deaths from ischemic heart disease
  • 13 percent fewer deaths from cardiovascular disease
  • 12 percent fewer deaths from all causes

These results certainly do seem impressive on the surface!

But, before you trade in your steak and salmon for tofu and lentils, let’s consider the serious shortcomings of these studies.

The Adventist Netherlands study compared SDA with the general population—not nutrivores! That introduces serious potential for healthy-user bias, because the members of the SDA church engage in lifestyle behaviors—like not smoking or drinking alcohol, eating more fresh fruits and vegetables, and getting more exercise—that have been shown to reduce the risk of death from cardiovascular disease and all causes. So, we can’t possibly know whether the reduction in deaths observed in this study was related to the vegetarian diet or these other causes, and thus the findings are not generalizable to the wider population.

Both Adventist Health Studies in the United States had better study designs. They recruited both the vegetarians and non-vegetarians from the SDA population. This reduced the healthy-user bias at least somewhat, since SDA omnivores would be expected to be healthier overall than SAD omnivores.

However, the first U.S. SDA study from the late 1970s and early 1980s did not control for smoking, body mass index, and other confounding factors. So, did the vegetarians in that study enjoy a longer lifespan because they didn’t eat meat, or because they smoked and weighed less than their omnivorous counterparts? We simply don’t know, which means the reduced risk of mortality among vegetarians in this study is also not generalizable to the wider population.

The second U.S. SDA study did a better job of controlling for confounding factors such as smoking and body mass index. But that doesn’t remove the potential for confounding altogether. On the contrary, as the authors of the study observed: (6)

Observed mortality benefits may be affected by factors related to the conscious lifestyle choice of a vegetarian diet other than dietary components. Potential for uncontrolled confounding remains.

Caution must be used in generalizing results to other populations in which attitudes, motivations, and applications of vegetarian dietary patterns may differ. [emphasis added]

To summarize:

Of the three SDA studies, two of them aren’t generalizable to the wider population because they either didn’t select a healthy omnivore population to compare the vegetarian groups against or because they failed to control for significant confounding factors like smoking and BMI. The third SDA study, while better than the other two, still suffers from significant potential for healthy-user bias.

The “Apples to Apples” Studies: What Research Comparing Healthy Omnivores to Vegetarians and Vegans Tells Us

Let’s turn our attention now to the studies that compare a healthier omnivore population with vegetarians and vegans.

It’s important to point out that we’re still not talking about nutrivores here; as you’ll see below, all we know about these omnivores is that they shop in health food stores or read health magazines. That said, these studies are at least a step in the right direction toward comparing “apples to apples,” because they at least attempt to reduce the healthy-user bias.

There are four studies that were designed specifically to select a healthier omnivore population and one study that did not do this but was performed using a large sample of the general population (rather than a religious group) and did a much better job controlling for confounding factors than most other studies of this type do.

The Health Food Shoppers Study

The Health Food Shoppers Study was a prospective cohort study of 10,736 subjects in the United Kingdom recruited between 1973 and 1979. Here’s a description of how participants were recruited, directly from the study: (7)

Subjects were recruited by distributing a short questionnaire to customers of health food shops and clinics, subscribers to health food magazines and a Seventh Day Adventist publication, and members of vegetarian and health food societies.

The idea was that omnivores who shop at health food stores, belong to health food societies, or read health food and/or SDA magazines and publications would be more health-conscious than average omnivores. This is certainly a step in the right direction in terms of the likelihood that the study results would be thrown off by the “healthy-user bias.”

What did the researchers find? In a 1999 analysis of the data, both vegetarians and omnivores in the health food store group lived longer than people in the general population—not surprising given their higher level of health consciousness—but there was no survival difference between vegetarians or omnivores. (8)

The Oxford Vegetarians Study

The Oxford Vegetarians Study was a prospective cohort study of 11,045 subjects from the United Kingdom recruited between 1980 and 1984. (9) The researchers recruited vegetarians first through the Vegetarian Society of the United Kingdom. They then asked the vegetarians to invite friends and relatives who consumed animal products, with the assumption that the omnivorous friends of vegetarians are likely to be healthier than the general population of omnivores.

As in the Health Food Shoppers Study, both vegetarians and health-conscious omnivores had lower risk of early death than the general population, but there was no difference in lifespan between the two groups.

EPIC-Oxford Cohort

The EPIC-Oxford Cohort is a component of the European Prospective Investigation into Cancer and Nutrition and included 44,561 participants from the United Kingdom recruited between 2002 and 2007, with a follow-up in 2009. (10) They used the same method as recruiting participants as the Oxford Vegetarians Study mentioned above.

In this study, researchers found that the risk of death for both vegetarians/vegans and omnivores was 52 percent lower than in the general population—similar to findings from the two studies above. However, there was no difference in mortality between vegetarians and omnivores.

Heidelberg Study

The Heidelberg Study was a prospective cohort study of 1,904 subjects from Germany recruited between 1976 and 1999. (11) They used a similar method for recruiting to that of the Oxford Vegetarian Study and the EPIC-Oxford Cohort: participants were recruited from readers of vegetarian magazines in Germany using a short questionnaire, and the vegetarians that agreed to participate were asked to invite family members that ate meat.

This study found that vegetarians had slightly higher (10 percent) total mortality than healthy omnivores. What’s more, the data suggested that non-dietary factors played a much greater role in predicting lifespan than diet:

In summary, we conclude that the recommended healthy lifestyle factors, particularly abstinence from smoking, a moderate or high level of physical activity, moderate alcohol intake, and absence of overweight, are important determinants of reduced mortality in both vegetarians and nonvegetarians who already follow a healthy lifestyle.

We’ll discuss the contribution of non-dietary factors to lifespan in more detail below.

The 45 and Up Australian Study

The 45 and Up Study is a longitudinal study that followed 243,096 participants in New South Wales, Australia, for six years. At baseline, the vegetarians in the study were:

… younger, less likely to be overweight or obese, more likely to be female, and less likely to have cardiovascular and metabolic diseases (including Type 2 diabetes, heart disease or stroke) and hypertension at the time of recruitment. They were also more likely to have healthy lifestyle behaviours such as a lower prevalence of smoking and risky alcohol intake. (12)

As you now understand, these significant lifestyle differences between the vegetarians and the omnivores in the study introduce huge potential for the healthy-user bias. However, the study authors were aware of this and mentioned the shortcomings of previous research (like the SDA studies) in this regard:

First, most of these studies were designed to recruit vegetarians or occurred in populations with higher proportions of vegetarians (such as the Adventists, who may have other lifestyle factors and health-enhancing behaviours responsible for the observed protective effects), therefore previous findings may have limited generalizability. Second, vegetarians usually engage in an overall healthier lifestyle compared with their non-vegetarian counterparts, such as a lower prevalence of smoking and excessive alcohol consumption (Key et al., 2009) and have higher levels of physical activity (Bedford and Barr, 2005). Therefore the protective effects observed could be due to the other concurrent behaviours.

This awareness of the high risk of bias in observational nutrition research is likely what led them to pay much more attention to confounding factors than other, similar studies. They controlled for age, gender, education, marital status, geographic remoteness, SEIFA (Socio-Economic Index for Area, a census-based ecologic measure of socio-economic disadvantage), smoking status, physical activity, alcohol intake, and comorbidities including cancer, hypertension, and cardiometabolic disease (CMD), which includes type 2 diabetes, stroke, and heart disease.

While this doesn’t completely eliminate the healthy-user bias, it does at least reduce it. What were the results? They found no significant difference in total mortality between vegetarians and omnivores. There was also no difference in mortality between vegetarians, pesco-vegetarians, and semi-vegetarians.

This finding is especially notable because the study had such a large sample size (around 250,000 participants). Because vegetarianism is relatively rare, a large sample size with sufficient numbers of vegetarians and different variations of vegetarian diets makes the results more likely to be accurate.

Meta-Analyses: What Have Reviews of Individual Studies Added?

There have been two meta-analyses (comprehensive reviews) of individual studies comparing mortality in vegetarians and omnivores, including all of the studies discussed above. Both found no difference in total mortality between vegetarians and omnivores.

The first was done in 2014. (13) It found no difference in total mortality between vegetarians/vegans and omnivores.

What’s more, it concluded that any benefit observed in previous analyses for vegetarian diets was driven by the SDA studies, which, as we’ve seen, suffer from multiple confounding factors that weren’t adequately controlled for:

… while all of the SDA studies demonstrate significant reduction in all-cause mortality with vegetarian diet, this finding was not replicated in four of the non-Adventist studies.

In conclusion, the reduction in IHD [ischemic heart disease] and all-cause mortality with vegetarian diet stems mainly from the Adventist studies, and there is much less convincing evidence from studies conducted in other populations.

In view of these inconsistent findings, we conclude that the benefits of vegetarian diet for reducing death and vascular events remain unproven. [emphasis added]

The study authors suggested several possibilities for why the SDA studies showed a difference in mortality whereas the other studies did not. The SDA diet is characterized by higher intake of fruits and vegetables compared with typical omnivores, and regular SDA churchgoers are more likely to abstain from smoking, have good health practices, and stay married (all of which contribute to longevity in other studies). In addition, they are advised to get sufficient rest, exercise regularly, and maintain healthy relationships. All of these differences explain why the findings from the SDA studies are not generalizable to the wider population.

The second meta-analysis was published in 2017 and reviewed 96 studies. (14) Although they found slight relative reductions in death from heart disease and cancer in vegetarians and vegans compared with omnivores, they found no difference in total mortality.

In the discussion section, the authors remark that the modest reductions in risk factors like BMI, glucose, and cholesterol observed among vegetarians and vegans need to be taken with a grain of salt:

The results of the present meta-analysis report that vegetarians and vegans show significantly lower levels of the most relevant risk factor for chronic disease such as BMI, lipid variables and fasting glucose, when compared to nonvegetarians and nonvegans. These findings, however, are significantly affected by the nature of the cross-sectional studies, which are highly susceptible to biases, as otherwise observed by the moderate-to-high risk of bias assessment in each included study. [emphasis added]

They also suggested that virtually all studies that claim to find health benefits from a vegetarian diet are highly subject to the healthy-user bias:

Indeed, generally speaking, vegetarians tend to be more conscious for the health aspects, slimmer, and in better health when compared with omnivores, and specific cohorts have been demonstrated to be not generalizable to the general population for the low prevalence of risk factors (Kwok et al., 2014). These findings might indicate the presence of flaws in the analysis of possible health benefits of vegetarian diet. [emphasis added]

The Mormon Studies: Another Group of “Healthy Omnivores” That Lives Longer Than the General Population

As we’ve learned, the findings from the SDA studies aren’t generalizable to the wider population because Adventists engage in many behaviors independent of their diet that could contribute to a longer lifespan, such as:

  • Not smoking or drinking
  • Exercising more
  • Eating more fruits and vegetables

What if there were another population we could study that also engaged in these healthy lifestyle practices but followed an omnivorous diet? If such a population also enjoyed better health and a longer lifespan than the general population, that would be another strong piece of evidence that the benefits seen in the SDA are due to their lifestyle, not because they don’t eat meat.

Well, turns out there is a population like this: Mormons. Religiously active Mormons practice a healthy lifestyle, which includes:

  • Not smoking or drinking
  • Exercising
  • Getting adequate sleep
  • Having a strong family life
  • Obtaining high levels of education (15)

All of these factors have been associated with reduced risk of disease and extended lifespan in other studies.

There have been three studies that examined the standardized mortality ratio (SMR) of Mormons to assess lifespan. The SMR is the ratio of deaths in a particular group compared to the general population. It can be expressed as a ratio (e.g., 0.75) or a percentage (e.g., 75 percent). If the SMR of a group studied is 75 percent, that means their death rate is 25 percent lower than the death rate in the general population.

Here are the results of the three studies:

This means that, across these three studies, Mormons had a significantly longer lifespan than their counterparts in the general population. This provides even more evidence that a healthier lifestyle, rather than abstaining from meat, was responsible for the survival advantage of vegetarians observed in the SDA studies.

There’s No Real Evidence That Vegetarians and Vegans Live Longer Than Meat Eaters

Here’s what we can conclude based on the existing research:

  1. While the average vegetarian may live longer than a “SAD omnivore,” there is no evidence that they live longer than more health-conscious omnivores.
  2. Studies showing health benefits of vegetarian diets are highly susceptible to the healthy-user bias, and their findings are not generalizable to the wider population.
  3. Diet and lifestyle factors such as exercise, alcohol intake, smoking, BMI, sleep, and fruit and vegetable consumption play a strong role in predicting lifespan independently of whether meat or animal products are consumed.

It’s also worth noting that we still don’t have a study that compares “nutrivores” with vegetarians and vegans. Although the methods used in the “apples to apples” studies mentioned above are better than the SDA and most observational studies, they’re far from perfect.

These studies recruited subjects who either shop at health food stores or read certain magazines or are friends or family members of vegetarians and vegans. But is that a really a reliable method for selecting a healthy omnivorous population?

Why It’s Hard to Create an Accurate Study

A truly accurate comparison of health-conscious omnivores and vegetarians would require a randomized, controlled trial (RCT) that lasts for at least 25–30 years, locks people up in a metabolic ward, and tightly controls their diet, exercise, sleep, alcohol intake, smoking, and other relevant lifestyle factors so that the only variable that differs between the two groups is whether they consume meat. This will never happen as it would cost hundreds of millions of dollars (who’s going to pay for that?) and they’d never find people willing to do it.

In the meantime, what’s a concerned person to do? Since modern nutrition research is not sufficient to answer this question, we need to rely on other lines of evidence. For example, what is the natural, human diet? What do studies say about the nutrient density of meat and other animal products? What do RCTs tell us about a Paleo diet—a whole-foods, nutrient-dense diet that emphasizes fresh, whole vegetables and fruits, nuts and seeds, and starchy plants, but also includes animal products?

I’ve addressed many of these topics elsewhere on my blog, podcast, and in my first book, The Paleo Cure, and I’ll be revisiting them soon. For now, you can start with the links in the paragraph above and simply focus on all of the things we know are important from an ancestral health perspective:

I hope you’ve enjoyed this article and that it has helped to clear up some of the confusion on this topic.

Now I’d like to hear from you. Have you heard that vegetarians and vegans live longer than their meat-eating counterparts? What do you think now? Let me know what you think in the comments section!

The term “nutrivore” comes from Dr. Sarah Ballantyne of The Paleo Mom.

The post Do Vegetarians and Vegans Live Longer than Meat Eaters? appeared first on Chris Kresser.

Be Nice and Share!
This post was originally published on this site

http://www.thekitchn.com/feedburnermain

This fast and fancy dessert is the recipe you didn’t know your life was missing. Chocolate mousse, unlike cookies or brownies, is all too often considered a fussy restaurant-only recipe, but Chrissy Teigen proves it’s just the opposite. The mousse is easy enough to make any day of the week, and the crispy-rice-cereal-meets-lux-hazelnut-crackle makes this recipe extra special.

I’m going on record right now saying that in years to come, Chrissy Teigen’s chocolate mousse recipe will become legendary — on the same status as Catherine Hepburn’s brownies or Ina’s coconut cake. It’s just that good.

READ MORE »

Be Nice and Share!
This post was originally published on this site

http://www.thekitchn.com/feedburnermain

As someone who proudly identifies as an appliance nerd, I’m often asked by friends and family what my favorite go-to kitchen tool is. Nine times out of 10 you’ll hear me tell them that it’s my trusty Nespresso machine. I’ve owned mine for five years and have even purchased one as a gift — that’s how much I stand by the convenience and quality of this product. On a recent trip to Europe, I noticed Nespresso machines in almost every restaurant I visited, which should tell you something about how good the coffee is.

READ MORE »

Be Nice and Share!
This post was originally published on this site

http://www.thekitchn.com/feedburnermain

Kitchn’s Delicious Links column highlights recipes we’re excited about from the bloggers we love. Follow along every weekday as we post our favorites.

Can we have an honest discussion about chicken? Why are we all so obsessed with it? Is it really that great? In the case of this garlic basil chicken that’s swimming in tomato butter sauce from Pinch of Yum, yes — yes it really is that great.

I’m so sorry, chicken, I should have never doubted you. I’ll never do it again, I promise.

READ MORE »

Be Nice and Share!
This post was originally published on this site

http://www.thekitchn.com/feedburnermain

Despite the rumors that eggplant is bitter, fussy, or boring, it can play the role of many other kitchen staples. You can transform eggplant into a luscious dip, make it mimic potatoes (when cooked as fries), or grill it as a “burger.” But roasted eggplant might be this nightshade’s greatest trick — turning it from soft and spongy to crispy and tender.

READ MORE »

Be Nice and Share!