What (We Hope) Science-Backed Nutrition Advice Will Look Like In 2020
As 2020 approaches, you'd like to think we have this whole "what to eat" thing figured out. And sure, in some areas there's a pretty solid consensus—not many people would argue that eating a wide variety of colorful veggies and scaling back on refined carbs is going to boost your health. But many areas of nutrition science are significantly more murky (ahem, red meat).
On the one hand, anyone could coherently argue that red meat should be avoided like the plague. For years, major health organizations like the American Heart Association and the World Health Organization (WHO) have recommended limiting red and processed meat consumption, with the WHO going as far as to say processed meat is "carcinogenic" and red meat is "probably carcinogenic." And plenty of these wildly popular plant-based, meatless meat products like Beyond Meat and Impossible Foods proudly tout anti-meat research stats on their websites. Beyond Meat, for example, says consumption of animal meat is associated with a 16% increased risk of cancer and a 21% increased risk of heart disease.
But with the publication of a new study, all that could come into question—which is sort of what happened this past October, when a research review in Annals of Internal Medicine stated that eating less red meat won't necessarily benefit you. In the report, which was met with harsh criticism from some in the nutrition community and elation from others (particularly the regenerative ag and carnivore diet crowds), reviewers said there was little to no connection between red meat and an increased risk of heart disease and cancer, despite years of recommendations to limit consumption.
And it's not just the word on meat that's continually flip-flopping, causing people to freak out in the process. Earlier last year, a much-criticized study published in JAMA claimed that the cholesterol in eggs is associated with a higher risk of heart disease and early death—even though dietary cholesterol had largely been vindicated in previous studies and we were just starting to embrace those yolks again.
Even more frustrating, every time the nutrition research flips, it rarely brings us to a united consensus on what to eat. If anything, we become more deeply entrenched in our beliefs. It causes raging Twitter wars between vegans and meat-eaters (with lengthy diatribes featuring cherry-picked studies to support their arguments), breeding distrust in nutrition science as a whole.
So what gives? Why does it seem like every time we decide on a food's healthfulness, it changes? The truth is, the way most nutritional research is conducted is highly flawed—and with these recent studies causing so much outrage, nutritional science is coming to a head, revealing that we really don't know as much as we thought and that it's time to look at the data with a more critical eye.
The huge problem with the vast majority of nutrition research.
People are often quick to blame the media for the ever-conflicting slew of dietary advice, and it's true that, on the whole, the media doesn't always provide enough context as to how data from a new study stacks up to the overall body of research on a given food, diet, or nutrient. But the bigger problem, it turns out, is the data itself and how it's being collected and interpreted.
Despite the ire that the recent controversial red meat study drew from some in the nutritional community, it brought up some valid points about the prevalence and risk of low-quality studies. Most current dietary recommendations, the study authors wrote, are "primarily based on observational studies that are at high risk for confounding," meaning they're at high risk for resulting in inaccurate associations between a given food and a particular outcome.
In fact, there is no shortage of doctors and dietitians willing to share their distaste and distrust for observational studies (aka nutritional epidemiology), in which people report what they eat over a period of time and researchers compare their health outcomes. For the most part, experts agree that while these studies can point us in the direction of a possible connection between two variables, they shouldn't be used to make black-and-white recommendations.
"Plenty of people have written eloquently about the idea that nutritional epidemiology is just kind of garbage," Ethan Weiss, M.D., cardiologist and associate professor at U.C.–San Francisco's Cardiovascular Research Institute, told mbg in a conversation about the JAMA egg study. "It's basically a tarot card reading; you can see whatever you want in these results. But it gets a lot of attention, and as long as people keep reacting to it the way they have been, we're going to keep seeing it."
The core problem with nutritional epidemiology, Weiss says, is that we are very bad at measuring what people are actually eating. In fact, people tend to only accurately recall about 50% of what they eat. The other problem: Even if the data compiled is good (which it may not be), there are so many confounding factors that it's difficult to identify how one component of a person's diet is truly affecting their health. Someone who eats eggs, for example, may also eat more bread, potatoes, or bacon, and maybe that has something to do with their increased risk of heart disease. "It's really hard to untangle what the actual problem is," he says.
One of the loudest voices against nutritional epidemiology has been John Ioannidis, M.D., DSc, professor of medicine and health research and policy at Stanford University School of Medicine. He has spoken and written extensively on the flaws of observational nutrition studies, suggesting that funds be redirected to fewer better-designed, randomized clinical trials (RCTs). "These studies need to be largely abandoned," he said in a recent Stanford Medicine interview. "Recall biases, in which study participants remember something incorrectly, can be severe...in addition, dietary intake of a single nutrient probably has small or even tiny effects on major health outcomes."
But even while RCTs are the gold standard of medical research, and preferable to observational studies, they're still not ideal when it comes to studying diets. (During RCTs, one group of participants is assigned one drug or diet, and another is assigned a different one or a placebo.) "Nutrition science is still a fairly young science, and it seems that no matter how you conduct a study, there will be room for error," says Frances Largeman-Roth, RDN, author of Eating in Color.
The problem with RCTs is that we cannot study food and diet the same way we study drugs, which was one of the issues with the controversial red meat study. "If you look at how they conducted their study, it was more like a drug trial, and we know that the effects of food are very different from the effects of drugs," she says. For example, "foods interact with other foods, and it may take decades to see the impact of certain dietary habits."
Another issue with RCTs is that it's hard to conduct a "blinded" study—participants can tell what they are eating. And when it comes to long-term studies and controlling for confounding factors, forget it. You would have to lock up a study group for years and force-feed them the studied diet. As Largeman-Roth explains, "The best nutrition studies are the ones that measure out food for people and have them come to the study location to eat. At least you know exactly what they've eaten. But these types of well-controlled studies are expensive and labor-intensive."
Not to mention, nutrition research is not well funded by the government, so conflicts of interest and industry funding has become a huge problem, as renowned nutrition expert Marion Nestle, MPH, Ph.D., author of Unsavory Truth: How Food Companies Skew the Science of What We Eat, frequently points out. For example, a soda company funding research that promotes physical activity as a more effective way to prevent obesity than avoiding added sugars. This, of course, only muddies the waters further.
Nutrition science in 2020 and beyond: Until we get better studies, we need to get better at contextualizing the ones we have.
As became more apparent than ever in 2019, nutrition science is kind of broken. But that doesn't mean it's without value (after all, nutrition science is how we figured out things like the fact that folate deficiencies cause birth defects). Here at mbg, we're optimistic that the growing attention surrounding this will start to inspire scientists, journalists, and health professionals to step up their game in 2020. And while we can't expect nutrition researchers to overhaul their field overnight, there's a lot that we can do now by way of delivering more nuanced dietary advice and comprehensive reporting.
While some flip-flopping in dietary advice is inevitable (the science can and does change and requires a certain amount of open-mindedness, says Largeman-Roth), it's also crucial to understand and interpret nutrition science in the context of the larger system. "While we should speak with authority, we have to also acknowledge that certain topics haven't been fully studied, and we need to communicate that to the public," she says.
Longtime New York Times health and nutrition reporter Anahad O’Connor echoed this sentiment in a recent conversation with mbg: "I think that as health reporters, we have to be careful about giving the public whiplash by sensationalizing every single contradictory finding." A better approach, and one we’ve personally started taking: Considering the body of research on a topic—considering systematic reviews and meta-analyses, as opposed to just single studies, and looking to see if different types of studies (RCTs, observational studies, animal studies, lab studies) all point to a similar conclusion.
The good news: Going into this new decade, we'll all be looking at nutrition science more critically than ever, which, we believe, is what could finally help prompt some much-needed change.
And do you want your passion for wellness to change the world? Become A Functional Nutrition Coach! Enroll today to join our upcoming live office hours.