You might have seen the following headlines in the news last week:
- “Changing your meat-eating habits could mean a longer life, study suggests” (CNN)
- “Red and processed meat can shorten life, say scientists” (The Guardian)
- “Eating red meat can cut your life expectancy, say scientists” (iNews)
- “Study: Eating More Red Meat Can Shorten Your Life” (NewsMax)
If you’ve been eating red meat, these headlines might have you second-guessing your eating habits.
You might even be thinking about giving up red meat all together.
Headlines such as these, especially in this uber-distracted internet world, are often worded only to grab our attention and get clicks.
But what truth, if any, is there to these headlines?
What does the study to which they refer actually tell us?
Is it true that eating red meat might shorten our ever-precious time enjoying all this world has to offer?
The intent of this post is not to thoroughly explore the relationship of red meat consumption on our health, but rather to explore what the study these headlines present actually says and what that means for you.
We’ll also discuss a little bit about nutritional research so that you know what to consider the next time you read headlines such as these.
First, let’s take a 30,000-foot look at the study on which these headlines are based.
The study, titled “Association of changes in red meat consumption with total and cause specific mortality among US women and men: two prospective cohort studies” was published June 12, 2019 in British Medical Journal.
Here’s how the authors describe what they did:
“We analyzed the association of changes in red meat consumption over eight years with mortality risk during the subsequent eight years. Participants were US women from the Nurses’ Health Study and US men from the Health Professionals Follow-up Study. The Dietary Guidelines for Americans 2015-2020 include the recommendation: “Strategies to increase the variety of protein foods include incorporating seafood as the protein foods choice in meals . . . and using legumes or nuts and seeds in mixed dishes instead of some meat or poultry.” Therefore, we used statistical models to estimate the effects of replacing red meat with equivalent amounts of other protein sources, such as nuts, poultry, fish, dairy, eggs, and legumes, and whole grains and vegetables.”
The researchers looked at data from two studies – the Nurses’ Health Study and the Health Professionals Follow-up Study – to investigate the relationship between increases or decreases in participants’ red meat consumption over eight years and the risk of death over the following eight years.
The researchers also investigated the relationship between replacing red meat with other “protein sources” – quotation marks because the researchers list pretty much all other foods as being protein sources – and risk of death.
The headlines use words like “shorten” and “cut” to imply a causal effect between changes in red meat consumption and lifespan.
One of the first things to look for when reading headlines such as these is whether the study in question actually demonstrated a causal relationship.
That is, does the study actually show that “X” causes “Y”.
Here are two excerpts from the “Results” section of the 2019 analysis.
“Changes in processed meat and unprocessed meat were significantly associated with mortality (both pooled P for trend<0.05), and such associations were mainly driven by the increased consumption (table 3). We found that changes in processed meat consumption had a stronger association with mortality than changes in unprocessed meat consumption (table 3).”
“A decrease in processed meat and a simultaneous increase in whole grains, vegetables, or other protein sources was even more strongly associated with lower total mortality, with the largest reductions in risk seen with increases in nuts (0.74, 0.70 to 0.79) and fish (0.75, 0.68 to 0.84). We found that a decrease in unprocessed red meat and a simultaneous increase in other protein sources except for legumes, whole grains, vegetables, or dairy was also associated with a substantially lower risk of death.
Notice that the researchers didn’t report that that eating red meat will “shorten your life” or “cut your life expectancy”.
They reported that changes in red meat consumption were associated with changes in risk of mortality.
The difference between associations and causation is something news headlines misrepresent repeatedly when reporting new scientific findings.
Just because there’s an association between two variables does not mean one of those things causes the other.
For example, you might see a relationship between children’s shoe sizes and their reading abilities.
Obviously, the size of one’s feet does not cause one to read better and one’s ability to read does not encourage foot growth.
Rather, children with larger feet are often older and have had additional education, which is the real reason that they are better able to read.
In this example, the children’s level of education, which would be causal to their reading ability and nearly perfectly correlate with their shoe size, is a confounding variable, or “a variable that distorts the association between two other variables (the exposure and the outcome)”.
Sometimes there aren’t even any identifiable confounding variables contributing to a correlation (spurious correlations).
The next time you see a news headline implying one thing causes another, look for the study they are referencing and get a feel for what the researchers actually concluded (skimming the abstract often suffices to this ended).
It’s not uncommon for headlines to misrepresent studies that conclude there’s a correlation between two variables as showing a cause and effect relationship.
This isn’t the only problem with these headlines or the study they report.
The results in real numbers
Let’s look at exactly how strong of an association the researchers observed and translate the results in to more meaningful numbers.
Here’s an excerpt from the “Results” section of the study abstract:
“An increase in total red meat consumption of at least half a serving per day was associated with a 10% higher mortality risk (pooled hazard ratio 1.10, 95% confidence interval 1.04 to 1.17). For processed and unprocessed red meat consumption, an increase of at least half a serving per day was associated with a 13% higher mortality risk (1.13, 1.04 to 1.23) and a 9% higher mortality risk (1.09, 1.02 to 1.17), respectively. A decrease in consumption of processed or unprocessed red meat of at least half a serving per day was not associated with mortality risk.”
The death rate for individuals who did not change their red meat consumption at all was 4016 deaths per 328,381 person-years.
Basically, 9.78% of participants who did not make any change to their red meat consumption died over the eight years included in the analysis.
You might be thinking that the “10%” increase means that those who increased their red meat consumption saw a mortality rate of 19.78% (an absolute 10% increase).
Notice, however, the phrase “pooled hazard ratio 1.10”, from which the “10%” figure is derived.
What this means is that the observed mortality risk wasn’t 10% higher in absolute terms (19.78% vs. 9.78%), but rather that it was 10% higher relative to the baseline.
The increased risk of mortality in those who increased their red meat consumption was 10.76%.
This is just under a 1% higher absolute risk of mortality.
Notice also that the authors report that a decrease in red meat consumption was not associated with any change in mortality.
Let’s imagine we take 300 people and split them into three groups of 100 – one of which increased their red meat consumption, one of which decreased their red meat consumption, and one of which kept their red meat intake the same.
After eight years, we’d expect to see roughly 10 deaths in the group that made no change to their red meat consumption and 11 deaths in the group that increased their red meat consumption.
Oh, and we’d expect to see no improvement at all in mortality in the group that decreased their red meat consumption.
We’ll talk about that little detail a bit later in this post.
Observational studies in the hierarchy of evidence
Let’s talk a bit more about the two studies the researchers analyzed – the Nurses’ Health Study and the Health Professionals Follow-up Study.
“The Nurses’ Health Study is a prospective cohort study of 121,700 US registered female nurses aged 30-55 at enrollment. The study started in 1976 and nurses completed a baseline questionnaire about demographic factors, diet habits, lifestyle, and medical history. The Health Professionals Follow-up Study was established in 1986 when 51,529 US male health professionals aged 40-75 returned a baseline questionnaire about detailed medical history, lifestyle, and usual diet. In both cohorts, questionnaires were completed biennially after baseline to collect and update information on lifestyle and occurrence of new onset diseases.”
These are what are called observational studies, named as such due to the fact that the participants were observed with no specific intervention or control of variables.
In the hierarchy of evidence, not all studies are created equal.
At the top of the evidence hierarchy are randomized controlled trials (RCTs), studies in which participants are randomly divided into separate groups, exposed to different interventions with as many other variables as possible controlled, and observed for an effect from the intervention.
At the bottom of the evidence hierarchy are anecdotes – “this worked for me” – or opinions based on fundamental theories – “considering X and Y, we suspect Z”.
Observational studies such as the Nurses’ Health Study and the Health Professionals Follow-up Study fall somewhere in the middle of the hierarchy, although they are considered by many to be “low quality”.
The applicability of nutritional epidemiology
The use of large, population-based observational studies like the Nurses’ Health Study and the Health Professionals Follow-up Study to identify health outcomes across populations fall under a type of research known as “epidemiology”.
While epidemiology plays a major role in shaping public health initiatives, the practice is not without its critics, particularly when it comes to nutrition.
One such critic is Dr. John P. Ionnidis of Stanford University.
In his recent paper, “The Challenge of Reforming Nutritional Epidemiologic Research”, Dr. Ionnidis points out the following:
“Individuals consume thousands of chemicals in millions of possible daily combinations. For instance, there are more than 250 000 different foods and even more potentially edible items, with 300 000 edible plants alone. Seemingly similar foods vary inexact chemical signatures (eg, more than 500 different polyphenols).
Much of the literature silently assumes disease risk is modulated by the most abundant substances; for example, carbohydrates or fats. However, relatively uncommon chemicals within food, circumstantial contaminants, serendipitous toxicants, or components that appear only under specific conditions or food preparation methods (eg, red meat cooking) may be influential.
Risk-conferring nutritional combinations may vary by an individual’s genetic background, metabolic profile, age, or environmental exposures. Disentangling the potential influence on health outcomes of a single dietary component from these other variables is challenging, if not impossible.”
Epidemiology is by no means worthless, and we shouldn’t completely discount observational studies.
However, these kinds of studies must be done properly in order for us to take any meaningful information from them, especially when it comes to making suggestions about mortality.
The food frequency questionnaire
Now would be a good time to discuss how dietary data was collected in the Nurses’ Health Study and the Health Professionals Follow-up Study.
“The two cohorts completed a validated semiquantitative food frequency questionnaire in 1986 and every four years thereafter. Participants were asked how often, on average, they consumed a standard portion of each food in the past year. Frequency response categories ranged from never or less than once a month, to six or more times each day.
Questionnaire items on unprocessed red meat (one serving, 85 g) included beef, pork, and lamb as a main dish; hamburger; and beef, pork, or lamb as a sandwich or mixed dish. Items on processed red meat included bacon (one serving, two slices, 13 g), hot dogs (one serving, one hot dog, 45 g), and sausage, salami, bologna, and other processed red meats (one serving, one piece, 28 g). Total red meat included unprocessed and processed red meat.”
The study participants filled out what are called “food frequency questionnaires” once in 1986 and then every four years thereafter.
Food frequency questionnaires are forms participants use to report, from memory, how frequently they ate certain foods in a given time period.
Similar to observational studies, food frequency questionnaires can be useful but have some serious limitations so far as the strength of the evidence they provide.
In the paper, “Controversy and Debate: Memory based Methods Paper 1: The Fatal Flaws of Food Frequency Questionnaires and other Memory-Based Dietary Assessment Methods”, six reasons are offered for why memory-based dietary assessment methods (M-BMs) like food frequency questionnaires “are invalid and inadmissible for scientific research and cannot be employed in evidence-based policy making”:
“Herein, we present the empirical evidence, and theoretic and philosophic perspectives that render M-BMs data both fatally flawed and pseudo-scientific.
First, the use of M-BMs is founded upon two inter-related logical fallacies: a category error and reification.
Second, human memory and recall are not valid instruments for scientific data collection.
Third, in standard epidemiologic contexts, the measurement errors associated with self-reported data are non-falsifiable (i.e., pseudo-scientific) because there is no way to ascertain if the reported foods and beverages match the respondent’s actual intake.
Fourth, the assignment of nutrient and energy values to self-reported intake (i.e., the pseudo-quantification of qualitative/anecdotal data) is impermissible and violates the foundational tenets of measurement theory.
Fifth, the proxy-estimates created via pseudo-quantification are physiologically implausible (i.e. meaningless numbers) and have little relation to actual nutrient and energy consumption.
Finally, investigators engendered a fictional discourse on the health effects of dietary sugar, salt, fat and cholesterol when they failed to cite contrary evidence or address decades of research demonstrating the fatal measurement, analytic, and inferential flaws presented herein.”
Beyond these five points, it’s worth noting that in the food frequency questionnaires used for this study, items like hamburgers, sandwiches, and other “mixed dishes” are considered “unprocessed”.
This presents at least two potential problems:
- The physiological effects of a grass-fed lamb chop are significantly different than the physiological effects of a Whopper or a Big Mac, all of which would be considered equivalent in this study.
- The people who are aware of the differences between burgers or sandwiches and red meat on its own – without a bun, cheese, or sauce – are likely also taking other steps to improve or maintain their health that those who are eating burgers or sandwiches aren’t.
We’ll touch more on that second point a bit later in this post.
The key takeaway is that food frequency questionnaires in general are problematic, and the way foods were grouped in this part
It’s arguable that we don’t have many better options for assessing dietary intake, but we should be aware of the strengths and weaknesses of whatever method we use.
Remember confounding variables and how they might result in what looks like a relationship where there isn’t one?
Well, fortunately, researchers can make statistical adjustments/corrections to their analysis to try to account for these confounding variables.
Looking at our earlier example of the relationship between shoe size and reading ability, there are statistical adjustments one can make to account for the influence of education levels to get a better feel for if there actually is a direct relationship between shoe size and reading ability.
Let’s take a look at how well the researchers accounted for potential confounding variables in the 2019 analysis of changes in red meat intake and mortality.
“We adjusted multivariable models for initial age, calendar year as the underlying time scale, race (white v other), family history of myocardial infarction, diabetes, or cancer (yes v no), initial aspirin use (yes v no), and initial multivitamin use (yes v no). We also adjusted for initial consumption of red meat (in fifths); body mass index categories (<23, 23-24.9, 25-29.9, 30-34.9, and ≥35); menopausal status and hormone therapy use in women (premenopausal, postmenopausal and hormone therapy never user, postmenopausal and hormone therapy current user, postmenopausal and hormone therapy past user, or missing indicator); simultaneous changes in smoking status (never to never, never to current, former to former, former to current, current to former, current to current, or missing indicator); initial and simultaneous changes in physical activity, alcohol consumption, total energy intake, and other main food groups, including vegetables, fruits, whole grains, and sugar-sweetened beverages (all in fifths)..”
The authors did, to their credit, adjust for several important potential confounding variables.
Want to know what was not adjusted for?
- Perceived stress
- Mental health (depression, anxiety…)
- Sleep quality/quantity
- Medication usage
- Photoperiod (sun and blue light)
- Time spent in nature
These are only examples countless other variables that may have played a role in participants’ mortality.
I’m not sure what data was or was not collected during the Nurses’ Health Study or Health Professionals Follow-up Study, but other variables could have – and likely did, as we’ll discuss next – have an effect on the outcomes of this study.
Healthy user bias
Let’s now talk about a concept known as “healthy user bias”:
“The healthy user effect is best described as the propensity for patients who receive one preventive therapy to also seek other preventive services or partake in other healthy behaviors. Patients who choose to receive preventive therapy may exercise more, eat a healthier diet, wear a seatbelt when they drive, and avoid tobacco. As a result, an observational study evaluating the effect of a preventive therapy (e.g., statin therapy) on a related outcome (e.g., myocardial infarction) without adjusting for other related preventive behaviors (e.g., healthy diet or exercise) will tend to overstate the effect of the preventive therapy under study.”
Healthy user bias is the concept that individuals who engage in one behavior that is perceived to be healthy – following nutritional recommendations like limiting red meat, for example – are likely to live healthier lifestyles in general.
Thus, when assessing the effects of these individuals’ nutrition habits through observational studies, it’s difficult, if not impossible, to correct for all of the other lifestyle factors that might affect health outcomes.
The results of the Nurses’ Health Study and Health Professionals Follow-up Study suggest this is a valid concern.
Individuals who ate less red meat over the course of the analysis also…
- drank less alcohol
- increased their physical activity levels
- ate fewer total calories
- gained less weight
- ate higher quality food (per AHEI score)
- smoked less
…than the individuals who increased their red meat intake over the course of the analysis.
While the researchers corrected for these and other factors, there are countless others that play a role in one’s risk of mortality.
What differences in lifestyle were not accounted for in the study or its adjustments?
That the two analyzed studies were of licensed health professionals, it’s not unreasonable to assume that the individuals who increased their red meat consumption against established recommendations were engaged in other problematic behaviors, many of which were not assessed or corrected for during analysis.
The result left out of the headlines
Let’s circle back to the fact that the researchers reported that a decrease in red meat consumption was not associated with any change in mortality.
“A decrease in consumption of processed or unprocessed red meat of at least half a serving per day was not associated with mortality risk.”
While the study showed that participants who increased their red meat intake over the course of the study had a higher risk of mortality, the participants who decreased their red meat intake over the course of the study did not see a reduction in risk of mortality.
This was observed of changes in total, processed, and unprocessed red meat.
It’s possible that the participants who decreased their red meat consumption over the course of the study increasingly engaged in problematic lifestyle factors that weren’t fully accounted for in adjusting for confounding variables.
What we know about healthy user bias suggests that this isn’t likely.
The individuals who decreased their meat consumption likely weren’t making other changes to their lifestyle that “cancel” out a beneficial effect from reduced red meat consumption.
So, why is CNN telling us, right in their headline, that “changing your meat-eating habits could mean a longer life”, when the results of this study show that the people who reduced their red meat intake did not live longer lives??
The following headline would more accurately present the findings of the study:
“Reducing red meat intake does not improve lifespan”
Such a headline would still be making a bold claim based on less-than-conclusive data, but no less bold than those that we saw this week.
Now, the authors do note that decreasing red meat intake and increasing intake of other foods showed a favorable effect on mortality.
“The pooled results showed a substantially lower mortality risk with a decrease in red meat consumption and a simultaneous increase in the consumption of nuts (pooled hazard ratio 0.81, 95% CI 0.79 to 0.84); fish (0.83, 0.76 to 0.91); whole grains (0.88, 0.83 to 0.94); poultry without skin (0.90, 0.86 to 0.95); vegetables without legumes (0.90, 0.87 to 0.93); dairy (0.92, 0.86 to 0.99); eggs (0.92, 0.89 to 0.96); or legumes (0.94, 0.90 to 0.99). A decrease in processed meat and a simultaneous increase in whole grains, vegetables, or other protein sources was even more strongly associated with lower total mortality, with the largest reductions in risk seen with increases in nuts (0.74, 0.70 to 0.79) and fish (0.75, 0.68 to 0.84). We found that a decrease in unprocessed red meat and a simultaneous increase in other protein sources except for legumes, whole grains, vegetables, or dairy was also associated with a substantially lower risk of death.”
This observation, though, essentially only serves to add confounding variables to the equation.
Was it the reduction of red meat that was more effective towards improving mortality or was it the increase in nut consumption?
That the results looking solely at red meat intake and adjusting for changes in other foods show no effect suggests that the increased consumption of these other foods, not the decreased consumption of red meat, is more likely to have been driving the observed change in mortality risk.
Keep in mind also what healthy user bias suggests about the individuals who might increase their intake of foods known to be associated with improved health.
All that is to say that the results of this analysis are probably just an indication that the relationship observed between changes in red meat intake and mortality just isn’t strong enough for us to make any meaningful conclusion, at least not from this study.
What you can do with this information
Do I think this study and the headlines it prompted are totally worthless?
Actually, I kinda do.
When I first typed that question to kick off this section, I expected to follow it up with the answer “not entirely…” but I don’t think that accurately reflects my opinion.
This was an observational study.
The data was based on food frequency questionnaires.
There are several confounding variables left unaccounted for.
The results themselves aren’t that convincing.
Most importantly, the headlines grossly misrepresent the findings of the study.
To be clear, I couldn’t care less how much red meat people eat.
While red meat can serve as an excellent source of essential nutrients and, if sourced properly, help to hedge against the impact of one’s diet on the environment, there’s no such thing as an essential food.
If you want to eat red meat, have at it.
If not, that’s cool, too.
You do you.
What I do care about, however, are headlines and articles that use scare tactics to discourage us from engaging in perfectly safe eating habits.
Such headlines serve no purpose other than to get clicks and continue to push the status quo, one-size-fits-all, nutritional advice that’s been failing us for decades.
Always take nutrition news headlines with a hefty serving of salt.
Eat real food.
With or without red meat.
You’ve got this.