News about dark chocolate & diabetes drawing false conclusions
Association does not equal causation
Here comes the holiday season when chocolate treats often find their way into stockings hung by the mantel or candy trays on the kitchen counter. And here comes another “benefits of dark chocolate” study and lots of media coverage - right between Thanksgiving and Christmas and Hanukkah and Kwanzaa and other holidays during which chocolate is often enjoyed. Women’s Health, CNN, The Guardian, Al Jazeera, Fox News, US News & World Report….many media had a sweet tooth for the latest study.
The NPR Shots blog posted a story, “How sweet! A daily dose of dark chocolate may cut your risk of diabetes.” Here’s a tip: anytime you see “may” in a story about a study, feel free to substitute “may not.” Either could be true. But “may not” just doesn’t draw the same attention.
Note how risk is communicated in news coverage. “A 21% lower risk of developing Type 2 diabetes” - or some similar phrase - is ubiquitous in the write-ups. In the NPR story, the study author is quoted: “"We are a little bit surprised to see that effect size.” But “lower risk” and “effect size” imply that cause-and-effect - a causal relationship between eating dark chocolate and Type 2 diabetes risk - has been established. It has not. Only a statistical association - not a causal one - has been shown.
The researchers themselves used the terms “associated with lower risk.”
The NPR story - like many others - is sugar-coated with other causal inferences:
“may offer health benefits”
“lower risk of strokes and other types of cardiovascular disease” (from another study)
“reduction in heart disease risk”
“can help improve insulin insensitivity”
“benefits for obesity, Type 2 diabetes and for metabolic syndrome”
All of these statements were made in connection with interesting research. But we’re talking about observational research which cannot establish cause-and-effect.
So the words matter.
The New York Times, in comparison, danced a fine line with its headline:
The Times put the important emphasis early in the story - in the second sentence:
The research did not prove that the chocolate itself was responsible for this health benefit.
The Times also dedicated an entire section and section headline to:
Reading the limitations of the study - which are published along with the findings in The BMJ - is important. It always amazes me how often journalists either didn’t read the study or ignored the limitations. Among other limitations, the researchers addressed: “we cannot entirely rule out the role of confounding in our observed associations.” In other words, other things in the research subjects’ lives could have influenced the results. They also admit that “food frequency questionnaires are subject to measurement errors.” This is a common shortcoming of studies that rely on self-reported data and on subjects keeping a diary. One team of researchers addressed this in a letter to the editor of The BMJ:
The reliance on self-reported dietary data introduces potential measurement error and recall bias. For instance, the clarification of a "serving" of chocolate is not clearly standardized, and significant variability exists in chocolate composition, particularly in cocoa content, sugar, and fat levels.
Letters to the editor are very often interesting. One PhD wrote:
I find it especially frustrating when talking or writing about nutrition and health, that for every positive statement about a food, there is an equally negative one, and it is difficult to balance these with the general public.
Imagine how frustrating it is for the public.
I also think it is essential to read the researchers’ disclosures when they publish their work. It’s worth noting that one of the co-authors of the journal article disclosed that she has received grants from Mars Edge, which touts its “innovative partnerships with academia, start-up companies and philanthropic organizations to bring our ideas to life.” That’s the same Mars of chocolate and candy brands fame.
A few years ago a New York Times story highlighted “More Evidence That Nutrition Studies Don’t Always Add Up.” A few of the things that were covered:
Academic misconduct and misreporting of data
An “alarming number of food studies are misleading, unscientific or manipulated to draw dubious conclusions”
Data dredging or torturing the data: “the process of running exhaustive analyses on data sets to tease out subtle signals that might otherwise be unremarkable…fairly common in health research, and especially in studies involving food. It is one reason contradictory nutrition headlines seem to be the norm: One week coffee, cheese and red wine are found to be protective against heart disease and cancer, and the next week a new crop of studies pronounce that they cause it.”
Pressure to publish: “Marion Nestle, a professor of nutrition, food studies and public health at New York University, said that many researchers are under enormous pressure to churn out papers. One recent analysis found that thousands of scientists publish a paper every five days. ‘You can’t get a job if you don’t have papers,’ she said.”
“The problem extends to science journalists as well: Many reporters are encouraged to produce articles that get lots of clicks. That is another reason researchers and universities feel pressure to put out studies and news releases with exaggerated findings.”
Journalism serves the public much better with more stories like that one than with dozens of more “chocolate lowers risk” claims.
Addendum later on December 17:
Last year at this time, Yale and Emory co-authors spoofed nutritional epidemiology studies in the Christmas edition of The BMJ. The paper was entitled:
Association of health benefits and harms of Christmas dessert ingredients in recipes from The Great British Bake Off: umbrella review of umbrella reviews of meta-analyses of observational studies
You really should read the entire piece yourself, but to cut to the chase, the conclusion was:
This Christmas, if concerns about the limitations of observational nutrition research are set aside, you can have your cake and eat it too.
The failure of Schools to teach students the difference between correlation and causation results in so many people making health decisions that have no substantiation in fact. Many people are convinced that some nugget they have heard is based in fact when in reality it is a result of one or two studies that receive a lot of publicity and or studies based on correlation.