“Myth-Busting” Can Impair, Rather Than Correct, Consumers’ Health Knowledge, Study Suggests

Stethoscope on laptop keyboard

Generally speaking, the Internet isn’t a great place to go for health advice. Search your symptoms and you’ll no doubt find an article — probably apocryphal —about the woman who foolishly brushed off her case of the sniffles and was dead within 48 hours. New research has identified a different pitfall of online health publications: unclear or misleading information from trusted sources.

Psychological scientists and education researchers led by Derek Powell from Stanford University found that reading diabetes health facts presented as “myths” confused readers, and led them to perform poorer on a quiz of basic diabetes knowledge than people who hadn’t read the information. Rephrasing the same basic statements as questions instead of myths did not hurt quiz scores in the same way. The research appears online in Psychological Science.

Powell and colleagues use these results, along with previous research showing that one in three Americans seek health information online, to point out that health communicators must be careful about how they present material. Once people read and learn incorrect information, it is tough to correct misconceptions.

Stephan Lewandowsky of the University of Western Australia and an international team of psychological scientists wrote about the challenges of correcting misinformation in a 2012 review article in Psychological Science in the Public Interest. They wrote that misinformation is sticky; after something is accepted as fact, it can take multiple ‘debunkings’ for a person to unlearn the falsehood. They add that people believe much of the information they read, especially if the information fits into a plausible narrative or comes from a trusted source. The diabetes “myths” that Powell and his team showed study participants came from the American Diabetes Association (ADA), a source that should be trusted to give medical advice.

Powell and colleagues write that many educational materials are designed with the best intentions by experts, but should be tested empirically to ensure that people understand the information correctly.

Powell and his colleagues give an example of paltering — using true statements to induce false conclusions:

Suppose someone is selling a car and a potential buyer asks, “Does this car need any maintenance?” The car is scheduled to need extensive maintenance fairly soon, but a paltering seller might respond, “It’s a great car; it’s always run beautifully.” This statement might be literally true, but it gives the false impression that the car does not need maintenance.

The authors say that paltering is usually a deceptive tactic. They include it in the report because it is useful to see how true statements don’t always convey the truth. In creating their “myths” page, the ADA was not trying to mislead, but inform. In this case, however, their statements were technically true but led to misunderstandings about diabetes.

For example, one entry read: “Myth: People with diabetes can’t eat sweets or chocolate.” Powell and colleagues note that while people with diabetes may be allowed to eat sweets and chocolate sparingly, framing this fact as a myth can lead some people to incorrectly view it as completely false, that people with diabetes need not watch their intake of fats and sugars. While the ADA likely meant to convey nuance and destigmatize the experience of people with diabetes, their “myths” introduced misconceptions in many readers.

Powell’s experiment included a number of conditions to test better ways of phrasing the ADA’s information. A baseline group of participants simply took a quiz on their diabetes knowledge. Another group read the ADA’s diabetes “myths” and then took a quiz. The myth readers scored worse on the quiz, even when it came to simply understanding diabetes risk. 98% of baseline participants correctly identified “being overweight significantly increases the likelihood that someone will become diabetic,” as a true statement compared with only 81% of myth-readers. On difficult questions the difference was more pronounced: 76% of baseline quiz-takers knew that diabetics have compromised immune systems, while only 38% of people who read myths answered the item correctly.

Another group of participants read similar statements reformatted as questions. Instead of, “Myth: People with diabetes can’t eat sweets or chocolate,” they saw, “Can people with diabetes eat sweets or chocolate?” These questions yielded more correct responses compared to “myths” and similar scores to baseline quiz-takers.

The article authors suggest that phrasing these ideas as questions leaves the issue unresolved in a reader’s mind, while the word “myth” suggests a settled truth.

 

References

Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131. doi:10.1177/1529100612451018

Powell, D., Keil, M., Brenner, D., Lim, L., & Markman, E. M. (2018). Misleading health consumers through violations of communicative norms: A case study of online diabetes education. Psychological Science. doi:10.1177/0956797617753393

 


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.