How to make sense of contradictory health news ABC Health & Wellbeing, by Tegan Taylor, posted 24 April 2018

Red meat is good for you; red meat is bad for you. Sugar is killing us; sugar’s still better than artificial sweeteners. Fat is bad; fat is good.

It seems like every day there’s a new piece of health advice in the news that contradicts the advice we got yesterday. Add to that a barrage of self-styled wellness experts and self-interested corporations and it’s hard to know who to believe if taking care of your health is important to you.

So before you go and empty your pantry of “cancer-causing” soy sauce before stocking up again tomorrow because of its health benefits, allow us to present seven litmus tests you can apply to see if a health claim or news story is legit.

1. Who says?

First, it’s worth asking that playground retort: “Who says?” Who’s put out this study? And who’s paid for it?

Following the money to see who funded a study can help uncover whether there are any vested interests that might have influenced the results.

GIF: Who’s paid for this research?

Are they a supplement manufacturer? A pharmaceutical brand? A food company publishing a survey they’ve run that happens to put their product in a good light?

It’s preferable to see a study from a research institution with funding coming through a transparent process such as via the National Health and Medical Research Council, advises Australian Science Media Centre director of news Lyndal Byford.

“It’s much more reliable than, say a study that is hailing the benefits of pomegranate juice that’s funded by a pomegranate juice company,” Ms Byford said.

“Although it doesn’t automatically mean a study is biased if it is funded by a company or industry, it is something to keep in mind and be cautious about.”

Once it’s clear where the study’s come from, it’s worth looking at where it’s been published.

Ideally, look for research that’s been peer-reviewed and published in a high-quality academic journal.

2. Sample size matters

How many people (or things) did the study look at? A small sample size can throw up results that aren’t actually true for the majority of the population.

“Quite often you’ll get a single study that might be showing a link between, say, broccoli and cancer, but then when you group that together with a whole lot of other studies that have been done that are quite similar, what initially looks like a link actually goes away,” Ms Byford said.

There’s no hard and fast rule as to how big a study “should” be — some studies are necessarily small because they’re looking at a rare condition and there just aren’t that many people who can participate. But bigger tends to be better.

When it comes to surveys and polls, Ms Byford said to look for numbers in the thousands.

“You generally need about 1,000 people to be representative of the whole population,” she said.

“Anything smaller than that should immediately raise questions.”

3. Lab benchtop or real world?

Another question to ask is where was this stuff was measured? Was it in a petri dish? In animals like mice? Or in a nice, big representative sample of human beings?

In vitro experiments in petri dishes and animal studies are important and informative — but news stories have been known to overstate the implications of lab experiments for humans in the real world.

For example, last year there was a rash of stories on a “breakthrough” that purported to show Vegemite could prevent birth defects. But dig a little deeper into the research paper itself and that link became so tenuous it all but disappeared.

The study was done in mice who had been genetically modified to have certain mutations known to cause birth defects. Researchers found these mouse mums were naturally deficient in NAD, a chemical the body naturally makes from vitamin B3 (which is found in Vegemite, among other foods).

When researchers made sure these birth-defect-prone mouse mums were not deficient in NAD, it averted the defects.

This was an important study — but the implications for humans were oversold by many media outlets, including the ABC.

As Robin Bisson wrote in an opinion piece for Cosmos Magazine discussing the story, “strong claims need to be backed up by strong evidence”.

That’s why looking at the research methods used, and what was actually measured is so important when you’re planning on basing your health decisions around them.

4. Correlation vs causation

Look out for those hedging words. “Linked to”. “Could cause”. It’s rare that a single study — or even a robust meta-analysis, which weighs a lot of different studies of the same thing to see where the balance of evidence lies — ever proves something definitively.

Especially when it comes to health issues, there are so many factors at play it’s virtually impossible to say one thing for sure causes something else.

That’s right, as you’ve probably heard before, correlation does not imply causation.

 

As Ms Byford says, a lot of health research relies on asking people about their behaviour and then looking at their health outcomes.

“For example, they might say watching an extra five hours of TV a night has been linked to an increased risk of diabetes,” she said.

“Asking … ‘how many hours a day do you watch TV?’ and then looking for differences between people who do a small amount of that and a big amount of that.

“Those kinds of studies are great and really important to ask questions and throw up questions and really give scientists a direction to head in but what they can’t say is one thing causes another thing.”

That’s why hedging words are important. The research progresses our knowledge of something, so that over time we can get a clearer idea of what might be risk factors for certain health problems.

5. Risky business

“ABC triples risk of XYZ.”

Sounds serious — almost as if we should do away with the alphabet altogether. But when it comes to claims of multiplying risk, it’s worth looking at what the base level of risk is.

GIF: One in 10,000 or three in 10,000?

If we’re tripling risk but the base level of risk is one in 10,000 and it’s tripling to three in 10,000, we’re still talking about a relatively low risk.

“That can sound quite scary but the main question that you need to ask there is what is the risk in the first place,” Ms Byford said.

“Where you can, ask for evidence and look at what the actual numbers are that are involved.

“What is the real risk of these conditions, not just what’s called the ‘relative risk’, this doubling or tripling you might often read about.”

6. The dose makes the poison

Too much fluoride will indeed kill you; just enough will keep your pearly whites pearly and white. It’s all about the dose.

That’s why you’ll occasionally see headlines linking seemingly innocuous foods, nutrients or medications with extreme health outcomes.

“If it’s done in animals, for example, they may be giving these animals a very, very high dose to try and see if something is toxic or causes cancer,” Ms Byford said.

“But actually the dose that humans are naturally exposed to is probably substantially lower than that and it may not have the same effects at a high dose and a low dose.”

GIF: Living in a fairytale universe is associated with an increased risk of apples being poisonous.

Similarly, isolating certain nutrients, minerals or vitamins can lead to misleading headlines.

“Just because the specific ingredient in a food is very good for you or very bad for you doesn’t necessarily mean that eating the whole food is going to make a difference,” she said.

Looking at the delivery method of a seemingly super substance is also important. Just because it does you good on one part of your body doesn’t mean it’s going to have great effects everywhere.

“Often in the digestive system, the enzymes and acids in your gut actually destroy a whole bunch of useful chemicals in the food that you eat,” Ms Byford said.

“So if the chemical is good rubbed on your skin, for example, that doesn’t necessarily mean it’s going to be useful if you’re eating it as a food.”

7. The bottom line? Read to the bottom line

Answering the questions above means reading beyond a sensationalist headline or clicking through a social media post to the full story and carefully going through the story. Sometimes it will involve doing more digging of your own to find the original research paper or compare what’s being claimed in a story against what has been published previously.

But when we’re talking about health stuff — the science of these bodies that transport us through life — isn’t a bit of extra legwork worth it?

If you do think you’ve come across some research that is relevant to a health issue you have, Ms Byford advises you talk it over with your doctor before taking any drastic action.

“If you read something in the newspaper that you think is relevant or interesting to you or it raises questions about a condition that you’ve got, it’s absolutely worth going and speaking to your GP or specialist about it because it’s certainly the case that a one-off study doesn’t mean that the whole approach to a treatment should be changed,” she said.

 

Leave a reply