With modern media, every day we are bombarded with information and claims about reality. The global economy will crash this year, red meat causes cancer, the earth is warming up, Syrian refugees cause crime, homeopathy works. The question is: what to believe and what not? Fortunately there are ways to distinguish the more credible claims from the less credible.
The quality of evidence
The likelihood of any claim being true, basically depends on one thing: the quality of the supporting (or opposing) evidence. Now, especially since the internet, you can find evidence for almost any claim you like. For instance, if you are looking for support of your flat earth belief, just visit the website of the Flat Earth Society.
The key thing to keep in mind though, is that not all evidence is equal. Some evidence is of higher quality than other evidence. And the way to determine this, is by using the scientific method.
To be clear: this does not mean science is always right! Scientists are people to, and they have prejudices and make mistakes just like anybody else. They too are poor observers, have unreliable memories, presume patterns that don’t exist and are bad at estimating probabilities.
The good thing is that the scientific method is especially designed to overcome these human shortcomings. It requires for instance that studies are properly designed, that they can be replicated by independent researchers, that results are statistical significant and that everything is published in peer-reviewed scientific journals.
Again, mistakes are frequent, so in reality the method is far from perfect, but we know by now that in the long run, it definitely works. After all, the last few thousand years science and technology have utterly transformed our world, from living in the Stone Age to the International Space Station, and that’s pretty impressive.
The scientific approach does require some background knowledge and time, two things not everybody has. Still, a number of its principles can be applied in everyday life, without first getting a Ph.D or quitting your day job.
For instance, if you’re confronted with a dubious claim, you could try to determine the quality of the supporting evidence by asking certain critical questions, like a scientist would. To give you an idea of the type of questions, I designed the Claim Credibility Checklist.
Claim Credibility Checklist
Answering the following questions will help you determine the credibility of a claim based on the quality of evidence.
Source of the claim…
1.
Who presents the claim?
E.g. a scientist, a politician, a business person, a conspiracy theorist?
Someone’s background is, in principle, independent of the credibility of their claim, but, in practice, an important indicator.
For instance, most scientists are well trained researchers with critical peers who judge their work, so they tend be provide evidence of a reasonable quality.
Many politicians and business people on the other hand, had less scientific training and might be more interested in promoting their ideology or making money than in testing their claims, so the quality of their evidence is often less.
As a general rule you should be more vigilant with claims when there is a potential a conflict of interest, e.g. someone making money of their claim.
2.
Where was the claim presented?
E.g. in a peer-reviewed scientific journal, an online newspaper, TV News, an entertainment website?
Again, the medium in which a claim is presented is, in principle, independent of its truth, but in practice an important indicator.
Among the most careful media are peer-reviewed scientific journals. These journals will only publish someone’s article after other scientists, who are specialized in the same field, have checked if the author did a well designed study, sufficiently documented it and if the evidence is convincing. The downside of scientific journals is, that they are very hard to read for non-experts, so in that respect they are an impractical source for many.
Established newspapers and TV News organizations are less careful of what they say than scientific journals, although they do often fact-check claims before publishing or airing them, and often interview experts (like scientists), to verify things.
And then there are countless websites and other news sources with minimal checks and balances in place to verify the truth of what they publish. There you should be most skeptical.
3.
What do other sources say?
E.g. is the evidence verified by independent researchers? How much evidence is there? Did others find opposing evidence?
If others make the same claim, based on independently obtained evidence, the credibility of the claim goes up. And the more supporting evidence there is (e.g. number of independent studies) the higher the chances are, that the claim is true.
However, if other sources make an opposite claim based on opposite evidence, you have to investigate this. If the quality of the opposing evidence is strong, the credibility of the original claim obviously goes down.
If a study was done…
4.
Is it explained how the study was done?
E.g. info on the test subjects, procedure, measurements?
If a claim is supported by evidence from a study, it is important that this study is described in detail so other researchers could replicate it and reproduce the evidence.
When a claim is presented in popular media, sometimes the claimants refer to a scientific journal article for the study details, and that is fine, as long as you are able to check the study’s methods.
5.
Was the study properly designed?
E.g. was the study large enough, was there a control group, were statistics correctly applied?
(Admittedly, if you have never taken a course on research methodology, checking the study’s design can be difficult, and you probably first need to do some studying yourself of the concepts involved). Here are some quality indicators:
You should check whether the study was large enough (e.g. enough test subjects) to measure the claimed effect with confidence. As a general rule: the more test subjects, the better a study can detect an effect.
You could also check if the study had, besides an experimental group, a control group for comparison. E.g. in medicine, if you claim that a new drug is effective, you need people in your study getting the drug (the experimental group) and people not getting the drug (the control group) to be able to compare.
You also might want to check if the researchers used the right statistics. E.g. is the effect they measured large enough or could it be coincidence, and are they presenting their data fairly or in a misleading way?
There are even more checks you can do. For instance, in case of a medical study, checking if a placebo was used, if test subjects were randomly assigned or if the study had a double-blind design. It goes to far to explain all these saveguards here in detail, but you can find more information online, e.g. on the Wikipedia page on Experiments.
6.
Were the right conclusions drawn?
E.g. was a causal relationship found between the variables or just an association? Are there possible alternative explanations?
A frequent mistake in interpreting study results that include two related variables, is thinking that one variable must be causing the other. E.g. if you study what makes some people grow older than other people, and you find that your older subjects tend to eat a lot of blueberries, it does not automatically mean that eating blueberries causes people to live longer. (Maybe it’s the other way around and old age increases one’s preference for blueberries. Or there is a third factor in play influencing both variables, e.g. leading a healthy lifestyle). You can only proof a causal relationship between two variables if you do a proper experiment where you have two comparable groups of test subjects and you manipulate one of the variables (e.g. the amount of blueberries they eat) and then measure the other variable (e.g. how old to they become). Unfortunately, this type of study is expensive and takes up much time, but without this design, it’s hard to be sure.
You also want to check if the study’s results might have alternative explanations that the claimants maybe overlooked.
Unfortunately, answering these questions can still take up quite some time. Even checking out an apparent simple claim like “a warming-up before sports prevents injuries” can easily take you a few hours of online research – if not longer. And for complicated subjects, it’s even worse.
In addition, you will constantly have to fight some major weaknesses in our human reasoning capacity. Psychologist Michael Shermer wrote in his book The Believing Brain for instance, that research showed that people tend to quickly form beliefs on emotional and social grounds and much less on a careful evaluation of the evidence. And after a belief is formed, they subconsciously seek out confirmatory evidence and ignore dis-confirmatory evidence – an effect that is called the confirmation bias.
So finding out the truth is hard work. The key is to remain an open-minded and critical thinker, demand quality evidence before accepting a claim, and eventually the truth should reveal it self.