My bubble is a kind of echo chamber of peers. It originated from my upbringing, friends, family, work, culture and other environmental factors. It’s a comfortable place but unconsciously it influences all my opinions and dissenting voices are ignored. A painful confrontation with reality lurks, so it’s time for an escape plan.
My bubble is filled with highly educated, progressive, socially conscious, vegetarian, yoga-practicing folks from Amsterdam (The Netherlands). Friendly people who worry about refugees, racism, climate change, privacy, work-life balance and right-wing populism.
As long as our beliefs are correct, there is nothing to worry about. But sometimes we’re wrong, usually without realising it. Did Geert Wilders actually say something meaningful the other day? Might nuclear energy also have benefits? Are there also arguments in favour of genetically modified foods? We probably missed those memos.
The problem is that false beliefs often lead to wrong actions. For example: If you do not believe in climate change, why would you change your energy consumption? Or if you think that vaccinations cause autism, it’s unlikely you let your children be vaccinated. And if you think that gay people are sinners, you will not accept their rights.
Practically everyone lives in a bubble, and some even worse than me. Think of the people of North Korea or members of the Scientology Church or supporters of the Taliban.
Your bubble also influences your media consumption. Take for instance the upcoming presidential elections in the United States. On the one hand you have the Republican Trump supporters who watch the Fox News channel, read the Wall Street Journal and listen to the conservative talk radio show of Rush Limbaugh. On the other hand you have the Democratic Clinton supporters. They mostly watch CNN, read the New York Times and listen to National Public Radio. As a consequence, both groups hardly know what is happening with the other group and they don’t understand each other’s motives. By now they basically view each other as beings from different planets.
You would expect that with the advent of the internet this ideological compartmentalisation would be reduced because new information and ideas have become more accessible. Our global network certainly has this potential. People in isolated areas who have come online are introduced to a new world of information (assuming they understand the language and their local government does not censor it). Also social media can play a crucial role in revolutions like the Arab Spring.
On the other hand the internet can actually strengthen your bubble as there is more information available than anytime before and you can easily limit yourself to those ideas that you already believed anyway. Even the really weird ones. You still believe the earth is flat? Visit the Flat Earth Society website. Always thought that 9/11 was an inside job by the Bush administration or that the moon landings were fake? You will find more than enough conspiracy websites, blogs and YouTube channels on this. And by following your fellow believers on Twitter, befriending them on Facebook and chatting with each other in forums, you will only get your beliefs confirmed further.
A revealing example of how people also prefer to remain in their “ideas silos” when purchasing online books was found in a study during the American presidential elections of 2004 and 2008. A social network analyst then looked at what combinations of political books were bought on Amazon using Amazon’s ‘customers who bought this book also bought…‘ function. When he subsequently visualised every book as a dot and drew lines between all the books that were often bought together, two clouds became visible: one cloud with closely related pro-Democrats books and one cloud with closely related pro-Republican books, and hardly any connections between the two. In other words, readers limited their book purchases mostly to those of one party and rarely bought books of both parties.
Why are we so vulnerable to this kind of tunnel vision? The psychologist and leader of the American skeptical movement, Michael Shermer, has been interested in this subject for a long time. His book The Believing Brain from 2011 is a fascinating summary of 30 years of research on why people believe.
Shermer’s conclusion is alarming: people don’t form their beliefs by accurately weighing the evidence in favour or against an issue, but make quick intuitive decisions on emotional grounds in a social context consisting of family, friends, colleagues, culture and the society at large.
And once a belief has formed, the believer tries to rationalise it and the confirmation bias emerges. This bias is the (largely unconscious) tendency of people to quickly believe new information that supports their current conviction and to ignore information that contradicts it.
Researchers have demonstrated this bias in many areas including the judicial system, investing, politics, education, aviation, and science itself.
For instance, a study was done at Stanford University in the U.S. among people who were strongly in favour of the death penalty or strongly against it. During the experiment they got to see both a study that showed that the death penalty was effective in reducing crime, as a study that showed that the death penalty was not effective. (Both studies were made up by the researchers and of equal quality.) After seeing the first study, the participants adjusted their views a little in the direction of the presented study as you would expect. But after seeing both studies, they were back at their original conviction and even to a greater extent than before the experiment! Interviews revealed that participants regarded the study that supported their views as one of good quality and noticed a variety of methodological shortcomings in the other study.
Professional forensic experts don’t seem to fare any better. One study showed for instance that if fingerprint specialists believe that a suspect is guilty, they are more likely to see a match with a fingerprint found at the crime scene than when they believe that the accused is innocent. The researchers argue that the same confounding effect occurs with facial recognition in composition sketches or in a line-up, with the interpretation of polygraph results, in DNA analysis and in voice and handwriting recognition.
The role that emotion plays in the confirmation bias was demonstrated in a study involving statements made by U.S. presidential candidates in 2004. In this experiment, voters with a strong preference for one of the candidates, George W. Bush or John Kerry, got to see some of their statements (or statements from a neutral person) where the person seemed to contradict himself. Bush voters turned out to notice more the contradictory statements of Kerry while the Kerry voters noticed Bush’s contradictions more. (No difference between the two groups was found when a neutral person had made the statements.) MRI recordings of the voters revealed that when reading conflicting statements of their favourite candidate, brain regions associated with negative emotions were active. It seems that voters subsequently tried to solve this negative emotional situation by rationalising the conflicting data away.
How to escape?
The confirmation bias turns out to be ubiquitous, largely unconscious and difficult to avoid. Escape from your bubble seems a hopeless case this way.
Yet there are some things you can do. One of them is deliberately seeking out sources that challenging your beliefs: read the opposition’s newspaper, try out a foreign news channel, follow an opponent on Twitter and try to follow her arguments. Occasionally looking over the fence does not have to take much time and will at least give you an unfiltered look on the enemy camp.
You could also occasionally ask yourself what would be required for you to change your mind on an important issue. What data or arguments would have to surface, for example, to become pro gun ownership, or against abortion or pro the death penalty? Just like every hypothesis in science has to be falsifiable (because otherwise it can never be proven to be incorrect), so should every conviction be vulnerable to proper criticism. If you can not think of anything that would make you change your mind on a certain subject, then your belief could be a dogma and you’re trapped inside your bubble.
If you really want to break out your bubble, you will also have to discuss matters with the opposition, even though this can be quite frustrating.
Someone who frequently gives this a try is the American neuroscientist and writer Sam Harris. On YouTube, in his podcasts and on his blog he speaks with people from all kinds of backgrounds about, what he calls, ‘toxic’ subjects such as religion, racism, terrorism and violence. And that’s not easy. Sometimes there is simply not enough mutual respect to achieve a good discussion and regularly Harris’ intentions and arguments are distorted afterwards in the media because of the combustable nature of the subjects. But the conversations are fascinating for sure. Harris sees this war of ideas as the only way to ultimately achieve a flourishing global society.
Finally, it really helps if you know how to assess the quality of new data, whether that data supports your point of view or not. For example, if a study claims that Moroccan immigrants in the Netherlands are more criminal than Dutch natives or that vegetarians live longer than meat eaters, you should be able to get an idea about the trustworthiness of that conclusion.
The best tool we’ve got for this is the scientific method. It is especially designed to prevent all kinds of shortcomings and biases in human perception and reasoning. This is not to say that scientists don’t make mistakes (they are only human), but that the method works can be seen by all the scientific and technological progress around us. So when you’ve never heard of things like control groups, replication studies, peer review and double-blind tests, then it wouldn’t hurt to read up on this method.
Education also plays an important role here of course. In our digital age of information overload, assessing the reliability of online information has become an indispensable skill. Critically thinking, online fact-checking and research methodology should therefore become key subjects in schools.
Completely freeing ourselves from our bubbels won’t happen anytime soon, but actively fighting them is a good start.
Other critical thinking articles
- What to believe? (2016)
- Meet the Watchdogs of Science: Ben Goldacre & Mark Henderson (2012)
- Mass Psychogenic Illness – How to Survive a Panic Epidemic (2012)
- Why We Deny (2011)
Back to Science features