Scientists like to think of ourselves as people who react to facts and only believe things that have concrete evidence. We get very angry about people who ignore obvious facts and fight with the majority of the scientific community, like climate and vaccination deniers. Indeed, many scientists can’t even understand why people disagree, especially when faced with the mountain of evidence that exists that climate change is real and that vaccines help vastly more people than they hurt. So, why does this happen?
Well, interestingly, it turns out that humans are somewhat complicated. Who knew? It also turns out that scientists are definitely not immune to the phenomena of being on the wrong side of an argument either.
When a person has an idea about how something works, it is typically called a theory. As a scientist, the person probably would like to make sure that this theory is true. So, they write a paper and show some evidence that supports their theory. They might write a grant, and if they are really lucky (with funding rates at about 10%-15%, they have to get really, really, lucky), they get some funds to take their theory and try to prove that it is the law of the land. To do this, they may try to do some statistical analysis or model runs or whatever. If their results support the theory, they publish another paper or two and then try to do more. Some people spend their whole careers working on a single topic.
If the theory is interesting and/or important, other people will pick up on it. They will also try to prove that it is true or try to disprove it. They may take a completely separate approach to the problem, which is an incredibly good thing.
Now, at this point, we have a trap. If the scientists were robots, they would look at the evidence dispassionately, and be able to objectively evaluate the merits of the theory and the evidence for and against it. But, sadly, they are not. Some scientists start to believe that their theories are true, and begin seeing truth in data. One podcast that I listened to pointed out that an easy way to innocently do this is to add more events, one at a time, until it seems clear that the theory is the predominant mechanism. (The example that they gave is that you believe that when you flip a coin, you will get heads more than tails. So, you flip the coin ten times and you get something like 6 heads and 4 tails. You decide that you need more data, so you flip two more times, both of which are heads. So, now you have 8 heads and 4 tails. At this point, a proper statistical analysis will point out that this is not really significant, since you only have 12 events. But, because you have sank a huge amount of time and energy into each coin flip, it is a good place to stop and declare victory. “Special coin gets 100% more heads than tails when flipped!”)
We, as humans who have sunk a lot of energy into our research, believe in our own theories. We talk to people who back up our opinions. We dig in. We become stubborn and refuse to see any other view point, even if the preponderance of evidence starts to stack against us. Our science becomes a belief – something that is not really based on fact anymore, but something that is based on a desire for it to be true.
A good scientist, when confronted with objective facts that disprove their theory, will withdraw their theory and state that it was not true. This might not be a public event or anything, but they will probably stop publishing on that topic. Not always, though. There are plenty of stories of researchers who held onto their beliefs long after the community has moved on.
On the opposite side, we can sometimes be stubborn to embrace things that are most likely true. As a personal example, I really don’t like the idea of dark energy or dark matter. Namely, cosmologists have noted that the expansion of the universe is happening at a different rate than can be explained by looking at the distribution of stuff. They think that there needs to be more there. So, they have come up with the idea of dark matter, which is a substance that exists in our universe but we can’t see it or interact with it in any real way. Except that it is helping to pull apart the universe. There are experiments were people try to detect the dark matter in some other way than the theories, but they can’t. I look at this and think that it is crazy talk. But I think that the majority of physicists believe that it is real. And when I say “believe”, I mean that they look at the evidence for it and against it and pass a judgement. They feel that the evidence supports dark energy existing.
In many ways, science is all about belief. We believe that some things are true and some things are not true, based on evidence. If we feel like the evidence is strong enough, then we believe it. If we feel like the evidence is not strong enough, we don’t.
Sometimes it is in our own personal interest to believe something to be true or false. For example, a lot of companies have a huge amount invested in an infrastructure that is oil-based. This leads them to not want to believe that they are leading the world into a horrifying future. It is definitely not in their interests to believe that they are the cause of climate change, which could ultimately displace more than a billion people from their homes when all of the ice melts, and put Florida under water. Who would want to believe that? It is much easier to deny that it is happening and look for any chinks in the theory.
It is very hard to fight this, since it is human nature to be invested in ideas and things that you have put a lot of energy into. (You are telling me that my whole life is a lie???) We are irrational. We, as scientists, need to come up with some other way to communicate with other people besides just stating facts and arguing. This doesn’t even work on other scientists most of the time.
One thing is for sure, though: making a movie about a world covered in water (staring Kevin Costner) won’t convince people that climate change is real.