Tuesday, May 3, 2011

The unchanged mind.

Author Chris Mooney wrote an article a while back titled The Science of Why We Don't Believe Science. In it, he talks about various problems with getting the population to accept science in general. This is a clear-cut problem. The anti-vaccine movement is growing, belief in evolution is ridiculously low, and alternative medicine trends proven not to work like homeopathy continue to have millions of practitioners. Chris Mooney tries to explain why, in the fact of such evidence, people continue to believe:

The theory of motivated reasoning builds on a key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

In other words, that which is viewed as contrary to our beliefs or the patterns we feel we've witnessed are fought outright or run from more than that which coincides. If a study is posted that flies in the face of what we accept, we find ourselves more critical than studies that coincide or we ignore it outright. I know I've fallen victim to this kind of reasoning before; what I believe highly influences what I'm inclined to accept at face value. Some of this stems from my religious upbringing, some of it from my nature. Now that I strive the follow the road of the skeptic, I fall victim to it, too. I know this and try to work on it.

What this really means, though, is that the more certain we are that we are right, the more difficult it is to change our minds. Anyone who's tried to get into a legitimate debate with someone idealistically opposed will find that it's nigh impossible to truly change minds. The act of altering one's opinion is a lengthy process and a single discussion will not sway minds. It's why as a general rule I avoid debating with the most passionate; they are good arguers, and have at the forefront of their mind arguments that I would need to sit down and contemplate before rebutting. It's easy to see what we want to believe and ignore that which we don't, whether we be scientists or lay people.

Looking at all of this, I've come to a broad conclusion: I cannot legitimately discuss a point with someone who claims to have never changed their mind. It doesn't have to be about faith.  They could even eventually swing back to original ideas. The more firm and unyielding we become, the less often we are to admit our own foolishness.  I feel uncomfortable around anyone who speaks so brazenly that they cannot admit that they're wrong, and when that feeling is directed at me, the unease explodes to new levels.

This isn't to say that we don't know anything, or that we are more often wrong than we are right.  What it does say is that we need to be aware of our tendency to cherry-pick data and information, and we need to constantly seek out criticism of our ideas to see how well they withstand open critique.  I suspect many look at this as further evidence that science is flawed, that we don't know what we're talking about, and that we shouldn't knock homeopathy or blatantly accept evolution because science "requires as much faith" or "means we're probably just looking for studies that confirm what we believe".  The difference is in the scientific method, in our efforts to continually refine ideas and eliminate bias as much as is possible.  Forged results like the Andrew Wakefield study eventually get rooted out, because there will eventually be enough data that the more fallacious results will not withstand the test of time.  Certainly immediate studies and opinions may turn out to be wrong, but in the broad history of science, the truth tends to win in the end.

And maybe it doesn't.  Maybe tomorrow I'll find something that tells me I'm absolutely, completely wrong in my assumptions about science.  I don't think so, and as I grow older and wiser things seem to more elegantly coincide in support of science and away from my meager, small-minded instincts.  As Tim Minchin says in his song Storm

If you show me/ That, say, homeopathy works,/ Then I will change my mind/ I’ll spin on a fucking dime/ I’ll be embarrassed as hell,/ But I will run through the streets yelling/ It’s a miracle! Take physics and bin it!


  1. I remember reading that article a few weeks ago on why we don't believe science and thought it was an interesting read. I was however a little put off by how biased it was from a political stand point. I remember thinking "what a dirty liberal." several times while reading through it, but the message it was trying to send about people being beholden to their personal beliefs is a good one.
    I think your decision to omit discussion with people who are unwavering in their beliefs is an interesting one. Usually those are the beliefs and issues in our society that cause the most harm when people do not understand them. It’s not fun ‘arguing with idiots’ as one comedian put it, but sometimes those idiots pose a clear and present danger to the public and I for one think the fight is worth fighting regardless of how relentless and unyielding the opposition is. Eternal vigilance is a good thing to have. There’s always that chance that you decide not to argue your opinion but if you had that person may have ‘seen the light’. But then you have a blog now so I guess this is a pretty easy way of getting your opinion out there for the entire world to read.

    Anyway, good first post! I enjoying reading what you have to say David.

  2. Chris Mooney is decidedly liberal, there's no question about that. If you haven't noticed by now I'm fairly heavy on the liberal side myself, so I don't always notice. Still, yes, I'm quite fond of trying to analyse my own arguments. Interestingly, I read another report recently suggesting that we humans are such good arguers from an evolutionary perspective not to help us see the truth but to win arguments.

    And yes, those beliefs that people are locked in on no matter the evidence are those that cause the most harm, but there are more productive things to do than to try to argue pointlessly. I'm not the best at arguments because I tend to want to think on a topic for a while before I legitimately respond, and I don't have the wit to come up with a response "from the hip," so to speak.

    And thanks, Adam. I'm glad you appreciate my thoughts. :)