Notes on Stupidity

From The Atlantic, I stumbled upon a great definition of science:

“Arguably, science is the gradual process by which the cognitive parts of our brains discover the profound inaccuracies in our deeper, evolutionarily built-in models of the world.”

Of course, this hardly capture the totality of what science is, the methods of accumulating, testing, verifying, and reproducing results; but it does address our fundamental inability to perceive the world with any reliable degree of accuracy.

Science does not deal in certainties, and contrary to popular usage of the term, a scientific theory is more than just a guess, it is a comprehensive explanation which accounts for the information known at the time.  As more information accumulates, as advances in technology allow us to see more deeply into the inner workings of the natural world, as more scientific minds rethink what is known; old theories have to give way to new ones.  Scientific worldviews are built to be replaced.

But human beings aren’t built for uncertainty.  Not only do we like to know things, we like to think that we already know things pretty well.  It’s one thing to try to convince someone that what they’re certain of is incorrect; it’s quite another to try to persuade them that, not only is their worldview wrong, but the very idea that they could legitimately possess a fixed worldview is invalid.

As any social critic or contrarian is aware, sometimes the validity of one’s argument will be judged on answers to non sequiturs like: “If that’s the problem, what’s your solution?” or, “How would you do it better?”  To the extent that people are willing to surrender their certainties, it is far more comfortable to trade them in for new ones. Skepticism and doubt can look like nihilism from afar, and most of us can concede that amongst the many hallways of the human mind, there are certain doors we may wish to keep shut, even if we disagree on which ones.

Which brings me to the next quote:

In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This from an article entitled “How Facts Backfire” which uses the study cited above to argue that the foundational myths of democracy have been laid upon the swamp water of confirmation bias.  How can we have an informed citizenry, when the very act of getting informed runs so contrary to our evolutionary journey from single celled life forms to hairless weakling apes that occasionally vote for the red team or the blue team?

Is this not why evil always wins?  Marketers, populists, the ugly talking heads on cable whose punditry is molded out of appeals to emotion.  Maybe.  But the idea that evil does always win is an untested hypothesis founded on primitive binary thinking.  It’s reassuring to watch others succumb to their own fallacious logic; delusional to presume that our own selves have somehow risen above it.