Sometimes, a simple explanation is better.
A recent article in Alternet asked an important question: Why are some people so resistant to science and evidence?
Currently, there are three important issues on which there is scientific consensus but controversy among laypeople: climate change, biological evolution and childhood vaccination. On all three issues, prominent members of the Trump administration, including the president, have lined up against the conclusions of research.
This widespread rejection of scientific findings presents a perplexing puzzle to those of us who value an evidence-based approach to knowledge and policy.
Agreed. So far, so good.
The author of the piece, a psychologist, then notes that many people resist complexity and shades of gray; they live in an either-or, black or white universe, and are extremely uncomfortable with “non-dichotomas” thinking. He notes that this characteristic is a factor in depression, anxiety, aggression and, especially, borderline personality disorder.
In this type of cognition, a spectrum of possibilities is divided into two parts, with a blurring of distinctions within those categories. Shades of gray are missed; everything is considered either black or white. Dichotomous thinking is not always or inevitably wrong, but it is a poor tool for understanding complicated realities because these usually involve spectrums of possibilities, not binaries….
In my observations, I see science deniers engage in dichotomous thinking about truth claims. In evaluating the evidence for a hypothesis or theory, they divide the spectrum of possibilities into two unequal parts: perfect certainty and inconclusive controversy. Any bit of data that does not support a theory is misunderstood to mean that the formulation is fundamentally in doubt, regardless of the amount of supportive evidence.
Similarly, deniers perceive the spectrum of scientific agreement as divided into two unequal parts: perfect consensus and no consensus at all. Any departure from 100 percent agreement is categorized as a lack of agreement, which is misinterpreted as indicating fundamental controversy in the field.
The article goes on to explain that people whose minds work this way will latch onto any anomaly or disagreement, any “non-consistent” factoid, as confirmation that the entire theory–evolution, climate change, the efficacy and safety of vaccination–is bogus.
Where I part company with the author is his willingness to see this “conceptual approach” as a sign of a mental mal-adaptation, an indicator of other (generally mild, but troubling)mental illness. Although I’m certainly willing to concede that this may sometimes be the case, a couple of other explanations are more consistent with Occam’s razor– the principle that, when presented with competing hypothetical answers to a problem, one should select the answer that requires the fewest assumptions.
In other words, simpler is likelier.
Among the elected officials who dismiss climate science, for example, are a significant number whose campaign coffers are regularly replenished by fossil fuel companies. I suspect these lawmakers’ expressed opinions are more convenient than real.
And if I may be permitted a decidedly un-politically-correct observation, a genuine inability to understand the difference between the scientific method and religious dogma–the inability to recognize the difference between empirical evidence and a preferred and comforting world-view– may be a sign of limited intellectual capacity.
In other words, these people aren’t mentally ill. They’re just not very smart.