Why Are Science and Politics So Hard?

albert-einstein5

Climate change, the effects of the Affordable Care Act, environmental hazards of fracking, the effects of widespread gun ownership on crime — all of these are questions that should be answerable by science or mathematics. Somehow, though, they never seem to be.

Advertisement

Of course, the political left has had an explanation for this: conservatives are not grounded in reality like liberals are. Chris Mooney has made rather an industry out of this, with his books The Republican War on Science and The Republican Brain: The Science of Why They Deny Science–and Reality, and of course the political left has tried for a long time to label themselves as “the reality-based community.” Recently, Salon reprinted an article by Marty Kaplan, originally published in Alternet, that is in turn based on an article by Chris Mooney in Grist, which was in turn based on a paper “Motivated Numeracy and Enlightened Self-Government,” posted on SSRN by Dan M. Kahan and others.

Here’s how Kaplan summarizes it:

[S]ay goodnight to the dream that education, journalism, scientific evidence, media literacy or reason can provide the tools and information that people need in order to make good decisions.  It turns out that in the public realm, a lack of information isn’t the real problem.  The hurdle is how our minds work, no matter how smart we think we are.  We want to believe we’re rational, but reason turns out to be the ex post facto way we rationalize what our emotions already want to believe.

Kaplan then goes on to summarize two papers by Brendan Nyhan and Jason Reifler. I’m just going to quote a couple of his summary paragraphs.

  • People who thought WMDs were found in Iraq believed that misinformation even more strongly when they were shown a news story correcting it.
  • People who said the economy was the most important issue to them, and who disapproved of Obama’s economic record, were shown a graph of nonfarm employment over the prior year – a rising line, adding about a million jobs.  They were asked whether the number of people with jobs had gone up, down or stayed about the same.  Many, looking straight at the graph, said down.
Advertisement

Now, here’s the interesting thing about these: in both cases, the “right” answer can be confirmed to be factually incorrect.


doc-brown

In the case of Iraqi WMD, specifically chemical weapons, while it was widely reported that none were found, in fact gas weapons were indeed found, albeit in smaller quantities than expected. Here’s the list, via Wikipedia but from UNMOVIC documents:

  • 50 deployed Al-Samoud 2 missiles
  • Various equipment, including vehicles, engines and warheads, related to the AS2 missiles
  • 2 large propellant casting chambers
  • 14 155 mm shells filled with mustard gas, the mustard gas totaling approximately 49 litres and still at high purity
  • Approximately 500 ml of thiodiglycol
  • Some 122 mm chemical warheads
  • Some chemical equipment
  • 224.6 kg of expired growth media

What’s more, WikiLeaks documents showed:

… for years [after Iraq fell], U.S. troops continued to find chemical weapons labs, encounter insurgent specialists in toxins and uncover weapons of mass destruction.

An initial glance at the WikiLeaks war logs doesn’t reveal evidence of some massive WMD program by the Saddam Hussein regime – the Bush administration’s most (in)famous rationale for invading Iraq. But chemical weapons, especially, did not vanish from the Iraqi battlefield. Remnants of Saddam’s toxic arsenal, largely destroyed after the Gulf War, remained. Jihadists, insurgents and foreign (possibly Iranian) agitators turned to these stockpiles during the Iraq conflict – and may have brewed up their own deadly agents.

Advertisement

Similarly, with reference to jobs, the flaw is that while employment had increased, from the lows, the actual number of jobs — and of people employed — had in fact decreased.

laboratory

The point being that, like a whole lot of social science research, the results and their interpretations aren’t unequivocal. One interpretation is that “conservatives” are stubborn in their views, and unwilling to change them when presented with contradictory news stories; the other, that “conservatives,” or at least people who believe that Saddam did have gas weapons at least, are better informed and so aren’t inclined to be swayed by a single news story that they may well consider untrustworthy.

Now, let’s look at the Kahan et al. paper. The title of the paper is “Motivated Numeracy and Enlightened Self-Government.” It takes an interesting approach to what is really an interesting question: how do people evaluate quantitative — that is, numerical — data? They did an experiment in which the subjects were evaluated for “numeracy,” the ability to reason about numerical quantities, as well as their political leanings. They were asking people to make a decision about relative likelihoods, a kind of problem that people often find difficult and unintuitive. They then presented them a problem deciding about a skin-rash treatment, and an exactly similar problem, with artificial data, about whether prohibiting concealed carry would increase or decrease the crime rate.

Advertisement

The results were that on the skin-rash treatment, people were more likely to find a “correct” answer the more numerate they were. However, on the gun-prohibition question, people who considered themselves “liberal” were more likely to infer that gun prohibition would decrease crime, while people who considered themselves “conservative” were more likely to infer gun prohibition would increase crime.

Kahan et al. interpret this as showing that even more numerate people are likely to skew their decisions to match their already-held beliefs, if those beliefs have become important to one’s identification with one’s group.

To which I say “no kidding? Y’think?”

What’s absent is the obvious next point: that people hold those beliefs on what they consider to be a rational basis, and so when presented with a difficult decision problem, they unconsciously factor those beliefs into their decisions.

People across all points of the political spectrum like to pretend that there is some vast edifice called Science that is intended to arrive at Truth, but the reality is that science is a social process itself, carried out by human beings, with human motivations and frailties. I see no way around it — or rather I think all the other possibilities are worse. Part of that process is what I’ve called the “social contract of science”: the understanding that your work must be presented as clearly as possible, that you must make your data available for critical inspection, and that you expect your results to be examined with a critical eye and subject to robust debate.

Advertisement

What Kahan’s and Nyhan’s papers may accidentally have done, though, is to call into question whether social science research of this kind can ever be done in a trustworthy way. It “deconstructs itself” if you will, because it suggests that people’s reasoning, mathematical numeracy, and so forth become impaired when they’re presented with data that contradicts their current opinions as “fact.” In Nyhan and Reifler’s paper, we see questions where a common opinion is presented as “fact,” but where there is significant contradictory information available to those who have looked into the question. Nyhan and Reifler think that a newspaper article should settle the question, and consider it a sign of irrationality when it doesn’t. The fact that one can rationally question the interpretation of the “facts” they believe correct isn’t considered.

That appears to me to be a flaw in the whole experiment: the people coming to this question aren’t blank files, they bring their past knowledge into the survey when they sign the releases. There’s no way of determining if the people who chose the “wrong” answer did so because they simply were better informed than Nyhan and Reifler, or that the interpretation of what is “right” is determined as much by the authors’ own cognitive biases as by the facts. But the paper made it through the experimental design, through the actual experiment, through data reduction, peer-review and on to publication, and I would bet cash money that no one ever asked them “are you sure that no WMD were ever found?”

Advertisement

It would be interesting to perform the same experiment, but where the proposal was that no WMD were found, followed by challenging the subject with some of the sources I linked above.

It would be interesting, not just in terms of whether anyone’s opinions would change, but also to see if a paper based on those “facts” could make it through the same peer review process at all.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement