A premise of democracy that I believe is worth defending is that it is incumbent on those seeking either change or for the status quo to be sustained to define and defend their arguments, even against robust criticism, and even against seemingly stupid and evil opinion. Needless to say, I also believe that this principle is entirely absent from the green argument. Instead, the environment’s putative voices have preferred to question the intellectual capacity and moral character of their critics, no matter how big a question mark it puts over their own hearts and minds. The most significant development in this regard seems to be the recruitment of cognitive and behavioural sciences into the climate debate, with their own ‘standards’ of evidence. Yet more recent developments have shone more light on this dark tendency.
While I was putting together the previous post, I was interested in where David Grimes was taking his claims from. For example, Grimes wrote:
Conspiratorial beliefs, which attribute events to secret manipulative actions by powerful individuals, are widely held [1] by a broad-cross section of society.
The basis for this claim was a 2008 paper by Cass R. Sunstein and Adrian Vermeule, called simply ‘Conspiracy Theories‘. As was pointed out, Sunstein and Vermeule’s claim was itself second hand. More troubling, the second hand evidence had little academic rigour itself, and moreover tried to establish belief in conspiracy theories in an area that had seen a massive incident in its very recent history. Any traumatic private or public event is bound to seed the formation of such beliefs. The failure of any public institution to do what people expect of it will rightly raise questions about that failure, prompting hypotheses in lieu of convincing attempts to avoid responsibility. We should therefore be suspicious of research which looks at the phenomenon of ‘conspiracy theories’ which takes no account of their context. What is its motivation?
The abstract of Sunstein and Vermeule’s paper reads as follows,
Those who subscribe to conspiracy theories may create serious risks, including risks of violence, and the existence of such theories raises significant challenges for policy and law. The first challenge is to understand the mechanisms by which conspiracy theories prosper; the second challenge is to understand how such theories might be undermined.
I believe we have answered the question ‘why do conspiracy theories prosper’. One only needs to look as far as the caricature of the conspiracy theorist to understand that the condition of conspiracy theorising is a relationship of distrust.
The character played by Mel Gibson in Conspiracy Theory was the archetypal conspiracy theorist: able to accumulate lots of information, but inclined to over-associate and to marshal the facts accordingly. As the film shows, the conspiracy theorist’s paranoia, demeanor and distrust of all forms of official authority isolate him, further fuelling his alienation. The whack-job has no credibility.
But rather than probing the reasons for the phenomenon of distrust in society, the paper’s motivation is more interesting: ‘the second challenge is to understand how such theories might be undermined’. Why is this a challenge? What kind of threat is the lonely, isolated nutter?
I was wondering where I had heard Sunstein’s name before, but it didn’t occur to me to look until after the post. Amazon provided the answer…
We are all susceptible to biases that can lead us to make bad decisions that make us poorer, less healthy and less happy. And, as Thaler and Sunstein show, no choice is ever presented to us in a neutral way. By knowing how people think, we can make it easier for them to choose what is best for them, their families and society. Using dozens of eye-opening examples the authors demonstrate how to nudge us in the right directions, without restricting our freedom of choice. Nudge offers a unique new way of looking at the world for individuals and governments alike.
‘Nudge’ always sounds bland enough. But it always seemed to me to treat people as means, rather than as ends. Indeed, ‘nudge’ is always presented as making it easier for people to do the right things. But when was that really the responsibility of the state, and if the state assumes responsibility for making sure people do the right thing, what autonomy is the individual really left with? When does a ‘nudge’ become a shove?
Nudge became especially popular under the previous coalition government, which established a ‘Behavioral Insights Team‘ (BIT), also known as the ‘Nudge Unit’. BIT claim:
We use insights from behavioural science to encourage people to make better choices for themselves and society.
I believe that the right and proper rejoinder to such a mission statement is ‘Foxtrot Oscar’. While the interventions it proposes may seem trivial, it represents one of the concerns that his blog has highlighted, about the transformation of the relationship between individuals, the state, and increasingly, academia. Suffice it to say that the latter’s recruitment into matters of public policy is wholly regressive, anti-democratic and assumes far to much about its own rectitude, not to say about its ability to better understand the choices individuals make than they.
“Choice architects” thus became flavour of the month with governments throughout the Anglosphere. Sunstein himself was made a chief of the Office of Information and Regulatory Affairs by his personal friend, Obama, to nobody’s delight (not even greens’).
The psychologist-as-bureaucrat, then, isn’t a mere reflection of the increasing tendency towards official intrusion, not merely into the private sphere, but into the mind. And mirroring this is the academy’s increasingly unhealthy interest in mind-probing, too, as means to understanding what’s happening in society, and how to intervene.
The point here, which is made here often, is that the ‘politics is prior’ to a great deal of climate research — that the presuppositions of environmentalism are routinely passed off as the ‘finding’ of studies which invariably ‘show’ precisely what the green perspective already held with. The head-shrinking of the public by… let’s call them ‘psychocrats’… is the broader phenomenon which either encompasses, or at least overlaps with what we have seen in the climate debate, most notably from the likes of Lewandowsky.
That is to say that we can see the politics loading researchers’ questions. The cognitive scientists seeking roles for themselves in policy-making circles would, no doubt, see this as a conspiracy theory… But the questions should be asked, nonetheless, with or without the protection of a tin foil hat: is it just a coincidence that an otherwise not-particularly-remarkable academic has found such favour amongst policy-makers? Are the insights yielded by psychocrats’ research really a sound basis on which to reorganise public institutions? And are psychocrats not using their science as a vehicle for a particular form of politics?
The anxious psychocrat can relax; the point here is not to credit him or her with sufficient nous to have organised a conspiracy, but that they are the useful idiots of people who look up to them as intellectual giants. The point, then, is to put the psychocrat’s anxiety under the microscope — just as we would with any ideology or doctrine that governments embrace.
Grimes and Sunstein have both been bothered by the fact that people not believing the right things seem to present a problem for policymakers. The obvious problem here is that such a worry presumes their own infallibility. Grimes, for instance, as well as much other politically-motivated research into the phenomenon of ‘denial’ (the examples of Lewandowksy and Chris Mooney were given in the previous post) takes belief in climate science as a proxy for belief in science — that to take a sceptical view of climate science is to be ‘anti-science’. The is easily debunked: we can find seemingly respectable scientists and scientific institutions involved with, and fueling most conspiracy theories. The interesting point, however, is the corollary of presuming oneself right is to presume the other is stupid.
More trouble for the pscyhocrat has emerged (hat-tip to Paul Matthews) and is summarised over at Dan Kahan’s Cultural Cognition blog…
First, as science comprehension goes up, people become more polarized on climate change.
Still not surprising; tha’s old, old, old, old news.
But second, as science comprehension goes up, so does the perception that there is scientific consensus on climate change—no matter what people’s political outlooks are!
Accordingly, as relatively “right-leaning” individuals become progressively more proficient in making sense of scientific information (a facility reflected in their scores on the Ordinary Science Intelligence assessment, which puts a heavy emphasis on critical reasoning skills), they become simultaneously more likely to believe there is “scientific consensus” on human-caused climate change but less likely to “believe” in it themselves!
[…]
One thing that is clear from these data is that it’s ridiculous to claim that “unfamiliarity” with scientific consensus on climate change “causes” non-acceptance of human-caused global warming.
But that shouldn’t surprise anyone. The idea that public conflict over climate change persists because, even after years and years of “messaging” (including a $300 million social-marketing campaign by Al Gore’s “Alliance for Climate Protection”), ordinary Americans still just “haven’t heard” yet that an overwhelming majority climate scientists believe in AGW is absurd.
[…]
These new data, though, show that acceptance of “scientific consensus” in fact has a weaker relationship to beliefs in climate change in right-leaning members of the public than it does in left-leaning ones.
I can come up w/ various “explanations,” but really, I don’t know what to make of this!
Kahan could save himself some head-scratching by reading this blog, of course. One can take the fact of the consensus for granted without committing to any of the imperatives greens would say it generates. The point being that there is a great deal between observing the effect of CO2 on the planet and claims about what it means — distance which has been obscured by many green advocates’ use of the consensus without regard for its actual substance. Kahan should have realised it, because he’s a relatively able critic of the 97% strategy. That is to say that the paradox is not that so many recalcitrant climate sceptics also hold with ‘the consensus’, but that researchers who aimed to measure the public’s understanding of climate have been largely ignorant to the nuances of the debate, if not extremely partial players in the debate.
Ask a stupid question, as they say…
… And you will get a stupid answer. Thus the psychocrat’s estimation of the public in fact measures only the mind that authored his own facile hypothesis. The more stupid the researcher, the lower his estimation of the public, and concomitantly, the greater utility his work has to psychocracy. This should remind us of Lewandowsky’s attempt to argue otherwise.
Back in 2014, Lewandowsky and Richard Pancost wrote
It is an unfortunate paradox: if you’re bad at something, you probably also lack the skills to assess your own performance. And if you don’t know much about a topic, you’re unlikely to be aware of the scope of your own ignorance.
[…]
Ignorance is associated with exaggerated confidence in one’s abilities, whereas experts are unduly tentative about their performance. This basic finding has been replicated numerous times in many different circumstances. There is very little doubt about its status as a fundamental aspect of human behaviour.
Lewandowsky was attempting to deploy the alledged Dunning-Kruger effect — which claims that people who do less well in tests of their knowledge over-estimate their performance — in his latest salvo in his war on climate scepticism. Sceptics, he argued, were stupid, and thus over-estimated themselves. But it was Lewandowsky who was claiming too much expertise, as was pointed out here.
The professor of psychology makes bold claims. He believes that he understands the entire world’s relationship to the natural world. He believes he understands the natural world, and professes expertise in climate science. And he believes he knows how society should be organised. Surely he is a true Renaissance Man… A polymath… A Renaissance Polymath… Or he is an epic blowhard?
The point of all this is that the pscyhocrat’s real project is to deny democracy. Not purposefully, and not out of some clearly defined malevolent intent, but through bad faith, nonetheless — hubris, at best — the aim is belittle ordinary people, and to elevate whichever university has been canny enough to establish a School of Psychocracy.
What that tendency costs us is the real dynamic that helps us to filter out good ideas and beliefs from the bad — the public contest of ideas. It fosters a condition of mutual cynicism between people and official institutions — the very thing that brings forth conspiracy theories. The upshot of which is that confidence in authorities that we turn to for knowledge — the academy — will be undermined. The slower that the academy responds to the bullshit from within its own corridors, the longer and deeper will be its decline in the public estimation.
I am under orders to make these posts shorter. To to save 3,000 words from what is essentially a bootnote… Sunstain has authored a number of books of interest here.
Mr. Sunstein is author of many articles and books, including Republic.com (2001), Risk and Reason (2002), Why Societies Need Dissent (2003), The Second Bill of Rights (2004), Laws of Fear: Beyond the Precautionary Principle (2005), Worst-Case Scenarios (2001), Nudge: Improving Decisions about Health, Wealth, and Happiness (with Richard H. Thaler, 2008), Simpler: The Future of Government (2013) and most recently Why Nudge? (2014) and Conspiracy Theories and Other Dangerous Ideas (2014). He is now working on group decisionmaking and various projects on the idea of liberty
Chiefly amongst these is his attack on the precautionary principle. What I suspect, however, from reading the blurbs, is that rather than wanting to depart from the Precautionary Principle, Sunstein wants to own it more completely. ‘Risk’ being at the centre of his perspective, we can see Sunstein as a victim of Risk Society, and his work very much belonging to that movement, more of which can be read about here and here.
While the prototypical Climate Conspiracy mentioned (it seems everywhere) is to describe the anti-AGW crowd for not wanting to believe AGW etc etc, the AGW crowd fails to recognize their own conspiracy theories. E.g. Big Oil and Koch Brothers funding of the entire anti-AGW crowd being a primary example.
Now that’s the kind of bite-size article I can deal with. This kind of archaeology of social science is fastidious and time consuming, but often pays off. Unfortunately my comment is rather long, so I’ve made it at
http://cliscep.com/2016/02/09/sources-of-the-climate-conspiracy-theory-conspiracy-theory/
Although the version of Sunstein & Vermeule that you link to is just called “Conspiracy theories”, the final version cited by Grimes, published in the Journal of Political Philosophy, is titled “Conspiracy Theories: Causes and Cures”.
So conspiracy theories are seen as a problem that needs to be “cured” by government intervention. Which brings us back to your question, “Why is this a challenge? What kind of threat is the lonely, isolated nutter?”
In the draft version they sort of raise this question themselves:
“If children believe in Santa Claus or the Easter Bunny, there is no problem for government to solve”
though sadly this sentence didn’t make it to the final published version.
They go on to claim that the Oklahoma City bombing happened because of a conspiratorial belief.
Even if that is the case, their proposed solution, that governments should indulge in “cognitive infiltration” would, it seems to me, only make matters worse.
It might be helpful to everyone (or at least everyone not committed to 100% advocacy irrespective of the truth) if somebody drew up a short-list of sensible questions these journalists, psychologists, and sociologists might consider asking whenever they think about conducting questionnaires and interviews on this topic.
Or at least tell them which questions are really vague and dumb, and should be avoided. Questions like “Do you believe in climate change or anthropogenic climate change?”
As soon as they they fail to ask some specific kind of semi-quantitative question along the lines ‘X percent of warming since the year Y caused by human activity Z?” then their research project falls flat on it’s face. Immediately.
But they keep on doing it.
I also think I’m one of those naive people that fail to see a real difference between a “conspiracy theory” and the statement “Some humans tell lies about the motives for the actions.” It doesn’t sound so grand when expressed that way.
Michael Hart
Your plea for sensible questions should start with some questions the researchers should ask themselves, such as:
– “Are we looking for opinions, beliefs, attitudes, unconscious motivational factors, or what?”
– “Are we researching the general public (in which your proposed quantitative question would be meaningless) or the tiny proportion of the public who’ve given the subject some thought?”
In the end, the only thing you can sensibly do is keep the question simple and ask the same thing repeatedly over time. Though a question such as “do you believe in climate change?” is strictly meaningless, change in the number of responses over time may tell you something useful.
geoff, I’m still going to go with the argument that some questions are so vague as to be useless, however often they are asked, answered, or analysed. E.g. If we asked people on the street, every day for 50 years, “Do you think it’s turned out nice today?”, then I suspect we would learn precisely zero about weather and climate changes over that period, or what they thought about it. It might reflect what they saw on TV today, yesterday last week. But probably not a lot. Language itself probably changes much faster than the climate.