It is time to stop quivering in our boots in pointless fear of the future and just roll up our sleeves and build it.
- Ray Pierrehumbert

Monday, March 8, 2010

Monbiot's paradox

Quoth George Monbiot here, with a tip of the ol' Stetson to Brian Dupuis of Alberta:
Arthur C Clarke remarked that “any sufficiently advanced technology is indistinguishable from magic”. He might have added that any sufficiently advanced expertise is indistinguishable from gobbledegook. Scientific specialisation is now so extreme that even people studying neighbouring subjects within the same discipline can no longer understand each other. The detail of modern science is incomprehensible to almost everyone, which means that we have to take what scientists say on trust. Yet science tells us to trust nothing; to believe only what can be demonstrated. This contradiction is fatal to public confidence.
There are lots of other interesting observations in the remarkable piece linked above, but this one is something of a revelation.

pic: George Monbiot by George Monbiot via Wikipedia, Creative Commons.


jstults said...

From the linked article:
“We need one passionate, persuasive scientist who can connect and convince … We need to be taught to believe by a true believer

Thinking that most people are interested in this sort of pantheistic religious experience from a secular priesthood is probably a big part of the PR problem.

...but it reinforces the disturbing possibility that nothing works.

Nothing but transparency works, slick PR campaigns lead by charismatic true believers are doomed.

Michael Tobis said...

Transparency does not suffice, for the reasons explained by Monbiot.

That said I am very much in favor of transparency.

Brian D said...

Viewing transparency as a necessary but not sufficient condition is probably for the best, particularly in a position where a) your operations require some training to understand (see the section Monbiot quoted above; any special training beyond middle school seems to be above "the general public" nowadays) and, especially, where b) people who disagree with you but are better at speaking to the public are waiting to pounce on the slightest misstep. In such a scenario, transparency without, for lack of a better term, public relations is an open invitation for the intellectual equivalent of a punch in the face.

Dan Olner said...

It's a good piece from Monbiot, but I think he's wrong. I only started getting into the climate science argument about six months back, and initially I expected some actual sophistication from those attacking AGW theory. What I found instead was, almost consistently, very simple errors of logic, a lot of which can be dealt with by simple, intuitive examples.

What I think that means: slow, patient, tortoise-like repetition of the basics will, in the end, succeed - along with, maybe, a little bit of pointing out the obvious fallacies in some of the most prominent cuckoo scientists. (Monckton: chaos isn't that complicated. Many simple examples: seasons vs weather, tides vs waves.)

Yes, it can get as obtuse as you like, but I think once a person sees the science has a solid bedrock they can understand, they prefer that to the shower of FUD they get from elsewhere.

David Pennell said...

There's some truth there, but not enough to explain his flakiness the last few months.

skanky said...

I think it's just worth pointing out, as there are always some who don't RTFR, that jstults' quote was by Peter Preston, and GM follows it with:

"Would it work? No. Look at the hatred and derision the passionate and persuasive Al Gore attracts."

I realise that jstults wasn't intentionally implying that it was GM who said that, but it could be read that way.

jstults said...

MT:Transparency does not suffice, for the reasons explained by Monbiot.

Monbiot's article:As a result, groups with opposing values often become more polarized, not less, when exposed to scientifically sound information.

He (and you?) both think that science has something to say about what the public's values should be, and you are surprised when talking about science doesn't change anyone's values. Check your assumptions. There's no sound metaphysical basis for thinking science has anything to say about ethics or morals. It is a method for learning about the physical world, that is all.

The reason transparency is insufficient is because everyone involved is trying to borrow credibility from The Science to prop up their value system. That wasn't one of the reasons Monbiot gave though, I wonder why?

guthrie said...

jstults - the most likely reason Monbiot doesn't get into the science and ethics thing is that he has already put on record that he thinks that ecosystems and other things under threat from AGW are valuable, and therefore given the evidence of likely occuring damage etc he is for action on AGW. He may not see that transparency in science is not enough if value systems are in play. Indeed too many people do not realise that your approach to AGW does depend to some extent on your value system.

Michael Tobis said...

Regarding transparency, another way to put it is one man's convincing is another man's opaque; one man's transparency is another's whitewashing.

In the end a real science takes ten years to appreciate and twenty to make a substantial contribution. Transparency is good, but very few people will take the ten years (or ten thousand hours, really) to appreciate it.

Opening the books is fine but leaving yourself open to paranoia isn't.

My copy of Norbert Weiner's most popular book "Human Use of Human Beings" is secondhand, and the previous owner was a Marxist, who completely failed to understand the book. When Weiner speaks of the relationship between "communication" and "control" he is speaking of a mathematical analogy, not a program of propagandistic manipulation. Whenever words like "constraint" or "exploit" appear in the text, they are underlined several times in a ballpoint pen and accompanied by a scrawled leftist polemic in the margin.

The fellow read most of the book and, based on the extensive scribbling in the margins that I see, understood not a word of it. He was just looking for hints to confirm his suspicion that Wiener was in the pay of the oligarchy.

He probably didn't know that FBI director J. Edgar Hoover considered Wiener a subversive. Maybe he would have underlined different words.

I will go along with jstults in saying that transparency is necessary, but it has to be accompanied by some way of preserving the ability to work of the people who already have put the twenty thousand hours in. Not every yahoo who sees the word "instability" actually has a worthwhile opinion on energy flows in fluid dynamics.

Put yet another way, why are we discussing jstults' familiar point while ignoring Monbiot's remarkable observation?

Monbiot identifies what seems to be a fundamental flaw in scientific ideology, which I think ties directly into why openness is insufficient, and why ignorant attacks on science succeed. Is he right?

Kooiti MASUDA said...

Fragmentation of the scientific commnity is a long standing problem, discussed by Derek de Solla Price and by Thomas Kuhn since the 1960s. Perhaps scholars in mid- 19th Century Japan knew it when they translated "science" as "kagaku", literally "branched learning".

The remedy I suggest is to have people who can speak dialects of multiple branches -- "interactional experts" in the technical term of science-technology-society scholars Harry Collins and Robert Evans (2007, "Rethinking Expertise", Univ. of Chicago Press). They may or may not be working experts ("contributory experts" in Collins and Evans' words) of each of the branches.

One cannot become an interactional expert instantly. Perhaps it requires experience equivalent to a part-time M.Sc. course. (Cf. it requires equivalent of a Ph.D. course to become a contributory expert.)

gmcrews said...

Hi Michael,

You ask the question: "Monbiot identifies what seems to be a fundamental flaw in scientific ideology, which I think ties directly into why openness is insufficient, and why ignorant attacks on science succeed. Is he right?"

If you mean is there a fundamental flaw in the scientific method, the answer is no. The consensus desiderata for the scientific method are:

1. Theory must be consistent.
2. The sole test of theory is experiment.
3. Theory and experiment shall be parsimonious.
4. All theory shall be independently verified and validated (i.e., the process of scientific confirmation).

These assumptions are necessary and sufficient. They contain no fundamental flaw. No attack on these fundamental principles (dating back to before the Church's attack on Galileo) has ever been successful.

But Monbiot's idea goes far beyond these desiderata, as noted in your quote of him in the beginning of this post. He thinks science must lead everyone to consensus truths. But science is not designed to do this. It is designed to avoid everyone coming to a consensus falsehood. The scientific method is an error management process. Not a truth machine.

For example, note that nowhere does the scientific method require that theory be true. Only that it be consistent with the outcome of experiment. Newton's laws of motion are not true. But they work for most experiments. Experiment, not truth, allows Newton's laws to earn the label of being scientific.

IMHO, the problem is one of scientists (and others) asking science to do something it was never intended to accomplish. Scientists should have the sophistication to know that rational Bayesians can often come to diverging viewpoints given the same evidence.


Michael Tobis said...

Unless you include observation as "experiment", the earth sciences and the space sciences do not qualify as sciences. I do not think that any universally accepted definition of "the scientific method" actually exists.

The finding of truth by elimination of error does seem to be a common thread though, so the import of your comment stands.

The question here is not how science works internally but how it is to function as a component of society. The understandings of the postwar period have broken down.

Peculiarly, it was a cabal of physicists that was as much responsible for this disaster as anybody. This in turn may be because they think the methods of physics are "the scientific method".

But this is complete nonsense. Science is pragmatic. When you are disrupting the biosphere, pragmatism is called for. Intellectual styles that may be very elegant and appropriate in searching for quarks and superstrings have little to do with grasping the actual physical object that, uniquely in the universe, supports our lives.

The people who think about that most are worried. The time for putting up with petty bullshit must end. But how?

gmcrews said...

Hi Michael,

I consider observation to qualify as experiment if a prediction about what will be observed is documented before the observation is performed.

But I'm afraid I don't really understand the rest of your comment. You seem to be bringing up a subject that may be at the pragmatic end of what the Wikipedia defines as technoscience. Am I correct? Who are the cabal of physicists you speak of?


Michael Tobis said...

Shudder... Please don't put me in any intellectual categories with Bruno Latour.

Regarding the physicists in question, go watch Naomi Oreskes' recent talk. It's currently linked at the top of the page, or here.

Regarding observations, consider getting prehistoric CO2 concentrations from ice cores. It is a very robust procedure. But the concentrations we obtain could not have been predicted form first principles. At the same time, immense effort is going into explaining them. Do you exclude this from science?

Phil said...

@gmcrews, I prefer Karl Popper's concept of "conjecture and refutation". For a theory to be useful, it has to be refutable.

Confirmation is a much harder thing.

Popper also came up with an interesting idea which is relevant in our context.

"You cannot have a rational discussion with a man who prefers shooting you to being convinced by you", he says in Conjectures and Refutations> He then points out that even if you convinced him of your irrefutable logic, he could still choose to shoot you.

Does this remind anybody of the denialists?

Hank Roberts said...

The paper Monbiot cited there in fn6 to a news story:

is available online now:
Worth reading:

When Corrections Fail: The persistence of political misperceptions

Forthcoming, Political Behavior

An extensive literature addresses citizen ignorance, but very little research focuses on misperceptions. Can these false or unsubstantiated beliefs about politics be corrected? Previous studies have not tested the efficacy of corrections in a realistic format. We conducted four experiments in which subjects read mock news articles that included either a misleading claim from a politician, or a misleading claim and a correction. Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a “backfire effect” in which corrections actually increase misperceptions among the group in question.

Hank Roberts said...

Monbiot focuses on one possible explanation (his fn7)--that people are influenced by the crowds they move in.

I wonder, though if the studies need to control for another variable, which is how the refutation is presented.

Earlier work suggests strongly that _how_ you refute bogus claims makes the difference, and that our first impulse (to repeat the claim then debunk it) is counterproductive. Seems to me the studies Monbiot points to don't take this into account.

I've posted this before. Here it is again, relevant to the newer work Monbiot cites I think:


Deck is stacked against "mythbusters"

Posted on: September 6, 2007 5:05 PM, by aetiology (Tara C. Smith)

Correcting misinformation can backfire.

The federal Centers for Disease Control and Prevention recently issued a flier to combat myths about the flu vaccine....

When University of Michigan social psychologist Norbert Schwarz had volunteers read the CDC flier, however, he found that within 30 minutes, older people misremembered 28 percent of the false statements as true. Three days later, they remembered 40 percent of the myths as factual.

Oh, and it only gets worse; more after the jump. ....
----end excerpt---

Hank Roberts said...

Indeed, Nyan and Reifler don't cite the Schwartz paper, and the method they used ("Study 1" done in 2005) was to have subjects read a newspaper article that gave a false impressions of balance, then read a refutation of the bogus claim, then read an affirmative statement of the bogus claim and rate its credibility.

Seems to me this experimental approach is very susceptible to the outcome "et al. and" Schwartz (2005) described, that stating a bunk claim to refute it makes it more memorable.

Skurnik, Yoon, Park, & Schwarz (2005) How warnings become recommendations

Telling people that a consumer claim is false can make them misremember it as true. In two experiments older adults were especially susceptible to this “illusion of truth” effect. Repeatedly identifying a claim as false helped older adults remember it as false in the short term, but paradoxically made them more likely to remember it as true after a three-day delay. This unintended effect of repetition comes from increased familiarity with the claim itself, but decreased recollection of the claim’s original context. Findings provide insight into susceptibility over time to memory distortions and exploitation via repetition of claims in media and advertising. -- Skurnik, I., Yoon, C., Park, D.C., & Schwarz, N. (2005).

Journal of Consumer Research., 31, 713-724. [ This paper received the 2008 Best Article Award from the Association for Consumer Research.]

see also:

Schwarz, N., Sanna, L., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight. Advances in Experimental Social Psychology, 39, 127‐161.

Aside: I somehow suspect this paper may also be relevant to blogging exchanges:

Chandler, J. & Schwarz, N. (2009). How extending your middle finger affects your perception of others: Learned movements influence concept accessibility. Journal of Experimental Social Psychology, 45, 123-128.