It is time to stop quivering in our boots in pointless fear of the future and just roll up our sleeves and build it.
- Ray Pierrehumbert

Sunday, January 16, 2011

Science for Everybody #159

158 illustrious contributors have been asked for a very short essay answering the question


This is a wonderful question and I look forward to the answers, many of which may also be wonderful, even though I may end up burdened with as many as 158 concepts.

(Note the linked page is quite a godawful mess, web-designically speaking. Scroll down to "BEGIN READING HERE" when you are ready to start reading...)

However, judging by the 158 titles, they should have asked me, because the one I would propose appears to be missing. So, before I go exploring the other 158, here's my contribution.


At first glance, there's a fundamental problem with science as a method. On the one hand, one is expected to take nothing on faith, to doubt everything, never to defer to authority. On the other hand, science is obviously a cumulative business. By now millions of person-years have been amassed, and it would seem to be impossible, to long have been impossible, to replicate enough of that thinking to get to the cutting edge where knowledge could be advanced.

In practice this presents no problems. We base our work on "established science". But how is this possible when we take nothing on faith? We pretend it is by reference to the "peer reviewed literature" but that is something of a pose. Reviewers are insufficiently rewarded to provide the requisite review. The system is there to make a plausible claim for a good-faith effort. It doesn't actually provide effective gatekeeping, and many trivial or incorrect results are published. Consequently the literature is really not enough. (*) But obviously science does work.

The undercelebrated key to science is coherence. The facts and methods that work in any discipline are the ones that "make sense" in the context of all the other facts and methods. Outsiders coming in and offering ideas generally fall outside the established context. The experienced scientist's first reaction is "no, that can't be true", followed by identification of a contradiction with a well-known bit of reality. That is, the wrong idea is incoherent with established knowledge.

Ideas that fall into "hey that might be true" are dramatically rarer, especially if they have a soupcon of "and that would explain this other thing that has been bugging me, too". And those are the ones to which we need to apply the full brunt of our skepticism, the ones that might survive into the shared network of coherence. So, our ability to advance the truth is based fundamentally on our understanding the truth well enough to quickly discard most untruths. Constructing a realistic and productive perspective is a matter, like the sculptor removing all the marble that is not part of his subject, of getting good at throwing away the stuff that doesn't matter. The stuff that doesn't matter is the stuff that is incoherent with what is already known.

(*) Highlighted text used to say: We pretend it is by reference to the "peer reviewed literature" but everyone knows that much of what passes peer review, probably a majority, is nonsense published for the benefit of tenure committees and grant reviewers. The literature is no help.

It's been argued that this is too strong and in retrospect I agree.


Zowish said...

From Utne, in response to your banner quote about democracy diminishing with increased population:

Brian D said...

I haven't finished the deeper reading of the original source, but I'm surprised that none so far have brought up recognizing a legitimate authority (and its corollary, recognizing incompetence, especially in yourself).

The corollary would help reduce the noise in the system (gutting Dunning-Kruger), the suggestion itself would dramatically strengthen the signal in our media (i.e. no more false balance or sounds-like-science reported as science).

Plus, it doesn't require much special training - arguably about the same as recognizing the need for a coherent worldview.

Anna Haynes said...

> recognizing a legitimate authority

Good point Brian - except I think the Q is about what "*doing* science" concept to add, not what "using science" concept - though public awareness of "emergent properties of the scientific endeavor" really would help.

And they've added the author of Proofiness. I hope we can Edge to accord MT Public Intellectual status without our sending him around to talk shows.

Who knows the Edge folk, that can get a word in, ah, edgewise?

Anna Haynes said...

s/we can Edge/we can get Edge/

And fyi, I have emailed the editors...

Jim Bouldin said...

"...everyone knows that much of what passes peer review, probably a majority, is nonsense published for the benefit of tenure committees and grant reviewers. The literature is no help."

You're going to have to explain that one to me. Way way way way over the top, pegging the cynicism meter.

Jim Bouldin said...

"improve everybody's cognitive toolkit"

I will continue my work on the grand unified theory of everything, explained in terms that everyone can easily understand, and which will have a positive future impact on everything.

Michael Tobis said...

Reviewers are insufficiently rewarded to provide the requisite review. The system is there to make a plausible claim for a good-faith effort. It doesn't actually provide effective gatekeeping, and many trivial or incorrect results are published. The important factor is not who gets published but who gets cited.

There should be a way to cite with the equivalent of a rel = nofollow attribute...

Jim Bouldin said...

It would be a helluva lot worse if there were no peer review at all Michael.

Insufficiently rewarded? Who is expecting a reward for reviewing a paper? It's just part of the job of being a scientist.

Michael Tobis said...

I stipulate that it would be worse if there were no peer review and nothing else changed. But not much.

If we were to design a system around 1) modern technology and 2) consequences for good or bad reviewing work, it could work much better.

Now that distribution costs are trivial, I think peer review should come after initial publication of a draft.

Jim Bouldin said...

Consequences for good or bad reviewing work are easily advanced by simply making public the names of reviewers, including for rejected manuscripts. That would change things in a hurry.

I should add that I agree with much of what you say as far as how to improve things. The more open the better.

EliRabett said...

Teach em how to bullshit test.

Kooiti MASUDA said...

A question to MT about the meaning of the word "wrong" in "the wrong idea is incoherent with established knowledge".
Does it reflect just a description that the experienced scientist generally thinks so?
Or does it reflect your meta-scientific evaluation that such an idea which is incoherent with established knowledge may be categorized "wrong"?

RP_sqrd said...

Michael Tobis: There is a huge gap between:

"Reviewers are insufficiently rewarded to provide the requisite review. The system is there to make a plausible claim for a good-faith effort. It doesn't actually provide effective gatekeeping, and many trivial or incorrect results are published."

with which most active researchers, publishing scientists, and reviewers would enthusiastically concur, and:

"everyone knows that much of what passes peer review, probably a majority, is nonsense published for the benefit of tenure committees and grant reviewers"

which I consider to be not only objectively false (as a senior scientist who has reviewed many grants and participated in several tenure committees) but as gratuitously offensive - the sort of thing that I am more used to seeing from the denialist camp.

Michael Tobis said...

OK, I will back down. I don't want to weaken the main point, and perhaps it was written too hastily.

Michael Tobis said...

Masuda-san, in practice it is a matter of degree.

When somebody proposes a perpetual motion machine, for instance, the scientifically educated person has such considerable confidence that there is an external source of power that is is fair to reject the idea immediately.

Of course, changes in ideas are necessary to make progress. Usually the productive idea will upset some established ideas but will align some previously awkward mismatches.

Coherence is not a definition of truth, but it is a characteristic of a successful scientific endeavor. Immature sciences may have less rich networks of coherence. For example, it seems many statements in economics will have vigorous arguments about the truth or falsity of the statement. This makes economics an immature science.

In fluid dynamics, disagreements about fundamentals are rare, and two fluid dynamicists can be expected to come up with completely coherent analyses of a given practical problem except in extraordinary cases.

A big part of our present problems originate from the fact that the maturity of climate science is underestimated. Certainly, there is broad disagreement about many issues of great importance, though not as broad as a casual observer might conclude. The sorts of problems on which a rich network of coherence exists do not come to the attention of the public. But there are many quantitative problems on which earth system scientists are in sufficient agreement that independent study of the problems by different groups will yield mutually compatible answers.

steven said...


Suggest some folks put Quine on their reading list

John Mashey said...

"The system is there to make a plausible claim for a good-faith effort. It doesn't actually provide effective gatekeeping, and many trivial or incorrect results are published."

I think this is an over-generalization quite likely to terribly confuse people unfamiliar with peer review, and actually be quoted by those who wish to lose any distinction between, say Energy&Environment and Science.

1) Like most other things, talking about "peer review" as thought it were binary is guaranteed to confuse the unwary.

2) A better model is to say that journals range in per review intensity from ~zero to very difficult and that people inside a a field tend to know which is which. I don't know the distribution shapes.

3) The fiercest peer review I've seen was the internal review inside Bell labs for papers proposed for external publication. A paper went up your own management chain to your Executive Director, across to 2 others, down their chains to reviewers, back up, across and down again. I several times signed off on reviews, forwarded by my E.D. elsewhere, where he added a note:
"Dear XXX, Once again, I am sad to say that my folks think a paper is junk, and I agree." or words to that effect.

Imagine that a proposed paper with statistics in it came back from Tukey saying it was bad. Not a career plus if happens often.

4) So, I would claim:

a) Peer review is no guarantee of correctness.

b) The likelihood of correctness varies widely by journal, but errors get through even the best.

c) Peer review alone is never sufficient, although it is usually practically necessary in some form.

d) If there is a startling result, the fact that it gets through peer review is far less significant than if it simply cannot get through peer review in any relevant credible journal, and instead gets published in some odd place without relevant expertise, mostly t o be able to claim it was pee-reviewed, as though that were a binary attribute.

For example, one of my current favorites was one of those involved in the Wegman mess, specifically, Said, et al (2008), that Deep Climate found long ago, and I discussed in SSWR, Appendix W.5.6.

In this case, a social-networks-analysis paper with a) substantial plagiarism, serious methodology problems, c) unsupported conclusions was accepted in 6 days from original receipt, in a journal that did not publish SNA or have associate editors listing it as a topic.

Had that paper gotten through the sister journal Social Networks, the level of credibility might have been different.

Kooiti MASUDA said...

MT, Thanks for clarification.

I was thinking about putting a translation of your posting at our blog (of my collegue scientist and myself somewhat like RealClimate).

But I am afraid, even with your clarification, it is difficult to convey the message. Some readers who think that we scientists always defand established knowledge and purge heretics likely reinforce their notion. I need to think more.

I agree that "the maturity of climate science is underestimated" by many of our critics. Some social scientists think that climate science is just as coherent as theirs. Some physicists think that climate science is much less coherent as theirs.

In some situation we need to do such "climate science", perhaps more aptly called "sustainablity science", that covers the subjects of IPCC WG 1+2+3, e.g. when we need to close the loop of industrial activity - emission - concentation - climate - impact. Then coherence of knowledge may be as week as the weakest link (social science, perhaps).

But, of course, we have many situations within the scope of climate science in our usual sense (in terms of IPCC, the scope of WG1), where coherence of the knowledge is strong.

It seems we need to write a long message, and we need to attract the readers' attention for a significant duration, in order to transmit our understanding.

EliRabett said...

Kooiti, perhaps this, although a bit more raw, is a bit clearer

Kooiti MASUDA said...

Eli's reference shows that there is no coherent system of knowledge in "AGW skeptics" as a community. Some (though not all) individual AGW deniers make self-incoherent discussions. "The value of coherence in science" by Stephan Lewandowsky at SkepticalScience criticizes such cases.

But I do not think that the focus of the present discussion by MT is incoherence of "AGW skeptics", but what kind of knowledge science is.

Peter T said...


I agree entirely with the importance of coherence. But it's not just a property of science - disciplines like history or archeology (and, from a bit of reading, even literary analysis) are cumulative too. A good recent history book will stand on a huge amount of tested and accepted data and interpretation. One thing that marks out economics is that there are unresolved basic arguments well over a century old.

What distinguishes science from literary criticism (and, though less so, from history) is that the total, coherent picture can be tested against the external world at many points.

Robert said...

Good post Michael. I agree coherence is a very important aspect of science

Michael Tobis said...

@PeterT: yes, it applpies outside of science. The fact that it applies outside of science is the exact point of the exercise.

Michael Tobis said...

applies! No pies!

Anna Haynes said...

Would the next Science for Everybody (#160) be "Type I vs Type II errors"?
(did anyone who took the time to read through all the answers notice if this one was suggested?)

Some people are keenly aware (& wary) of the risk of accepting a hypothesis that's actually false, but oblivious to the risk of failing to accept one that's actually true.

And when the real-world consequences of one of these errors is orders of magnitude more severe than the other...

Anna Haynes said...


Anna Haynes said...

(I'm cleaning up the Edge Q11 HTML, btw; stay tuned.)

Anna Haynes said...

A simpler Edge 2011 answers index, with one addition. (Which in retrospect I should have labeled as such)

The file's not editable though, w/o getting a new tinyurl; so it is likely to get out of date.

Hopefully the Edge editors will just grab the HTML, edit to their taste & throw it up somewhere on their site.

Michael Tobis said...

Anna, thanks, but your URL is broken.

Anna Haynes said...

It's the thought that counts, right?

this might work, albeit not nicely.
(not sure yet; I have to quit being me in order to test whether you can see it, but I have to be me in order to comment here...)

So. Where is the site (free, & no bandwidth limit) where I can deposit a webpage with a multitude of links, and have a simple URL that goes to it, and not have the host throw up its pixelated hands in horror assuming it's spam?

Anna Haynes said...

Try this?

Paul Baer said...

Interesting post and interesting thread. Your discussion of coherence is excellent. I look forward to reading more of the other 158 also.

Anna Haynes said...

Recommended, William Calvin's Find that Frame - has he been reading MT's red&blue dialogue?

( Found from my single-post blog at )

Anna Haynes said...

Read William Calvin's submission, "Find that Frame"