So somehow this RC comment from the halcyon pre-CRU-scandal days popped up in my RSS feed. I'm glad it did, because even though I remember the thread well I had forgotten this insightful story by John Mashey:
People are making an error common to those comparing science to commercial software engineering.In general this relates to the common error of people putting expectations from their own professional lives onto other disciplines, including the endlessly misplaced emphasis on frequentist reasoning in climate from engineers and MDs (Crichton and McIntyre both), or the desire for tight proofs from physicists (Dyson, Laughton, even Motl). In neither group is a "balance of evidence" argument useful, but that's how most of earth science works.
Research: *insight* is the primary product.
Commercial software development: the *software* is the product.
Of course, sometimes a piece of research software becomes so useful that it gets turned into a commercial product, and then the rules change.
===
It is fairly likely that any “advanced version control system” people use has an early ancestor or at least inspiration in PWB/UNIX Source Code Control System (1974-), which was developed by Marc Rochkind (next office) and Alan Glasser (my office-mate) with a lot of kibitzing from me and a few others.
Likewise, much of modern software engineering’s practice of using high-level scripting languages for software process automation has a 1975 root in PWB/UNIX.
It was worth a lot of money in Bell labs to pay good computer scientists to build tools like this, because we had to:
- build mission-critical systems
- support multiple versions in the field at multiple sites
- regenerate specific configurations, sometimes with site-specific patches
- run huge sets of automated tests, often with elaborate test harnesses, database loads, etc.
This is more akin to doing missile-control or avionics software, although those are somewhat worse, given that “system crash” means “crash”. However, having the US telephone system “down”, in whole or in part, was not viewed with favor either.
We (in our case, a tools department of about 30 people within a software organization of about 1000) were supporting software product engineers, not researchers. The resulting *software* was the product, and errors could of course damage databases in ways that weren’t immediately obvious, but could cause $Ms worth of direct costs.
It is easier these days, because many useful tools are widely available, whereas we had to invent many of them as we went along.
By late 1970s, most Bell Labs software product developers used such tools.
But, Bell Labs researchers? Certainly no the physicists/ chemists, etc, an usually not computing research (home of Ritchie & Thompson). That’s because people knew the difference between R & D and had decent perspective on where money should be spent and where not.
The original UNIX research guys did a terrific job making their code available [but "use at your own risk"], but they’d never add the overhead of running a large software engineering development shop. If they got a bunch of extra budget, they would *not* have spent it on people to do a lot of configuration management, they would have hired a few more PhDs to do research, and they’d have been right.
The original UNIX guys had their own priorities, and would respond far less politely than Gavin does to outsiders crashing in telling them how to do things, and their track record was good enough to let them do that, just as GISS’s is. They did listen to moderate numbers of people who convinced them that we understood what they were doing, and could actually contribute to progress.
Had some Executive Director in another division proposed to them that he send a horde of new hires over to check through every line of code in UNIX and ask them questions … that ED would have faced some hard questions from the BTL President shortly thereafter for having lost his mind.
As I’ve said before, if people want GISS to do more, help get them more budget … but I suspect they’d make the same decisions our researchers did, and spend the money the same way, and they’d likely be right. Having rummaged a bit on GISS’s website, and looked at some code, I’d say they do pretty well for an R group.
Finally, for all of those who think random “auditing” is doing useful science, one really, really should read Chris Mooney’s “The Republican War on Science”, especially Chapter 8 ‘Wine, Jazz, and “Data Quality”‘, i.e., Jim Tozzi, the Data Quality Act, and “paralysis-by-analysis.”
When you don’t like what science says, this shows how you can slow scientists down by demanding utter perfection. Likewise, you *could* insist there never be another release of UNIX, Linux, MacOS, or Windows until *every* bug is fixed, and the code thoroughly reviewed by hordes of people with one programming course.
Note the distinction between normal scientific processes (with builtin skepticism), and the deliberate efforts to waste scientists’ time as much as possible if one fears the likely results. Cigarette companies were early leaders at this, but others learned to do it as well.
While Mashey's piece is a particular gem, the whole thread is worth a revisit. I was the centrist with some sympathy for the inactivists on that one, so for those who like to cast me as a knee-jerk extremist I'd appreciate if you gave it a look.
PS - I still cannot fathom why Mashey does not have a climate blog.
Thanks for the kind words.
ReplyDeleteAs for the blog, sometime I may write "Why I don't have a blog" and if so, I will post it here. You might guess there is a complex web of reasons, some fairly subtle.
"In neither group is a "balance of evidence" argument useful, but that's how most of earth science works."
ReplyDeleteYou stole my point. I see climate science as a whole lot closer to epidemiology than to a fiddly, one-at-a-time precision activity like eye surgery.
And those "auditors" are in the analysis to paralysis game.
For me the question of "auditor" motivation is whether they know they are playing the paralysis game or are merely the Dunning-Kruger effect writ large.
ReplyDeleteMashey is infuriating. He is a better science writer and communicator than the bulk of dead-tree published science writers and communicators. His existence is a cruel joke against us of tiny intellectual gifts.
ReplyDeleteFrom: tobis@skool.ssec.wisc.edu (Michael Tobis)
ReplyDeleteSource: Healthy Fiesta by Jacqueline Higuera McMahan (Olive Press, (heh)
POB 194, Lake Hughes CA 93532) (1990)
JACQUIE'S EVERYDAY SALSA
========================
Ingredients:
------------
4 oz tomatillos
2 lb tomatoes
1 cup chopped onion
1/2 cup chopped green onions
1 tblsp minced garlic
1/2 cup canned green chiles
1/2 cup jalapen~o chiles, some seeds removed
2 tsp ground red chile
1/2 tsp ground cumin
1/2 tsp salt
1/2 cup minced cilantro
3 tblsp white wine vinegar
Instructions:
-------------
1. Soak tomatillos in warm water and remove dry husks. Dip tomatoes in boiling water for 30 sec. or hold over a gas flame and char. Remove skins and squeeze out seeds. (I ignored this and just opened a large can of stewed tomatoes.)
2. ROUGHLY puree everything.
3. Simmer in an open 2 quart saucepan for 5 minutes to blend flavors and help preserve the salsa. Salsa keeps well. If you want salsa even hotter, just add more jalapenos or keep more seeds. (I didn't have jalapenos, so just added some cayenne and some tabasco to taste. I also found that about a tablespoon of lime juice was nice.)
Obviously not a gourmet recipe. But it turned out very nice, and solves the problem of what to do with the other 90% of the cilantro I buy every week or two.
MT: In general this relates to the common error of people putting expectations from their own professional lives onto other disciplines....
ReplyDeleteThere's also some pretty biased reporting of what actually goes on in the real world, with, in particular, McIntyre and some of his followers painting a rosy picture of the supposedly rigorous standards of review and auditing in resource industries.
The main goal of resource audits is to ensure consistency in the calculation and public reporting of resource and reserves figures. Of course, the auditors do varying amounts of checking of interpretations of the underlying data but often many of the originating company's maps and interpretations are accepted as is. Just read the disclaimers at the beginning of any auditor's report to see how little they are prepared to take responsibility for the accuracy of their work.
Commercial auditing is a form of peer review (albeit with the auditors being in the pay of the audited company) but it's often very limited in scope. Third parties contemplating an asset purchase are well aware of this (having had their own work audited) and will invariably conduct their own evaluation work and will seldom take the auditor's word for it.
In other words, auditing, like peer review is a necessary but imperfect process; but the only thing that really matters in the long run is independent replication. The notion that academics have big lessons to learn from commerce about rigor, accountability and data disclosure is nonsense.
Andy, I don;t entirely agree.
ReplyDeleteThe lessons to be learned are from the software hobbyist sector. In their better moments some of the McIntyre crowd recognize this.
The interesting thing is that Linus Torvalds claims he was motivated by the scientific method, but we see it in action in the open source community.
You can create an involuntary John Mashey blog by doing a Google blog search for him and putting the search as a link on your blogroll:
ReplyDeletehttp://blogsearch.google.com/blogsearch?oe=utf-8&client=firefox-a&um=1&ie=UTF-8&q=%22john+mashey%22&as_drrb=q&as_qdr=w
That's what I've done at my blog.
To be clear, my comments were riffing, somewhat off-topic, on that particular comment of yours that I highlighted.
ReplyDeleteWhen it comes to software--or salsa-- I'm just a consumer, and am happy to remain entirely innocent of how it's made. ;-)
Andy:
ReplyDeleteCommercial auditing is a form of peer review (albeit with the auditors being in the pay of the audited company) but it's often very limited in scope.
Indeed. I also think it strange that the people at CA never pay any attention to materiality, which has been pretty much the key organising concept for all of the audit and reconciliation processes I have ever had to deal with.
It's almost as if they don't know much about how auditors and financial control types actually go about their business...
Regards
Luke
Agree Luke.
ReplyDeleteLet's face it, even a tax audit focuses on the areas likely to find rich pickings in actual taxable transactions.
They really don't care about trivial variations in the supply of pens or paper towels.
Auditors do check every little thing - but only after they've selected the target transactions carefully.
At RealClimate, the quoted comment was not found in the thread "On replication" (8 Feb. 2009) but in the thread "Warm reception to Antarctic warming story (27 Jan. 2009).
ReplyDeleteWell, Moe is also too kind, even if he thinks I'm infuriating :-)
ReplyDeleteReally, practice helps, and I've spent a lot of my career explaining technical subjects to wide varieties of audiences, which is why I did ~500 public lectures and ~1000 sales pitches.
This particular Bell Labs story was easy: I worked in several of the best software-savvy organizations there (i.e., anywhere). When I was a supervisor I was also a sort of a troubleshooter & special projects guy for 2 Directors, one of whom later became CTO and the other President of Bell Labs. The standard software engineering management course had been developed by my at-the-time boss, I used to help teach it, and one of the lessons was about choosing the software engineering methodology appropriate to a project. Projects sometimes failed because they used such heavyweight methods that projects just took too long. Some of that was in these talks.
I suppose I might have been more specific on "Likewise, much of modern software engineering’s practice of using high-level scripting languages for software process automation has a 1975 root in PWB/UNIX."
This is somewhat off-topic, but... has anyone come across a paleotemperature graph that looks like this?
ReplyDelete-- frank
Frank, I haven't seen that one exactly but it looks like a typical spaghetti graph ca. 2005 or 2006.
ReplyDeleteRattus Norvegicus:
ReplyDeleteThanks. It now seems that the graph is a tweaked version of the paleotemperature graph in Briffa et al. (2001), however I'll be blessed if I can find which precise IDL file in FOIA/document/osborn-tree6/ it came from.
-- frank
This is why no one takes you seriously. Until you alarmists can provide 6 sigma quality control for million-year time flitches that function as a sendmail daemon I can run in Fortran77, why should they*?
ReplyDelete*A very-slightly-paraphrased portmanteau of actual arguments from a commenter on RC, Ian Plimer, and Eric Raymond.
In general this relates to the common error
ReplyDeleteof people putting expectations from their
own professional lives onto other discipline
Putting that on the twitternets.