Met Office Proposal

The Met Office is proposing a new international analysis of land surface air temperature data.  This is good news in light of the damage done to the credibility of the datasets due to the media misrepresentations arising from the CRU hack. But me thinks celebration is premature — from my small sampling of ‘skeptic’, contrarian and denialist sites, even this will not quell the paranoia.

CA covers it here.

While there is general positive tone, some are wary:

Dr Iain McQueen

Posted Feb 23, 2010 at 6:55 PM | Permalink | Reply

This seems a very good document from the Met Office, and the concept, interestingly suggested a while back by Steve, is very encouraging.

It does seem, in the current climate, so to speak, to put strong emphasis on unhindered accessibilty of the data, and I think importantly homogenization routines. Hopefully a more organized and available record than professor Jones’ might be established!

Could there be a potential for subversion by Government of such a highly centralized and effectively government bought scheme? I do see obvious advantages however in the centralization but I also see risks. Maybe just paranoia!

Some are worried about their taxes:


Posted Feb 23, 2010 at 7:04 PM | Permalink | Reply

I read this story at Bishop’s place and I will add the same comment here: the proposed network will obviously require a large amount of funding and therefore to my mind is just another pitch for even more of my taxes. Obviously I would be interested to hear why they think this is needed, given the absolute certainty with which they’ve promoted and sold their existing data.

Fox News covers it here.

Here’s an excerpt:

The Met proposal argues says that its old datasets “are adequate for answering the pressing 20th Century questions of whether climate is changing and if so how. Bet they are fundamentally ill-conditioned to answer 21st Century questions such as how extremes are changing and therefore what adaptation and mitigation decisions should be taken.”

Those “21st Century questions” are not small and they are very far from cheap. At Copenhagen, wealthy nations were being asked to spend trillions of dollars on answering them, a deal that only fell through when China, India, and other near-developed nations refused to join the mammoth climate-control deal.

The question after the Met Office’s proposal may be whether environmentalists eager to move those mountains of cash are also ready to stand down until the 21st century questions get 21st century answers.

One can already see the wheels turning on how to spin this.

You can read the proposal here:

Here’s an excerpt:

The current surface temperature datasets were first put together in the 1980s to the best standards of dataset development at that time; they are independent analyses and give the same results, thereby corroborating each other.

In the case of the CRU land surface temperature dataset (CRUTEMP3, which forms the land component of the HADCRUT dataset) there are substantial IPR issues around the raw station data that underpin the dataset: we are actively pursuing resolution of these issues so that the base data can be made openly available. We know that several stations have already been explicitly forbidden from release by the rights’ holders so we will not be able to release all the under-pinning station data.

Consequently we have been considering how the dataset can be brought up to modern standards and made fit for the purpose of addressing 21st Century needs.  We feel that it is timely to propose an international effort to reanalyze surface temperature data in collaboration with the World Meteorological Organization (WMO), which has the responsiblity for global observing and monitoring systems for weather and climate.

There are a number of elements in the proposal, and it looks as if all the complaints and demands of the critics are being taken into consideration, including open access to data and code, independent assessments and comprehensive audit trails to “deliver confidence in the results”.

From the Fox News article:

  • “verifiable datasets starting from a common databank of unrestricted data”
  • “methods that are fully documented in the peer reviewed literature and open to scrutiny;”
  • “a set of independent assessments of surface temperature produced by independent groups using independent methods,”
  • “comprehensive audit trails to deliver confidence in the results;”
  • “robust assessment of uncertainties associated with observational error, temporal and geographical in homogeneities.”

Bishop Hill covers it here:

Here’s an excerpt:

Update on Feb 23, 2010 by Registered CommenterBishop Hill

The text of the Met Office’s proposal (or at least the executive summary thereof) is here. Interestingly it requires the data to be publicly available and the methodology to be published in the peer reviewed literature.

I guess this means that we will have access to the data but not the code. Adjustments to remain a secret then, and remember the warming is all in the adjustments (or it is in the US at least).

While this is potentially a good development, and should quell the criticisms of skeptics that the different organizations responsible for creating these datasets are hiding something, even this won’t satisfy deniers and contrarians of course, who reject AGW for reasons other than science.

Here’s another interesting comment:

Given the climate scientists’ abject failure to challenge or blow the whistle on the shenanigans at CRU and NASA, they are generally the last people who should be put in charge of the project. Perhaps a group from across the spectrum (eg Julia, Prof Curry, etc) could oversee a public effort to rebuild the database of raw data, including if possible the “Zombie stations” and others ignored by GISS.

No disrespect to Dr. Curry and Julia (whoever she is) but come on…

Here’s more:

The key here is that the Met office want the business before it is taken away from them… this a pre-emptive strike to retain their control.

The point should also be made the Met Office should not be the Controller of this data, just a user. A user that would have to compete on the new worldwide internet based marketplace of climate studies.

There must be a SEPARATE “OPEN SOURCE” FOUNDATION with a “board” accountable to the people who want to use the data.

I guess this is more proof that the project will not quell the truly tin hat crowd.

I suppose it’s too soon to hope that McI  will STFU can now retire and spend his days panning for gold or digging up diamonds somewhere in the wilds of Alberta.


About Policy Lass

Exploring skeptic tales.

14 Responses to “Met Office Proposal”

  1. GISS & NCDC have their own independent analysis, but according to the denialist crowd, they’re just the U.S. side of the hoax. After all, their data, which includes a complete spatial representation, shows slightly more warming over the recent decade as compared with HadCrut, which neglects the Arctic.

  2. For the record, all data, methodology, and code

    is available through GISS. Those making broad claims about climate data/code being hidden are generally just engaging in some form of the telephone game.

    As for McIntyre, he’ll retire someday, but some other individuals will step up to meet the public’s insatiable demand for global warming contrarianism. Supermarket tabloids remain popular for a reason.

  3. People are cynical because whenever the climate community is given the chance to redeem itself by acknowledging its failures and demonstrate a willingness to change its ways they choose whitewash and cover up.

    In any case, there is a simple way to produce a dataset that even the most cynical skeptic will accept: don’t adjust the raw data with opaque algorithms that add warming to stations for no apparent reason. If there is a good reason for such an adjustment (i.e. a station move) then that should be clearly indicated in the station metadata.

  4. The announcement by the Met Office is a very good thing because it is a direct attack on the lack of openness practiced by the pseudoscientists in climate science. This is a stinging rebuke of Phil Jones and posse.

    On the other hand, there is no doubt the effort by the Met Office will fall short of what should happen and would happen if a truly an Source Temp Record was being developed. If computer programmers from the open source community were driving this bus, it would be done like Firefox or Linux was developed. The result would be a computer interface that could analyze data using many different assumptions and adjustments. Users would actually be able to see the impact of adjustments or lack of adjustments. How does the temp record change if all of the airport stations are left out? How does it change if you adjust for UHI? Or do not adjust for UHI? How does it change if you only use CRN 1/2 stations?

    • Yeah, Ron. Firefox is great because it crashes all the time. And it is tons slower than IE8. And BTW, there is an open source version of GISTemp, check out

      And as far as Linux goes, I worked with some of the Linux code at SGI (around 15 years ago) and the interrupt handling code for X86 sucked. I feel that I can say this since I lived and breathed this code for System V for close to a decade. I’m sure that the problems I observed have been fixed now, but the hard bits seemed to get a lot less attention than the sexy bits. If you want to see your questions answered, download the CCC code and hack away. That’s the point of it. If you are good at what you do, it will probably make it into the codebase.

    • Ron, ever heard of Clear Climate Code?
      Of course, I see no reason why all those things you would *like* to see, *should* be part of such a temperature record. No one is stopping you from writing such a thing yourself. No one! You have free access to the raw data (GHCN). Free! What’s stopping you, Ron?

      And the Met Office is one of those organisations that blocked open data for decades…

  5. “People are cynical because whenever the climate community is given the chance to redeem itself by acknowledging its failures and demonstrate a willingness to change its ways they choose whitewash and cover up.”

    As a member of the climate science community, I would like to know what you are specifically referring to. Most of my colleagues are quite happy to correct legitimate errors when they are pointed out. For example:

    “don’t adjust the raw data with opaque algorithms that add warming to stations for no apparent reason”


    It really is statements like these that make myself, and other climate scientists, reluctant to participate in communication efforts with the public. How are we supposed to respond and engage in good faith discussions when we are constantly being lied about?

    • Examples of failures that make sceptics cynical:

      1) Penn State Inquiry. Many of the the CRU e-mails involved people who claim they have been wronged by Mann and his colleagues. There was no excuse for not seeking these people out, asking for their side of the story and insisting that Mann address their complaints directly.

      2) UEA Inquiry. Started out with reasonable objectives but screwed up by asking the editor of a journal that was implicated in the CRU emails to be on the review team. He did resign but the fact that he was even asked in the first place damaged confidence. Confidence has been further eroded when the claim that none of team members had a predetermined view on climate science was shown to be false because Boulton has a long history of political advocacy for the alarmist view of climate change. Boulton has refused to resign yet the claim of ‘no predetermined view on climate change’ remains on inquiry website.

      You can probably rationalize these actions but you should be able to understand why sceptics feel these inquiries are designed to be a whitewash rather than an honest effort to investigate the issues that have come up.

      The problem is not with errors. The problem is with the meat grinder homiginization algorithms that apply adjustments that don’t make sense when individual station records are reviewed.

      For example:
      http //

      UHI is a known issue so the adjustments to the Anrchorage station make sense. But the adjustments to a rural station like Matanuska make zero sense and should not be applied unless there is some documented reason like a station move. It is not enough to say “whatever the algorithm spits out must be correct because it was ‘peer reviewed'”. There are many other stations with similar issues.

      • Tim,

        Instead of vague references and innuendo, can you specify who and what you are talking about? For example, who specifically did Mann “wrong” that should have been contacted and involved? How should they have been involved? Incidentally, an independent investigation by Penn State exonerated Mann on 3/4 charges. Investigation of the fourth charge is ongoing because of lack of evidence to make a decision.

        Also, there are currently four investigations occurring at UEA right now, including at least one parliamentary investigation. Curiously, no one seems concerned with investigating the illegal release of these emails. So I would say, that those who are really interesting in having their say against scientists are certainly well represented.

        • One example: McIntrye asked Mann for the residuals from the MBH series. Mann refused and claimed it would take too much time. The CRU emails show that Mann had no problems sending the residuals to Osborn and admitting that they were dirty laundry. The panel should have asked for McIntyre’s side of the story.

          Basically, they should have contacted everyone who was mentioned in Mann’s emails and asked them for submissions.

          I don’t see what the number of inquiries UEA has to do with anything. The independent Muir inquiry was supposed to be objective yet through a series of dumb decisions they have created the appearance of bias. Maybe they will surprise us with a fair report but many sceptics are rightly pessimistic given the actions of the chair to date.

          The police are investigating the leak vs. hack issue. They don’t seem to have found any evidence that it was a deliberate hack. The last I heard, they seem to think it was either an inside job or an accidental release.

          • Tim, unless the UEA enquiry comes with claims of fraud and conspiracy, I very much doubt the ‘skeptics’ would accept the result of the enquiry. There is only one acceptable outcome for the ‘skeptics’. This was obvious enough with the Penn State enquiry, which did so *without any official complaints filed*.

            “The last [you] heard” is a claim by a journalist. He didn’t even contact the police, he just made a claim. Within a few days that claim became a ‘fact’.

  6. suppose it’s too soon to hope that McI will STFU

    Kind of odd to see a parasite wishing for the death of its host.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: