Skip to content

“these will be artificially adjusted”

November 22, 2009

The emails are only the start of this. The zip includes data and code. Reader Neal writes as follows (SM Note: Anthony reports below that he has verified these comments in the following location /documents/osborn-tree6/mann/oldprog
in the files maps12.pro maps15.pro maps24.pro):

People are talking about the emails being smoking guns but I find the remarks in the code and the code more of a smoking gun. The code is so hacked around to give predetermined results that it shows the bias of the coder. In other words make the code ignore inconvenient data to show what I want it to show. The code after a quick scan is quite a mess. Anyone with any pride would be to ashamed of to let it out public viewing. As examples bias take a look at the following remarks from the MANN code files:

function mkp2correlation,indts,depts,remts,t,filter=filter,refperiod=refperiod,$
datathresh=datathresh
;
; THIS WORKS WITH REMTS BEING A 2D ARRAY (nseries,ntime) OF MULTIPLE TIMESERIES
; WHOSE INFLUENCE IS TO BE REMOVED. UNFORTUNATELY THE IDL5.4 p_correlate
; FAILS WITH >1 SERIES TO HOLD CONSTANT, SO I HAVE TO REMOVE THEIR INFLUENCE
; FROM BOTH INDTS AND DEPTS USING MULTIPLE LINEAR REGRESSION AND THEN USE THE
; USUAL correlate FUNCTION ON THE RESIDUALS.
;

pro maps12,yrstart,doinfill=doinfill
;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;

;
; Plots (1 at a time) yearly maps of calibrated (PCR-infilled or not) MXD
; reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.

86 Comments leave one →
  1. twit permalink
    November 22, 2009 6:14 pm

    Why don’t people use a serious programming language like Ada?

  2. Mike permalink
    November 22, 2009 6:26 pm

    If there ever is a need to actually _run_ some code on a cluster to see what it does, I might try to get access to our university cluster (don’t know how busy it is, but it is quite good I hear).

  3. ali baba permalink
    November 22, 2009 6:39 pm

    Two things are missing in this post — the code and an explanation.

    As far as I can tell, the comments document (1) a hack around a bug in IDL5.4 p_correlate and (2) the “trick”.

    Apparently your correspondent Neal knows better but forgot to say.

  4. crosspatch permalink
    November 22, 2009 6:42 pm

    Judging from the comments in HARRY_READ_ME.txt putting together the database from all the input was an awful chore spanning three years. And this exposes what I had suspected was a very real problem. You have several different people doing climate research and they all apparently make their own databases from the raw input. The raw input seems to change as stations are added or drop out or numbering systems change and so forth.

    What the world needs someplace is a common standard temperature record database that they can all draw from where the data is standardized. It would be a very difficult job but once done, could provide some continuity and make future studies much easier and less expensive. You could have better assurance that people are comparing apples to apples and not constructing databases that differ because of how they handled data input inconsistencies.

    Following HARRY_READ_ME.txt is like a techie version of a Stephen King novel and I imagine that just about anyone compiling any kind of climate records writes their own version of it.

  5. Mike permalink
    November 22, 2009 7:06 pm

    I’m wondering whether Harry might be the guy who released this… someone with an IT background who also had had a good look at the ugly side of all this.

  6. PaulS permalink
    November 22, 2009 7:09 pm

    ; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
    ; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
    ; gives a zero mean over 1881-1960) after extending the calibration to boxes
    ; without temperature data (pl_calibmxd1.pro). We have identified and
    ; artificially removed (i.e. corrected) the decline in this calibrated
    ; data set. We now recalibrate this corrected calibrated dataset against
    ; the unfiltered 1911-1990 temperature data, and apply the same calibration
    ; to the corrected and uncorrected calibrated MXD data.
    ;
    matchvar=1 ; 0=regression, 1=variance matching
    calper=[1911,1990]
    verper=[1856,1910]
    ;
    ; Get corrected and uncorrected calibrated MXD data
    ;
    if matchvar eq 0 then fnadd=’_regress’ else fnadd=”
    restore,filename=’calibmxd3′+fnadd+’.idlsave’
    ;

    Comments please? From file FOI2009.zip\FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro

  7. Eric permalink
    November 22, 2009 7:12 pm

    UEA may be careless with archiving – from README_GRIDDING.TXT:
    Bear in mind that there is no working synthetic method for cloud, because Mark New
    lost the coefficients file and never found it again (despite searching on tape
    archives at UEA) and never recreated it. This hasn’t mattered too much, because
    the synthetic cloud grids had not been discarded for 1901-95, and after 1995
    sunshine data is used instead of cloud data anyway.

    I am also reading the various Fortran source files – would have been nice if they had heard of the concept of descriptive comments. I wonder if this code has ever been peer reviewed, verified or validated (I am a software engineer)?

  8. November 22, 2009 7:13 pm

    Reading some of these letters really makes me realise how often the ‘professionals’ that show us graphs and data to ‘prove beyond a doubt the obvious truth’, actually have to put a lot of bits of chosen data together to get things working and displaying right.
    For some reason I thought there would not be so much discussion on how to make graphs and what years/which data to take from – I assumed if they need a graph showing C02 levels for the last 10 years, they just get the data from some singular indisputable source that everyone knows about and voila.

    It suddenly seems all so human and subjective.
    …maybe ‘marooned.jpg’ is something that got spat out of their trend/prediction models…

  9. Eric permalink
    November 22, 2009 7:18 pm

    From documents\harris-tree\recon_esper.pro:

    ; Computes regressions on full, high and low pass Esper et al. (2002) series,
    ; anomalies against full NH temperatures and other series.
    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
    ;
    ; Specify period over which to compute the regressions (stop in 1960 to avoid
    ; the decline
    ;

    Note the wording here “avoid the decline” versus “hide the decline” in the famous email.

  10. Calvin Ball permalink
    November 22, 2009 7:26 pm

    void function fubar(void); {

    if dataset == hockeystick then plot(dataset); else fudge(dataset);

    return; }

  11. benjamin permalink
    November 22, 2009 7:27 pm

    “So, we can have a proper result, but only by including a load of garbage! ”
    — HARRY_READ_ME.txt

  12. Calvin Ball permalink
    November 22, 2009 7:30 pm

    Analogous to the open source voting project, there needs to be an open source climate project. That seems like a doable project. The data and reconstructions are a place to start, but that can and should also be done with the GCM.

    And we all know what objections we’re going to hear.

  13. Eric permalink
    November 22, 2009 7:45 pm

    Another “avoid the decline” in documents\harris-tree\recon1.pro – only this time to avoid the decline after 1940 – there could be perfectly reasonable explanations for these so I am not drawing any conclusions:

    ;
    ; Computes regressions on full, high and low pass MEAN timeseries of MXD
    ; anomalies against full NH temperatures.
    ; THIS IS FOR THE AGE-BANDED (ALL BANDS) STUFF OF HARRY’S
    ;
    ; Specify period over which to compute the regressions (stop in 1940 to avoid
    ; the decline
    ;

    Also appears in recon_jones.pro and many other files in this directory.

    The file calibrate_nhrecon.pro gets more specific:
    ; Specify period over which to compute the regressions (stop in 1960 to avoid
    ; the decline that affects tree-ring density records)

    The file plotagehuger.pro seems to be creating correlations at different time scales – including 20 and 40 year time scales.

    Interesting – plotregions_instr2.pro:
    ; Early N. American ones seem biased to being too warm early on (due to
    ; using north facing walls rather than Stephenson screens; the walls get
    ; sunny during high latitude summer!). so cut the first two regions off:

    recon_mann.pro:
    ; Computes regressions on full, high and low pass MEAN timeseries of MXD
    ; anomalies against full NH temperatures.
    ; THIS IS FOR THE Mann et al. reconstruction
    ; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
    ; IN FACT, I NOW HAVE AN ANNUAL LAND-ONLY NORTH OF 20N VERSION OF MANN,
    ; SO I CAN CALIBRATE THIS TOO – WHICH MEANS I’m ONLY ALTERING THE SEASON

    briff_sep98_e.pro:
    ;
    ; PLOTS ‘ALL’ REGION MXD timeseries from age banded and from hugershoff
    ; standardised datasets.
    ; Reads Harry’s regional timeseries and outputs the 1600-1992 portion
    ; with missing values set appropriately. Uses mxd, and just the
    ; “all band” timeseries
    ;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********

    After doing that, this code then looks at both 20-year and 50-year components.

    The code of all these files is very similar – mostly changing the data sets, it seems. From a s/w engineering perspective, having so much duplicated code can lead to maintenance, modification – and test and validation problems.

    The complete 2001 Briffa tree ring data set is located in
    documents\harris-tree\banding\b01abd_site.txt

    Table of tree-ring sites used in

    Briffa KR et al. (2001) Low-frequency temperature variations from a northern tree-ring-density network. J. Geophysical Research 106, 2929-2941.

    Number of sites: 389

    The code in this directory is also many files containing roughly the same code with minor changes.

  14. Nicholas permalink
    November 22, 2009 7:50 pm

    It sounds like they don’t have much experiences with databases or programming. The correct way to do that is to write code which reads from the original input data and does whatever corrections/merging are required, then spits out the new data. That way if the upstream (input) data changes you just re-run it and generate the updated output file. Doing it manually is silly, a waste of time, and a maintenance nightmare.

  15. Eric permalink
    November 22, 2009 7:58 pm

    documents/osborn-tree3/declineseries.pdf is a chart of numerous locations apparently showing ring density and widths.

    For the density category, there is a divergence post 1940. For the width category (as I understand the chart labels), there is a divergence post-1960.

    Some one familiar with tree rings may wish to help explain what these charts show and what they mean, and how the declining values may related to the name “declineseries” and perhaps the phrase “hide the decline”.

  16. Peter Sullivan permalink
    November 22, 2009 8:06 pm

    Hey Steve, This mathematically challenged bloke can still follow your logic. I doffs my lid to you sir.

  17. DRE permalink
    November 22, 2009 8:14 pm

    Keep looking at the code comments. I’ve found some interesting stuff there as well.

    Holy &*%$*@ Batman I’d hadn’t seen this before!

    ; plot past 1960 because these will be artificially adjusted to look closer to
    ; the real temperatures.

    Smoking howitzer?

    “trick to hide the decline” perhaps?

  18. MattN permalink
    November 22, 2009 8:18 pm

    I commented on another forum this morning how the code was really going to get people in trouble. Can;t wait to see what you guys turn up…

  19. Jim permalink
    November 22, 2009 8:21 pm

    *********************
    crosspatch permalink
    What the world needs someplace is a common standard temperature record database that they can all draw from where the data is standardized. It would be a very difficult job but once done, could provide some continuity and make future studies much easier and less expensive. You could have better assurance that people are comparing apples to apples and not constructing databases that differ because of how they handled data input inconsistencies.
    ****************
    The truth is that probably there are not enough surface instrumental records from instruments that are well sited and well distributed around the globe. So there probably never will be a good global record of the past. Probably the satellite record is the best, but we only have 30 years worth.

  20. Calvin Ball permalink
    November 22, 2009 8:36 pm

    Sounds like they don’t understand how email works, either. There’s no such thing as “deleted”.

  21. Frank K. permalink
    November 22, 2009 9:07 pm

    MattN permalink

    “I commented on another forum this morning how the code was really going to get people in trouble. Can’t wait to see what you guys turn up…”

    The fact that the codes are proving to be amateurish hacks is no surprise to me, given what I have seen in both GISTEMP and Model E. The amazing thing to me is that these guys will write papers describing applications of the most intricate and complicated mathematical algorithms to climate data; but when the curtain is opened to reveal how their results are ** really ** computed, we find incomprehensible junk code. This is not unlike what I sometimes see in the world of CFD…

  22. Calvin Ball permalink
    November 22, 2009 9:57 pm

    If someone has an innocent interpretation of “;****** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE*********”, I’m all ears. Seriously.

  23. Declan O'Dea permalink
    November 22, 2009 10:09 pm

    The episode has made the front page of Australia’s national newspaper: http://www.theaustralian.com.au/news/nation/hackers-expose-climate-brawl/story-e6frg6nf-1225801879912

  24. Nick Stokes permalink
    November 22, 2009 10:55 pm

    Calvin – there is a “community project” developing a framework for GCM’s etc – CCSM.

  25. richard permalink
    November 22, 2009 10:55 pm

    Here’s an interesting May 2007 e-mail from Phil Jones that Ben Santer is replying to in 1178107838.txt. Jones says:

    As a side issue , the disappearance of sea ice in the Arctic is going
    to cause loads of problems monitoring temps there as when SST data
    have come in from the areas that have been mostly sea ice, it is always
    warm as the 61-90 means are close to -1.8C. Been talking to Nick
    Rayner about this. It isn’t serious yet, but it’s getting to be a problem.
    In the AR4 chapter, we had to exclude the SST from the Arctic plot
    as the Arctic (north of 65N) from 1950 was above the 61-90 average
    for most of the years that had enough data to estimate a value.

    Now, to me, this sounds like he’s talking about a real issue in the data: i.e. the land temps on ice are colder than the sea temps when the ice melts and the 61-90 anomaly is based mostly on land temps, therefore anytime the ice melts, the temperature anomaly jumps up more than it should. Sounds like a legitimate data issue to be concerned about.

    But then, he quickly moves on to how they had to exclude SST from the Arctic plot in AR4 because 1950 was too high, higher than the 61-90 average. Interesting…. Are they doing the same in plots of Arctic temperatures now? Or is it okay when it happens now (increases the trend) but needs to be suppressed in the past (decreases the trend)?

    It’s all of these little gatekeeping choices that concern me the most.

  26. stevemcintyre permalink*
    November 23, 2009 12:40 am

    Can someone please provide an exact location for the code provided by reader Neal that I cited in th head post? (Obviously I should have done so already but have been very busy today.)

    In the file harris-tree/briffa_sep98_e.pro, I’ve verified the phrase

    ***** APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE********

    but I’d like to get an exact reference for Neal’s quote.

  27. wattsupwiththat permalink*
    November 23, 2009 12:50 am

    Steve, confirmed.

    The source files with the comments that are the topic of this thread are in this folder of the zip file

    /documents/osborn-tree6/mann/oldprog

    in the files
    maps12.pro
    maps15.pro
    maps24.pro

    These first two files are dated 1/18/2000, and the map24 file on 11/10/1999 so it fits with Jones 1999 email where he mentions “Mike’s Nature trick” which was dated 11/16/1999.

  28. stevemcintyre permalink*
    November 23, 2009 12:59 am

    Thanks, Anthony. I’ve looked through the predecessor emails to Mike’s trick – they refer to the IPCC spaghetti diagram which I’ve discussed on a few occasions at CA. I’ll have a detailed post on this.

  29. J.Hansford permalink
    November 23, 2009 2:21 am

    Yes, it’s pretty bad code considering that CRU obtained 13.5 million pounds stirling in which to write and corrolate it…. Lot of British taxpayer money there for a virtual nothing. (pun intended)

  30. aha permalink
    November 23, 2009 2:26 am

    I like this one from README_GRIDDING.TXT..

    “Use dist to specify the correlation decay distance for the climate
    variable being interpolated – necessary information to determine where
    to add dummy or synthetic data.”

    Keen to find out whether dummy and synthetic data was used during the runs that produced the graph data. Moreover, what was the methodology used to produce the synthetic data. Oh, and whether dummy data was just used for testing.

    Not sure if there’s anything untoward going on here, but you know what modelers tend to say… Garbage In Garbage Out (GIGO).

  31. KeithIsDeepThroat permalink
    November 23, 2009 2:57 am

    “This whole project is SUCH A MESS. No wonder I needed therapy!!”

    — HARRY_READ_ME.txt

  32. Dev permalink
    November 23, 2009 3:13 am

    Found in calibrate_mxd.pro:

    ;
    ; Calibrates the gridded and infilled MXD data against instrumental
    ; summer temperatures (land&sea). On a grid-box basis first, using the
    ; period 1911-1990 for calibration and the period 1856-1910 for verification,
    ; where data is available.
    ;
    ; Due to the decline, all time series are first high-pass filter with a
    ; 40-yr filter, although the calibration equation is then applied to raw
    ; data.
    ;

  33. FINN permalink
    November 23, 2009 4:30 am

    How should this be:

    \f77\mnew\master.dat.com
    29740 6030 2500 58 HELSINKI/SEUTULA—- FINLAND—— 1829 1999 297400 2974001
    1829 -129 -150 -97 -18 76 139 173 139 112 34 -47 -77

    This suspects me… was there in Seutula (Helsinki/Vantaa’s airport) a measuring point since 1829 – long before the opening of the airport in 1952? Or what?

    I suspect the coordinate data is in the header 2. and 3. field, 60° 30′N ja 25° 00′E?

    According to map, that is 5km southwest from the centre of Järvenpää.. in the middle of the forest. Pretty interesting? In the middle of the forest of Järvempää there is a sation called “Helsinki/Seutula” (which is the airport)????

  34. FINN permalink
    November 23, 2009 5:20 am

    I am also very concerned about this:
    ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2/v2.mean.Z from here we will find measurements from the Pello’s station (WMO-code is 02844)
    According to FMI measurements in Pello started 1.9.2000:
    http://www.fmi.fi/weather/stations_35.html

    Do they calculate graphs from made up data, or are the GHCN-codes just messed?

  35. FINN permalink
    November 23, 2009 5:44 am

    Forgot to say that from NOAA’s package we find measurements from 1.11.1801, and FMI says measurements started 1.9.2000. Highly suspecting.

  36. bender permalink
    November 23, 2009 8:59 am

    In fact that’s what “Harry” himself says. The decision early on to continue with past code & methods was probably the pointy-haired boss’s idea, to maintain consistency with earlier products. Bad decision.

  37. Ace permalink
    November 23, 2009 10:48 am

    I’ve written a tool that can search through a batch of files and quickly find key phrases in any type of document.

    What I’ve found is that most of what people have posted is typical conspiracy theorist alarmist nonsense, with almost no substance.

    For example:
    “One particular thing you said – and we agreed – was about the IPCC reports and the broader climate negotiations were working to the globalisation agenda driven by organisations like the WTO.

    So my first question
    is do you have anything written or published, or know of anything
    particularly on this subject, which talks about this in more detail?”

    I located the file in question (greenpeace.txt) It is an email from a Paul Horsman at Greenpeace to a Mick Kelly at UEA. It simply implies that Mick must have said at some stage that the IPCC reports were working in favour of a globalisation agenda. It doesn’t imply that IPCC have intentionally skewed data to favour a Globalisation agenda as the blogger wrote.

    Once again “these will be artificially adjusted to look closer to the real temperatures.” What exactly will be artificially adjusted, and what is meant by real temperatures? To me, this means actual temperature measurements that have been taken, as opposed to studying tree rings for implied temperatures.

    As far as I am concerned, there is nothing wrong with doing science like this if he is talking about replacing data after 1960 from less reliable sources with data that is current from more reliable sources.

    I’m not a climate scientist, and I don’t know what he meant. But I can see that it is possible that what he is doing could be legitimate.

    All I’m saying is that although it might be fun to find a conspiracy theory in every document. People need to be responsible; look deeper and stay open minded to other possible interpretations before getting carried away.

    I believe that most people who comment here and most bloggers who have written about this are not qualified to pass judgement on what they have found, unless they are themselves climate scientists and have a full understanding of what the authors intended message was at the time of authorship.

    This needs to be left to a board for independent inquiry, but God forbid, not a bunch of conspiracy theorists with blogs.

  38. Ace permalink
    November 23, 2009 11:28 am

    I think this proves my point.

    http://www.realclimate.org/index.php/archives/2009/11/the-cru-hack/

    No doubt, instances of cherry-picked and poorly-worded “gotcha” phrases will be pulled out of context. One example is worth mentioning quickly. Phil Jones in discussing the presentation of temperature reconstructions stated that “I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) and from 1961 for Keith’s to hide the decline.” The paper in question is the Mann, Bradley and Hughes (1998) Nature paper on the original multiproxy temperature reconstruction, and the ‘trick’ is just to plot the instrumental records along with reconstruction so that the context of the recent warming is clear. Scientists often use the term “trick” to refer to a “a good way to deal with a problem”, rather than something that is “secret”, and so there is nothing problematic in this at all. As for the ‘decline’, it is well known that Keith Briffa’s maximum latewood tree ring density proxy diverges from the temperature records after 1960 (this is more commonly known as the “divergence problem”–see e.g. the recent discussion in this paper) and has been discussed in the literature since Briffa et al in Nature in 1998 (Nature, 391, 678-682). Those authors have always recommend not using the post 1960 part of their reconstruction, and so while ‘hiding’ is probably a poor choice of words (since it is ‘hidden’ in plain sight), not using the data in the plot is completely appropriate, as is further research to understand why this happens.

  39. DaveJR permalink
    November 23, 2009 1:35 pm

    “As far as I am concerned, there is nothing wrong with doing science like this if he is talking about replacing data after 1960 from less reliable sources with data that is current from more reliable sources.”

    Nice idea, but these “less reliable sources” are being used to tell us what the temperature was for the past 2000 years! You don’t think that maybe using “reliable” data for the modern period and “less reliable” data for the earlier part is going to introduce any kind of visual bias at all?

    If the tree ring data is so unreliable it has to be replaced with temperature data to make it look good, what scientific use is it when there is no temperature data available to correct the “unreliability”?

  40. Sean permalink
    November 23, 2009 2:16 pm

    Hilarious!

  41. Sean permalink
    November 23, 2009 2:23 pm

    Great line!

    What do you mean by your screen name by the way?

  42. Sean permalink
    November 23, 2009 2:36 pm

    Hi Ace, what you’re missing is that the proxies which supposedly track global temperatures for the past 1,000 years (when we can’t check them against anything else) demonstrably fail to track global temperatures as reported by HAD-CRU for the last 100 years. Starting in about 1960, the proxies suggest that temps are falling when the HAD-CRU temp record show that they are rising.

    If the proxies don’t work now, that should cause you to question whether they ever worked. There may be a good answer to that question BUT if the observed temperatures are padded into the proxy chronology, the proxy chronology will appear to track observed temps and then you, the reader, will not even know that there is a question that needs to be answered.

    And what does the padding tell you about their confidence that there is a good answer to the question they are so studiously avoiding?

  43. Layman Lurker permalink
    November 23, 2009 2:48 pm

    Steve, here are some re-posts from tAV by on follow up comments by Jeff C. regarding “mbh98-osborn.zip”

    Jeff C. Nov 23, 1:19 pm

    “Following up on #9 and #10, the zip file ”mbh98-osborn.zip” in the documents directory has some other interesting stuff. The directory ”mbh98-osborn\TREE\COMPARE” appears to have multiple versions of various recons presumably to compare the differences. There is even a file named “resid-fudge.dat” that has quite a hockey stick. Probably just someone goofing around with the data but the filename makes one wonder.”
    -
    Jeff C. Nov 23, 1:35 pm

    “Looks like someone might have been indulging in a bit of wishful thinking with the “resid-fudge.dat” file. If you compare it to several others in the same directory (resid-lowf.dat, resid-lowf-1750.dat, and resid-best.dat) look what you find:

    year resid-lowf resid-fudge resid-lowf-1750 resid-best
    1952 0.199307 0.199307 0.199307 0.199307
    1953 0.203005 0.203005 0.203005 0.203005
    1954 0.206704 0.206704 0.206704 0.206704
    1955 0.211684 0.211684 0.211684 0.211684
    1956 0.216662 0.216662 0.216662 0.216662
    1957 0.222889 0.222889 0.222889 0.222889
    1958 0.229115 0.229115 0.229115 0.229115
    1959 0.236529 0.236529 0.236529 0.236529
    1960 0.243945 0.243945 0.243945 0.243945
    1961 0.25247 0.25247 0.25247 0.24
    1962 0.260995 0.260995 0.260995 0.24
    1963 0.270529 0.270529 0.270529 0.24
    1964 0.280065 0.280065 0.280065 0.24
    1965 0.290492 0.290492 0.290492 0.24
    1966 0.300919 0.300919 0.300919 0.24
    1967 0.312104 0.312104 0.312104 0.24
    1968 0.323288 0.323288 0.323288 0.24
    1969 0.335084 0.335084 0.335084 0.24
    1970 0.346879 0.346879 0.346879 0.24
    1971 0.359128 0.359128 0.359128 0.24
    1972 0.371377 0.371377 0.371377 0.24
    1973 0.383913 0.383913 0.383913 0.24
    1974 0.396452 0.396452 0.396452 0.24
    1975 0.409108 0.409108 0.409108 0.24
    1976 0.421766 0.521766 0.421766 0.24
    1977 0.434374 0.534374 0.434374 0.24
    1978 0.446982 0.546982 0.446982 0.24
    1979 0.459376 0.559376 0.459376 0.24
    1980 0.471771 0.771771 0.471771 0.24

    Note how the values are the same up until 1975. After that someone helpfully added 0.1 to the “resid-fudge” values from 1976 to 1979 and 0.3 to 1980. Again, no smoking gun but it does make me think this deserves more scrutiny.”

    Question: Could this be intermediate work which ties in with “Mike’s Nature Trick”? The date of the zip file is 5/22/2000

  44. Sean permalink
    November 23, 2009 2:50 pm

    They have not been clear abuot this, and even today they are not clear about it in the mainstream media.

    This is from Andy Revkin’s article in the New York Times:

    “Through the last century, tree rings and thermometers show a consistent rise in temperature until 1960, when some tree rings, for unknown reasons, no longer show that rise, while the thermometers continue to do so until the present.”

    So far so good, but here comes the obfuscation:

    “Dr. Mann explained that the reliability of the tree-ring data was called into question, so they were no longer used to track temperature fluctuations.”

    To any layman this would sound like the reliability of tree-rings was called into question, so tree-ring proxies were dropped from use due to their unreliability. Sounds very scientific to me.

    But that’s not what happened at all,. No, the tree ring data was truncated at 1940, so the inconvenient bits were snipped away and the convenient (but equally questionable) pre-1960 data was not dropped. Note that truncated is kind of like “no longer used” so Mann is choosing his words very carefully.

    And in case you think that I am just misunderstanding Mann even though his meaning is clear, here’s the next sentence of the article:

    “But he said dropping the use of the tree rings was never something that was hidden, and had been in the scientific literature for more than a decade.”

    So even Andy Revkin (who knows this area very well) interpreted Mann’s statement to mean that the unreliable data set was dropped, not selectively used.

    So, contrary to the claims of Real Climate, “hiding” is indeed an excellent choice of words to describe Mann’s ongoing obfuscation.

    By the way, I tried to post this same point to Andy Revkin’s blog on dotearth twice and it never made it through moderation, as near as I can tell.

  45. Layman Lurker permalink
    November 23, 2009 2:59 pm

    Jeff C’s comments on tAV found on this thread: http://noconsensus.wordpress.com/2009/11/22/steve-mcintyres-at-it-again/#comment-12737

  46. John Baltutis permalink
    November 23, 2009 5:55 pm

    there is a “community project” developing a framework for GCM’s etc – CCSM.

    Hmmm! Started in 1983, redone in 1996, and the last version in June 2006, with not much else going on. Seems more like a propaganda exercise, mimicking the IPPC’s stance.

  47. Sean permalink
    November 23, 2009 6:50 pm

    The use of the word “fudge” seems fishy. It’s almost the perfect smoking gun. Could it be that someone has released some real emails but fiddled with others and with the code? Either someone trying to (further) discredit CRU, or someone who hopes that critics of CRU will seize upon this and end up looking like conspiracy theorists?

    It doesn’t seem like any of this was made up, so far, but I note that there’s really only one email that CRU is acknowledging (the trick/hide email) and many of the rest (especially about deleting documents) are being ignored. I would ignore them too, if they were mine and they were accurate, but in an abundance of caution, critics should be careful until they are verified (maybe through a fresh FOI request?)

  48. Sean permalink
    November 23, 2009 6:58 pm

    Correction! It seems the emails about deleting documents are being acknowledged (at least the most damning one), by none other than the Mann himself:

    “Mann, a climatologist and statistician at Pennsylvania State University, University Park, said, “I did not delete any emails at all in response to Phil Jone’s [sic] request, nor did I indicate to him that I would.”

    http://blogs.sciencemag.org/scienceinsider/2009/11/in-climate-hack.html

    By the way, the article is very interesting and has some information on the potential criminal implications of deleting emails subject to a FOI request.

  49. Jon permalink
    November 23, 2009 7:24 pm

    snip – don’t leverage this thread to introduce this topic

  50. Gerry permalink
    November 23, 2009 7:54 pm

    What programming language is this?

  51. John Murphy permalink
    November 23, 2009 8:58 pm

    Does anyone know whether this is the standard of coding used for the many GCMs floating about?

    If it is, we can have no confidence at all in them either, much less in the data they are “supposed” to be using.

  52. John Murphy permalink
    November 23, 2009 9:06 pm

    Well, there was one of those.

    The English High Court held last year that Gore’s DVD was, in effect, scientific hocus-pocus and political propoganda.

    Otherwise, it’s a bit hard to get proper independnt inquiries. Even governments now have a vested interest – gaining power, avoiding loss of face – and it’s governments who set up inquiries.

  53. Layman Lurker permalink
    November 23, 2009 9:47 pm

    If you were going to do a run such as this what else would you call it? I think it is obvious that it is not a final product, but intermediate code/data. Therefore the “fudge” name is just something that helps to document what this file represents. So if this represents something intermediate, then “to what” I ask? The WMO graph? Playing around with the data to see what type of work is needed to create “Mike’s Nature Trick”? Dunno.

  54. November 23, 2009 10:02 pm

    Gerry, I believe this is IDL. If so a license is $6200 – not sure if there are viable alternatives (there is a gpl version, but not sure how compatible it is, or maybe this runs on that originally)

  55. TurkeyLurkey permalink
    November 24, 2009 12:26 am

    Mr. Mc;
    So very glad you had gotten through that esperesque treatment of the Yamal trees before this all hit the fan.
    I’m thinking that will be worthy of revisitation after this noise dies down.
    Perhaps the ninth word of the prior sentence is better as ‘if’ …
    Highest regards,
    TL

  56. kingdiamond25 permalink
    November 24, 2009 4:25 am

    How do you know that tree ring data is all they had to plot historical temperatures. Maybe they had other findings that confirmed that tree ring data pre 1960. Maybe they have a good reason for ignoring tree ring data post 1960, maybe some other anthropogenic effect could influence these results (such as excess CO2 in the atmosphere)

    I’m not saying we should ignore these emails. I am saying that they should be scrutinised by independent scientists. Not by conspiracy theorists with blogs, who are not qualified to pass judgement on what they have found.

  57. Larry Huldén permalink
    November 24, 2009 6:00 am

    FINN said: 29740 6030 2500 58 HELSINKI/SEUTULA—- FINLAND—— 1829 1999 297400 2974001
    1829 -129 -150 -97 -18 76 139 173 139 112 34 -47 -77

    This suspects me… was there in Seutula (Helsinki/Vantaa’s airport) a measuring point since 1829 – long before the opening of the airport in 1952? Or what?

    Helsinki temperature data extends backward to 1829, when the park of Kaisaniemi was the measuring point. It was moved a few times and at last placed in Helsinki/Vantaa airport about 15 km north of the city.
    The history is relatively well understood but homogenized data have been produced only from 1880-.
    The given coordinates doesn’t explain the history of the measuring point.
    Larry Huldén
    Finnish Museum of Natural History

  58. Joe permalink
    November 24, 2009 2:54 pm

    Calvin,

    I can. When I put a comment like that in my code amongst several other lines of non-capitalized / asterixed sentences it means “WARNING TO ANYONE READING THIS CODE, TAKE THE FOLLOWING WITH A GRAIN OF SALT, PLEASE DON’T MIS-INTERPRET THESE RESULTS AS REAL.” As in the code is meant simply to demonstrate something. Has anyone even figured out what (if any) end-product this code truely was used for?

  59. Syl permalink
    November 24, 2009 6:47 pm

    ACE

    This part of Gavin’s response you quoted is the actual trick:

    “The paper in question is the Mann, Bradley and Hughes (1998) Nature paper on the original multiproxy temperature reconstruction, and the ‘trick’ is just to plot the instrumental records along with reconstruction so that the context of the recent warming is clear.”

    The reconstruction flattened centuries past to emphasize the recent warming.

    Up until this paper was published, it was thought there had been much more variation in past temperatures.

  60. Uninformed Luddite permalink
    November 24, 2009 9:40 pm

    Eric,

    You are a software engineer and you have to ask? I don’t even use the language but a quick glance at the code’s structure and perusing a few manuals makes it pretty clear that this looks more like the work of a script kiddie(with a pre-conceived result in mind) than the work of a real scientist. Although, the code may have been written by someone who hasn’t had any exposure to Comp. Sci. (although in that case you would expect the code to have been written by a professional).

  61. Asimov permalink
    November 25, 2009 12:36 am

    Please people, before you start dissecting the code, read the file HARRY_READ_ME.txt in it’s entirety. It will make MANY things clear. Few of which are likely to make the CRU people happy… But it does help make sense of the… ridiculous… state that the code is in.

  62. Steve permalink
    November 25, 2009 7:33 am

    Obviously he has a preconceived result in mind. I am a computational physicist, and this is how all modern science works.

    It might not seem right to you, but this method of approaching problems actually works. If it didn’t, no scientific progress would have been made since about the time of Newton and we’d all still be living in the dark ages.

    Before you approach a problem esp. a computational one, you have a predicted outcome in mind based on theory. You then apply known rules to your program, and hope it’s results agree with your predicted outcome. If they don’t, you try to adjust your theory slightly, write these adjustments into the program, and so the cycle continues until you have agreement.

    This is certainly the way my field of physics is done, and I know from studying philosophy of science that this is the way all modern science is done.

  63. W W permalink
    November 25, 2009 8:49 pm

    For the problem of divergence ClimateAudit refers to this article:
    http://www.nature.com/nature/journal/v391/n6668/full/391678a0.html

    The abstract says: “The cause of this increasing insensitivity of wood density to temperature changes is not known, but if it is not taken into account in dendroclimatic reconstructions, past temperatures could be overestimated.”

    Obviously the code did the exact opposite: past temperatures are underestimated and the proxy must be corrected to compensate.

    Am I missing something?

  64. W W permalink
    November 25, 2009 9:10 pm

    I meant “RealClimate refers to this article” (not ClimateAudit)

  65. UninformedLuddite permalink
    November 26, 2009 5:53 am

    As I cannot reply to your post Steve I will reply here

    Maybe I didn;t phrase what I meant that well. Some of the code(two verified with this behaviour so far) will output certain ‘preconceived’ results with a very wide range of inputs and appear to have been designed that way (with some very basic obfuscation(hence my perjorative ‘script-kiddie’ reference). Sorry I was not more clear.

  66. UninformedLuddite permalink
    November 26, 2009 6:07 am

    Actually mate upon re-reading your response to my initial post just delete anything I have said on this blog(just a little to condescending for me). I promise not to return. You have been given several smoking guns and they aren;t in the emails. find them yourself.

  67. Asimov permalink
    December 1, 2009 4:43 pm

    Steve said: “Obviously he has a preconceived result in mind. I am a computational physicist, and this is how all modern science works.”

    That isn’t the issue. Does modern science also go back and change old data PERMANENTLY and IRREVOCABLY because it doesn’t fit in with their preconceived result? Oh, and throw the real data away after the changes?

    Sorry, but that’s NOT how modern science works. Having a preconceived result in mind is totally acceptable, changing the facts because they don’t fit in with your particular view is NOT acceptable.

  68. j r walker permalink
    December 11, 2009 4:29 pm

    If you transform the raw data, then discard the raw data, so that one can reproduce your results, then it’s not science. End of story.

  69. November 5, 2010 9:24 pm

    I’m glad to see this blog.
    I really like that because the post have many things to learn………

Trackbacks

  1. Steve McIntyre’s at it Again « the Air Vent
  2. Climategate Code Models Show Evidence Of Being Rigged « Fascist Soup
  3. Nice Deb
  4. CRU Emails “may” be open to interpretation, but commented code by the programmer tells the real story | NW0.eu Daily News
  5. Forget The Emails, Code Discusses “Artificially Adjusted” Temperatures « Wars & Rumor
  6. Mer Climategate | Sultans Blogg
  7. CRU Emails “may” be open to interpretation, but commented code by the programmer tells the real story « NWO News
  8. Forget The Emails, Code Discusses “Artificially Adjusted” Temperatures | NW0.eu Daily News
  9. Forget The Emails, Code Discusses “Artificially Adjusted” Temperatures | Conspiratorium 101
  10. Climategate – La fraude à propos du climat s’effondre « Les 7 du Québec
  11. CRU Emails “may” be open to interpretation, but commented code by the programmer tells the real story
  12. Global Warming Research Group Hacked: E-mails May Indicate Fraud, Conspiracy « The Seldon Plan
  13. Climate Gate And The Crisis Of Modern Science | The Laser Guided Loogie
  14. The Progressive Mind » “these will be artificially adjusted” « Climate Audit – mirror site [OBSOLETE!]
  15. NA-151-2009-11-26
  16. NoAgenda.tv | Blog | NA-151-2009-11-26
  17. Climategate – La fraude à propos du climat s’effondre | Les 7 du Québec

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: