Posts Tagged ‘data homogenisation’

How much of global warming is due to data corruption?

August 27, 2014

The Australian Bureau of Meteorology (BOM) is scrabbling trying to defend why the intentional corruption of data is justified. Dr. Jennifer Marohasy has a new post demonstrating that the excuses being offered do not hold up.

Whereas the Australian establishment uses “homogenisation” as their euphemism for “intentional data corruption”, the US uses “adjustment” : How NOAA Data Tampering Destroys Science

The temperature record at Rutherglen has been corrupted by managers at the Australian Bureau of Meteorology.

Of course raw data often needs to be adjusted but when the magnitude of the data adjustment is greater than the magnitude of the conclusion, then the adjustments or homogenisation become “data corruption” or ” data tampering”. As my Professor, Doug Elliott,  once told me – some 40 years ago – when I wanted to make calculated corrections for presumed errors due to radiation in flame temperature measurements, “You can argue for whatever corrections you want to make, but you cannot replace the measurement. The measurement is the measurement is the measurement”.

A “science” built on the falsification of data?

As was recently pointed out, fudging both data and model results seems endemic in “climate science”:

a recent paper from ETH Zurich.

If the model data is corrected downwards, as suggested by the ETH researchers, and

the measurement data is corrected upwards, as suggested by the British and Canadian researchers,

then the model and actual observations are very similar.

 


%d bloggers like this: