Global Warming theory is based on the idea of a measurable global temperature. The first problem with that is that some stations used to measure it are poorly sited. The second, discussed here, is that to adjust for these and other biases, the data is “corrected”, and this correction accounts for over half the claimed warming in the last 100 years.
Physics is based on accurate measurement – for example Einstein’s final Special Theory of Relativity explained the strange observation that the speed of light is constant, regardless of the speed of the object emitting it. That measurement was made by Michelson and Morley. Actually, they found a very small emitter speed effect, but:
Although this small “velocity” was measured, it was considered far too small to be used as evidence of aether, and it was later said to be within the range of an experimental error that would allow the speed to actually be zero.
So accurate measurements are quite central to understanding of the workings of the universe.
Measuring climate much more complicated than measuring the speed of light.
First we have to decide on the definition of “global temperature”, and a sensible person might decide that it’s approximated by the average of a regular grid of measurements covering the entire surface of the earth.
Weather is local, so the finer the grid, the more accurate the measurements. But the grid isn’t very fine – the entire US only has about 1200 measurement points, when at any given time there’ll be many more than 1200 different weather events happening, so some will be missed. The climate modelers correct for this by extrapolating to fill in the gaps.
Then there are inaccessible areas – seas, deserts, and ice caps. These have nothing like the density of coverage of the US, so more guesses need to be made.
And then how can we compare measurements if the measuring device is moved – for example because a gas station is placed on top of it? That’s another correction.
Satellites help, but they’ve only been around for 50 years so tell us nothing about the long term. Plus they measure atmospheric as well as surface temperature. And anyway, they show no warming in the southern hemisphere, and only a bit in the north.
So we’re forced back to those surface stations and their pesky corrections. The next one we have to deal with is time of day. If we take a simultaneous measurement of temperature at a grid of points across the globe, some will be at high noon, and some midnight, and the rest in between.
Then there’s time of year. It’s hot now in the northern hemisphere, but it’ll be cold in 6 months because of the tilt of the earth’s axis.
So to compare temperature measurements in our grid, we need to know exactly when they were taken – time (to the second) and date. And average over a full 24 hour and 365 day cycle. But the data links don’t allow 60*60*24*365 measurements per sensor per year, so again we need to extrapolate to fill in the gaps.
The claimed change in world temperature since 1900 is under 1 degree F, so this correction accounts for over half of it. And the shape of the correction curve is exactly the one that’s used to prove that the warming is man made: taking off in the 1920s as the world industrialized.
This is the quality of data that’s being fed to a bunch of computer climate models that can’t predict tomorrow’s weather.
And these models are being used by the Climate Changers to justify de-industrializing our societies and preventing the poor people of the world from rising out of poverty and disease.
Michelson and Morley will be spinning in their graves.