how temperatures are measured
By Dr J Floor Anthoni (2010)
(This chapter is best navigated by opening links in a new tab of your browser)
Note! for best printed results, read tips
for printing. For corrections and suggestions, e-mail
-- Seafriends home --- climate index -- global issues -- Rev:20100416,20100816,20110105,20110703,
Measuring temperature should be a most simple scientific exercise, which a primary school student could do to full satisfaction. It is therefore a surprise that it becomes a major problem to do right, in such a way that temperatures all over the world can be compared, and stored in a database.
Today there are still two temperature scales in use: Fahrenheit (UK previously and USA still today) and Celsius (the rest of the world). The Fahrenheit scale has been replaced scientifically by the Celsius scale (called Centigrade in the UK and USA), and later by the Kelvin scale, which has identical one degree steps.
Temperature is an important quality in daily life, science and industry. Just about all processes depend on temperature because heat makes molecules move or vibrate faster, resulting in faster chemical reactions. Heat is wanted and wasted, and so is cold. When substances are cold, the processes within proceed more slowly, as in chilled or frozen foods. It does not surprise therefore that many ways have been invented to measure and control temperature.
|Based on known extension of a known substance
When a substance (solid or liquid or gaseous) is heated, it extends or expands (with few exceptions). When such an extension can be seen, a thermometer can be made. Substances with high expansion coefficients are of course most suitable but there are other requirements.
The mercury thermometer is the classical thermometer, based on the known expansion of mercury, a liquid metal. Its principle is simple: a (relatively large) volume of mercury inside a rigid class 'bulb' is warmed and expands into a narrow capillary tube of rigid glass. The larger the bulb and the smaller the capillary, the more sensitive the instrument becomes. Medical mercury thermometers are capable of measuring to tenth of a degree Celsius. The mercury thermometer has the following properties:
+ mercury expands easilyThe alcohol thermometer is also widely used, with the following properties:
+ it conducts heat easily, being a liquid metal
+ it is silvery opaque and clearly visible
+ it does not stick to glass
+ a minimum-maximum thermometer can be made with it
+ it has a high boiling point (357ºC) and can thus be used for high temperatures
- it freezes at -39ºC and this could cause the bulb to crack
- it is relatively expensive
- it is considered an ecological hazard, even though liquid mercury is harmless
+ it expands easily, even more than mercuryThe Six's maximum and minimum thermometer is a clever use of an alcohol bulb thermometer with some mercury in its capillary, topped up with more alcohol and ending in an empty bulb with some vacuum. Because mercury is so dense, a magnetic metal needle will float on it, and can be pushed against some friction (a metal back plate). At maximum temperature the furthest needle will stay behind, attracted by the metal backing plate. Likewise at minimum temperature, the closest needle will stay behind. After reading the thermometer, the two needles can be re-set (drawn onto the mercury level) with an external magnet, or by pushing the metal back plane away from the magnetic needles, which then descend by the pull of gravity. The Six's thermometer has the advantages and disadvantages of both mercury and alcohol thermometers. But its capillary must be wide enough to place the metal floating pins, which means that it cannot be read very accurately (0.5ºC is difficult).
- it is not a good conductor of heat
+ it can be coloured in any colour to be easily visible
- it has a low boiling point of +78ºC
+ it has a low freezing point of -112ºC and is suitable for low temperatures
+ it is inexpensive
- it wets glass and gives a less precise readout
+ it is not harmful to the environment
Please note that bulb thermometers are sensitive to outside pressure
and are thus less suitable for deep sea temperature measurements, unless
they are encased inside a rugged mantle.
still to do: drawing of these thermometers
The industrial bulb thermometer consists of a relatively large copper bulb with long capillary tube that can be bent and guided through the innards of an appliance. At its end it has a tiny pressure sensor (manometer) which operates an electrical switch. With a screw its setting can be altered. These thermo-controllers are extensively used in air conditioners, washing machines and other appliances.
A metal spring thermometer can be made by coiling a metal strip with an indicator attached to its loose end. When the strip expands, the coil unwinds somewhat, which moves the indicator. This kind of thermometer is useful where a wide range of temperatures needs to be measured with low accuracy, as in cooking food and for ovens.
The bi-metal thermometer is based on the difference in extension
between two metal strips, sandwiched together and riveted or spot-welded
at both ends. This causes the strip to bend when temperature changes. The
strip can be bent, folded or coiled to amplify its effect. Bi-metal thermometers
are extensively used in temperature controllers to switch electrical devices
like warmers and coolers on or off. They are less suitable for absolute
temperature measurement. Some bi-metal thermometers are dimpled to give
a click-clack effect, a positive transition at a certain temperature (click),
but with hysteresis (lagging behind) when clacking back.
Temperature also makes electrons move faster inside conductors like metal, thereby changing their resistance.
The platinum resistance thermometer is based on its resistance changing precisely with temperature. The change in resistance can be measured with an electronic circuit and amplified as an electrical signal and shown on a voltage indicator. To minimise external influences like supply voltage variations, a 'bridge' circuit is used which essentially measures the difference in voltage between the platinum resistance and another known resistance. Because platinum is a noble metal, the thermometer is very stable while able to operate under a very wide range of temperatures. For ultimate precision, linearising circuits are applied, and the 'known' resistor may be kept at a known temperature.
The thermocouple thermometer is based on the difference in conductivity
(electron mobility) between two metals, brought into contact with one another
or spot-welded together. When two dissimilar conductors are brought together,
a voltage difference occurs, which can be measured. When warmed, the voltage
increases due to a higher electron mobility. Thermocouple thermometers
can measure a large range of temperatures and are very stable. They are
also independent of the contact area, and are thus easy to make. They are
also insensitive to outside pressures. However, thermocouples occur in
pairs and one of them must be kept at a constant known temperature.
When thermocouples are stacked in series, their sensitivity increases proportionally, known as a thermopile. They can be used for measuring heat flow.
The thermistor thermometer is based on the conductivity of a semiconductor, which is quite sensitive to temperature. So it acts like a resistance thermometer. Unfortunately the resistance change is not linear and can be corrected only to some degree. It also has a very limited range. Thermistor thermometers are suitable for measuring the temperature of living organisms, like humans. They can be made rather small (less than 1mm).
Infra-red thermometers measure the infra-red (IR) radiation of
substances. Therefore they do not need to be in direct contact with them.
But the measured object must be warmer than the infra-red detector. So
they are more suitable for measuring high temperatures at a safe distance.
By cooling the IR detector to a known temperature, also lower temperatures
like that of living organisms, can be measured. Note that the CO2 in air
absorbs IR radiation, which limits their use but manufacturers excluded
the CO2 absorption band. The accuracy of IR thermometers is limited.
Passive infra-red (PIR) detectors also detect warmer-than-air objects, but they are used for detecting movement of such objects, and not their precise temperature.
|The Stevenson Screen
The Stevenson screen was designed by Thomas Stevenson (1818-1887), a British civil engineer, in order to more accurately measure air temperatures rather than side effects like solar irradiation heating up the thermometers. To reflect heat back, it is painted white, but better still would have been reflective aluminium. It has louvered sides to let the air through but not the sunlight. Once it became an accepted standard, the Stevenson Screen is now spread all over the world. It now allows temperatures to be compared wherever they are measured.
A lot of thought and experience went into its design: the door swings down rather than to one side so that the wind won't catch it on windy days and rip it off the hinges, and it opens facing north, to keep the sun from shining directly on the thermometers while reading the thermometers.
Inside it one finds two normal thermometers (alcohol for cold areas, mercury for warm places), but one of these has its bulb wetted by a wick soaked in a bottle of water. This wet bulb thermometer gives an indication of evaporation, because evaporation of water causes cooling. There is usually also a max-min thermometer. The thermometers are placed such that they can be read with ease and replaced with minimum effort.
An important consideration is also that the louvered box stands a fixed
distance above the ground, for least interference with low objects that
may impede wind flow (and snow).
|Temperature reading errors
Suppose we have stations with the finest thermometers inside the most standard Stevenson screens and located in rural areas, away from urban disturbances, then surely, readings must always be accurate? They are not, for various reasons:
In a paper  scientists are reminded of the natural uncertainty (or inaccuracy) in thermometer measurements, arising from reading errors, instrument errors, time of day errors, poor location and weather short-term fluctuations. It creates a band of almost 1 degree C around observations. In scientific terms, it means that it cannot be said with certainty that the world has warmed since 1880. Draw a horizontal line from just above 0 on left to right and it will traverse through the grey envelope. In the words of the authors:
"The ±0.46ºC lower limit of uncertainty shows that between 1880 and 2000, the trend in averaged global surface air temperature anomalies is statistically indistinguishable from 0 C at the 1-sigma level [half the width of the grey envelope]". One cannot, therefore, avoid the conclusion that it is presently impossible to quantify the warming trend in global climate since 1880."
 Frank, Patrick (2011): Uncertainty in the Global Average Surface Air Temperature Multiscience vol 21/8 http://multi-science.metapress.com/content/c47t1650k0j2n047/?p=7857ae035f62422491fa3013c9897669&pi=4. not free. 
do we measure?
What do we measure with Stevenson Screen meteorological thermometers? The problems with temperature measurements do not end with the ones described above, because the real question is what do they measure? It is claimed that they measure Earth's surface temperature, but is that really so? What do the maximum and minimum temperatures tell us? Is the day's average equal to the middle between maximum and minimum? The graphs show some of the problem.
A day begins with the blue curve of net sunlight beginning just before six in the morning and ending just after six in the evening (apparently in spring). It doesn't take long before the air begins to warm too (sensible heat, orange) due to the warming of the surface, and later still some evaporation happens (latent heat, cyan). But watch what infrared out-radiation does (net IR, magenta), shown upside down because it goes out rather than in. It increases somewhat during the day and is still present at night, in total area equalling that of sensible heat (conduction and convection). In other words, the idea of infra-red out-radiation from the surface is only half supported by measurements. The part that does, is soon absorbed by air molecules and converted into sensible heat. Source .
|This graph shows measured temperatures during a single year. MSAT means Meteorological Surface Air Temperature, the temperature inside the Stevenson Screen. It has two outomes, Min MSAT, the minimum temperature (black) and Max MSAT, the maximum (magenta). The average between these is considered the surface temperature for the global temperature datasets. But as you can see, it does not represent the actual surface temperature, 1.5m lower, shown in blue (Max) and yellow (Min). The average between these two is considerably larger.  Also note that the Min MSAT follows the minimum surface temperature and that Max MSAT comes close to the real average..|
 Roy Clark (2010): What surface temperature is your
model really predicting? http://hidethedecline.eu/media/BLANDET/What%20Surface%20Temperature%20V2_R%20Clark_9%2020%2010.pdf
 Roy Clark (2010): It Is Impossible For A 100 ppm Increase In Atmospheric CO2 Concentration To Cause Global Warming.http://venturaphotonics.com/GlobalWarming.html.
|Urban Heat Islands
It is human nature to change his environment for maximum comfort, which means shutting out the nasty aspects of weather like rain, cold wind and intolerable heat. So where people live, one finds wind breaks, shading trees, houses, roofs, concrete, parkings, roads, air conditioners, cars, air planes, all contributing to a change in air temperature. And they all cause extra heat. Where Stevenson screens once stood isolated in a meadow, over time they find themselves surrounded by civilisation, causing the air temperature to rise. This is called the Urban Heat Island effect, which can corrupt temperature data substantially.
|This image (courtesy Anthony Watts) shows the urban heat island effect over Reno California USA before midday. The temperature measured varies from 47-57ºF (by 5ºC). so the question is what is THE temperature of Reno? Is it the average (51) or the minimum (47)? Clearly, the UHI causes a formidable difference between cities and rural places and more so with bigger cities. Its main problem lies in its unpredictability from place to place and over time.|
|Tokyo with its 18 million inhabitants and massive urbanisation and transport systems, has a very significant UHI signature, as shown in this graph (from Anthony Watts). It has increased by a massive 3ºC in the past century and is still increasing further. By comparison nearby Hachijo island which has also suffered some urbanisation, shows a modest temperature increase of less than 0.5ºC in a century. Which of the two stations would you exclude from a world temperature database? Guess what the people of Tokyo are more interested in? Note also that temperature swings (a decadal cycle) are larger at Hachijo, perhaps caused by swings in sea temperature.|
|The graph shown here was derived from 47 counties in California, averaging their temperature trends for the period 1940-1996 and plotting them against their population size. Rural stations on left and urban stations on right. From the data points a straight line can be drawn which would cross the zero temperature trend. Also shown on this graph are the six stations "X" used by NASA GISS from which global averages are calculated. As can be seen, five out of six are located where a significant Urban Heat Island (UHI) effect is experienced, of about 0.6 degrees. Not shown is the historical growth of these counties over the 56 years, but it is evident that much of 'global warming' consists of the UHI. Many similar studies exist, all consistently showing that UHI seriously pollutes the instrumental record.|
|In 1996 Goodridge grouped Californian counties by population size and obtained these three temperature curves for the 20th century, using standard temperature datasets. Once more it showed that population density (UHI) is the main contributor to 'warming'.|
On a daily basis, 1600 weather balloons are released from 800 stations, usually at the same time: 0:00 UTC and 12:00 UTC. The 2m diameter rubber latex balloon is filled with hydrogen gas. Its mission is to measure temperature, relative humidity and pressure, which are used for weather forecasting and observation. Modern weather balloons can now also measure position and wind speed by using GPS positioning.
|The advantage of weather balloons is that they truly measure the air's temperature, unaffected by Urban Heat Island effects. Satellite temperature measurement also have this advantage, but cannot measure over a range of altitudes. This graph compares the three methods over a period of 20 years. Note how balloons and satellites agree, and how the surface temperatures show an urban heat island effect of some +2 degrees. Not shown is how regular adjustments aim to bring these measurements into agreement. For instance, the starting point in this graph has been aligned this way, and perhaps 1998 as well.|
NOAA National Weather Service Radiosonde Observations
http://www.webmet.com/ Meteorological Resource Centre. Met Monitoring Guide: http://www.webmet.com/met_monitoring/toc.html chapter 9.1..2
Ocean surface temperatures have been measured by ships for several centuries. First it was done by collecting surface water in a bucket while steaming on, but later the engine's cooling water inlet was used. Unfortunately this made a difference, because the water inlet is at some depth under water. Today this may serve to advantage because satellite can measure only the top few centimetres of the sea because infrared radiation is rapidly absorbed by water. Because water continually evaporates from the sea, the surface film is somewhat colder than a few metres down. This map from Reynolds (2000) shows where the ships' tracks are, and that their measurements are in no way representative of the entire oceans.
|The graph shows both land and ocean temperatures from thermometers, since 1880. As can be seen, the land temperature rises more steeply than the sea temperature, most likely caused by the Urban Heat Island effect. Even so, both follow similar oscillations; a steep short decline followed by a long slow incline. The sea warms by about 0.5 degrees per century whereas the land warms by about 1.2 degrees per century. Compare this with the UHI effect of Tokyo above. What is omitted from this graph is the steep decline before 1880.|
|Ocean temperature buoys
Since the year 2000, and benefiting from technological advancement, an aggressive programme was begun to measure the oceans entirely, with tide gauge stations, moored buoys, drifters and ships of opportunity. The ARGOS satellite system circles Earth to collect the data, while the AOML has responsibility for the logistics of drifter deployment and quality control of the resulting data . The map shows the locations of ARGOS drifters from the USA (blue) and UK (red/orange). Of course their positions change daily.
A main advantage of the ocean drifters is that they collect data of the air as well as the sea at various depths, and entirely without human error.
|A drifting buoy is an inexpensive, autonomous device which is deployed by ships of opportunity. Distributed throughout the oceans of the world, it is designed to drift freely with the ocean surface currents, has an average lifetime of more than a year, and can measure sea surface temperature, surface currents, and sea level pressure. The buoy is a round sphere of about 0.5m diameter, from which an array of cables and sensors hangs. It measures temperature, salinity and ocean currents. The collected data are then transmitted back to shore via satellite. In July 1995, data were logged from more than 750 buoys.|
An expendable bathythermograph (XBT) is another inexpensive device which is also deployed by ships of opportunity. An XBT is a small instrument that is dropped into the ocean from a ship. During its descent at a constant rate, an XBT measures the temperature of the seawater through which it descends, and sends these measurements back to the ship through two fine wires that connect the ship to the instrument. XBTs generally have a depth limit of 750 meters, but some reach depths of 1800 meters. Many ships relay summaries of the vertical profiles of temperature back to the shore by satellite. Meteorological centers throughout the world receive data from both the XBTs and the buoys via a global communications network, and use it to prepare the analyses that are essential for forecasts of weather and climate. The complete vertical temperature profiles are sent to data collection centers after the ships reach port. The Upper Ocean Thermal Center at AOML has responsibility for quality control of an average of 2,000 XBTs per month.
The latest drifters are semi-autonomous, being capable of making deep
dives to 200m, drifting there for 9 days, and surfacing at intervals to
transmit their data and recharge their batteries. Over 3000 of these autonomous
drifters have been released so far. As their technology becomes more sophisticated,
they could perhaps at some time also measure clarity, light extinction
with depth, pH, pCO2, plankton concentrations, oxygen and carbon fluxes,
|Satellite Sea Surface Temperatures (SST)
Since satellites began to be used for measuring environmental variables (GOES), both land and sea temperatures have been measured with good accuracy. The map here shows average ocean temperatures for a given year. It is important to remember that this represents only the very thin surface of the oceans.
The advantage of satellite measurements is that they truly cover the whole of the world. Their disadvantage is that they cannot measure absolute temperatures, and that they vary slowly with time (drifting).
 http://www.aoml.noaa.gov/general/ Atlantic Oceanographic & Meteorological Office AOML.
http://www.aoml.noaa.gov/phod/dac/gdp.html Global drifter program
http://www.aoml.noaa.gov/phod/dac/2006_gdp_report.pdf An impressive report on the ocean drifter programme (PDF) slideshow.
The places where thermometers are placed were never selected with a view of collecting a representative set of temperatures from which the world's average could be calculated. They are simply located where people live, and that introduces the urban heat island effect. The two maps below, show that the world is not adequately or evenly covered. To make matters worse, many temperature stations are pretty recent and do not have a long-term record. Others do not satisfy stringent quality requirements.
|Averaging the temperature data
From the above maps one can see that it is impossible to arrive at an average temperature for every square on the grid. Besides, the squares become smaller towards the poles (but this can be accounted for). Yet this is precisely what NASA (USA), and the Climate Research Unit (UK) have done, with disastrous results. These results were then used in the IPCC reports as if they were reliable.
To make matters worse, these scientists have been 'adjusting' the original data to fit their expectations. It is important to remember that 'world average' temperatures mean less than a good time series of a single remote station. It also implies that the evidence from thermometers to support 'global warming', is entirely unreliable.
There is also a thermodynamic 'finer point': if one wishes to know the effective out-radiation, which is proportional to the fourth power of absolute temperature (T x T x T x T), then this should be taken into account, making the effective temperature noticeably larger than the average temperature.
Finally, were average temperatures to have any meaning, it should also be related to the heat content where it was measured. Ice caps and oceans have large latent heat, whereas deserts have low latent heat. Thus in climatology, one should be very cautious about 'temperature averages'.
For various known and unknown reasons, the chemical elements found on Earth have 'sister' elements or isotopes (Gk: isos=equal; topos= place; as in the same place in the periodic table of elements). Isotopes behave chemically alike but have different bulk (different number of neutrons). Some isotopes are unstable and fall apart by radioactive decay (alpha, beta or gamma radiation).
One of the best known isotopes is radioactive carbon-14 which is created in the atmosphere from the element nitrogen. Because of its beta-decay (emitting an electron) and half-life of about 5000 years, it is extensively used in radio-carbon dating of biological substances (wood, shell, hair, etc.). Carbon-14 measures time rather than temperature.
Note that the correct notation for isotope carbon-14 is: 14C
Tip: for the ºdegree symbol hold the ALT key while typing 167 (ALT+167)
Similarly ‰ = ALT+0137 and the ñ in La Niña = ALT+164. Micro µ = ALT+0181 Beta ß = ALT+0223
Beryllium is the fourth element in the Periodic Table, after Lithium and before Boron. It has an atomic mass of 9, made up by 4 protons and 5 neutrons. It can be made as a fragment from heavier elements (nitrogen 14, oxygen 16) by cosmic bombardment (spallation) which expels protons and neutrons. Also cosmic radiation itself contains beryllium. Radioactive Beryllium-10 has a half-life of 1.51 million years, and decays by beta decay to stable Boron-10 with a maximum energy of 556.2 keV.
|This figure shows two different proxies of solar activity during the last several hundred years. In red is shown the Group Sunspot Number (Rg) as reconstructed from historical observations by Hoyt and Schatten (1998a, 1998b). In blue is shown the beryllium-10 concentration (10E4 atoms/(gram of ice)) as measured in an annually layered ice core from Dye-3, Greenland (Beer et al. 1994). Beryllium-10 is a cosmogenic isotope created in the atmosphere by galactic cosmic rays. Because the flux of such cosmic rays is affected by the intensity of the interplanetary magnetic field carried by the solar wind, the rate at which Beryllium-10 is created reflects changes in solar activity. A more active sun results in lower beryllium concentrations (note inverted scale on the blue plot). Note that the sun's variability is much more than suggested by the satellite record (the solar constant).|
Oxygen-18 or 18O has two extra neutrons instead of the usual 8 (10n+8p). It is a mysterious element that occurs in concentrations of around 0.2% and is stable (not radioactive). Practical measurements have shown that it correlates with temperature: higher concentrations mean lower temperatures, but the why and how eludes somewhat. The graph shows 18-O variations in foraminifers which are usually found on sea bottoms in the shallow coastal zone.
Present thinking is that colder temperatures cause ice caps to expand, which are deficient in O-18, leaving the sea more abundant in 18-O. Thus the delta-18-O measures the amount of ice in ice caps rather than actual surface temperature. As a consequence, the 18-O signature lags many hundreds of years behind surface temperature. When Earth is cooling, water is transported through air to the ice caps, so the time lag is maximal as also the rate of the 18-O signature is more gradual than that of surface temperature. When Earth is warming, ice caps melt and meltwater flows almost instantaneously back to the sea. So the warming part of the 18-O signature lags less and changes more steeply.
Scientists use the symbol delta for the Greek letter 'd', for differences in quantities.
The variations in isotopes are expressed as a percentage (%) (or promille ‰) and calculated the way one would calculate relative profit:
profit (%) = ( (sales -cost)/ cost) x 100%Likewise the delta-18-O ‰ = ((measured value - standard value)/ standard value) x 1000 ‰
where the standard value is either a standard sample (as in PeeBee Belemnite for 13-C) or any other sample.
Carbon-13 is a natural stable isotope of carbon and has one extra neutron (7n + 6p). It makes up about 1.1% of all natural carbon on Earth. Whereas isotopes are normally detected by mass spectroscopy, carbon-13 can sensitively be detected with Nuclear-Magnetic Resonance (NMR). It is also a mysterious isotope that is preferentially avoided by plants. Thus wherever 13-C is used, there is less of it. C-13 is always measured against a world standard called PeeBee Belemnite or similar. Belemnite is a calcium-rich deposit from the soft internal shells of ancient belemnite inkfish, with a delta-13-C agreed to be the zero base.
The diagram shows typical concentrations (almost always negative), and where they occur. Note that the modern 'grasses' (maize, sorghum, sugarcane) have a four-step photosynthetic process (C4) which is more efficient than the much more common three-step (C3) process, but requires more warmth. See our soil section for more.
12-C and 13-C can be used as temperature tracers that explain ocean
circulation. Plants find it easier to use the lighter isotopes (12-C) when
they convert sunlight and carbon dioxide into food, thus large blooms of
plankton (free-floating organisms) draw large amounts of 12-C into the
oceans. If those oceans are stratified layers of warm water near the top,
and colder water deeper down) the water cannot circulate, thus when the
plankton dies it sinks and carries 12-C with them, making the surface layers
relatively rich in 13-C. Where the cold waters well up from the depths
(North Atlantic) it carries the 12-C with it. Thus, when the ocean was
less stratified than today, there was plenty of 12-C in the skeletons of
surface-dwelling species. Other indicators of past climate include the
presence of tropical species, coral growths rings, etc.
Due to differential uptake in plants as well as marine carbonates of 13-C, it is possible to use these isotopic signature in earth science. In aqueous geochemistry, by analyzing the delta-13-C value of surface and ground waters the source of the water can be identified.
However, there are some insurmountable problems with this isotope for detecting a 'human footprint' in CO2:
|13-C/18-O clumped-isotope geochemistry
There is a slight thermodynamic tendency for heavy isotopes to form bonds with each other, in excess of what would be expected. Thus the occurrence of a CO2 molecule made up of one 13-C atom, one 18-O atom and one normal 16-O atom, adding up to a molecular weight of 47 (13+18+16) is just common enough to be used to detect temperature changes.
Lab experiments, quantum mechanical calculations, and natural samples (with known crystallization temperatures) all indicate that delta-47 is correlated to the inverse square of temperature. Thus delta-47 measurements provide an estimation of the temperature at which a carbonate formed. 13-C/18-O paleothermometry does not require prior knowledge of the concentration of 18-O in the water (which the delta18-O method does). This allows the 13C-18O paleothermometer to be applied to some samples, including freshwater carbonates and very old rocks, with less ambiguity than other isotope-based methods. The method is presently limited by the very low concentration of isotopologues of mass 47 or higher in CO2 produced from natural carbonates, and by the scarcity of instruments with appropriate detector arrays and sensitivities.
 Beryllium-10 http://www.onafarawayday.com/Radiogenic/Ch14/Ch14-3.htm
In the previous chapter we've discussed isotopes to measure temperature and, strictly spoken, these are also proxies (L:procurare= to cure, to deal with. proxy= substitute, delegate, representative) even though they are methods rather than substitutes. Here we'll look at various other ways scientists have tried to measure past temperatures.
This graph from Globalwarming Art (after Huang & Pollack, 1998) shows a borehole temperature reconstruction (showing 1ºC warming), aligned with the trace from the instrumental record from Brohan et al. 2006, (which shows the most warming of all instrumental records, watch out!). The graph goes back some 500 years, but the further back in time (depth), the bigger the error rate and the flatter the curve, as also details disappear. The basis for borehole temperature measurements stems from the fact that rock is a very poor temperature conductor, but eventually, over time, a small temperature change will happen deeper down.
|The year before (1997) the same authors (Huang & Pollack) produced a radically different graph, from the same 6000 boreholes and this one showed the Little Ice Age and the Medieval Warm Period earlier on. The 1998 publication selected 358 boreholes out of the qualifying set of 6000. What made the authors change their minds? The hockey stick was published in 1998. Co-incidence? Peer pressure? Fraud?|
+ direct measurement of temperature; no proxies.The graph shows how difficult it is to make sense of borehole temperature data. In fact, it makes little sense. Researchers try to work backwards from the borehole data, using computer models, to a surface temperature record that looks plausible. This is not reliable.
Look at the grey cluster of actual measurements to notice that nearly half the samples disagree with the other half. In other words, they disprove what the others are saying. In real science one cannot average such disagreements to arrive at a single agreement. It is called nonsense.
"How many lies does one need to average to arrive at a single truth?" - Floor Anthoni
http://www.co2science.org/subject/b/summaries/boreholes.php a balanced account of various borehole measurements by various scientists.
http://www.ncdc.noaa.gov/paleo/borehole/borehole.html University of Michigan global database of boreholes.
Some of the ice masses on Earth have remained for hundreds of thousands of years, like on Antarctica and Greenland. An ice core is drilled with a hollow core drill, in 6m sections at a time. The technique is surprisingly difficult and has been improved over time. The ice mass consists of layers accumulated from snow on top. As layer upon layer forms, the lower layers experience pressure and compaction. At some depth the firn (loose ice and snow) becomes compacted enough such that enclosed air becomes isolated. From here on the ice remains surprisingly similar in texture, with year bands, until a zone is reached where the ice 'flows' as described in part2/glaciers. From here on the age of the ice can no longer be ascertained from year bands.
Some trees grow very old, and within their stems they somehow have traces of ancient climates. The width of tree rings represent growth rate, and are thought to agree with temperature because trees grow faster when it is warmer. But such trees depend even more on thaw, cloud level, nutrient availability, sunlight, moisture, CO2, root space, root competition and bacterial activity. A tree surrounded by larger trees, receives less light. During droughts trees won't grow and may die. In other words, the widths of tree rings are poor proxies for ancient temperatures.
comments about CRU tree ring 'hockey stick' as used by the IPCC
The infamous hockey stick graph produced by Mann, Bradley & Hughes (98), and used by the IPCC in their Third Assessment Report as the 'smoking gun' of Global Warming, has been criticised and rebutted scientifically:
McKitrick : ".. our model performs better when using highly autocorrelated noise rather than proxies to ”predict” temperature. The real proxies are less predictive than our ”fake” data."
McShane and Wyner : "We find that the proxies do not predict temperature significantly better than random series generated independently of temperature. Furthermore, various model specifications that perform similarly at predicting temperature produce extremely different historical backcasts. Finally, the proxies seem unable to forecast the high levels of and sharp run-up in temperature in the 1990s either in-sample or from contiguous holdout blocks, thus casting doubt on their ability to predict such phenomena if in fact they occurred several hundred years ago." - "Furthermore, it implies that up to half of the already short instrumental record is corrupted by anthropogenic factors, thus undermining paleoclimatology as a statistical enterprise."
Calcite or calcium carbonate (CaCO3) is a common building material for sea creatures. Because it has both carbon and oxygen, it can be used for the carbon-14 (time) and oxygen-18 (temperature) proxies.
Dripstones or stalagtites (hanging down) and stalagmites (below) form where ground water drips from a ceiling. Dissolved in the groundwater are several minerals, among which dissolved limestone. As the water slowly drips down, while pausing at a low point of the stalagtite (the upper part hanging down from the ceiling), some of the water may evaporate, leaving a little bit of limestone behind at a rate of 0.1-3mm per year. Because moisture has an annual cycle, year rings can be seen. At the bottom a stalagmite is formed, and at some time the two meet. Dripstones are surprisingly hard. The stalagmites have a more consistent form because droplets splatter and moisture is spread more evenly.
Dissolution of limestone:
CaCO3(solid) + H2O + CO2(aq) => Ca(HCO3)2(aq)
Formation of limestone: Ca(HCO3)2(aq) => CaCO3(solid) + H2O + CO2(aq)
Foraminifers (L:foramen= a hole; Gk: phero= to bear; hole-bearers) are complex single-celled animals, mostly living on the sea bottom, particularly in the shallow coastal zone. They occur in a great variety of species, often in zones defined by subtle changes in living conditions. All have a hard outer skeleton made of calcite, riddled with holes through which they extend long 'hairy' arms for feeding and for moving slowly.
Corals are animal polyps that live in clear sun-lit waters in symbiosis with plant cells within their skins. They build extensive coral skeletons that join up to make coral reefs. The individual hard corals are joined up by crustose calcareous algae which are technically red sea weeds that also build limestone skeletons. As coral reefs grow, they incorporate a chemical history of the atmosphere, but their mass is too chaotic.
But there are some coral colonies that slowly grow to massive forms of several metres tall and wide, like Porites corals. These are called 'massive' corals even though their polyps remain small. Their mass is neatly ordered in growth rings like those of a tree, and can be used for analysis. One coral analysis has been dissected on this web site and is worth studying (Declining coral calcification ..).
http://en.wikipedia.org/wiki/Proxy_(climate) about climate proxies
http://www.ncdc.noaa.gov/paleo/borehole/borehole.html University of Michigan global database of boreholes.
temperature in perspective
Average global temperature has little meaning without viewing it in perspective, which is what Australian wine maker Erland Happ did from publicly available NCEP data . As a wine maker he noticed that Australia has been cooling rather than warming, and he set out on a quest to understand what the story is. He divided the world into three zones, the arctic where hardly anyone lives (blue zone), the northern hemisphere where most of the world lives (green zone), and the southern hemisphere down to where no more people are found (red zone). His results are shown in the three panels below.
A number of things strike immediately:
 Erland Happ (2011): The character of climate change,
part 2. http://wattsupwiththat.com/2011/08/16/the-character-of-climate-change-part-2.
1. Must read.
In the chapters on Urban Heat Island and thermometer locations above, we've seen that the instrumental temperature dataset is rather primitive and not representative of global temperature. But at least those from rural stations could have shown credible temperature trends. Unfortunately the institutions charged with collecting temperature data, have been making adjustments, in order to show global warming. In this chapter we'll examine how they've done that and to what extent.
As one can see, the climate data is in the hands of a very few actors, which invites for corrupting the data towards political ends. Fortunately much of the data is freely available (after adjustments), even though much also has been kept under wraps (CRU), as exposed by the Climategate scandal. Determined skeptics like Ross McKitrick, Stephen McIntyre, Anthony Watts, Joe d'Aleo, Fred Singer, John Daly and many others, managed to show how much the temperature data has been corrupted, mainly in four invisible ways:
Q: Where would you safely store precious ice cores?
A: In the desert (UCAR, Boulder, Colorado USA)
[Ross McKitrick (Jul 2010): A Critical Review of Global Surface Temperature Data Products. For more detail about how temperature data is collected, stored and corrected, and the anomalies discovered. http://rossmckitrick.weebly.com/uploads/4/8/0/8/4808045/surfacetempreview.pdf. PDF 78pp]
|Rural USA temperature
The graph here shows average temperature over the USA from 1895 to 1996, spanning a whole century. Even though it includes urban thermometers, it shows no appreciable rise in temperature. The 1960-1970s were cooler whereas the 1930-1940s were warmer. Unanimously, rural records  have shown no significant rise in temperatures. Please note that this is a very important scientific test of the AGW hypothesis, since any exception to the hypothesis (global + warming) disproves it. We may ask ourselves why the scientific method has been abandoned when it comes to global warming.
 John Daly (2006): What The Stations Say. http://www.john-daly.com/stations/stations.htm - check if you can find any that show systematic warming. Excellent world-wide database.
|Central Europe Temperature
Visit http://news.thatsit.net.au/Science/Climate/Global-Temperatures.aspx for more thermometer sites around the world, showing basically no significant warming either.
Reader please note that the scientific method protects against nonsense. It goes as follows:
Reader, the importance of the above cannot be overstated, yet somehow the scientific fraternity (brotherhood) did not adhere to its own scientifiic principles in the case of Catastrophic Anthropogenic Global Warming (CAGW) - an unforgivable misbehaviour.
A hypothesis is pronounced (global warming occurs due to rising CO2 levels). The consequence (prediction) is that temperatures go up (not down) as more CO2 stays in the atmosphere. In fact, by about +2ºC for 100ppm additional CO2 (IPCC). CO2 is spread quite evenly through the atmosphere and from north to south. So all places should experience some or similar warming. In the past century we've seen CO2 increase by about 100ppm, thus the world must have warmed by +2ºC (not cooled). Indeed the IPCC temperature record comes close to this, due to the UHI, and fraudulent adjustments (see below). But all rural records disagree: there is no warming, and many show even slight cooling. A temperature station does not just produce data; each is an independent 'experiment', testing the hypothesis, and their results must be seen in this light. Hundreds if not thousands of these 'experiments' falsified (proved wrong) the hypothesis. Indeed NONE of the projections (predictions) made by the IPCC have happened - enough to disprove the hypothesis on the basis of its own predictions. Thus CO2 does NOT produce warming. The hypothesis is false. End of scientific debate. The scientific method protects against nonsense.
"It doesn't take 100 scientists to prove me wrong, it takes a single fact'." - Albert Einstein
"It is a typical soothsayer's trick to predict things so vaguely that the predictions can hardly fail: that they become irrefutable." - Sir Karl Popper
We'll now investigate how climate fraud was commited.
|Hushing up instrument
Where 'global warming' is involved, it has become common practice not to report instrument failures, particularly where such faults produce lower temperature readings. The satellite that first ignited the fury is NOAA-16. But as we have since learned there are now five key satellites that have become either degraded or seriously compromised, resulting in ridiculous temperature readings. Even though the Indian government was long ago onto these faults, researcher Devendra Singh tried and failed to draw attention to the increasing problems with the satellite as early as 2004 but his paper remained largely ignored outside of his native homeland. For at least five years and perhaps longer, NOAA National Climatic Data Centre (NCDC) has been hushing up the faults in their satellites , which is a cardinal sin for any scientist or scientific institute. The picture shows how the path scanned, failed to reproduce the landscape below, resulting in an erroneous stripy pattern, now known as barcode. The data was automatically fed into climate records. This scandal places the entire satellite record in doubt , and the use the IPCC made of it.
Dr. Timothy Ball: “At best the entire incident indicates gross incompetence, at worst it indicates a deliberate attempt to create a temperature record that suits the political message of the day.”
 CO2insanity.com: link.  climatechanedispatch.com link.
The graph shows temperatures and their adjustments in Darwin (a smallish town), NW Australia. The blue curve is actual temperature which suffered a drop in 1940, thought to be 'unusual', but happening again around 1987. The average trend of the raw data (blue) shows 0.7 degrees cooling per century. After undocumented adjustments (black curve), the red curve was arrived at, showing warming of 1.2 degrees per century. This is a very blatant case of cooking the temperature, and many such cases have been documented from all over the world. For more information, visit http://climateaudit.org/.
|Upward adjustment of all raw
Steven Goddard discovered that all US temperatures have been gradually adjusted upward by a whopping 0.5ºF without appropriate documentation. The reasoning behind this adjustment was entirely arbitrary: "many sites were relocated from city locations to airports and from roof tops to grassy areas. This often resulted in cooler readings than were observed at the previous sites." The graph shows the difference between what the thermometers read (RAW data), and the temperatures corrected by the USHCN. One would have expected that adjustments canceled one another out as thermometers are relocated. Could one call this fraud?
|This table is from the 7 important temperature stations of New Zealand, showing raw and adjusted trends. Averaging the unadjusted trends arrives at +0.08ºC per century, but after adjustment, the trend becomes +0.59ºC per century. The New Zealand temperature database is managed and kept by NIWA who have not been able to explain the adjustments, since the culprit, Jim Salinger left. For more details see http://www.climatescience.org.nz/ who are fighting for the truth.See also an overview with links: http://wattsupwiththat.com/2012/03/07/the-cold-kiwi-comes-home-to-roost/|
graph shown here of unadjusted (green) and adjusted (red) temperatures
shows the degree of fraud involved.One cannot believe that there are other
scientists willing to defend this fraud.
UPDATE 8 Oct 2010: the High Court has decided that the 'adjusted' temperature data could not be used as an official record, and NIWA has also distantiated itself: NIWA now denies there was any such thing as an “official” NZ Temperature Record, and "NZ authorities, formally stated that, in their opinion, they are not required to use the best available information nor to apply the best scientific practices and techniques available at any given time. They don’t think that forms any part of their statutory obligation to pursue 'excellence'.” - what a mess, what a defeat for 'science'. link.
Please note that NZ temperatures have a large influence on the 'world
average' because there exist very few thermometers in the Southern Ocean.
The NZ temperatures are then 'extrapolated' over a very large area.
But NZ is not alone as their Australian colleagues are doing the same.
|Australian Bureau Of Meteorology
(BOM) data corruption
The BOM was caught red-handed in "homogenising" Australia's temperature data, always resulting in cooling the past while warming the present (red graph). But Australian biologistJennifer Marohasy collected actual temperatures which reveal a different story (blue graph).
Jennifer Marohasy's website: .jennifermarohasy.com. Her rebuttal of BOM: http://jennifermarohasy.com/2014/05/corrupting-australias-temperature-record/
 Ira Glickstein (2011): The PAST is Not What it Used to Be (GW Tiger Tale). http://wattsupwiththat.com/2011/01/16/the-past-is-not-what-it-used-to-be-gw-tiger-tale/
|Rise and fall in thermometers
This graph shows annual mean temperature (magenta) and the number of thermometers taking part (dark blue). Thermometers were sparse before the Industrial Revolution (1850) but gradually rose in numbers, mainly in industrialised nations (Northern Hemisphere). After 1980 most were deselected in favour of automated thermometers. Note how temperatures jumped, first when thermometer numbers jumped up, and again when they dropped down.
is a detailed view of average temperature and thermometer numbers after
1950. Note how average temperature suddenly began to look like a hockey
stick. How did they do this? Mainly by promoting thermometers from warm
places and demoting those from cold places ( higher altitudes and remote
And in the United States, Anthony Watts - in a volunteer survey of over 1000 of the 1221 instrument stations - had found 89% were poorly or very poorly sited, using NOAA’s own criteria. This resulted in a warm bias of over 1ºC. A warm contamination of up to 50% has been shown by no less than a dozen peer review papers including ironically one by Tom Karl (1988), director of NOAA’s NCDC and another by the CRU’s Phil Jones (2009). (Tom Karl and Phil Jones are at the centre of the Climategate scandal)
|Urbanisation by selection
 Joseph D’Aleo (2009): Response to Gavin Schmidt
on the Integrity of the Global Data Bases -
|Selecting warmer sites
This diagram from  above shows how over time, more warmer stations were selected. Horizontal is time, over one century, and vertical average latitude, the distance to the equator. The curve represents the average latitude of the temperature stations used for calculating the world's temperature. One century ago, their average latitude was 35 degrees, but gradually over time, it changed to 20 degrees, with some inexplicable swings inbetween, as more southern stations were included and northern stations dropped off. Thus by design or by accident, more and more warmer thermometer stations were used and/or less and less those from colder places. The result gives substantial over-all warming.
|More minimum records
This graph shows that the minimum and maximum temperature readings went out of lock-step. Before 1920 their numbers were roughly equal, the maxes sometimes outnumbering the mins. But since 1930 things went wrong, and the minimum temperatures outnumbering the maximums, and since 1980 the maxes are in the majority again, and since 2000 vastly outnumbering the mins, at a time that the globe has been cooling. As a result the past was artificially cooled as the present was artificially warmed. Thus the average temperature has been doctored to fit the AGW hypothesis.
Fudging the data in any way whatsoever is quite literally a sin against the holy ghost (spirit) of science. I’m not religious, but I put it that way because I feel so strongly. It’s the one thing you do not ever do. You’ve got to have standards. - James Lovelock
|Accidental data corruption
In the year 2000, a most curious and massive jump occurred in the temperature data held by NASA, affecting 48 states in the USA. It was not detected by the data keepers but by an attentive outsider, Steve McIntyre. The IPCC was over the moon with this sudden demonstration of catastrophic warming, but when it was exposed as a year-2000 bug in the programs, the correction was quietly made and hushed up. No longer was 1998 the warmest year on record, as was trumpeted around the world. The important lesson is that outsiders are needed to keep a watchful eye on all intended and unintended data corruptions. Important to note is also that keeping temperature data is not just a question of storing, but that there are massive computer programs at work massaging and adjusting this data, which then becomes 'available' to the public as 'raw' data. What these programs do, has not been documented and made public. It may take decades before the mess has been sorted out - if it ever will.
http://climateaudit.org/2010/01/23/nasa-hide-this-after-jim-checks-it/ - you could not have imagined this. Essential reading.
“Anyone who doesn't take truth seriously in small matters cannot be trusted in large ones either.” - Albert Einstein
Investigators Joe D’Aleo and Anthony Watts reported the following shortcomings in the temperature records :