Print Email Download

Paid Writing Services

More Free Content

Get Your Own Essay

Order Now

Instant Price

Search for an Essay


The history of temperature measurement

Ancient man was able to distinguish between hot and cold by simply using common sense. For instance, they had to decide how big the fire needed to be or how close to the fire they had to sit in order to stay warm. Even today, a tribe in Australia still uses dogs rather than blankets to keep themselves warm. During a fairly cold night, the people of the tribe might need two dogs, and therefore calling it a two-dog night. A frosty night is a six-dog night, and so on.

In the beginning, man used to preserve the fire caused by natural disasters, and therefore was given the name fire tenders. As time went by, they discovered how to create fire themselves and thus turned into fire makers. Ultimately they became fire managers, as they discovered how to use fire to carry out daily tasks such as cooking meat, boiling water, and later learned how to work with metals such as iron and even to make glass. And even though they didn't have access to any measurement appliances, they determined the heat of the fire by creating guidelines for different kinds of fire to use for different procedures.

Although the ancient Greeks discovered that the air expands when heat is applied, they were not able to build any temperature measurement devices to determine how much heat was required.

In the year 1592, Galileo Galilei, became the first man to try to measure temperature. The device that he built was composed of bulb of air. When heat was applied to the air bulb, it induced the liquid down a column and then into a container. (See Figure 1).

In the year 1611, the thermometer was standardised by Galileo's associate Sanctorius Sanctorius, who noted the level of the liquid when the bulb was cooled with melting snow and when it was heated with a candle. He then marked the area in between the two levels on the column in 110 equal parts.

In the year 1632, the French physician Jean Rey built the first ever liquid thermometer. The thermometer was composed of a flask with a long slender neck, which was partly contained with water. When there was a change in temperature, the level of the water would increase or decrease in reply. (See Figure 2).

However, it was later understood that these temperature measurement devices were significantly affected by the changes in atmospheric pressure and therefore they were not accurate.

In the year 1644, liquid glass thermometers were first developed. The main types of liquids used were water, alcohol and mercury, however alcohol was favoured as it demonstrated the greatest expansion in reply to increase in temperature.

During the next century, scientists focused on building a more accurate temperature measurement device and more than 35 scales were developed. Usually the scales were standardised at two temperature points and the area in between them was split into equal parts. The most famous scale of these was the one built by Sir Isaac Newton, which was a twelve degree scale where the melting point of ice was marked as zero and the body temperature level was marked as twelve.

In the year 1701, the Danish astronomer Ole Romer, built a better scale by using a mixture of ice and salt which he marked as zero degrees and the boiling point of water was marked as sixty degrees. According to his scale, the freezing point of water was eight degrees and the body temperature was twenty-two and a half.

Ole Romer's scale was the best as of yet. In the year 1714, Romer's scale influenced the famous Dutch instrument maker Daniel Gabriel Fahrenheit. Using Romer's readings of salt and ice mixture at zero degrees, Fahrenheit set out to build a scale with better and more accurate unit divisions. He marked thirty-two for the freezing point of water and ninety-six for body temperature. He later realised that there was a change in body temperature during different times of the day and thus he tried to seek a better preset point. At last he decided to set the boiling point of water as 212 degrees and later modified the body temperature to be ninety-eight degrees. The Fahrenheit scale used today makes use of the boiling point and freezing point of water as the two preset points.

In the year 1742, the Swedish astronomer Anders Celsius set out to build his own temperature measurement scale. Celsius allocated the boiling point of water at zero degrees and the temperature of melting snow was marked at hundred degrees. Not long after that, he reversed the points and thus assigning zero degrees to the freezing point of water and hundred degrees to the boiling point of water. In the year 1948, this scale was given the name Celsius scale.

Jacques Charles' and Joseph Gay-Lussac's gas law demonstrated that the volume of gas reduced by 1/273.15 its volume at 0°C and one atmosphere pressure for each one degree fall in temperature. This law influenced the temperature absolute zero, which was described to be the temperature at which an ideal gas has no volume, making it the coldest temperature theoretically possible. This was later concluded to be -273.15 degrees Celsius.

In the year 1948, William Thompson, better known as Lord Kelvin, suggested a thermodynamic temperature scale derived from the efficiency of an ideal heat engine planned in the year 1824 by Sadi Carnot. Lord Kelvin demonstrated how the temperatures defined throughout his results were similar to the ones defined by the gas laws. In order for his scale to be well-matched with the ones already being used throughout many countries, he set the size of the temperature degrees similar to the ones in the Celsius scale. As a result, the Kelvin unit has the same size as the degree Celsius.