by Sara Shay

Early Technologies: The Thermometer

News
Apr 15, 20012 mins
Enterprise Applications

Quantifying natural phenomena was all the rage among early 17th century scientists. As they grappled with measuring heat and cold, several came up with the thermoscope. This thin-necked flask was warmed between the hands and inverted over a dish of water; the water level rose in the flask in relation to the temperature of the air inside it.

The bulb thermometer is a direct descendant of the thermoscope and is based on the principle that a liquid expands when heated. The liquid in the bulb, usually mercury, rises into a thin tube as it warms up. While this genre is the most popular, other types of thermometers, including electrical and bimetallic, have since been developed. Electrical ones measure the resistance of a material, such as nickel wire, and convert that measurement into a temperature reading. Bimetallic thermometers, found in ovens and backyards, are made of two strips of different metals that are bonded together. The metals expand at different rates when heated, making the strip bend. The amount it bends is used to measure temperature.

German physicist Daniel Fahrenheit was the first to create a numerical scale for measuring temperatures that became standard. In 1714, Fahrenheit mixed snow and salt, stuck a thermometer in, noted how high the mercury rose and labeled that point zero. Then he boiled mercury, stuck the thermometer in again and called that point 600. He marked off even spaces between the points, thus securing his place in history.

Swedish astronomer Anders Celsius created a more practical scale in 1742. He labeled the freezing and boiling points of water 100 degrees and zero degrees respectively. (The scale was later inverted so that zero became the freezing point.) Celsius is now the standard scale in the sciences and in most countries–generally those that embrace the metric system.