ADVERTISEMENT

History of Thermometers

By David A. Fryxell Premium

Sign up for the Family Tree Newsletter Plus, you’ll receive our 10 Essential Genealogy Research Forms PDF as a special thank you!

Get Your Free Genealogy Forms

"*" indicates required fields

Hidden
Hidden
This field is for validation purposes and should be left unchanged.

Before the invention of the thermometer, our ancestors had no objective comeback to such snappy repartee as, “Cold enough for ya?” Temperature was subjective, measured only by the Goldilocks and the Three Bears scale: too hot, too cold, just right. Hot and cold, in fact, were thought of as two independent properties, rather than a thermodynamic continuum.

Then along came Daniel Gabriel Fahrenheit (1686-1736), a German physicist who spent much of his life creating precision meteorological instruments. Fahrenheit fashioned his first thermometer 300 years ago, in 1709; it was filled with alcohol rather than mercury (an innovation Fahrenheit introduced five years later). At least by some accounts, it was also 300 years ago—during the brutal winter of 1708-1709—when Fahrenheit took the measure of what would become zero on his temperature scale in 1724.

Ancient history
Like most inventors, Fahrenheit built on earlier ideas. In ancient Alexandria, both the philosopher Philo and the mathematician Hero observed that the expansion and contraction of air with changing temperature caused the level of a liquid in a tube to rise and fall. In the early 11th century, the Persian scientist Avicenna developed a device based on this principle. And none other than Galileo is often identified as the inventor of the thermometer, though his innovation was actually a thermoscope—lacking a scale or standard to compare temperature from one place or device with another. Galileo’s gizmo, which wasn’t sealed against changes in air pressure, was as much barometer as thermometer.

Another Italian, Santorio (aka Sanctorius), is credited with the first thermometer scale, as early as 1612. But Santorio’s device was also vulnerable to changes in air pressure—a problem solved in 1654 by yet another Italian, Ferdinand II, Grand Duke of Tuscany. The duke crafted the first thermometer using a liquid (alcohol) sealed in a tube, the familiar form of thermometers today.

Despite suggestions from scientists ranging from Christian Huygens to Isaac Newton, however, the successful combination of a sealed thermometer and an accurate temperature scale took another 60 years. Not merely a physicist, Fahrenheit was also a skilled craftsman capable of manufacturing thermometers of a previously unknown precision. His 1714 invention of the modern mercury thermometer enabled still more-accurate measurements and in turn made possible his eponymous scale in 1724.

Measuring up
Several competing stories explain how Fahrenheit wound up with the freezing and boiling points of water at, respectively, 32 and 212 degrees. He drew upon the earlier work of Ole Christensen Rømer (1644-1710), who’d created a scale using a thermometer filled with red wine.

But Fahrenheit wanted to avoid Rømer’s need for negative numbers—hence his measurement of an especially cold winter in his hometown of Danzig (later part of Poland). Fahrenheit was blissfully unaware of temperatures in places such as North Dakota that would extend his scale “below zero.”

He later confirmed this low temperature in the lab with a mixture of ice, salt and water. Fahrenheit then set an upper point based on his body temperature, and divided the scale between by 12 and again by 8 to reach 96 degrees. (The scale was later recalibrated so human body temperature averages 98 degrees.) This gave him a freezing point for water at 32 degrees and a boiling point at 212. Other versions have him using the blood temperature of horses or the melting point of butter as the basis for 100 degrees. In any case, Fahrenheit’s scale became the basis for measuring temperature in most English-speaking countries until the 1960s, and remains standard in the United States.

Competition
Other countries adopted the “centigrade” scale created by Swedish scientist Anders Celsius (1701-1744), which allotted 100 degrees to the difference between boiling and freezing. Celsius’ original scale ran backwards, with 100 as the freezing point; taxonomic pioneer Carl Linnaeus, a contemporary of Celsius, suggested flipping the scale to its current form. In 1948, the Ninth General Conference on Weights and Measures changed “degrees centigrade” to “degrees Celsius,” much like “degrees Fahrenheit.”

But the 10th such conference, convened in 1954, selected the scale invented in 1848 by William Thomson, Baron Kelvin (1824-1907), as the metric measure of thermo­dynamic temperature. Kelvin’s scale uses the same degrees as Celsius’ (nine-fifths of a Fahrenheit degree), but assigns zero to absolute zero, the point at which a substance has no heat energy and its molecules don’t move (equal to minus 459.67 degrees Fahrenheit). Kelvin got a further posthumous honor at the 13th conference in 1967, which eliminated the word degree: The freezing point of water could now be referred to, for instance, as “273.15 Kelvin.”

Nonetheless, at least in the United States, when the weatherman says the temperature will be zero, he doesn’t mean the freezing point of water (zero degrees Celsius) or—thank goodness!—absolute zero (zero Kelvin). He’s referring to zero degrees Fahrenheit, pretty much as Herr Fahrenheit himself measured it, 300 years ago.

ADVERTISEMENT