← Back to context

Comment by shadowgovt

3 days ago

This reminds me of how the Fahrenheit scale came about.

For all its flaws, Fahrenheit was based on some good ideas and firmly grounded in what you could easily measure in the 1720s. A brine solution and body heat are two things you can measure without risking burning or freezing the observer. Even the gradations were intentional: in the original scale, the reference temperatures mapped to 32 and 96, and since those are 64 units apart, you could mark the rest of the thermometer with a bit of string and some halving geometry. Marking a Celsius scale from 0 to 100 accurately? Hope you have a good pair of calipers to divide a range into five evenly-spaced divisions...

Nowadays, we have machines capable of doing proper calibration of such mundane temperature ranges to far higher accuracy than the needle or alcohol-mix can even show, but back then, when scientists had to craft their own thermometers? Ease of manufacture mattered a lot.

They say two points but it was really three. The ammoniac mixture at 0F, water freezing at 32F and body temperature at 96F.

Also Celsius, for whatever reason, originally put boiling at 0 and freezing at 100. Maybe Sweden is just that cold.

Jame's Burke's "Connections" series covered this in series 3 episode 10. Here's that clip:

https://youtu.be/w4ujTt0gDx8?si=XUV9J3srYdaBwqwm&t=1227

100 + 28 degrees are not harder to mark than 64, and then aim 0 and 100 properly. :-/

  • What would be the process to do that? To aim 0 and 100 properly, you'd need a tool to calculate a 100:28 (25:7) ratio on an arbitrary distance, wouldn't you?

    One can build such a tool, but it's not a doubled-over piece of string.

    • Make marks on the thermometer at 0 and 100 degrees C, then project light from a candle to a wall to see these marks with say 5x magnification. Now project marks from the 128 mark ruler to the same wall and align marks from both, then place marks on the thermometer with 5x better accuracy.

      5 replies →