← Back to context

Comment by dmoy

1 year ago

I grew up in a place where it'd get above 100F and below 0F pretty much every year.

But I will say, F is pretty decent still, even if the GP statement is a bit off:

100F is getting uncomfortably hot for a human. You gotta worry about heat stroke and stuff.

0F is getting uncomfortably cold for a human. You gotta worry about frostbite and dying from the cold if underdressed.

In the middle, you'll probably live. Get locked out of the house taking out the trash when it's 15F? You're probably okay until you find a neighbor. Get locked out of the house taking out the trash when it's -15F? You have a moment of mental sheer panic where you realize you might be getting frostbite and require medical attention if you don't get inside in like <10 minutes.

But yea I still use C for almost everything.

80F is uncomfortably hot for me unless I strip off; that's when my aircon goes on. And 55F is uncomfortably cold...

I think basically all of these are rationalisation (and that goes for the celsius numbers too). They don't matter. You learn very early which numbers you actually care about, and they're pretty much never going to be 0 or 100 on either scale.

You're not going to be thinking about whether it's 0 outside or not if locked out; just whether or not you're freezing cold or not.

  • It's not the bookends themselves that's the issue, it's the coarseness. Celsius is too coarse because it's extrapolated from 0-freezing and 100-boiling points. People can generally feel the difference between 1˚F increments, and roughly two make up 1˚C diff. Also, you can't really say "in the 70s" etc with Celsius. I watch a foreign weather report and that entire country is in the 20s ˚C for an entire week.

    It's a minor difference either way, but I'm not going to switch to something slightly worse.

    • In my 48 years of using Celsius I can safely say I have never cared about smaller increments of celsius than 1. You're not keeping a room stable with that precision, for example, nor will the temperature at any given specific location outside be within that precision of your weather reports. Or anywhere close. And we can, and do, say "low 20's" "high 20s', "low 30's" etc. which serves the same effect. It's again, never in my 48 years mattered.

      Either system is only "worse" when you're not used to it. It makes no practical difference other than when people try to argue for or against either system online.

      The only real reason to consider switching would be that it's a pointless difference that creates minor friction in trade, but there too it's hardly a big deal given how small the effect is and how long it'd likely take to "pay for itself" in any kind of way, if ever.

      2 replies →

    • People don't generally need to communicate the difference between 20C and 20.6C (68F and 69F) unless measuring it directly, in which case you would use the exact decimal number.

      I also don't think most people can tell the difference between 68F and 69F unless they are experiencing them very close between, and the perceived heat at that precision is dependent on a lot more than just the measured heat.

      I don't get why saying "in the 70s" is better than saying "around 24" besides being used to one way or the other.

      Fahrenheit is not better and for any scientific/engineering/proper measurement you would use celsius or kelvin (which shares a scale with celsius but with a different zero-point) anyway, so why keep fahrenheit? Unless for purely traditional or cultural reasons.

      2 replies →