Comment by hot_gril

1 year ago

It's not the bookends themselves that's the issue, it's the coarseness. Celsius is too coarse because it's extrapolated from 0-freezing and 100-boiling points. People can generally feel the difference between 1˚F increments, and roughly two make up 1˚C diff. Also, you can't really say "in the 70s" etc with Celsius. I watch a foreign weather report and that entire country is in the 20s ˚C for an entire week.

It's a minor difference either way, but I'm not going to switch to something slightly worse.

In my 48 years of using Celsius I can safely say I have never cared about smaller increments of celsius than 1. You're not keeping a room stable with that precision, for example, nor will the temperature at any given specific location outside be within that precision of your weather reports. Or anywhere close. And we can, and do, say "low 20's" "high 20s', "low 30's" etc. which serves the same effect. It's again, never in my 48 years mattered.

Either system is only "worse" when you're not used to it. It makes no practical difference other than when people try to argue for or against either system online.

The only real reason to consider switching would be that it's a pointless difference that creates minor friction in trade, but there too it's hardly a big deal given how small the effect is and how long it'd likely take to "pay for itself" in any kind of way, if ever.

  • You might not tell the difference, but evidently enough people can that digital thermostats commonly add 0.5 increments when switching into ˚C mode. And when they don't, some people put them into ˚F mode just for the extra precision.

    • I'm sure some do. And that more think they do. I still don't buy that the difference affects their lives in any meaningful way. My thermostat, btw. has 0.1 increments.

      It does not matter, because when the heating is on the difference between the temperature measured at ground, at ceiling, at the edges or at the centre of the room will easily be a couple of degrees or more apart depending on just how significant the temperature differential is with the outside. Have measured, as part of figuring out how the hell to get to within even 3-4 degrees of the same temperature at different places in the same open living areas.

      Very few people live in houses that are insulated well enough and with good enough temperature control that they have anything close to that level of precision control over the temperature in their house.

      But if it makes them feel better to think they do, then, hey, they can get my kind of thermostats. At last count there are now 5 thermostats on different heating options in my living room, all with 0.1C steps.

People don't generally need to communicate the difference between 20C and 20.6C (68F and 69F) unless measuring it directly, in which case you would use the exact decimal number.

I also don't think most people can tell the difference between 68F and 69F unless they are experiencing them very close between, and the perceived heat at that precision is dependent on a lot more than just the measured heat.

I don't get why saying "in the 70s" is better than saying "around 24" besides being used to one way or the other.

Fahrenheit is not better and for any scientific/engineering/proper measurement you would use celsius or kelvin (which shares a scale with celsius but with a different zero-point) anyway, so why keep fahrenheit? Unless for purely traditional or cultural reasons.

  • We tend to be much better at noticing temperature changes than fixed temperatures anyway, and more likely to reach to feeling that we're getting warmer or colder, than the specific temperature differential causing it. I think a lot of the people who think they feel differences at that precision really are feeling the difference of their heating/cooling turning on or off at different intervals. As I noted in another comment, having spent time trying to figure out how to make all of my living room - which isn't that big - comfortable at the same time, the difference is often huge, even with thermostats with 0.1 steps, because when the thermostats triggers, it's not like it will precisely lift the temperature at its measured zone by 0.1 steps. It will either heat my underfloor heating or my radiators to a point where they will first hit a 0.1 increase, plus the margin before it triggers the other direction again, at which point they'll get turned off, and significantly overshoot while the floor or radiator cools down. Setting a thermostat to 24.3 is not going to leave you with a room at 24.3, it's going to leave you with a room fluctuating between something like 22 and 26 in different places and heights and time intervals...

    The only time I'll buy that anyone manages that level of precision is if they live in a very modern house with near perfect insulation where the heating or cooling input needed to keep it in balance is near nothing.

    • It's more achievable in a small single-story apartment, or better yet, a car. I can really feel the difference without looking at the number. There's also what you said about the trigger points, but it's still a good reason to have precision on a thermostat. I felt like it was slightly too cold yesterday, so I moved the thermostat up 1˚F and it felt warm enough.

      And I'm not a scientist, but in science classes we were only using Kelvin, not Celsius. C and F aren't useful for proportions because 0 isn't 0. Even Rankine would be fine, just use different constants.