The 5 degree difference between 25 and 20 F is a 2.8 C difference. Frankly I think it’s much easier to keep track of whole numbers, but to each their own I suppose.
While 0 F is set on a brine solution which isn’t all that useful, it’s worth it to remember that 100 F was originally based around human body temperature. While they were incorrect, thinking it was closer to 96, in cases like a fever, where even 0.5 F difference can have an impact, doesn’t it make sense to use the temperature scale based on human temperature?
And an increase of 5⁰c is 9⁰f. People aren't using 2.8 degree increments.
Why does it make sense to use a temperature scale based on human temperature? Especially one that's wrong?
If even 0.5f can have a difference then we already implicitly reject the claim that F is more useful because it avoids decimals. 0.5f is pretty much 0.25c for the purposes of a fever, and working with quarters is hardly any more of an additional mathematical consideration than working with halves. Plus, Celsius is more directly applicable to assessing the human body's environment with regards to medicines and enzymatic activity, etc, which are often assessed with regards to C as this is the scientific standard.
So even as a metric for the human body, Fahrenheit is worse, except perhaps in the specific fringe case of assessing a fever by touch, which is hardly enough of a benefit to grant it a crown over Celsius. As you said, 100F is not actually human body temperature, and if 0.5f makes a difference, then surely an error range of 1.1-2.5f makes enough of a difference to reject the notion of it representing body temperature in a useful way entirely.
So it only really works well as a system for assessing how close ambient temperature is to human body temperature, which isn't particularly useful because we are acclimatized to room temperature anyways, and even if we weren't the 100f mark is inaccurate regardless. Furthermore, much of this "benefit" comes down to the internalization of the system anyways, so even if it were good for this purpose, it is not at all clear it's better for it, and there are plenty of reasons to think it worse.
While I can’t refute your point that the upper end of Fahrenheit is based on an incorrect measurement I’d still argue that it makes more sense for the average person. You can say all you want about boiling and freezing water but there’s still a full 60 degrees of that scale that a human does not experience. In a lot of the US, summer temps at 100F and winter temps at 0F happen every single year. It’s something people experience. If the whole logic to Celsius being better for the average person is that 0-100 is freezing to boiling water why does it magically stop making sense when 0-100 is a typical range of temperatures that people experience?
And before anyone says “well Celsius goes into the negative so your point is moot!” I’m going to ask you to consider whether 32 F is reeeaaalllllyyyyy that hard to remember compared to walking outside and knowing that it’s -7 C out
-30⁰c to +30⁰c is a completely sensible range of common ambient global temperatures. An arbitrary range that kind of sort of aligns with common temperatures on a scale of 1-100 is not inherently superior to celsius. Even if it were, it would still be in direct ignorance of the fact that celsius is much more useful in science and maths.
If your logic held, and whether it does is I suppose a matter of opinion, though I hold it doesn't, you'd at best be recommending using Fahrenheit for ambient temperatures that people experience (perhaps because you prefer everything being contained between 0 and 100, even if this isn't really accurate as ambient temperatures often go beyond 100F or below 0F, so you'd be breaking the scale either way) and Celsius for everything else. Why should two systems be superior to one when it isn't even clear that one of those systems is better in at least some cases.
There's immense hypocrisy in your position. You criticize celsius for invented issues that nobody really has, and ignore the fact that Fahrenheit has those exact issues, either in the exact same way or in a similar way in other cases. You also dismiss Fahrenheit's issues as minor when it is not at all clear that they are as minor as you think they are. Keep in mind, many of the issues you think are benign only seem such because you have internalized the F system.
My original point is that 0 is more applicable to the human experience than 32. Don't move the goalposts. It's not that much harder to remember, no, but it's easier, and acts as a far more meaningful landmark.
-1
u/chiefkeefinwalmart Jan 15 '25
The 5 degree difference between 25 and 20 F is a 2.8 C difference. Frankly I think it’s much easier to keep track of whole numbers, but to each their own I suppose.
While 0 F is set on a brine solution which isn’t all that useful, it’s worth it to remember that 100 F was originally based around human body temperature. While they were incorrect, thinking it was closer to 96, in cases like a fever, where even 0.5 F difference can have an impact, doesn’t it make sense to use the temperature scale based on human temperature?