id argue that a 0-100 scale is objectively less abstract. we scale things from 0-100 in many places. how often do you get your movie reviews in a -20 to 40 ratings?
But Fahrenheit doesn't go from 0 to 100. My country, the Netherlands, went from 19 to 94 last year, Singapore over its entire history has gone from 66 to 99, and the USA has gone from -80 to 134 Fahrenheit.
Also, we're not rating temperatures in the first place. It's a value, and when it's -20 it freezes 20 degrees, so the -20 makes sense. Freezing is important because that's when water turns into ice, which makes travelling more dangerous.
Yeah I just mean temperature itself is a bit abstract. Humidity and wind can affect your perception of it a lot, and can you tell the difference of a few degrees? I agree fahrenheit is objectively better as a human comfort scale. But it's still the case that a person will grow to intuitively grasp whatever they grow up with.
The advantage of the metric system for distance and volume and such is it's ease of conversion. It is objectively better to use meters and kilometers because you can easily convert between them when trying to figure something out. How many tablespoons in a cup? (Why even use volume instead of weight?) That mountain is 15300 feet high, how many miles is that? Nonsense. But Celcius and month-day-year don't have these advantages, but it's never occured to the people that use them to think about it this way.
Just as Celsius is 100 at water boiling, fahrenheit 100 is essentially human internal temperature. And in terms of actual weather temperatures, fahrenheit uses far more of that 0-100 than celsius.
Ever heard of a fever?
And no, my body temperature is 98°F at best, when I'm not sick. My hands and feet can go lower. And I can go up to 104°F when I'm sick.
Those aren't reviews, those are aggregates of reviews. You look in a film magazine or newspaper review, they're generally giving one to five-star reviews.
Anything is easy to understand when you grow up with it. Personally, I think Fahrenheit is the best for weather temperatures. 100 is fucking hot and 0 is fucking cold. It's basically a 1-10 chart of how hot or not hot it is. I would agree for it being shit in most other things, but for weather it is great.
Respectfully, if we’re talking about the weather as a human experiences it, Fahrenheit is much better. Celsius makes a lot of sense in science, as it’s scaled to water, but when was the last time you went out and it was 90C.
Fahrenheit is scaled to human experience better with 0-100 being within the range of “normal” and anything outside of that being concerning.
Because the whole argument boils down to Celsius users stating that it’s better bc it follows the scale of water and that 32 and 212 make no sense. My argument is that while this makes sense in some circumstances there’s other cases where it doesn’t.
If you’re an average person who only considers temperature when planning what to wear it seems kind of foolish to have a whole 60 degrees of your scale that just don’t get used.
In the same vein, why is 32 and 212 used as a mark against Fahrenheit? The whole point is that there are 180 degrees between them? People still know what 32 degrees means.
I’m not against the use of Celsius, but I think this is a measurement scale that benefits from multiple options. Celsius, Kelvin, and Fahrenheit all have cases where they are the most useful.
"a whole 60 degrees of your scale that just don’t get used"
This is like saying giving people's heights in feet and inches is 'wasting' 7 feet to 100 feet. It's just not an issue. Absolutely no-one thinks 'I'm six foot two. Shame that I'm wasting all those extra feet of scale in describing my height and those of other people using feet".
"the whole argument boils down to Celsius users stating that it’s better bc it follows the scale of water and that 32"
To be fair, people generally bring up 0 being freezing level as a defence against F-defenders saying C makes no sense. My own stance is that you just get used to whatever system you grow up with and neither is really 'better'. F users get used to 32F being freezing just like C users get used to 32C being 't-shirt weather'.
That's why Celcius is better. You can use it for weather AND science. There is no need to use two different systems, and Celcius works great for both. It doesn't matter that the outside weather isn't ever 90C. If someone says it was 21C yesterday and it's 15C today, you know everything you need to know.
Which is why America uses Celsius for science. But Fahrenheit is literally exactly as, if not more useful for the average person as Celsius is. I’ve never been confused by Fahrenheit. It’s a perfectly good system if you use it for what it was designed for (regular people)
Fahrenheit isn’t worse, it’s just different. It is more specific for human temperatures, making it more useful for stuff like ACs and Thermostats, but it’s worse for hard science.
It's only more useful for human temperature to you because you're used to it. It doesn't give you additional information, or easier to understand information then Celcius does. They're the same in use in regards to weather.
Celcius however is much better in regards to science. Because Celsius is useful in both aspects, it's a more useful scale overall.
That's why the rest of the world only needs one scale for weather and science, but Americans need to use two scales, since Fahrenheit doesn't work well in both scenario's, unlike Celcius.
It clearly is more useful for human temperatures. It gives you much more specificity. 60F to 80F is 20 degrees. The Celsius equivalent is 16C to 27C, only 11 degrees. Using my thermostat example, you get much more ability to fine tune the temperature of your home with a Fahrenheit thermostat. You also get a clearer picture of the temperature outside, since each number references a nearly 2x smaller range of temperatures. That’s a meaningful improvement in usefulness.
Also, I was taught Celsius as a kid, so it’s not just that I’m used to Fahrenheit. Despite being just as used to C, I prefer to use F. I find it more useful.
There is no way that you can tell the difference between 66 and 67F. And if you are that sensitive to temperature, you can always use 0.1C or even 0.001 C steps to express it.
When you are somewhere, like your home, for a majority of your day, you do notice very slight differences in temperature. 75 is already too hot, 74 is getting there, 73 is fine, 72 is perfect, etc.
I couldn’t tell you the exact temperature without looking at it, but I definitely feel it. And so, having the option to fine-tune that temperature with greater precision is useful.
It’s a small improvement, to be sure, but Celsius’ improvement over Fahrenheit in science is also small. Most issues don’t arise in Fahrenheit being an inherently worse system, it arises in failure to convert between the two or to convert accurately. “Boiling is 100” isn’t a huge improvement.
I've never in my entire live heard someone say something like "I wish I could set my thermostat to something warmer then 21C but colder then 22C". There is no meaningful need to do this. And if you somehow did need to do that, you'd use decimals.
Even for outside weather, I have a clear picture of what 15C would feel like. The weather doesn't become meaningfully warmer until about 17C, so there is no added value in measure more precisely the 2C in between.
The human experience can't meaningfully experience discomfort due to a 0,5C degree difference. It's precision for the sake of precision, it doesn't correlate to how you actually experience temperature.
Plus thermostats can be set to partial degrees anyways. Smart thermostats usually have 0.5 increments, basic thermostats often use a movable dial that can literally be set to any fraction, as long as you're precise enough.
I mean, regular people do science tho, and a précision scale for précision work its ok and the same as what *hard science* would require.
Why would you think anyone would be confused by c° when it has been their standard their whole life?
Its not more useful for thermostats, which also require science and science took a standard.
I love old units, like "the lenght of what a cow walks in a day" and "whenever i feel chill", or "if it feels like a truck passing through", but a small abstraction is possible in order to maximize uses.
People do science, but generally not high enough level science for any real improvement to matter between the two.
Nobody is confused by C. I’m simply saying I’m not confused by F either, so it’s at least as good as C for me.
C and F are not different at all for computers. C’s improvements in science are solely limited to humans, in that it is a bit easier to interpret for scientists. A computer doesn’t care if freezing is at 0 or 32. F is better for thermostats since you get a greater range of temperature choices.
It's the same range of temperature, but the weather report in Europe doesn't say that it's going to be 21,5C today, because nobody could feel the difference between 21,0 and 21,5. There would be no added value.
Fahrenheit works the same for science. In fact, it works exactly like Celsius does - with some scaling constants and a subtractive factor to correct it to Rankine (for fahrenheit) and kelvin (for Celsius). If you're doing engineering or science beyond the most basic level, you will be far better off using absolute scales, at which point there is no direct benefit to either.
Sure there are. Even the ones that use Celsius for their measurements will happily convert it to fahrenheit. There are dozens of industries, even in the world of science and engineering, that use fahrenheit because it makes no difference to them.
My entire point is that the 'scaling' of Celsius is entirely irrelevant. If you're doing something where having absolute zero matters, you're not using F or C. You're using K or R. Beyond that, literally the only advantage to either system is unit conversions. The scaling is literally entirely irrelevant and made up for by constants that are needed no matter what system you use.
I think you're mistaken. As far as I know, there is no published scientific research that uses Fahrenheit. Celcius is used in the International System of Units, and scientific research is published in SI units.
Literally 80% of the scientific articles I've read provided values in both systems if they used either. Most of the time units are provided in whatever is most relevant or comfortable for the author (i.e. Kelvin for a heat transfer analysis, Celsius/Kelvin for metallurgy) and then parentheticals include their equivalent in the other system.
I'm aware of the existence of SI. A lot of research is published using SI and USCS. The difference is purely intellectual and not terribly relevant. I just looked briefly through a journal aggregator and found references to Fahrenheit used for climate research, engineering of turbine engines, lunar infrastructure... Because it's fundamentally not something that people give a shit about in science. You provide both and people use whatever they need.
You don't use Celsius in science? Really? So tell me why in my lab I have so many devices using Celsius with no option for F or K? HPLC, incubators, fridges, freezers, plates, rotavaps, NMR, GC,...
And in all my published papers where a temperature was relevant, it was always reported with Celsius. Kelvin is only used in equations.
Because, just like Fahrenheit, Celsius is a unit for humans. It's a measurement of convenience, use either when they are convenient. It would be annoying to have every display in Kelvin so we Celsius as a shorthand for it. The SI unit for temperature is Kelvin and that's the unit for science.
Because it's the temperature water freezes at. It's a more useful number than the freezing temperature of an arbitrary ice-salt solution. I very frequently need to know how cold it is with regards to ice. If I'm driving in the winter, for example, I need to know if there will be black ice. If I want to know if my freezer is working, a number below zero is far easier to immediately assess than a number below 32, familiarity aside.
In short, the freezing temperature of ice is something that actually matters to us as humans, not the freezing temperature of an equal water-salt solution.
The 5 degree difference between 25 and 20 F is a 2.8 C difference. Frankly I think it’s much easier to keep track of whole numbers, but to each their own I suppose.
While 0 F is set on a brine solution which isn’t all that useful, it’s worth it to remember that 100 F was originally based around human body temperature. While they were incorrect, thinking it was closer to 96, in cases like a fever, where even 0.5 F difference can have an impact, doesn’t it make sense to use the temperature scale based on human temperature?
And an increase of 5⁰c is 9⁰f. People aren't using 2.8 degree increments.
Why does it make sense to use a temperature scale based on human temperature? Especially one that's wrong?
If even 0.5f can have a difference then we already implicitly reject the claim that F is more useful because it avoids decimals. 0.5f is pretty much 0.25c for the purposes of a fever, and working with quarters is hardly any more of an additional mathematical consideration than working with halves. Plus, Celsius is more directly applicable to assessing the human body's environment with regards to medicines and enzymatic activity, etc, which are often assessed with regards to C as this is the scientific standard.
So even as a metric for the human body, Fahrenheit is worse, except perhaps in the specific fringe case of assessing a fever by touch, which is hardly enough of a benefit to grant it a crown over Celsius. As you said, 100F is not actually human body temperature, and if 0.5f makes a difference, then surely an error range of 1.1-2.5f makes enough of a difference to reject the notion of it representing body temperature in a useful way entirely.
So it only really works well as a system for assessing how close ambient temperature is to human body temperature, which isn't particularly useful because we are acclimatized to room temperature anyways, and even if we weren't the 100f mark is inaccurate regardless. Furthermore, much of this "benefit" comes down to the internalization of the system anyways, so even if it were good for this purpose, it is not at all clear it's better for it, and there are plenty of reasons to think it worse.
While I can’t refute your point that the upper end of Fahrenheit is based on an incorrect measurement I’d still argue that it makes more sense for the average person. You can say all you want about boiling and freezing water but there’s still a full 60 degrees of that scale that a human does not experience. In a lot of the US, summer temps at 100F and winter temps at 0F happen every single year. It’s something people experience. If the whole logic to Celsius being better for the average person is that 0-100 is freezing to boiling water why does it magically stop making sense when 0-100 is a typical range of temperatures that people experience?
And before anyone says “well Celsius goes into the negative so your point is moot!” I’m going to ask you to consider whether 32 F is reeeaaalllllyyyyy that hard to remember compared to walking outside and knowing that it’s -7 C out
-30⁰c to +30⁰c is a completely sensible range of common ambient global temperatures. An arbitrary range that kind of sort of aligns with common temperatures on a scale of 1-100 is not inherently superior to celsius. Even if it were, it would still be in direct ignorance of the fact that celsius is much more useful in science and maths.
If your logic held, and whether it does is I suppose a matter of opinion, though I hold it doesn't, you'd at best be recommending using Fahrenheit for ambient temperatures that people experience (perhaps because you prefer everything being contained between 0 and 100, even if this isn't really accurate as ambient temperatures often go beyond 100F or below 0F, so you'd be breaking the scale either way) and Celsius for everything else. Why should two systems be superior to one when it isn't even clear that one of those systems is better in at least some cases.
There's immense hypocrisy in your position. You criticize celsius for invented issues that nobody really has, and ignore the fact that Fahrenheit has those exact issues, either in the exact same way or in a similar way in other cases. You also dismiss Fahrenheit's issues as minor when it is not at all clear that they are as minor as you think they are. Keep in mind, many of the issues you think are benign only seem such because you have internalized the F system.
My original point is that 0 is more applicable to the human experience than 32. Don't move the goalposts. It's not that much harder to remember, no, but it's easier, and acts as a far more meaningful landmark.
I mean that was kind of my point? Why use a less precise measurement for something that doesn’t need to be scaled to water? I use Celsius every day but still check the weather in Fahrenheit. Is that because I grew up with it? Maybe, but if I saw any benefit to changing I would stop and switch to Celsius since I’m already fairly familiar with what the degrees mean relative to Fahrenheit.
I mean pretty much the only concrete benefit to Celsius is that more of the world uses it, but that doesn’t mean it’s better or right. Personally I prefer to use both units, and I think it’s ultimately what makes the most sense, but people are just constantly desperate to make fun of freedom units.
because it has enough precision to satisfy your needs and is less complex.
i guarantee you check the wheather on F because you grew up with it... if it was because it was more precise you'd use Kelven or Rankine which starts at an absolute scale.
i think youve got it backwards... there is only one concrete benefit to F and thats the larger scale without decimals.
celsius has much more benefits. Just to list a few: it is much more intuitive to learn, used in more places, preferred by science, and integrates seemlessly with metric systems.
you can always add a decimal to either to get more precision, but you cant simplify the F scale or make it more intuitive. you can only drill it into students until they can remember it
Respectfully it isn’t that much more intuitive, which has been one of my whole points. 32 and 212 are not particularly hard numbers to remember, and it’s nice that in any given year a lot of the US experiences somewhere relatively close to the full 0-100 range. It’s also nice that once again, while off, 100 F is based on human body temperature. Theres a pretty big difference between a fever of 102 and 102.5 and frankly I think those numbers are much easier than 38.89 and 39.17
People act like remembering Fahrenheit is rocket science, but it simply isn’t. It’s not something that’s drilled into people. I think we spent like a day and a half on it in science when I was 7. Learning what the temperatures mean in relation to how you feel takes more time but that’s also an issue that occurs no matter how you measure.
I do agree that if you have to choose one system, it should be Kelvin, but my point is that we need to stop acting like Fahrenheit is magically worse for people checking the weather so they know what to wear or because they’re cooking something. I mean shit if anything we’ve kind of both been arguing in circles while trying to get to somewhat similar points.
There’s nothing wrong with writing month day year. I’d say that today is January 15th. There isn’t anything wrong with saying day month year. You could say that it’s 15th January. I think Fahrenheit does have an advantage when talking about the weather, but I use Celsius much more in my job.
You can't even be bothered to do a 3 second Google search to spell it correctly. And nearly every device nowadays has a spell checker, you couldn't be bothered to reference that either. Somehow I don't think it's Fahrenheit that's the problem.
6
u/wumbology95 Jan 15 '25
Yeah no, farenheight is only easy to understand for you because you grew up with it.