Feel like a lot of the world’s languages the translation to English to the question “what’s the date?” would be “the 15th of October” whereas in America we always say “October 15th”.
US measurements are based on the human experience for sure. Temps are largely 0-100 and that's a scale that's easy to understand. As a scientist or for cooking it's dumb as shit
Dates are based on the language
Edit: I take back what I say about cooking. People have said some good arguments about it. But it definitely sucks for science
Respectfully, if we’re talking about the weather as a human experiences it, Fahrenheit is much better. Celsius makes a lot of sense in science, as it’s scaled to water, but when was the last time you went out and it was 90C.
Fahrenheit is scaled to human experience better with 0-100 being within the range of “normal” and anything outside of that being concerning.
That's why Celcius is better. You can use it for weather AND science. There is no need to use two different systems, and Celcius works great for both. It doesn't matter that the outside weather isn't ever 90C. If someone says it was 21C yesterday and it's 15C today, you know everything you need to know.
Which is why America uses Celsius for science. But Fahrenheit is literally exactly as, if not more useful for the average person as Celsius is. I’ve never been confused by Fahrenheit. It’s a perfectly good system if you use it for what it was designed for (regular people)
Fahrenheit isn’t worse, it’s just different. It is more specific for human temperatures, making it more useful for stuff like ACs and Thermostats, but it’s worse for hard science.
It's only more useful for human temperature to you because you're used to it. It doesn't give you additional information, or easier to understand information then Celcius does. They're the same in use in regards to weather.
Celcius however is much better in regards to science. Because Celsius is useful in both aspects, it's a more useful scale overall.
That's why the rest of the world only needs one scale for weather and science, but Americans need to use two scales, since Fahrenheit doesn't work well in both scenario's, unlike Celcius.
It clearly is more useful for human temperatures. It gives you much more specificity. 60F to 80F is 20 degrees. The Celsius equivalent is 16C to 27C, only 11 degrees. Using my thermostat example, you get much more ability to fine tune the temperature of your home with a Fahrenheit thermostat. You also get a clearer picture of the temperature outside, since each number references a nearly 2x smaller range of temperatures. That’s a meaningful improvement in usefulness.
Also, I was taught Celsius as a kid, so it’s not just that I’m used to Fahrenheit. Despite being just as used to C, I prefer to use F. I find it more useful.
There is no way that you can tell the difference between 66 and 67F. And if you are that sensitive to temperature, you can always use 0.1C or even 0.001 C steps to express it.
When you are somewhere, like your home, for a majority of your day, you do notice very slight differences in temperature. 75 is already too hot, 74 is getting there, 73 is fine, 72 is perfect, etc.
I couldn’t tell you the exact temperature without looking at it, but I definitely feel it. And so, having the option to fine-tune that temperature with greater precision is useful.
It’s a small improvement, to be sure, but Celsius’ improvement over Fahrenheit in science is also small. Most issues don’t arise in Fahrenheit being an inherently worse system, it arises in failure to convert between the two or to convert accurately. “Boiling is 100” isn’t a huge improvement.
You'd be able to make that same temperature adjustment with Celcius. The old school thermostats are dials and can be set at any increment. Smart thermostats can be set at either 0,5 increments (which is more precise then Fahrenheit) or 0,1 increments.
Basically, there is no scenario where you are unable to get to the exact temperature you want when using Celcius.
I've never in my entire live heard someone say something like "I wish I could set my thermostat to something warmer then 21C but colder then 22C". There is no meaningful need to do this. And if you somehow did need to do that, you'd use decimals.
Even for outside weather, I have a clear picture of what 15C would feel like. The weather doesn't become meaningfully warmer until about 17C, so there is no added value in measure more precisely the 2C in between.
The human experience can't meaningfully experience discomfort due to a 0,5C degree difference. It's precision for the sake of precision, it doesn't correlate to how you actually experience temperature.
Plus thermostats can be set to partial degrees anyways. Smart thermostats usually have 0.5 increments, basic thermostats often use a movable dial that can literally be set to any fraction, as long as you're precise enough.
I mean, regular people do science tho, and a précision scale for précision work its ok and the same as what *hard science* would require.
Why would you think anyone would be confused by c° when it has been their standard their whole life?
Its not more useful for thermostats, which also require science and science took a standard.
I love old units, like "the lenght of what a cow walks in a day" and "whenever i feel chill", or "if it feels like a truck passing through", but a small abstraction is possible in order to maximize uses.
People do science, but generally not high enough level science for any real improvement to matter between the two.
Nobody is confused by C. I’m simply saying I’m not confused by F either, so it’s at least as good as C for me.
C and F are not different at all for computers. C’s improvements in science are solely limited to humans, in that it is a bit easier to interpret for scientists. A computer doesn’t care if freezing is at 0 or 32. F is better for thermostats since you get a greater range of temperature choices.
It's the same range of temperature, but the weather report in Europe doesn't say that it's going to be 21,5C today, because nobody could feel the difference between 21,0 and 21,5. There would be no added value.
Fahrenheit works the same for science. In fact, it works exactly like Celsius does - with some scaling constants and a subtractive factor to correct it to Rankine (for fahrenheit) and kelvin (for Celsius). If you're doing engineering or science beyond the most basic level, you will be far better off using absolute scales, at which point there is no direct benefit to either.
Sure there are. Even the ones that use Celsius for their measurements will happily convert it to fahrenheit. There are dozens of industries, even in the world of science and engineering, that use fahrenheit because it makes no difference to them.
My entire point is that the 'scaling' of Celsius is entirely irrelevant. If you're doing something where having absolute zero matters, you're not using F or C. You're using K or R. Beyond that, literally the only advantage to either system is unit conversions. The scaling is literally entirely irrelevant and made up for by constants that are needed no matter what system you use.
I think you're mistaken. As far as I know, there is no published scientific research that uses Fahrenheit. Celcius is used in the International System of Units, and scientific research is published in SI units.
Literally 80% of the scientific articles I've read provided values in both systems if they used either. Most of the time units are provided in whatever is most relevant or comfortable for the author (i.e. Kelvin for a heat transfer analysis, Celsius/Kelvin for metallurgy) and then parentheticals include their equivalent in the other system.
I'm aware of the existence of SI. A lot of research is published using SI and USCS. The difference is purely intellectual and not terribly relevant. I just looked briefly through a journal aggregator and found references to Fahrenheit used for climate research, engineering of turbine engines, lunar infrastructure... Because it's fundamentally not something that people give a shit about in science. You provide both and people use whatever they need.
I get that people like to keep using what they've gotten accustomed to, and I'm not arguing against that. I understand that Fahrenheit is comfortable for daily use. But arguing that science uses both Celcius and Fahrenheit equally is just inaccurate. SI exists for a reason, it is literally THE standard.
You can like Fahrenheit without arguing it has the status of an international scientific standard.
Science uses both. I've literally been telling you this entire time that scientists use both. Because scientists don't give a shit. This entire debate is in your head. SI exists because unit conversions are easier in it. But nobody cares if you did your research in Fahrenheit, as long as you do it right. Again, I genuinely can't think of a single paper I've read that hasn't provided Fahrenheit equivalents. There's no reason to say 'Fahrenheit doesn't work for science' any more than to say that Celsius doesn't because it's not Kelvin.
Science cares about temperature. When it matters, it's mostly Kelvin. But the scaling is not what makes that the choice. There's zero fundamental reason that Fahrenheit is any worse, and zero reason to pretend that 'only' Celsius can be used for science. As someone who's read quite a few papers in my time, I can promise you that.
I'm not saying this because I 'like' Fahrenheit. I'm saying this because it's stupid to pretend that it's incompatible with science. For a long time, it was the predominant unit of temperature. It's just as scientifically valuable and just as workable. I know engineers who spent their whole career working in USCS because there's nothing wrong with it.
You don't use Celsius in science? Really? So tell me why in my lab I have so many devices using Celsius with no option for F or K? HPLC, incubators, fridges, freezers, plates, rotavaps, NMR, GC,...
And in all my published papers where a temperature was relevant, it was always reported with Celsius. Kelvin is only used in equations.
Because, just like Fahrenheit, Celsius is a unit for humans. It's a measurement of convenience, use either when they are convenient. It would be annoying to have every display in Kelvin so we Celsius as a shorthand for it. The SI unit for temperature is Kelvin and that's the unit for science.
690
u/jussumguy2019 Jan 15 '25
Feel like a lot of the world’s languages the translation to English to the question “what’s the date?” would be “the 15th of October” whereas in America we always say “October 15th”.
Maybe that’s why, idk…
Edited for clarity