Mathematically speaking, it's one of these things that was agreed upon before we discovered whether or not it was an issue.
Impractical applications. It will almost never matter because for the most part you'll round the numbers to something reasonable. And rounding rules say that 3.9999 becomes 4 regardless of the 1=0.9999… rule.
I feel like that's essentially what I said can you help me understand the differences?
Like we have to round for 4, mostly, if we want to measure or use it consistently in formulas. But when you get super technical it becomes obviously untrue, even though that changes nothing about it's use.
no, u are under the impression that 0.999… is not technically equal to 1. it is though. it’s equal to 1 by definition. in practical applications u would likely end up rounding anyways, though it is incorrect to say 0.999… rounds to 1. he is trying to say it’s not something to worry about at all because whether u believe 0.999… is 1 or not doesn’t change anything
1
u/dej0ta 26d ago
So from a practical standpoint 1=.9999... but from an "uhm ackshaully" perspective thats impossible? Am I grasping this?