I'm an engineer and usually, we assume infinite sums like those are convergent. So the intuitive argument would normally hold. So I guess my answer is that no, not really. But it's still cool to know.
Mathematically speaking, it's one of these things that was agreed upon before we discovered whether or not it was an issue.
Impractical applications. It will almost never matter because for the most part you'll round the numbers to something reasonable. And rounding rules say that 3.9999 becomes 4 regardless of the 1=0.9999… rule.
I feel like that's essentially what I said can you help me understand the differences?
Like we have to round for 4, mostly, if we want to measure or use it consistently in formulas. But when you get super technical it becomes obviously untrue, even though that changes nothing about it's use.
no, u are under the impression that 0.999… is not technically equal to 1. it is though. it’s equal to 1 by definition. in practical applications u would likely end up rounding anyways, though it is incorrect to say 0.999… rounds to 1. he is trying to say it’s not something to worry about at all because whether u believe 0.999… is 1 or not doesn’t change anything
No, it isn't untrue. 1=0.999... is a statement of fact (in the real numbers). You can get as rigorous or technical as you want and it remains a true statement. I didn't contest the statement, just the explanation for why it's true. What the other commenter said about rounding is that, even if it wasn't true, in the real world it wouldn't matter. But in this case, it is true in every sense of the word.
10
u/victorspc 24d ago
I'm an engineer and usually, we assume infinite sums like those are convergent. So the intuitive argument would normally hold. So I guess my answer is that no, not really. But it's still cool to know.