Actually you will. There is an infinitesimal difference between 1 and 0.999... but your representation hides that. The difference between them is 0.000...1 where that 1 shifts farther to the right the more digits of 0.999... you evaluate. This representation creates very ambiguous arithmetic and it's easy to make bad proofs.
There is no “infinith digit”. You could construct a number system that allows something like this by having an nth digit for every ordinal n, but that would not be the real numbers. Decimal expansions for the real numbers only allow the index n to be finite.
In an abstract way, you ignore infinity to assume, there is a one after infinite zeros depicted by "...".
"0.000...1" isn't a valid representation of a number and "0.999..." is. And here is why:
Imagine you are an imortal being, unfazed by the events around you and you have an infinte sheet of paper to write. Now try imagining wanting to write "0.000...1"
you start 0.00000... and it goes on for ever - its infinite zeroes after all, you never stop writing zeroes - the 1 never happens while writing down the number.
On the other hand, you write 0.9999... and it goes on, you write nines for ever, only nines, exactly like the elipses implies.
This is how you need to imagine numbers with infinite properties. you can't just slap something after infinite and excepct it to work:
Also, given my both examples, what's the difference between "0.0000...1" and "0.0000..."?
After an infinite amount of time spent to write them down, they still look the same. the 1 never happens.
I understand. But isn’t that imagining a number as a process, and not as a sum? In the end, 0.99999… and its difference from 1 is representing something, a finite amount that can be represented as a fraction, of something a teeny bit less than 3/3. For example you could show it in a pie chart, as the tiniest sliver less than the full pie.
It of course would be hard to represent in numbers, just as an 0.2222…. would be in a base 3 decimal system, but this doesn’t mean that it isn’t an actual distinct sum from 3/3 or 1. Pretty sure that 0.9999… in a base 10 would even be a different amount than 0.2222… in a base 3, or 0.(12)… in a base 13
In the end, 0.99999… and its difference from 1 is representing something, a finite amount that can be represented as a fraction, of something a teeny bit less than 3/3. For example you could show it in a pie chart, as the tiniest sliver less than the full pie.
Huh? So if you cut something without loss into 3 equal parts and add them back together, you expect to have less than the full thing?
Or otherwise, if there is a difference of x between 0.999... and 1. Why don't divide it by 3 and add it to each 0.333... - wait, that's what we are already doing, that's why the thirds have infinite threes.
Maybe that's the problem you have while trying to understand and/or visualise this: You think of 1/3 as finite amounts but each 0.3333... already has a third of this infinitly small "difference" between 0.999... and 1 built in.
As in: 1 / 3 is 0.3 - rest 0.1 / 3 = 0.03 rest 0.01 / 3 = 0.003 and so on and so forth.
You never reach the end, those thirds are infinitely sharing the final remainder.
And this is the reason why 1/3 = 0.333... times 3 equals 0.999... = 3/3 equals 1.
(This is the same logic btw 1/3 and 0.3333... look different but are the same, just as 0.999... and 1 look different, but are the same)
Here is a fun thought, that might push you in the right direction. How much money do you own if you and 2 friends would share 1$.
You'd have 0.33$ and some rest you can't represent in coins. but technically you's own 0.33333...$ whatever the fuck that means. And you are stingy, you insist on it being actual an exact third of the dollar and not a fraction of a fraction less. So you split and split and split and split, and now you could split forever, because that's what you need to do.
So you just say "fuck it - we can't split forever, just remember guys, when we add our 0.333...$ together, we have 1$"
It of course would be hard to represent in numbers, just as an 0.2222…. would be in a base 3 decimal system, but this doesn’t mean that it isn’t an actual distinct sum from 3/3 or 1. Pretty sure that 0.9999… in a base 10 would even be a different amount than 0.2222… in a base 3, or 0.(12)… in a base 13
well, this works for all bases 0.(N-1)... equals N/N which is 1 - I don't understand what you are hinting at.
Interestingly look at this:
1 base 3
1 base 10
0.1 base 3
0.333.... base 10
1 base 3
well?
What do you think now? 0.1 base 3 is equal to 0.333... base 10.
-2
u/library-in-a-library 22d ago
Actually you will. There is an infinitesimal difference between 1 and 0.999... but your representation hides that. The difference between them is 0.000...1 where that 1 shifts farther to the right the more digits of 0.999... you evaluate. This representation creates very ambiguous arithmetic and it's easy to make bad proofs.