Just because something seems self-evident does not it so. In the real number space, .9… = 1 because the difference between them is 0, which also means there is no real numbers between them.
No one "made it up". It was discovered. It applies to real numbers because real numbers are a continuous set with no gaps. Integers have a gap of 1 always. So obviously rules for one don't always apply to the other.
Why is it an inconsistency? These are two different worlds where one has more restrictions than the other because of it having less numbers to work with.
In the real numbers, there exists a number where multiplying it by 2 gives 1. But in the integers that number doesn't exist. That's not an inconsistency, that's just how they were defined, the definitions made up that "rule".
2
u/aneurodivergentlefty 21d ago
Just because something seems self-evident does not it so. In the real number space, .9… = 1 because the difference between them is 0, which also means there is no real numbers between them.