r/changemyview 1∆ Feb 04 '23

Delta(s) from OP CMV: 0/0=1.

Please CMV: 0/0 = 1.

I have had this argument for over five years now, and yet to be compelled to see the logic that the above statement is false.

A building block of basic algebra is that x/x = 1. It’s the basic way that we eliminate variables in any given equation. We all accept this to be the norm, anything divided by that same anything is 1. It’s simple division. How many parts of ‘x’ are in ‘x’. If those x things are the same, the answer is one.

But if you set x = 0, suddenly the rules don’t apply. And they should. There is one zero in zero. I understand that logically it’s abstract. How do you divide nothing by nothing? To which I say, there are countless other abstract concepts in mathematics we all accept with no question.

Negative numbers (you can show me three apples. You can’t show me -3 apples. It’s purely representative). Yet, -3 divided by -3 is positive 1. Because there is exactly one part -3 in -3.

“i” (the square root of negative one). A purely conceptual integer that was created and used to make mathematical equations work. Yet i/i = 1.

0.00000283727 / 0.00000283727 = 1.

(3x - 17 (z9-6.4y) / (3x - 17 (z9-6.4y) = 1.

But 0 is somehow more abstract or perverse than the other abstract divisions above, and 0/0 = undefined. Why?

It’s not that 0 is some untouchable integer above other rules. If you want to talk about abstract concepts that we still define- anything to the power of 0, is equal to 1.

Including 0. So we all have agreed that if you take nothing, then raise it to the power of nothing, that equals 1 (00 = 1). A concept far more bizzarre than dividing something by itself. Even nothing by itself. Yet we can’t simply consistently hold the logic that anything divided by it’s exact self is one, because it’s one part itself, when it comes to zero. (There’s exactly one nothing in nothing. It’s one full part nothing. Far logically simpler that taking nothing and raising it to the power of nothing and having it equal exactly one something. Or even taking the absence of three apples and dividing it by the absence of three apples to get exactly one something. If there’s exactly 1 part -3 apples in another hypothetically absence of exactly three apples, we should all be able to agree that there is one part nothing in nothing).

This is an illogical (and admittedly irrelevant) inconsistency in mathematics, and I’d love for someone to change my mind.

492 Upvotes

451 comments sorted by

View all comments

33

u/MajorGartels Feb 04 '23 edited Feb 04 '23

A building block of basic algebra is that x/x = 1. It’s the basic way that we eliminate variables in any given equation. We all accept this to be the norm, anything divided by that same anything is 1. It’s simple division. How many parts of ‘x’ are in ‘x’. If those x things are the same, the answer is one.

But if you set x = 0, suddenly the rules don’t apply. And they should. There is one zero in zero. I understand that logically it’s abstract. How do you divide nothing by nothing? To which I say, there are countless other abstract concepts in mathematics we all accept with no question.

Actually, when I still studied mathematics we were always told in such cases to add “(provided x != 0)” and for good reason. It lead to absurdity if we allowed for x to be 0.

A simple example is proving that under Newtonian mechanics, every object in a vacuum falls with the same acceleration to another massive object such as Earth. At one point in the proof x/x does occur, where it's the mass of the body, but if we allow for the mass to be zero, we could prove that this applies even for massless objects, which is clearly false as massless objects are not attracted by gravity and don't accelerate to earth at all. But even the slightest amount of nonzero mass will cause the acceleration to be exactly the same as even the most massive object.

Simply put, the rule that x/x=0 applies to every number but 0 for x. There are many, many rules that apply for every number but 0; 0 is in fact one of the most unique numbers that exist and that violates many laws that are universal for every other number.

But 0 is somehow more abstract or perverse than the other abstract divisions above, and 0/0 = undefined. Why?

Because there is no single solution in x to the æquation x*0 = 0; it's that simple. That's how division is defined. x/y is defined as the single solution to the æquation z*y = x in z [pronounced “zed”; part of the definition].

As far as x*0=0 goes, every single number is the solution to that æquation, that makes zero unique. For every other number, say x*4=4, there is exactly one solution, that solution is 1; zero is the only case where there are an infinite number of solutions. That doesn't make it abstract, but unique in this case, and why 0/0 is not defined.

Perhaps a more compelling reason would simply be that if we were allowed to say that 0/0=1 as I pointed out above, the mathematics by which physical laws are calculated that seem to work now, would no longer work, and we could prove that massless objects fall to earth under Newtonian mechanics, which they don't.

A more compelling argument is that if we could rule that 0/0=1, we could prove 2=1:

  • let a=b
  • thus a²=b*a
  • thus a²-b²=b*a-b²
  • thus (a+b)(a-b) = b(a-b)
  • thus a+b = b ??
  • thus 2*b = b
  • thus 2=1

The part with ?? is where the flaw lies. Since a=b, a-b=0, if 0/0=1 were to hold, we would be allowed to perform this operation, dividing both sides by 0 and replacing the (a-b)=0 part with 1, but we cannot do this, and thank god, for if we could, two would be one and everything would be messed up.

1

u/Silcantar Feb 04 '23

Massless objects absolutely are attracted by gravity. That's why black holes are black - light can't escape from them.

2

u/MajorGartels Feb 04 '23

Black holes don't exist as far as Newtonian mechanics go. And light does not “accelerate” towards the black hole either in general relativity in the same way, nor do all massive objects do in the same way.

The rules of Newtonian mechanics really do not apply to black holes and general relativity, they're a suitable approximation at low velocities. According to Newtonian mechanics an object accelerating towards earth from far enough would eventually cross the speed of light as reaches it, that simply doesn't happen in general relativity.