Seeking elegant proof why 0 divided by 0 does not equal 1











up vote
8
down vote

favorite
1












Several years ago I was bored and so for amusement I wrote out a proof that $dfrac00$ does not equal $1$. I began by assuming that $dfrac00$ does equal $1$ and then was eventually able to deduce that, based upon my assumption (which as we know was false) $0=1$. As this is clearly false and if all the steps in my proof were logically valid, the conclusion then is that my only assumption (that $dfrac00=1$) must be false. Unfortunately, I can no longer recall the steps I used to arrive at the contradiction. If anyone could help me out I would appreciate it.










share|cite|improve this question




















  • 9




    Hint: $~acdot0=0,~$ for all a.
    – Lucian
    Nov 17 '14 at 9:08








  • 12




    Unfortunately, it isn't possible to form a proof in the way you describe. $frac{0}{0}$ is simply undefined, and so it is impossible to perform any manipulations on the expression you have given.
    – Benjamin Alderson
    Nov 17 '14 at 9:09








  • 9




    @BenjaminAlderson, your objection is not valid. The reason $0/0$ is undefined is that it is impossible to define it to be equal to any real number while obeying the familiar algebraic properties of the reals. It is perfectly reasonable to contemplate particular vales for $0/0$ and obtain a contradiction. This is how we know it is impossible to define it in any reasonable way. To say, it's simply undefined so this is invalid is not the way mathematics is done. OP is interested in why it can't be defined, not in blindly accepting authority.
    – Ittay Weiss
    Nov 17 '14 at 9:32






  • 6




    @Ittay: While you're right, Benjamin is not wrong. $0/0$ being undefined is simply a matter of definition, and the question the OP asked doesn't really make sense. What you argue is that the OP really should be asking a different question: "what motivated mathematicians to define division in a way so as to leave $0/0$ undefined?".
    – Hurkyl
    Nov 17 '14 at 10:00








  • 4




    I think the title of the questions clarifies OP's intentions sufficiently well, though the question certainly could have been worded with more care. However, saying it's undefined cause it's undefined is a poor argument if it's an argument at all.
    – Ittay Weiss
    Nov 17 '14 at 10:06















up vote
8
down vote

favorite
1












Several years ago I was bored and so for amusement I wrote out a proof that $dfrac00$ does not equal $1$. I began by assuming that $dfrac00$ does equal $1$ and then was eventually able to deduce that, based upon my assumption (which as we know was false) $0=1$. As this is clearly false and if all the steps in my proof were logically valid, the conclusion then is that my only assumption (that $dfrac00=1$) must be false. Unfortunately, I can no longer recall the steps I used to arrive at the contradiction. If anyone could help me out I would appreciate it.










share|cite|improve this question




















  • 9




    Hint: $~acdot0=0,~$ for all a.
    – Lucian
    Nov 17 '14 at 9:08








  • 12




    Unfortunately, it isn't possible to form a proof in the way you describe. $frac{0}{0}$ is simply undefined, and so it is impossible to perform any manipulations on the expression you have given.
    – Benjamin Alderson
    Nov 17 '14 at 9:09








  • 9




    @BenjaminAlderson, your objection is not valid. The reason $0/0$ is undefined is that it is impossible to define it to be equal to any real number while obeying the familiar algebraic properties of the reals. It is perfectly reasonable to contemplate particular vales for $0/0$ and obtain a contradiction. This is how we know it is impossible to define it in any reasonable way. To say, it's simply undefined so this is invalid is not the way mathematics is done. OP is interested in why it can't be defined, not in blindly accepting authority.
    – Ittay Weiss
    Nov 17 '14 at 9:32






  • 6




    @Ittay: While you're right, Benjamin is not wrong. $0/0$ being undefined is simply a matter of definition, and the question the OP asked doesn't really make sense. What you argue is that the OP really should be asking a different question: "what motivated mathematicians to define division in a way so as to leave $0/0$ undefined?".
    – Hurkyl
    Nov 17 '14 at 10:00








  • 4




    I think the title of the questions clarifies OP's intentions sufficiently well, though the question certainly could have been worded with more care. However, saying it's undefined cause it's undefined is a poor argument if it's an argument at all.
    – Ittay Weiss
    Nov 17 '14 at 10:06













up vote
8
down vote

favorite
1









up vote
8
down vote

favorite
1






1





Several years ago I was bored and so for amusement I wrote out a proof that $dfrac00$ does not equal $1$. I began by assuming that $dfrac00$ does equal $1$ and then was eventually able to deduce that, based upon my assumption (which as we know was false) $0=1$. As this is clearly false and if all the steps in my proof were logically valid, the conclusion then is that my only assumption (that $dfrac00=1$) must be false. Unfortunately, I can no longer recall the steps I used to arrive at the contradiction. If anyone could help me out I would appreciate it.










share|cite|improve this question















Several years ago I was bored and so for amusement I wrote out a proof that $dfrac00$ does not equal $1$. I began by assuming that $dfrac00$ does equal $1$ and then was eventually able to deduce that, based upon my assumption (which as we know was false) $0=1$. As this is clearly false and if all the steps in my proof were logically valid, the conclusion then is that my only assumption (that $dfrac00=1$) must be false. Unfortunately, I can no longer recall the steps I used to arrive at the contradiction. If anyone could help me out I would appreciate it.







arithmetic definition






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 17 '14 at 14:53









MJD

46.6k28205386




46.6k28205386










asked Nov 17 '14 at 9:05









Doug Kennedy

4713




4713








  • 9




    Hint: $~acdot0=0,~$ for all a.
    – Lucian
    Nov 17 '14 at 9:08








  • 12




    Unfortunately, it isn't possible to form a proof in the way you describe. $frac{0}{0}$ is simply undefined, and so it is impossible to perform any manipulations on the expression you have given.
    – Benjamin Alderson
    Nov 17 '14 at 9:09








  • 9




    @BenjaminAlderson, your objection is not valid. The reason $0/0$ is undefined is that it is impossible to define it to be equal to any real number while obeying the familiar algebraic properties of the reals. It is perfectly reasonable to contemplate particular vales for $0/0$ and obtain a contradiction. This is how we know it is impossible to define it in any reasonable way. To say, it's simply undefined so this is invalid is not the way mathematics is done. OP is interested in why it can't be defined, not in blindly accepting authority.
    – Ittay Weiss
    Nov 17 '14 at 9:32






  • 6




    @Ittay: While you're right, Benjamin is not wrong. $0/0$ being undefined is simply a matter of definition, and the question the OP asked doesn't really make sense. What you argue is that the OP really should be asking a different question: "what motivated mathematicians to define division in a way so as to leave $0/0$ undefined?".
    – Hurkyl
    Nov 17 '14 at 10:00








  • 4




    I think the title of the questions clarifies OP's intentions sufficiently well, though the question certainly could have been worded with more care. However, saying it's undefined cause it's undefined is a poor argument if it's an argument at all.
    – Ittay Weiss
    Nov 17 '14 at 10:06














  • 9




    Hint: $~acdot0=0,~$ for all a.
    – Lucian
    Nov 17 '14 at 9:08








  • 12




    Unfortunately, it isn't possible to form a proof in the way you describe. $frac{0}{0}$ is simply undefined, and so it is impossible to perform any manipulations on the expression you have given.
    – Benjamin Alderson
    Nov 17 '14 at 9:09








  • 9




    @BenjaminAlderson, your objection is not valid. The reason $0/0$ is undefined is that it is impossible to define it to be equal to any real number while obeying the familiar algebraic properties of the reals. It is perfectly reasonable to contemplate particular vales for $0/0$ and obtain a contradiction. This is how we know it is impossible to define it in any reasonable way. To say, it's simply undefined so this is invalid is not the way mathematics is done. OP is interested in why it can't be defined, not in blindly accepting authority.
    – Ittay Weiss
    Nov 17 '14 at 9:32






  • 6




    @Ittay: While you're right, Benjamin is not wrong. $0/0$ being undefined is simply a matter of definition, and the question the OP asked doesn't really make sense. What you argue is that the OP really should be asking a different question: "what motivated mathematicians to define division in a way so as to leave $0/0$ undefined?".
    – Hurkyl
    Nov 17 '14 at 10:00








  • 4




    I think the title of the questions clarifies OP's intentions sufficiently well, though the question certainly could have been worded with more care. However, saying it's undefined cause it's undefined is a poor argument if it's an argument at all.
    – Ittay Weiss
    Nov 17 '14 at 10:06








9




9




Hint: $~acdot0=0,~$ for all a.
– Lucian
Nov 17 '14 at 9:08






Hint: $~acdot0=0,~$ for all a.
– Lucian
Nov 17 '14 at 9:08






12




12




Unfortunately, it isn't possible to form a proof in the way you describe. $frac{0}{0}$ is simply undefined, and so it is impossible to perform any manipulations on the expression you have given.
– Benjamin Alderson
Nov 17 '14 at 9:09






Unfortunately, it isn't possible to form a proof in the way you describe. $frac{0}{0}$ is simply undefined, and so it is impossible to perform any manipulations on the expression you have given.
– Benjamin Alderson
Nov 17 '14 at 9:09






9




9




@BenjaminAlderson, your objection is not valid. The reason $0/0$ is undefined is that it is impossible to define it to be equal to any real number while obeying the familiar algebraic properties of the reals. It is perfectly reasonable to contemplate particular vales for $0/0$ and obtain a contradiction. This is how we know it is impossible to define it in any reasonable way. To say, it's simply undefined so this is invalid is not the way mathematics is done. OP is interested in why it can't be defined, not in blindly accepting authority.
– Ittay Weiss
Nov 17 '14 at 9:32




@BenjaminAlderson, your objection is not valid. The reason $0/0$ is undefined is that it is impossible to define it to be equal to any real number while obeying the familiar algebraic properties of the reals. It is perfectly reasonable to contemplate particular vales for $0/0$ and obtain a contradiction. This is how we know it is impossible to define it in any reasonable way. To say, it's simply undefined so this is invalid is not the way mathematics is done. OP is interested in why it can't be defined, not in blindly accepting authority.
– Ittay Weiss
Nov 17 '14 at 9:32




6




6




@Ittay: While you're right, Benjamin is not wrong. $0/0$ being undefined is simply a matter of definition, and the question the OP asked doesn't really make sense. What you argue is that the OP really should be asking a different question: "what motivated mathematicians to define division in a way so as to leave $0/0$ undefined?".
– Hurkyl
Nov 17 '14 at 10:00






@Ittay: While you're right, Benjamin is not wrong. $0/0$ being undefined is simply a matter of definition, and the question the OP asked doesn't really make sense. What you argue is that the OP really should be asking a different question: "what motivated mathematicians to define division in a way so as to leave $0/0$ undefined?".
– Hurkyl
Nov 17 '14 at 10:00






4




4




I think the title of the questions clarifies OP's intentions sufficiently well, though the question certainly could have been worded with more care. However, saying it's undefined cause it's undefined is a poor argument if it's an argument at all.
– Ittay Weiss
Nov 17 '14 at 10:06




I think the title of the questions clarifies OP's intentions sufficiently well, though the question certainly could have been worded with more care. However, saying it's undefined cause it's undefined is a poor argument if it's an argument at all.
– Ittay Weiss
Nov 17 '14 at 10:06










11 Answers
11






active

oldest

votes

















up vote
21
down vote













If $0/0$ were equal to $1$, then $1=frac{0}{0}=frac{0+0}{0}=frac{0}{0}+frac{0}{0}=1+1=2$.






share|cite|improve this answer





















  • Is there a similar (algebraic) reason why $frac{0}{0}$ doesn't equal $0$?
    – vuur
    Nov 17 '14 at 15:05






  • 2




    @vuur Sure, $1=0+1=tfrac00+tfrac11=tfrac{0+0}{0}=0.$
    – Hakim
    Nov 17 '14 at 16:31












  • @Hakim How do you go from $frac00+frac11$ to $frac{0+0}0$?
    – Jaycob Coleman
    Nov 17 '14 at 21:41












  • @JaycobColeman $tfrac{a}{b}+tfrac{c}{d}=tfrac{ad+bc}{bd}$.
    – Hakim
    Nov 17 '14 at 21:49






  • 1




    I really think you need to work with a definition of division on the whatever set of numbers you choose (natural, integer, rationals or reals). Then it should be immediately apparent that you can neither prove nor disprove that $0/0=1$.
    – Dan Christensen
    Nov 18 '14 at 5:18


















up vote
6
down vote













In lay terms, evaluating 0/0 is asking "what number, when multiplied by zero, gives zero". Since the answer to this is "any number", it cannot be defined as a specific value.






share|cite|improve this answer





















  • but perhaps there are compelling reasons then to choose one particular solution as the value of $0/0$. Your argument does not exclude $0/0=1$.
    – Ittay Weiss
    Nov 17 '14 at 20:04










  • @IttayWeiss if you reduce an expression, say f(x) to 0/0 then that does not tell you that f(x) = 1 nor that it is not 1. 0/0 is undefined and could have any value (or even a different value in each place it appears) in your formula, so it gives you no information about the value of f(x).
    – Joe Lee-Moyet
    Nov 17 '14 at 23:06










  • your answer only shows that treating $0/0$ as the solution to $xcdot 0 =0$ does not determine any unique value for $0/0$. That is correct, but does not answer OP's question. OP asked what is inconsistent with defining $0/0=1$. You did not answer that question.
    – Ittay Weiss
    Nov 17 '14 at 23:13










  • @IttayWeiss You could define 0/0 to be 1, but you then lose the very useful property that division is the inverse of multiplication. I believe that also loses you the equivalence $frac{a+b}{c}equivfrac{a}{c}+frac{b}{c}$ that you rely on in your answer.
    – Joe Lee-Moyet
    Nov 17 '14 at 23:29












  • I'm not trying to convince you that $0/0=1$ is a good idea. I know why it does not work. I'm just pointing out that your answer does not address OP's question.
    – Ittay Weiss
    Nov 18 '14 at 1:04


















up vote
6
down vote













The accepted definition of division on the natural numbers is something like:




For all natural numbers $x, y, z$ where $yne 0$, we have $x/y = z$ iff $x=ytimes z$. (Also works for the integers, rational numbers and reals.)




Using this definition, you can neither prove nor disprove that $0/0=1$. You wouldn't be able to draw any inferences from your assumption that $0/0=1$. If $y$ (the divisor) is $0$, this definition tells you nothing.





Suppose we did not have the restriction $yne 0$ and that, instead, we simply defined $x/y = z$ iff $x=ytimes z$ for any natural numbers $x, y$ and $z$.



Then, consider two cases: $x=0$ and $xne 0$.



If $x=0$, then the definition would be inconsistent with our definition of the natural numbers.



$0/0$ could be $0$ because $0times 0 =0$



$0/0$ could be $1$ because $0times 1 = 0$



$0/0$ could be $256$ because $0times 256 = 0$



All natural numbers would have to be equal (a contradiction). This alone would be enough to reject our restriction-free alternative definition. It is inconsistent.



If $xne 0$, then no natural number would work for $x/0$. For any natural number $z$, we could not have $x=0times z$. Zero times any number is always zero.



Either way, the alternative definition simply doesn't work.






share|cite|improve this answer






























    up vote
    2
    down vote













    Let's view the problem within ring theory, i.e., an algebraic structure with addition and multiplication following familiar axioms. Usually when we write $0$ in this context, it means an additive neutral element, i.e. $x+0=0+x = x, forall x$. In any ring it follows from distributive laws that for any $x$ we have: $$xcdot 0 = xcdot(0+0) = xcdot 0 + xcdot 0implies xcdot 0 = 0$$ Now, when we write $frac xy = z$, in this context we mean that there is unique $z$ such that $x = zy$. If there are more than one $z$ meeting this condition, $frac xy$ wouldn't be well defined.



    Let us assume that we have a ring $R$ with additive neutral element $0$, such that division by $0$ is well defined for some $x$, that is, there is unique $yin R$ such that $ycdot 0 = x$. First we note that $x = ycdot 0 = 0$, so only $frac 00$ might be defined. Now, assuming $frac 00 = y$, as we said before, it means that $y$ is the unique element in $R$ such that $ycdot 0 = 0$. But since for any $yin R$ we have that condition, we conclude that $R$ has exactly one element, namely $R={0}$. We call this ring zero ring.



    TLDR: Division by $0$ is possible, but only in a trivial (zero) ring, and nowhere else.






    share|cite|improve this answer





















    • Why has this been downvoted?
      – Karl
      Feb 20 '16 at 16:47










    • I agree it is important to start with assuming a structure such as a ring and deducing the fact from the given axioms. I'm honesty not sure why your answer has been downvoted.
      – Karl
      Feb 20 '16 at 16:55


















    up vote
    1
    down vote













    Yes. There is a algebraic reason. In a field there is no reasonable way we can divide by zero, because one cannot have both the identities $(a/b)times b =a$ and $ctimes0=0$ hold simultaneously if $b$ is allowed to be zero.



    Note that the cancellation law depends of non-zero divisors:



    Proposition (Integers have no zero divisors). Let $a$ and $b$ be integers such that $ab=0$. Then either $a=0$ or $b=0$ (or both).



    Corollary (Cancellation law for integers). If $a, b, c$ are integers such that $ac=bc$ and $c$ is non-zero, then $a=b$.





    EDIT. By other hand, is possible to construct a algebraic structure with $0/0=1$ (similar a ring, but adding another axioms, maybe as $0/0=1$). But in that case, we must consider that we are no working with the rationals $Bbb Q$ or the reals $Bbb R$, since they are fields, so their theorems couldn't be true.






    share|cite|improve this answer























    • $0/0=1$ is consistent with the equalities you write. Moreover, it is incorrect to think of L'Hopital's rule as "dividing by a quantity that approaches zero. In fact, there is no such thing as "a quantity that approaces zero". Limits are numbers, not approximations or anything vague such as quantities approaching anything.
      – Ittay Weiss
      Nov 17 '14 at 18:53










    • You're right, L'Hospital wasn't a good example. On the other hand, if we define $c:=a/b$ and $b:=0$, we have $(a/b)times0=0$ or $(a/b)times0=a$, but no both simultaneously. I don't say that $0/0=1$ is impossible, we can construct a algebraic structure with $0/0=1$. I'm saying that case, we're not working with a "field", and so we are not working with the rationals $mathbb Q$ or the reals $mathbb R$.
      – Cristhian Gz
      Nov 17 '14 at 19:33












    • that is true only if $ane 0$. But $0/0=1$ is not inconsistent with the equalities you had written.
      – Ittay Weiss
      Nov 17 '14 at 20:01










    • as for your later edit, $0/0=1$ is not the case in any ring.
      – Ittay Weiss
      Nov 17 '14 at 20:02










    • OK. I meant "the ring axioms and another axioms, maybe $0/0=1$ as an axiom".
      – Cristhian Gz
      Nov 17 '14 at 20:13




















    up vote
    1
    down vote













    Assume $frac{0}{0} = 1$ and define $x := frac{0}{0^2}$. Then
    $$
    0 = 0cdot x = 0 cdot frac{0}{0^2} = frac{0}{0} = 1.
    $$






    share|cite|improve this answer




























      up vote
      0
      down vote













      Here is one of the famous "fake" proof that $0=1$.



      Let $a=b=1$ (But here you can choose another number is you want).



      $a²=ab$



      $a²-b²=b(a-b)$,



      $(a-b)(a+b)=b(a-b)$



      Then you divide each term by $(a-b)$ (here is the clincher, of course...) and you get



      $a+b=b$, hence $a=0$ (But remember that $a=1$...).



      Of course in order to "get" this result, you have to divide by $0$, which as NO meaning whatsoever... And you can get whatever result you want, which will be meaningless, since you did something meaningless.






      share|cite|improve this answer





















      • OP was asking why $0/0$ can't be defined. The whole question is "how do you know you can't meaningfully define $0/0$?"
        – Ittay Weiss
        Nov 17 '14 at 9:41










      • @IttayWeiss Well, one can deduce easily from the aforementionned demonstration that dividing by $0$ is not possible and not defined, hence $frac{0}{0}$ is not defined as well. But I agree that this is not using a possible $frac{0}{0}=1$ assumption...
        – Martigan
        Nov 17 '14 at 9:49






      • 1




        exactly. The argument for why $a/0$ for $ane 0$ is not definable is different than why $0/0$ is not definable.
        – Ittay Weiss
        Nov 17 '14 at 9:58


















      up vote
      0
      down vote













      Here are two algebraic proofs(plus some calculus).
      begin{align*}
      frac00={}&0^0 \
      logleft(frac00right)={}&logleft(0^0right) \
      logleft(frac00right)={}&log(0)cdot0 \
      end{align*}
      In calculus, the limit of log(x) is equal to the limit of -1/x as they approach 0+
      begin{align*}
      logleft(frac00right)={}&-left|frac10right|cdot0 \
      logleft(frac00right)={}&frac00
      end{align*}



      For $logleft(frac00right)={}frac00$, if $frac00=1$, then 0=1.



      begin{align*}
      frac00={}&x \
      0cdotfrac00={}&xcdot0 \
      frac{0cdot0}{0}={}&xcdot0 \
      frac00={}&frac00cdot0.
      end{align*}



      If $frac00=1$, then 1=0.






      share|cite|improve this answer




























        up vote
        0
        down vote













        The other answers are quite helpful.I only want to add that $frac00$ is just not defined any value.It is one of the several indeterminate forms of mathematics-https://en.wikipedia.org/wiki/Indeterminate_form#List_of_indeterminate_forms






        share|cite|improve this answer




























          up vote
          0
          down vote













          Here is my theory: if anything times zero is zero, than zero divided by anything is zero, and zero divided by zero is anything. When I say "anything" I mean, any number that exists, even an imaginary number. 0*anything=0, so that should mean 0/anything=0, and 0/0=anything. So if I say 0/0=0, that would be true, and if I say 0/0=1, that would be true as well. 0/0=infinity would also be true, 0/0=(-infinity) would be true. Think of it like this: nine divided by three is three, because three goes into nine three times. But how many times must zero be added to get to zero? If zero is added once, it'll be zero. If zero gets added twice, it'll be zero. If it's added three times, four, five, six, seven, as much as infinity, a negative number, a decimal/fraction, a whole number, an integer, a rational number, an irrational number, a real number, an imaginary number, ALL NUMBERS, it'll still be zero. So basically zero divided by zero can be anything, in my theory.






          share|cite|improve this answer




























            up vote
            0
            down vote













            Here is an argument for the kindergarten notion of division, i.e., to compute $a/b$ for natural numbers $a,b$, pretend you take $a$ cookies and divide them among $b$ kids, and ask yourself how many cookies will each child get. Theorem: $a/0$ does not have a unique value, and therefore it does not equal anything. Proof: Take $a$ cookies and divide them among $0$ children. It is vacuously true that each child gets $5$ cookies. It is also vacuously true that each child gets $m$ cookies, for any value of $m$. QED.






            share|cite|improve this answer






















              protected by Joffan Nov 10 '17 at 4:08



              Thank you for your interest in this question.
              Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



              Would you like to answer one of these unanswered questions instead?














              11 Answers
              11






              active

              oldest

              votes








              11 Answers
              11






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes








              up vote
              21
              down vote













              If $0/0$ were equal to $1$, then $1=frac{0}{0}=frac{0+0}{0}=frac{0}{0}+frac{0}{0}=1+1=2$.






              share|cite|improve this answer





















              • Is there a similar (algebraic) reason why $frac{0}{0}$ doesn't equal $0$?
                – vuur
                Nov 17 '14 at 15:05






              • 2




                @vuur Sure, $1=0+1=tfrac00+tfrac11=tfrac{0+0}{0}=0.$
                – Hakim
                Nov 17 '14 at 16:31












              • @Hakim How do you go from $frac00+frac11$ to $frac{0+0}0$?
                – Jaycob Coleman
                Nov 17 '14 at 21:41












              • @JaycobColeman $tfrac{a}{b}+tfrac{c}{d}=tfrac{ad+bc}{bd}$.
                – Hakim
                Nov 17 '14 at 21:49






              • 1




                I really think you need to work with a definition of division on the whatever set of numbers you choose (natural, integer, rationals or reals). Then it should be immediately apparent that you can neither prove nor disprove that $0/0=1$.
                – Dan Christensen
                Nov 18 '14 at 5:18















              up vote
              21
              down vote













              If $0/0$ were equal to $1$, then $1=frac{0}{0}=frac{0+0}{0}=frac{0}{0}+frac{0}{0}=1+1=2$.






              share|cite|improve this answer





















              • Is there a similar (algebraic) reason why $frac{0}{0}$ doesn't equal $0$?
                – vuur
                Nov 17 '14 at 15:05






              • 2




                @vuur Sure, $1=0+1=tfrac00+tfrac11=tfrac{0+0}{0}=0.$
                – Hakim
                Nov 17 '14 at 16:31












              • @Hakim How do you go from $frac00+frac11$ to $frac{0+0}0$?
                – Jaycob Coleman
                Nov 17 '14 at 21:41












              • @JaycobColeman $tfrac{a}{b}+tfrac{c}{d}=tfrac{ad+bc}{bd}$.
                – Hakim
                Nov 17 '14 at 21:49






              • 1




                I really think you need to work with a definition of division on the whatever set of numbers you choose (natural, integer, rationals or reals). Then it should be immediately apparent that you can neither prove nor disprove that $0/0=1$.
                – Dan Christensen
                Nov 18 '14 at 5:18













              up vote
              21
              down vote










              up vote
              21
              down vote









              If $0/0$ were equal to $1$, then $1=frac{0}{0}=frac{0+0}{0}=frac{0}{0}+frac{0}{0}=1+1=2$.






              share|cite|improve this answer












              If $0/0$ were equal to $1$, then $1=frac{0}{0}=frac{0+0}{0}=frac{0}{0}+frac{0}{0}=1+1=2$.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Nov 17 '14 at 9:40









              Ittay Weiss

              62.9k6101182




              62.9k6101182












              • Is there a similar (algebraic) reason why $frac{0}{0}$ doesn't equal $0$?
                – vuur
                Nov 17 '14 at 15:05






              • 2




                @vuur Sure, $1=0+1=tfrac00+tfrac11=tfrac{0+0}{0}=0.$
                – Hakim
                Nov 17 '14 at 16:31












              • @Hakim How do you go from $frac00+frac11$ to $frac{0+0}0$?
                – Jaycob Coleman
                Nov 17 '14 at 21:41












              • @JaycobColeman $tfrac{a}{b}+tfrac{c}{d}=tfrac{ad+bc}{bd}$.
                – Hakim
                Nov 17 '14 at 21:49






              • 1




                I really think you need to work with a definition of division on the whatever set of numbers you choose (natural, integer, rationals or reals). Then it should be immediately apparent that you can neither prove nor disprove that $0/0=1$.
                – Dan Christensen
                Nov 18 '14 at 5:18


















              • Is there a similar (algebraic) reason why $frac{0}{0}$ doesn't equal $0$?
                – vuur
                Nov 17 '14 at 15:05






              • 2




                @vuur Sure, $1=0+1=tfrac00+tfrac11=tfrac{0+0}{0}=0.$
                – Hakim
                Nov 17 '14 at 16:31












              • @Hakim How do you go from $frac00+frac11$ to $frac{0+0}0$?
                – Jaycob Coleman
                Nov 17 '14 at 21:41












              • @JaycobColeman $tfrac{a}{b}+tfrac{c}{d}=tfrac{ad+bc}{bd}$.
                – Hakim
                Nov 17 '14 at 21:49






              • 1




                I really think you need to work with a definition of division on the whatever set of numbers you choose (natural, integer, rationals or reals). Then it should be immediately apparent that you can neither prove nor disprove that $0/0=1$.
                – Dan Christensen
                Nov 18 '14 at 5:18
















              Is there a similar (algebraic) reason why $frac{0}{0}$ doesn't equal $0$?
              – vuur
              Nov 17 '14 at 15:05




              Is there a similar (algebraic) reason why $frac{0}{0}$ doesn't equal $0$?
              – vuur
              Nov 17 '14 at 15:05




              2




              2




              @vuur Sure, $1=0+1=tfrac00+tfrac11=tfrac{0+0}{0}=0.$
              – Hakim
              Nov 17 '14 at 16:31






              @vuur Sure, $1=0+1=tfrac00+tfrac11=tfrac{0+0}{0}=0.$
              – Hakim
              Nov 17 '14 at 16:31














              @Hakim How do you go from $frac00+frac11$ to $frac{0+0}0$?
              – Jaycob Coleman
              Nov 17 '14 at 21:41






              @Hakim How do you go from $frac00+frac11$ to $frac{0+0}0$?
              – Jaycob Coleman
              Nov 17 '14 at 21:41














              @JaycobColeman $tfrac{a}{b}+tfrac{c}{d}=tfrac{ad+bc}{bd}$.
              – Hakim
              Nov 17 '14 at 21:49




              @JaycobColeman $tfrac{a}{b}+tfrac{c}{d}=tfrac{ad+bc}{bd}$.
              – Hakim
              Nov 17 '14 at 21:49




              1




              1




              I really think you need to work with a definition of division on the whatever set of numbers you choose (natural, integer, rationals or reals). Then it should be immediately apparent that you can neither prove nor disprove that $0/0=1$.
              – Dan Christensen
              Nov 18 '14 at 5:18




              I really think you need to work with a definition of division on the whatever set of numbers you choose (natural, integer, rationals or reals). Then it should be immediately apparent that you can neither prove nor disprove that $0/0=1$.
              – Dan Christensen
              Nov 18 '14 at 5:18










              up vote
              6
              down vote













              In lay terms, evaluating 0/0 is asking "what number, when multiplied by zero, gives zero". Since the answer to this is "any number", it cannot be defined as a specific value.






              share|cite|improve this answer





















              • but perhaps there are compelling reasons then to choose one particular solution as the value of $0/0$. Your argument does not exclude $0/0=1$.
                – Ittay Weiss
                Nov 17 '14 at 20:04










              • @IttayWeiss if you reduce an expression, say f(x) to 0/0 then that does not tell you that f(x) = 1 nor that it is not 1. 0/0 is undefined and could have any value (or even a different value in each place it appears) in your formula, so it gives you no information about the value of f(x).
                – Joe Lee-Moyet
                Nov 17 '14 at 23:06










              • your answer only shows that treating $0/0$ as the solution to $xcdot 0 =0$ does not determine any unique value for $0/0$. That is correct, but does not answer OP's question. OP asked what is inconsistent with defining $0/0=1$. You did not answer that question.
                – Ittay Weiss
                Nov 17 '14 at 23:13










              • @IttayWeiss You could define 0/0 to be 1, but you then lose the very useful property that division is the inverse of multiplication. I believe that also loses you the equivalence $frac{a+b}{c}equivfrac{a}{c}+frac{b}{c}$ that you rely on in your answer.
                – Joe Lee-Moyet
                Nov 17 '14 at 23:29












              • I'm not trying to convince you that $0/0=1$ is a good idea. I know why it does not work. I'm just pointing out that your answer does not address OP's question.
                – Ittay Weiss
                Nov 18 '14 at 1:04















              up vote
              6
              down vote













              In lay terms, evaluating 0/0 is asking "what number, when multiplied by zero, gives zero". Since the answer to this is "any number", it cannot be defined as a specific value.






              share|cite|improve this answer





















              • but perhaps there are compelling reasons then to choose one particular solution as the value of $0/0$. Your argument does not exclude $0/0=1$.
                – Ittay Weiss
                Nov 17 '14 at 20:04










              • @IttayWeiss if you reduce an expression, say f(x) to 0/0 then that does not tell you that f(x) = 1 nor that it is not 1. 0/0 is undefined and could have any value (or even a different value in each place it appears) in your formula, so it gives you no information about the value of f(x).
                – Joe Lee-Moyet
                Nov 17 '14 at 23:06










              • your answer only shows that treating $0/0$ as the solution to $xcdot 0 =0$ does not determine any unique value for $0/0$. That is correct, but does not answer OP's question. OP asked what is inconsistent with defining $0/0=1$. You did not answer that question.
                – Ittay Weiss
                Nov 17 '14 at 23:13










              • @IttayWeiss You could define 0/0 to be 1, but you then lose the very useful property that division is the inverse of multiplication. I believe that also loses you the equivalence $frac{a+b}{c}equivfrac{a}{c}+frac{b}{c}$ that you rely on in your answer.
                – Joe Lee-Moyet
                Nov 17 '14 at 23:29












              • I'm not trying to convince you that $0/0=1$ is a good idea. I know why it does not work. I'm just pointing out that your answer does not address OP's question.
                – Ittay Weiss
                Nov 18 '14 at 1:04













              up vote
              6
              down vote










              up vote
              6
              down vote









              In lay terms, evaluating 0/0 is asking "what number, when multiplied by zero, gives zero". Since the answer to this is "any number", it cannot be defined as a specific value.






              share|cite|improve this answer












              In lay terms, evaluating 0/0 is asking "what number, when multiplied by zero, gives zero". Since the answer to this is "any number", it cannot be defined as a specific value.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Nov 17 '14 at 16:22









              Joe Lee-Moyet

              1613




              1613












              • but perhaps there are compelling reasons then to choose one particular solution as the value of $0/0$. Your argument does not exclude $0/0=1$.
                – Ittay Weiss
                Nov 17 '14 at 20:04










              • @IttayWeiss if you reduce an expression, say f(x) to 0/0 then that does not tell you that f(x) = 1 nor that it is not 1. 0/0 is undefined and could have any value (or even a different value in each place it appears) in your formula, so it gives you no information about the value of f(x).
                – Joe Lee-Moyet
                Nov 17 '14 at 23:06










              • your answer only shows that treating $0/0$ as the solution to $xcdot 0 =0$ does not determine any unique value for $0/0$. That is correct, but does not answer OP's question. OP asked what is inconsistent with defining $0/0=1$. You did not answer that question.
                – Ittay Weiss
                Nov 17 '14 at 23:13










              • @IttayWeiss You could define 0/0 to be 1, but you then lose the very useful property that division is the inverse of multiplication. I believe that also loses you the equivalence $frac{a+b}{c}equivfrac{a}{c}+frac{b}{c}$ that you rely on in your answer.
                – Joe Lee-Moyet
                Nov 17 '14 at 23:29












              • I'm not trying to convince you that $0/0=1$ is a good idea. I know why it does not work. I'm just pointing out that your answer does not address OP's question.
                – Ittay Weiss
                Nov 18 '14 at 1:04


















              • but perhaps there are compelling reasons then to choose one particular solution as the value of $0/0$. Your argument does not exclude $0/0=1$.
                – Ittay Weiss
                Nov 17 '14 at 20:04










              • @IttayWeiss if you reduce an expression, say f(x) to 0/0 then that does not tell you that f(x) = 1 nor that it is not 1. 0/0 is undefined and could have any value (or even a different value in each place it appears) in your formula, so it gives you no information about the value of f(x).
                – Joe Lee-Moyet
                Nov 17 '14 at 23:06










              • your answer only shows that treating $0/0$ as the solution to $xcdot 0 =0$ does not determine any unique value for $0/0$. That is correct, but does not answer OP's question. OP asked what is inconsistent with defining $0/0=1$. You did not answer that question.
                – Ittay Weiss
                Nov 17 '14 at 23:13










              • @IttayWeiss You could define 0/0 to be 1, but you then lose the very useful property that division is the inverse of multiplication. I believe that also loses you the equivalence $frac{a+b}{c}equivfrac{a}{c}+frac{b}{c}$ that you rely on in your answer.
                – Joe Lee-Moyet
                Nov 17 '14 at 23:29












              • I'm not trying to convince you that $0/0=1$ is a good idea. I know why it does not work. I'm just pointing out that your answer does not address OP's question.
                – Ittay Weiss
                Nov 18 '14 at 1:04
















              but perhaps there are compelling reasons then to choose one particular solution as the value of $0/0$. Your argument does not exclude $0/0=1$.
              – Ittay Weiss
              Nov 17 '14 at 20:04




              but perhaps there are compelling reasons then to choose one particular solution as the value of $0/0$. Your argument does not exclude $0/0=1$.
              – Ittay Weiss
              Nov 17 '14 at 20:04












              @IttayWeiss if you reduce an expression, say f(x) to 0/0 then that does not tell you that f(x) = 1 nor that it is not 1. 0/0 is undefined and could have any value (or even a different value in each place it appears) in your formula, so it gives you no information about the value of f(x).
              – Joe Lee-Moyet
              Nov 17 '14 at 23:06




              @IttayWeiss if you reduce an expression, say f(x) to 0/0 then that does not tell you that f(x) = 1 nor that it is not 1. 0/0 is undefined and could have any value (or even a different value in each place it appears) in your formula, so it gives you no information about the value of f(x).
              – Joe Lee-Moyet
              Nov 17 '14 at 23:06












              your answer only shows that treating $0/0$ as the solution to $xcdot 0 =0$ does not determine any unique value for $0/0$. That is correct, but does not answer OP's question. OP asked what is inconsistent with defining $0/0=1$. You did not answer that question.
              – Ittay Weiss
              Nov 17 '14 at 23:13




              your answer only shows that treating $0/0$ as the solution to $xcdot 0 =0$ does not determine any unique value for $0/0$. That is correct, but does not answer OP's question. OP asked what is inconsistent with defining $0/0=1$. You did not answer that question.
              – Ittay Weiss
              Nov 17 '14 at 23:13












              @IttayWeiss You could define 0/0 to be 1, but you then lose the very useful property that division is the inverse of multiplication. I believe that also loses you the equivalence $frac{a+b}{c}equivfrac{a}{c}+frac{b}{c}$ that you rely on in your answer.
              – Joe Lee-Moyet
              Nov 17 '14 at 23:29






              @IttayWeiss You could define 0/0 to be 1, but you then lose the very useful property that division is the inverse of multiplication. I believe that also loses you the equivalence $frac{a+b}{c}equivfrac{a}{c}+frac{b}{c}$ that you rely on in your answer.
              – Joe Lee-Moyet
              Nov 17 '14 at 23:29














              I'm not trying to convince you that $0/0=1$ is a good idea. I know why it does not work. I'm just pointing out that your answer does not address OP's question.
              – Ittay Weiss
              Nov 18 '14 at 1:04




              I'm not trying to convince you that $0/0=1$ is a good idea. I know why it does not work. I'm just pointing out that your answer does not address OP's question.
              – Ittay Weiss
              Nov 18 '14 at 1:04










              up vote
              6
              down vote













              The accepted definition of division on the natural numbers is something like:




              For all natural numbers $x, y, z$ where $yne 0$, we have $x/y = z$ iff $x=ytimes z$. (Also works for the integers, rational numbers and reals.)




              Using this definition, you can neither prove nor disprove that $0/0=1$. You wouldn't be able to draw any inferences from your assumption that $0/0=1$. If $y$ (the divisor) is $0$, this definition tells you nothing.





              Suppose we did not have the restriction $yne 0$ and that, instead, we simply defined $x/y = z$ iff $x=ytimes z$ for any natural numbers $x, y$ and $z$.



              Then, consider two cases: $x=0$ and $xne 0$.



              If $x=0$, then the definition would be inconsistent with our definition of the natural numbers.



              $0/0$ could be $0$ because $0times 0 =0$



              $0/0$ could be $1$ because $0times 1 = 0$



              $0/0$ could be $256$ because $0times 256 = 0$



              All natural numbers would have to be equal (a contradiction). This alone would be enough to reject our restriction-free alternative definition. It is inconsistent.



              If $xne 0$, then no natural number would work for $x/0$. For any natural number $z$, we could not have $x=0times z$. Zero times any number is always zero.



              Either way, the alternative definition simply doesn't work.






              share|cite|improve this answer



























                up vote
                6
                down vote













                The accepted definition of division on the natural numbers is something like:




                For all natural numbers $x, y, z$ where $yne 0$, we have $x/y = z$ iff $x=ytimes z$. (Also works for the integers, rational numbers and reals.)




                Using this definition, you can neither prove nor disprove that $0/0=1$. You wouldn't be able to draw any inferences from your assumption that $0/0=1$. If $y$ (the divisor) is $0$, this definition tells you nothing.





                Suppose we did not have the restriction $yne 0$ and that, instead, we simply defined $x/y = z$ iff $x=ytimes z$ for any natural numbers $x, y$ and $z$.



                Then, consider two cases: $x=0$ and $xne 0$.



                If $x=0$, then the definition would be inconsistent with our definition of the natural numbers.



                $0/0$ could be $0$ because $0times 0 =0$



                $0/0$ could be $1$ because $0times 1 = 0$



                $0/0$ could be $256$ because $0times 256 = 0$



                All natural numbers would have to be equal (a contradiction). This alone would be enough to reject our restriction-free alternative definition. It is inconsistent.



                If $xne 0$, then no natural number would work for $x/0$. For any natural number $z$, we could not have $x=0times z$. Zero times any number is always zero.



                Either way, the alternative definition simply doesn't work.






                share|cite|improve this answer

























                  up vote
                  6
                  down vote










                  up vote
                  6
                  down vote









                  The accepted definition of division on the natural numbers is something like:




                  For all natural numbers $x, y, z$ where $yne 0$, we have $x/y = z$ iff $x=ytimes z$. (Also works for the integers, rational numbers and reals.)




                  Using this definition, you can neither prove nor disprove that $0/0=1$. You wouldn't be able to draw any inferences from your assumption that $0/0=1$. If $y$ (the divisor) is $0$, this definition tells you nothing.





                  Suppose we did not have the restriction $yne 0$ and that, instead, we simply defined $x/y = z$ iff $x=ytimes z$ for any natural numbers $x, y$ and $z$.



                  Then, consider two cases: $x=0$ and $xne 0$.



                  If $x=0$, then the definition would be inconsistent with our definition of the natural numbers.



                  $0/0$ could be $0$ because $0times 0 =0$



                  $0/0$ could be $1$ because $0times 1 = 0$



                  $0/0$ could be $256$ because $0times 256 = 0$



                  All natural numbers would have to be equal (a contradiction). This alone would be enough to reject our restriction-free alternative definition. It is inconsistent.



                  If $xne 0$, then no natural number would work for $x/0$. For any natural number $z$, we could not have $x=0times z$. Zero times any number is always zero.



                  Either way, the alternative definition simply doesn't work.






                  share|cite|improve this answer














                  The accepted definition of division on the natural numbers is something like:




                  For all natural numbers $x, y, z$ where $yne 0$, we have $x/y = z$ iff $x=ytimes z$. (Also works for the integers, rational numbers and reals.)




                  Using this definition, you can neither prove nor disprove that $0/0=1$. You wouldn't be able to draw any inferences from your assumption that $0/0=1$. If $y$ (the divisor) is $0$, this definition tells you nothing.





                  Suppose we did not have the restriction $yne 0$ and that, instead, we simply defined $x/y = z$ iff $x=ytimes z$ for any natural numbers $x, y$ and $z$.



                  Then, consider two cases: $x=0$ and $xne 0$.



                  If $x=0$, then the definition would be inconsistent with our definition of the natural numbers.



                  $0/0$ could be $0$ because $0times 0 =0$



                  $0/0$ could be $1$ because $0times 1 = 0$



                  $0/0$ could be $256$ because $0times 256 = 0$



                  All natural numbers would have to be equal (a contradiction). This alone would be enough to reject our restriction-free alternative definition. It is inconsistent.



                  If $xne 0$, then no natural number would work for $x/0$. For any natural number $z$, we could not have $x=0times z$. Zero times any number is always zero.



                  Either way, the alternative definition simply doesn't work.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Nov 18 '14 at 5:17

























                  answered Nov 17 '14 at 15:34









                  Dan Christensen

                  8,46021832




                  8,46021832






















                      up vote
                      2
                      down vote













                      Let's view the problem within ring theory, i.e., an algebraic structure with addition and multiplication following familiar axioms. Usually when we write $0$ in this context, it means an additive neutral element, i.e. $x+0=0+x = x, forall x$. In any ring it follows from distributive laws that for any $x$ we have: $$xcdot 0 = xcdot(0+0) = xcdot 0 + xcdot 0implies xcdot 0 = 0$$ Now, when we write $frac xy = z$, in this context we mean that there is unique $z$ such that $x = zy$. If there are more than one $z$ meeting this condition, $frac xy$ wouldn't be well defined.



                      Let us assume that we have a ring $R$ with additive neutral element $0$, such that division by $0$ is well defined for some $x$, that is, there is unique $yin R$ such that $ycdot 0 = x$. First we note that $x = ycdot 0 = 0$, so only $frac 00$ might be defined. Now, assuming $frac 00 = y$, as we said before, it means that $y$ is the unique element in $R$ such that $ycdot 0 = 0$. But since for any $yin R$ we have that condition, we conclude that $R$ has exactly one element, namely $R={0}$. We call this ring zero ring.



                      TLDR: Division by $0$ is possible, but only in a trivial (zero) ring, and nowhere else.






                      share|cite|improve this answer





















                      • Why has this been downvoted?
                        – Karl
                        Feb 20 '16 at 16:47










                      • I agree it is important to start with assuming a structure such as a ring and deducing the fact from the given axioms. I'm honesty not sure why your answer has been downvoted.
                        – Karl
                        Feb 20 '16 at 16:55















                      up vote
                      2
                      down vote













                      Let's view the problem within ring theory, i.e., an algebraic structure with addition and multiplication following familiar axioms. Usually when we write $0$ in this context, it means an additive neutral element, i.e. $x+0=0+x = x, forall x$. In any ring it follows from distributive laws that for any $x$ we have: $$xcdot 0 = xcdot(0+0) = xcdot 0 + xcdot 0implies xcdot 0 = 0$$ Now, when we write $frac xy = z$, in this context we mean that there is unique $z$ such that $x = zy$. If there are more than one $z$ meeting this condition, $frac xy$ wouldn't be well defined.



                      Let us assume that we have a ring $R$ with additive neutral element $0$, such that division by $0$ is well defined for some $x$, that is, there is unique $yin R$ such that $ycdot 0 = x$. First we note that $x = ycdot 0 = 0$, so only $frac 00$ might be defined. Now, assuming $frac 00 = y$, as we said before, it means that $y$ is the unique element in $R$ such that $ycdot 0 = 0$. But since for any $yin R$ we have that condition, we conclude that $R$ has exactly one element, namely $R={0}$. We call this ring zero ring.



                      TLDR: Division by $0$ is possible, but only in a trivial (zero) ring, and nowhere else.






                      share|cite|improve this answer





















                      • Why has this been downvoted?
                        – Karl
                        Feb 20 '16 at 16:47










                      • I agree it is important to start with assuming a structure such as a ring and deducing the fact from the given axioms. I'm honesty not sure why your answer has been downvoted.
                        – Karl
                        Feb 20 '16 at 16:55













                      up vote
                      2
                      down vote










                      up vote
                      2
                      down vote









                      Let's view the problem within ring theory, i.e., an algebraic structure with addition and multiplication following familiar axioms. Usually when we write $0$ in this context, it means an additive neutral element, i.e. $x+0=0+x = x, forall x$. In any ring it follows from distributive laws that for any $x$ we have: $$xcdot 0 = xcdot(0+0) = xcdot 0 + xcdot 0implies xcdot 0 = 0$$ Now, when we write $frac xy = z$, in this context we mean that there is unique $z$ such that $x = zy$. If there are more than one $z$ meeting this condition, $frac xy$ wouldn't be well defined.



                      Let us assume that we have a ring $R$ with additive neutral element $0$, such that division by $0$ is well defined for some $x$, that is, there is unique $yin R$ such that $ycdot 0 = x$. First we note that $x = ycdot 0 = 0$, so only $frac 00$ might be defined. Now, assuming $frac 00 = y$, as we said before, it means that $y$ is the unique element in $R$ such that $ycdot 0 = 0$. But since for any $yin R$ we have that condition, we conclude that $R$ has exactly one element, namely $R={0}$. We call this ring zero ring.



                      TLDR: Division by $0$ is possible, but only in a trivial (zero) ring, and nowhere else.






                      share|cite|improve this answer












                      Let's view the problem within ring theory, i.e., an algebraic structure with addition and multiplication following familiar axioms. Usually when we write $0$ in this context, it means an additive neutral element, i.e. $x+0=0+x = x, forall x$. In any ring it follows from distributive laws that for any $x$ we have: $$xcdot 0 = xcdot(0+0) = xcdot 0 + xcdot 0implies xcdot 0 = 0$$ Now, when we write $frac xy = z$, in this context we mean that there is unique $z$ such that $x = zy$. If there are more than one $z$ meeting this condition, $frac xy$ wouldn't be well defined.



                      Let us assume that we have a ring $R$ with additive neutral element $0$, such that division by $0$ is well defined for some $x$, that is, there is unique $yin R$ such that $ycdot 0 = x$. First we note that $x = ycdot 0 = 0$, so only $frac 00$ might be defined. Now, assuming $frac 00 = y$, as we said before, it means that $y$ is the unique element in $R$ such that $ycdot 0 = 0$. But since for any $yin R$ we have that condition, we conclude that $R$ has exactly one element, namely $R={0}$. We call this ring zero ring.



                      TLDR: Division by $0$ is possible, but only in a trivial (zero) ring, and nowhere else.







                      share|cite|improve this answer












                      share|cite|improve this answer



                      share|cite|improve this answer










                      answered Aug 8 '15 at 14:10









                      Ennar

                      14.1k32343




                      14.1k32343












                      • Why has this been downvoted?
                        – Karl
                        Feb 20 '16 at 16:47










                      • I agree it is important to start with assuming a structure such as a ring and deducing the fact from the given axioms. I'm honesty not sure why your answer has been downvoted.
                        – Karl
                        Feb 20 '16 at 16:55


















                      • Why has this been downvoted?
                        – Karl
                        Feb 20 '16 at 16:47










                      • I agree it is important to start with assuming a structure such as a ring and deducing the fact from the given axioms. I'm honesty not sure why your answer has been downvoted.
                        – Karl
                        Feb 20 '16 at 16:55
















                      Why has this been downvoted?
                      – Karl
                      Feb 20 '16 at 16:47




                      Why has this been downvoted?
                      – Karl
                      Feb 20 '16 at 16:47












                      I agree it is important to start with assuming a structure such as a ring and deducing the fact from the given axioms. I'm honesty not sure why your answer has been downvoted.
                      – Karl
                      Feb 20 '16 at 16:55




                      I agree it is important to start with assuming a structure such as a ring and deducing the fact from the given axioms. I'm honesty not sure why your answer has been downvoted.
                      – Karl
                      Feb 20 '16 at 16:55










                      up vote
                      1
                      down vote













                      Yes. There is a algebraic reason. In a field there is no reasonable way we can divide by zero, because one cannot have both the identities $(a/b)times b =a$ and $ctimes0=0$ hold simultaneously if $b$ is allowed to be zero.



                      Note that the cancellation law depends of non-zero divisors:



                      Proposition (Integers have no zero divisors). Let $a$ and $b$ be integers such that $ab=0$. Then either $a=0$ or $b=0$ (or both).



                      Corollary (Cancellation law for integers). If $a, b, c$ are integers such that $ac=bc$ and $c$ is non-zero, then $a=b$.





                      EDIT. By other hand, is possible to construct a algebraic structure with $0/0=1$ (similar a ring, but adding another axioms, maybe as $0/0=1$). But in that case, we must consider that we are no working with the rationals $Bbb Q$ or the reals $Bbb R$, since they are fields, so their theorems couldn't be true.






                      share|cite|improve this answer























                      • $0/0=1$ is consistent with the equalities you write. Moreover, it is incorrect to think of L'Hopital's rule as "dividing by a quantity that approaches zero. In fact, there is no such thing as "a quantity that approaces zero". Limits are numbers, not approximations or anything vague such as quantities approaching anything.
                        – Ittay Weiss
                        Nov 17 '14 at 18:53










                      • You're right, L'Hospital wasn't a good example. On the other hand, if we define $c:=a/b$ and $b:=0$, we have $(a/b)times0=0$ or $(a/b)times0=a$, but no both simultaneously. I don't say that $0/0=1$ is impossible, we can construct a algebraic structure with $0/0=1$. I'm saying that case, we're not working with a "field", and so we are not working with the rationals $mathbb Q$ or the reals $mathbb R$.
                        – Cristhian Gz
                        Nov 17 '14 at 19:33












                      • that is true only if $ane 0$. But $0/0=1$ is not inconsistent with the equalities you had written.
                        – Ittay Weiss
                        Nov 17 '14 at 20:01










                      • as for your later edit, $0/0=1$ is not the case in any ring.
                        – Ittay Weiss
                        Nov 17 '14 at 20:02










                      • OK. I meant "the ring axioms and another axioms, maybe $0/0=1$ as an axiom".
                        – Cristhian Gz
                        Nov 17 '14 at 20:13

















                      up vote
                      1
                      down vote













                      Yes. There is a algebraic reason. In a field there is no reasonable way we can divide by zero, because one cannot have both the identities $(a/b)times b =a$ and $ctimes0=0$ hold simultaneously if $b$ is allowed to be zero.



                      Note that the cancellation law depends of non-zero divisors:



                      Proposition (Integers have no zero divisors). Let $a$ and $b$ be integers such that $ab=0$. Then either $a=0$ or $b=0$ (or both).



                      Corollary (Cancellation law for integers). If $a, b, c$ are integers such that $ac=bc$ and $c$ is non-zero, then $a=b$.





                      EDIT. By other hand, is possible to construct a algebraic structure with $0/0=1$ (similar a ring, but adding another axioms, maybe as $0/0=1$). But in that case, we must consider that we are no working with the rationals $Bbb Q$ or the reals $Bbb R$, since they are fields, so their theorems couldn't be true.






                      share|cite|improve this answer























                      • $0/0=1$ is consistent with the equalities you write. Moreover, it is incorrect to think of L'Hopital's rule as "dividing by a quantity that approaches zero. In fact, there is no such thing as "a quantity that approaces zero". Limits are numbers, not approximations or anything vague such as quantities approaching anything.
                        – Ittay Weiss
                        Nov 17 '14 at 18:53










                      • You're right, L'Hospital wasn't a good example. On the other hand, if we define $c:=a/b$ and $b:=0$, we have $(a/b)times0=0$ or $(a/b)times0=a$, but no both simultaneously. I don't say that $0/0=1$ is impossible, we can construct a algebraic structure with $0/0=1$. I'm saying that case, we're not working with a "field", and so we are not working with the rationals $mathbb Q$ or the reals $mathbb R$.
                        – Cristhian Gz
                        Nov 17 '14 at 19:33












                      • that is true only if $ane 0$. But $0/0=1$ is not inconsistent with the equalities you had written.
                        – Ittay Weiss
                        Nov 17 '14 at 20:01










                      • as for your later edit, $0/0=1$ is not the case in any ring.
                        – Ittay Weiss
                        Nov 17 '14 at 20:02










                      • OK. I meant "the ring axioms and another axioms, maybe $0/0=1$ as an axiom".
                        – Cristhian Gz
                        Nov 17 '14 at 20:13















                      up vote
                      1
                      down vote










                      up vote
                      1
                      down vote









                      Yes. There is a algebraic reason. In a field there is no reasonable way we can divide by zero, because one cannot have both the identities $(a/b)times b =a$ and $ctimes0=0$ hold simultaneously if $b$ is allowed to be zero.



                      Note that the cancellation law depends of non-zero divisors:



                      Proposition (Integers have no zero divisors). Let $a$ and $b$ be integers such that $ab=0$. Then either $a=0$ or $b=0$ (or both).



                      Corollary (Cancellation law for integers). If $a, b, c$ are integers such that $ac=bc$ and $c$ is non-zero, then $a=b$.





                      EDIT. By other hand, is possible to construct a algebraic structure with $0/0=1$ (similar a ring, but adding another axioms, maybe as $0/0=1$). But in that case, we must consider that we are no working with the rationals $Bbb Q$ or the reals $Bbb R$, since they are fields, so their theorems couldn't be true.






                      share|cite|improve this answer














                      Yes. There is a algebraic reason. In a field there is no reasonable way we can divide by zero, because one cannot have both the identities $(a/b)times b =a$ and $ctimes0=0$ hold simultaneously if $b$ is allowed to be zero.



                      Note that the cancellation law depends of non-zero divisors:



                      Proposition (Integers have no zero divisors). Let $a$ and $b$ be integers such that $ab=0$. Then either $a=0$ or $b=0$ (or both).



                      Corollary (Cancellation law for integers). If $a, b, c$ are integers such that $ac=bc$ and $c$ is non-zero, then $a=b$.





                      EDIT. By other hand, is possible to construct a algebraic structure with $0/0=1$ (similar a ring, but adding another axioms, maybe as $0/0=1$). But in that case, we must consider that we are no working with the rationals $Bbb Q$ or the reals $Bbb R$, since they are fields, so their theorems couldn't be true.







                      share|cite|improve this answer














                      share|cite|improve this answer



                      share|cite|improve this answer








                      edited Nov 18 '14 at 16:22

























                      answered Nov 17 '14 at 16:33









                      Cristhian Gz

                      1,8781720




                      1,8781720












                      • $0/0=1$ is consistent with the equalities you write. Moreover, it is incorrect to think of L'Hopital's rule as "dividing by a quantity that approaches zero. In fact, there is no such thing as "a quantity that approaces zero". Limits are numbers, not approximations or anything vague such as quantities approaching anything.
                        – Ittay Weiss
                        Nov 17 '14 at 18:53










                      • You're right, L'Hospital wasn't a good example. On the other hand, if we define $c:=a/b$ and $b:=0$, we have $(a/b)times0=0$ or $(a/b)times0=a$, but no both simultaneously. I don't say that $0/0=1$ is impossible, we can construct a algebraic structure with $0/0=1$. I'm saying that case, we're not working with a "field", and so we are not working with the rationals $mathbb Q$ or the reals $mathbb R$.
                        – Cristhian Gz
                        Nov 17 '14 at 19:33












                      • that is true only if $ane 0$. But $0/0=1$ is not inconsistent with the equalities you had written.
                        – Ittay Weiss
                        Nov 17 '14 at 20:01










                      • as for your later edit, $0/0=1$ is not the case in any ring.
                        – Ittay Weiss
                        Nov 17 '14 at 20:02










                      • OK. I meant "the ring axioms and another axioms, maybe $0/0=1$ as an axiom".
                        – Cristhian Gz
                        Nov 17 '14 at 20:13




















                      • $0/0=1$ is consistent with the equalities you write. Moreover, it is incorrect to think of L'Hopital's rule as "dividing by a quantity that approaches zero. In fact, there is no such thing as "a quantity that approaces zero". Limits are numbers, not approximations or anything vague such as quantities approaching anything.
                        – Ittay Weiss
                        Nov 17 '14 at 18:53










                      • You're right, L'Hospital wasn't a good example. On the other hand, if we define $c:=a/b$ and $b:=0$, we have $(a/b)times0=0$ or $(a/b)times0=a$, but no both simultaneously. I don't say that $0/0=1$ is impossible, we can construct a algebraic structure with $0/0=1$. I'm saying that case, we're not working with a "field", and so we are not working with the rationals $mathbb Q$ or the reals $mathbb R$.
                        – Cristhian Gz
                        Nov 17 '14 at 19:33












                      • that is true only if $ane 0$. But $0/0=1$ is not inconsistent with the equalities you had written.
                        – Ittay Weiss
                        Nov 17 '14 at 20:01










                      • as for your later edit, $0/0=1$ is not the case in any ring.
                        – Ittay Weiss
                        Nov 17 '14 at 20:02










                      • OK. I meant "the ring axioms and another axioms, maybe $0/0=1$ as an axiom".
                        – Cristhian Gz
                        Nov 17 '14 at 20:13


















                      $0/0=1$ is consistent with the equalities you write. Moreover, it is incorrect to think of L'Hopital's rule as "dividing by a quantity that approaches zero. In fact, there is no such thing as "a quantity that approaces zero". Limits are numbers, not approximations or anything vague such as quantities approaching anything.
                      – Ittay Weiss
                      Nov 17 '14 at 18:53




                      $0/0=1$ is consistent with the equalities you write. Moreover, it is incorrect to think of L'Hopital's rule as "dividing by a quantity that approaches zero. In fact, there is no such thing as "a quantity that approaces zero". Limits are numbers, not approximations or anything vague such as quantities approaching anything.
                      – Ittay Weiss
                      Nov 17 '14 at 18:53












                      You're right, L'Hospital wasn't a good example. On the other hand, if we define $c:=a/b$ and $b:=0$, we have $(a/b)times0=0$ or $(a/b)times0=a$, but no both simultaneously. I don't say that $0/0=1$ is impossible, we can construct a algebraic structure with $0/0=1$. I'm saying that case, we're not working with a "field", and so we are not working with the rationals $mathbb Q$ or the reals $mathbb R$.
                      – Cristhian Gz
                      Nov 17 '14 at 19:33






                      You're right, L'Hospital wasn't a good example. On the other hand, if we define $c:=a/b$ and $b:=0$, we have $(a/b)times0=0$ or $(a/b)times0=a$, but no both simultaneously. I don't say that $0/0=1$ is impossible, we can construct a algebraic structure with $0/0=1$. I'm saying that case, we're not working with a "field", and so we are not working with the rationals $mathbb Q$ or the reals $mathbb R$.
                      – Cristhian Gz
                      Nov 17 '14 at 19:33














                      that is true only if $ane 0$. But $0/0=1$ is not inconsistent with the equalities you had written.
                      – Ittay Weiss
                      Nov 17 '14 at 20:01




                      that is true only if $ane 0$. But $0/0=1$ is not inconsistent with the equalities you had written.
                      – Ittay Weiss
                      Nov 17 '14 at 20:01












                      as for your later edit, $0/0=1$ is not the case in any ring.
                      – Ittay Weiss
                      Nov 17 '14 at 20:02




                      as for your later edit, $0/0=1$ is not the case in any ring.
                      – Ittay Weiss
                      Nov 17 '14 at 20:02












                      OK. I meant "the ring axioms and another axioms, maybe $0/0=1$ as an axiom".
                      – Cristhian Gz
                      Nov 17 '14 at 20:13






                      OK. I meant "the ring axioms and another axioms, maybe $0/0=1$ as an axiom".
                      – Cristhian Gz
                      Nov 17 '14 at 20:13












                      up vote
                      1
                      down vote













                      Assume $frac{0}{0} = 1$ and define $x := frac{0}{0^2}$. Then
                      $$
                      0 = 0cdot x = 0 cdot frac{0}{0^2} = frac{0}{0} = 1.
                      $$






                      share|cite|improve this answer

























                        up vote
                        1
                        down vote













                        Assume $frac{0}{0} = 1$ and define $x := frac{0}{0^2}$. Then
                        $$
                        0 = 0cdot x = 0 cdot frac{0}{0^2} = frac{0}{0} = 1.
                        $$






                        share|cite|improve this answer























                          up vote
                          1
                          down vote










                          up vote
                          1
                          down vote









                          Assume $frac{0}{0} = 1$ and define $x := frac{0}{0^2}$. Then
                          $$
                          0 = 0cdot x = 0 cdot frac{0}{0^2} = frac{0}{0} = 1.
                          $$






                          share|cite|improve this answer












                          Assume $frac{0}{0} = 1$ and define $x := frac{0}{0^2}$. Then
                          $$
                          0 = 0cdot x = 0 cdot frac{0}{0^2} = frac{0}{0} = 1.
                          $$







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Aug 8 '15 at 13:29









                          Paglia

                          1,036615




                          1,036615






















                              up vote
                              0
                              down vote













                              Here is one of the famous "fake" proof that $0=1$.



                              Let $a=b=1$ (But here you can choose another number is you want).



                              $a²=ab$



                              $a²-b²=b(a-b)$,



                              $(a-b)(a+b)=b(a-b)$



                              Then you divide each term by $(a-b)$ (here is the clincher, of course...) and you get



                              $a+b=b$, hence $a=0$ (But remember that $a=1$...).



                              Of course in order to "get" this result, you have to divide by $0$, which as NO meaning whatsoever... And you can get whatever result you want, which will be meaningless, since you did something meaningless.






                              share|cite|improve this answer





















                              • OP was asking why $0/0$ can't be defined. The whole question is "how do you know you can't meaningfully define $0/0$?"
                                – Ittay Weiss
                                Nov 17 '14 at 9:41










                              • @IttayWeiss Well, one can deduce easily from the aforementionned demonstration that dividing by $0$ is not possible and not defined, hence $frac{0}{0}$ is not defined as well. But I agree that this is not using a possible $frac{0}{0}=1$ assumption...
                                – Martigan
                                Nov 17 '14 at 9:49






                              • 1




                                exactly. The argument for why $a/0$ for $ane 0$ is not definable is different than why $0/0$ is not definable.
                                – Ittay Weiss
                                Nov 17 '14 at 9:58















                              up vote
                              0
                              down vote













                              Here is one of the famous "fake" proof that $0=1$.



                              Let $a=b=1$ (But here you can choose another number is you want).



                              $a²=ab$



                              $a²-b²=b(a-b)$,



                              $(a-b)(a+b)=b(a-b)$



                              Then you divide each term by $(a-b)$ (here is the clincher, of course...) and you get



                              $a+b=b$, hence $a=0$ (But remember that $a=1$...).



                              Of course in order to "get" this result, you have to divide by $0$, which as NO meaning whatsoever... And you can get whatever result you want, which will be meaningless, since you did something meaningless.






                              share|cite|improve this answer





















                              • OP was asking why $0/0$ can't be defined. The whole question is "how do you know you can't meaningfully define $0/0$?"
                                – Ittay Weiss
                                Nov 17 '14 at 9:41










                              • @IttayWeiss Well, one can deduce easily from the aforementionned demonstration that dividing by $0$ is not possible and not defined, hence $frac{0}{0}$ is not defined as well. But I agree that this is not using a possible $frac{0}{0}=1$ assumption...
                                – Martigan
                                Nov 17 '14 at 9:49






                              • 1




                                exactly. The argument for why $a/0$ for $ane 0$ is not definable is different than why $0/0$ is not definable.
                                – Ittay Weiss
                                Nov 17 '14 at 9:58













                              up vote
                              0
                              down vote










                              up vote
                              0
                              down vote









                              Here is one of the famous "fake" proof that $0=1$.



                              Let $a=b=1$ (But here you can choose another number is you want).



                              $a²=ab$



                              $a²-b²=b(a-b)$,



                              $(a-b)(a+b)=b(a-b)$



                              Then you divide each term by $(a-b)$ (here is the clincher, of course...) and you get



                              $a+b=b$, hence $a=0$ (But remember that $a=1$...).



                              Of course in order to "get" this result, you have to divide by $0$, which as NO meaning whatsoever... And you can get whatever result you want, which will be meaningless, since you did something meaningless.






                              share|cite|improve this answer












                              Here is one of the famous "fake" proof that $0=1$.



                              Let $a=b=1$ (But here you can choose another number is you want).



                              $a²=ab$



                              $a²-b²=b(a-b)$,



                              $(a-b)(a+b)=b(a-b)$



                              Then you divide each term by $(a-b)$ (here is the clincher, of course...) and you get



                              $a+b=b$, hence $a=0$ (But remember that $a=1$...).



                              Of course in order to "get" this result, you have to divide by $0$, which as NO meaning whatsoever... And you can get whatever result you want, which will be meaningless, since you did something meaningless.







                              share|cite|improve this answer












                              share|cite|improve this answer



                              share|cite|improve this answer










                              answered Nov 17 '14 at 9:18









                              Martigan

                              5,165917




                              5,165917












                              • OP was asking why $0/0$ can't be defined. The whole question is "how do you know you can't meaningfully define $0/0$?"
                                – Ittay Weiss
                                Nov 17 '14 at 9:41










                              • @IttayWeiss Well, one can deduce easily from the aforementionned demonstration that dividing by $0$ is not possible and not defined, hence $frac{0}{0}$ is not defined as well. But I agree that this is not using a possible $frac{0}{0}=1$ assumption...
                                – Martigan
                                Nov 17 '14 at 9:49






                              • 1




                                exactly. The argument for why $a/0$ for $ane 0$ is not definable is different than why $0/0$ is not definable.
                                – Ittay Weiss
                                Nov 17 '14 at 9:58


















                              • OP was asking why $0/0$ can't be defined. The whole question is "how do you know you can't meaningfully define $0/0$?"
                                – Ittay Weiss
                                Nov 17 '14 at 9:41










                              • @IttayWeiss Well, one can deduce easily from the aforementionned demonstration that dividing by $0$ is not possible and not defined, hence $frac{0}{0}$ is not defined as well. But I agree that this is not using a possible $frac{0}{0}=1$ assumption...
                                – Martigan
                                Nov 17 '14 at 9:49






                              • 1




                                exactly. The argument for why $a/0$ for $ane 0$ is not definable is different than why $0/0$ is not definable.
                                – Ittay Weiss
                                Nov 17 '14 at 9:58
















                              OP was asking why $0/0$ can't be defined. The whole question is "how do you know you can't meaningfully define $0/0$?"
                              – Ittay Weiss
                              Nov 17 '14 at 9:41




                              OP was asking why $0/0$ can't be defined. The whole question is "how do you know you can't meaningfully define $0/0$?"
                              – Ittay Weiss
                              Nov 17 '14 at 9:41












                              @IttayWeiss Well, one can deduce easily from the aforementionned demonstration that dividing by $0$ is not possible and not defined, hence $frac{0}{0}$ is not defined as well. But I agree that this is not using a possible $frac{0}{0}=1$ assumption...
                              – Martigan
                              Nov 17 '14 at 9:49




                              @IttayWeiss Well, one can deduce easily from the aforementionned demonstration that dividing by $0$ is not possible and not defined, hence $frac{0}{0}$ is not defined as well. But I agree that this is not using a possible $frac{0}{0}=1$ assumption...
                              – Martigan
                              Nov 17 '14 at 9:49




                              1




                              1




                              exactly. The argument for why $a/0$ for $ane 0$ is not definable is different than why $0/0$ is not definable.
                              – Ittay Weiss
                              Nov 17 '14 at 9:58




                              exactly. The argument for why $a/0$ for $ane 0$ is not definable is different than why $0/0$ is not definable.
                              – Ittay Weiss
                              Nov 17 '14 at 9:58










                              up vote
                              0
                              down vote













                              Here are two algebraic proofs(plus some calculus).
                              begin{align*}
                              frac00={}&0^0 \
                              logleft(frac00right)={}&logleft(0^0right) \
                              logleft(frac00right)={}&log(0)cdot0 \
                              end{align*}
                              In calculus, the limit of log(x) is equal to the limit of -1/x as they approach 0+
                              begin{align*}
                              logleft(frac00right)={}&-left|frac10right|cdot0 \
                              logleft(frac00right)={}&frac00
                              end{align*}



                              For $logleft(frac00right)={}frac00$, if $frac00=1$, then 0=1.



                              begin{align*}
                              frac00={}&x \
                              0cdotfrac00={}&xcdot0 \
                              frac{0cdot0}{0}={}&xcdot0 \
                              frac00={}&frac00cdot0.
                              end{align*}



                              If $frac00=1$, then 1=0.






                              share|cite|improve this answer

























                                up vote
                                0
                                down vote













                                Here are two algebraic proofs(plus some calculus).
                                begin{align*}
                                frac00={}&0^0 \
                                logleft(frac00right)={}&logleft(0^0right) \
                                logleft(frac00right)={}&log(0)cdot0 \
                                end{align*}
                                In calculus, the limit of log(x) is equal to the limit of -1/x as they approach 0+
                                begin{align*}
                                logleft(frac00right)={}&-left|frac10right|cdot0 \
                                logleft(frac00right)={}&frac00
                                end{align*}



                                For $logleft(frac00right)={}frac00$, if $frac00=1$, then 0=1.



                                begin{align*}
                                frac00={}&x \
                                0cdotfrac00={}&xcdot0 \
                                frac{0cdot0}{0}={}&xcdot0 \
                                frac00={}&frac00cdot0.
                                end{align*}



                                If $frac00=1$, then 1=0.






                                share|cite|improve this answer























                                  up vote
                                  0
                                  down vote










                                  up vote
                                  0
                                  down vote









                                  Here are two algebraic proofs(plus some calculus).
                                  begin{align*}
                                  frac00={}&0^0 \
                                  logleft(frac00right)={}&logleft(0^0right) \
                                  logleft(frac00right)={}&log(0)cdot0 \
                                  end{align*}
                                  In calculus, the limit of log(x) is equal to the limit of -1/x as they approach 0+
                                  begin{align*}
                                  logleft(frac00right)={}&-left|frac10right|cdot0 \
                                  logleft(frac00right)={}&frac00
                                  end{align*}



                                  For $logleft(frac00right)={}frac00$, if $frac00=1$, then 0=1.



                                  begin{align*}
                                  frac00={}&x \
                                  0cdotfrac00={}&xcdot0 \
                                  frac{0cdot0}{0}={}&xcdot0 \
                                  frac00={}&frac00cdot0.
                                  end{align*}



                                  If $frac00=1$, then 1=0.






                                  share|cite|improve this answer












                                  Here are two algebraic proofs(plus some calculus).
                                  begin{align*}
                                  frac00={}&0^0 \
                                  logleft(frac00right)={}&logleft(0^0right) \
                                  logleft(frac00right)={}&log(0)cdot0 \
                                  end{align*}
                                  In calculus, the limit of log(x) is equal to the limit of -1/x as they approach 0+
                                  begin{align*}
                                  logleft(frac00right)={}&-left|frac10right|cdot0 \
                                  logleft(frac00right)={}&frac00
                                  end{align*}



                                  For $logleft(frac00right)={}frac00$, if $frac00=1$, then 0=1.



                                  begin{align*}
                                  frac00={}&x \
                                  0cdotfrac00={}&xcdot0 \
                                  frac{0cdot0}{0}={}&xcdot0 \
                                  frac00={}&frac00cdot0.
                                  end{align*}



                                  If $frac00=1$, then 1=0.







                                  share|cite|improve this answer












                                  share|cite|improve this answer



                                  share|cite|improve this answer










                                  answered Jul 3 '16 at 2:06









                                  empCarnivore

                                  438




                                  438






















                                      up vote
                                      0
                                      down vote













                                      The other answers are quite helpful.I only want to add that $frac00$ is just not defined any value.It is one of the several indeterminate forms of mathematics-https://en.wikipedia.org/wiki/Indeterminate_form#List_of_indeterminate_forms






                                      share|cite|improve this answer

























                                        up vote
                                        0
                                        down vote













                                        The other answers are quite helpful.I only want to add that $frac00$ is just not defined any value.It is one of the several indeterminate forms of mathematics-https://en.wikipedia.org/wiki/Indeterminate_form#List_of_indeterminate_forms






                                        share|cite|improve this answer























                                          up vote
                                          0
                                          down vote










                                          up vote
                                          0
                                          down vote









                                          The other answers are quite helpful.I only want to add that $frac00$ is just not defined any value.It is one of the several indeterminate forms of mathematics-https://en.wikipedia.org/wiki/Indeterminate_form#List_of_indeterminate_forms






                                          share|cite|improve this answer












                                          The other answers are quite helpful.I only want to add that $frac00$ is just not defined any value.It is one of the several indeterminate forms of mathematics-https://en.wikipedia.org/wiki/Indeterminate_form#List_of_indeterminate_forms







                                          share|cite|improve this answer












                                          share|cite|improve this answer



                                          share|cite|improve this answer










                                          answered Jul 3 '16 at 2:11









                                          tatan

                                          5,53452555




                                          5,53452555






















                                              up vote
                                              0
                                              down vote













                                              Here is my theory: if anything times zero is zero, than zero divided by anything is zero, and zero divided by zero is anything. When I say "anything" I mean, any number that exists, even an imaginary number. 0*anything=0, so that should mean 0/anything=0, and 0/0=anything. So if I say 0/0=0, that would be true, and if I say 0/0=1, that would be true as well. 0/0=infinity would also be true, 0/0=(-infinity) would be true. Think of it like this: nine divided by three is three, because three goes into nine three times. But how many times must zero be added to get to zero? If zero is added once, it'll be zero. If zero gets added twice, it'll be zero. If it's added three times, four, five, six, seven, as much as infinity, a negative number, a decimal/fraction, a whole number, an integer, a rational number, an irrational number, a real number, an imaginary number, ALL NUMBERS, it'll still be zero. So basically zero divided by zero can be anything, in my theory.






                                              share|cite|improve this answer

























                                                up vote
                                                0
                                                down vote













                                                Here is my theory: if anything times zero is zero, than zero divided by anything is zero, and zero divided by zero is anything. When I say "anything" I mean, any number that exists, even an imaginary number. 0*anything=0, so that should mean 0/anything=0, and 0/0=anything. So if I say 0/0=0, that would be true, and if I say 0/0=1, that would be true as well. 0/0=infinity would also be true, 0/0=(-infinity) would be true. Think of it like this: nine divided by three is three, because three goes into nine three times. But how many times must zero be added to get to zero? If zero is added once, it'll be zero. If zero gets added twice, it'll be zero. If it's added three times, four, five, six, seven, as much as infinity, a negative number, a decimal/fraction, a whole number, an integer, a rational number, an irrational number, a real number, an imaginary number, ALL NUMBERS, it'll still be zero. So basically zero divided by zero can be anything, in my theory.






                                                share|cite|improve this answer























                                                  up vote
                                                  0
                                                  down vote










                                                  up vote
                                                  0
                                                  down vote









                                                  Here is my theory: if anything times zero is zero, than zero divided by anything is zero, and zero divided by zero is anything. When I say "anything" I mean, any number that exists, even an imaginary number. 0*anything=0, so that should mean 0/anything=0, and 0/0=anything. So if I say 0/0=0, that would be true, and if I say 0/0=1, that would be true as well. 0/0=infinity would also be true, 0/0=(-infinity) would be true. Think of it like this: nine divided by three is three, because three goes into nine three times. But how many times must zero be added to get to zero? If zero is added once, it'll be zero. If zero gets added twice, it'll be zero. If it's added three times, four, five, six, seven, as much as infinity, a negative number, a decimal/fraction, a whole number, an integer, a rational number, an irrational number, a real number, an imaginary number, ALL NUMBERS, it'll still be zero. So basically zero divided by zero can be anything, in my theory.






                                                  share|cite|improve this answer












                                                  Here is my theory: if anything times zero is zero, than zero divided by anything is zero, and zero divided by zero is anything. When I say "anything" I mean, any number that exists, even an imaginary number. 0*anything=0, so that should mean 0/anything=0, and 0/0=anything. So if I say 0/0=0, that would be true, and if I say 0/0=1, that would be true as well. 0/0=infinity would also be true, 0/0=(-infinity) would be true. Think of it like this: nine divided by three is three, because three goes into nine three times. But how many times must zero be added to get to zero? If zero is added once, it'll be zero. If zero gets added twice, it'll be zero. If it's added three times, four, five, six, seven, as much as infinity, a negative number, a decimal/fraction, a whole number, an integer, a rational number, an irrational number, a real number, an imaginary number, ALL NUMBERS, it'll still be zero. So basically zero divided by zero can be anything, in my theory.







                                                  share|cite|improve this answer












                                                  share|cite|improve this answer



                                                  share|cite|improve this answer










                                                  answered Nov 10 '17 at 1:45









                                                  Brandon

                                                  1




                                                  1






















                                                      up vote
                                                      0
                                                      down vote













                                                      Here is an argument for the kindergarten notion of division, i.e., to compute $a/b$ for natural numbers $a,b$, pretend you take $a$ cookies and divide them among $b$ kids, and ask yourself how many cookies will each child get. Theorem: $a/0$ does not have a unique value, and therefore it does not equal anything. Proof: Take $a$ cookies and divide them among $0$ children. It is vacuously true that each child gets $5$ cookies. It is also vacuously true that each child gets $m$ cookies, for any value of $m$. QED.






                                                      share|cite|improve this answer



























                                                        up vote
                                                        0
                                                        down vote













                                                        Here is an argument for the kindergarten notion of division, i.e., to compute $a/b$ for natural numbers $a,b$, pretend you take $a$ cookies and divide them among $b$ kids, and ask yourself how many cookies will each child get. Theorem: $a/0$ does not have a unique value, and therefore it does not equal anything. Proof: Take $a$ cookies and divide them among $0$ children. It is vacuously true that each child gets $5$ cookies. It is also vacuously true that each child gets $m$ cookies, for any value of $m$. QED.






                                                        share|cite|improve this answer

























                                                          up vote
                                                          0
                                                          down vote










                                                          up vote
                                                          0
                                                          down vote









                                                          Here is an argument for the kindergarten notion of division, i.e., to compute $a/b$ for natural numbers $a,b$, pretend you take $a$ cookies and divide them among $b$ kids, and ask yourself how many cookies will each child get. Theorem: $a/0$ does not have a unique value, and therefore it does not equal anything. Proof: Take $a$ cookies and divide them among $0$ children. It is vacuously true that each child gets $5$ cookies. It is also vacuously true that each child gets $m$ cookies, for any value of $m$. QED.






                                                          share|cite|improve this answer














                                                          Here is an argument for the kindergarten notion of division, i.e., to compute $a/b$ for natural numbers $a,b$, pretend you take $a$ cookies and divide them among $b$ kids, and ask yourself how many cookies will each child get. Theorem: $a/0$ does not have a unique value, and therefore it does not equal anything. Proof: Take $a$ cookies and divide them among $0$ children. It is vacuously true that each child gets $5$ cookies. It is also vacuously true that each child gets $m$ cookies, for any value of $m$. QED.







                                                          share|cite|improve this answer














                                                          share|cite|improve this answer



                                                          share|cite|improve this answer








                                                          edited 18 hours ago

























                                                          answered yesterday









                                                          Ittay Weiss

                                                          62.9k6101182




                                                          62.9k6101182

















                                                              protected by Joffan Nov 10 '17 at 4:08



                                                              Thank you for your interest in this question.
                                                              Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



                                                              Would you like to answer one of these unanswered questions instead?



                                                              Popular posts from this blog

                                                              android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

                                                              SQL update select statement

                                                              'app-layout' is not a known element: how to share Component with different Modules