Prove that $A$ is invertible iff $det(A)neq 0$ with Cauchy-Binet theorem.












0












$begingroup$


Let $A$ a matrix $ntimes n$ over $mathbb{R}$.
I'm trying to prove that A is invertible if and only if $det(A)neq0$ using the Cauchy-Binet theorem.



I know that the Cauchy -Binet theorem is $$det(A B)=det(A)cdot det(B)$$



But for now, I couldn't think of any solutions to solve the proof.










share|cite|improve this question











$endgroup$

















    0












    $begingroup$


    Let $A$ a matrix $ntimes n$ over $mathbb{R}$.
    I'm trying to prove that A is invertible if and only if $det(A)neq0$ using the Cauchy-Binet theorem.



    I know that the Cauchy -Binet theorem is $$det(A B)=det(A)cdot det(B)$$



    But for now, I couldn't think of any solutions to solve the proof.










    share|cite|improve this question











    $endgroup$















      0












      0








      0





      $begingroup$


      Let $A$ a matrix $ntimes n$ over $mathbb{R}$.
      I'm trying to prove that A is invertible if and only if $det(A)neq0$ using the Cauchy-Binet theorem.



      I know that the Cauchy -Binet theorem is $$det(A B)=det(A)cdot det(B)$$



      But for now, I couldn't think of any solutions to solve the proof.










      share|cite|improve this question











      $endgroup$




      Let $A$ a matrix $ntimes n$ over $mathbb{R}$.
      I'm trying to prove that A is invertible if and only if $det(A)neq0$ using the Cauchy-Binet theorem.



      I know that the Cauchy -Binet theorem is $$det(A B)=det(A)cdot det(B)$$



      But for now, I couldn't think of any solutions to solve the proof.







      linear-algebra proof-explanation determinant






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Jan 9 at 20:26









      user376343

      3,5483827




      3,5483827










      asked Jan 9 at 19:33









      KevinKevin

      13311




      13311






















          2 Answers
          2






          active

          oldest

          votes


















          3












          $begingroup$

          Suppose that $A$ is invertible. Then there is a matrix $B$ such that $AB=I$. The Cauchy-Binet theorem then implies that
          $$1=det(I)=det(AB)=det(A)det(B)$$
          so that $det(A)neq 0neq det(B)$.



          Conversely, suppose $det(A)neq 0$. Then $B=frac{1}{detA}adj(A)$ satisfies $AB=I$, so that $A$ is invertible. Here $adj(A)$ is the adjugate of the matrix $A$.



          Edit: Given the comments on my answer, I'm including a link to a thread about left/right inverses:



          If $AB = I$ then $BA = I$






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            In my experience, some professors are incredibly pedantic with the definition of matrix inverse; you need not only specify that there exists $B$ such that $AB = I$, but also that $BA = I$ as well. That is, $AB = I = BA$.
            $endgroup$
            – Decaf-Math
            Jan 9 at 19:58










          • $begingroup$
            Either that or append a proof that $AB=Ito BA=I$ for square matrices on finite-dimensional spaces.
            $endgroup$
            – J.G.
            Jan 9 at 21:12



















          3












          $begingroup$

          If $A$ is invertible, then $AA^{-1}=I$ (identity matrix), so
          $$
          1=det I=det(AA^{-1})=det Adet(A^{-1})
          $$

          and therefore $det Ane0$.



          The converse doesn't follow from Binet's theorem, but rather from the fact that the determinant is multilinear and alternating on the columns of a matrix.



          Fact 1. If $A$ has a zero column, then $det A=det A+det A$, so $det A=0$.



          Fact 2. If $A$ has two identical columns, then $det A=-det A$, by swapping them, so $det A=0$.



          Fact 3. If $A$ is not invertible, then $det A=0$.



          Since we want to show that $det A=0$, possibly with a column swap we can assume that the last column is a linear combination of the other $n-1$ columns. Say $A=[a_1 dots a_{n-1} a_n]$, with
          $$
          a_n=c_1a_1+dots+c_{n-1}a_{n-1}
          $$

          Then, by multilinearity and facts 1 and 2, we have
          $$
          0=det[a_1 dots a_{n-1} 0]=
          det A
          -c_1det[a_1 dots a_{n-1} a_1]
          -dots
          -c_{n-1}det[a_1 dots a_{n-1} a_{n-1}]
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            A somewhat similar approach for the converse uses the fact that row and column operations can be achieved by multiplying (on the left or right) by elementary matrices.
            $endgroup$
            – Cheerful Parsnip
            Jan 9 at 23:02










          • $begingroup$
            @CheerfulParsnip Indeed, in my course I don't mention multilinearity, but rather define the determinant to be “invariant” by elementary column operations (or, equivalently, row operations): multiplying a column by $c$ multiplies the determinant by $c$; adding to a column another column multiplied by $d$ doesn't change the determinant; swapping two columns multiplies the determinant by $-1$. I find this better because the course puts great emphasis on elimination.
            $endgroup$
            – egreg
            Jan 9 at 23:11













          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3067863%2fprove-that-a-is-invertible-iff-deta-neq-0-with-cauchy-binet-theorem%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          3












          $begingroup$

          Suppose that $A$ is invertible. Then there is a matrix $B$ such that $AB=I$. The Cauchy-Binet theorem then implies that
          $$1=det(I)=det(AB)=det(A)det(B)$$
          so that $det(A)neq 0neq det(B)$.



          Conversely, suppose $det(A)neq 0$. Then $B=frac{1}{detA}adj(A)$ satisfies $AB=I$, so that $A$ is invertible. Here $adj(A)$ is the adjugate of the matrix $A$.



          Edit: Given the comments on my answer, I'm including a link to a thread about left/right inverses:



          If $AB = I$ then $BA = I$






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            In my experience, some professors are incredibly pedantic with the definition of matrix inverse; you need not only specify that there exists $B$ such that $AB = I$, but also that $BA = I$ as well. That is, $AB = I = BA$.
            $endgroup$
            – Decaf-Math
            Jan 9 at 19:58










          • $begingroup$
            Either that or append a proof that $AB=Ito BA=I$ for square matrices on finite-dimensional spaces.
            $endgroup$
            – J.G.
            Jan 9 at 21:12
















          3












          $begingroup$

          Suppose that $A$ is invertible. Then there is a matrix $B$ such that $AB=I$. The Cauchy-Binet theorem then implies that
          $$1=det(I)=det(AB)=det(A)det(B)$$
          so that $det(A)neq 0neq det(B)$.



          Conversely, suppose $det(A)neq 0$. Then $B=frac{1}{detA}adj(A)$ satisfies $AB=I$, so that $A$ is invertible. Here $adj(A)$ is the adjugate of the matrix $A$.



          Edit: Given the comments on my answer, I'm including a link to a thread about left/right inverses:



          If $AB = I$ then $BA = I$






          share|cite|improve this answer











          $endgroup$









          • 1




            $begingroup$
            In my experience, some professors are incredibly pedantic with the definition of matrix inverse; you need not only specify that there exists $B$ such that $AB = I$, but also that $BA = I$ as well. That is, $AB = I = BA$.
            $endgroup$
            – Decaf-Math
            Jan 9 at 19:58










          • $begingroup$
            Either that or append a proof that $AB=Ito BA=I$ for square matrices on finite-dimensional spaces.
            $endgroup$
            – J.G.
            Jan 9 at 21:12














          3












          3








          3





          $begingroup$

          Suppose that $A$ is invertible. Then there is a matrix $B$ such that $AB=I$. The Cauchy-Binet theorem then implies that
          $$1=det(I)=det(AB)=det(A)det(B)$$
          so that $det(A)neq 0neq det(B)$.



          Conversely, suppose $det(A)neq 0$. Then $B=frac{1}{detA}adj(A)$ satisfies $AB=I$, so that $A$ is invertible. Here $adj(A)$ is the adjugate of the matrix $A$.



          Edit: Given the comments on my answer, I'm including a link to a thread about left/right inverses:



          If $AB = I$ then $BA = I$






          share|cite|improve this answer











          $endgroup$



          Suppose that $A$ is invertible. Then there is a matrix $B$ such that $AB=I$. The Cauchy-Binet theorem then implies that
          $$1=det(I)=det(AB)=det(A)det(B)$$
          so that $det(A)neq 0neq det(B)$.



          Conversely, suppose $det(A)neq 0$. Then $B=frac{1}{detA}adj(A)$ satisfies $AB=I$, so that $A$ is invertible. Here $adj(A)$ is the adjugate of the matrix $A$.



          Edit: Given the comments on my answer, I'm including a link to a thread about left/right inverses:



          If $AB = I$ then $BA = I$







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Jan 9 at 21:34

























          answered Jan 9 at 19:42









          pwerthpwerth

          3,113417




          3,113417








          • 1




            $begingroup$
            In my experience, some professors are incredibly pedantic with the definition of matrix inverse; you need not only specify that there exists $B$ such that $AB = I$, but also that $BA = I$ as well. That is, $AB = I = BA$.
            $endgroup$
            – Decaf-Math
            Jan 9 at 19:58










          • $begingroup$
            Either that or append a proof that $AB=Ito BA=I$ for square matrices on finite-dimensional spaces.
            $endgroup$
            – J.G.
            Jan 9 at 21:12














          • 1




            $begingroup$
            In my experience, some professors are incredibly pedantic with the definition of matrix inverse; you need not only specify that there exists $B$ such that $AB = I$, but also that $BA = I$ as well. That is, $AB = I = BA$.
            $endgroup$
            – Decaf-Math
            Jan 9 at 19:58










          • $begingroup$
            Either that or append a proof that $AB=Ito BA=I$ for square matrices on finite-dimensional spaces.
            $endgroup$
            – J.G.
            Jan 9 at 21:12








          1




          1




          $begingroup$
          In my experience, some professors are incredibly pedantic with the definition of matrix inverse; you need not only specify that there exists $B$ such that $AB = I$, but also that $BA = I$ as well. That is, $AB = I = BA$.
          $endgroup$
          – Decaf-Math
          Jan 9 at 19:58




          $begingroup$
          In my experience, some professors are incredibly pedantic with the definition of matrix inverse; you need not only specify that there exists $B$ such that $AB = I$, but also that $BA = I$ as well. That is, $AB = I = BA$.
          $endgroup$
          – Decaf-Math
          Jan 9 at 19:58












          $begingroup$
          Either that or append a proof that $AB=Ito BA=I$ for square matrices on finite-dimensional spaces.
          $endgroup$
          – J.G.
          Jan 9 at 21:12




          $begingroup$
          Either that or append a proof that $AB=Ito BA=I$ for square matrices on finite-dimensional spaces.
          $endgroup$
          – J.G.
          Jan 9 at 21:12











          3












          $begingroup$

          If $A$ is invertible, then $AA^{-1}=I$ (identity matrix), so
          $$
          1=det I=det(AA^{-1})=det Adet(A^{-1})
          $$

          and therefore $det Ane0$.



          The converse doesn't follow from Binet's theorem, but rather from the fact that the determinant is multilinear and alternating on the columns of a matrix.



          Fact 1. If $A$ has a zero column, then $det A=det A+det A$, so $det A=0$.



          Fact 2. If $A$ has two identical columns, then $det A=-det A$, by swapping them, so $det A=0$.



          Fact 3. If $A$ is not invertible, then $det A=0$.



          Since we want to show that $det A=0$, possibly with a column swap we can assume that the last column is a linear combination of the other $n-1$ columns. Say $A=[a_1 dots a_{n-1} a_n]$, with
          $$
          a_n=c_1a_1+dots+c_{n-1}a_{n-1}
          $$

          Then, by multilinearity and facts 1 and 2, we have
          $$
          0=det[a_1 dots a_{n-1} 0]=
          det A
          -c_1det[a_1 dots a_{n-1} a_1]
          -dots
          -c_{n-1}det[a_1 dots a_{n-1} a_{n-1}]
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            A somewhat similar approach for the converse uses the fact that row and column operations can be achieved by multiplying (on the left or right) by elementary matrices.
            $endgroup$
            – Cheerful Parsnip
            Jan 9 at 23:02










          • $begingroup$
            @CheerfulParsnip Indeed, in my course I don't mention multilinearity, but rather define the determinant to be “invariant” by elementary column operations (or, equivalently, row operations): multiplying a column by $c$ multiplies the determinant by $c$; adding to a column another column multiplied by $d$ doesn't change the determinant; swapping two columns multiplies the determinant by $-1$. I find this better because the course puts great emphasis on elimination.
            $endgroup$
            – egreg
            Jan 9 at 23:11


















          3












          $begingroup$

          If $A$ is invertible, then $AA^{-1}=I$ (identity matrix), so
          $$
          1=det I=det(AA^{-1})=det Adet(A^{-1})
          $$

          and therefore $det Ane0$.



          The converse doesn't follow from Binet's theorem, but rather from the fact that the determinant is multilinear and alternating on the columns of a matrix.



          Fact 1. If $A$ has a zero column, then $det A=det A+det A$, so $det A=0$.



          Fact 2. If $A$ has two identical columns, then $det A=-det A$, by swapping them, so $det A=0$.



          Fact 3. If $A$ is not invertible, then $det A=0$.



          Since we want to show that $det A=0$, possibly with a column swap we can assume that the last column is a linear combination of the other $n-1$ columns. Say $A=[a_1 dots a_{n-1} a_n]$, with
          $$
          a_n=c_1a_1+dots+c_{n-1}a_{n-1}
          $$

          Then, by multilinearity and facts 1 and 2, we have
          $$
          0=det[a_1 dots a_{n-1} 0]=
          det A
          -c_1det[a_1 dots a_{n-1} a_1]
          -dots
          -c_{n-1}det[a_1 dots a_{n-1} a_{n-1}]
          $$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            A somewhat similar approach for the converse uses the fact that row and column operations can be achieved by multiplying (on the left or right) by elementary matrices.
            $endgroup$
            – Cheerful Parsnip
            Jan 9 at 23:02










          • $begingroup$
            @CheerfulParsnip Indeed, in my course I don't mention multilinearity, but rather define the determinant to be “invariant” by elementary column operations (or, equivalently, row operations): multiplying a column by $c$ multiplies the determinant by $c$; adding to a column another column multiplied by $d$ doesn't change the determinant; swapping two columns multiplies the determinant by $-1$. I find this better because the course puts great emphasis on elimination.
            $endgroup$
            – egreg
            Jan 9 at 23:11
















          3












          3








          3





          $begingroup$

          If $A$ is invertible, then $AA^{-1}=I$ (identity matrix), so
          $$
          1=det I=det(AA^{-1})=det Adet(A^{-1})
          $$

          and therefore $det Ane0$.



          The converse doesn't follow from Binet's theorem, but rather from the fact that the determinant is multilinear and alternating on the columns of a matrix.



          Fact 1. If $A$ has a zero column, then $det A=det A+det A$, so $det A=0$.



          Fact 2. If $A$ has two identical columns, then $det A=-det A$, by swapping them, so $det A=0$.



          Fact 3. If $A$ is not invertible, then $det A=0$.



          Since we want to show that $det A=0$, possibly with a column swap we can assume that the last column is a linear combination of the other $n-1$ columns. Say $A=[a_1 dots a_{n-1} a_n]$, with
          $$
          a_n=c_1a_1+dots+c_{n-1}a_{n-1}
          $$

          Then, by multilinearity and facts 1 and 2, we have
          $$
          0=det[a_1 dots a_{n-1} 0]=
          det A
          -c_1det[a_1 dots a_{n-1} a_1]
          -dots
          -c_{n-1}det[a_1 dots a_{n-1} a_{n-1}]
          $$






          share|cite|improve this answer









          $endgroup$



          If $A$ is invertible, then $AA^{-1}=I$ (identity matrix), so
          $$
          1=det I=det(AA^{-1})=det Adet(A^{-1})
          $$

          and therefore $det Ane0$.



          The converse doesn't follow from Binet's theorem, but rather from the fact that the determinant is multilinear and alternating on the columns of a matrix.



          Fact 1. If $A$ has a zero column, then $det A=det A+det A$, so $det A=0$.



          Fact 2. If $A$ has two identical columns, then $det A=-det A$, by swapping them, so $det A=0$.



          Fact 3. If $A$ is not invertible, then $det A=0$.



          Since we want to show that $det A=0$, possibly with a column swap we can assume that the last column is a linear combination of the other $n-1$ columns. Say $A=[a_1 dots a_{n-1} a_n]$, with
          $$
          a_n=c_1a_1+dots+c_{n-1}a_{n-1}
          $$

          Then, by multilinearity and facts 1 and 2, we have
          $$
          0=det[a_1 dots a_{n-1} 0]=
          det A
          -c_1det[a_1 dots a_{n-1} a_1]
          -dots
          -c_{n-1}det[a_1 dots a_{n-1} a_{n-1}]
          $$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 9 at 22:58









          egregegreg

          181k1485203




          181k1485203












          • $begingroup$
            A somewhat similar approach for the converse uses the fact that row and column operations can be achieved by multiplying (on the left or right) by elementary matrices.
            $endgroup$
            – Cheerful Parsnip
            Jan 9 at 23:02










          • $begingroup$
            @CheerfulParsnip Indeed, in my course I don't mention multilinearity, but rather define the determinant to be “invariant” by elementary column operations (or, equivalently, row operations): multiplying a column by $c$ multiplies the determinant by $c$; adding to a column another column multiplied by $d$ doesn't change the determinant; swapping two columns multiplies the determinant by $-1$. I find this better because the course puts great emphasis on elimination.
            $endgroup$
            – egreg
            Jan 9 at 23:11




















          • $begingroup$
            A somewhat similar approach for the converse uses the fact that row and column operations can be achieved by multiplying (on the left or right) by elementary matrices.
            $endgroup$
            – Cheerful Parsnip
            Jan 9 at 23:02










          • $begingroup$
            @CheerfulParsnip Indeed, in my course I don't mention multilinearity, but rather define the determinant to be “invariant” by elementary column operations (or, equivalently, row operations): multiplying a column by $c$ multiplies the determinant by $c$; adding to a column another column multiplied by $d$ doesn't change the determinant; swapping two columns multiplies the determinant by $-1$. I find this better because the course puts great emphasis on elimination.
            $endgroup$
            – egreg
            Jan 9 at 23:11


















          $begingroup$
          A somewhat similar approach for the converse uses the fact that row and column operations can be achieved by multiplying (on the left or right) by elementary matrices.
          $endgroup$
          – Cheerful Parsnip
          Jan 9 at 23:02




          $begingroup$
          A somewhat similar approach for the converse uses the fact that row and column operations can be achieved by multiplying (on the left or right) by elementary matrices.
          $endgroup$
          – Cheerful Parsnip
          Jan 9 at 23:02












          $begingroup$
          @CheerfulParsnip Indeed, in my course I don't mention multilinearity, but rather define the determinant to be “invariant” by elementary column operations (or, equivalently, row operations): multiplying a column by $c$ multiplies the determinant by $c$; adding to a column another column multiplied by $d$ doesn't change the determinant; swapping two columns multiplies the determinant by $-1$. I find this better because the course puts great emphasis on elimination.
          $endgroup$
          – egreg
          Jan 9 at 23:11






          $begingroup$
          @CheerfulParsnip Indeed, in my course I don't mention multilinearity, but rather define the determinant to be “invariant” by elementary column operations (or, equivalently, row operations): multiplying a column by $c$ multiplies the determinant by $c$; adding to a column another column multiplied by $d$ doesn't change the determinant; swapping two columns multiplies the determinant by $-1$. I find this better because the course puts great emphasis on elimination.
          $endgroup$
          – egreg
          Jan 9 at 23:11




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3067863%2fprove-that-a-is-invertible-iff-deta-neq-0-with-cauchy-binet-theorem%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          MongoDB - Not Authorized To Execute Command

          How to fix TextFormField cause rebuild widget in Flutter

          in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith