Determinant is alternating over a commutative ring with $1$












2












$begingroup$


In Section 11.4 of Dummit and Foote, they introduce a determinant function $det$ on the ring of $ntimes n$ matrices over a commutative ring $R$ with $1$ as




  1. Any $n$-multilinear alternating form, where the $n$-tuples are the $n$ columns of the matrices in $M_{ntimes n}(R)$, and


  2. $det(I) = 1$ where $I$ is the $ntimes n$ identity matrix.



They then define a function
$$
det(alpha_{ij}) = sum_{sigmain S_n}operatorname{sgn}(sigma)alpha_{sigma(1)1}dotsbalpha_{sigma(n)n}
$$
and show that the determinant is unique, but they leave it as an exercise to show that the function $det$ defined above is actually a determinant function.



Note that Dummit and Foote take alternating to mean that if two consecutive columns of the matrix $(alpha_{ij})$ are equal, then the alternating form returns $0$ when applied to $(alpha_{ij})$.



I am having trouble showing that $det$ so-defined is alternating. I have managed to show that if a matrix $(alpha_{ij})$ has two consecutive columns equal, say the $j$th and $j+1$st, then $det(alpha_{ij})=-det(alpha_{ij})$. I am not sure if this is sufficient to show that $det(alpha_{ij}) = 0$ since we are in a commutative ring with $1$, which may have zero divisors.



Is there an easy fix? I can supply my proof if need be. Thanks.










share|cite|improve this question









$endgroup$

















    2












    $begingroup$


    In Section 11.4 of Dummit and Foote, they introduce a determinant function $det$ on the ring of $ntimes n$ matrices over a commutative ring $R$ with $1$ as




    1. Any $n$-multilinear alternating form, where the $n$-tuples are the $n$ columns of the matrices in $M_{ntimes n}(R)$, and


    2. $det(I) = 1$ where $I$ is the $ntimes n$ identity matrix.



    They then define a function
    $$
    det(alpha_{ij}) = sum_{sigmain S_n}operatorname{sgn}(sigma)alpha_{sigma(1)1}dotsbalpha_{sigma(n)n}
    $$
    and show that the determinant is unique, but they leave it as an exercise to show that the function $det$ defined above is actually a determinant function.



    Note that Dummit and Foote take alternating to mean that if two consecutive columns of the matrix $(alpha_{ij})$ are equal, then the alternating form returns $0$ when applied to $(alpha_{ij})$.



    I am having trouble showing that $det$ so-defined is alternating. I have managed to show that if a matrix $(alpha_{ij})$ has two consecutive columns equal, say the $j$th and $j+1$st, then $det(alpha_{ij})=-det(alpha_{ij})$. I am not sure if this is sufficient to show that $det(alpha_{ij}) = 0$ since we are in a commutative ring with $1$, which may have zero divisors.



    Is there an easy fix? I can supply my proof if need be. Thanks.










    share|cite|improve this question









    $endgroup$















      2












      2








      2





      $begingroup$


      In Section 11.4 of Dummit and Foote, they introduce a determinant function $det$ on the ring of $ntimes n$ matrices over a commutative ring $R$ with $1$ as




      1. Any $n$-multilinear alternating form, where the $n$-tuples are the $n$ columns of the matrices in $M_{ntimes n}(R)$, and


      2. $det(I) = 1$ where $I$ is the $ntimes n$ identity matrix.



      They then define a function
      $$
      det(alpha_{ij}) = sum_{sigmain S_n}operatorname{sgn}(sigma)alpha_{sigma(1)1}dotsbalpha_{sigma(n)n}
      $$
      and show that the determinant is unique, but they leave it as an exercise to show that the function $det$ defined above is actually a determinant function.



      Note that Dummit and Foote take alternating to mean that if two consecutive columns of the matrix $(alpha_{ij})$ are equal, then the alternating form returns $0$ when applied to $(alpha_{ij})$.



      I am having trouble showing that $det$ so-defined is alternating. I have managed to show that if a matrix $(alpha_{ij})$ has two consecutive columns equal, say the $j$th and $j+1$st, then $det(alpha_{ij})=-det(alpha_{ij})$. I am not sure if this is sufficient to show that $det(alpha_{ij}) = 0$ since we are in a commutative ring with $1$, which may have zero divisors.



      Is there an easy fix? I can supply my proof if need be. Thanks.










      share|cite|improve this question









      $endgroup$




      In Section 11.4 of Dummit and Foote, they introduce a determinant function $det$ on the ring of $ntimes n$ matrices over a commutative ring $R$ with $1$ as




      1. Any $n$-multilinear alternating form, where the $n$-tuples are the $n$ columns of the matrices in $M_{ntimes n}(R)$, and


      2. $det(I) = 1$ where $I$ is the $ntimes n$ identity matrix.



      They then define a function
      $$
      det(alpha_{ij}) = sum_{sigmain S_n}operatorname{sgn}(sigma)alpha_{sigma(1)1}dotsbalpha_{sigma(n)n}
      $$
      and show that the determinant is unique, but they leave it as an exercise to show that the function $det$ defined above is actually a determinant function.



      Note that Dummit and Foote take alternating to mean that if two consecutive columns of the matrix $(alpha_{ij})$ are equal, then the alternating form returns $0$ when applied to $(alpha_{ij})$.



      I am having trouble showing that $det$ so-defined is alternating. I have managed to show that if a matrix $(alpha_{ij})$ has two consecutive columns equal, say the $j$th and $j+1$st, then $det(alpha_{ij})=-det(alpha_{ij})$. I am not sure if this is sufficient to show that $det(alpha_{ij}) = 0$ since we are in a commutative ring with $1$, which may have zero divisors.



      Is there an easy fix? I can supply my proof if need be. Thanks.







      linear-algebra abstract-algebra






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jul 4 '17 at 16:39









      Alex OrtizAlex Ortiz

      11.5k21442




      11.5k21442






















          3 Answers
          3






          active

          oldest

          votes


















          2












          $begingroup$

          You are correct. In the field with two elements (say), the equation $det(A) = -det(A)$ yields no new information--it might be best to argue the determinant is alternating directly.



          Here's the idea for the 3x3 case, assuming that the first two columns of the matrix are equal. Let $G = S_3$ and let $H$ be the (two element) subgroup generated by the transposition $tau := (1, 2)$. Then partition $G$ into cosets $G/H$. In this specific example, $G/H = {{(1), tau}, {(1, 2, 3), (1, 2, 3)tau}, {(2,3), (2,3)tau}}$. Note that each coset has two permutations in it, some permutation, and then a transposition multiplied by that permutation.



          Then recalling that the first two columns of the matrix are equal, you obtain that the determinant is $$sum_{sigma in S_3} sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H}sum_{sigma in S}sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma tau(1)1}a_{sigma tau(2)2}a_{sigma tau(3)3}) = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) stackrel{*}= sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)2}a_{sigma(2)1}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) = sum_{S in G/H, choose (sigma) in S} 0 = 0$$



          where $stackrel{*}=$ uses the fact that the first two columns agree. (Note: the sum looks scary, but the idea is not nearly as scary.)






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thanks for your answer, but I am not sure what you mean by suggesting to show that the determinant is alternating directly. In my mind, that is what I tried to show, but I came up with my problem.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:41










          • $begingroup$
            I mean you should show that after you switch two rows, each entry in that big sum is multiplied by -1 when computing the determinant of the new matrix.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 17:51










          • $begingroup$
            Okay, but note that my definition of alternating is that when two adjacent columns (or rows) are equal, then the determinant function should evaluate to $0$. What you are suggesting doesn't seem to get me further in the right direction.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:54










          • $begingroup$
            You're absolutely right, my apologies! I've updated my answer accordingly.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 18:29










          • $begingroup$
            Ah, beautiful; I was able to turn this into a proof of the general case with no problem. I didn't think of partitioning $G$ into its cosets; does this idea come up in any other proofs you know of? Thanks very much!
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 19:01



















          4












          $begingroup$

          For a fully elaborated proof, I shall be lazy and just refer to Exercise 6.7 (e) in my Notes on the combinatorial fundamentals of algebra, version of 10 January 2019. Or the proof of property (iii) in §5.3.4 of Hartmut Laue, Determinants. The main idea is to split the sum $sumlimits_{sigma in S_n} ldots$ into a sum over all even permutations and a sum over all odd permutations, and show that the addends in the two sums mutually cancel, as Tom Gannon suggests.



          However, there is also a way to derive the result from your $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$ observation. Namely, fix $n in mathbb{N}$ and $k in left{1,2,ldots,n-1right}$. Let a $k$-col-equal matrix be a matrix whose $k$-th column equals its $k+1$-st column. Your claim is that a $k$-col-equal matrix must have determinant $0$. Now, observe that you can derive $u = 0$ from $u = -u$ when $u$ is a polynomial over $mathbb{Z}$ (for example). Thus, you can prove your claim whenever the entries of the $k$-col-equal matrix are polynomials over $mathbb{Z}$ (because you have shown that each $k$-col-equal matrix $left(alpha_{i,j}right)$ satisfies $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$). In particular, your claim holds when the col-equal matrix is the "universal $k$-col-equal matrix", which is the matrix whose entries are indeterminates (in a polynomial ring over $mathbb{Z}$) that are distinct except for the two columns that are supposed to be equal. (For example, the universal $2$-col-equal matrix for $n = 4$ is $left(begin{array}{cccc} x_{1,1} & x_{1,2} & x_{1,2} & x_{1,4} \ x_{2,1} & x_{2,2} & x_{2,2} & x_{2,4} \ x_{3,1} & x_{3,2} & x_{3,2} & x_{3,4} \ x_{4,1} & x_{4,2} & x_{4,2} & x_{4,4} end{array}right)$, where the $x_{i,j}$ are distinct indeterminates in a polynomial ring over $mathbb{Z}$.) But you can view an arbitrary $k$-col-equal matrix as a result of substituting concrete values for these indeterminates in the "universal $k$-col-equal matrix". Therefore, since your claim holds for the latter matrix, it must also hold for the former.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            This is a very interesting take on it. One that I had not considered, but that is very nice. It's good to know that the original result can still get us to the desired conclusion.
            $endgroup$
            – Alex Ortiz
            Jul 23 '17 at 3:25



















          0












          $begingroup$

          Suppose we take a matrix $(alpha_{i,j})$, and switch the $p$th and $q$th rows to form another matrix $(beta_{i,j})$, which is to say,
          $$beta_{i,j} = leftlbrace begin{array}{ccc} alpha_{i, p} &mathrm{if} & j = q \ alpha_{i, q} &mathrm{if} & j = p \ alpha_{i, j} & & mathrm{otherwise}end{array} right.$$
          Another way to put it is, if we let $tau = (p ~ q) in S_n$, then,
          $$beta_{i,j} = alpha_{i, tau(j)}$$
          So we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{i in lbrace 1, ldots, n rbrace} alpha_{sigma(i),tau(i)}.$$
          If we reindex the above product in terms of $j = tau(i) iff i = tau(j)$, then $j$ ranges over $lbrace 1, ldots, n rbrace$ as well (note the importance of commutativity in this step), and we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{j in lbrace 1, ldots, n rbrace} alpha_{sigma circ tau(j),j}.$$
          We can now reindex the sum with $psi = sigma circ tau iff sigma = psi circ tau$, giving us
          $$det(beta_{i,j}) = sum_{psi in S_n} operatorname{sgn}(psi circ tau) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j}.$$
          But, since composing a permutation with a transposition changes its sign, we thus have,
          $$det(beta_{i,j}) = -sum_{psi in S_n} operatorname{sgn}(psi) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j} = det(alpha_{i,j}).$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Do you mean to say switch the two columns? My indexing convention is that if I have $a_{ij}$, then $i$ denotes the row, and $j$ the column.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:42






          • 1




            $begingroup$
            Also, you have apparently shown what I have, which is that switching two rows (or columns) amounts to changing the sign. I am trying to show that if two neighboring columns (or rows) are identical, then the determinant is zero.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:44












          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2346374%2fdeterminant-is-alternating-over-a-commutative-ring-with-1%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          3 Answers
          3






          active

          oldest

          votes








          3 Answers
          3






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2












          $begingroup$

          You are correct. In the field with two elements (say), the equation $det(A) = -det(A)$ yields no new information--it might be best to argue the determinant is alternating directly.



          Here's the idea for the 3x3 case, assuming that the first two columns of the matrix are equal. Let $G = S_3$ and let $H$ be the (two element) subgroup generated by the transposition $tau := (1, 2)$. Then partition $G$ into cosets $G/H$. In this specific example, $G/H = {{(1), tau}, {(1, 2, 3), (1, 2, 3)tau}, {(2,3), (2,3)tau}}$. Note that each coset has two permutations in it, some permutation, and then a transposition multiplied by that permutation.



          Then recalling that the first two columns of the matrix are equal, you obtain that the determinant is $$sum_{sigma in S_3} sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H}sum_{sigma in S}sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma tau(1)1}a_{sigma tau(2)2}a_{sigma tau(3)3}) = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) stackrel{*}= sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)2}a_{sigma(2)1}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) = sum_{S in G/H, choose (sigma) in S} 0 = 0$$



          where $stackrel{*}=$ uses the fact that the first two columns agree. (Note: the sum looks scary, but the idea is not nearly as scary.)






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thanks for your answer, but I am not sure what you mean by suggesting to show that the determinant is alternating directly. In my mind, that is what I tried to show, but I came up with my problem.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:41










          • $begingroup$
            I mean you should show that after you switch two rows, each entry in that big sum is multiplied by -1 when computing the determinant of the new matrix.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 17:51










          • $begingroup$
            Okay, but note that my definition of alternating is that when two adjacent columns (or rows) are equal, then the determinant function should evaluate to $0$. What you are suggesting doesn't seem to get me further in the right direction.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:54










          • $begingroup$
            You're absolutely right, my apologies! I've updated my answer accordingly.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 18:29










          • $begingroup$
            Ah, beautiful; I was able to turn this into a proof of the general case with no problem. I didn't think of partitioning $G$ into its cosets; does this idea come up in any other proofs you know of? Thanks very much!
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 19:01
















          2












          $begingroup$

          You are correct. In the field with two elements (say), the equation $det(A) = -det(A)$ yields no new information--it might be best to argue the determinant is alternating directly.



          Here's the idea for the 3x3 case, assuming that the first two columns of the matrix are equal. Let $G = S_3$ and let $H$ be the (two element) subgroup generated by the transposition $tau := (1, 2)$. Then partition $G$ into cosets $G/H$. In this specific example, $G/H = {{(1), tau}, {(1, 2, 3), (1, 2, 3)tau}, {(2,3), (2,3)tau}}$. Note that each coset has two permutations in it, some permutation, and then a transposition multiplied by that permutation.



          Then recalling that the first two columns of the matrix are equal, you obtain that the determinant is $$sum_{sigma in S_3} sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H}sum_{sigma in S}sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma tau(1)1}a_{sigma tau(2)2}a_{sigma tau(3)3}) = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) stackrel{*}= sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)2}a_{sigma(2)1}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) = sum_{S in G/H, choose (sigma) in S} 0 = 0$$



          where $stackrel{*}=$ uses the fact that the first two columns agree. (Note: the sum looks scary, but the idea is not nearly as scary.)






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thanks for your answer, but I am not sure what you mean by suggesting to show that the determinant is alternating directly. In my mind, that is what I tried to show, but I came up with my problem.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:41










          • $begingroup$
            I mean you should show that after you switch two rows, each entry in that big sum is multiplied by -1 when computing the determinant of the new matrix.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 17:51










          • $begingroup$
            Okay, but note that my definition of alternating is that when two adjacent columns (or rows) are equal, then the determinant function should evaluate to $0$. What you are suggesting doesn't seem to get me further in the right direction.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:54










          • $begingroup$
            You're absolutely right, my apologies! I've updated my answer accordingly.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 18:29










          • $begingroup$
            Ah, beautiful; I was able to turn this into a proof of the general case with no problem. I didn't think of partitioning $G$ into its cosets; does this idea come up in any other proofs you know of? Thanks very much!
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 19:01














          2












          2








          2





          $begingroup$

          You are correct. In the field with two elements (say), the equation $det(A) = -det(A)$ yields no new information--it might be best to argue the determinant is alternating directly.



          Here's the idea for the 3x3 case, assuming that the first two columns of the matrix are equal. Let $G = S_3$ and let $H$ be the (two element) subgroup generated by the transposition $tau := (1, 2)$. Then partition $G$ into cosets $G/H$. In this specific example, $G/H = {{(1), tau}, {(1, 2, 3), (1, 2, 3)tau}, {(2,3), (2,3)tau}}$. Note that each coset has two permutations in it, some permutation, and then a transposition multiplied by that permutation.



          Then recalling that the first two columns of the matrix are equal, you obtain that the determinant is $$sum_{sigma in S_3} sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H}sum_{sigma in S}sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma tau(1)1}a_{sigma tau(2)2}a_{sigma tau(3)3}) = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) stackrel{*}= sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)2}a_{sigma(2)1}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) = sum_{S in G/H, choose (sigma) in S} 0 = 0$$



          where $stackrel{*}=$ uses the fact that the first two columns agree. (Note: the sum looks scary, but the idea is not nearly as scary.)






          share|cite|improve this answer











          $endgroup$



          You are correct. In the field with two elements (say), the equation $det(A) = -det(A)$ yields no new information--it might be best to argue the determinant is alternating directly.



          Here's the idea for the 3x3 case, assuming that the first two columns of the matrix are equal. Let $G = S_3$ and let $H$ be the (two element) subgroup generated by the transposition $tau := (1, 2)$. Then partition $G$ into cosets $G/H$. In this specific example, $G/H = {{(1), tau}, {(1, 2, 3), (1, 2, 3)tau}, {(2,3), (2,3)tau}}$. Note that each coset has two permutations in it, some permutation, and then a transposition multiplied by that permutation.



          Then recalling that the first two columns of the matrix are equal, you obtain that the determinant is $$sum_{sigma in S_3} sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H}sum_{sigma in S}sign(sigma)a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma tau(1)1}a_{sigma tau(2)2}a_{sigma tau(3)3}) = sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)1}a_{sigma(2)2}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) stackrel{*}= sum_{S in G/H, choose (sigma) in S} (pm a_{sigma(1)2}a_{sigma(2)1}a_{sigma(3)3} mp a_{sigma (2)1}a_{sigma (1)2}a_{sigma (3)3}) = sum_{S in G/H, choose (sigma) in S} 0 = 0$$



          where $stackrel{*}=$ uses the fact that the first two columns agree. (Note: the sum looks scary, but the idea is not nearly as scary.)







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Jul 22 '17 at 21:26

























          answered Jul 4 '17 at 17:34









          Tom GannonTom Gannon

          707210




          707210












          • $begingroup$
            Thanks for your answer, but I am not sure what you mean by suggesting to show that the determinant is alternating directly. In my mind, that is what I tried to show, but I came up with my problem.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:41










          • $begingroup$
            I mean you should show that after you switch two rows, each entry in that big sum is multiplied by -1 when computing the determinant of the new matrix.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 17:51










          • $begingroup$
            Okay, but note that my definition of alternating is that when two adjacent columns (or rows) are equal, then the determinant function should evaluate to $0$. What you are suggesting doesn't seem to get me further in the right direction.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:54










          • $begingroup$
            You're absolutely right, my apologies! I've updated my answer accordingly.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 18:29










          • $begingroup$
            Ah, beautiful; I was able to turn this into a proof of the general case with no problem. I didn't think of partitioning $G$ into its cosets; does this idea come up in any other proofs you know of? Thanks very much!
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 19:01


















          • $begingroup$
            Thanks for your answer, but I am not sure what you mean by suggesting to show that the determinant is alternating directly. In my mind, that is what I tried to show, but I came up with my problem.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:41










          • $begingroup$
            I mean you should show that after you switch two rows, each entry in that big sum is multiplied by -1 when computing the determinant of the new matrix.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 17:51










          • $begingroup$
            Okay, but note that my definition of alternating is that when two adjacent columns (or rows) are equal, then the determinant function should evaluate to $0$. What you are suggesting doesn't seem to get me further in the right direction.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:54










          • $begingroup$
            You're absolutely right, my apologies! I've updated my answer accordingly.
            $endgroup$
            – Tom Gannon
            Jul 4 '17 at 18:29










          • $begingroup$
            Ah, beautiful; I was able to turn this into a proof of the general case with no problem. I didn't think of partitioning $G$ into its cosets; does this idea come up in any other proofs you know of? Thanks very much!
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 19:01
















          $begingroup$
          Thanks for your answer, but I am not sure what you mean by suggesting to show that the determinant is alternating directly. In my mind, that is what I tried to show, but I came up with my problem.
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 17:41




          $begingroup$
          Thanks for your answer, but I am not sure what you mean by suggesting to show that the determinant is alternating directly. In my mind, that is what I tried to show, but I came up with my problem.
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 17:41












          $begingroup$
          I mean you should show that after you switch two rows, each entry in that big sum is multiplied by -1 when computing the determinant of the new matrix.
          $endgroup$
          – Tom Gannon
          Jul 4 '17 at 17:51




          $begingroup$
          I mean you should show that after you switch two rows, each entry in that big sum is multiplied by -1 when computing the determinant of the new matrix.
          $endgroup$
          – Tom Gannon
          Jul 4 '17 at 17:51












          $begingroup$
          Okay, but note that my definition of alternating is that when two adjacent columns (or rows) are equal, then the determinant function should evaluate to $0$. What you are suggesting doesn't seem to get me further in the right direction.
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 17:54




          $begingroup$
          Okay, but note that my definition of alternating is that when two adjacent columns (or rows) are equal, then the determinant function should evaluate to $0$. What you are suggesting doesn't seem to get me further in the right direction.
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 17:54












          $begingroup$
          You're absolutely right, my apologies! I've updated my answer accordingly.
          $endgroup$
          – Tom Gannon
          Jul 4 '17 at 18:29




          $begingroup$
          You're absolutely right, my apologies! I've updated my answer accordingly.
          $endgroup$
          – Tom Gannon
          Jul 4 '17 at 18:29












          $begingroup$
          Ah, beautiful; I was able to turn this into a proof of the general case with no problem. I didn't think of partitioning $G$ into its cosets; does this idea come up in any other proofs you know of? Thanks very much!
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 19:01




          $begingroup$
          Ah, beautiful; I was able to turn this into a proof of the general case with no problem. I didn't think of partitioning $G$ into its cosets; does this idea come up in any other proofs you know of? Thanks very much!
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 19:01











          4












          $begingroup$

          For a fully elaborated proof, I shall be lazy and just refer to Exercise 6.7 (e) in my Notes on the combinatorial fundamentals of algebra, version of 10 January 2019. Or the proof of property (iii) in §5.3.4 of Hartmut Laue, Determinants. The main idea is to split the sum $sumlimits_{sigma in S_n} ldots$ into a sum over all even permutations and a sum over all odd permutations, and show that the addends in the two sums mutually cancel, as Tom Gannon suggests.



          However, there is also a way to derive the result from your $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$ observation. Namely, fix $n in mathbb{N}$ and $k in left{1,2,ldots,n-1right}$. Let a $k$-col-equal matrix be a matrix whose $k$-th column equals its $k+1$-st column. Your claim is that a $k$-col-equal matrix must have determinant $0$. Now, observe that you can derive $u = 0$ from $u = -u$ when $u$ is a polynomial over $mathbb{Z}$ (for example). Thus, you can prove your claim whenever the entries of the $k$-col-equal matrix are polynomials over $mathbb{Z}$ (because you have shown that each $k$-col-equal matrix $left(alpha_{i,j}right)$ satisfies $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$). In particular, your claim holds when the col-equal matrix is the "universal $k$-col-equal matrix", which is the matrix whose entries are indeterminates (in a polynomial ring over $mathbb{Z}$) that are distinct except for the two columns that are supposed to be equal. (For example, the universal $2$-col-equal matrix for $n = 4$ is $left(begin{array}{cccc} x_{1,1} & x_{1,2} & x_{1,2} & x_{1,4} \ x_{2,1} & x_{2,2} & x_{2,2} & x_{2,4} \ x_{3,1} & x_{3,2} & x_{3,2} & x_{3,4} \ x_{4,1} & x_{4,2} & x_{4,2} & x_{4,4} end{array}right)$, where the $x_{i,j}$ are distinct indeterminates in a polynomial ring over $mathbb{Z}$.) But you can view an arbitrary $k$-col-equal matrix as a result of substituting concrete values for these indeterminates in the "universal $k$-col-equal matrix". Therefore, since your claim holds for the latter matrix, it must also hold for the former.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            This is a very interesting take on it. One that I had not considered, but that is very nice. It's good to know that the original result can still get us to the desired conclusion.
            $endgroup$
            – Alex Ortiz
            Jul 23 '17 at 3:25
















          4












          $begingroup$

          For a fully elaborated proof, I shall be lazy and just refer to Exercise 6.7 (e) in my Notes on the combinatorial fundamentals of algebra, version of 10 January 2019. Or the proof of property (iii) in §5.3.4 of Hartmut Laue, Determinants. The main idea is to split the sum $sumlimits_{sigma in S_n} ldots$ into a sum over all even permutations and a sum over all odd permutations, and show that the addends in the two sums mutually cancel, as Tom Gannon suggests.



          However, there is also a way to derive the result from your $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$ observation. Namely, fix $n in mathbb{N}$ and $k in left{1,2,ldots,n-1right}$. Let a $k$-col-equal matrix be a matrix whose $k$-th column equals its $k+1$-st column. Your claim is that a $k$-col-equal matrix must have determinant $0$. Now, observe that you can derive $u = 0$ from $u = -u$ when $u$ is a polynomial over $mathbb{Z}$ (for example). Thus, you can prove your claim whenever the entries of the $k$-col-equal matrix are polynomials over $mathbb{Z}$ (because you have shown that each $k$-col-equal matrix $left(alpha_{i,j}right)$ satisfies $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$). In particular, your claim holds when the col-equal matrix is the "universal $k$-col-equal matrix", which is the matrix whose entries are indeterminates (in a polynomial ring over $mathbb{Z}$) that are distinct except for the two columns that are supposed to be equal. (For example, the universal $2$-col-equal matrix for $n = 4$ is $left(begin{array}{cccc} x_{1,1} & x_{1,2} & x_{1,2} & x_{1,4} \ x_{2,1} & x_{2,2} & x_{2,2} & x_{2,4} \ x_{3,1} & x_{3,2} & x_{3,2} & x_{3,4} \ x_{4,1} & x_{4,2} & x_{4,2} & x_{4,4} end{array}right)$, where the $x_{i,j}$ are distinct indeterminates in a polynomial ring over $mathbb{Z}$.) But you can view an arbitrary $k$-col-equal matrix as a result of substituting concrete values for these indeterminates in the "universal $k$-col-equal matrix". Therefore, since your claim holds for the latter matrix, it must also hold for the former.






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            This is a very interesting take on it. One that I had not considered, but that is very nice. It's good to know that the original result can still get us to the desired conclusion.
            $endgroup$
            – Alex Ortiz
            Jul 23 '17 at 3:25














          4












          4








          4





          $begingroup$

          For a fully elaborated proof, I shall be lazy and just refer to Exercise 6.7 (e) in my Notes on the combinatorial fundamentals of algebra, version of 10 January 2019. Or the proof of property (iii) in §5.3.4 of Hartmut Laue, Determinants. The main idea is to split the sum $sumlimits_{sigma in S_n} ldots$ into a sum over all even permutations and a sum over all odd permutations, and show that the addends in the two sums mutually cancel, as Tom Gannon suggests.



          However, there is also a way to derive the result from your $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$ observation. Namely, fix $n in mathbb{N}$ and $k in left{1,2,ldots,n-1right}$. Let a $k$-col-equal matrix be a matrix whose $k$-th column equals its $k+1$-st column. Your claim is that a $k$-col-equal matrix must have determinant $0$. Now, observe that you can derive $u = 0$ from $u = -u$ when $u$ is a polynomial over $mathbb{Z}$ (for example). Thus, you can prove your claim whenever the entries of the $k$-col-equal matrix are polynomials over $mathbb{Z}$ (because you have shown that each $k$-col-equal matrix $left(alpha_{i,j}right)$ satisfies $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$). In particular, your claim holds when the col-equal matrix is the "universal $k$-col-equal matrix", which is the matrix whose entries are indeterminates (in a polynomial ring over $mathbb{Z}$) that are distinct except for the two columns that are supposed to be equal. (For example, the universal $2$-col-equal matrix for $n = 4$ is $left(begin{array}{cccc} x_{1,1} & x_{1,2} & x_{1,2} & x_{1,4} \ x_{2,1} & x_{2,2} & x_{2,2} & x_{2,4} \ x_{3,1} & x_{3,2} & x_{3,2} & x_{3,4} \ x_{4,1} & x_{4,2} & x_{4,2} & x_{4,4} end{array}right)$, where the $x_{i,j}$ are distinct indeterminates in a polynomial ring over $mathbb{Z}$.) But you can view an arbitrary $k$-col-equal matrix as a result of substituting concrete values for these indeterminates in the "universal $k$-col-equal matrix". Therefore, since your claim holds for the latter matrix, it must also hold for the former.






          share|cite|improve this answer











          $endgroup$



          For a fully elaborated proof, I shall be lazy and just refer to Exercise 6.7 (e) in my Notes on the combinatorial fundamentals of algebra, version of 10 January 2019. Or the proof of property (iii) in §5.3.4 of Hartmut Laue, Determinants. The main idea is to split the sum $sumlimits_{sigma in S_n} ldots$ into a sum over all even permutations and a sum over all odd permutations, and show that the addends in the two sums mutually cancel, as Tom Gannon suggests.



          However, there is also a way to derive the result from your $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$ observation. Namely, fix $n in mathbb{N}$ and $k in left{1,2,ldots,n-1right}$. Let a $k$-col-equal matrix be a matrix whose $k$-th column equals its $k+1$-st column. Your claim is that a $k$-col-equal matrix must have determinant $0$. Now, observe that you can derive $u = 0$ from $u = -u$ when $u$ is a polynomial over $mathbb{Z}$ (for example). Thus, you can prove your claim whenever the entries of the $k$-col-equal matrix are polynomials over $mathbb{Z}$ (because you have shown that each $k$-col-equal matrix $left(alpha_{i,j}right)$ satisfies $det left(alpha_{i,j}right) = - det left(alpha_{i,j}right)$). In particular, your claim holds when the col-equal matrix is the "universal $k$-col-equal matrix", which is the matrix whose entries are indeterminates (in a polynomial ring over $mathbb{Z}$) that are distinct except for the two columns that are supposed to be equal. (For example, the universal $2$-col-equal matrix for $n = 4$ is $left(begin{array}{cccc} x_{1,1} & x_{1,2} & x_{1,2} & x_{1,4} \ x_{2,1} & x_{2,2} & x_{2,2} & x_{2,4} \ x_{3,1} & x_{3,2} & x_{3,2} & x_{3,4} \ x_{4,1} & x_{4,2} & x_{4,2} & x_{4,4} end{array}right)$, where the $x_{i,j}$ are distinct indeterminates in a polynomial ring over $mathbb{Z}$.) But you can view an arbitrary $k$-col-equal matrix as a result of substituting concrete values for these indeterminates in the "universal $k$-col-equal matrix". Therefore, since your claim holds for the latter matrix, it must also hold for the former.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Feb 3 at 6:29

























          answered Jul 22 '17 at 22:36









          darij grinbergdarij grinberg

          11.5k33168




          11.5k33168












          • $begingroup$
            This is a very interesting take on it. One that I had not considered, but that is very nice. It's good to know that the original result can still get us to the desired conclusion.
            $endgroup$
            – Alex Ortiz
            Jul 23 '17 at 3:25


















          • $begingroup$
            This is a very interesting take on it. One that I had not considered, but that is very nice. It's good to know that the original result can still get us to the desired conclusion.
            $endgroup$
            – Alex Ortiz
            Jul 23 '17 at 3:25
















          $begingroup$
          This is a very interesting take on it. One that I had not considered, but that is very nice. It's good to know that the original result can still get us to the desired conclusion.
          $endgroup$
          – Alex Ortiz
          Jul 23 '17 at 3:25




          $begingroup$
          This is a very interesting take on it. One that I had not considered, but that is very nice. It's good to know that the original result can still get us to the desired conclusion.
          $endgroup$
          – Alex Ortiz
          Jul 23 '17 at 3:25











          0












          $begingroup$

          Suppose we take a matrix $(alpha_{i,j})$, and switch the $p$th and $q$th rows to form another matrix $(beta_{i,j})$, which is to say,
          $$beta_{i,j} = leftlbrace begin{array}{ccc} alpha_{i, p} &mathrm{if} & j = q \ alpha_{i, q} &mathrm{if} & j = p \ alpha_{i, j} & & mathrm{otherwise}end{array} right.$$
          Another way to put it is, if we let $tau = (p ~ q) in S_n$, then,
          $$beta_{i,j} = alpha_{i, tau(j)}$$
          So we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{i in lbrace 1, ldots, n rbrace} alpha_{sigma(i),tau(i)}.$$
          If we reindex the above product in terms of $j = tau(i) iff i = tau(j)$, then $j$ ranges over $lbrace 1, ldots, n rbrace$ as well (note the importance of commutativity in this step), and we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{j in lbrace 1, ldots, n rbrace} alpha_{sigma circ tau(j),j}.$$
          We can now reindex the sum with $psi = sigma circ tau iff sigma = psi circ tau$, giving us
          $$det(beta_{i,j}) = sum_{psi in S_n} operatorname{sgn}(psi circ tau) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j}.$$
          But, since composing a permutation with a transposition changes its sign, we thus have,
          $$det(beta_{i,j}) = -sum_{psi in S_n} operatorname{sgn}(psi) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j} = det(alpha_{i,j}).$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Do you mean to say switch the two columns? My indexing convention is that if I have $a_{ij}$, then $i$ denotes the row, and $j$ the column.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:42






          • 1




            $begingroup$
            Also, you have apparently shown what I have, which is that switching two rows (or columns) amounts to changing the sign. I am trying to show that if two neighboring columns (or rows) are identical, then the determinant is zero.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:44
















          0












          $begingroup$

          Suppose we take a matrix $(alpha_{i,j})$, and switch the $p$th and $q$th rows to form another matrix $(beta_{i,j})$, which is to say,
          $$beta_{i,j} = leftlbrace begin{array}{ccc} alpha_{i, p} &mathrm{if} & j = q \ alpha_{i, q} &mathrm{if} & j = p \ alpha_{i, j} & & mathrm{otherwise}end{array} right.$$
          Another way to put it is, if we let $tau = (p ~ q) in S_n$, then,
          $$beta_{i,j} = alpha_{i, tau(j)}$$
          So we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{i in lbrace 1, ldots, n rbrace} alpha_{sigma(i),tau(i)}.$$
          If we reindex the above product in terms of $j = tau(i) iff i = tau(j)$, then $j$ ranges over $lbrace 1, ldots, n rbrace$ as well (note the importance of commutativity in this step), and we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{j in lbrace 1, ldots, n rbrace} alpha_{sigma circ tau(j),j}.$$
          We can now reindex the sum with $psi = sigma circ tau iff sigma = psi circ tau$, giving us
          $$det(beta_{i,j}) = sum_{psi in S_n} operatorname{sgn}(psi circ tau) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j}.$$
          But, since composing a permutation with a transposition changes its sign, we thus have,
          $$det(beta_{i,j}) = -sum_{psi in S_n} operatorname{sgn}(psi) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j} = det(alpha_{i,j}).$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Do you mean to say switch the two columns? My indexing convention is that if I have $a_{ij}$, then $i$ denotes the row, and $j$ the column.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:42






          • 1




            $begingroup$
            Also, you have apparently shown what I have, which is that switching two rows (or columns) amounts to changing the sign. I am trying to show that if two neighboring columns (or rows) are identical, then the determinant is zero.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:44














          0












          0








          0





          $begingroup$

          Suppose we take a matrix $(alpha_{i,j})$, and switch the $p$th and $q$th rows to form another matrix $(beta_{i,j})$, which is to say,
          $$beta_{i,j} = leftlbrace begin{array}{ccc} alpha_{i, p} &mathrm{if} & j = q \ alpha_{i, q} &mathrm{if} & j = p \ alpha_{i, j} & & mathrm{otherwise}end{array} right.$$
          Another way to put it is, if we let $tau = (p ~ q) in S_n$, then,
          $$beta_{i,j} = alpha_{i, tau(j)}$$
          So we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{i in lbrace 1, ldots, n rbrace} alpha_{sigma(i),tau(i)}.$$
          If we reindex the above product in terms of $j = tau(i) iff i = tau(j)$, then $j$ ranges over $lbrace 1, ldots, n rbrace$ as well (note the importance of commutativity in this step), and we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{j in lbrace 1, ldots, n rbrace} alpha_{sigma circ tau(j),j}.$$
          We can now reindex the sum with $psi = sigma circ tau iff sigma = psi circ tau$, giving us
          $$det(beta_{i,j}) = sum_{psi in S_n} operatorname{sgn}(psi circ tau) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j}.$$
          But, since composing a permutation with a transposition changes its sign, we thus have,
          $$det(beta_{i,j}) = -sum_{psi in S_n} operatorname{sgn}(psi) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j} = det(alpha_{i,j}).$$






          share|cite|improve this answer









          $endgroup$



          Suppose we take a matrix $(alpha_{i,j})$, and switch the $p$th and $q$th rows to form another matrix $(beta_{i,j})$, which is to say,
          $$beta_{i,j} = leftlbrace begin{array}{ccc} alpha_{i, p} &mathrm{if} & j = q \ alpha_{i, q} &mathrm{if} & j = p \ alpha_{i, j} & & mathrm{otherwise}end{array} right.$$
          Another way to put it is, if we let $tau = (p ~ q) in S_n$, then,
          $$beta_{i,j} = alpha_{i, tau(j)}$$
          So we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{i in lbrace 1, ldots, n rbrace} alpha_{sigma(i),tau(i)}.$$
          If we reindex the above product in terms of $j = tau(i) iff i = tau(j)$, then $j$ ranges over $lbrace 1, ldots, n rbrace$ as well (note the importance of commutativity in this step), and we have,
          $$det(beta_{i,j}) = sum_{sigma in S_n} operatorname{sgn}(sigma) prod_{j in lbrace 1, ldots, n rbrace} alpha_{sigma circ tau(j),j}.$$
          We can now reindex the sum with $psi = sigma circ tau iff sigma = psi circ tau$, giving us
          $$det(beta_{i,j}) = sum_{psi in S_n} operatorname{sgn}(psi circ tau) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j}.$$
          But, since composing a permutation with a transposition changes its sign, we thus have,
          $$det(beta_{i,j}) = -sum_{psi in S_n} operatorname{sgn}(psi) prod_{j in lbrace 1, ldots, n rbrace} alpha_{psi(j),j} = det(alpha_{i,j}).$$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jul 4 '17 at 17:32









          Theo BenditTheo Bendit

          21k12355




          21k12355












          • $begingroup$
            Do you mean to say switch the two columns? My indexing convention is that if I have $a_{ij}$, then $i$ denotes the row, and $j$ the column.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:42






          • 1




            $begingroup$
            Also, you have apparently shown what I have, which is that switching two rows (or columns) amounts to changing the sign. I am trying to show that if two neighboring columns (or rows) are identical, then the determinant is zero.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:44


















          • $begingroup$
            Do you mean to say switch the two columns? My indexing convention is that if I have $a_{ij}$, then $i$ denotes the row, and $j$ the column.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:42






          • 1




            $begingroup$
            Also, you have apparently shown what I have, which is that switching two rows (or columns) amounts to changing the sign. I am trying to show that if two neighboring columns (or rows) are identical, then the determinant is zero.
            $endgroup$
            – Alex Ortiz
            Jul 4 '17 at 17:44
















          $begingroup$
          Do you mean to say switch the two columns? My indexing convention is that if I have $a_{ij}$, then $i$ denotes the row, and $j$ the column.
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 17:42




          $begingroup$
          Do you mean to say switch the two columns? My indexing convention is that if I have $a_{ij}$, then $i$ denotes the row, and $j$ the column.
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 17:42




          1




          1




          $begingroup$
          Also, you have apparently shown what I have, which is that switching two rows (or columns) amounts to changing the sign. I am trying to show that if two neighboring columns (or rows) are identical, then the determinant is zero.
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 17:44




          $begingroup$
          Also, you have apparently shown what I have, which is that switching two rows (or columns) amounts to changing the sign. I am trying to show that if two neighboring columns (or rows) are identical, then the determinant is zero.
          $endgroup$
          – Alex Ortiz
          Jul 4 '17 at 17:44


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2346374%2fdeterminant-is-alternating-over-a-commutative-ring-with-1%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

          'app-layout' is not a known element: how to share Component with different Modules

          SQL update select statement