Proving $dim(E_0) geq n - k$












0












$begingroup$


I found a question from an old exam which I am not really able to wrap my head around. The questions states:



Given $k<n$ and $v_1, v_2, ..., v_k in mathbb{R} ^n$ non-zero vectors, orthogonal to the standard dot product on $mathbb{R}^n$, where we consider the vectors of $mathbb{R}^n$ as column vectors. Given that $lambda_1, lambda_2, ..., lambda_k in mathbb{R}$ and
$$A = lambda_1 v_1 v_1^T + lambda_2 v_2 v_2^T + ... + lambda_k v_k v_k^T in mathbb{R}^{n times n}$$



Part A asks to prove that $v_1, v_2, ..., v_k $ are eigenvectors of $A$. I managed to prove this by showing that $$Av_i = left(sum_{j=1}^klambda_jv_jv_j^Tright)v_i=sum_{j=1}^klambda_jv_j(v_j^Tv_i)=lambda_iv_i|| v_i||^2=left(lambda_i|| v_i||^2right)v_i$$ $forall i = 1, 2, ..., k$



Part B asks to prove that $dim(E_0) geq n - k$, where $E_0$ the eigenspace is for the eigenvalue $0$. I don't really have a direct idea of how to get started with this part.



Part C asks to prove that $A$ is diagonalisable and give an orthogonal basis of $A$. Again I am not really sure how to get started with this.



Any help is appreciated.










share|cite|improve this question











$endgroup$












  • $begingroup$
    I think $Bbb R^{2times2}$ should be $Bbb R^{ntimes n}$, which presumably denotes the space of $ntimes n$ matrices.
    $endgroup$
    – Marc van Leeuwen
    Jan 22 at 17:12
















0












$begingroup$


I found a question from an old exam which I am not really able to wrap my head around. The questions states:



Given $k<n$ and $v_1, v_2, ..., v_k in mathbb{R} ^n$ non-zero vectors, orthogonal to the standard dot product on $mathbb{R}^n$, where we consider the vectors of $mathbb{R}^n$ as column vectors. Given that $lambda_1, lambda_2, ..., lambda_k in mathbb{R}$ and
$$A = lambda_1 v_1 v_1^T + lambda_2 v_2 v_2^T + ... + lambda_k v_k v_k^T in mathbb{R}^{n times n}$$



Part A asks to prove that $v_1, v_2, ..., v_k $ are eigenvectors of $A$. I managed to prove this by showing that $$Av_i = left(sum_{j=1}^klambda_jv_jv_j^Tright)v_i=sum_{j=1}^klambda_jv_j(v_j^Tv_i)=lambda_iv_i|| v_i||^2=left(lambda_i|| v_i||^2right)v_i$$ $forall i = 1, 2, ..., k$



Part B asks to prove that $dim(E_0) geq n - k$, where $E_0$ the eigenspace is for the eigenvalue $0$. I don't really have a direct idea of how to get started with this part.



Part C asks to prove that $A$ is diagonalisable and give an orthogonal basis of $A$. Again I am not really sure how to get started with this.



Any help is appreciated.










share|cite|improve this question











$endgroup$












  • $begingroup$
    I think $Bbb R^{2times2}$ should be $Bbb R^{ntimes n}$, which presumably denotes the space of $ntimes n$ matrices.
    $endgroup$
    – Marc van Leeuwen
    Jan 22 at 17:12














0












0








0





$begingroup$


I found a question from an old exam which I am not really able to wrap my head around. The questions states:



Given $k<n$ and $v_1, v_2, ..., v_k in mathbb{R} ^n$ non-zero vectors, orthogonal to the standard dot product on $mathbb{R}^n$, where we consider the vectors of $mathbb{R}^n$ as column vectors. Given that $lambda_1, lambda_2, ..., lambda_k in mathbb{R}$ and
$$A = lambda_1 v_1 v_1^T + lambda_2 v_2 v_2^T + ... + lambda_k v_k v_k^T in mathbb{R}^{n times n}$$



Part A asks to prove that $v_1, v_2, ..., v_k $ are eigenvectors of $A$. I managed to prove this by showing that $$Av_i = left(sum_{j=1}^klambda_jv_jv_j^Tright)v_i=sum_{j=1}^klambda_jv_j(v_j^Tv_i)=lambda_iv_i|| v_i||^2=left(lambda_i|| v_i||^2right)v_i$$ $forall i = 1, 2, ..., k$



Part B asks to prove that $dim(E_0) geq n - k$, where $E_0$ the eigenspace is for the eigenvalue $0$. I don't really have a direct idea of how to get started with this part.



Part C asks to prove that $A$ is diagonalisable and give an orthogonal basis of $A$. Again I am not really sure how to get started with this.



Any help is appreciated.










share|cite|improve this question











$endgroup$




I found a question from an old exam which I am not really able to wrap my head around. The questions states:



Given $k<n$ and $v_1, v_2, ..., v_k in mathbb{R} ^n$ non-zero vectors, orthogonal to the standard dot product on $mathbb{R}^n$, where we consider the vectors of $mathbb{R}^n$ as column vectors. Given that $lambda_1, lambda_2, ..., lambda_k in mathbb{R}$ and
$$A = lambda_1 v_1 v_1^T + lambda_2 v_2 v_2^T + ... + lambda_k v_k v_k^T in mathbb{R}^{n times n}$$



Part A asks to prove that $v_1, v_2, ..., v_k $ are eigenvectors of $A$. I managed to prove this by showing that $$Av_i = left(sum_{j=1}^klambda_jv_jv_j^Tright)v_i=sum_{j=1}^klambda_jv_j(v_j^Tv_i)=lambda_iv_i|| v_i||^2=left(lambda_i|| v_i||^2right)v_i$$ $forall i = 1, 2, ..., k$



Part B asks to prove that $dim(E_0) geq n - k$, where $E_0$ the eigenspace is for the eigenvalue $0$. I don't really have a direct idea of how to get started with this part.



Part C asks to prove that $A$ is diagonalisable and give an orthogonal basis of $A$. Again I am not really sure how to get started with this.



Any help is appreciated.







linear-algebra matrices eigenvalues-eigenvectors orthogonality diagonalization






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 23 at 8:49







Elliot S

















asked Jan 22 at 16:41









Elliot SElliot S

657




657












  • $begingroup$
    I think $Bbb R^{2times2}$ should be $Bbb R^{ntimes n}$, which presumably denotes the space of $ntimes n$ matrices.
    $endgroup$
    – Marc van Leeuwen
    Jan 22 at 17:12


















  • $begingroup$
    I think $Bbb R^{2times2}$ should be $Bbb R^{ntimes n}$, which presumably denotes the space of $ntimes n$ matrices.
    $endgroup$
    – Marc van Leeuwen
    Jan 22 at 17:12
















$begingroup$
I think $Bbb R^{2times2}$ should be $Bbb R^{ntimes n}$, which presumably denotes the space of $ntimes n$ matrices.
$endgroup$
– Marc van Leeuwen
Jan 22 at 17:12




$begingroup$
I think $Bbb R^{2times2}$ should be $Bbb R^{ntimes n}$, which presumably denotes the space of $ntimes n$ matrices.
$endgroup$
– Marc van Leeuwen
Jan 22 at 17:12










2 Answers
2






active

oldest

votes


















1












$begingroup$

For Part B, you can find an orthogonal system $$v_{k+1},v_{k+2},ldots, v_n ,$$ which is orthogonal to ${v_1,v_2,ldots, v_k}$ (e.g. by Gram-Schmidt). We can see $Av_{i}=0$ for $k<ile n$. From this, deduce that $dim E_0ge n-k$.



Combining the Part A and B, we can see immediately there exists an orthogonal basis consisting of eigenvectors of $A$.



Note: Spectral decomposition theorem says every symmetric real matrix is of the same form as that of $A$.






share|cite|improve this answer











$endgroup$





















    1












    $begingroup$

    Part B: observe that "the eigenspace for the eigenvalue $0$" is just a fancy way of saying "null space". Both are the set of all vectors $v$ for which $Av=0$. I'm assuming you're familiar with computing null spaces?



    Part C: a matrix is diagonalizable if the sum of dimensions of its eigenspaces equals the number of columns. For example, a $3times 3$ matrix will be diagonalizable if it has two eigenvalues, and the corresponding dimensions of the eigenspaces are $1$ and $2$. If we know that a matrix is diagonalizable, diagonalizing it amounts to finding all of the eigenvalue/eigenvector pairs. You can then use Gram-Schmidt to get an orthogonal basis.






    share|cite|improve this answer









    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083387%2fproving-dime-0-geq-n-k%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      1












      $begingroup$

      For Part B, you can find an orthogonal system $$v_{k+1},v_{k+2},ldots, v_n ,$$ which is orthogonal to ${v_1,v_2,ldots, v_k}$ (e.g. by Gram-Schmidt). We can see $Av_{i}=0$ for $k<ile n$. From this, deduce that $dim E_0ge n-k$.



      Combining the Part A and B, we can see immediately there exists an orthogonal basis consisting of eigenvectors of $A$.



      Note: Spectral decomposition theorem says every symmetric real matrix is of the same form as that of $A$.






      share|cite|improve this answer











      $endgroup$


















        1












        $begingroup$

        For Part B, you can find an orthogonal system $$v_{k+1},v_{k+2},ldots, v_n ,$$ which is orthogonal to ${v_1,v_2,ldots, v_k}$ (e.g. by Gram-Schmidt). We can see $Av_{i}=0$ for $k<ile n$. From this, deduce that $dim E_0ge n-k$.



        Combining the Part A and B, we can see immediately there exists an orthogonal basis consisting of eigenvectors of $A$.



        Note: Spectral decomposition theorem says every symmetric real matrix is of the same form as that of $A$.






        share|cite|improve this answer











        $endgroup$
















          1












          1








          1





          $begingroup$

          For Part B, you can find an orthogonal system $$v_{k+1},v_{k+2},ldots, v_n ,$$ which is orthogonal to ${v_1,v_2,ldots, v_k}$ (e.g. by Gram-Schmidt). We can see $Av_{i}=0$ for $k<ile n$. From this, deduce that $dim E_0ge n-k$.



          Combining the Part A and B, we can see immediately there exists an orthogonal basis consisting of eigenvectors of $A$.



          Note: Spectral decomposition theorem says every symmetric real matrix is of the same form as that of $A$.






          share|cite|improve this answer











          $endgroup$



          For Part B, you can find an orthogonal system $$v_{k+1},v_{k+2},ldots, v_n ,$$ which is orthogonal to ${v_1,v_2,ldots, v_k}$ (e.g. by Gram-Schmidt). We can see $Av_{i}=0$ for $k<ile n$. From this, deduce that $dim E_0ge n-k$.



          Combining the Part A and B, we can see immediately there exists an orthogonal basis consisting of eigenvectors of $A$.



          Note: Spectral decomposition theorem says every symmetric real matrix is of the same form as that of $A$.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Jan 22 at 17:00

























          answered Jan 22 at 16:55









          SongSong

          16.9k21145




          16.9k21145























              1












              $begingroup$

              Part B: observe that "the eigenspace for the eigenvalue $0$" is just a fancy way of saying "null space". Both are the set of all vectors $v$ for which $Av=0$. I'm assuming you're familiar with computing null spaces?



              Part C: a matrix is diagonalizable if the sum of dimensions of its eigenspaces equals the number of columns. For example, a $3times 3$ matrix will be diagonalizable if it has two eigenvalues, and the corresponding dimensions of the eigenspaces are $1$ and $2$. If we know that a matrix is diagonalizable, diagonalizing it amounts to finding all of the eigenvalue/eigenvector pairs. You can then use Gram-Schmidt to get an orthogonal basis.






              share|cite|improve this answer









              $endgroup$


















                1












                $begingroup$

                Part B: observe that "the eigenspace for the eigenvalue $0$" is just a fancy way of saying "null space". Both are the set of all vectors $v$ for which $Av=0$. I'm assuming you're familiar with computing null spaces?



                Part C: a matrix is diagonalizable if the sum of dimensions of its eigenspaces equals the number of columns. For example, a $3times 3$ matrix will be diagonalizable if it has two eigenvalues, and the corresponding dimensions of the eigenspaces are $1$ and $2$. If we know that a matrix is diagonalizable, diagonalizing it amounts to finding all of the eigenvalue/eigenvector pairs. You can then use Gram-Schmidt to get an orthogonal basis.






                share|cite|improve this answer









                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  Part B: observe that "the eigenspace for the eigenvalue $0$" is just a fancy way of saying "null space". Both are the set of all vectors $v$ for which $Av=0$. I'm assuming you're familiar with computing null spaces?



                  Part C: a matrix is diagonalizable if the sum of dimensions of its eigenspaces equals the number of columns. For example, a $3times 3$ matrix will be diagonalizable if it has two eigenvalues, and the corresponding dimensions of the eigenspaces are $1$ and $2$. If we know that a matrix is diagonalizable, diagonalizing it amounts to finding all of the eigenvalue/eigenvector pairs. You can then use Gram-Schmidt to get an orthogonal basis.






                  share|cite|improve this answer









                  $endgroup$



                  Part B: observe that "the eigenspace for the eigenvalue $0$" is just a fancy way of saying "null space". Both are the set of all vectors $v$ for which $Av=0$. I'm assuming you're familiar with computing null spaces?



                  Part C: a matrix is diagonalizable if the sum of dimensions of its eigenspaces equals the number of columns. For example, a $3times 3$ matrix will be diagonalizable if it has two eigenvalues, and the corresponding dimensions of the eigenspaces are $1$ and $2$. If we know that a matrix is diagonalizable, diagonalizing it amounts to finding all of the eigenvalue/eigenvector pairs. You can then use Gram-Schmidt to get an orthogonal basis.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Jan 22 at 16:49









                  pwerthpwerth

                  3,265417




                  3,265417






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083387%2fproving-dime-0-geq-n-k%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      MongoDB - Not Authorized To Execute Command

                      in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

                      How to fix TextFormField cause rebuild widget in Flutter