Decomposition of matrix into sum of product of vectors












3












$begingroup$


If I have an $n times n$ matrix $X$ that is symmetric and positive semi-definite, how do I prove that there exists vectors $v_1,ldots,v_n$ such that



$$X = sum_{i=1}^nv_iv_i^T$$



I know this is some form of factorization and know of how to find the eigenvectors of a matrix but am a little stuck on how to approach this. Any hints would be great!










share|cite|improve this question











$endgroup$

















    3












    $begingroup$


    If I have an $n times n$ matrix $X$ that is symmetric and positive semi-definite, how do I prove that there exists vectors $v_1,ldots,v_n$ such that



    $$X = sum_{i=1}^nv_iv_i^T$$



    I know this is some form of factorization and know of how to find the eigenvectors of a matrix but am a little stuck on how to approach this. Any hints would be great!










    share|cite|improve this question











    $endgroup$















      3












      3








      3


      1



      $begingroup$


      If I have an $n times n$ matrix $X$ that is symmetric and positive semi-definite, how do I prove that there exists vectors $v_1,ldots,v_n$ such that



      $$X = sum_{i=1}^nv_iv_i^T$$



      I know this is some form of factorization and know of how to find the eigenvectors of a matrix but am a little stuck on how to approach this. Any hints would be great!










      share|cite|improve this question











      $endgroup$




      If I have an $n times n$ matrix $X$ that is symmetric and positive semi-definite, how do I prove that there exists vectors $v_1,ldots,v_n$ such that



      $$X = sum_{i=1}^nv_iv_i^T$$



      I know this is some form of factorization and know of how to find the eigenvectors of a matrix but am a little stuck on how to approach this. Any hints would be great!







      linear-algebra eigenvalues-eigenvectors matrix-decomposition






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Jan 25 at 20:53









      mechanodroid

      28.9k62548




      28.9k62548










      asked Jan 25 at 20:14









      AnthonyAnthony

      35519




      35519






















          2 Answers
          2






          active

          oldest

          votes


















          1












          $begingroup$

          Since $V$ is symmetric and positive definite, there exists an orthonormal basis ${e_1, ldots e_n}$ such that $Ve_i = lambda_ie_i$ for some $lambda_i > 0$.



          Denote $P_i$ the orthogonal projection onto $operatorname{span}{e_i}$, i.e. $P_ix = langle x, e_irangle e_i$.



          Note that we have the equality
          $$V = sum_{i=1}^n lambda_iP_i$$



          Now verify that the matrix of $P_i$ w.r.t. the standard basis is $e_ie_i^T$. Hence
          $$V = sum_{i=1}^n lambda_ie_ie_i^T = sum_{i=1}^n left(sqrt{lambda_i}e_iright)left(sqrt{lambda_i}e_iright)^T$$






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thank you! This makes perfect sense
            $endgroup$
            – Anthony
            Jan 25 at 22:10










          • $begingroup$
            Hi, sorry one question. Are the radicals placed correctly in your last line? Like is one of the eigenvectors not square-rooted?
            $endgroup$
            – Anthony
            Jan 26 at 20:59






          • 1




            $begingroup$
            @Anthony It was a typo, thanks for noticing.
            $endgroup$
            – mechanodroid
            Jan 26 at 21:45










          • $begingroup$
            sorry one last question, $P_i$ is the orthogonal projection of what onto the span?
            $endgroup$
            – Anthony
            Jan 27 at 16:51






          • 1




            $begingroup$
            @Anthony $P_i$ is the linear map which sends a vector $x$ to its orthogonal projection onto the subspace $operatorname{span}{e_i}$, which is equal to $langle x, e_irangle e_i$. The matrix of $P_i$ is $e_ie_i^T$ because $$(e_ie_i^T)x = e_i(e_i^Tx) = e_ilangle x, e_irangle$$
            $endgroup$
            – mechanodroid
            Jan 27 at 17:25





















          4












          $begingroup$

          Hint: The matrix being symmetric, there exists an orthonormal basis which consists of eigenvectors of the matrix (see Spectral theorem).






          share|cite|improve this answer









          $endgroup$













            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3087575%2fdecomposition-of-matrix-into-sum-of-product-of-vectors%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            1












            $begingroup$

            Since $V$ is symmetric and positive definite, there exists an orthonormal basis ${e_1, ldots e_n}$ such that $Ve_i = lambda_ie_i$ for some $lambda_i > 0$.



            Denote $P_i$ the orthogonal projection onto $operatorname{span}{e_i}$, i.e. $P_ix = langle x, e_irangle e_i$.



            Note that we have the equality
            $$V = sum_{i=1}^n lambda_iP_i$$



            Now verify that the matrix of $P_i$ w.r.t. the standard basis is $e_ie_i^T$. Hence
            $$V = sum_{i=1}^n lambda_ie_ie_i^T = sum_{i=1}^n left(sqrt{lambda_i}e_iright)left(sqrt{lambda_i}e_iright)^T$$






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              Thank you! This makes perfect sense
              $endgroup$
              – Anthony
              Jan 25 at 22:10










            • $begingroup$
              Hi, sorry one question. Are the radicals placed correctly in your last line? Like is one of the eigenvectors not square-rooted?
              $endgroup$
              – Anthony
              Jan 26 at 20:59






            • 1




              $begingroup$
              @Anthony It was a typo, thanks for noticing.
              $endgroup$
              – mechanodroid
              Jan 26 at 21:45










            • $begingroup$
              sorry one last question, $P_i$ is the orthogonal projection of what onto the span?
              $endgroup$
              – Anthony
              Jan 27 at 16:51






            • 1




              $begingroup$
              @Anthony $P_i$ is the linear map which sends a vector $x$ to its orthogonal projection onto the subspace $operatorname{span}{e_i}$, which is equal to $langle x, e_irangle e_i$. The matrix of $P_i$ is $e_ie_i^T$ because $$(e_ie_i^T)x = e_i(e_i^Tx) = e_ilangle x, e_irangle$$
              $endgroup$
              – mechanodroid
              Jan 27 at 17:25


















            1












            $begingroup$

            Since $V$ is symmetric and positive definite, there exists an orthonormal basis ${e_1, ldots e_n}$ such that $Ve_i = lambda_ie_i$ for some $lambda_i > 0$.



            Denote $P_i$ the orthogonal projection onto $operatorname{span}{e_i}$, i.e. $P_ix = langle x, e_irangle e_i$.



            Note that we have the equality
            $$V = sum_{i=1}^n lambda_iP_i$$



            Now verify that the matrix of $P_i$ w.r.t. the standard basis is $e_ie_i^T$. Hence
            $$V = sum_{i=1}^n lambda_ie_ie_i^T = sum_{i=1}^n left(sqrt{lambda_i}e_iright)left(sqrt{lambda_i}e_iright)^T$$






            share|cite|improve this answer











            $endgroup$













            • $begingroup$
              Thank you! This makes perfect sense
              $endgroup$
              – Anthony
              Jan 25 at 22:10










            • $begingroup$
              Hi, sorry one question. Are the radicals placed correctly in your last line? Like is one of the eigenvectors not square-rooted?
              $endgroup$
              – Anthony
              Jan 26 at 20:59






            • 1




              $begingroup$
              @Anthony It was a typo, thanks for noticing.
              $endgroup$
              – mechanodroid
              Jan 26 at 21:45










            • $begingroup$
              sorry one last question, $P_i$ is the orthogonal projection of what onto the span?
              $endgroup$
              – Anthony
              Jan 27 at 16:51






            • 1




              $begingroup$
              @Anthony $P_i$ is the linear map which sends a vector $x$ to its orthogonal projection onto the subspace $operatorname{span}{e_i}$, which is equal to $langle x, e_irangle e_i$. The matrix of $P_i$ is $e_ie_i^T$ because $$(e_ie_i^T)x = e_i(e_i^Tx) = e_ilangle x, e_irangle$$
              $endgroup$
              – mechanodroid
              Jan 27 at 17:25
















            1












            1








            1





            $begingroup$

            Since $V$ is symmetric and positive definite, there exists an orthonormal basis ${e_1, ldots e_n}$ such that $Ve_i = lambda_ie_i$ for some $lambda_i > 0$.



            Denote $P_i$ the orthogonal projection onto $operatorname{span}{e_i}$, i.e. $P_ix = langle x, e_irangle e_i$.



            Note that we have the equality
            $$V = sum_{i=1}^n lambda_iP_i$$



            Now verify that the matrix of $P_i$ w.r.t. the standard basis is $e_ie_i^T$. Hence
            $$V = sum_{i=1}^n lambda_ie_ie_i^T = sum_{i=1}^n left(sqrt{lambda_i}e_iright)left(sqrt{lambda_i}e_iright)^T$$






            share|cite|improve this answer











            $endgroup$



            Since $V$ is symmetric and positive definite, there exists an orthonormal basis ${e_1, ldots e_n}$ such that $Ve_i = lambda_ie_i$ for some $lambda_i > 0$.



            Denote $P_i$ the orthogonal projection onto $operatorname{span}{e_i}$, i.e. $P_ix = langle x, e_irangle e_i$.



            Note that we have the equality
            $$V = sum_{i=1}^n lambda_iP_i$$



            Now verify that the matrix of $P_i$ w.r.t. the standard basis is $e_ie_i^T$. Hence
            $$V = sum_{i=1}^n lambda_ie_ie_i^T = sum_{i=1}^n left(sqrt{lambda_i}e_iright)left(sqrt{lambda_i}e_iright)^T$$







            share|cite|improve this answer














            share|cite|improve this answer



            share|cite|improve this answer








            edited Jan 26 at 21:45

























            answered Jan 25 at 20:50









            mechanodroidmechanodroid

            28.9k62548




            28.9k62548












            • $begingroup$
              Thank you! This makes perfect sense
              $endgroup$
              – Anthony
              Jan 25 at 22:10










            • $begingroup$
              Hi, sorry one question. Are the radicals placed correctly in your last line? Like is one of the eigenvectors not square-rooted?
              $endgroup$
              – Anthony
              Jan 26 at 20:59






            • 1




              $begingroup$
              @Anthony It was a typo, thanks for noticing.
              $endgroup$
              – mechanodroid
              Jan 26 at 21:45










            • $begingroup$
              sorry one last question, $P_i$ is the orthogonal projection of what onto the span?
              $endgroup$
              – Anthony
              Jan 27 at 16:51






            • 1




              $begingroup$
              @Anthony $P_i$ is the linear map which sends a vector $x$ to its orthogonal projection onto the subspace $operatorname{span}{e_i}$, which is equal to $langle x, e_irangle e_i$. The matrix of $P_i$ is $e_ie_i^T$ because $$(e_ie_i^T)x = e_i(e_i^Tx) = e_ilangle x, e_irangle$$
              $endgroup$
              – mechanodroid
              Jan 27 at 17:25




















            • $begingroup$
              Thank you! This makes perfect sense
              $endgroup$
              – Anthony
              Jan 25 at 22:10










            • $begingroup$
              Hi, sorry one question. Are the radicals placed correctly in your last line? Like is one of the eigenvectors not square-rooted?
              $endgroup$
              – Anthony
              Jan 26 at 20:59






            • 1




              $begingroup$
              @Anthony It was a typo, thanks for noticing.
              $endgroup$
              – mechanodroid
              Jan 26 at 21:45










            • $begingroup$
              sorry one last question, $P_i$ is the orthogonal projection of what onto the span?
              $endgroup$
              – Anthony
              Jan 27 at 16:51






            • 1




              $begingroup$
              @Anthony $P_i$ is the linear map which sends a vector $x$ to its orthogonal projection onto the subspace $operatorname{span}{e_i}$, which is equal to $langle x, e_irangle e_i$. The matrix of $P_i$ is $e_ie_i^T$ because $$(e_ie_i^T)x = e_i(e_i^Tx) = e_ilangle x, e_irangle$$
              $endgroup$
              – mechanodroid
              Jan 27 at 17:25


















            $begingroup$
            Thank you! This makes perfect sense
            $endgroup$
            – Anthony
            Jan 25 at 22:10




            $begingroup$
            Thank you! This makes perfect sense
            $endgroup$
            – Anthony
            Jan 25 at 22:10












            $begingroup$
            Hi, sorry one question. Are the radicals placed correctly in your last line? Like is one of the eigenvectors not square-rooted?
            $endgroup$
            – Anthony
            Jan 26 at 20:59




            $begingroup$
            Hi, sorry one question. Are the radicals placed correctly in your last line? Like is one of the eigenvectors not square-rooted?
            $endgroup$
            – Anthony
            Jan 26 at 20:59




            1




            1




            $begingroup$
            @Anthony It was a typo, thanks for noticing.
            $endgroup$
            – mechanodroid
            Jan 26 at 21:45




            $begingroup$
            @Anthony It was a typo, thanks for noticing.
            $endgroup$
            – mechanodroid
            Jan 26 at 21:45












            $begingroup$
            sorry one last question, $P_i$ is the orthogonal projection of what onto the span?
            $endgroup$
            – Anthony
            Jan 27 at 16:51




            $begingroup$
            sorry one last question, $P_i$ is the orthogonal projection of what onto the span?
            $endgroup$
            – Anthony
            Jan 27 at 16:51




            1




            1




            $begingroup$
            @Anthony $P_i$ is the linear map which sends a vector $x$ to its orthogonal projection onto the subspace $operatorname{span}{e_i}$, which is equal to $langle x, e_irangle e_i$. The matrix of $P_i$ is $e_ie_i^T$ because $$(e_ie_i^T)x = e_i(e_i^Tx) = e_ilangle x, e_irangle$$
            $endgroup$
            – mechanodroid
            Jan 27 at 17:25






            $begingroup$
            @Anthony $P_i$ is the linear map which sends a vector $x$ to its orthogonal projection onto the subspace $operatorname{span}{e_i}$, which is equal to $langle x, e_irangle e_i$. The matrix of $P_i$ is $e_ie_i^T$ because $$(e_ie_i^T)x = e_i(e_i^Tx) = e_ilangle x, e_irangle$$
            $endgroup$
            – mechanodroid
            Jan 27 at 17:25













            4












            $begingroup$

            Hint: The matrix being symmetric, there exists an orthonormal basis which consists of eigenvectors of the matrix (see Spectral theorem).






            share|cite|improve this answer









            $endgroup$


















              4












              $begingroup$

              Hint: The matrix being symmetric, there exists an orthonormal basis which consists of eigenvectors of the matrix (see Spectral theorem).






              share|cite|improve this answer









              $endgroup$
















                4












                4








                4





                $begingroup$

                Hint: The matrix being symmetric, there exists an orthonormal basis which consists of eigenvectors of the matrix (see Spectral theorem).






                share|cite|improve this answer









                $endgroup$



                Hint: The matrix being symmetric, there exists an orthonormal basis which consists of eigenvectors of the matrix (see Spectral theorem).







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered Jan 25 at 20:43









                ScientificaScientifica

                6,82941335




                6,82941335






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Mathematics Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3087575%2fdecomposition-of-matrix-into-sum-of-product-of-vectors%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

                    SQL update select statement

                    'app-layout' is not a known element: how to share Component with different Modules