Analytically finding minimum eigenvalue of a matrix












0












$begingroup$


Is there a way to find the minimum eigenvalue of a matrix analytically? (I know that the usual way of finding the eigenvalues is the characteristic polynomial).



One definition of the minimum eigenvalue is



$$
min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}
$$



For the simple matrix [ 0 1 ; -2 -3 ] I tried to find the smallest eigenvalue analytically by naming the x-vector components $x_1$ and $x_2$ and having $D = ((-2x_1 -3 x_2)^2 + x_2^2)/(x_1^2+x_2^2)$. From this I obtain the gradient $frac{partial D}{partial x_1}$, $frac{partial D}{partial x_2}$ and set it to zero solve for $x_1$ and $x_2$.



Surely enough, this system of equations does not have a solution.










share|cite|improve this question











$endgroup$












  • $begingroup$
    For an eigenvector $x$ with eigenvalue $lambda$, $x^TAx=lambdalVert xrVert^2$. So it would seem that you need to divide that expression by $lVert xrVert^2$ before taking the min in order to find $lambda$.
    $endgroup$
    – alex.jordan
    Jan 9 at 0:54










  • $begingroup$
    You are right. I added the normalization but the system does not have a solution either. I will update my question.
    $endgroup$
    – divB
    Jan 9 at 1:03












  • $begingroup$
    In the most recent edit you replaced $x^TAx$ with $lVert AxrVert^2$. These are not the same. $lVert AxrVert^2$ is the same as $x^TA^TAx$.
    $endgroup$
    – alex.jordan
    Jan 9 at 1:47


















0












$begingroup$


Is there a way to find the minimum eigenvalue of a matrix analytically? (I know that the usual way of finding the eigenvalues is the characteristic polynomial).



One definition of the minimum eigenvalue is



$$
min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}
$$



For the simple matrix [ 0 1 ; -2 -3 ] I tried to find the smallest eigenvalue analytically by naming the x-vector components $x_1$ and $x_2$ and having $D = ((-2x_1 -3 x_2)^2 + x_2^2)/(x_1^2+x_2^2)$. From this I obtain the gradient $frac{partial D}{partial x_1}$, $frac{partial D}{partial x_2}$ and set it to zero solve for $x_1$ and $x_2$.



Surely enough, this system of equations does not have a solution.










share|cite|improve this question











$endgroup$












  • $begingroup$
    For an eigenvector $x$ with eigenvalue $lambda$, $x^TAx=lambdalVert xrVert^2$. So it would seem that you need to divide that expression by $lVert xrVert^2$ before taking the min in order to find $lambda$.
    $endgroup$
    – alex.jordan
    Jan 9 at 0:54










  • $begingroup$
    You are right. I added the normalization but the system does not have a solution either. I will update my question.
    $endgroup$
    – divB
    Jan 9 at 1:03












  • $begingroup$
    In the most recent edit you replaced $x^TAx$ with $lVert AxrVert^2$. These are not the same. $lVert AxrVert^2$ is the same as $x^TA^TAx$.
    $endgroup$
    – alex.jordan
    Jan 9 at 1:47
















0












0








0





$begingroup$


Is there a way to find the minimum eigenvalue of a matrix analytically? (I know that the usual way of finding the eigenvalues is the characteristic polynomial).



One definition of the minimum eigenvalue is



$$
min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}
$$



For the simple matrix [ 0 1 ; -2 -3 ] I tried to find the smallest eigenvalue analytically by naming the x-vector components $x_1$ and $x_2$ and having $D = ((-2x_1 -3 x_2)^2 + x_2^2)/(x_1^2+x_2^2)$. From this I obtain the gradient $frac{partial D}{partial x_1}$, $frac{partial D}{partial x_2}$ and set it to zero solve for $x_1$ and $x_2$.



Surely enough, this system of equations does not have a solution.










share|cite|improve this question











$endgroup$




Is there a way to find the minimum eigenvalue of a matrix analytically? (I know that the usual way of finding the eigenvalues is the characteristic polynomial).



One definition of the minimum eigenvalue is



$$
min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}
$$



For the simple matrix [ 0 1 ; -2 -3 ] I tried to find the smallest eigenvalue analytically by naming the x-vector components $x_1$ and $x_2$ and having $D = ((-2x_1 -3 x_2)^2 + x_2^2)/(x_1^2+x_2^2)$. From this I obtain the gradient $frac{partial D}{partial x_1}$, $frac{partial D}{partial x_2}$ and set it to zero solve for $x_1$ and $x_2$.



Surely enough, this system of equations does not have a solution.







linear-algebra eigenvalues-eigenvectors convex-optimization






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 9 at 1:32







divB

















asked Jan 9 at 0:49









divBdivB

228111




228111












  • $begingroup$
    For an eigenvector $x$ with eigenvalue $lambda$, $x^TAx=lambdalVert xrVert^2$. So it would seem that you need to divide that expression by $lVert xrVert^2$ before taking the min in order to find $lambda$.
    $endgroup$
    – alex.jordan
    Jan 9 at 0:54










  • $begingroup$
    You are right. I added the normalization but the system does not have a solution either. I will update my question.
    $endgroup$
    – divB
    Jan 9 at 1:03












  • $begingroup$
    In the most recent edit you replaced $x^TAx$ with $lVert AxrVert^2$. These are not the same. $lVert AxrVert^2$ is the same as $x^TA^TAx$.
    $endgroup$
    – alex.jordan
    Jan 9 at 1:47




















  • $begingroup$
    For an eigenvector $x$ with eigenvalue $lambda$, $x^TAx=lambdalVert xrVert^2$. So it would seem that you need to divide that expression by $lVert xrVert^2$ before taking the min in order to find $lambda$.
    $endgroup$
    – alex.jordan
    Jan 9 at 0:54










  • $begingroup$
    You are right. I added the normalization but the system does not have a solution either. I will update my question.
    $endgroup$
    – divB
    Jan 9 at 1:03












  • $begingroup$
    In the most recent edit you replaced $x^TAx$ with $lVert AxrVert^2$. These are not the same. $lVert AxrVert^2$ is the same as $x^TA^TAx$.
    $endgroup$
    – alex.jordan
    Jan 9 at 1:47


















$begingroup$
For an eigenvector $x$ with eigenvalue $lambda$, $x^TAx=lambdalVert xrVert^2$. So it would seem that you need to divide that expression by $lVert xrVert^2$ before taking the min in order to find $lambda$.
$endgroup$
– alex.jordan
Jan 9 at 0:54




$begingroup$
For an eigenvector $x$ with eigenvalue $lambda$, $x^TAx=lambdalVert xrVert^2$. So it would seem that you need to divide that expression by $lVert xrVert^2$ before taking the min in order to find $lambda$.
$endgroup$
– alex.jordan
Jan 9 at 0:54












$begingroup$
You are right. I added the normalization but the system does not have a solution either. I will update my question.
$endgroup$
– divB
Jan 9 at 1:03






$begingroup$
You are right. I added the normalization but the system does not have a solution either. I will update my question.
$endgroup$
– divB
Jan 9 at 1:03














$begingroup$
In the most recent edit you replaced $x^TAx$ with $lVert AxrVert^2$. These are not the same. $lVert AxrVert^2$ is the same as $x^TA^TAx$.
$endgroup$
– alex.jordan
Jan 9 at 1:47






$begingroup$
In the most recent edit you replaced $x^TAx$ with $lVert AxrVert^2$. These are not the same. $lVert AxrVert^2$ is the same as $x^TA^TAx$.
$endgroup$
– alex.jordan
Jan 9 at 1:47












1 Answer
1






active

oldest

votes


















0












$begingroup$

Your "definition" of minimum eigenvalue has several problems. For one thing, it would imply that every matrix has a real and positive eigenvalue, which is not true in general.
Also, dont' forget a simple sanity check: your matrix property should apply for a $1times 1$ matrix... and it clearly doesn't.



What you actually mean, I guess, is : if $M$ is hermitian and positive definite (which implies that its eigenvalues are real and positive), then letting $lambda_1$ be its smallest eigenvalue, we have



$$lambda_1 = min_{x ne 0} frac{x^t M x }{x^t x} $$



Further, if we decompose $M=A^t A$ the above gives



$$lambda_1 = min_{x ne 0} frac{x^t A^t A x }{x^t x} =min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}$$



But, bear in mind, here $lambda_1$ is an eigenvalue of $M$, not of $A$.



To answer the question in the title: there cannot be a general analytical procedure for computing the smallest eingenvalue of a matrix, because the problem maps to finding the smallest root of a polinomial, and for degrees five and above there's no analytical procedure (in the usual sense of the expression) for doing that (Abel-Ruffini theorem)



On the other side, if the matrix is $2times 2$, then it's trivial. But you would't use the Rayleigh quotient for that. This is only useful for computing numerically the smallest (or biggest) eigenvalue, by an iterative method.






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3066938%2fanalytically-finding-minimum-eigenvalue-of-a-matrix%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0












    $begingroup$

    Your "definition" of minimum eigenvalue has several problems. For one thing, it would imply that every matrix has a real and positive eigenvalue, which is not true in general.
    Also, dont' forget a simple sanity check: your matrix property should apply for a $1times 1$ matrix... and it clearly doesn't.



    What you actually mean, I guess, is : if $M$ is hermitian and positive definite (which implies that its eigenvalues are real and positive), then letting $lambda_1$ be its smallest eigenvalue, we have



    $$lambda_1 = min_{x ne 0} frac{x^t M x }{x^t x} $$



    Further, if we decompose $M=A^t A$ the above gives



    $$lambda_1 = min_{x ne 0} frac{x^t A^t A x }{x^t x} =min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}$$



    But, bear in mind, here $lambda_1$ is an eigenvalue of $M$, not of $A$.



    To answer the question in the title: there cannot be a general analytical procedure for computing the smallest eingenvalue of a matrix, because the problem maps to finding the smallest root of a polinomial, and for degrees five and above there's no analytical procedure (in the usual sense of the expression) for doing that (Abel-Ruffini theorem)



    On the other side, if the matrix is $2times 2$, then it's trivial. But you would't use the Rayleigh quotient for that. This is only useful for computing numerically the smallest (or biggest) eigenvalue, by an iterative method.






    share|cite|improve this answer











    $endgroup$


















      0












      $begingroup$

      Your "definition" of minimum eigenvalue has several problems. For one thing, it would imply that every matrix has a real and positive eigenvalue, which is not true in general.
      Also, dont' forget a simple sanity check: your matrix property should apply for a $1times 1$ matrix... and it clearly doesn't.



      What you actually mean, I guess, is : if $M$ is hermitian and positive definite (which implies that its eigenvalues are real and positive), then letting $lambda_1$ be its smallest eigenvalue, we have



      $$lambda_1 = min_{x ne 0} frac{x^t M x }{x^t x} $$



      Further, if we decompose $M=A^t A$ the above gives



      $$lambda_1 = min_{x ne 0} frac{x^t A^t A x }{x^t x} =min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}$$



      But, bear in mind, here $lambda_1$ is an eigenvalue of $M$, not of $A$.



      To answer the question in the title: there cannot be a general analytical procedure for computing the smallest eingenvalue of a matrix, because the problem maps to finding the smallest root of a polinomial, and for degrees five and above there's no analytical procedure (in the usual sense of the expression) for doing that (Abel-Ruffini theorem)



      On the other side, if the matrix is $2times 2$, then it's trivial. But you would't use the Rayleigh quotient for that. This is only useful for computing numerically the smallest (or biggest) eigenvalue, by an iterative method.






      share|cite|improve this answer











      $endgroup$
















        0












        0








        0





        $begingroup$

        Your "definition" of minimum eigenvalue has several problems. For one thing, it would imply that every matrix has a real and positive eigenvalue, which is not true in general.
        Also, dont' forget a simple sanity check: your matrix property should apply for a $1times 1$ matrix... and it clearly doesn't.



        What you actually mean, I guess, is : if $M$ is hermitian and positive definite (which implies that its eigenvalues are real and positive), then letting $lambda_1$ be its smallest eigenvalue, we have



        $$lambda_1 = min_{x ne 0} frac{x^t M x }{x^t x} $$



        Further, if we decompose $M=A^t A$ the above gives



        $$lambda_1 = min_{x ne 0} frac{x^t A^t A x }{x^t x} =min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}$$



        But, bear in mind, here $lambda_1$ is an eigenvalue of $M$, not of $A$.



        To answer the question in the title: there cannot be a general analytical procedure for computing the smallest eingenvalue of a matrix, because the problem maps to finding the smallest root of a polinomial, and for degrees five and above there's no analytical procedure (in the usual sense of the expression) for doing that (Abel-Ruffini theorem)



        On the other side, if the matrix is $2times 2$, then it's trivial. But you would't use the Rayleigh quotient for that. This is only useful for computing numerically the smallest (or biggest) eigenvalue, by an iterative method.






        share|cite|improve this answer











        $endgroup$



        Your "definition" of minimum eigenvalue has several problems. For one thing, it would imply that every matrix has a real and positive eigenvalue, which is not true in general.
        Also, dont' forget a simple sanity check: your matrix property should apply for a $1times 1$ matrix... and it clearly doesn't.



        What you actually mean, I guess, is : if $M$ is hermitian and positive definite (which implies that its eigenvalues are real and positive), then letting $lambda_1$ be its smallest eigenvalue, we have



        $$lambda_1 = min_{x ne 0} frac{x^t M x }{x^t x} $$



        Further, if we decompose $M=A^t A$ the above gives



        $$lambda_1 = min_{x ne 0} frac{x^t A^t A x }{x^t x} =min_{,,,x\xneq 0} frac{|A x|^2}{|x|^2}$$



        But, bear in mind, here $lambda_1$ is an eigenvalue of $M$, not of $A$.



        To answer the question in the title: there cannot be a general analytical procedure for computing the smallest eingenvalue of a matrix, because the problem maps to finding the smallest root of a polinomial, and for degrees five and above there's no analytical procedure (in the usual sense of the expression) for doing that (Abel-Ruffini theorem)



        On the other side, if the matrix is $2times 2$, then it's trivial. But you would't use the Rayleigh quotient for that. This is only useful for computing numerically the smallest (or biggest) eigenvalue, by an iterative method.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Jan 9 at 2:14

























        answered Jan 9 at 2:09









        leonbloyleonbloy

        40.7k645107




        40.7k645107






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3066938%2fanalytically-finding-minimum-eigenvalue-of-a-matrix%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            MongoDB - Not Authorized To Execute Command

            How to fix TextFormField cause rebuild widget in Flutter

            in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith