Euler-Ansatz for hom. ODEs with constant coefficients











up vote
0
down vote

favorite












Given is an hom. ODE with constant constant coefficients:



$A_0y(x)+A_1y'(x)+A_2y''(x) + dots + A_ny^{(n)}=0 tag{1}$



Now it's clear to me that the solution space $yinmathbb L$ is a vector space.



We can solve (1) using the Ansatz: $y(x)=e^{kx}$. We get:



$chi(k)=A_0 + A_1k + A_2k^2 + dots + A_nk^n=0 tag{2}$



Now with $k_i$ being a solution to $chi$ with multiplicity $m_i$, the solution to the ODE is:



$y(x)=sum_i y_i(x) tag{3}$



with



$y_i(x)=sum_{j=0}^{m_i} C_j x^j e^{k_i x}, quad C_jinmathbb R tag{4}$



Question: Since I expect $mathbb L$ to be a vector space, (4) kind of makes sense. I mean it doesn't look wrong and I can work with it and solve such ODEs, but I can't derive it. So (1), (2), (3) is clear but (4) isn't. How exactly do we get the $x^j$ part in (4)?










share|cite|improve this question


























    up vote
    0
    down vote

    favorite












    Given is an hom. ODE with constant constant coefficients:



    $A_0y(x)+A_1y'(x)+A_2y''(x) + dots + A_ny^{(n)}=0 tag{1}$



    Now it's clear to me that the solution space $yinmathbb L$ is a vector space.



    We can solve (1) using the Ansatz: $y(x)=e^{kx}$. We get:



    $chi(k)=A_0 + A_1k + A_2k^2 + dots + A_nk^n=0 tag{2}$



    Now with $k_i$ being a solution to $chi$ with multiplicity $m_i$, the solution to the ODE is:



    $y(x)=sum_i y_i(x) tag{3}$



    with



    $y_i(x)=sum_{j=0}^{m_i} C_j x^j e^{k_i x}, quad C_jinmathbb R tag{4}$



    Question: Since I expect $mathbb L$ to be a vector space, (4) kind of makes sense. I mean it doesn't look wrong and I can work with it and solve such ODEs, but I can't derive it. So (1), (2), (3) is clear but (4) isn't. How exactly do we get the $x^j$ part in (4)?










    share|cite|improve this question
























      up vote
      0
      down vote

      favorite









      up vote
      0
      down vote

      favorite











      Given is an hom. ODE with constant constant coefficients:



      $A_0y(x)+A_1y'(x)+A_2y''(x) + dots + A_ny^{(n)}=0 tag{1}$



      Now it's clear to me that the solution space $yinmathbb L$ is a vector space.



      We can solve (1) using the Ansatz: $y(x)=e^{kx}$. We get:



      $chi(k)=A_0 + A_1k + A_2k^2 + dots + A_nk^n=0 tag{2}$



      Now with $k_i$ being a solution to $chi$ with multiplicity $m_i$, the solution to the ODE is:



      $y(x)=sum_i y_i(x) tag{3}$



      with



      $y_i(x)=sum_{j=0}^{m_i} C_j x^j e^{k_i x}, quad C_jinmathbb R tag{4}$



      Question: Since I expect $mathbb L$ to be a vector space, (4) kind of makes sense. I mean it doesn't look wrong and I can work with it and solve such ODEs, but I can't derive it. So (1), (2), (3) is clear but (4) isn't. How exactly do we get the $x^j$ part in (4)?










      share|cite|improve this question













      Given is an hom. ODE with constant constant coefficients:



      $A_0y(x)+A_1y'(x)+A_2y''(x) + dots + A_ny^{(n)}=0 tag{1}$



      Now it's clear to me that the solution space $yinmathbb L$ is a vector space.



      We can solve (1) using the Ansatz: $y(x)=e^{kx}$. We get:



      $chi(k)=A_0 + A_1k + A_2k^2 + dots + A_nk^n=0 tag{2}$



      Now with $k_i$ being a solution to $chi$ with multiplicity $m_i$, the solution to the ODE is:



      $y(x)=sum_i y_i(x) tag{3}$



      with



      $y_i(x)=sum_{j=0}^{m_i} C_j x^j e^{k_i x}, quad C_jinmathbb R tag{4}$



      Question: Since I expect $mathbb L$ to be a vector space, (4) kind of makes sense. I mean it doesn't look wrong and I can work with it and solve such ODEs, but I can't derive it. So (1), (2), (3) is clear but (4) isn't. How exactly do we get the $x^j$ part in (4)?







      calculus differential-equations






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked yesterday









      xotix

      32629




      32629






















          1 Answer
          1






          active

          oldest

          votes

















          up vote
          1
          down vote



          accepted










          Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,



          $frac{dv}{dt} = Av$



          $v(0) = v_0$



          where $A$ is an appropriate constant matrix. Now, you can show that



          $v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$



          is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.



          Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have



          $v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$



          The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form



          $e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)






          share|cite|improve this answer























            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














             

            draft saved


            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004933%2feuler-ansatz-for-hom-odes-with-constant-coefficients%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes








            up vote
            1
            down vote



            accepted










            Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,



            $frac{dv}{dt} = Av$



            $v(0) = v_0$



            where $A$ is an appropriate constant matrix. Now, you can show that



            $v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$



            is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.



            Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have



            $v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$



            The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form



            $e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)






            share|cite|improve this answer



























              up vote
              1
              down vote



              accepted










              Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,



              $frac{dv}{dt} = Av$



              $v(0) = v_0$



              where $A$ is an appropriate constant matrix. Now, you can show that



              $v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$



              is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.



              Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have



              $v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$



              The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form



              $e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)






              share|cite|improve this answer

























                up vote
                1
                down vote



                accepted







                up vote
                1
                down vote



                accepted






                Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,



                $frac{dv}{dt} = Av$



                $v(0) = v_0$



                where $A$ is an appropriate constant matrix. Now, you can show that



                $v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$



                is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.



                Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have



                $v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$



                The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form



                $e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)






                share|cite|improve this answer














                Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,



                $frac{dv}{dt} = Av$



                $v(0) = v_0$



                where $A$ is an appropriate constant matrix. Now, you can show that



                $v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$



                is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.



                Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have



                $v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$



                The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form



                $e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)







                share|cite|improve this answer














                share|cite|improve this answer



                share|cite|improve this answer








                edited yesterday

























                answered yesterday









                M. Santos

                314




                314






























                     

                    draft saved


                    draft discarded



















































                     


                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004933%2feuler-ansatz-for-hom-odes-with-constant-coefficients%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    'app-layout' is not a known element: how to share Component with different Modules

                    android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

                    WPF add header to Image with URL pettitions [duplicate]