Solving a non-homogenous system of differential equations












0












$begingroup$



If X' = AX + b then can I use the same principle as if this was an first order ODE?




What do I mean by that, to solve a first order ODE we multiply with the so-called integrating factor: $e^{-int a(t)dt}$ can I apply this here and solve it this way?



So what would I do:



$$X'-AX=b$$



multiply both sides with $e^{-At}$



$$(Xe^{-At})'=be^{-At}$$



integrate both sides



$$Xe^{-At}=int be^{-At}dt$$



Giving us:



$$X=e^{At}int be^{-At}dt$$



Where $A in mathcal M_n(mathbb{R})$



Is this proof correct? How do I integrate a Matrix or a vector? Do I integrate on every component? Also how do I raise $e$ to some matrix power? Do I use the fact that:



$$e^{At}=sum_{n=0}^{infty}frac{(At)^n}{n!}$$



And if $A$ is diagonalizable I can:



$$e^{At}=P(sum_{n=0}^{infty}frac{(Dt)^n}{n!})P^{-1}$$



Now:



$$e^{At}=Pe^{Dt}P^{-1}$$



where: $D$ is a diagonal matrix, but still how do I compute $e^{Dt}$?



And what if $A$ is not diagnolaziable and it's root are complex, conjugate?



How can I solve then?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Did you review something like math.berkeley.edu/~conway/Teaching/old/summer2016-2552bc/… and math.okstate.edu/people/binegar/4233/4233-l03.pdf and ndsu.edu/pubweb/~novozhil/Teaching/266%20Data/lecture_24.pdf?
    $endgroup$
    – Moo
    Jan 15 at 13:53












  • $begingroup$
    No, but can you exaplin me in more details what really happens when you raise $e$ to a matrix?
    $endgroup$
    – C. Cristi
    Jan 15 at 14:31










  • $begingroup$
    Sure, read this - cs.cornell.edu/cv/researchpdf/19ways+.pdf
    $endgroup$
    – Moo
    Jan 15 at 16:07












  • $begingroup$
    Your proof is formally correct although it doesn't make much sense if you don't know about the matrix exponential. For the integral of a vector-valued function, you integrate each entry, yes. For the calculation of the matrix exponential see my answer below.
    $endgroup$
    – Christoph
    Jan 16 at 5:50
















0












$begingroup$



If X' = AX + b then can I use the same principle as if this was an first order ODE?




What do I mean by that, to solve a first order ODE we multiply with the so-called integrating factor: $e^{-int a(t)dt}$ can I apply this here and solve it this way?



So what would I do:



$$X'-AX=b$$



multiply both sides with $e^{-At}$



$$(Xe^{-At})'=be^{-At}$$



integrate both sides



$$Xe^{-At}=int be^{-At}dt$$



Giving us:



$$X=e^{At}int be^{-At}dt$$



Where $A in mathcal M_n(mathbb{R})$



Is this proof correct? How do I integrate a Matrix or a vector? Do I integrate on every component? Also how do I raise $e$ to some matrix power? Do I use the fact that:



$$e^{At}=sum_{n=0}^{infty}frac{(At)^n}{n!}$$



And if $A$ is diagonalizable I can:



$$e^{At}=P(sum_{n=0}^{infty}frac{(Dt)^n}{n!})P^{-1}$$



Now:



$$e^{At}=Pe^{Dt}P^{-1}$$



where: $D$ is a diagonal matrix, but still how do I compute $e^{Dt}$?



And what if $A$ is not diagnolaziable and it's root are complex, conjugate?



How can I solve then?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Did you review something like math.berkeley.edu/~conway/Teaching/old/summer2016-2552bc/… and math.okstate.edu/people/binegar/4233/4233-l03.pdf and ndsu.edu/pubweb/~novozhil/Teaching/266%20Data/lecture_24.pdf?
    $endgroup$
    – Moo
    Jan 15 at 13:53












  • $begingroup$
    No, but can you exaplin me in more details what really happens when you raise $e$ to a matrix?
    $endgroup$
    – C. Cristi
    Jan 15 at 14:31










  • $begingroup$
    Sure, read this - cs.cornell.edu/cv/researchpdf/19ways+.pdf
    $endgroup$
    – Moo
    Jan 15 at 16:07












  • $begingroup$
    Your proof is formally correct although it doesn't make much sense if you don't know about the matrix exponential. For the integral of a vector-valued function, you integrate each entry, yes. For the calculation of the matrix exponential see my answer below.
    $endgroup$
    – Christoph
    Jan 16 at 5:50














0












0








0


1



$begingroup$



If X' = AX + b then can I use the same principle as if this was an first order ODE?




What do I mean by that, to solve a first order ODE we multiply with the so-called integrating factor: $e^{-int a(t)dt}$ can I apply this here and solve it this way?



So what would I do:



$$X'-AX=b$$



multiply both sides with $e^{-At}$



$$(Xe^{-At})'=be^{-At}$$



integrate both sides



$$Xe^{-At}=int be^{-At}dt$$



Giving us:



$$X=e^{At}int be^{-At}dt$$



Where $A in mathcal M_n(mathbb{R})$



Is this proof correct? How do I integrate a Matrix or a vector? Do I integrate on every component? Also how do I raise $e$ to some matrix power? Do I use the fact that:



$$e^{At}=sum_{n=0}^{infty}frac{(At)^n}{n!}$$



And if $A$ is diagonalizable I can:



$$e^{At}=P(sum_{n=0}^{infty}frac{(Dt)^n}{n!})P^{-1}$$



Now:



$$e^{At}=Pe^{Dt}P^{-1}$$



where: $D$ is a diagonal matrix, but still how do I compute $e^{Dt}$?



And what if $A$ is not diagnolaziable and it's root are complex, conjugate?



How can I solve then?










share|cite|improve this question









$endgroup$





If X' = AX + b then can I use the same principle as if this was an first order ODE?




What do I mean by that, to solve a first order ODE we multiply with the so-called integrating factor: $e^{-int a(t)dt}$ can I apply this here and solve it this way?



So what would I do:



$$X'-AX=b$$



multiply both sides with $e^{-At}$



$$(Xe^{-At})'=be^{-At}$$



integrate both sides



$$Xe^{-At}=int be^{-At}dt$$



Giving us:



$$X=e^{At}int be^{-At}dt$$



Where $A in mathcal M_n(mathbb{R})$



Is this proof correct? How do I integrate a Matrix or a vector? Do I integrate on every component? Also how do I raise $e$ to some matrix power? Do I use the fact that:



$$e^{At}=sum_{n=0}^{infty}frac{(At)^n}{n!}$$



And if $A$ is diagonalizable I can:



$$e^{At}=P(sum_{n=0}^{infty}frac{(Dt)^n}{n!})P^{-1}$$



Now:



$$e^{At}=Pe^{Dt}P^{-1}$$



where: $D$ is a diagonal matrix, but still how do I compute $e^{Dt}$?



And what if $A$ is not diagnolaziable and it's root are complex, conjugate?



How can I solve then?







ordinary-differential-equations systems-of-equations






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 15 at 13:41









C. CristiC. Cristi

1,629218




1,629218












  • $begingroup$
    Did you review something like math.berkeley.edu/~conway/Teaching/old/summer2016-2552bc/… and math.okstate.edu/people/binegar/4233/4233-l03.pdf and ndsu.edu/pubweb/~novozhil/Teaching/266%20Data/lecture_24.pdf?
    $endgroup$
    – Moo
    Jan 15 at 13:53












  • $begingroup$
    No, but can you exaplin me in more details what really happens when you raise $e$ to a matrix?
    $endgroup$
    – C. Cristi
    Jan 15 at 14:31










  • $begingroup$
    Sure, read this - cs.cornell.edu/cv/researchpdf/19ways+.pdf
    $endgroup$
    – Moo
    Jan 15 at 16:07












  • $begingroup$
    Your proof is formally correct although it doesn't make much sense if you don't know about the matrix exponential. For the integral of a vector-valued function, you integrate each entry, yes. For the calculation of the matrix exponential see my answer below.
    $endgroup$
    – Christoph
    Jan 16 at 5:50


















  • $begingroup$
    Did you review something like math.berkeley.edu/~conway/Teaching/old/summer2016-2552bc/… and math.okstate.edu/people/binegar/4233/4233-l03.pdf and ndsu.edu/pubweb/~novozhil/Teaching/266%20Data/lecture_24.pdf?
    $endgroup$
    – Moo
    Jan 15 at 13:53












  • $begingroup$
    No, but can you exaplin me in more details what really happens when you raise $e$ to a matrix?
    $endgroup$
    – C. Cristi
    Jan 15 at 14:31










  • $begingroup$
    Sure, read this - cs.cornell.edu/cv/researchpdf/19ways+.pdf
    $endgroup$
    – Moo
    Jan 15 at 16:07












  • $begingroup$
    Your proof is formally correct although it doesn't make much sense if you don't know about the matrix exponential. For the integral of a vector-valued function, you integrate each entry, yes. For the calculation of the matrix exponential see my answer below.
    $endgroup$
    – Christoph
    Jan 16 at 5:50
















$begingroup$
Did you review something like math.berkeley.edu/~conway/Teaching/old/summer2016-2552bc/… and math.okstate.edu/people/binegar/4233/4233-l03.pdf and ndsu.edu/pubweb/~novozhil/Teaching/266%20Data/lecture_24.pdf?
$endgroup$
– Moo
Jan 15 at 13:53






$begingroup$
Did you review something like math.berkeley.edu/~conway/Teaching/old/summer2016-2552bc/… and math.okstate.edu/people/binegar/4233/4233-l03.pdf and ndsu.edu/pubweb/~novozhil/Teaching/266%20Data/lecture_24.pdf?
$endgroup$
– Moo
Jan 15 at 13:53














$begingroup$
No, but can you exaplin me in more details what really happens when you raise $e$ to a matrix?
$endgroup$
– C. Cristi
Jan 15 at 14:31




$begingroup$
No, but can you exaplin me in more details what really happens when you raise $e$ to a matrix?
$endgroup$
– C. Cristi
Jan 15 at 14:31












$begingroup$
Sure, read this - cs.cornell.edu/cv/researchpdf/19ways+.pdf
$endgroup$
– Moo
Jan 15 at 16:07






$begingroup$
Sure, read this - cs.cornell.edu/cv/researchpdf/19ways+.pdf
$endgroup$
– Moo
Jan 15 at 16:07














$begingroup$
Your proof is formally correct although it doesn't make much sense if you don't know about the matrix exponential. For the integral of a vector-valued function, you integrate each entry, yes. For the calculation of the matrix exponential see my answer below.
$endgroup$
– Christoph
Jan 16 at 5:50




$begingroup$
Your proof is formally correct although it doesn't make much sense if you don't know about the matrix exponential. For the integral of a vector-valued function, you integrate each entry, yes. For the calculation of the matrix exponential see my answer below.
$endgroup$
– Christoph
Jan 16 at 5:50










1 Answer
1






active

oldest

votes


















1












$begingroup$

In general you use the Jordan normal form $J$ of $A$, $A = P J P^{-1}$. The matrix $J$ can be written as $J = D + N$ with a diagonal matrix $D$ (containing the eigenvalues of $A$) and with a nilpotent matrix $N$ ($N=0$ if and only if $A$ is diagonalizable). From the power series definition of the exponential function which you already wrote down we obtain
begin{equation}
e^{tA} = P e^{tJ} P^{-1} = P e^{t(D+N)} P^{-1} = P e^{tD} e^{tN} P^{-1}.
end{equation}

Now the matrix exponentials $e^{tD}$ and $e^{tN}$ are easy to evaluate:



1.
begin{equation}
e^{tD} = sum_{k=0}^{infty} frac{(tD)^k}{k!} = sum_{k=0}^{infty} frac{t^k}{k!} D^k = sum_{k=0}^{infty} frac{t^k}{k!} left( begin{array}{cccc}
d_1^k\
& d_2^k\
& & ddots\
& & & d_n^k
end{array}
right) = left( begin{array}{cccc}
sum_{k=0}^{infty} frac{(td_1)^k}{k!}\
& sum_{k=0}^{infty} frac{(td_2)^k}{k!}\
& & ddots\
& & & sum_{k=0}^{infty} frac{(td_n)^k}{k!}
end{array}
right) = left( begin{array}{cccc}
e^{td_1}\
& e^{td_2}\
& & ddots\
& & & e^{td_n}
end{array}
right).
end{equation}

2. For the matrix exponential of the nilpotent matrix $N$, which by definition satisfies $N^k = 0$, $k geq m$, for some $m in mathbb{N}$, you simply obtain a finite sum of powers of $tN$,
begin{equation}
e^{tN} = sum_{k=0}^{m-1} frac{(tN)^k}{k!}.
end{equation}

Finally you compute the product $e^{tA} = P e^{tD} e^{tN} P^{-1}$.



By the way all of this also works for complex-valued entries, there is no difference.






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074441%2fsolving-a-non-homogenous-system-of-differential-equations%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    In general you use the Jordan normal form $J$ of $A$, $A = P J P^{-1}$. The matrix $J$ can be written as $J = D + N$ with a diagonal matrix $D$ (containing the eigenvalues of $A$) and with a nilpotent matrix $N$ ($N=0$ if and only if $A$ is diagonalizable). From the power series definition of the exponential function which you already wrote down we obtain
    begin{equation}
    e^{tA} = P e^{tJ} P^{-1} = P e^{t(D+N)} P^{-1} = P e^{tD} e^{tN} P^{-1}.
    end{equation}

    Now the matrix exponentials $e^{tD}$ and $e^{tN}$ are easy to evaluate:



    1.
    begin{equation}
    e^{tD} = sum_{k=0}^{infty} frac{(tD)^k}{k!} = sum_{k=0}^{infty} frac{t^k}{k!} D^k = sum_{k=0}^{infty} frac{t^k}{k!} left( begin{array}{cccc}
    d_1^k\
    & d_2^k\
    & & ddots\
    & & & d_n^k
    end{array}
    right) = left( begin{array}{cccc}
    sum_{k=0}^{infty} frac{(td_1)^k}{k!}\
    & sum_{k=0}^{infty} frac{(td_2)^k}{k!}\
    & & ddots\
    & & & sum_{k=0}^{infty} frac{(td_n)^k}{k!}
    end{array}
    right) = left( begin{array}{cccc}
    e^{td_1}\
    & e^{td_2}\
    & & ddots\
    & & & e^{td_n}
    end{array}
    right).
    end{equation}

    2. For the matrix exponential of the nilpotent matrix $N$, which by definition satisfies $N^k = 0$, $k geq m$, for some $m in mathbb{N}$, you simply obtain a finite sum of powers of $tN$,
    begin{equation}
    e^{tN} = sum_{k=0}^{m-1} frac{(tN)^k}{k!}.
    end{equation}

    Finally you compute the product $e^{tA} = P e^{tD} e^{tN} P^{-1}$.



    By the way all of this also works for complex-valued entries, there is no difference.






    share|cite|improve this answer











    $endgroup$


















      1












      $begingroup$

      In general you use the Jordan normal form $J$ of $A$, $A = P J P^{-1}$. The matrix $J$ can be written as $J = D + N$ with a diagonal matrix $D$ (containing the eigenvalues of $A$) and with a nilpotent matrix $N$ ($N=0$ if and only if $A$ is diagonalizable). From the power series definition of the exponential function which you already wrote down we obtain
      begin{equation}
      e^{tA} = P e^{tJ} P^{-1} = P e^{t(D+N)} P^{-1} = P e^{tD} e^{tN} P^{-1}.
      end{equation}

      Now the matrix exponentials $e^{tD}$ and $e^{tN}$ are easy to evaluate:



      1.
      begin{equation}
      e^{tD} = sum_{k=0}^{infty} frac{(tD)^k}{k!} = sum_{k=0}^{infty} frac{t^k}{k!} D^k = sum_{k=0}^{infty} frac{t^k}{k!} left( begin{array}{cccc}
      d_1^k\
      & d_2^k\
      & & ddots\
      & & & d_n^k
      end{array}
      right) = left( begin{array}{cccc}
      sum_{k=0}^{infty} frac{(td_1)^k}{k!}\
      & sum_{k=0}^{infty} frac{(td_2)^k}{k!}\
      & & ddots\
      & & & sum_{k=0}^{infty} frac{(td_n)^k}{k!}
      end{array}
      right) = left( begin{array}{cccc}
      e^{td_1}\
      & e^{td_2}\
      & & ddots\
      & & & e^{td_n}
      end{array}
      right).
      end{equation}

      2. For the matrix exponential of the nilpotent matrix $N$, which by definition satisfies $N^k = 0$, $k geq m$, for some $m in mathbb{N}$, you simply obtain a finite sum of powers of $tN$,
      begin{equation}
      e^{tN} = sum_{k=0}^{m-1} frac{(tN)^k}{k!}.
      end{equation}

      Finally you compute the product $e^{tA} = P e^{tD} e^{tN} P^{-1}$.



      By the way all of this also works for complex-valued entries, there is no difference.






      share|cite|improve this answer











      $endgroup$
















        1












        1








        1





        $begingroup$

        In general you use the Jordan normal form $J$ of $A$, $A = P J P^{-1}$. The matrix $J$ can be written as $J = D + N$ with a diagonal matrix $D$ (containing the eigenvalues of $A$) and with a nilpotent matrix $N$ ($N=0$ if and only if $A$ is diagonalizable). From the power series definition of the exponential function which you already wrote down we obtain
        begin{equation}
        e^{tA} = P e^{tJ} P^{-1} = P e^{t(D+N)} P^{-1} = P e^{tD} e^{tN} P^{-1}.
        end{equation}

        Now the matrix exponentials $e^{tD}$ and $e^{tN}$ are easy to evaluate:



        1.
        begin{equation}
        e^{tD} = sum_{k=0}^{infty} frac{(tD)^k}{k!} = sum_{k=0}^{infty} frac{t^k}{k!} D^k = sum_{k=0}^{infty} frac{t^k}{k!} left( begin{array}{cccc}
        d_1^k\
        & d_2^k\
        & & ddots\
        & & & d_n^k
        end{array}
        right) = left( begin{array}{cccc}
        sum_{k=0}^{infty} frac{(td_1)^k}{k!}\
        & sum_{k=0}^{infty} frac{(td_2)^k}{k!}\
        & & ddots\
        & & & sum_{k=0}^{infty} frac{(td_n)^k}{k!}
        end{array}
        right) = left( begin{array}{cccc}
        e^{td_1}\
        & e^{td_2}\
        & & ddots\
        & & & e^{td_n}
        end{array}
        right).
        end{equation}

        2. For the matrix exponential of the nilpotent matrix $N$, which by definition satisfies $N^k = 0$, $k geq m$, for some $m in mathbb{N}$, you simply obtain a finite sum of powers of $tN$,
        begin{equation}
        e^{tN} = sum_{k=0}^{m-1} frac{(tN)^k}{k!}.
        end{equation}

        Finally you compute the product $e^{tA} = P e^{tD} e^{tN} P^{-1}$.



        By the way all of this also works for complex-valued entries, there is no difference.






        share|cite|improve this answer











        $endgroup$



        In general you use the Jordan normal form $J$ of $A$, $A = P J P^{-1}$. The matrix $J$ can be written as $J = D + N$ with a diagonal matrix $D$ (containing the eigenvalues of $A$) and with a nilpotent matrix $N$ ($N=0$ if and only if $A$ is diagonalizable). From the power series definition of the exponential function which you already wrote down we obtain
        begin{equation}
        e^{tA} = P e^{tJ} P^{-1} = P e^{t(D+N)} P^{-1} = P e^{tD} e^{tN} P^{-1}.
        end{equation}

        Now the matrix exponentials $e^{tD}$ and $e^{tN}$ are easy to evaluate:



        1.
        begin{equation}
        e^{tD} = sum_{k=0}^{infty} frac{(tD)^k}{k!} = sum_{k=0}^{infty} frac{t^k}{k!} D^k = sum_{k=0}^{infty} frac{t^k}{k!} left( begin{array}{cccc}
        d_1^k\
        & d_2^k\
        & & ddots\
        & & & d_n^k
        end{array}
        right) = left( begin{array}{cccc}
        sum_{k=0}^{infty} frac{(td_1)^k}{k!}\
        & sum_{k=0}^{infty} frac{(td_2)^k}{k!}\
        & & ddots\
        & & & sum_{k=0}^{infty} frac{(td_n)^k}{k!}
        end{array}
        right) = left( begin{array}{cccc}
        e^{td_1}\
        & e^{td_2}\
        & & ddots\
        & & & e^{td_n}
        end{array}
        right).
        end{equation}

        2. For the matrix exponential of the nilpotent matrix $N$, which by definition satisfies $N^k = 0$, $k geq m$, for some $m in mathbb{N}$, you simply obtain a finite sum of powers of $tN$,
        begin{equation}
        e^{tN} = sum_{k=0}^{m-1} frac{(tN)^k}{k!}.
        end{equation}

        Finally you compute the product $e^{tA} = P e^{tD} e^{tN} P^{-1}$.



        By the way all of this also works for complex-valued entries, there is no difference.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Jan 16 at 6:09

























        answered Jan 16 at 5:46









        ChristophChristoph

        58116




        58116






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074441%2fsolving-a-non-homogenous-system-of-differential-equations%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

            SQL update select statement

            'app-layout' is not a known element: how to share Component with different Modules