Integral of matrix exponential












31












$begingroup$


Let $A$ be an $n times n$ matrix. Then the solution of the initial value problem
begin{align*}
dot{x}(t) = A x(t), quad x(0) = x_0
end{align*}
is given by $x(t) = mathrm{e}^{At} x_0$.



I am interested in the following matrix
begin{align*}
int_{0}^T mathrm{e}^{At}, dt
end{align*}
for some $T>0$. Can one write down a general solution to this without distinguishing cases (e.g. $A$ nonsingular)?



Is this matrix always invertible?










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    you can represent matrix exponential by power series
    $endgroup$
    – dato datuashvili
    Jan 31 '14 at 9:52








  • 1




    $begingroup$
    en.wikipedia.org/wiki/Matrix_exponential
    $endgroup$
    – dato datuashvili
    Jan 31 '14 at 9:52










  • $begingroup$
    Spectral theorem says that if you take analytical function $f$, apply it to the matrix $A$ (its eigenvaules has to be in the domain of analaticity of $f$), then eigenvalues of $f(A)$ are $f(text{eigenvalues of } A)$. Exponent is never zero, hence $exp(A)$ is always invertible, moreover, $exp(A)^{-1}=exp(-A)$.
    $endgroup$
    – TZakrevskiy
    Jan 31 '14 at 10:05
















31












$begingroup$


Let $A$ be an $n times n$ matrix. Then the solution of the initial value problem
begin{align*}
dot{x}(t) = A x(t), quad x(0) = x_0
end{align*}
is given by $x(t) = mathrm{e}^{At} x_0$.



I am interested in the following matrix
begin{align*}
int_{0}^T mathrm{e}^{At}, dt
end{align*}
for some $T>0$. Can one write down a general solution to this without distinguishing cases (e.g. $A$ nonsingular)?



Is this matrix always invertible?










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    you can represent matrix exponential by power series
    $endgroup$
    – dato datuashvili
    Jan 31 '14 at 9:52








  • 1




    $begingroup$
    en.wikipedia.org/wiki/Matrix_exponential
    $endgroup$
    – dato datuashvili
    Jan 31 '14 at 9:52










  • $begingroup$
    Spectral theorem says that if you take analytical function $f$, apply it to the matrix $A$ (its eigenvaules has to be in the domain of analaticity of $f$), then eigenvalues of $f(A)$ are $f(text{eigenvalues of } A)$. Exponent is never zero, hence $exp(A)$ is always invertible, moreover, $exp(A)^{-1}=exp(-A)$.
    $endgroup$
    – TZakrevskiy
    Jan 31 '14 at 10:05














31












31








31


12



$begingroup$


Let $A$ be an $n times n$ matrix. Then the solution of the initial value problem
begin{align*}
dot{x}(t) = A x(t), quad x(0) = x_0
end{align*}
is given by $x(t) = mathrm{e}^{At} x_0$.



I am interested in the following matrix
begin{align*}
int_{0}^T mathrm{e}^{At}, dt
end{align*}
for some $T>0$. Can one write down a general solution to this without distinguishing cases (e.g. $A$ nonsingular)?



Is this matrix always invertible?










share|cite|improve this question











$endgroup$




Let $A$ be an $n times n$ matrix. Then the solution of the initial value problem
begin{align*}
dot{x}(t) = A x(t), quad x(0) = x_0
end{align*}
is given by $x(t) = mathrm{e}^{At} x_0$.



I am interested in the following matrix
begin{align*}
int_{0}^T mathrm{e}^{At}, dt
end{align*}
for some $T>0$. Can one write down a general solution to this without distinguishing cases (e.g. $A$ nonsingular)?



Is this matrix always invertible?







calculus linear-algebra matrices exponential-function matrix-calculus






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Oct 28 '14 at 22:48









Yiorgos S. Smyrlis

63.7k1385165




63.7k1385165










asked Jan 31 '14 at 9:45









samsa44samsa44

166124




166124








  • 1




    $begingroup$
    you can represent matrix exponential by power series
    $endgroup$
    – dato datuashvili
    Jan 31 '14 at 9:52








  • 1




    $begingroup$
    en.wikipedia.org/wiki/Matrix_exponential
    $endgroup$
    – dato datuashvili
    Jan 31 '14 at 9:52










  • $begingroup$
    Spectral theorem says that if you take analytical function $f$, apply it to the matrix $A$ (its eigenvaules has to be in the domain of analaticity of $f$), then eigenvalues of $f(A)$ are $f(text{eigenvalues of } A)$. Exponent is never zero, hence $exp(A)$ is always invertible, moreover, $exp(A)^{-1}=exp(-A)$.
    $endgroup$
    – TZakrevskiy
    Jan 31 '14 at 10:05














  • 1




    $begingroup$
    you can represent matrix exponential by power series
    $endgroup$
    – dato datuashvili
    Jan 31 '14 at 9:52








  • 1




    $begingroup$
    en.wikipedia.org/wiki/Matrix_exponential
    $endgroup$
    – dato datuashvili
    Jan 31 '14 at 9:52










  • $begingroup$
    Spectral theorem says that if you take analytical function $f$, apply it to the matrix $A$ (its eigenvaules has to be in the domain of analaticity of $f$), then eigenvalues of $f(A)$ are $f(text{eigenvalues of } A)$. Exponent is never zero, hence $exp(A)$ is always invertible, moreover, $exp(A)^{-1}=exp(-A)$.
    $endgroup$
    – TZakrevskiy
    Jan 31 '14 at 10:05








1




1




$begingroup$
you can represent matrix exponential by power series
$endgroup$
– dato datuashvili
Jan 31 '14 at 9:52






$begingroup$
you can represent matrix exponential by power series
$endgroup$
– dato datuashvili
Jan 31 '14 at 9:52






1




1




$begingroup$
en.wikipedia.org/wiki/Matrix_exponential
$endgroup$
– dato datuashvili
Jan 31 '14 at 9:52




$begingroup$
en.wikipedia.org/wiki/Matrix_exponential
$endgroup$
– dato datuashvili
Jan 31 '14 at 9:52












$begingroup$
Spectral theorem says that if you take analytical function $f$, apply it to the matrix $A$ (its eigenvaules has to be in the domain of analaticity of $f$), then eigenvalues of $f(A)$ are $f(text{eigenvalues of } A)$. Exponent is never zero, hence $exp(A)$ is always invertible, moreover, $exp(A)^{-1}=exp(-A)$.
$endgroup$
– TZakrevskiy
Jan 31 '14 at 10:05




$begingroup$
Spectral theorem says that if you take analytical function $f$, apply it to the matrix $A$ (its eigenvaules has to be in the domain of analaticity of $f$), then eigenvalues of $f(A)$ are $f(text{eigenvalues of } A)$. Exponent is never zero, hence $exp(A)$ is always invertible, moreover, $exp(A)^{-1}=exp(-A)$.
$endgroup$
– TZakrevskiy
Jan 31 '14 at 10:05










4 Answers
4






active

oldest

votes


















21












$begingroup$

Case I. If $A$ is nonsingular, then
$$
int_0^Tmathrm{e}^{tA},dt=A^{-1}big(mathrm{e}^{TA}-Ibig),
$$
where $I$ is the identity matrix.



Case II. If $A$ is singular, then using the Jordan form we can write $A$ as
$$
A=U^{-1}left(begin{matrix}B&0\0&Cend{matrix}right)U,
$$
where $C$ is nonsingular, and $B$ is strictly upper triangular. Then
$$
mathrm{e}^{tA}=U^{-1}left(begin{matrix}mathrm{e}^{tB}&0\0&mathrm{e}^{tC}
end{matrix}right)U,
$$
and
$$
int_0^Tmathrm{e}^{tA},dt=U^{-1}left(begin{matrix}int_0^Tmathrm{e}^{tB}dt&0\0&C^{-1}big(mathrm{e}^{TC}-Ibig)
end{matrix}right)U
$$
But $int_0^Tmathrm{e}^{tB}dt$ may have different expressions. For example if
$$
B_1=left(begin{matrix}0&0\0&0end{matrix}right), quad
B_2=left(begin{matrix}0&1\0&0end{matrix}right),
$$
then
$$
int_0^Tmathrm{e}^{tB_1}dt=left(begin{matrix}T&0\0&Tend{matrix}right), quad
int_0^Tmathrm{e}^{tB_2}dt=left(begin{matrix}T&T^2/2\0&Tend{matrix}right).
$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    What we will do if we use $A(t)$ instead of $tA$
    $endgroup$
    – Nirvana
    Oct 6 '14 at 8:06










  • $begingroup$
    There is nothing wrong with that.
    $endgroup$
    – Yiorgos S. Smyrlis
    Oct 6 '14 at 10:22










  • $begingroup$
    @ Yiorgos S. Smyrlis So you are saying even if $A(t)$ and $A^{'}(t)$ are not commutative,this will work
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:28












  • $begingroup$
    Here A is not a constant matrix. It is a variable matrix
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:28










  • $begingroup$
    I am talking about $int e^{A(t)} dt$
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:33



















18












$begingroup$

The general formula is the power series



$$ int_0^T e^{At} dt = T left( I + frac{AT}{2!} + frac{(AT)^2}{3!} + dots + frac{(AT)^{n-1}}{n!} + dots right) $$



Note that also



$$ left(int_0^T e^{At} dt right) A + I = e^{AT} $$



is always satisfied.



A sufficient condition for this matrix to be non-singular is the so-called Kalman-Ho-Narendra Theorem, which states that the matrix $int_0^T e^{At} dt$ is invertible if



$$ T(mu - lambda) neq 2k pi i $$



for any nonzero integer $k$, where $lambda$ and $mu$ are any pair of eigenvalues of $A$.



Note to the interested: This matrix also comes from the discretization of a continuous linear time invariant system. It can also be said that controllability is preserved under discretization if and only if this matrix has an inverse.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Thank you. This answer looks very promising. Btw, I am looking at this matrix exactly because of what you said about the discretization!
    $endgroup$
    – samsa44
    Feb 24 '14 at 11:19





















3












$begingroup$

A Python numerical answer



It is surprisingly difficult to find a proper python package for numerical integration of matrix. I know it is not what the question want but I cannot find anywhere else to publish this.



Here, I provide a numerical solution to it. Just call the function



intergral_result = compute_exp_matrix_intergration(A,T)


will be enough



import numpy as np
def compute_exp_matrix_intergration(A,T,nbins=100):
f = lambda x: expm(A*x)
xv = np.linspace(0,T,nbins)
result = np.apply_along_axis(f,0,xv.reshape(1,-1))
return np.trapz(result,xv)





share|cite|improve this answer









$endgroup$













  • $begingroup$
    I get a shape error in result = ... . Also, to whoever it may help: the expm is from scipy.linalg.
    $endgroup$
    – anderstood
    Jan 29 at 14:13



















0












$begingroup$

Another Python implementation



Here is another implementation in Python, if it can be helpful to anyone... (and since ArtificiallyIntelligence's answer returns an error in my setup).
The value of the integral is integral and the last line verifies the equality $int_0^T e^{At}dt = A^{-1}(e^{tA}-I)$, provided $A$ is nonsingular (which is generically the case for randomly generated matrices).



import numpy as np
N = 5
t = 1
A = np.random.rand(N,N)
taylor = t*np.array([np.linalg.matrix_power(A*t,k)/np.math.factorial(k+1) for k in range(50)])
integral = taylor.sum(axis = 0)

print np.linalg.norm(integral - np.dot(np.linalg.inv(A),scipy.linalg.expm(t*A)-np.identity(N)))


Note that you should adjust the 50 in taylor = ... to check convergence.






share|cite|improve this answer









$endgroup$














    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f658276%2fintegral-of-matrix-exponential%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    4 Answers
    4






    active

    oldest

    votes








    4 Answers
    4






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    21












    $begingroup$

    Case I. If $A$ is nonsingular, then
    $$
    int_0^Tmathrm{e}^{tA},dt=A^{-1}big(mathrm{e}^{TA}-Ibig),
    $$
    where $I$ is the identity matrix.



    Case II. If $A$ is singular, then using the Jordan form we can write $A$ as
    $$
    A=U^{-1}left(begin{matrix}B&0\0&Cend{matrix}right)U,
    $$
    where $C$ is nonsingular, and $B$ is strictly upper triangular. Then
    $$
    mathrm{e}^{tA}=U^{-1}left(begin{matrix}mathrm{e}^{tB}&0\0&mathrm{e}^{tC}
    end{matrix}right)U,
    $$
    and
    $$
    int_0^Tmathrm{e}^{tA},dt=U^{-1}left(begin{matrix}int_0^Tmathrm{e}^{tB}dt&0\0&C^{-1}big(mathrm{e}^{TC}-Ibig)
    end{matrix}right)U
    $$
    But $int_0^Tmathrm{e}^{tB}dt$ may have different expressions. For example if
    $$
    B_1=left(begin{matrix}0&0\0&0end{matrix}right), quad
    B_2=left(begin{matrix}0&1\0&0end{matrix}right),
    $$
    then
    $$
    int_0^Tmathrm{e}^{tB_1}dt=left(begin{matrix}T&0\0&Tend{matrix}right), quad
    int_0^Tmathrm{e}^{tB_2}dt=left(begin{matrix}T&T^2/2\0&Tend{matrix}right).
    $$






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      What we will do if we use $A(t)$ instead of $tA$
      $endgroup$
      – Nirvana
      Oct 6 '14 at 8:06










    • $begingroup$
      There is nothing wrong with that.
      $endgroup$
      – Yiorgos S. Smyrlis
      Oct 6 '14 at 10:22










    • $begingroup$
      @ Yiorgos S. Smyrlis So you are saying even if $A(t)$ and $A^{'}(t)$ are not commutative,this will work
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:28












    • $begingroup$
      Here A is not a constant matrix. It is a variable matrix
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:28










    • $begingroup$
      I am talking about $int e^{A(t)} dt$
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:33
















    21












    $begingroup$

    Case I. If $A$ is nonsingular, then
    $$
    int_0^Tmathrm{e}^{tA},dt=A^{-1}big(mathrm{e}^{TA}-Ibig),
    $$
    where $I$ is the identity matrix.



    Case II. If $A$ is singular, then using the Jordan form we can write $A$ as
    $$
    A=U^{-1}left(begin{matrix}B&0\0&Cend{matrix}right)U,
    $$
    where $C$ is nonsingular, and $B$ is strictly upper triangular. Then
    $$
    mathrm{e}^{tA}=U^{-1}left(begin{matrix}mathrm{e}^{tB}&0\0&mathrm{e}^{tC}
    end{matrix}right)U,
    $$
    and
    $$
    int_0^Tmathrm{e}^{tA},dt=U^{-1}left(begin{matrix}int_0^Tmathrm{e}^{tB}dt&0\0&C^{-1}big(mathrm{e}^{TC}-Ibig)
    end{matrix}right)U
    $$
    But $int_0^Tmathrm{e}^{tB}dt$ may have different expressions. For example if
    $$
    B_1=left(begin{matrix}0&0\0&0end{matrix}right), quad
    B_2=left(begin{matrix}0&1\0&0end{matrix}right),
    $$
    then
    $$
    int_0^Tmathrm{e}^{tB_1}dt=left(begin{matrix}T&0\0&Tend{matrix}right), quad
    int_0^Tmathrm{e}^{tB_2}dt=left(begin{matrix}T&T^2/2\0&Tend{matrix}right).
    $$






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      What we will do if we use $A(t)$ instead of $tA$
      $endgroup$
      – Nirvana
      Oct 6 '14 at 8:06










    • $begingroup$
      There is nothing wrong with that.
      $endgroup$
      – Yiorgos S. Smyrlis
      Oct 6 '14 at 10:22










    • $begingroup$
      @ Yiorgos S. Smyrlis So you are saying even if $A(t)$ and $A^{'}(t)$ are not commutative,this will work
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:28












    • $begingroup$
      Here A is not a constant matrix. It is a variable matrix
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:28










    • $begingroup$
      I am talking about $int e^{A(t)} dt$
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:33














    21












    21








    21





    $begingroup$

    Case I. If $A$ is nonsingular, then
    $$
    int_0^Tmathrm{e}^{tA},dt=A^{-1}big(mathrm{e}^{TA}-Ibig),
    $$
    where $I$ is the identity matrix.



    Case II. If $A$ is singular, then using the Jordan form we can write $A$ as
    $$
    A=U^{-1}left(begin{matrix}B&0\0&Cend{matrix}right)U,
    $$
    where $C$ is nonsingular, and $B$ is strictly upper triangular. Then
    $$
    mathrm{e}^{tA}=U^{-1}left(begin{matrix}mathrm{e}^{tB}&0\0&mathrm{e}^{tC}
    end{matrix}right)U,
    $$
    and
    $$
    int_0^Tmathrm{e}^{tA},dt=U^{-1}left(begin{matrix}int_0^Tmathrm{e}^{tB}dt&0\0&C^{-1}big(mathrm{e}^{TC}-Ibig)
    end{matrix}right)U
    $$
    But $int_0^Tmathrm{e}^{tB}dt$ may have different expressions. For example if
    $$
    B_1=left(begin{matrix}0&0\0&0end{matrix}right), quad
    B_2=left(begin{matrix}0&1\0&0end{matrix}right),
    $$
    then
    $$
    int_0^Tmathrm{e}^{tB_1}dt=left(begin{matrix}T&0\0&Tend{matrix}right), quad
    int_0^Tmathrm{e}^{tB_2}dt=left(begin{matrix}T&T^2/2\0&Tend{matrix}right).
    $$






    share|cite|improve this answer











    $endgroup$



    Case I. If $A$ is nonsingular, then
    $$
    int_0^Tmathrm{e}^{tA},dt=A^{-1}big(mathrm{e}^{TA}-Ibig),
    $$
    where $I$ is the identity matrix.



    Case II. If $A$ is singular, then using the Jordan form we can write $A$ as
    $$
    A=U^{-1}left(begin{matrix}B&0\0&Cend{matrix}right)U,
    $$
    where $C$ is nonsingular, and $B$ is strictly upper triangular. Then
    $$
    mathrm{e}^{tA}=U^{-1}left(begin{matrix}mathrm{e}^{tB}&0\0&mathrm{e}^{tC}
    end{matrix}right)U,
    $$
    and
    $$
    int_0^Tmathrm{e}^{tA},dt=U^{-1}left(begin{matrix}int_0^Tmathrm{e}^{tB}dt&0\0&C^{-1}big(mathrm{e}^{TC}-Ibig)
    end{matrix}right)U
    $$
    But $int_0^Tmathrm{e}^{tB}dt$ may have different expressions. For example if
    $$
    B_1=left(begin{matrix}0&0\0&0end{matrix}right), quad
    B_2=left(begin{matrix}0&1\0&0end{matrix}right),
    $$
    then
    $$
    int_0^Tmathrm{e}^{tB_1}dt=left(begin{matrix}T&0\0&Tend{matrix}right), quad
    int_0^Tmathrm{e}^{tB_2}dt=left(begin{matrix}T&T^2/2\0&Tend{matrix}right).
    $$







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jan 15 '17 at 14:57

























    answered Jan 31 '14 at 10:06









    Yiorgos S. SmyrlisYiorgos S. Smyrlis

    63.7k1385165




    63.7k1385165












    • $begingroup$
      What we will do if we use $A(t)$ instead of $tA$
      $endgroup$
      – Nirvana
      Oct 6 '14 at 8:06










    • $begingroup$
      There is nothing wrong with that.
      $endgroup$
      – Yiorgos S. Smyrlis
      Oct 6 '14 at 10:22










    • $begingroup$
      @ Yiorgos S. Smyrlis So you are saying even if $A(t)$ and $A^{'}(t)$ are not commutative,this will work
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:28












    • $begingroup$
      Here A is not a constant matrix. It is a variable matrix
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:28










    • $begingroup$
      I am talking about $int e^{A(t)} dt$
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:33


















    • $begingroup$
      What we will do if we use $A(t)$ instead of $tA$
      $endgroup$
      – Nirvana
      Oct 6 '14 at 8:06










    • $begingroup$
      There is nothing wrong with that.
      $endgroup$
      – Yiorgos S. Smyrlis
      Oct 6 '14 at 10:22










    • $begingroup$
      @ Yiorgos S. Smyrlis So you are saying even if $A(t)$ and $A^{'}(t)$ are not commutative,this will work
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:28












    • $begingroup$
      Here A is not a constant matrix. It is a variable matrix
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:28










    • $begingroup$
      I am talking about $int e^{A(t)} dt$
      $endgroup$
      – Nirvana
      Oct 6 '14 at 10:33
















    $begingroup$
    What we will do if we use $A(t)$ instead of $tA$
    $endgroup$
    – Nirvana
    Oct 6 '14 at 8:06




    $begingroup$
    What we will do if we use $A(t)$ instead of $tA$
    $endgroup$
    – Nirvana
    Oct 6 '14 at 8:06












    $begingroup$
    There is nothing wrong with that.
    $endgroup$
    – Yiorgos S. Smyrlis
    Oct 6 '14 at 10:22




    $begingroup$
    There is nothing wrong with that.
    $endgroup$
    – Yiorgos S. Smyrlis
    Oct 6 '14 at 10:22












    $begingroup$
    @ Yiorgos S. Smyrlis So you are saying even if $A(t)$ and $A^{'}(t)$ are not commutative,this will work
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:28






    $begingroup$
    @ Yiorgos S. Smyrlis So you are saying even if $A(t)$ and $A^{'}(t)$ are not commutative,this will work
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:28














    $begingroup$
    Here A is not a constant matrix. It is a variable matrix
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:28




    $begingroup$
    Here A is not a constant matrix. It is a variable matrix
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:28












    $begingroup$
    I am talking about $int e^{A(t)} dt$
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:33




    $begingroup$
    I am talking about $int e^{A(t)} dt$
    $endgroup$
    – Nirvana
    Oct 6 '14 at 10:33











    18












    $begingroup$

    The general formula is the power series



    $$ int_0^T e^{At} dt = T left( I + frac{AT}{2!} + frac{(AT)^2}{3!} + dots + frac{(AT)^{n-1}}{n!} + dots right) $$



    Note that also



    $$ left(int_0^T e^{At} dt right) A + I = e^{AT} $$



    is always satisfied.



    A sufficient condition for this matrix to be non-singular is the so-called Kalman-Ho-Narendra Theorem, which states that the matrix $int_0^T e^{At} dt$ is invertible if



    $$ T(mu - lambda) neq 2k pi i $$



    for any nonzero integer $k$, where $lambda$ and $mu$ are any pair of eigenvalues of $A$.



    Note to the interested: This matrix also comes from the discretization of a continuous linear time invariant system. It can also be said that controllability is preserved under discretization if and only if this matrix has an inverse.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      Thank you. This answer looks very promising. Btw, I am looking at this matrix exactly because of what you said about the discretization!
      $endgroup$
      – samsa44
      Feb 24 '14 at 11:19


















    18












    $begingroup$

    The general formula is the power series



    $$ int_0^T e^{At} dt = T left( I + frac{AT}{2!} + frac{(AT)^2}{3!} + dots + frac{(AT)^{n-1}}{n!} + dots right) $$



    Note that also



    $$ left(int_0^T e^{At} dt right) A + I = e^{AT} $$



    is always satisfied.



    A sufficient condition for this matrix to be non-singular is the so-called Kalman-Ho-Narendra Theorem, which states that the matrix $int_0^T e^{At} dt$ is invertible if



    $$ T(mu - lambda) neq 2k pi i $$



    for any nonzero integer $k$, where $lambda$ and $mu$ are any pair of eigenvalues of $A$.



    Note to the interested: This matrix also comes from the discretization of a continuous linear time invariant system. It can also be said that controllability is preserved under discretization if and only if this matrix has an inverse.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      Thank you. This answer looks very promising. Btw, I am looking at this matrix exactly because of what you said about the discretization!
      $endgroup$
      – samsa44
      Feb 24 '14 at 11:19
















    18












    18








    18





    $begingroup$

    The general formula is the power series



    $$ int_0^T e^{At} dt = T left( I + frac{AT}{2!} + frac{(AT)^2}{3!} + dots + frac{(AT)^{n-1}}{n!} + dots right) $$



    Note that also



    $$ left(int_0^T e^{At} dt right) A + I = e^{AT} $$



    is always satisfied.



    A sufficient condition for this matrix to be non-singular is the so-called Kalman-Ho-Narendra Theorem, which states that the matrix $int_0^T e^{At} dt$ is invertible if



    $$ T(mu - lambda) neq 2k pi i $$



    for any nonzero integer $k$, where $lambda$ and $mu$ are any pair of eigenvalues of $A$.



    Note to the interested: This matrix also comes from the discretization of a continuous linear time invariant system. It can also be said that controllability is preserved under discretization if and only if this matrix has an inverse.






    share|cite|improve this answer









    $endgroup$



    The general formula is the power series



    $$ int_0^T e^{At} dt = T left( I + frac{AT}{2!} + frac{(AT)^2}{3!} + dots + frac{(AT)^{n-1}}{n!} + dots right) $$



    Note that also



    $$ left(int_0^T e^{At} dt right) A + I = e^{AT} $$



    is always satisfied.



    A sufficient condition for this matrix to be non-singular is the so-called Kalman-Ho-Narendra Theorem, which states that the matrix $int_0^T e^{At} dt$ is invertible if



    $$ T(mu - lambda) neq 2k pi i $$



    for any nonzero integer $k$, where $lambda$ and $mu$ are any pair of eigenvalues of $A$.



    Note to the interested: This matrix also comes from the discretization of a continuous linear time invariant system. It can also be said that controllability is preserved under discretization if and only if this matrix has an inverse.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jan 31 '14 at 18:40









    obareeyobareey

    3,06911128




    3,06911128












    • $begingroup$
      Thank you. This answer looks very promising. Btw, I am looking at this matrix exactly because of what you said about the discretization!
      $endgroup$
      – samsa44
      Feb 24 '14 at 11:19




















    • $begingroup$
      Thank you. This answer looks very promising. Btw, I am looking at this matrix exactly because of what you said about the discretization!
      $endgroup$
      – samsa44
      Feb 24 '14 at 11:19


















    $begingroup$
    Thank you. This answer looks very promising. Btw, I am looking at this matrix exactly because of what you said about the discretization!
    $endgroup$
    – samsa44
    Feb 24 '14 at 11:19






    $begingroup$
    Thank you. This answer looks very promising. Btw, I am looking at this matrix exactly because of what you said about the discretization!
    $endgroup$
    – samsa44
    Feb 24 '14 at 11:19













    3












    $begingroup$

    A Python numerical answer



    It is surprisingly difficult to find a proper python package for numerical integration of matrix. I know it is not what the question want but I cannot find anywhere else to publish this.



    Here, I provide a numerical solution to it. Just call the function



    intergral_result = compute_exp_matrix_intergration(A,T)


    will be enough



    import numpy as np
    def compute_exp_matrix_intergration(A,T,nbins=100):
    f = lambda x: expm(A*x)
    xv = np.linspace(0,T,nbins)
    result = np.apply_along_axis(f,0,xv.reshape(1,-1))
    return np.trapz(result,xv)





    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      I get a shape error in result = ... . Also, to whoever it may help: the expm is from scipy.linalg.
      $endgroup$
      – anderstood
      Jan 29 at 14:13
















    3












    $begingroup$

    A Python numerical answer



    It is surprisingly difficult to find a proper python package for numerical integration of matrix. I know it is not what the question want but I cannot find anywhere else to publish this.



    Here, I provide a numerical solution to it. Just call the function



    intergral_result = compute_exp_matrix_intergration(A,T)


    will be enough



    import numpy as np
    def compute_exp_matrix_intergration(A,T,nbins=100):
    f = lambda x: expm(A*x)
    xv = np.linspace(0,T,nbins)
    result = np.apply_along_axis(f,0,xv.reshape(1,-1))
    return np.trapz(result,xv)





    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      I get a shape error in result = ... . Also, to whoever it may help: the expm is from scipy.linalg.
      $endgroup$
      – anderstood
      Jan 29 at 14:13














    3












    3








    3





    $begingroup$

    A Python numerical answer



    It is surprisingly difficult to find a proper python package for numerical integration of matrix. I know it is not what the question want but I cannot find anywhere else to publish this.



    Here, I provide a numerical solution to it. Just call the function



    intergral_result = compute_exp_matrix_intergration(A,T)


    will be enough



    import numpy as np
    def compute_exp_matrix_intergration(A,T,nbins=100):
    f = lambda x: expm(A*x)
    xv = np.linspace(0,T,nbins)
    result = np.apply_along_axis(f,0,xv.reshape(1,-1))
    return np.trapz(result,xv)





    share|cite|improve this answer









    $endgroup$



    A Python numerical answer



    It is surprisingly difficult to find a proper python package for numerical integration of matrix. I know it is not what the question want but I cannot find anywhere else to publish this.



    Here, I provide a numerical solution to it. Just call the function



    intergral_result = compute_exp_matrix_intergration(A,T)


    will be enough



    import numpy as np
    def compute_exp_matrix_intergration(A,T,nbins=100):
    f = lambda x: expm(A*x)
    xv = np.linspace(0,T,nbins)
    result = np.apply_along_axis(f,0,xv.reshape(1,-1))
    return np.trapz(result,xv)






    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jun 3 '18 at 21:13









    ArtificiallyIntelligenceArtificiallyIntelligence

    310112




    310112












    • $begingroup$
      I get a shape error in result = ... . Also, to whoever it may help: the expm is from scipy.linalg.
      $endgroup$
      – anderstood
      Jan 29 at 14:13


















    • $begingroup$
      I get a shape error in result = ... . Also, to whoever it may help: the expm is from scipy.linalg.
      $endgroup$
      – anderstood
      Jan 29 at 14:13
















    $begingroup$
    I get a shape error in result = ... . Also, to whoever it may help: the expm is from scipy.linalg.
    $endgroup$
    – anderstood
    Jan 29 at 14:13




    $begingroup$
    I get a shape error in result = ... . Also, to whoever it may help: the expm is from scipy.linalg.
    $endgroup$
    – anderstood
    Jan 29 at 14:13











    0












    $begingroup$

    Another Python implementation



    Here is another implementation in Python, if it can be helpful to anyone... (and since ArtificiallyIntelligence's answer returns an error in my setup).
    The value of the integral is integral and the last line verifies the equality $int_0^T e^{At}dt = A^{-1}(e^{tA}-I)$, provided $A$ is nonsingular (which is generically the case for randomly generated matrices).



    import numpy as np
    N = 5
    t = 1
    A = np.random.rand(N,N)
    taylor = t*np.array([np.linalg.matrix_power(A*t,k)/np.math.factorial(k+1) for k in range(50)])
    integral = taylor.sum(axis = 0)

    print np.linalg.norm(integral - np.dot(np.linalg.inv(A),scipy.linalg.expm(t*A)-np.identity(N)))


    Note that you should adjust the 50 in taylor = ... to check convergence.






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      Another Python implementation



      Here is another implementation in Python, if it can be helpful to anyone... (and since ArtificiallyIntelligence's answer returns an error in my setup).
      The value of the integral is integral and the last line verifies the equality $int_0^T e^{At}dt = A^{-1}(e^{tA}-I)$, provided $A$ is nonsingular (which is generically the case for randomly generated matrices).



      import numpy as np
      N = 5
      t = 1
      A = np.random.rand(N,N)
      taylor = t*np.array([np.linalg.matrix_power(A*t,k)/np.math.factorial(k+1) for k in range(50)])
      integral = taylor.sum(axis = 0)

      print np.linalg.norm(integral - np.dot(np.linalg.inv(A),scipy.linalg.expm(t*A)-np.identity(N)))


      Note that you should adjust the 50 in taylor = ... to check convergence.






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        Another Python implementation



        Here is another implementation in Python, if it can be helpful to anyone... (and since ArtificiallyIntelligence's answer returns an error in my setup).
        The value of the integral is integral and the last line verifies the equality $int_0^T e^{At}dt = A^{-1}(e^{tA}-I)$, provided $A$ is nonsingular (which is generically the case for randomly generated matrices).



        import numpy as np
        N = 5
        t = 1
        A = np.random.rand(N,N)
        taylor = t*np.array([np.linalg.matrix_power(A*t,k)/np.math.factorial(k+1) for k in range(50)])
        integral = taylor.sum(axis = 0)

        print np.linalg.norm(integral - np.dot(np.linalg.inv(A),scipy.linalg.expm(t*A)-np.identity(N)))


        Note that you should adjust the 50 in taylor = ... to check convergence.






        share|cite|improve this answer









        $endgroup$



        Another Python implementation



        Here is another implementation in Python, if it can be helpful to anyone... (and since ArtificiallyIntelligence's answer returns an error in my setup).
        The value of the integral is integral and the last line verifies the equality $int_0^T e^{At}dt = A^{-1}(e^{tA}-I)$, provided $A$ is nonsingular (which is generically the case for randomly generated matrices).



        import numpy as np
        N = 5
        t = 1
        A = np.random.rand(N,N)
        taylor = t*np.array([np.linalg.matrix_power(A*t,k)/np.math.factorial(k+1) for k in range(50)])
        integral = taylor.sum(axis = 0)

        print np.linalg.norm(integral - np.dot(np.linalg.inv(A),scipy.linalg.expm(t*A)-np.identity(N)))


        Note that you should adjust the 50 in taylor = ... to check convergence.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Jan 30 at 9:56









        anderstoodanderstood

        2,1791030




        2,1791030






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f658276%2fintegral-of-matrix-exponential%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            MongoDB - Not Authorized To Execute Command

            in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

            How to fix TextFormField cause rebuild widget in Flutter