Euler-Ansatz for hom. ODEs with constant coefficients
up vote
0
down vote
favorite
Given is an hom. ODE with constant constant coefficients:
$A_0y(x)+A_1y'(x)+A_2y''(x) + dots + A_ny^{(n)}=0 tag{1}$
Now it's clear to me that the solution space $yinmathbb L$ is a vector space.
We can solve (1) using the Ansatz: $y(x)=e^{kx}$. We get:
$chi(k)=A_0 + A_1k + A_2k^2 + dots + A_nk^n=0 tag{2}$
Now with $k_i$ being a solution to $chi$ with multiplicity $m_i$, the solution to the ODE is:
$y(x)=sum_i y_i(x) tag{3}$
with
$y_i(x)=sum_{j=0}^{m_i} C_j x^j e^{k_i x}, quad C_jinmathbb R tag{4}$
Question: Since I expect $mathbb L$ to be a vector space, (4) kind of makes sense. I mean it doesn't look wrong and I can work with it and solve such ODEs, but I can't derive it. So (1), (2), (3) is clear but (4) isn't. How exactly do we get the $x^j$ part in (4)?
calculus differential-equations
add a comment |
up vote
0
down vote
favorite
Given is an hom. ODE with constant constant coefficients:
$A_0y(x)+A_1y'(x)+A_2y''(x) + dots + A_ny^{(n)}=0 tag{1}$
Now it's clear to me that the solution space $yinmathbb L$ is a vector space.
We can solve (1) using the Ansatz: $y(x)=e^{kx}$. We get:
$chi(k)=A_0 + A_1k + A_2k^2 + dots + A_nk^n=0 tag{2}$
Now with $k_i$ being a solution to $chi$ with multiplicity $m_i$, the solution to the ODE is:
$y(x)=sum_i y_i(x) tag{3}$
with
$y_i(x)=sum_{j=0}^{m_i} C_j x^j e^{k_i x}, quad C_jinmathbb R tag{4}$
Question: Since I expect $mathbb L$ to be a vector space, (4) kind of makes sense. I mean it doesn't look wrong and I can work with it and solve such ODEs, but I can't derive it. So (1), (2), (3) is clear but (4) isn't. How exactly do we get the $x^j$ part in (4)?
calculus differential-equations
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Given is an hom. ODE with constant constant coefficients:
$A_0y(x)+A_1y'(x)+A_2y''(x) + dots + A_ny^{(n)}=0 tag{1}$
Now it's clear to me that the solution space $yinmathbb L$ is a vector space.
We can solve (1) using the Ansatz: $y(x)=e^{kx}$. We get:
$chi(k)=A_0 + A_1k + A_2k^2 + dots + A_nk^n=0 tag{2}$
Now with $k_i$ being a solution to $chi$ with multiplicity $m_i$, the solution to the ODE is:
$y(x)=sum_i y_i(x) tag{3}$
with
$y_i(x)=sum_{j=0}^{m_i} C_j x^j e^{k_i x}, quad C_jinmathbb R tag{4}$
Question: Since I expect $mathbb L$ to be a vector space, (4) kind of makes sense. I mean it doesn't look wrong and I can work with it and solve such ODEs, but I can't derive it. So (1), (2), (3) is clear but (4) isn't. How exactly do we get the $x^j$ part in (4)?
calculus differential-equations
Given is an hom. ODE with constant constant coefficients:
$A_0y(x)+A_1y'(x)+A_2y''(x) + dots + A_ny^{(n)}=0 tag{1}$
Now it's clear to me that the solution space $yinmathbb L$ is a vector space.
We can solve (1) using the Ansatz: $y(x)=e^{kx}$. We get:
$chi(k)=A_0 + A_1k + A_2k^2 + dots + A_nk^n=0 tag{2}$
Now with $k_i$ being a solution to $chi$ with multiplicity $m_i$, the solution to the ODE is:
$y(x)=sum_i y_i(x) tag{3}$
with
$y_i(x)=sum_{j=0}^{m_i} C_j x^j e^{k_i x}, quad C_jinmathbb R tag{4}$
Question: Since I expect $mathbb L$ to be a vector space, (4) kind of makes sense. I mean it doesn't look wrong and I can work with it and solve such ODEs, but I can't derive it. So (1), (2), (3) is clear but (4) isn't. How exactly do we get the $x^j$ part in (4)?
calculus differential-equations
calculus differential-equations
asked yesterday
xotix
32629
32629
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
accepted
Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,
$frac{dv}{dt} = Av$
$v(0) = v_0$
where $A$ is an appropriate constant matrix. Now, you can show that
$v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$
is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.
Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have
$v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$
The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form
$e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
accepted
Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,
$frac{dv}{dt} = Av$
$v(0) = v_0$
where $A$ is an appropriate constant matrix. Now, you can show that
$v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$
is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.
Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have
$v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$
The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form
$e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)
add a comment |
up vote
1
down vote
accepted
Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,
$frac{dv}{dt} = Av$
$v(0) = v_0$
where $A$ is an appropriate constant matrix. Now, you can show that
$v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$
is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.
Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have
$v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$
The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form
$e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)
add a comment |
up vote
1
down vote
accepted
up vote
1
down vote
accepted
Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,
$frac{dv}{dt} = Av$
$v(0) = v_0$
where $A$ is an appropriate constant matrix. Now, you can show that
$v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$
is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.
Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have
$v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$
The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form
$e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)
Defining $v = (y, y^{(1)}, ..., y^{(n-1)})$, you can interpret your ODE as a linear system of differential equations,
$frac{dv}{dt} = Av$
$v(0) = v_0$
where $A$ is an appropriate constant matrix. Now, you can show that
$v(t) = (I + At + frac{(At)^2}{2!} + ...)v_0$
is a well defined analytic solution of this problem (in fact, the only solution). This infinite sum defines the exponential of a matrix, $e^{At}$.
Example: $y'' - 2y' + y = 0$. Defining $v = (y, y')$, we have
$v' = begin{pmatrix} 0 & 1 \ -1 & 2 end{pmatrix} v$
The characteristic polynomial of this matrix (and of the associated ODE) has only one root, and for that reason the exponential of $At$ will be something of the form
$e^{At} = Qe^tbegin{pmatrix} 1 & 0 \ t & 1 end{pmatrix}Q^{-1}$, where Q is some constant matrix (search for Jordan Decomposition, in the context of the exponential of a matrix). And that's where the $t^j$ will come from :)
edited yesterday
answered yesterday
M. Santos
314
314
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004933%2feuler-ansatz-for-hom-odes-with-constant-coefficients%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown