is there short proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE?












0












$begingroup$


I am looking for short and simple (accessible for an economist with calculus background) proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE.










share|cite|improve this question









$endgroup$












  • $begingroup$
    How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
    $endgroup$
    – LutzL
    Jan 15 at 19:26










  • $begingroup$
    Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 19:39
















0












$begingroup$


I am looking for short and simple (accessible for an economist with calculus background) proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE.










share|cite|improve this question









$endgroup$












  • $begingroup$
    How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
    $endgroup$
    – LutzL
    Jan 15 at 19:26










  • $begingroup$
    Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 19:39














0












0








0





$begingroup$


I am looking for short and simple (accessible for an economist with calculus background) proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE.










share|cite|improve this question









$endgroup$




I am looking for short and simple (accessible for an economist with calculus background) proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE.







ordinary-differential-equations






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 15 at 19:16









Zviad KhukhunashviliZviad Khukhunashvili

1208




1208












  • $begingroup$
    How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
    $endgroup$
    – LutzL
    Jan 15 at 19:26










  • $begingroup$
    Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 19:39


















  • $begingroup$
    How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
    $endgroup$
    – LutzL
    Jan 15 at 19:26










  • $begingroup$
    Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 19:39
















$begingroup$
How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
$endgroup$
– LutzL
Jan 15 at 19:26




$begingroup$
How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
$endgroup$
– LutzL
Jan 15 at 19:26












$begingroup$
Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 19:39




$begingroup$
Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 19:39










1 Answer
1






active

oldest

votes


















1












$begingroup$

In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$

Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$

By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$

So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.





To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}

From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.



Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$

for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    could you give a textbook citation for this?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:13










  • $begingroup$
    how do you define a matrix norm?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:31












  • $begingroup$
    You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
    $endgroup$
    – LutzL
    Jan 15 at 20:55










  • $begingroup$
    May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 23 at 19:12












  • $begingroup$
    Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
    $endgroup$
    – LutzL
    Jan 23 at 19:20











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074823%2fis-there-short-proof-of-uniqueness-of-solutions-of-a-linear-two-dimensional-non%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$

In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$

Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$

By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$

So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.





To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}

From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.



Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$

for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    could you give a textbook citation for this?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:13










  • $begingroup$
    how do you define a matrix norm?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:31












  • $begingroup$
    You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
    $endgroup$
    – LutzL
    Jan 15 at 20:55










  • $begingroup$
    May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 23 at 19:12












  • $begingroup$
    Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
    $endgroup$
    – LutzL
    Jan 23 at 19:20
















1












$begingroup$

In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$

Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$

By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$

So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.





To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}

From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.



Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$

for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    could you give a textbook citation for this?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:13










  • $begingroup$
    how do you define a matrix norm?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:31












  • $begingroup$
    You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
    $endgroup$
    – LutzL
    Jan 15 at 20:55










  • $begingroup$
    May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 23 at 19:12












  • $begingroup$
    Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
    $endgroup$
    – LutzL
    Jan 23 at 19:20














1












1








1





$begingroup$

In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$

Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$

By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$

So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.





To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}

From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.



Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$

for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$






share|cite|improve this answer











$endgroup$



In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$

Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$

By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$

So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.





To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}

From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.



Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$

for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Jan 24 at 13:06

























answered Jan 15 at 19:53









LutzLLutzL

58.8k42055




58.8k42055












  • $begingroup$
    could you give a textbook citation for this?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:13










  • $begingroup$
    how do you define a matrix norm?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:31












  • $begingroup$
    You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
    $endgroup$
    – LutzL
    Jan 15 at 20:55










  • $begingroup$
    May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 23 at 19:12












  • $begingroup$
    Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
    $endgroup$
    – LutzL
    Jan 23 at 19:20


















  • $begingroup$
    could you give a textbook citation for this?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:13










  • $begingroup$
    how do you define a matrix norm?
    $endgroup$
    – Zviad Khukhunashvili
    Jan 15 at 20:31












  • $begingroup$
    You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
    $endgroup$
    – LutzL
    Jan 15 at 20:55










  • $begingroup$
    May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
    $endgroup$
    – Zviad Khukhunashvili
    Jan 23 at 19:12












  • $begingroup$
    Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
    $endgroup$
    – LutzL
    Jan 23 at 19:20
















$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13




$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13












$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31






$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31














$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55




$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55












$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12






$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12














$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20




$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074823%2fis-there-short-proof-of-uniqueness-of-solutions-of-a-linear-two-dimensional-non%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

'app-layout' is not a known element: how to share Component with different Modules

android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

WPF add header to Image with URL pettitions [duplicate]