is there short proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE?
$begingroup$
I am looking for short and simple (accessible for an economist with calculus background) proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE.
ordinary-differential-equations
$endgroup$
add a comment |
$begingroup$
I am looking for short and simple (accessible for an economist with calculus background) proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE.
ordinary-differential-equations
$endgroup$
$begingroup$
How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
$endgroup$
– LutzL
Jan 15 at 19:26
$begingroup$
Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 19:39
add a comment |
$begingroup$
I am looking for short and simple (accessible for an economist with calculus background) proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE.
ordinary-differential-equations
$endgroup$
I am looking for short and simple (accessible for an economist with calculus background) proof of uniqueness of solutions of a linear two-dimensional non-autonomous system of ODE.
ordinary-differential-equations
ordinary-differential-equations
asked Jan 15 at 19:16
Zviad KhukhunashviliZviad Khukhunashvili
1208
1208
$begingroup$
How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
$endgroup$
– LutzL
Jan 15 at 19:26
$begingroup$
Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 19:39
add a comment |
$begingroup$
How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
$endgroup$
– LutzL
Jan 15 at 19:26
$begingroup$
Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 19:39
$begingroup$
How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
$endgroup$
– LutzL
Jan 15 at 19:26
$begingroup$
How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
$endgroup$
– LutzL
Jan 15 at 19:26
$begingroup$
Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 19:39
$begingroup$
Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 19:39
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$
Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$
By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$
So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.
To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}
From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.
Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$
for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$
$endgroup$
$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13
$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31
$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55
$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12
$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20
|
show 7 more comments
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074823%2fis-there-short-proof-of-uniqueness-of-solutions-of-a-linear-two-dimensional-non%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$
Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$
By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$
So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.
To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}
From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.
Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$
for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$
$endgroup$
$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13
$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31
$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55
$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12
$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20
|
show 7 more comments
$begingroup$
In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$
Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$
By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$
So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.
To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}
From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.
Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$
for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$
$endgroup$
$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13
$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31
$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55
$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12
$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20
|
show 7 more comments
$begingroup$
In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$
Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$
By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$
So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.
To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}
From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.
Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$
for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$
$endgroup$
In a linear system $y'(t)=A(t)y(t)+b(t)$, the difference $u$ of two solutions $x,y$ is itself the solution of the homogeneous system
$$
u'(t)=y'(t)-x'(t)=A(t)(y(t)-x(t))=A(t)u(t).
$$
Now apply vector and associated matrix norms
$$
|u'(t)|le |A(t)|,|u(t)|
$$
By the Grönwall lemma, this results in the upper bound
$$
|u(t)|le expleft(int_{t_0}^t|A(s)|,dsright)|u(t_0)|.
$$
So when the two solutions are equal at $t_0$, they also have to be equal for any $t>t_0$. A similar argument goes also for $t<t_0$.
To get that bound, first consider the exact equation $d'(t)=|A(t)|d(t)$ which has the solution $d(t)=e^{c(t)}d(t_0)$ with $c'(t)=|A(t)|$, $c(t_0)=0$. Now consider the difference $h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(s)}$ for some $a>0$ under the integral identities and inequalities
begin{align}
|u(t)|&=|u(t_0)|+left|int_{t_0}^tu'(s),dsright|
le|u(t_0)|+int_{t_0}^tleft|u'(s)right|,ds
\
&le |u(t_0)|+int_{t_0}^t|A(s)|,|u(s)|,ds
\
\
e^{c(t)}&=1+int_{t_0}^te^{c(s)}c'(s),ds
\[1em]hline
h_a(t)=|u(t)|-(|u(t_0)|+a)e^{c(t)}&
le-a+int_{t_0}^t|A(s)|,Bigl(|u(s)|-(|u(t_0)|+a)e^{c(s)}Bigr),ds
\&=-a+int_{t_0}^t|A(s)|,h_a(s),ds.
end{align}
From this one concludes that there can be no $t$ where the $h_a(t)ge 0$,, as then there would be a minimal such $t$ with $h_a(t)=0$ and from the last inequality $h_a(t)le-a<0$ would follow in contradiction.
Now as
$$
|u(t)|<(|u(t_0)|+a)e^{c(t)}
$$
for all $a>0$, it follows that in the limit
$$
|u(t)|le|u(t_0)|e^{c(t)}.
$$
edited Jan 24 at 13:06
answered Jan 15 at 19:53
LutzLLutzL
58.8k42055
58.8k42055
$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13
$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31
$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55
$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12
$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20
|
show 7 more comments
$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13
$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31
$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55
$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12
$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20
$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13
$begingroup$
could you give a textbook citation for this?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:13
$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31
$begingroup$
how do you define a matrix norm?
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 20:31
$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55
$begingroup$
You can take the max norm for the vectors, then the associated matrix norm is the row-sum norm. This should be simple enough.
$endgroup$
– LutzL
Jan 15 at 20:55
$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12
$begingroup$
May I ask how to apply Gr"onwall lemma to the normed inequality? In the sources that I saw the lemma is provided for one-dimensional real-valued case.
$endgroup$
– Zviad Khukhunashvili
Jan 23 at 19:12
$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20
$begingroup$
Consider the function $d(t)=|u(t)|$, then $|d'(t)|le |u'(t)|$ so that the inequality of the vector case reduces to the scalar case.
$endgroup$
– LutzL
Jan 23 at 19:20
|
show 7 more comments
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074823%2fis-there-short-proof-of-uniqueness-of-solutions-of-a-linear-two-dimensional-non%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
How good is the economist with the Grönwall lemma? Its terms for a start and, more advanced, some of the proof? This is the shortest variant for the uniqueness proof. The other variant takes the full proof of the existence theorem and extracts the uniqueness from the uniqueness of the Banach fixed-point theorem used therein.
$endgroup$
– LutzL
Jan 15 at 19:26
$begingroup$
Not too happy with Cauchy theorem of existence and uniqueness. We are jointly working on economic problem (I do the math part) and he insists on understanding the proof of uniqueness of solutions of the linear system we are working on. I cite the Cauchy theorem, then prove that our system satisfies the conditions of the theorem. I guess the path can not be straighter.
$endgroup$
– Zviad Khukhunashvili
Jan 15 at 19:39