Wronski-Test for linear ODE












1












$begingroup$


To test solutions of linear ODEs for lineary independence you can determine the wronski determinant. The theorem says if you have a solution to the linear ODE in the form:
$$
dot{vec{x}}(t) = A cdot vec{x}(t)
$$

you can test for linear independence by showing
$$exists t_0 in Icolon;
operatorname{det}(W(t_0)) neq 0
$$

with $W(x)$ being the Wronski-Matrix:
$$
W(t) = begin{pmatrix}
vec{x_1}(t)& cdots & vec{x_n}(t)
end{pmatrix}
$$
where $x_i$ is the i-th vector of the solution.
If you find just one $t_0$ it follows that:
$$forall t in Icolon;operatorname{det}(W(t)) notequiv 0$$
(With $notequiv 0$ i mean constant not zero).
I don't understand why this implication holds true.



Why can I conclude from the fact that the Wronski determinant is at one point equal zero that it must be everywhere? $ det (W (t)) $ is not constant and it depends on $t$.



I'm happy about any answers!



Many Greetings,
Sebi2020










share|cite|improve this question











$endgroup$












  • $begingroup$
    I think the issue is that if the Wronskian determinant is nonzero for some $t_0 in I$, we are guaranteed that the solutions are linearly independent. If the solutions were not linearly independent, the Wronskian would always be zero (looking at contrapositive).
    $endgroup$
    – D.B.
    Jan 10 at 23:59


















1












$begingroup$


To test solutions of linear ODEs for lineary independence you can determine the wronski determinant. The theorem says if you have a solution to the linear ODE in the form:
$$
dot{vec{x}}(t) = A cdot vec{x}(t)
$$

you can test for linear independence by showing
$$exists t_0 in Icolon;
operatorname{det}(W(t_0)) neq 0
$$

with $W(x)$ being the Wronski-Matrix:
$$
W(t) = begin{pmatrix}
vec{x_1}(t)& cdots & vec{x_n}(t)
end{pmatrix}
$$
where $x_i$ is the i-th vector of the solution.
If you find just one $t_0$ it follows that:
$$forall t in Icolon;operatorname{det}(W(t)) notequiv 0$$
(With $notequiv 0$ i mean constant not zero).
I don't understand why this implication holds true.



Why can I conclude from the fact that the Wronski determinant is at one point equal zero that it must be everywhere? $ det (W (t)) $ is not constant and it depends on $t$.



I'm happy about any answers!



Many Greetings,
Sebi2020










share|cite|improve this question











$endgroup$












  • $begingroup$
    I think the issue is that if the Wronskian determinant is nonzero for some $t_0 in I$, we are guaranteed that the solutions are linearly independent. If the solutions were not linearly independent, the Wronskian would always be zero (looking at contrapositive).
    $endgroup$
    – D.B.
    Jan 10 at 23:59
















1












1








1


1



$begingroup$


To test solutions of linear ODEs for lineary independence you can determine the wronski determinant. The theorem says if you have a solution to the linear ODE in the form:
$$
dot{vec{x}}(t) = A cdot vec{x}(t)
$$

you can test for linear independence by showing
$$exists t_0 in Icolon;
operatorname{det}(W(t_0)) neq 0
$$

with $W(x)$ being the Wronski-Matrix:
$$
W(t) = begin{pmatrix}
vec{x_1}(t)& cdots & vec{x_n}(t)
end{pmatrix}
$$
where $x_i$ is the i-th vector of the solution.
If you find just one $t_0$ it follows that:
$$forall t in Icolon;operatorname{det}(W(t)) notequiv 0$$
(With $notequiv 0$ i mean constant not zero).
I don't understand why this implication holds true.



Why can I conclude from the fact that the Wronski determinant is at one point equal zero that it must be everywhere? $ det (W (t)) $ is not constant and it depends on $t$.



I'm happy about any answers!



Many Greetings,
Sebi2020










share|cite|improve this question











$endgroup$




To test solutions of linear ODEs for lineary independence you can determine the wronski determinant. The theorem says if you have a solution to the linear ODE in the form:
$$
dot{vec{x}}(t) = A cdot vec{x}(t)
$$

you can test for linear independence by showing
$$exists t_0 in Icolon;
operatorname{det}(W(t_0)) neq 0
$$

with $W(x)$ being the Wronski-Matrix:
$$
W(t) = begin{pmatrix}
vec{x_1}(t)& cdots & vec{x_n}(t)
end{pmatrix}
$$
where $x_i$ is the i-th vector of the solution.
If you find just one $t_0$ it follows that:
$$forall t in Icolon;operatorname{det}(W(t)) notequiv 0$$
(With $notequiv 0$ i mean constant not zero).
I don't understand why this implication holds true.



Why can I conclude from the fact that the Wronski determinant is at one point equal zero that it must be everywhere? $ det (W (t)) $ is not constant and it depends on $t$.



I'm happy about any answers!



Many Greetings,
Sebi2020







linear-algebra ordinary-differential-equations fundamental-solution






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 10 at 23:43







Sebi2020

















asked Jan 10 at 23:37









Sebi2020Sebi2020

1225




1225












  • $begingroup$
    I think the issue is that if the Wronskian determinant is nonzero for some $t_0 in I$, we are guaranteed that the solutions are linearly independent. If the solutions were not linearly independent, the Wronskian would always be zero (looking at contrapositive).
    $endgroup$
    – D.B.
    Jan 10 at 23:59




















  • $begingroup$
    I think the issue is that if the Wronskian determinant is nonzero for some $t_0 in I$, we are guaranteed that the solutions are linearly independent. If the solutions were not linearly independent, the Wronskian would always be zero (looking at contrapositive).
    $endgroup$
    – D.B.
    Jan 10 at 23:59


















$begingroup$
I think the issue is that if the Wronskian determinant is nonzero for some $t_0 in I$, we are guaranteed that the solutions are linearly independent. If the solutions were not linearly independent, the Wronskian would always be zero (looking at contrapositive).
$endgroup$
– D.B.
Jan 10 at 23:59






$begingroup$
I think the issue is that if the Wronskian determinant is nonzero for some $t_0 in I$, we are guaranteed that the solutions are linearly independent. If the solutions were not linearly independent, the Wronskian would always be zero (looking at contrapositive).
$endgroup$
– D.B.
Jan 10 at 23:59












1 Answer
1






active

oldest

votes


















2












$begingroup$

This matter may be resolved via Liouville's formula, which affirms that if $X(t)$ an $n times n$ matrix solution of the differential equation



$dot X(t) = A(t)X(t) tag 1$



the determinant of $det(X(t))$ of $X(t)$ is given by



$det(X(t)) = det(X(t_0)) exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ); tag 2$



since



$exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ) ne 0, ; forall t_0, t in I, tag 3$



we see that



$det(X(t)) ne 0 Longleftrightarrow det(X(t_0)) ne 0. tag 4$



Now we may conclude with the simple observation that we may take



$W(t) = det(X(t)) = det(vec x_1(t), vec x_2(t), ldots, vec x_n(t)). tag 5$






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    But is it only a implication from the liouville formula or is there another explanation? Our lecturer said that the reason for this is that the solutions form a vector room. But I cannot explain myself why this should be a proof for $det(X(t)) neq 0 Leftrightarrow det(X(t_0))neq 0$ whereas the liouville formula shows that. I think the vector room property explains only the need to look for solutions which form a fundamental system.
    $endgroup$
    – Sebi2020
    Jan 11 at 14:06












  • $begingroup$
    @Sebi2020: is a "vector room" the same as a vector space?
    $endgroup$
    – Robert Lewis
    Jan 11 at 16:14






  • 1




    $begingroup$
    Yes. English is not my native language, so I used the wrong translation for "vector space". But I wanted to say vector space.
    $endgroup$
    – Sebi2020
    Jan 11 at 20:57








  • 1




    $begingroup$
    I think I don't quite understand it. I struggle with this: " if the $x_i$ are linearly dependent at some point, so that $sum a_i x_i(t_0) = 0$, then we must have the unique solution $x(t) = sum a_ix_i(t) = 0$ everywhere". This is what I dont understand. Which property of linear equations or vector spaces proves that? I would thought that $a_ix_i(t_0)$ maybe resides to $operatorname{ker} x(t_0)$ but that this dosn't have to be true for $x(t), tneq0$. Is it just the existence and uniqueness theorem saying that there can only be one solution with $x(t) = 0$?
    $endgroup$
    – Sebi2020
    Jan 11 at 22:07








  • 1




    $begingroup$
    Thank you! Your're explanations were very helpful.
    $endgroup$
    – Sebi2020
    Jan 11 at 23:23













Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3069315%2fwronski-test-for-linear-ode%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









2












$begingroup$

This matter may be resolved via Liouville's formula, which affirms that if $X(t)$ an $n times n$ matrix solution of the differential equation



$dot X(t) = A(t)X(t) tag 1$



the determinant of $det(X(t))$ of $X(t)$ is given by



$det(X(t)) = det(X(t_0)) exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ); tag 2$



since



$exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ) ne 0, ; forall t_0, t in I, tag 3$



we see that



$det(X(t)) ne 0 Longleftrightarrow det(X(t_0)) ne 0. tag 4$



Now we may conclude with the simple observation that we may take



$W(t) = det(X(t)) = det(vec x_1(t), vec x_2(t), ldots, vec x_n(t)). tag 5$






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    But is it only a implication from the liouville formula or is there another explanation? Our lecturer said that the reason for this is that the solutions form a vector room. But I cannot explain myself why this should be a proof for $det(X(t)) neq 0 Leftrightarrow det(X(t_0))neq 0$ whereas the liouville formula shows that. I think the vector room property explains only the need to look for solutions which form a fundamental system.
    $endgroup$
    – Sebi2020
    Jan 11 at 14:06












  • $begingroup$
    @Sebi2020: is a "vector room" the same as a vector space?
    $endgroup$
    – Robert Lewis
    Jan 11 at 16:14






  • 1




    $begingroup$
    Yes. English is not my native language, so I used the wrong translation for "vector space". But I wanted to say vector space.
    $endgroup$
    – Sebi2020
    Jan 11 at 20:57








  • 1




    $begingroup$
    I think I don't quite understand it. I struggle with this: " if the $x_i$ are linearly dependent at some point, so that $sum a_i x_i(t_0) = 0$, then we must have the unique solution $x(t) = sum a_ix_i(t) = 0$ everywhere". This is what I dont understand. Which property of linear equations or vector spaces proves that? I would thought that $a_ix_i(t_0)$ maybe resides to $operatorname{ker} x(t_0)$ but that this dosn't have to be true for $x(t), tneq0$. Is it just the existence and uniqueness theorem saying that there can only be one solution with $x(t) = 0$?
    $endgroup$
    – Sebi2020
    Jan 11 at 22:07








  • 1




    $begingroup$
    Thank you! Your're explanations were very helpful.
    $endgroup$
    – Sebi2020
    Jan 11 at 23:23


















2












$begingroup$

This matter may be resolved via Liouville's formula, which affirms that if $X(t)$ an $n times n$ matrix solution of the differential equation



$dot X(t) = A(t)X(t) tag 1$



the determinant of $det(X(t))$ of $X(t)$ is given by



$det(X(t)) = det(X(t_0)) exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ); tag 2$



since



$exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ) ne 0, ; forall t_0, t in I, tag 3$



we see that



$det(X(t)) ne 0 Longleftrightarrow det(X(t_0)) ne 0. tag 4$



Now we may conclude with the simple observation that we may take



$W(t) = det(X(t)) = det(vec x_1(t), vec x_2(t), ldots, vec x_n(t)). tag 5$






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    But is it only a implication from the liouville formula or is there another explanation? Our lecturer said that the reason for this is that the solutions form a vector room. But I cannot explain myself why this should be a proof for $det(X(t)) neq 0 Leftrightarrow det(X(t_0))neq 0$ whereas the liouville formula shows that. I think the vector room property explains only the need to look for solutions which form a fundamental system.
    $endgroup$
    – Sebi2020
    Jan 11 at 14:06












  • $begingroup$
    @Sebi2020: is a "vector room" the same as a vector space?
    $endgroup$
    – Robert Lewis
    Jan 11 at 16:14






  • 1




    $begingroup$
    Yes. English is not my native language, so I used the wrong translation for "vector space". But I wanted to say vector space.
    $endgroup$
    – Sebi2020
    Jan 11 at 20:57








  • 1




    $begingroup$
    I think I don't quite understand it. I struggle with this: " if the $x_i$ are linearly dependent at some point, so that $sum a_i x_i(t_0) = 0$, then we must have the unique solution $x(t) = sum a_ix_i(t) = 0$ everywhere". This is what I dont understand. Which property of linear equations or vector spaces proves that? I would thought that $a_ix_i(t_0)$ maybe resides to $operatorname{ker} x(t_0)$ but that this dosn't have to be true for $x(t), tneq0$. Is it just the existence and uniqueness theorem saying that there can only be one solution with $x(t) = 0$?
    $endgroup$
    – Sebi2020
    Jan 11 at 22:07








  • 1




    $begingroup$
    Thank you! Your're explanations were very helpful.
    $endgroup$
    – Sebi2020
    Jan 11 at 23:23
















2












2








2





$begingroup$

This matter may be resolved via Liouville's formula, which affirms that if $X(t)$ an $n times n$ matrix solution of the differential equation



$dot X(t) = A(t)X(t) tag 1$



the determinant of $det(X(t))$ of $X(t)$ is given by



$det(X(t)) = det(X(t_0)) exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ); tag 2$



since



$exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ) ne 0, ; forall t_0, t in I, tag 3$



we see that



$det(X(t)) ne 0 Longleftrightarrow det(X(t_0)) ne 0. tag 4$



Now we may conclude with the simple observation that we may take



$W(t) = det(X(t)) = det(vec x_1(t), vec x_2(t), ldots, vec x_n(t)). tag 5$






share|cite|improve this answer









$endgroup$



This matter may be resolved via Liouville's formula, which affirms that if $X(t)$ an $n times n$ matrix solution of the differential equation



$dot X(t) = A(t)X(t) tag 1$



the determinant of $det(X(t))$ of $X(t)$ is given by



$det(X(t)) = det(X(t_0)) exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ); tag 2$



since



$exp left ( displaystyle int_{t_0}^t text{trace}(A(s)) ; ds right ) ne 0, ; forall t_0, t in I, tag 3$



we see that



$det(X(t)) ne 0 Longleftrightarrow det(X(t_0)) ne 0. tag 4$



Now we may conclude with the simple observation that we may take



$W(t) = det(X(t)) = det(vec x_1(t), vec x_2(t), ldots, vec x_n(t)). tag 5$







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 10 at 23:59









Robert LewisRobert Lewis

46k23066




46k23066








  • 1




    $begingroup$
    But is it only a implication from the liouville formula or is there another explanation? Our lecturer said that the reason for this is that the solutions form a vector room. But I cannot explain myself why this should be a proof for $det(X(t)) neq 0 Leftrightarrow det(X(t_0))neq 0$ whereas the liouville formula shows that. I think the vector room property explains only the need to look for solutions which form a fundamental system.
    $endgroup$
    – Sebi2020
    Jan 11 at 14:06












  • $begingroup$
    @Sebi2020: is a "vector room" the same as a vector space?
    $endgroup$
    – Robert Lewis
    Jan 11 at 16:14






  • 1




    $begingroup$
    Yes. English is not my native language, so I used the wrong translation for "vector space". But I wanted to say vector space.
    $endgroup$
    – Sebi2020
    Jan 11 at 20:57








  • 1




    $begingroup$
    I think I don't quite understand it. I struggle with this: " if the $x_i$ are linearly dependent at some point, so that $sum a_i x_i(t_0) = 0$, then we must have the unique solution $x(t) = sum a_ix_i(t) = 0$ everywhere". This is what I dont understand. Which property of linear equations or vector spaces proves that? I would thought that $a_ix_i(t_0)$ maybe resides to $operatorname{ker} x(t_0)$ but that this dosn't have to be true for $x(t), tneq0$. Is it just the existence and uniqueness theorem saying that there can only be one solution with $x(t) = 0$?
    $endgroup$
    – Sebi2020
    Jan 11 at 22:07








  • 1




    $begingroup$
    Thank you! Your're explanations were very helpful.
    $endgroup$
    – Sebi2020
    Jan 11 at 23:23
















  • 1




    $begingroup$
    But is it only a implication from the liouville formula or is there another explanation? Our lecturer said that the reason for this is that the solutions form a vector room. But I cannot explain myself why this should be a proof for $det(X(t)) neq 0 Leftrightarrow det(X(t_0))neq 0$ whereas the liouville formula shows that. I think the vector room property explains only the need to look for solutions which form a fundamental system.
    $endgroup$
    – Sebi2020
    Jan 11 at 14:06












  • $begingroup$
    @Sebi2020: is a "vector room" the same as a vector space?
    $endgroup$
    – Robert Lewis
    Jan 11 at 16:14






  • 1




    $begingroup$
    Yes. English is not my native language, so I used the wrong translation for "vector space". But I wanted to say vector space.
    $endgroup$
    – Sebi2020
    Jan 11 at 20:57








  • 1




    $begingroup$
    I think I don't quite understand it. I struggle with this: " if the $x_i$ are linearly dependent at some point, so that $sum a_i x_i(t_0) = 0$, then we must have the unique solution $x(t) = sum a_ix_i(t) = 0$ everywhere". This is what I dont understand. Which property of linear equations or vector spaces proves that? I would thought that $a_ix_i(t_0)$ maybe resides to $operatorname{ker} x(t_0)$ but that this dosn't have to be true for $x(t), tneq0$. Is it just the existence and uniqueness theorem saying that there can only be one solution with $x(t) = 0$?
    $endgroup$
    – Sebi2020
    Jan 11 at 22:07








  • 1




    $begingroup$
    Thank you! Your're explanations were very helpful.
    $endgroup$
    – Sebi2020
    Jan 11 at 23:23










1




1




$begingroup$
But is it only a implication from the liouville formula or is there another explanation? Our lecturer said that the reason for this is that the solutions form a vector room. But I cannot explain myself why this should be a proof for $det(X(t)) neq 0 Leftrightarrow det(X(t_0))neq 0$ whereas the liouville formula shows that. I think the vector room property explains only the need to look for solutions which form a fundamental system.
$endgroup$
– Sebi2020
Jan 11 at 14:06






$begingroup$
But is it only a implication from the liouville formula or is there another explanation? Our lecturer said that the reason for this is that the solutions form a vector room. But I cannot explain myself why this should be a proof for $det(X(t)) neq 0 Leftrightarrow det(X(t_0))neq 0$ whereas the liouville formula shows that. I think the vector room property explains only the need to look for solutions which form a fundamental system.
$endgroup$
– Sebi2020
Jan 11 at 14:06














$begingroup$
@Sebi2020: is a "vector room" the same as a vector space?
$endgroup$
– Robert Lewis
Jan 11 at 16:14




$begingroup$
@Sebi2020: is a "vector room" the same as a vector space?
$endgroup$
– Robert Lewis
Jan 11 at 16:14




1




1




$begingroup$
Yes. English is not my native language, so I used the wrong translation for "vector space". But I wanted to say vector space.
$endgroup$
– Sebi2020
Jan 11 at 20:57






$begingroup$
Yes. English is not my native language, so I used the wrong translation for "vector space". But I wanted to say vector space.
$endgroup$
– Sebi2020
Jan 11 at 20:57






1




1




$begingroup$
I think I don't quite understand it. I struggle with this: " if the $x_i$ are linearly dependent at some point, so that $sum a_i x_i(t_0) = 0$, then we must have the unique solution $x(t) = sum a_ix_i(t) = 0$ everywhere". This is what I dont understand. Which property of linear equations or vector spaces proves that? I would thought that $a_ix_i(t_0)$ maybe resides to $operatorname{ker} x(t_0)$ but that this dosn't have to be true for $x(t), tneq0$. Is it just the existence and uniqueness theorem saying that there can only be one solution with $x(t) = 0$?
$endgroup$
– Sebi2020
Jan 11 at 22:07






$begingroup$
I think I don't quite understand it. I struggle with this: " if the $x_i$ are linearly dependent at some point, so that $sum a_i x_i(t_0) = 0$, then we must have the unique solution $x(t) = sum a_ix_i(t) = 0$ everywhere". This is what I dont understand. Which property of linear equations or vector spaces proves that? I would thought that $a_ix_i(t_0)$ maybe resides to $operatorname{ker} x(t_0)$ but that this dosn't have to be true for $x(t), tneq0$. Is it just the existence and uniqueness theorem saying that there can only be one solution with $x(t) = 0$?
$endgroup$
– Sebi2020
Jan 11 at 22:07






1




1




$begingroup$
Thank you! Your're explanations were very helpful.
$endgroup$
– Sebi2020
Jan 11 at 23:23






$begingroup$
Thank you! Your're explanations were very helpful.
$endgroup$
– Sebi2020
Jan 11 at 23:23




















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3069315%2fwronski-test-for-linear-ode%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

How to fix TextFormField cause rebuild widget in Flutter

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith