Show $text Pleft[|X^x_t|<rright]xrightarrow{|x|toinfty}0$ for strong solutions of SDEs












1












$begingroup$


Let





  • $(Omega,mathcal A,operatorname P)$ be a probability space


  • $b,sigma:mathbb Rtomathbb R$ be Lipschitz continuous


  • $(X_t^x)_{tge0}$ be a continuous process on $(Omega,mathcal A,operatorname P)$ with $$X^x_t=x+int_0^tb(X^x_s):{rm d}s+int_0^tsigma(X^x_s):{rm d}W_s;;;text{for all }tge0text{ almost surely}tag1$$ for $xinmathbb R$



Fix $tge0$ and $r>0$. I want to show that $$operatorname Pleft[left|X^x_tright|<rright]xrightarrow{left|xright|toinfty}0.tag2$$




By Markov's inequality, $$operatorname Pleft[left|X^x_tright|<rright]leoperatorname Pleft[left|X^x_t-xright|>left|xright|-rright]lefrac{operatorname Eleft[left|X^x_t-xright|^2right]}{left(left|xright|-rright)^2}tag3$$ for all $xinmathbb R$ with $left|xright|-r>0$.




If $b$ and $sigma$ are bounded and $lambda:=sup_{xinmathbb R}left|b(x)right|^2+sup_{xinmathbb R}left|sigma(x)right|^2$, then $$operatorname Eleft[left|X^x_t-xright|^2right]le2lambda t(t+4)tag4$$ by Hölder's inequality and the Burkholder-Davis-Gundy inequality. Since the denominator on the right-hand side of $(3)$ tends to $infty$, we're able to conclude $(2)$.



Question: Are we able to prove $(2)$ without assuming boundedness of $b$ and $sigma$?




By Lipschitz continuity, $$|b(x)|^2+|sigma(x)|^2le c(1+|x|^2);;;text{for all }xinmathbb Rtag5$$ for some $cge0$. For simplicity of notation, write $|Y|_t^ast:=sup_{sin[0,:t]}|Y_s|$ for $tge0$ and any process $(Y_t)_{tge0}$. Letting $c_1:=max(2,4c(t+4)t,4c(t+4))$, we obtain $$operatorname Eleft[{left|X^xright|_t^ast}^2right]le c_1left(x^2+1+int_0^toperatorname Eleft[{left|X^xright|_s^ast}^2right]{rm d}sright)tag6$$ by the same argumentation as needed for $(4)$ and hence $$operatorname Eleft[{left|X^xright|_t^ast}^2right]le c_1left(x^2+1right)e^{c_1t}tag7$$ by Grönwall's inequality.




However, I'm not able to utilize $(7)$ to prove $(2)$.$^1$






$^1$ compare with my related question.










share|cite|improve this question











$endgroup$












  • $begingroup$
    The assertion hold's true if $b$, $sigma$ are at most of linear growth; otherwise it does, in general, fail to hold true. A more general framework (SDEs driven by Lévy processes) is discussed in this paper.
    $endgroup$
    – saz
    Jan 30 at 7:29










  • $begingroup$
    @saz In the setting described in the question, they are of linear growth (since they are globally Lipschitz, see $(5)$), but I don't see how the assertion can be proved.
    $endgroup$
    – 0xbadf00d
    Jan 30 at 9:45












  • $begingroup$
    The proof which I know uses Lyapunov functions... I can write it up later.
    $endgroup$
    – saz
    Jan 30 at 10:26










  • $begingroup$
    @saz That would be great. Thank you.
    $endgroup$
    – 0xbadf00d
    Jan 30 at 11:17
















1












$begingroup$


Let





  • $(Omega,mathcal A,operatorname P)$ be a probability space


  • $b,sigma:mathbb Rtomathbb R$ be Lipschitz continuous


  • $(X_t^x)_{tge0}$ be a continuous process on $(Omega,mathcal A,operatorname P)$ with $$X^x_t=x+int_0^tb(X^x_s):{rm d}s+int_0^tsigma(X^x_s):{rm d}W_s;;;text{for all }tge0text{ almost surely}tag1$$ for $xinmathbb R$



Fix $tge0$ and $r>0$. I want to show that $$operatorname Pleft[left|X^x_tright|<rright]xrightarrow{left|xright|toinfty}0.tag2$$




By Markov's inequality, $$operatorname Pleft[left|X^x_tright|<rright]leoperatorname Pleft[left|X^x_t-xright|>left|xright|-rright]lefrac{operatorname Eleft[left|X^x_t-xright|^2right]}{left(left|xright|-rright)^2}tag3$$ for all $xinmathbb R$ with $left|xright|-r>0$.




If $b$ and $sigma$ are bounded and $lambda:=sup_{xinmathbb R}left|b(x)right|^2+sup_{xinmathbb R}left|sigma(x)right|^2$, then $$operatorname Eleft[left|X^x_t-xright|^2right]le2lambda t(t+4)tag4$$ by Hölder's inequality and the Burkholder-Davis-Gundy inequality. Since the denominator on the right-hand side of $(3)$ tends to $infty$, we're able to conclude $(2)$.



Question: Are we able to prove $(2)$ without assuming boundedness of $b$ and $sigma$?




By Lipschitz continuity, $$|b(x)|^2+|sigma(x)|^2le c(1+|x|^2);;;text{for all }xinmathbb Rtag5$$ for some $cge0$. For simplicity of notation, write $|Y|_t^ast:=sup_{sin[0,:t]}|Y_s|$ for $tge0$ and any process $(Y_t)_{tge0}$. Letting $c_1:=max(2,4c(t+4)t,4c(t+4))$, we obtain $$operatorname Eleft[{left|X^xright|_t^ast}^2right]le c_1left(x^2+1+int_0^toperatorname Eleft[{left|X^xright|_s^ast}^2right]{rm d}sright)tag6$$ by the same argumentation as needed for $(4)$ and hence $$operatorname Eleft[{left|X^xright|_t^ast}^2right]le c_1left(x^2+1right)e^{c_1t}tag7$$ by Grönwall's inequality.




However, I'm not able to utilize $(7)$ to prove $(2)$.$^1$






$^1$ compare with my related question.










share|cite|improve this question











$endgroup$












  • $begingroup$
    The assertion hold's true if $b$, $sigma$ are at most of linear growth; otherwise it does, in general, fail to hold true. A more general framework (SDEs driven by Lévy processes) is discussed in this paper.
    $endgroup$
    – saz
    Jan 30 at 7:29










  • $begingroup$
    @saz In the setting described in the question, they are of linear growth (since they are globally Lipschitz, see $(5)$), but I don't see how the assertion can be proved.
    $endgroup$
    – 0xbadf00d
    Jan 30 at 9:45












  • $begingroup$
    The proof which I know uses Lyapunov functions... I can write it up later.
    $endgroup$
    – saz
    Jan 30 at 10:26










  • $begingroup$
    @saz That would be great. Thank you.
    $endgroup$
    – 0xbadf00d
    Jan 30 at 11:17














1












1








1





$begingroup$


Let





  • $(Omega,mathcal A,operatorname P)$ be a probability space


  • $b,sigma:mathbb Rtomathbb R$ be Lipschitz continuous


  • $(X_t^x)_{tge0}$ be a continuous process on $(Omega,mathcal A,operatorname P)$ with $$X^x_t=x+int_0^tb(X^x_s):{rm d}s+int_0^tsigma(X^x_s):{rm d}W_s;;;text{for all }tge0text{ almost surely}tag1$$ for $xinmathbb R$



Fix $tge0$ and $r>0$. I want to show that $$operatorname Pleft[left|X^x_tright|<rright]xrightarrow{left|xright|toinfty}0.tag2$$




By Markov's inequality, $$operatorname Pleft[left|X^x_tright|<rright]leoperatorname Pleft[left|X^x_t-xright|>left|xright|-rright]lefrac{operatorname Eleft[left|X^x_t-xright|^2right]}{left(left|xright|-rright)^2}tag3$$ for all $xinmathbb R$ with $left|xright|-r>0$.




If $b$ and $sigma$ are bounded and $lambda:=sup_{xinmathbb R}left|b(x)right|^2+sup_{xinmathbb R}left|sigma(x)right|^2$, then $$operatorname Eleft[left|X^x_t-xright|^2right]le2lambda t(t+4)tag4$$ by Hölder's inequality and the Burkholder-Davis-Gundy inequality. Since the denominator on the right-hand side of $(3)$ tends to $infty$, we're able to conclude $(2)$.



Question: Are we able to prove $(2)$ without assuming boundedness of $b$ and $sigma$?




By Lipschitz continuity, $$|b(x)|^2+|sigma(x)|^2le c(1+|x|^2);;;text{for all }xinmathbb Rtag5$$ for some $cge0$. For simplicity of notation, write $|Y|_t^ast:=sup_{sin[0,:t]}|Y_s|$ for $tge0$ and any process $(Y_t)_{tge0}$. Letting $c_1:=max(2,4c(t+4)t,4c(t+4))$, we obtain $$operatorname Eleft[{left|X^xright|_t^ast}^2right]le c_1left(x^2+1+int_0^toperatorname Eleft[{left|X^xright|_s^ast}^2right]{rm d}sright)tag6$$ by the same argumentation as needed for $(4)$ and hence $$operatorname Eleft[{left|X^xright|_t^ast}^2right]le c_1left(x^2+1right)e^{c_1t}tag7$$ by Grönwall's inequality.




However, I'm not able to utilize $(7)$ to prove $(2)$.$^1$






$^1$ compare with my related question.










share|cite|improve this question











$endgroup$




Let





  • $(Omega,mathcal A,operatorname P)$ be a probability space


  • $b,sigma:mathbb Rtomathbb R$ be Lipschitz continuous


  • $(X_t^x)_{tge0}$ be a continuous process on $(Omega,mathcal A,operatorname P)$ with $$X^x_t=x+int_0^tb(X^x_s):{rm d}s+int_0^tsigma(X^x_s):{rm d}W_s;;;text{for all }tge0text{ almost surely}tag1$$ for $xinmathbb R$



Fix $tge0$ and $r>0$. I want to show that $$operatorname Pleft[left|X^x_tright|<rright]xrightarrow{left|xright|toinfty}0.tag2$$




By Markov's inequality, $$operatorname Pleft[left|X^x_tright|<rright]leoperatorname Pleft[left|X^x_t-xright|>left|xright|-rright]lefrac{operatorname Eleft[left|X^x_t-xright|^2right]}{left(left|xright|-rright)^2}tag3$$ for all $xinmathbb R$ with $left|xright|-r>0$.




If $b$ and $sigma$ are bounded and $lambda:=sup_{xinmathbb R}left|b(x)right|^2+sup_{xinmathbb R}left|sigma(x)right|^2$, then $$operatorname Eleft[left|X^x_t-xright|^2right]le2lambda t(t+4)tag4$$ by Hölder's inequality and the Burkholder-Davis-Gundy inequality. Since the denominator on the right-hand side of $(3)$ tends to $infty$, we're able to conclude $(2)$.



Question: Are we able to prove $(2)$ without assuming boundedness of $b$ and $sigma$?




By Lipschitz continuity, $$|b(x)|^2+|sigma(x)|^2le c(1+|x|^2);;;text{for all }xinmathbb Rtag5$$ for some $cge0$. For simplicity of notation, write $|Y|_t^ast:=sup_{sin[0,:t]}|Y_s|$ for $tge0$ and any process $(Y_t)_{tge0}$. Letting $c_1:=max(2,4c(t+4)t,4c(t+4))$, we obtain $$operatorname Eleft[{left|X^xright|_t^ast}^2right]le c_1left(x^2+1+int_0^toperatorname Eleft[{left|X^xright|_s^ast}^2right]{rm d}sright)tag6$$ by the same argumentation as needed for $(4)$ and hence $$operatorname Eleft[{left|X^xright|_t^ast}^2right]le c_1left(x^2+1right)e^{c_1t}tag7$$ by Grönwall's inequality.




However, I'm not able to utilize $(7)$ to prove $(2)$.$^1$






$^1$ compare with my related question.







probability-theory stochastic-processes stochastic-calculus stochastic-analysis sde






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 31 at 8:25









saz

82k862131




82k862131










asked Jan 29 at 23:56









0xbadf00d0xbadf00d

1,77641534




1,77641534












  • $begingroup$
    The assertion hold's true if $b$, $sigma$ are at most of linear growth; otherwise it does, in general, fail to hold true. A more general framework (SDEs driven by Lévy processes) is discussed in this paper.
    $endgroup$
    – saz
    Jan 30 at 7:29










  • $begingroup$
    @saz In the setting described in the question, they are of linear growth (since they are globally Lipschitz, see $(5)$), but I don't see how the assertion can be proved.
    $endgroup$
    – 0xbadf00d
    Jan 30 at 9:45












  • $begingroup$
    The proof which I know uses Lyapunov functions... I can write it up later.
    $endgroup$
    – saz
    Jan 30 at 10:26










  • $begingroup$
    @saz That would be great. Thank you.
    $endgroup$
    – 0xbadf00d
    Jan 30 at 11:17


















  • $begingroup$
    The assertion hold's true if $b$, $sigma$ are at most of linear growth; otherwise it does, in general, fail to hold true. A more general framework (SDEs driven by Lévy processes) is discussed in this paper.
    $endgroup$
    – saz
    Jan 30 at 7:29










  • $begingroup$
    @saz In the setting described in the question, they are of linear growth (since they are globally Lipschitz, see $(5)$), but I don't see how the assertion can be proved.
    $endgroup$
    – 0xbadf00d
    Jan 30 at 9:45












  • $begingroup$
    The proof which I know uses Lyapunov functions... I can write it up later.
    $endgroup$
    – saz
    Jan 30 at 10:26










  • $begingroup$
    @saz That would be great. Thank you.
    $endgroup$
    – 0xbadf00d
    Jan 30 at 11:17
















$begingroup$
The assertion hold's true if $b$, $sigma$ are at most of linear growth; otherwise it does, in general, fail to hold true. A more general framework (SDEs driven by Lévy processes) is discussed in this paper.
$endgroup$
– saz
Jan 30 at 7:29




$begingroup$
The assertion hold's true if $b$, $sigma$ are at most of linear growth; otherwise it does, in general, fail to hold true. A more general framework (SDEs driven by Lévy processes) is discussed in this paper.
$endgroup$
– saz
Jan 30 at 7:29












$begingroup$
@saz In the setting described in the question, they are of linear growth (since they are globally Lipschitz, see $(5)$), but I don't see how the assertion can be proved.
$endgroup$
– 0xbadf00d
Jan 30 at 9:45






$begingroup$
@saz In the setting described in the question, they are of linear growth (since they are globally Lipschitz, see $(5)$), but I don't see how the assertion can be proved.
$endgroup$
– 0xbadf00d
Jan 30 at 9:45














$begingroup$
The proof which I know uses Lyapunov functions... I can write it up later.
$endgroup$
– saz
Jan 30 at 10:26




$begingroup$
The proof which I know uses Lyapunov functions... I can write it up later.
$endgroup$
– saz
Jan 30 at 10:26












$begingroup$
@saz That would be great. Thank you.
$endgroup$
– 0xbadf00d
Jan 30 at 11:17




$begingroup$
@saz That would be great. Thank you.
$endgroup$
– 0xbadf00d
Jan 30 at 11:17










1 Answer
1






active

oldest

votes


















1












$begingroup$

It is straight-forward to check that the function



$$f(x) := frac{1}{x^2+1}$$



satisfies



$$|f'(x)| leq 2 |x| f(x)^2 quad text{and} quad |f''(x)| leq 6 f(x)^2. tag{1}$$



Applying Dynkin's formula (or Itô's formula) we find that



$$mathbb{E}f(X_t^x)-f(x) = mathbb{E} left( int_0^t Af(X_s^x) , ds right) tag{2}$$



where



$$Af(x) := b(x) f'(x) + frac{1}{2} sigma^2(x) f''(x).$$



Because of $(1)$ and the at most linear growth of $b$ and $sigma$ it follows that we can find a constant $c_1>0$ such that



$$|Af(x)| leq c_1 f(x) quad text{for all $x in mathbb{R}$}.$$



Hence, by $(2)$,



$$mathbb{E}f(X_t^x) leq f(x) + c_1 int_0^t mathbb{E}f(X_s^x) , ds.$$



Applying Gronwall's lemma we get



$$mathbb{E}f(X_t^x) leq f(x) e^{c_2 t}, qquad t geq 0, x in mathbb{R} tag{3}$$



for a suitable constant $c_2>0$. Noting that, by the monotonicity of $f$,



$${|X_t^x| leq r} = {f(X_t^x) geq f(r)}$$



it follows from Markov's inequality and $(3)$ that



$$begin{align*} mathbb{P}(|X_t^x| leq r) = mathbb{P}(f(X_t^x) geq f(r)) &leq frac{1}{f(r)} mathbb{E}f(X_t^x) \ &leq frac{f(x)}{f(r)} e^{c_2 t} end{align*}$$



and the right-hand side converges to $0$ as $|x| to infty$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    How do you obtain $|Af(x)| leq c_1 f(x)$? By linear growth, there is a $c_3$ with $|b(x)|le c_3(1+|x|)$ and $sigma^2le c_3(1+|x|^2)$. This yields $|(Af)(x)|le 2c_1(1+|x|)|x|f^2(x)+3c_1(1+|x|^2)f^2(x)=2c_1(1+|x|)|x|f^2(x)+3c_1f(x)$. But I don't see how we can eliminate the $f^2$ in the first term.
    $endgroup$
    – 0xbadf00d
    Feb 2 at 20:13








  • 1




    $begingroup$
    @0xbadf00d $f(x)^2 = frac{1}{x^2+1} f(x)$, and so $$(1+|x|) |x| f(x)^2 leq (1+|x|)^2 f(x)^2 leq c_2 f(x)$$ where $$c_2 := sup_x frac{(1+|x|)^2}{1+x^2}.$$
    $endgroup$
    – saz
    Feb 2 at 20:17












  • $begingroup$
    Intuitively it's clear to me that $|text E[f(X^x_t)]xrightarrow{|x|toinfty}0$, if $fin C_0(mathbb R)$, but how do we need to argue rigorously? Given $ε>0$, we know that there is a $r>0$ with $$|f(x)|<ε;;;text{for all }|x|ge r.$$ So, begin{equation}begin{split}|text E[f(X^x_t)]&letext E[1_{left{:|X^x_t|:<:r:right}}|f(X^x_t)|]+text E[1_{left{:|X^x_t|:ge:r:right}}|f(X^x_t)|]\&lesup_{|y|<r}|f(y)|text P[|X^x_t|<r]+εtext P[|X^x_t|ge r]end{split}end{equation} Clearly, the first tends to $0$ as $|x|→infty$ and $text P[|X^x_t|ge r]le1$? Is that sufficient?
    $endgroup$
    – 0xbadf00d
    Feb 3 at 10:38












  • $begingroup$
    @0xbadf00d Well, yes... why should it not be sufficient?
    $endgroup$
    – saz
    Feb 3 at 11:19










  • $begingroup$
    I've seen that you've deleted your answer to this question. Could you tell me what you think about the following idea: The only "good" choice for $alpha$ is $alpha=1/2$. With this choice and under appropriate conditions, $sum_{i=1}^dg'(X_i)(sigma Z_i)$ converges in distribution (by the central limit theorem) and $sum_{i=1}^dfrac{g''(X_i)}2(sigma Z_i)^2$ converges almost surely. Do you think it's possible to show that the second sum tends almost surely to $-infty$? (But I don't know what to do with the first sum.)
    $endgroup$
    – 0xbadf00d
    Mar 9 at 19:04














Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3092915%2fshow-text-p-leftxx-tr-right-xrightarrowx-to-infty0-for-strong-solut%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$

It is straight-forward to check that the function



$$f(x) := frac{1}{x^2+1}$$



satisfies



$$|f'(x)| leq 2 |x| f(x)^2 quad text{and} quad |f''(x)| leq 6 f(x)^2. tag{1}$$



Applying Dynkin's formula (or Itô's formula) we find that



$$mathbb{E}f(X_t^x)-f(x) = mathbb{E} left( int_0^t Af(X_s^x) , ds right) tag{2}$$



where



$$Af(x) := b(x) f'(x) + frac{1}{2} sigma^2(x) f''(x).$$



Because of $(1)$ and the at most linear growth of $b$ and $sigma$ it follows that we can find a constant $c_1>0$ such that



$$|Af(x)| leq c_1 f(x) quad text{for all $x in mathbb{R}$}.$$



Hence, by $(2)$,



$$mathbb{E}f(X_t^x) leq f(x) + c_1 int_0^t mathbb{E}f(X_s^x) , ds.$$



Applying Gronwall's lemma we get



$$mathbb{E}f(X_t^x) leq f(x) e^{c_2 t}, qquad t geq 0, x in mathbb{R} tag{3}$$



for a suitable constant $c_2>0$. Noting that, by the monotonicity of $f$,



$${|X_t^x| leq r} = {f(X_t^x) geq f(r)}$$



it follows from Markov's inequality and $(3)$ that



$$begin{align*} mathbb{P}(|X_t^x| leq r) = mathbb{P}(f(X_t^x) geq f(r)) &leq frac{1}{f(r)} mathbb{E}f(X_t^x) \ &leq frac{f(x)}{f(r)} e^{c_2 t} end{align*}$$



and the right-hand side converges to $0$ as $|x| to infty$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    How do you obtain $|Af(x)| leq c_1 f(x)$? By linear growth, there is a $c_3$ with $|b(x)|le c_3(1+|x|)$ and $sigma^2le c_3(1+|x|^2)$. This yields $|(Af)(x)|le 2c_1(1+|x|)|x|f^2(x)+3c_1(1+|x|^2)f^2(x)=2c_1(1+|x|)|x|f^2(x)+3c_1f(x)$. But I don't see how we can eliminate the $f^2$ in the first term.
    $endgroup$
    – 0xbadf00d
    Feb 2 at 20:13








  • 1




    $begingroup$
    @0xbadf00d $f(x)^2 = frac{1}{x^2+1} f(x)$, and so $$(1+|x|) |x| f(x)^2 leq (1+|x|)^2 f(x)^2 leq c_2 f(x)$$ where $$c_2 := sup_x frac{(1+|x|)^2}{1+x^2}.$$
    $endgroup$
    – saz
    Feb 2 at 20:17












  • $begingroup$
    Intuitively it's clear to me that $|text E[f(X^x_t)]xrightarrow{|x|toinfty}0$, if $fin C_0(mathbb R)$, but how do we need to argue rigorously? Given $ε>0$, we know that there is a $r>0$ with $$|f(x)|<ε;;;text{for all }|x|ge r.$$ So, begin{equation}begin{split}|text E[f(X^x_t)]&letext E[1_{left{:|X^x_t|:<:r:right}}|f(X^x_t)|]+text E[1_{left{:|X^x_t|:ge:r:right}}|f(X^x_t)|]\&lesup_{|y|<r}|f(y)|text P[|X^x_t|<r]+εtext P[|X^x_t|ge r]end{split}end{equation} Clearly, the first tends to $0$ as $|x|→infty$ and $text P[|X^x_t|ge r]le1$? Is that sufficient?
    $endgroup$
    – 0xbadf00d
    Feb 3 at 10:38












  • $begingroup$
    @0xbadf00d Well, yes... why should it not be sufficient?
    $endgroup$
    – saz
    Feb 3 at 11:19










  • $begingroup$
    I've seen that you've deleted your answer to this question. Could you tell me what you think about the following idea: The only "good" choice for $alpha$ is $alpha=1/2$. With this choice and under appropriate conditions, $sum_{i=1}^dg'(X_i)(sigma Z_i)$ converges in distribution (by the central limit theorem) and $sum_{i=1}^dfrac{g''(X_i)}2(sigma Z_i)^2$ converges almost surely. Do you think it's possible to show that the second sum tends almost surely to $-infty$? (But I don't know what to do with the first sum.)
    $endgroup$
    – 0xbadf00d
    Mar 9 at 19:04


















1












$begingroup$

It is straight-forward to check that the function



$$f(x) := frac{1}{x^2+1}$$



satisfies



$$|f'(x)| leq 2 |x| f(x)^2 quad text{and} quad |f''(x)| leq 6 f(x)^2. tag{1}$$



Applying Dynkin's formula (or Itô's formula) we find that



$$mathbb{E}f(X_t^x)-f(x) = mathbb{E} left( int_0^t Af(X_s^x) , ds right) tag{2}$$



where



$$Af(x) := b(x) f'(x) + frac{1}{2} sigma^2(x) f''(x).$$



Because of $(1)$ and the at most linear growth of $b$ and $sigma$ it follows that we can find a constant $c_1>0$ such that



$$|Af(x)| leq c_1 f(x) quad text{for all $x in mathbb{R}$}.$$



Hence, by $(2)$,



$$mathbb{E}f(X_t^x) leq f(x) + c_1 int_0^t mathbb{E}f(X_s^x) , ds.$$



Applying Gronwall's lemma we get



$$mathbb{E}f(X_t^x) leq f(x) e^{c_2 t}, qquad t geq 0, x in mathbb{R} tag{3}$$



for a suitable constant $c_2>0$. Noting that, by the monotonicity of $f$,



$${|X_t^x| leq r} = {f(X_t^x) geq f(r)}$$



it follows from Markov's inequality and $(3)$ that



$$begin{align*} mathbb{P}(|X_t^x| leq r) = mathbb{P}(f(X_t^x) geq f(r)) &leq frac{1}{f(r)} mathbb{E}f(X_t^x) \ &leq frac{f(x)}{f(r)} e^{c_2 t} end{align*}$$



and the right-hand side converges to $0$ as $|x| to infty$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    How do you obtain $|Af(x)| leq c_1 f(x)$? By linear growth, there is a $c_3$ with $|b(x)|le c_3(1+|x|)$ and $sigma^2le c_3(1+|x|^2)$. This yields $|(Af)(x)|le 2c_1(1+|x|)|x|f^2(x)+3c_1(1+|x|^2)f^2(x)=2c_1(1+|x|)|x|f^2(x)+3c_1f(x)$. But I don't see how we can eliminate the $f^2$ in the first term.
    $endgroup$
    – 0xbadf00d
    Feb 2 at 20:13








  • 1




    $begingroup$
    @0xbadf00d $f(x)^2 = frac{1}{x^2+1} f(x)$, and so $$(1+|x|) |x| f(x)^2 leq (1+|x|)^2 f(x)^2 leq c_2 f(x)$$ where $$c_2 := sup_x frac{(1+|x|)^2}{1+x^2}.$$
    $endgroup$
    – saz
    Feb 2 at 20:17












  • $begingroup$
    Intuitively it's clear to me that $|text E[f(X^x_t)]xrightarrow{|x|toinfty}0$, if $fin C_0(mathbb R)$, but how do we need to argue rigorously? Given $ε>0$, we know that there is a $r>0$ with $$|f(x)|<ε;;;text{for all }|x|ge r.$$ So, begin{equation}begin{split}|text E[f(X^x_t)]&letext E[1_{left{:|X^x_t|:<:r:right}}|f(X^x_t)|]+text E[1_{left{:|X^x_t|:ge:r:right}}|f(X^x_t)|]\&lesup_{|y|<r}|f(y)|text P[|X^x_t|<r]+εtext P[|X^x_t|ge r]end{split}end{equation} Clearly, the first tends to $0$ as $|x|→infty$ and $text P[|X^x_t|ge r]le1$? Is that sufficient?
    $endgroup$
    – 0xbadf00d
    Feb 3 at 10:38












  • $begingroup$
    @0xbadf00d Well, yes... why should it not be sufficient?
    $endgroup$
    – saz
    Feb 3 at 11:19










  • $begingroup$
    I've seen that you've deleted your answer to this question. Could you tell me what you think about the following idea: The only "good" choice for $alpha$ is $alpha=1/2$. With this choice and under appropriate conditions, $sum_{i=1}^dg'(X_i)(sigma Z_i)$ converges in distribution (by the central limit theorem) and $sum_{i=1}^dfrac{g''(X_i)}2(sigma Z_i)^2$ converges almost surely. Do you think it's possible to show that the second sum tends almost surely to $-infty$? (But I don't know what to do with the first sum.)
    $endgroup$
    – 0xbadf00d
    Mar 9 at 19:04
















1












1








1





$begingroup$

It is straight-forward to check that the function



$$f(x) := frac{1}{x^2+1}$$



satisfies



$$|f'(x)| leq 2 |x| f(x)^2 quad text{and} quad |f''(x)| leq 6 f(x)^2. tag{1}$$



Applying Dynkin's formula (or Itô's formula) we find that



$$mathbb{E}f(X_t^x)-f(x) = mathbb{E} left( int_0^t Af(X_s^x) , ds right) tag{2}$$



where



$$Af(x) := b(x) f'(x) + frac{1}{2} sigma^2(x) f''(x).$$



Because of $(1)$ and the at most linear growth of $b$ and $sigma$ it follows that we can find a constant $c_1>0$ such that



$$|Af(x)| leq c_1 f(x) quad text{for all $x in mathbb{R}$}.$$



Hence, by $(2)$,



$$mathbb{E}f(X_t^x) leq f(x) + c_1 int_0^t mathbb{E}f(X_s^x) , ds.$$



Applying Gronwall's lemma we get



$$mathbb{E}f(X_t^x) leq f(x) e^{c_2 t}, qquad t geq 0, x in mathbb{R} tag{3}$$



for a suitable constant $c_2>0$. Noting that, by the monotonicity of $f$,



$${|X_t^x| leq r} = {f(X_t^x) geq f(r)}$$



it follows from Markov's inequality and $(3)$ that



$$begin{align*} mathbb{P}(|X_t^x| leq r) = mathbb{P}(f(X_t^x) geq f(r)) &leq frac{1}{f(r)} mathbb{E}f(X_t^x) \ &leq frac{f(x)}{f(r)} e^{c_2 t} end{align*}$$



and the right-hand side converges to $0$ as $|x| to infty$.






share|cite|improve this answer











$endgroup$



It is straight-forward to check that the function



$$f(x) := frac{1}{x^2+1}$$



satisfies



$$|f'(x)| leq 2 |x| f(x)^2 quad text{and} quad |f''(x)| leq 6 f(x)^2. tag{1}$$



Applying Dynkin's formula (or Itô's formula) we find that



$$mathbb{E}f(X_t^x)-f(x) = mathbb{E} left( int_0^t Af(X_s^x) , ds right) tag{2}$$



where



$$Af(x) := b(x) f'(x) + frac{1}{2} sigma^2(x) f''(x).$$



Because of $(1)$ and the at most linear growth of $b$ and $sigma$ it follows that we can find a constant $c_1>0$ such that



$$|Af(x)| leq c_1 f(x) quad text{for all $x in mathbb{R}$}.$$



Hence, by $(2)$,



$$mathbb{E}f(X_t^x) leq f(x) + c_1 int_0^t mathbb{E}f(X_s^x) , ds.$$



Applying Gronwall's lemma we get



$$mathbb{E}f(X_t^x) leq f(x) e^{c_2 t}, qquad t geq 0, x in mathbb{R} tag{3}$$



for a suitable constant $c_2>0$. Noting that, by the monotonicity of $f$,



$${|X_t^x| leq r} = {f(X_t^x) geq f(r)}$$



it follows from Markov's inequality and $(3)$ that



$$begin{align*} mathbb{P}(|X_t^x| leq r) = mathbb{P}(f(X_t^x) geq f(r)) &leq frac{1}{f(r)} mathbb{E}f(X_t^x) \ &leq frac{f(x)}{f(r)} e^{c_2 t} end{align*}$$



and the right-hand side converges to $0$ as $|x| to infty$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Feb 2 at 6:57

























answered Jan 31 at 8:23









sazsaz

82k862131




82k862131












  • $begingroup$
    How do you obtain $|Af(x)| leq c_1 f(x)$? By linear growth, there is a $c_3$ with $|b(x)|le c_3(1+|x|)$ and $sigma^2le c_3(1+|x|^2)$. This yields $|(Af)(x)|le 2c_1(1+|x|)|x|f^2(x)+3c_1(1+|x|^2)f^2(x)=2c_1(1+|x|)|x|f^2(x)+3c_1f(x)$. But I don't see how we can eliminate the $f^2$ in the first term.
    $endgroup$
    – 0xbadf00d
    Feb 2 at 20:13








  • 1




    $begingroup$
    @0xbadf00d $f(x)^2 = frac{1}{x^2+1} f(x)$, and so $$(1+|x|) |x| f(x)^2 leq (1+|x|)^2 f(x)^2 leq c_2 f(x)$$ where $$c_2 := sup_x frac{(1+|x|)^2}{1+x^2}.$$
    $endgroup$
    – saz
    Feb 2 at 20:17












  • $begingroup$
    Intuitively it's clear to me that $|text E[f(X^x_t)]xrightarrow{|x|toinfty}0$, if $fin C_0(mathbb R)$, but how do we need to argue rigorously? Given $ε>0$, we know that there is a $r>0$ with $$|f(x)|<ε;;;text{for all }|x|ge r.$$ So, begin{equation}begin{split}|text E[f(X^x_t)]&letext E[1_{left{:|X^x_t|:<:r:right}}|f(X^x_t)|]+text E[1_{left{:|X^x_t|:ge:r:right}}|f(X^x_t)|]\&lesup_{|y|<r}|f(y)|text P[|X^x_t|<r]+εtext P[|X^x_t|ge r]end{split}end{equation} Clearly, the first tends to $0$ as $|x|→infty$ and $text P[|X^x_t|ge r]le1$? Is that sufficient?
    $endgroup$
    – 0xbadf00d
    Feb 3 at 10:38












  • $begingroup$
    @0xbadf00d Well, yes... why should it not be sufficient?
    $endgroup$
    – saz
    Feb 3 at 11:19










  • $begingroup$
    I've seen that you've deleted your answer to this question. Could you tell me what you think about the following idea: The only "good" choice for $alpha$ is $alpha=1/2$. With this choice and under appropriate conditions, $sum_{i=1}^dg'(X_i)(sigma Z_i)$ converges in distribution (by the central limit theorem) and $sum_{i=1}^dfrac{g''(X_i)}2(sigma Z_i)^2$ converges almost surely. Do you think it's possible to show that the second sum tends almost surely to $-infty$? (But I don't know what to do with the first sum.)
    $endgroup$
    – 0xbadf00d
    Mar 9 at 19:04




















  • $begingroup$
    How do you obtain $|Af(x)| leq c_1 f(x)$? By linear growth, there is a $c_3$ with $|b(x)|le c_3(1+|x|)$ and $sigma^2le c_3(1+|x|^2)$. This yields $|(Af)(x)|le 2c_1(1+|x|)|x|f^2(x)+3c_1(1+|x|^2)f^2(x)=2c_1(1+|x|)|x|f^2(x)+3c_1f(x)$. But I don't see how we can eliminate the $f^2$ in the first term.
    $endgroup$
    – 0xbadf00d
    Feb 2 at 20:13








  • 1




    $begingroup$
    @0xbadf00d $f(x)^2 = frac{1}{x^2+1} f(x)$, and so $$(1+|x|) |x| f(x)^2 leq (1+|x|)^2 f(x)^2 leq c_2 f(x)$$ where $$c_2 := sup_x frac{(1+|x|)^2}{1+x^2}.$$
    $endgroup$
    – saz
    Feb 2 at 20:17












  • $begingroup$
    Intuitively it's clear to me that $|text E[f(X^x_t)]xrightarrow{|x|toinfty}0$, if $fin C_0(mathbb R)$, but how do we need to argue rigorously? Given $ε>0$, we know that there is a $r>0$ with $$|f(x)|<ε;;;text{for all }|x|ge r.$$ So, begin{equation}begin{split}|text E[f(X^x_t)]&letext E[1_{left{:|X^x_t|:<:r:right}}|f(X^x_t)|]+text E[1_{left{:|X^x_t|:ge:r:right}}|f(X^x_t)|]\&lesup_{|y|<r}|f(y)|text P[|X^x_t|<r]+εtext P[|X^x_t|ge r]end{split}end{equation} Clearly, the first tends to $0$ as $|x|→infty$ and $text P[|X^x_t|ge r]le1$? Is that sufficient?
    $endgroup$
    – 0xbadf00d
    Feb 3 at 10:38












  • $begingroup$
    @0xbadf00d Well, yes... why should it not be sufficient?
    $endgroup$
    – saz
    Feb 3 at 11:19










  • $begingroup$
    I've seen that you've deleted your answer to this question. Could you tell me what you think about the following idea: The only "good" choice for $alpha$ is $alpha=1/2$. With this choice and under appropriate conditions, $sum_{i=1}^dg'(X_i)(sigma Z_i)$ converges in distribution (by the central limit theorem) and $sum_{i=1}^dfrac{g''(X_i)}2(sigma Z_i)^2$ converges almost surely. Do you think it's possible to show that the second sum tends almost surely to $-infty$? (But I don't know what to do with the first sum.)
    $endgroup$
    – 0xbadf00d
    Mar 9 at 19:04


















$begingroup$
How do you obtain $|Af(x)| leq c_1 f(x)$? By linear growth, there is a $c_3$ with $|b(x)|le c_3(1+|x|)$ and $sigma^2le c_3(1+|x|^2)$. This yields $|(Af)(x)|le 2c_1(1+|x|)|x|f^2(x)+3c_1(1+|x|^2)f^2(x)=2c_1(1+|x|)|x|f^2(x)+3c_1f(x)$. But I don't see how we can eliminate the $f^2$ in the first term.
$endgroup$
– 0xbadf00d
Feb 2 at 20:13






$begingroup$
How do you obtain $|Af(x)| leq c_1 f(x)$? By linear growth, there is a $c_3$ with $|b(x)|le c_3(1+|x|)$ and $sigma^2le c_3(1+|x|^2)$. This yields $|(Af)(x)|le 2c_1(1+|x|)|x|f^2(x)+3c_1(1+|x|^2)f^2(x)=2c_1(1+|x|)|x|f^2(x)+3c_1f(x)$. But I don't see how we can eliminate the $f^2$ in the first term.
$endgroup$
– 0xbadf00d
Feb 2 at 20:13






1




1




$begingroup$
@0xbadf00d $f(x)^2 = frac{1}{x^2+1} f(x)$, and so $$(1+|x|) |x| f(x)^2 leq (1+|x|)^2 f(x)^2 leq c_2 f(x)$$ where $$c_2 := sup_x frac{(1+|x|)^2}{1+x^2}.$$
$endgroup$
– saz
Feb 2 at 20:17






$begingroup$
@0xbadf00d $f(x)^2 = frac{1}{x^2+1} f(x)$, and so $$(1+|x|) |x| f(x)^2 leq (1+|x|)^2 f(x)^2 leq c_2 f(x)$$ where $$c_2 := sup_x frac{(1+|x|)^2}{1+x^2}.$$
$endgroup$
– saz
Feb 2 at 20:17














$begingroup$
Intuitively it's clear to me that $|text E[f(X^x_t)]xrightarrow{|x|toinfty}0$, if $fin C_0(mathbb R)$, but how do we need to argue rigorously? Given $ε>0$, we know that there is a $r>0$ with $$|f(x)|<ε;;;text{for all }|x|ge r.$$ So, begin{equation}begin{split}|text E[f(X^x_t)]&letext E[1_{left{:|X^x_t|:<:r:right}}|f(X^x_t)|]+text E[1_{left{:|X^x_t|:ge:r:right}}|f(X^x_t)|]\&lesup_{|y|<r}|f(y)|text P[|X^x_t|<r]+εtext P[|X^x_t|ge r]end{split}end{equation} Clearly, the first tends to $0$ as $|x|→infty$ and $text P[|X^x_t|ge r]le1$? Is that sufficient?
$endgroup$
– 0xbadf00d
Feb 3 at 10:38






$begingroup$
Intuitively it's clear to me that $|text E[f(X^x_t)]xrightarrow{|x|toinfty}0$, if $fin C_0(mathbb R)$, but how do we need to argue rigorously? Given $ε>0$, we know that there is a $r>0$ with $$|f(x)|<ε;;;text{for all }|x|ge r.$$ So, begin{equation}begin{split}|text E[f(X^x_t)]&letext E[1_{left{:|X^x_t|:<:r:right}}|f(X^x_t)|]+text E[1_{left{:|X^x_t|:ge:r:right}}|f(X^x_t)|]\&lesup_{|y|<r}|f(y)|text P[|X^x_t|<r]+εtext P[|X^x_t|ge r]end{split}end{equation} Clearly, the first tends to $0$ as $|x|→infty$ and $text P[|X^x_t|ge r]le1$? Is that sufficient?
$endgroup$
– 0xbadf00d
Feb 3 at 10:38














$begingroup$
@0xbadf00d Well, yes... why should it not be sufficient?
$endgroup$
– saz
Feb 3 at 11:19




$begingroup$
@0xbadf00d Well, yes... why should it not be sufficient?
$endgroup$
– saz
Feb 3 at 11:19












$begingroup$
I've seen that you've deleted your answer to this question. Could you tell me what you think about the following idea: The only "good" choice for $alpha$ is $alpha=1/2$. With this choice and under appropriate conditions, $sum_{i=1}^dg'(X_i)(sigma Z_i)$ converges in distribution (by the central limit theorem) and $sum_{i=1}^dfrac{g''(X_i)}2(sigma Z_i)^2$ converges almost surely. Do you think it's possible to show that the second sum tends almost surely to $-infty$? (But I don't know what to do with the first sum.)
$endgroup$
– 0xbadf00d
Mar 9 at 19:04






$begingroup$
I've seen that you've deleted your answer to this question. Could you tell me what you think about the following idea: The only "good" choice for $alpha$ is $alpha=1/2$. With this choice and under appropriate conditions, $sum_{i=1}^dg'(X_i)(sigma Z_i)$ converges in distribution (by the central limit theorem) and $sum_{i=1}^dfrac{g''(X_i)}2(sigma Z_i)^2$ converges almost surely. Do you think it's possible to show that the second sum tends almost surely to $-infty$? (But I don't know what to do with the first sum.)
$endgroup$
– 0xbadf00d
Mar 9 at 19:04




















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3092915%2fshow-text-p-leftxx-tr-right-xrightarrowx-to-infty0-for-strong-solut%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

'app-layout' is not a known element: how to share Component with different Modules

android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

WPF add header to Image with URL pettitions [duplicate]