On infinitesimal generator when the volatility of Brownian motion is given as a function of time
$begingroup$
Consider standard Brownian motion.
$dS_t=muspace d_t+sigmaspace dB_t$
For the process, the operator are given below.
$Af(x)=mu frac{df}{ds}+frac{sigma^2}{2}frac{d^2f}{ds^2}+frac{df}{dt}$
I would like to consider the case where σ is a function of time,for example $sigma(t)=cos(t)$
so,the brownian motion become,
$dS_t=muspace d_t+sigma(t)space dB_t$
In this case,Is it ok to think that the operator become the following?
$Af(x)=mu frac{df}{ds}+frac{sigma(t)^2}{2}frac{d^2f}{ds^2}+frac{df}{dt}$
stochastic-processes stochastic-calculus brownian-motion infinitesimals
$endgroup$
add a comment |
$begingroup$
Consider standard Brownian motion.
$dS_t=muspace d_t+sigmaspace dB_t$
For the process, the operator are given below.
$Af(x)=mu frac{df}{ds}+frac{sigma^2}{2}frac{d^2f}{ds^2}+frac{df}{dt}$
I would like to consider the case where σ is a function of time,for example $sigma(t)=cos(t)$
so,the brownian motion become,
$dS_t=muspace d_t+sigma(t)space dB_t$
In this case,Is it ok to think that the operator become the following?
$Af(x)=mu frac{df}{ds}+frac{sigma(t)^2}{2}frac{d^2f}{ds^2}+frac{df}{dt}$
stochastic-processes stochastic-calculus brownian-motion infinitesimals
$endgroup$
add a comment |
$begingroup$
Consider standard Brownian motion.
$dS_t=muspace d_t+sigmaspace dB_t$
For the process, the operator are given below.
$Af(x)=mu frac{df}{ds}+frac{sigma^2}{2}frac{d^2f}{ds^2}+frac{df}{dt}$
I would like to consider the case where σ is a function of time,for example $sigma(t)=cos(t)$
so,the brownian motion become,
$dS_t=muspace d_t+sigma(t)space dB_t$
In this case,Is it ok to think that the operator become the following?
$Af(x)=mu frac{df}{ds}+frac{sigma(t)^2}{2}frac{d^2f}{ds^2}+frac{df}{dt}$
stochastic-processes stochastic-calculus brownian-motion infinitesimals
$endgroup$
Consider standard Brownian motion.
$dS_t=muspace d_t+sigmaspace dB_t$
For the process, the operator are given below.
$Af(x)=mu frac{df}{ds}+frac{sigma^2}{2}frac{d^2f}{ds^2}+frac{df}{dt}$
I would like to consider the case where σ is a function of time,for example $sigma(t)=cos(t)$
so,the brownian motion become,
$dS_t=muspace d_t+sigma(t)space dB_t$
In this case,Is it ok to think that the operator become the following?
$Af(x)=mu frac{df}{ds}+frac{sigma(t)^2}{2}frac{d^2f}{ds^2}+frac{df}{dt}$
stochastic-processes stochastic-calculus brownian-motion infinitesimals
stochastic-processes stochastic-calculus brownian-motion infinitesimals
edited Jan 31 at 23:48
Xminer
asked Jan 31 at 4:20
XminerXminer
1134
1134
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
The answer to your question is yes (more or less). The short answer as to why is "because of Ito's lemma". A much more detailed explanation is given below, which I highly recommend taking the time to read and understand.
Assumption. $sigma : [0, infty) rightarrow mathbb{R}$ is a continuous function of time.
Let $f$ be any twice-continuously differentiable and compactly supported real-valued function.
The infinitesimal generator at time $t$ (as applied to $f$) is defined as
$$
mathcal{A}_{t}f(x)=lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}
$$
where the process $X^{t,x}$ satisfies the SDE you wrote down:
$$
X_{s}=x+int_{t}^{s}mu dr+int_{t}^{s}sigma(r)dB_{r}qquadtext{for }s geq t.
$$
Use Ito's lemma on $f$ to get
$$
f(X_{t+h}^{t,x})-f(x)=int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds+int_{t}^{t+h}sigma(s)frac{partial f}{partial x}(X_{s}^{t,x})dB_{s}.
$$
Take expectations of both sides to get
$$
mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)=mathbb{E}left[int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})dsright]
$$
where we have used the fact that the expectation of the Ito integral is zero.
Using the mean value theorem for integrals we get
$$
int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds=hleft(mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right)
$$
where $cequiv c(omega)$, which depends on the sample path $omega$, is a point between $t$ and $t+h$.
Since $f$ is compactly supported, the dominated convergence theorem gives
$$
lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}=mathbb{E}left[lim_{hdownarrow0}left{ mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right} right].
$$
Using continuity and the facts that $sigma(c)rightarrow sigma(t)$ and $X_{c}rightarrow x$ as $h downarrow 0$ we can take the limit to conclude
$$
mathcal{A}_tf(x)=mufrac{partial f}{partial x}(x)+frac{(sigma(t))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(x).
$$
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3094506%2fon-infinitesimal-generator-when-the-volatility-of-brownian-motion-is-given-as-a%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The answer to your question is yes (more or less). The short answer as to why is "because of Ito's lemma". A much more detailed explanation is given below, which I highly recommend taking the time to read and understand.
Assumption. $sigma : [0, infty) rightarrow mathbb{R}$ is a continuous function of time.
Let $f$ be any twice-continuously differentiable and compactly supported real-valued function.
The infinitesimal generator at time $t$ (as applied to $f$) is defined as
$$
mathcal{A}_{t}f(x)=lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}
$$
where the process $X^{t,x}$ satisfies the SDE you wrote down:
$$
X_{s}=x+int_{t}^{s}mu dr+int_{t}^{s}sigma(r)dB_{r}qquadtext{for }s geq t.
$$
Use Ito's lemma on $f$ to get
$$
f(X_{t+h}^{t,x})-f(x)=int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds+int_{t}^{t+h}sigma(s)frac{partial f}{partial x}(X_{s}^{t,x})dB_{s}.
$$
Take expectations of both sides to get
$$
mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)=mathbb{E}left[int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})dsright]
$$
where we have used the fact that the expectation of the Ito integral is zero.
Using the mean value theorem for integrals we get
$$
int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds=hleft(mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right)
$$
where $cequiv c(omega)$, which depends on the sample path $omega$, is a point between $t$ and $t+h$.
Since $f$ is compactly supported, the dominated convergence theorem gives
$$
lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}=mathbb{E}left[lim_{hdownarrow0}left{ mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right} right].
$$
Using continuity and the facts that $sigma(c)rightarrow sigma(t)$ and $X_{c}rightarrow x$ as $h downarrow 0$ we can take the limit to conclude
$$
mathcal{A}_tf(x)=mufrac{partial f}{partial x}(x)+frac{(sigma(t))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(x).
$$
$endgroup$
add a comment |
$begingroup$
The answer to your question is yes (more or less). The short answer as to why is "because of Ito's lemma". A much more detailed explanation is given below, which I highly recommend taking the time to read and understand.
Assumption. $sigma : [0, infty) rightarrow mathbb{R}$ is a continuous function of time.
Let $f$ be any twice-continuously differentiable and compactly supported real-valued function.
The infinitesimal generator at time $t$ (as applied to $f$) is defined as
$$
mathcal{A}_{t}f(x)=lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}
$$
where the process $X^{t,x}$ satisfies the SDE you wrote down:
$$
X_{s}=x+int_{t}^{s}mu dr+int_{t}^{s}sigma(r)dB_{r}qquadtext{for }s geq t.
$$
Use Ito's lemma on $f$ to get
$$
f(X_{t+h}^{t,x})-f(x)=int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds+int_{t}^{t+h}sigma(s)frac{partial f}{partial x}(X_{s}^{t,x})dB_{s}.
$$
Take expectations of both sides to get
$$
mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)=mathbb{E}left[int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})dsright]
$$
where we have used the fact that the expectation of the Ito integral is zero.
Using the mean value theorem for integrals we get
$$
int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds=hleft(mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right)
$$
where $cequiv c(omega)$, which depends on the sample path $omega$, is a point between $t$ and $t+h$.
Since $f$ is compactly supported, the dominated convergence theorem gives
$$
lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}=mathbb{E}left[lim_{hdownarrow0}left{ mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right} right].
$$
Using continuity and the facts that $sigma(c)rightarrow sigma(t)$ and $X_{c}rightarrow x$ as $h downarrow 0$ we can take the limit to conclude
$$
mathcal{A}_tf(x)=mufrac{partial f}{partial x}(x)+frac{(sigma(t))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(x).
$$
$endgroup$
add a comment |
$begingroup$
The answer to your question is yes (more or less). The short answer as to why is "because of Ito's lemma". A much more detailed explanation is given below, which I highly recommend taking the time to read and understand.
Assumption. $sigma : [0, infty) rightarrow mathbb{R}$ is a continuous function of time.
Let $f$ be any twice-continuously differentiable and compactly supported real-valued function.
The infinitesimal generator at time $t$ (as applied to $f$) is defined as
$$
mathcal{A}_{t}f(x)=lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}
$$
where the process $X^{t,x}$ satisfies the SDE you wrote down:
$$
X_{s}=x+int_{t}^{s}mu dr+int_{t}^{s}sigma(r)dB_{r}qquadtext{for }s geq t.
$$
Use Ito's lemma on $f$ to get
$$
f(X_{t+h}^{t,x})-f(x)=int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds+int_{t}^{t+h}sigma(s)frac{partial f}{partial x}(X_{s}^{t,x})dB_{s}.
$$
Take expectations of both sides to get
$$
mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)=mathbb{E}left[int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})dsright]
$$
where we have used the fact that the expectation of the Ito integral is zero.
Using the mean value theorem for integrals we get
$$
int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds=hleft(mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right)
$$
where $cequiv c(omega)$, which depends on the sample path $omega$, is a point between $t$ and $t+h$.
Since $f$ is compactly supported, the dominated convergence theorem gives
$$
lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}=mathbb{E}left[lim_{hdownarrow0}left{ mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right} right].
$$
Using continuity and the facts that $sigma(c)rightarrow sigma(t)$ and $X_{c}rightarrow x$ as $h downarrow 0$ we can take the limit to conclude
$$
mathcal{A}_tf(x)=mufrac{partial f}{partial x}(x)+frac{(sigma(t))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(x).
$$
$endgroup$
The answer to your question is yes (more or less). The short answer as to why is "because of Ito's lemma". A much more detailed explanation is given below, which I highly recommend taking the time to read and understand.
Assumption. $sigma : [0, infty) rightarrow mathbb{R}$ is a continuous function of time.
Let $f$ be any twice-continuously differentiable and compactly supported real-valued function.
The infinitesimal generator at time $t$ (as applied to $f$) is defined as
$$
mathcal{A}_{t}f(x)=lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}
$$
where the process $X^{t,x}$ satisfies the SDE you wrote down:
$$
X_{s}=x+int_{t}^{s}mu dr+int_{t}^{s}sigma(r)dB_{r}qquadtext{for }s geq t.
$$
Use Ito's lemma on $f$ to get
$$
f(X_{t+h}^{t,x})-f(x)=int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds+int_{t}^{t+h}sigma(s)frac{partial f}{partial x}(X_{s}^{t,x})dB_{s}.
$$
Take expectations of both sides to get
$$
mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)=mathbb{E}left[int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})dsright]
$$
where we have used the fact that the expectation of the Ito integral is zero.
Using the mean value theorem for integrals we get
$$
int_{t}^{t+h}mufrac{partial f}{partial x}(X_{s}^{t,x})+frac{(sigma(s))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{s}^{t,x})ds=hleft(mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right)
$$
where $cequiv c(omega)$, which depends on the sample path $omega$, is a point between $t$ and $t+h$.
Since $f$ is compactly supported, the dominated convergence theorem gives
$$
lim_{hdownarrow0}frac{mathbb{E}left[f(X_{t+h}^{t,x})right]-f(x)}{h}=mathbb{E}left[lim_{hdownarrow0}left{ mufrac{partial f}{partial x}(X_{c}^{t,x})+frac{(sigma(c))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(X_{c}^{t,x})right} right].
$$
Using continuity and the facts that $sigma(c)rightarrow sigma(t)$ and $X_{c}rightarrow x$ as $h downarrow 0$ we can take the limit to conclude
$$
mathcal{A}_tf(x)=mufrac{partial f}{partial x}(x)+frac{(sigma(t))^{2}}{2}frac{partial^{2}f}{partial x^{2}}(x).
$$
edited Jan 31 at 15:53
answered Jan 31 at 5:29
parsiadparsiad
18.6k32453
18.6k32453
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3094506%2fon-infinitesimal-generator-when-the-volatility-of-brownian-motion-is-given-as-a%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown