Discrete Time “Girsanov” with sub-Gaussian noise
$begingroup$
Context
I seem to recall from a course in stochastic calculus a few years back that for a random vector $X_t$ with dynamics
$$dX_t = f(x_t)dt + g_t dW_t$$
where $W$ is $mathbb{P}$-wiener there exists another measure $mathbb{Q}$ such that $X_t$ is a martingale under this measure (under suitable conditions on $f,g$).
Question
Currently, I find myself in a position where (super)-"martingalizing" a discrete time process seems useful. The setting is as follows: I have given myself a space $(Omega, (mathcal{F_t}), mathbb{P})$ and I consider the process
$$
X_{t+1}=f(X_t)+eta_t Leftrightarrow X_{t+1}-X_t=f(X_t)-X_t+eta_t
$$
where $eta_t$ is independent $sigma^2$-sub-gaussian noise, $mathbb{E} [e^{lambda
p eta_t}] leq e^{lambda ^2 sigma^2/2}$ for all the one-dimensional projections $p eta_t$. The function $f$ should be reasonable for the setting (I guess be measurable and satisfy some discrete analogue of Novikov).
Now, my hunch is that I might not be able to find $mathbb{Q}$ such that $Y_t = CX_t$ becomes a martingale (with a deterministic matrix $C$) - but maybe at least a supermartingale (the end user will be large deviation so I don't really care about it being a martingale, super will do)? I saw the answer to
Discrete and continuous Girsanov
however, the setting to that answer seems considerably simpler (using predictability).
Progress
I was considering that if one defines, for some $phi_t$, analogously to the continuous case
$$
L_t = expleft(sum_{k=1}^tlangle phi_k, W_k rangle-frac{1}{2}sum_{k=1}^tlangle phi_k, phi_k rangle right)
$$
the following computation could go through...
begin{align*}mathbb{E}[CX_{t+1}L_{t+1}|mathcal{F_t}]&=CL_tmathbb{E}[X_{t+1}e^{langlephi_{t+1},phi_{t+1}rangle-langle phi_{t+1},eta_{t+1}rangle/2}|mathcal{F_t}] \
&= dots \
&leq CX_t L_t,
end{align*}
for a clever choice of $phi_t$ by using the sub-gaussian property - but as for the "$dots$" i am stuck on:
- Can I actually deal with the nonlinearity at all? We can in the continuous case so should be possible I feel?
- What is the correct choice of kernel $phi_t$?
I would very much appreciate any help for proving/disproving the above inequality. NB, I know I have been sloppy in defining $f$. If one absolutely requires another property such as convexity to answer the question I am willing to admit the property (even though I obviously would prefer this not to be necessary...).
probability stochastic-processes stochastic-calculus
$endgroup$
add a comment |
$begingroup$
Context
I seem to recall from a course in stochastic calculus a few years back that for a random vector $X_t$ with dynamics
$$dX_t = f(x_t)dt + g_t dW_t$$
where $W$ is $mathbb{P}$-wiener there exists another measure $mathbb{Q}$ such that $X_t$ is a martingale under this measure (under suitable conditions on $f,g$).
Question
Currently, I find myself in a position where (super)-"martingalizing" a discrete time process seems useful. The setting is as follows: I have given myself a space $(Omega, (mathcal{F_t}), mathbb{P})$ and I consider the process
$$
X_{t+1}=f(X_t)+eta_t Leftrightarrow X_{t+1}-X_t=f(X_t)-X_t+eta_t
$$
where $eta_t$ is independent $sigma^2$-sub-gaussian noise, $mathbb{E} [e^{lambda
p eta_t}] leq e^{lambda ^2 sigma^2/2}$ for all the one-dimensional projections $p eta_t$. The function $f$ should be reasonable for the setting (I guess be measurable and satisfy some discrete analogue of Novikov).
Now, my hunch is that I might not be able to find $mathbb{Q}$ such that $Y_t = CX_t$ becomes a martingale (with a deterministic matrix $C$) - but maybe at least a supermartingale (the end user will be large deviation so I don't really care about it being a martingale, super will do)? I saw the answer to
Discrete and continuous Girsanov
however, the setting to that answer seems considerably simpler (using predictability).
Progress
I was considering that if one defines, for some $phi_t$, analogously to the continuous case
$$
L_t = expleft(sum_{k=1}^tlangle phi_k, W_k rangle-frac{1}{2}sum_{k=1}^tlangle phi_k, phi_k rangle right)
$$
the following computation could go through...
begin{align*}mathbb{E}[CX_{t+1}L_{t+1}|mathcal{F_t}]&=CL_tmathbb{E}[X_{t+1}e^{langlephi_{t+1},phi_{t+1}rangle-langle phi_{t+1},eta_{t+1}rangle/2}|mathcal{F_t}] \
&= dots \
&leq CX_t L_t,
end{align*}
for a clever choice of $phi_t$ by using the sub-gaussian property - but as for the "$dots$" i am stuck on:
- Can I actually deal with the nonlinearity at all? We can in the continuous case so should be possible I feel?
- What is the correct choice of kernel $phi_t$?
I would very much appreciate any help for proving/disproving the above inequality. NB, I know I have been sloppy in defining $f$. If one absolutely requires another property such as convexity to answer the question I am willing to admit the property (even though I obviously would prefer this not to be necessary...).
probability stochastic-processes stochastic-calculus
$endgroup$
add a comment |
$begingroup$
Context
I seem to recall from a course in stochastic calculus a few years back that for a random vector $X_t$ with dynamics
$$dX_t = f(x_t)dt + g_t dW_t$$
where $W$ is $mathbb{P}$-wiener there exists another measure $mathbb{Q}$ such that $X_t$ is a martingale under this measure (under suitable conditions on $f,g$).
Question
Currently, I find myself in a position where (super)-"martingalizing" a discrete time process seems useful. The setting is as follows: I have given myself a space $(Omega, (mathcal{F_t}), mathbb{P})$ and I consider the process
$$
X_{t+1}=f(X_t)+eta_t Leftrightarrow X_{t+1}-X_t=f(X_t)-X_t+eta_t
$$
where $eta_t$ is independent $sigma^2$-sub-gaussian noise, $mathbb{E} [e^{lambda
p eta_t}] leq e^{lambda ^2 sigma^2/2}$ for all the one-dimensional projections $p eta_t$. The function $f$ should be reasonable for the setting (I guess be measurable and satisfy some discrete analogue of Novikov).
Now, my hunch is that I might not be able to find $mathbb{Q}$ such that $Y_t = CX_t$ becomes a martingale (with a deterministic matrix $C$) - but maybe at least a supermartingale (the end user will be large deviation so I don't really care about it being a martingale, super will do)? I saw the answer to
Discrete and continuous Girsanov
however, the setting to that answer seems considerably simpler (using predictability).
Progress
I was considering that if one defines, for some $phi_t$, analogously to the continuous case
$$
L_t = expleft(sum_{k=1}^tlangle phi_k, W_k rangle-frac{1}{2}sum_{k=1}^tlangle phi_k, phi_k rangle right)
$$
the following computation could go through...
begin{align*}mathbb{E}[CX_{t+1}L_{t+1}|mathcal{F_t}]&=CL_tmathbb{E}[X_{t+1}e^{langlephi_{t+1},phi_{t+1}rangle-langle phi_{t+1},eta_{t+1}rangle/2}|mathcal{F_t}] \
&= dots \
&leq CX_t L_t,
end{align*}
for a clever choice of $phi_t$ by using the sub-gaussian property - but as for the "$dots$" i am stuck on:
- Can I actually deal with the nonlinearity at all? We can in the continuous case so should be possible I feel?
- What is the correct choice of kernel $phi_t$?
I would very much appreciate any help for proving/disproving the above inequality. NB, I know I have been sloppy in defining $f$. If one absolutely requires another property such as convexity to answer the question I am willing to admit the property (even though I obviously would prefer this not to be necessary...).
probability stochastic-processes stochastic-calculus
$endgroup$
Context
I seem to recall from a course in stochastic calculus a few years back that for a random vector $X_t$ with dynamics
$$dX_t = f(x_t)dt + g_t dW_t$$
where $W$ is $mathbb{P}$-wiener there exists another measure $mathbb{Q}$ such that $X_t$ is a martingale under this measure (under suitable conditions on $f,g$).
Question
Currently, I find myself in a position where (super)-"martingalizing" a discrete time process seems useful. The setting is as follows: I have given myself a space $(Omega, (mathcal{F_t}), mathbb{P})$ and I consider the process
$$
X_{t+1}=f(X_t)+eta_t Leftrightarrow X_{t+1}-X_t=f(X_t)-X_t+eta_t
$$
where $eta_t$ is independent $sigma^2$-sub-gaussian noise, $mathbb{E} [e^{lambda
p eta_t}] leq e^{lambda ^2 sigma^2/2}$ for all the one-dimensional projections $p eta_t$. The function $f$ should be reasonable for the setting (I guess be measurable and satisfy some discrete analogue of Novikov).
Now, my hunch is that I might not be able to find $mathbb{Q}$ such that $Y_t = CX_t$ becomes a martingale (with a deterministic matrix $C$) - but maybe at least a supermartingale (the end user will be large deviation so I don't really care about it being a martingale, super will do)? I saw the answer to
Discrete and continuous Girsanov
however, the setting to that answer seems considerably simpler (using predictability).
Progress
I was considering that if one defines, for some $phi_t$, analogously to the continuous case
$$
L_t = expleft(sum_{k=1}^tlangle phi_k, W_k rangle-frac{1}{2}sum_{k=1}^tlangle phi_k, phi_k rangle right)
$$
the following computation could go through...
begin{align*}mathbb{E}[CX_{t+1}L_{t+1}|mathcal{F_t}]&=CL_tmathbb{E}[X_{t+1}e^{langlephi_{t+1},phi_{t+1}rangle-langle phi_{t+1},eta_{t+1}rangle/2}|mathcal{F_t}] \
&= dots \
&leq CX_t L_t,
end{align*}
for a clever choice of $phi_t$ by using the sub-gaussian property - but as for the "$dots$" i am stuck on:
- Can I actually deal with the nonlinearity at all? We can in the continuous case so should be possible I feel?
- What is the correct choice of kernel $phi_t$?
I would very much appreciate any help for proving/disproving the above inequality. NB, I know I have been sloppy in defining $f$. If one absolutely requires another property such as convexity to answer the question I am willing to admit the property (even though I obviously would prefer this not to be necessary...).
probability stochastic-processes stochastic-calculus
probability stochastic-processes stochastic-calculus
asked Jan 13 at 19:46
sortofamathematiciansortofamathematician
516
516
add a comment |
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3072442%2fdiscrete-time-girsanov-with-sub-gaussian-noise%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3072442%2fdiscrete-time-girsanov-with-sub-gaussian-noise%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown