$X_n$ ~ Poisson(n), $Y_n$ ~ Geometric($e^{-frac{1}{n}})$
$begingroup$
Let $X_n$ ~ Poisson(n), $Y_n$ ~ Geometric($e^{-frac{1}{n}})$, everything independent. I want to find the convergence in law of:
$$ Z_n = frac{1}{n}X_n + beta Y_n $$
With $beta in mathbb{R}$.
Now, my plan is to use the characteristic function of $Z_n$.
This is what I have done:
$$phi_{Z_n} = phi_{frac{1}{n}X_n} *phi_{beta Y_n} $$
$$phi_{Z_n} = frac{ e^{-frac{1}{n}} e^{ibeta t}}{1-(1-e^{-frac{1}{n}})e^{ibeta t} } *e^{n(e^{it} - 1)}$$
I hope my calculations make sense up untili this point. My problem is now calculating the limit of n going to infinity. Any idea how to solve this?
probability weak-convergence
$endgroup$
add a comment |
$begingroup$
Let $X_n$ ~ Poisson(n), $Y_n$ ~ Geometric($e^{-frac{1}{n}})$, everything independent. I want to find the convergence in law of:
$$ Z_n = frac{1}{n}X_n + beta Y_n $$
With $beta in mathbb{R}$.
Now, my plan is to use the characteristic function of $Z_n$.
This is what I have done:
$$phi_{Z_n} = phi_{frac{1}{n}X_n} *phi_{beta Y_n} $$
$$phi_{Z_n} = frac{ e^{-frac{1}{n}} e^{ibeta t}}{1-(1-e^{-frac{1}{n}})e^{ibeta t} } *e^{n(e^{it} - 1)}$$
I hope my calculations make sense up untili this point. My problem is now calculating the limit of n going to infinity. Any idea how to solve this?
probability weak-convergence
$endgroup$
$begingroup$
Is your fraction just 1? Is there an exponent missing?
$endgroup$
– Paul
Jan 22 at 14:50
$begingroup$
@Paul Edited, thank you for catching the mistake.
$endgroup$
– qcc101
Jan 22 at 15:06
add a comment |
$begingroup$
Let $X_n$ ~ Poisson(n), $Y_n$ ~ Geometric($e^{-frac{1}{n}})$, everything independent. I want to find the convergence in law of:
$$ Z_n = frac{1}{n}X_n + beta Y_n $$
With $beta in mathbb{R}$.
Now, my plan is to use the characteristic function of $Z_n$.
This is what I have done:
$$phi_{Z_n} = phi_{frac{1}{n}X_n} *phi_{beta Y_n} $$
$$phi_{Z_n} = frac{ e^{-frac{1}{n}} e^{ibeta t}}{1-(1-e^{-frac{1}{n}})e^{ibeta t} } *e^{n(e^{it} - 1)}$$
I hope my calculations make sense up untili this point. My problem is now calculating the limit of n going to infinity. Any idea how to solve this?
probability weak-convergence
$endgroup$
Let $X_n$ ~ Poisson(n), $Y_n$ ~ Geometric($e^{-frac{1}{n}})$, everything independent. I want to find the convergence in law of:
$$ Z_n = frac{1}{n}X_n + beta Y_n $$
With $beta in mathbb{R}$.
Now, my plan is to use the characteristic function of $Z_n$.
This is what I have done:
$$phi_{Z_n} = phi_{frac{1}{n}X_n} *phi_{beta Y_n} $$
$$phi_{Z_n} = frac{ e^{-frac{1}{n}} e^{ibeta t}}{1-(1-e^{-frac{1}{n}})e^{ibeta t} } *e^{n(e^{it} - 1)}$$
I hope my calculations make sense up untili this point. My problem is now calculating the limit of n going to infinity. Any idea how to solve this?
probability weak-convergence
probability weak-convergence
edited Jan 22 at 15:06
qcc101
asked Jan 22 at 14:46
qcc101qcc101
627213
627213
$begingroup$
Is your fraction just 1? Is there an exponent missing?
$endgroup$
– Paul
Jan 22 at 14:50
$begingroup$
@Paul Edited, thank you for catching the mistake.
$endgroup$
– qcc101
Jan 22 at 15:06
add a comment |
$begingroup$
Is your fraction just 1? Is there an exponent missing?
$endgroup$
– Paul
Jan 22 at 14:50
$begingroup$
@Paul Edited, thank you for catching the mistake.
$endgroup$
– qcc101
Jan 22 at 15:06
$begingroup$
Is your fraction just 1? Is there an exponent missing?
$endgroup$
– Paul
Jan 22 at 14:50
$begingroup$
Is your fraction just 1? Is there an exponent missing?
$endgroup$
– Paul
Jan 22 at 14:50
$begingroup$
@Paul Edited, thank you for catching the mistake.
$endgroup$
– qcc101
Jan 22 at 15:06
$begingroup$
@Paul Edited, thank you for catching the mistake.
$endgroup$
– qcc101
Jan 22 at 15:06
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Note that defining $(W_n)_{n in mathbb{N}}$ where each $W_n$ is independent and Poisson distributed with mean $1$, we get
$$
frac{1}{n} X_n overset{mathcal{D}}{=} frac{1}{n} sum_{i=1}^n W_i
$$
and it is obvious from the law of large numbers that this converges to $1$ almost surely (hence in probability and in distribution).
Chebyshev's inequality can be used to prove that $beta Y_n to beta$ in probability since for any $varepsilon > 0$
$$
P(|beta Y_n - beta| geq varepsilon) = Pleft(|Y-1| geq frac{varepsilon}{|beta|} right) leq frac{textrm{Var}(Y_n) beta^2}{varepsilon^2}
$$
and since $textrm{Var}(Y_n) = frac{1-e^{-frac{1}{n}}}{(e^{-frac{1}{n}})^2 } to 0$ as $n to infty$, we are done. (This argument could also be used for the Poisson variables)
Convergence in probability is stable under summation and implies convergence in distribution.
$endgroup$
$begingroup$
Thank you, super clear. What if I wanted to use the characteristic function though?
$endgroup$
– qcc101
Jan 22 at 15:42
$begingroup$
@qcc101 I don't really see why you would want that when this is easier and proves something stronger but note that you're missing a $1/n$ in your expression for the characteristic function for $1/n X_n$. It should be $exp(n(e^{frac{it}{n}}-1))$. I think this helps showing the same result as above by applying the usual laws for limits.
$endgroup$
– Lundborg
Jan 22 at 16:25
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083258%2fx-n-poissonn-y-n-geometrice-frac1n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Note that defining $(W_n)_{n in mathbb{N}}$ where each $W_n$ is independent and Poisson distributed with mean $1$, we get
$$
frac{1}{n} X_n overset{mathcal{D}}{=} frac{1}{n} sum_{i=1}^n W_i
$$
and it is obvious from the law of large numbers that this converges to $1$ almost surely (hence in probability and in distribution).
Chebyshev's inequality can be used to prove that $beta Y_n to beta$ in probability since for any $varepsilon > 0$
$$
P(|beta Y_n - beta| geq varepsilon) = Pleft(|Y-1| geq frac{varepsilon}{|beta|} right) leq frac{textrm{Var}(Y_n) beta^2}{varepsilon^2}
$$
and since $textrm{Var}(Y_n) = frac{1-e^{-frac{1}{n}}}{(e^{-frac{1}{n}})^2 } to 0$ as $n to infty$, we are done. (This argument could also be used for the Poisson variables)
Convergence in probability is stable under summation and implies convergence in distribution.
$endgroup$
$begingroup$
Thank you, super clear. What if I wanted to use the characteristic function though?
$endgroup$
– qcc101
Jan 22 at 15:42
$begingroup$
@qcc101 I don't really see why you would want that when this is easier and proves something stronger but note that you're missing a $1/n$ in your expression for the characteristic function for $1/n X_n$. It should be $exp(n(e^{frac{it}{n}}-1))$. I think this helps showing the same result as above by applying the usual laws for limits.
$endgroup$
– Lundborg
Jan 22 at 16:25
add a comment |
$begingroup$
Note that defining $(W_n)_{n in mathbb{N}}$ where each $W_n$ is independent and Poisson distributed with mean $1$, we get
$$
frac{1}{n} X_n overset{mathcal{D}}{=} frac{1}{n} sum_{i=1}^n W_i
$$
and it is obvious from the law of large numbers that this converges to $1$ almost surely (hence in probability and in distribution).
Chebyshev's inequality can be used to prove that $beta Y_n to beta$ in probability since for any $varepsilon > 0$
$$
P(|beta Y_n - beta| geq varepsilon) = Pleft(|Y-1| geq frac{varepsilon}{|beta|} right) leq frac{textrm{Var}(Y_n) beta^2}{varepsilon^2}
$$
and since $textrm{Var}(Y_n) = frac{1-e^{-frac{1}{n}}}{(e^{-frac{1}{n}})^2 } to 0$ as $n to infty$, we are done. (This argument could also be used for the Poisson variables)
Convergence in probability is stable under summation and implies convergence in distribution.
$endgroup$
$begingroup$
Thank you, super clear. What if I wanted to use the characteristic function though?
$endgroup$
– qcc101
Jan 22 at 15:42
$begingroup$
@qcc101 I don't really see why you would want that when this is easier and proves something stronger but note that you're missing a $1/n$ in your expression for the characteristic function for $1/n X_n$. It should be $exp(n(e^{frac{it}{n}}-1))$. I think this helps showing the same result as above by applying the usual laws for limits.
$endgroup$
– Lundborg
Jan 22 at 16:25
add a comment |
$begingroup$
Note that defining $(W_n)_{n in mathbb{N}}$ where each $W_n$ is independent and Poisson distributed with mean $1$, we get
$$
frac{1}{n} X_n overset{mathcal{D}}{=} frac{1}{n} sum_{i=1}^n W_i
$$
and it is obvious from the law of large numbers that this converges to $1$ almost surely (hence in probability and in distribution).
Chebyshev's inequality can be used to prove that $beta Y_n to beta$ in probability since for any $varepsilon > 0$
$$
P(|beta Y_n - beta| geq varepsilon) = Pleft(|Y-1| geq frac{varepsilon}{|beta|} right) leq frac{textrm{Var}(Y_n) beta^2}{varepsilon^2}
$$
and since $textrm{Var}(Y_n) = frac{1-e^{-frac{1}{n}}}{(e^{-frac{1}{n}})^2 } to 0$ as $n to infty$, we are done. (This argument could also be used for the Poisson variables)
Convergence in probability is stable under summation and implies convergence in distribution.
$endgroup$
Note that defining $(W_n)_{n in mathbb{N}}$ where each $W_n$ is independent and Poisson distributed with mean $1$, we get
$$
frac{1}{n} X_n overset{mathcal{D}}{=} frac{1}{n} sum_{i=1}^n W_i
$$
and it is obvious from the law of large numbers that this converges to $1$ almost surely (hence in probability and in distribution).
Chebyshev's inequality can be used to prove that $beta Y_n to beta$ in probability since for any $varepsilon > 0$
$$
P(|beta Y_n - beta| geq varepsilon) = Pleft(|Y-1| geq frac{varepsilon}{|beta|} right) leq frac{textrm{Var}(Y_n) beta^2}{varepsilon^2}
$$
and since $textrm{Var}(Y_n) = frac{1-e^{-frac{1}{n}}}{(e^{-frac{1}{n}})^2 } to 0$ as $n to infty$, we are done. (This argument could also be used for the Poisson variables)
Convergence in probability is stable under summation and implies convergence in distribution.
answered Jan 22 at 15:33
LundborgLundborg
852517
852517
$begingroup$
Thank you, super clear. What if I wanted to use the characteristic function though?
$endgroup$
– qcc101
Jan 22 at 15:42
$begingroup$
@qcc101 I don't really see why you would want that when this is easier and proves something stronger but note that you're missing a $1/n$ in your expression for the characteristic function for $1/n X_n$. It should be $exp(n(e^{frac{it}{n}}-1))$. I think this helps showing the same result as above by applying the usual laws for limits.
$endgroup$
– Lundborg
Jan 22 at 16:25
add a comment |
$begingroup$
Thank you, super clear. What if I wanted to use the characteristic function though?
$endgroup$
– qcc101
Jan 22 at 15:42
$begingroup$
@qcc101 I don't really see why you would want that when this is easier and proves something stronger but note that you're missing a $1/n$ in your expression for the characteristic function for $1/n X_n$. It should be $exp(n(e^{frac{it}{n}}-1))$. I think this helps showing the same result as above by applying the usual laws for limits.
$endgroup$
– Lundborg
Jan 22 at 16:25
$begingroup$
Thank you, super clear. What if I wanted to use the characteristic function though?
$endgroup$
– qcc101
Jan 22 at 15:42
$begingroup$
Thank you, super clear. What if I wanted to use the characteristic function though?
$endgroup$
– qcc101
Jan 22 at 15:42
$begingroup$
@qcc101 I don't really see why you would want that when this is easier and proves something stronger but note that you're missing a $1/n$ in your expression for the characteristic function for $1/n X_n$. It should be $exp(n(e^{frac{it}{n}}-1))$. I think this helps showing the same result as above by applying the usual laws for limits.
$endgroup$
– Lundborg
Jan 22 at 16:25
$begingroup$
@qcc101 I don't really see why you would want that when this is easier and proves something stronger but note that you're missing a $1/n$ in your expression for the characteristic function for $1/n X_n$. It should be $exp(n(e^{frac{it}{n}}-1))$. I think this helps showing the same result as above by applying the usual laws for limits.
$endgroup$
– Lundborg
Jan 22 at 16:25
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083258%2fx-n-poissonn-y-n-geometrice-frac1n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Is your fraction just 1? Is there an exponent missing?
$endgroup$
– Paul
Jan 22 at 14:50
$begingroup$
@Paul Edited, thank you for catching the mistake.
$endgroup$
– qcc101
Jan 22 at 15:06