Prove the central limit theorem for a sequence of i.i.d. Bernoulli($p$) random variables
$begingroup$
Prove the central limit theorem for a sequence of i.i.d.
Bernoulli($p$) random variables, where $pin(0,1)$.
I am trying to do this by computing the moment generating function of the object I want the limit of and use Taylor's expansion to show that it converges to the moment generating function of a standard normal.
Attempt:
For a random variable X, we have the moment generating function $$M_{X}(t)=mathbb{E}[e^{tX}]$$ and if we expand using the Taylor series of $e^{tX}$ we get $$M_X(t)=sum_{n=0}^infty frac{mathbb{E}[X^n]}{n!}t^n.$$ So, we have in particular $$M_x^{(n)}(0)=mathbb{E}[X^n].$$
Now for the proof, we have a Bernoulli random variable is 1 with probability $p$ and 0 with probability $(1-p)$. First we want the moment generating function for a Bernoulli random variable. In particular, we have $$M_{Bernoulli(p)}=(1-p)+pe^t=1+(e^t-1)p$$
Then, we have a binomial random variable as the sum of n independent Bernoulli variables. So,
$$M_{binomial(n,p)}=((1-p)+pe^t=(1+(e^t-1)p)^n$$
Suppose that $p=lambda/n$ and notice that $$M_{binomial(n,lambda/n)}=bigg(1+frac{(e^t-1)lambda}{n}bigg)^n rightarrow e^{lambda(e^t-1)}$$
I am not sure how to finish the proof though or if I am even on the right track...
probability probability-theory random-variables central-limit-theorem
$endgroup$
add a comment |
$begingroup$
Prove the central limit theorem for a sequence of i.i.d.
Bernoulli($p$) random variables, where $pin(0,1)$.
I am trying to do this by computing the moment generating function of the object I want the limit of and use Taylor's expansion to show that it converges to the moment generating function of a standard normal.
Attempt:
For a random variable X, we have the moment generating function $$M_{X}(t)=mathbb{E}[e^{tX}]$$ and if we expand using the Taylor series of $e^{tX}$ we get $$M_X(t)=sum_{n=0}^infty frac{mathbb{E}[X^n]}{n!}t^n.$$ So, we have in particular $$M_x^{(n)}(0)=mathbb{E}[X^n].$$
Now for the proof, we have a Bernoulli random variable is 1 with probability $p$ and 0 with probability $(1-p)$. First we want the moment generating function for a Bernoulli random variable. In particular, we have $$M_{Bernoulli(p)}=(1-p)+pe^t=1+(e^t-1)p$$
Then, we have a binomial random variable as the sum of n independent Bernoulli variables. So,
$$M_{binomial(n,p)}=((1-p)+pe^t=(1+(e^t-1)p)^n$$
Suppose that $p=lambda/n$ and notice that $$M_{binomial(n,lambda/n)}=bigg(1+frac{(e^t-1)lambda}{n}bigg)^n rightarrow e^{lambda(e^t-1)}$$
I am not sure how to finish the proof though or if I am even on the right track...
probability probability-theory random-variables central-limit-theorem
$endgroup$
1
$begingroup$
What you have done so far is merely deriving (correctly) Binomial becoming Poisson under the limit $n to infty, p to 0$ with $np = lambda$ held constant. The statement of Central Limit Theorem involves $frac{bar X - mu}{ sigma}$. Where's your analysis on $bar X$?
$endgroup$
– Lee David Chung Lin
Feb 2 at 2:05
$begingroup$
Oh okay. Thank you. So I want to find the mean and the std deviation for the random variable next?
$endgroup$
– MathIsHard
Feb 2 at 2:48
1
$begingroup$
There are many ways to do this. Since you wanted to use the moment generating function, you should find the MGF of $M_Y(t)$ where $Y equiv bar X = frac1n sum X_i$ and then further consider the MGF of $sqrt{n}frac{Y - mu}{ sigma}$
$endgroup$
– Lee David Chung Lin
Feb 2 at 3:36
$begingroup$
oh okay. Thank you. I will give that a try. I appreciate your time
$endgroup$
– MathIsHard
Feb 2 at 4:03
add a comment |
$begingroup$
Prove the central limit theorem for a sequence of i.i.d.
Bernoulli($p$) random variables, where $pin(0,1)$.
I am trying to do this by computing the moment generating function of the object I want the limit of and use Taylor's expansion to show that it converges to the moment generating function of a standard normal.
Attempt:
For a random variable X, we have the moment generating function $$M_{X}(t)=mathbb{E}[e^{tX}]$$ and if we expand using the Taylor series of $e^{tX}$ we get $$M_X(t)=sum_{n=0}^infty frac{mathbb{E}[X^n]}{n!}t^n.$$ So, we have in particular $$M_x^{(n)}(0)=mathbb{E}[X^n].$$
Now for the proof, we have a Bernoulli random variable is 1 with probability $p$ and 0 with probability $(1-p)$. First we want the moment generating function for a Bernoulli random variable. In particular, we have $$M_{Bernoulli(p)}=(1-p)+pe^t=1+(e^t-1)p$$
Then, we have a binomial random variable as the sum of n independent Bernoulli variables. So,
$$M_{binomial(n,p)}=((1-p)+pe^t=(1+(e^t-1)p)^n$$
Suppose that $p=lambda/n$ and notice that $$M_{binomial(n,lambda/n)}=bigg(1+frac{(e^t-1)lambda}{n}bigg)^n rightarrow e^{lambda(e^t-1)}$$
I am not sure how to finish the proof though or if I am even on the right track...
probability probability-theory random-variables central-limit-theorem
$endgroup$
Prove the central limit theorem for a sequence of i.i.d.
Bernoulli($p$) random variables, where $pin(0,1)$.
I am trying to do this by computing the moment generating function of the object I want the limit of and use Taylor's expansion to show that it converges to the moment generating function of a standard normal.
Attempt:
For a random variable X, we have the moment generating function $$M_{X}(t)=mathbb{E}[e^{tX}]$$ and if we expand using the Taylor series of $e^{tX}$ we get $$M_X(t)=sum_{n=0}^infty frac{mathbb{E}[X^n]}{n!}t^n.$$ So, we have in particular $$M_x^{(n)}(0)=mathbb{E}[X^n].$$
Now for the proof, we have a Bernoulli random variable is 1 with probability $p$ and 0 with probability $(1-p)$. First we want the moment generating function for a Bernoulli random variable. In particular, we have $$M_{Bernoulli(p)}=(1-p)+pe^t=1+(e^t-1)p$$
Then, we have a binomial random variable as the sum of n independent Bernoulli variables. So,
$$M_{binomial(n,p)}=((1-p)+pe^t=(1+(e^t-1)p)^n$$
Suppose that $p=lambda/n$ and notice that $$M_{binomial(n,lambda/n)}=bigg(1+frac{(e^t-1)lambda}{n}bigg)^n rightarrow e^{lambda(e^t-1)}$$
I am not sure how to finish the proof though or if I am even on the right track...
probability probability-theory random-variables central-limit-theorem
probability probability-theory random-variables central-limit-theorem
asked Feb 2 at 1:18
MathIsHardMathIsHard
1,318516
1,318516
1
$begingroup$
What you have done so far is merely deriving (correctly) Binomial becoming Poisson under the limit $n to infty, p to 0$ with $np = lambda$ held constant. The statement of Central Limit Theorem involves $frac{bar X - mu}{ sigma}$. Where's your analysis on $bar X$?
$endgroup$
– Lee David Chung Lin
Feb 2 at 2:05
$begingroup$
Oh okay. Thank you. So I want to find the mean and the std deviation for the random variable next?
$endgroup$
– MathIsHard
Feb 2 at 2:48
1
$begingroup$
There are many ways to do this. Since you wanted to use the moment generating function, you should find the MGF of $M_Y(t)$ where $Y equiv bar X = frac1n sum X_i$ and then further consider the MGF of $sqrt{n}frac{Y - mu}{ sigma}$
$endgroup$
– Lee David Chung Lin
Feb 2 at 3:36
$begingroup$
oh okay. Thank you. I will give that a try. I appreciate your time
$endgroup$
– MathIsHard
Feb 2 at 4:03
add a comment |
1
$begingroup$
What you have done so far is merely deriving (correctly) Binomial becoming Poisson under the limit $n to infty, p to 0$ with $np = lambda$ held constant. The statement of Central Limit Theorem involves $frac{bar X - mu}{ sigma}$. Where's your analysis on $bar X$?
$endgroup$
– Lee David Chung Lin
Feb 2 at 2:05
$begingroup$
Oh okay. Thank you. So I want to find the mean and the std deviation for the random variable next?
$endgroup$
– MathIsHard
Feb 2 at 2:48
1
$begingroup$
There are many ways to do this. Since you wanted to use the moment generating function, you should find the MGF of $M_Y(t)$ where $Y equiv bar X = frac1n sum X_i$ and then further consider the MGF of $sqrt{n}frac{Y - mu}{ sigma}$
$endgroup$
– Lee David Chung Lin
Feb 2 at 3:36
$begingroup$
oh okay. Thank you. I will give that a try. I appreciate your time
$endgroup$
– MathIsHard
Feb 2 at 4:03
1
1
$begingroup$
What you have done so far is merely deriving (correctly) Binomial becoming Poisson under the limit $n to infty, p to 0$ with $np = lambda$ held constant. The statement of Central Limit Theorem involves $frac{bar X - mu}{ sigma}$. Where's your analysis on $bar X$?
$endgroup$
– Lee David Chung Lin
Feb 2 at 2:05
$begingroup$
What you have done so far is merely deriving (correctly) Binomial becoming Poisson under the limit $n to infty, p to 0$ with $np = lambda$ held constant. The statement of Central Limit Theorem involves $frac{bar X - mu}{ sigma}$. Where's your analysis on $bar X$?
$endgroup$
– Lee David Chung Lin
Feb 2 at 2:05
$begingroup$
Oh okay. Thank you. So I want to find the mean and the std deviation for the random variable next?
$endgroup$
– MathIsHard
Feb 2 at 2:48
$begingroup$
Oh okay. Thank you. So I want to find the mean and the std deviation for the random variable next?
$endgroup$
– MathIsHard
Feb 2 at 2:48
1
1
$begingroup$
There are many ways to do this. Since you wanted to use the moment generating function, you should find the MGF of $M_Y(t)$ where $Y equiv bar X = frac1n sum X_i$ and then further consider the MGF of $sqrt{n}frac{Y - mu}{ sigma}$
$endgroup$
– Lee David Chung Lin
Feb 2 at 3:36
$begingroup$
There are many ways to do this. Since you wanted to use the moment generating function, you should find the MGF of $M_Y(t)$ where $Y equiv bar X = frac1n sum X_i$ and then further consider the MGF of $sqrt{n}frac{Y - mu}{ sigma}$
$endgroup$
– Lee David Chung Lin
Feb 2 at 3:36
$begingroup$
oh okay. Thank you. I will give that a try. I appreciate your time
$endgroup$
– MathIsHard
Feb 2 at 4:03
$begingroup$
oh okay. Thank you. I will give that a try. I appreciate your time
$endgroup$
– MathIsHard
Feb 2 at 4:03
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3096926%2fprove-the-central-limit-theorem-for-a-sequence-of-i-i-d-bernoullip-random-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3096926%2fprove-the-central-limit-theorem-for-a-sequence-of-i-i-d-bernoullip-random-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
What you have done so far is merely deriving (correctly) Binomial becoming Poisson under the limit $n to infty, p to 0$ with $np = lambda$ held constant. The statement of Central Limit Theorem involves $frac{bar X - mu}{ sigma}$. Where's your analysis on $bar X$?
$endgroup$
– Lee David Chung Lin
Feb 2 at 2:05
$begingroup$
Oh okay. Thank you. So I want to find the mean and the std deviation for the random variable next?
$endgroup$
– MathIsHard
Feb 2 at 2:48
1
$begingroup$
There are many ways to do this. Since you wanted to use the moment generating function, you should find the MGF of $M_Y(t)$ where $Y equiv bar X = frac1n sum X_i$ and then further consider the MGF of $sqrt{n}frac{Y - mu}{ sigma}$
$endgroup$
– Lee David Chung Lin
Feb 2 at 3:36
$begingroup$
oh okay. Thank you. I will give that a try. I appreciate your time
$endgroup$
– MathIsHard
Feb 2 at 4:03