Berry-Esseen function bound
up vote
1
down vote
favorite
ByBerry-Esseen theorem on Wikipedia we know that
$$|F_n(x)-Phi(x)|le frac{Crho}{sigma^3sqrt{n}}$$
where $F_n$ is the cumulative distribution function given there.
However, in many important cases we expect $F_n(0)$ to be much closer or equal to $Phi(0).$ For example if $p=1/2$ and $n$ is odd, then
$$F_n(0)=sum_{k=0}^{lfloor n/2rfloor} {nchoose k} p^k(1-p)^{n-k}=frac{1}{2}=Phi(0).$$
(By $F_n(0)$ above, I really mean we consider a slightly modified binomial distribution, but I hope this is clear.) Is there a better bound for $|F_n(x)-Phi(x)|$ for the example above in terms of a function $E(x)$ that goes to zero as $xto 0$ and achieves a maximum that is less than or equal to $frac{Crho}{sigma^3sqrt{n}}$? Is there a more general error term $E(x)$ that works for other binomial distributions?
probability-theory probability-distributions
|
show 3 more comments
up vote
1
down vote
favorite
ByBerry-Esseen theorem on Wikipedia we know that
$$|F_n(x)-Phi(x)|le frac{Crho}{sigma^3sqrt{n}}$$
where $F_n$ is the cumulative distribution function given there.
However, in many important cases we expect $F_n(0)$ to be much closer or equal to $Phi(0).$ For example if $p=1/2$ and $n$ is odd, then
$$F_n(0)=sum_{k=0}^{lfloor n/2rfloor} {nchoose k} p^k(1-p)^{n-k}=frac{1}{2}=Phi(0).$$
(By $F_n(0)$ above, I really mean we consider a slightly modified binomial distribution, but I hope this is clear.) Is there a better bound for $|F_n(x)-Phi(x)|$ for the example above in terms of a function $E(x)$ that goes to zero as $xto 0$ and achieves a maximum that is less than or equal to $frac{Crho}{sigma^3sqrt{n}}$? Is there a more general error term $E(x)$ that works for other binomial distributions?
probability-theory probability-distributions
What do you mean by a "modified binomial distribution"? Does it have zero mean?
– d.k.o.
yesterday
Yes that's what I meant, so $F_n(0)=.5.$
– Maxim G.
yesterday
So what is $F_n(x)$ for $xne 0$?
– d.k.o.
yesterday
Just make the binomial distribution have zero mean by subtracting off the mean. The new $F_n$ adjusts accordingly. We can think of $B_{n,p}$ as a sum of bernoulli random variables. Instead of summing $X_1+cdots +X_n,$ sum $(X_1-mu)+cdots +(X_n-mu).$
– Maxim G.
yesterday
By the way -- every definition I'm referencing is from Wikipedia. That is, $Y_n= (X_1-mu)+cdots +(X_n-mu)$ and the rest is the same as wiki.
– Maxim G.
yesterday
|
show 3 more comments
up vote
1
down vote
favorite
up vote
1
down vote
favorite
ByBerry-Esseen theorem on Wikipedia we know that
$$|F_n(x)-Phi(x)|le frac{Crho}{sigma^3sqrt{n}}$$
where $F_n$ is the cumulative distribution function given there.
However, in many important cases we expect $F_n(0)$ to be much closer or equal to $Phi(0).$ For example if $p=1/2$ and $n$ is odd, then
$$F_n(0)=sum_{k=0}^{lfloor n/2rfloor} {nchoose k} p^k(1-p)^{n-k}=frac{1}{2}=Phi(0).$$
(By $F_n(0)$ above, I really mean we consider a slightly modified binomial distribution, but I hope this is clear.) Is there a better bound for $|F_n(x)-Phi(x)|$ for the example above in terms of a function $E(x)$ that goes to zero as $xto 0$ and achieves a maximum that is less than or equal to $frac{Crho}{sigma^3sqrt{n}}$? Is there a more general error term $E(x)$ that works for other binomial distributions?
probability-theory probability-distributions
ByBerry-Esseen theorem on Wikipedia we know that
$$|F_n(x)-Phi(x)|le frac{Crho}{sigma^3sqrt{n}}$$
where $F_n$ is the cumulative distribution function given there.
However, in many important cases we expect $F_n(0)$ to be much closer or equal to $Phi(0).$ For example if $p=1/2$ and $n$ is odd, then
$$F_n(0)=sum_{k=0}^{lfloor n/2rfloor} {nchoose k} p^k(1-p)^{n-k}=frac{1}{2}=Phi(0).$$
(By $F_n(0)$ above, I really mean we consider a slightly modified binomial distribution, but I hope this is clear.) Is there a better bound for $|F_n(x)-Phi(x)|$ for the example above in terms of a function $E(x)$ that goes to zero as $xto 0$ and achieves a maximum that is less than or equal to $frac{Crho}{sigma^3sqrt{n}}$? Is there a more general error term $E(x)$ that works for other binomial distributions?
probability-theory probability-distributions
probability-theory probability-distributions
edited 20 hours ago
asked yesterday
Maxim G.
611413
611413
What do you mean by a "modified binomial distribution"? Does it have zero mean?
– d.k.o.
yesterday
Yes that's what I meant, so $F_n(0)=.5.$
– Maxim G.
yesterday
So what is $F_n(x)$ for $xne 0$?
– d.k.o.
yesterday
Just make the binomial distribution have zero mean by subtracting off the mean. The new $F_n$ adjusts accordingly. We can think of $B_{n,p}$ as a sum of bernoulli random variables. Instead of summing $X_1+cdots +X_n,$ sum $(X_1-mu)+cdots +(X_n-mu).$
– Maxim G.
yesterday
By the way -- every definition I'm referencing is from Wikipedia. That is, $Y_n= (X_1-mu)+cdots +(X_n-mu)$ and the rest is the same as wiki.
– Maxim G.
yesterday
|
show 3 more comments
What do you mean by a "modified binomial distribution"? Does it have zero mean?
– d.k.o.
yesterday
Yes that's what I meant, so $F_n(0)=.5.$
– Maxim G.
yesterday
So what is $F_n(x)$ for $xne 0$?
– d.k.o.
yesterday
Just make the binomial distribution have zero mean by subtracting off the mean. The new $F_n$ adjusts accordingly. We can think of $B_{n,p}$ as a sum of bernoulli random variables. Instead of summing $X_1+cdots +X_n,$ sum $(X_1-mu)+cdots +(X_n-mu).$
– Maxim G.
yesterday
By the way -- every definition I'm referencing is from Wikipedia. That is, $Y_n= (X_1-mu)+cdots +(X_n-mu)$ and the rest is the same as wiki.
– Maxim G.
yesterday
What do you mean by a "modified binomial distribution"? Does it have zero mean?
– d.k.o.
yesterday
What do you mean by a "modified binomial distribution"? Does it have zero mean?
– d.k.o.
yesterday
Yes that's what I meant, so $F_n(0)=.5.$
– Maxim G.
yesterday
Yes that's what I meant, so $F_n(0)=.5.$
– Maxim G.
yesterday
So what is $F_n(x)$ for $xne 0$?
– d.k.o.
yesterday
So what is $F_n(x)$ for $xne 0$?
– d.k.o.
yesterday
Just make the binomial distribution have zero mean by subtracting off the mean. The new $F_n$ adjusts accordingly. We can think of $B_{n,p}$ as a sum of bernoulli random variables. Instead of summing $X_1+cdots +X_n,$ sum $(X_1-mu)+cdots +(X_n-mu).$
– Maxim G.
yesterday
Just make the binomial distribution have zero mean by subtracting off the mean. The new $F_n$ adjusts accordingly. We can think of $B_{n,p}$ as a sum of bernoulli random variables. Instead of summing $X_1+cdots +X_n,$ sum $(X_1-mu)+cdots +(X_n-mu).$
– Maxim G.
yesterday
By the way -- every definition I'm referencing is from Wikipedia. That is, $Y_n= (X_1-mu)+cdots +(X_n-mu)$ and the rest is the same as wiki.
– Maxim G.
yesterday
By the way -- every definition I'm referencing is from Wikipedia. That is, $Y_n= (X_1-mu)+cdots +(X_n-mu)$ and the rest is the same as wiki.
– Maxim G.
yesterday
|
show 3 more comments
1 Answer
1
active
oldest
votes
up vote
1
down vote
For a sequence of zero-mean i.i.d. r.v.s ${X_i}$ with variance $sigma^2$ and finite third moment $gamma_3$,
$$
|mathsf{P}(S_n/(sqrt{n}sigma)le x)-Phi(x)|le frac{Cgamma_3}{sigma^3sqrt{n}}times frac{1}{1+|x|^3},
$$
where $S_n:=sum_{ile n}X_i$ and $C>0$ is an absolute constant (see, e.g. Chen and Shao, 2001). This bound is better then the uniform one for large values of $x$. However, if the distribution of $S_n$ is known, you may get better estimates.
Thank you, that's very helpful. Is there anything else known as $|x|to 0$?
– Maxim G.
yesterday
Unless the distribution of $X_1$ is known, I doubt about the existence of such results...
– d.k.o.
yesterday
But in this case the distribution of $X_i$'s is just Bernoulli. In this case is something known?
– Maxim G.
20 hours ago
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
For a sequence of zero-mean i.i.d. r.v.s ${X_i}$ with variance $sigma^2$ and finite third moment $gamma_3$,
$$
|mathsf{P}(S_n/(sqrt{n}sigma)le x)-Phi(x)|le frac{Cgamma_3}{sigma^3sqrt{n}}times frac{1}{1+|x|^3},
$$
where $S_n:=sum_{ile n}X_i$ and $C>0$ is an absolute constant (see, e.g. Chen and Shao, 2001). This bound is better then the uniform one for large values of $x$. However, if the distribution of $S_n$ is known, you may get better estimates.
Thank you, that's very helpful. Is there anything else known as $|x|to 0$?
– Maxim G.
yesterday
Unless the distribution of $X_1$ is known, I doubt about the existence of such results...
– d.k.o.
yesterday
But in this case the distribution of $X_i$'s is just Bernoulli. In this case is something known?
– Maxim G.
20 hours ago
add a comment |
up vote
1
down vote
For a sequence of zero-mean i.i.d. r.v.s ${X_i}$ with variance $sigma^2$ and finite third moment $gamma_3$,
$$
|mathsf{P}(S_n/(sqrt{n}sigma)le x)-Phi(x)|le frac{Cgamma_3}{sigma^3sqrt{n}}times frac{1}{1+|x|^3},
$$
where $S_n:=sum_{ile n}X_i$ and $C>0$ is an absolute constant (see, e.g. Chen and Shao, 2001). This bound is better then the uniform one for large values of $x$. However, if the distribution of $S_n$ is known, you may get better estimates.
Thank you, that's very helpful. Is there anything else known as $|x|to 0$?
– Maxim G.
yesterday
Unless the distribution of $X_1$ is known, I doubt about the existence of such results...
– d.k.o.
yesterday
But in this case the distribution of $X_i$'s is just Bernoulli. In this case is something known?
– Maxim G.
20 hours ago
add a comment |
up vote
1
down vote
up vote
1
down vote
For a sequence of zero-mean i.i.d. r.v.s ${X_i}$ with variance $sigma^2$ and finite third moment $gamma_3$,
$$
|mathsf{P}(S_n/(sqrt{n}sigma)le x)-Phi(x)|le frac{Cgamma_3}{sigma^3sqrt{n}}times frac{1}{1+|x|^3},
$$
where $S_n:=sum_{ile n}X_i$ and $C>0$ is an absolute constant (see, e.g. Chen and Shao, 2001). This bound is better then the uniform one for large values of $x$. However, if the distribution of $S_n$ is known, you may get better estimates.
For a sequence of zero-mean i.i.d. r.v.s ${X_i}$ with variance $sigma^2$ and finite third moment $gamma_3$,
$$
|mathsf{P}(S_n/(sqrt{n}sigma)le x)-Phi(x)|le frac{Cgamma_3}{sigma^3sqrt{n}}times frac{1}{1+|x|^3},
$$
where $S_n:=sum_{ile n}X_i$ and $C>0$ is an absolute constant (see, e.g. Chen and Shao, 2001). This bound is better then the uniform one for large values of $x$. However, if the distribution of $S_n$ is known, you may get better estimates.
answered yesterday
d.k.o.
8,079527
8,079527
Thank you, that's very helpful. Is there anything else known as $|x|to 0$?
– Maxim G.
yesterday
Unless the distribution of $X_1$ is known, I doubt about the existence of such results...
– d.k.o.
yesterday
But in this case the distribution of $X_i$'s is just Bernoulli. In this case is something known?
– Maxim G.
20 hours ago
add a comment |
Thank you, that's very helpful. Is there anything else known as $|x|to 0$?
– Maxim G.
yesterday
Unless the distribution of $X_1$ is known, I doubt about the existence of such results...
– d.k.o.
yesterday
But in this case the distribution of $X_i$'s is just Bernoulli. In this case is something known?
– Maxim G.
20 hours ago
Thank you, that's very helpful. Is there anything else known as $|x|to 0$?
– Maxim G.
yesterday
Thank you, that's very helpful. Is there anything else known as $|x|to 0$?
– Maxim G.
yesterday
Unless the distribution of $X_1$ is known, I doubt about the existence of such results...
– d.k.o.
yesterday
Unless the distribution of $X_1$ is known, I doubt about the existence of such results...
– d.k.o.
yesterday
But in this case the distribution of $X_i$'s is just Bernoulli. In this case is something known?
– Maxim G.
20 hours ago
But in this case the distribution of $X_i$'s is just Bernoulli. In this case is something known?
– Maxim G.
20 hours ago
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3005212%2fberry-esseen-function-bound%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
What do you mean by a "modified binomial distribution"? Does it have zero mean?
– d.k.o.
yesterday
Yes that's what I meant, so $F_n(0)=.5.$
– Maxim G.
yesterday
So what is $F_n(x)$ for $xne 0$?
– d.k.o.
yesterday
Just make the binomial distribution have zero mean by subtracting off the mean. The new $F_n$ adjusts accordingly. We can think of $B_{n,p}$ as a sum of bernoulli random variables. Instead of summing $X_1+cdots +X_n,$ sum $(X_1-mu)+cdots +(X_n-mu).$
– Maxim G.
yesterday
By the way -- every definition I'm referencing is from Wikipedia. That is, $Y_n= (X_1-mu)+cdots +(X_n-mu)$ and the rest is the same as wiki.
– Maxim G.
yesterday