If $E[|sum_{i < l leq j} xi_l |^gamma] leq (sum_{i < l leq j} u_l)^alpha$ and $sum u_l < infty$ then...
$begingroup$
Suppose $xi_1, xi_2, ldots$ are random variables that satisfy
$$Eleft[left|sum_{i < l leq j} xi_l right|^gammaright] leq left(sum_{i < l leq j} u_lright)^alpha$$
for $gamma geq 0$, $alpha > 1$, $u_l$ non-negative, and $sum u_l < infty$. I want to show $sum xi_l$ converges almost surely.
Let $S_m = sum_{i = 1}^m xi_i$ and $M_m = max_{1 leq i leq m} |S_i|$. This criterion on the expectation can be used to show
$$P(|S_j - S_i| geq lambda) leq frac{1}{lambda^gamma} left(sum_{i < l leq j} u_lright)^alpha$$
Because of this we can use a theorem from Billingsley's book Convergence of Probability Measures (first edition, 1968) to say
$$P(M_m geq lambda) leq frac{K}{lambda^gamma} (u_1 + ldots + u_m)^alpha$$
(Billingsley says to use that theorem to reach the desired result.) $K$ depends only on $gamma$ and $alpha$.
After this I'm not sure how to proceed. Sure, one could take $m to infty$ and then see that the sum is bounded almost surely but I don't see how that can show that the sum is convergent. And I'm not seeing where to judiciously use the Borel-Cantelli lemma.
Basically, I don't see how the bound on $P(M_m geq lambda)$ is useful for proving almost-sure convergence but supposedly it can (and in fact needs to be) used to get the desired result.
So what should I do next?
probability-theory convergence summation random-variables almost-everywhere
$endgroup$
add a comment |
$begingroup$
Suppose $xi_1, xi_2, ldots$ are random variables that satisfy
$$Eleft[left|sum_{i < l leq j} xi_l right|^gammaright] leq left(sum_{i < l leq j} u_lright)^alpha$$
for $gamma geq 0$, $alpha > 1$, $u_l$ non-negative, and $sum u_l < infty$. I want to show $sum xi_l$ converges almost surely.
Let $S_m = sum_{i = 1}^m xi_i$ and $M_m = max_{1 leq i leq m} |S_i|$. This criterion on the expectation can be used to show
$$P(|S_j - S_i| geq lambda) leq frac{1}{lambda^gamma} left(sum_{i < l leq j} u_lright)^alpha$$
Because of this we can use a theorem from Billingsley's book Convergence of Probability Measures (first edition, 1968) to say
$$P(M_m geq lambda) leq frac{K}{lambda^gamma} (u_1 + ldots + u_m)^alpha$$
(Billingsley says to use that theorem to reach the desired result.) $K$ depends only on $gamma$ and $alpha$.
After this I'm not sure how to proceed. Sure, one could take $m to infty$ and then see that the sum is bounded almost surely but I don't see how that can show that the sum is convergent. And I'm not seeing where to judiciously use the Borel-Cantelli lemma.
Basically, I don't see how the bound on $P(M_m geq lambda)$ is useful for proving almost-sure convergence but supposedly it can (and in fact needs to be) used to get the desired result.
So what should I do next?
probability-theory convergence summation random-variables almost-everywhere
$endgroup$
$begingroup$
Are you sure there is no independence assumption here? The theorems you are quoting are not true without independence.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:49
$begingroup$
There is an obvious counterexample with $pm 1$ valued random variables $xi_i$.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:50
$begingroup$
@KaviRamaMurthy Billingsley explicitly notes that the variables need not be independent, and the counterexample you mention does not satisfy the assumption mentioned; the tightest $u_l$ you could have would not be summable.
$endgroup$
– cgmil
Jan 10 at 22:44
add a comment |
$begingroup$
Suppose $xi_1, xi_2, ldots$ are random variables that satisfy
$$Eleft[left|sum_{i < l leq j} xi_l right|^gammaright] leq left(sum_{i < l leq j} u_lright)^alpha$$
for $gamma geq 0$, $alpha > 1$, $u_l$ non-negative, and $sum u_l < infty$. I want to show $sum xi_l$ converges almost surely.
Let $S_m = sum_{i = 1}^m xi_i$ and $M_m = max_{1 leq i leq m} |S_i|$. This criterion on the expectation can be used to show
$$P(|S_j - S_i| geq lambda) leq frac{1}{lambda^gamma} left(sum_{i < l leq j} u_lright)^alpha$$
Because of this we can use a theorem from Billingsley's book Convergence of Probability Measures (first edition, 1968) to say
$$P(M_m geq lambda) leq frac{K}{lambda^gamma} (u_1 + ldots + u_m)^alpha$$
(Billingsley says to use that theorem to reach the desired result.) $K$ depends only on $gamma$ and $alpha$.
After this I'm not sure how to proceed. Sure, one could take $m to infty$ and then see that the sum is bounded almost surely but I don't see how that can show that the sum is convergent. And I'm not seeing where to judiciously use the Borel-Cantelli lemma.
Basically, I don't see how the bound on $P(M_m geq lambda)$ is useful for proving almost-sure convergence but supposedly it can (and in fact needs to be) used to get the desired result.
So what should I do next?
probability-theory convergence summation random-variables almost-everywhere
$endgroup$
Suppose $xi_1, xi_2, ldots$ are random variables that satisfy
$$Eleft[left|sum_{i < l leq j} xi_l right|^gammaright] leq left(sum_{i < l leq j} u_lright)^alpha$$
for $gamma geq 0$, $alpha > 1$, $u_l$ non-negative, and $sum u_l < infty$. I want to show $sum xi_l$ converges almost surely.
Let $S_m = sum_{i = 1}^m xi_i$ and $M_m = max_{1 leq i leq m} |S_i|$. This criterion on the expectation can be used to show
$$P(|S_j - S_i| geq lambda) leq frac{1}{lambda^gamma} left(sum_{i < l leq j} u_lright)^alpha$$
Because of this we can use a theorem from Billingsley's book Convergence of Probability Measures (first edition, 1968) to say
$$P(M_m geq lambda) leq frac{K}{lambda^gamma} (u_1 + ldots + u_m)^alpha$$
(Billingsley says to use that theorem to reach the desired result.) $K$ depends only on $gamma$ and $alpha$.
After this I'm not sure how to proceed. Sure, one could take $m to infty$ and then see that the sum is bounded almost surely but I don't see how that can show that the sum is convergent. And I'm not seeing where to judiciously use the Borel-Cantelli lemma.
Basically, I don't see how the bound on $P(M_m geq lambda)$ is useful for proving almost-sure convergence but supposedly it can (and in fact needs to be) used to get the desired result.
So what should I do next?
probability-theory convergence summation random-variables almost-everywhere
probability-theory convergence summation random-variables almost-everywhere
asked Jan 7 at 7:28
cgmilcgmil
664316
664316
$begingroup$
Are you sure there is no independence assumption here? The theorems you are quoting are not true without independence.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:49
$begingroup$
There is an obvious counterexample with $pm 1$ valued random variables $xi_i$.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:50
$begingroup$
@KaviRamaMurthy Billingsley explicitly notes that the variables need not be independent, and the counterexample you mention does not satisfy the assumption mentioned; the tightest $u_l$ you could have would not be summable.
$endgroup$
– cgmil
Jan 10 at 22:44
add a comment |
$begingroup$
Are you sure there is no independence assumption here? The theorems you are quoting are not true without independence.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:49
$begingroup$
There is an obvious counterexample with $pm 1$ valued random variables $xi_i$.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:50
$begingroup$
@KaviRamaMurthy Billingsley explicitly notes that the variables need not be independent, and the counterexample you mention does not satisfy the assumption mentioned; the tightest $u_l$ you could have would not be summable.
$endgroup$
– cgmil
Jan 10 at 22:44
$begingroup$
Are you sure there is no independence assumption here? The theorems you are quoting are not true without independence.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:49
$begingroup$
Are you sure there is no independence assumption here? The theorems you are quoting are not true without independence.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:49
$begingroup$
There is an obvious counterexample with $pm 1$ valued random variables $xi_i$.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:50
$begingroup$
There is an obvious counterexample with $pm 1$ valued random variables $xi_i$.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:50
$begingroup$
@KaviRamaMurthy Billingsley explicitly notes that the variables need not be independent, and the counterexample you mention does not satisfy the assumption mentioned; the tightest $u_l$ you could have would not be summable.
$endgroup$
– cgmil
Jan 10 at 22:44
$begingroup$
@KaviRamaMurthy Billingsley explicitly notes that the variables need not be independent, and the counterexample you mention does not satisfy the assumption mentioned; the tightest $u_l$ you could have would not be summable.
$endgroup$
– cgmil
Jan 10 at 22:44
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
I guess we need some kind of weak dependence assumption on $(xi_j)$, but anyway, this is only about how we get the convergence result using the maximal inequality about $M_n$. First, we note that we can bound
$$
P(max_{nle ile N}|S_i-S_n|>lambda)lefrac{K}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha
$$ and hence
$$
P(max_{nle i, jle N}|S_i-S_j|>lambda)lefrac{K'}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha.
$$ (We can regard $xi_j$ as starting from index $j=n+1$.) To show that $S_n$ converges, id est, that $limsup_{ntoinfty} S_n = liminf_{ntoinfty} S_n$, it seems natural for us to control the oscillation defined by
$$
limsup_{i,jto infty}|S_i-S_j| =lim_{ntoinfty}sup_{i,jge n} |S_i-S_j|.
$$ Let $W_{n,N} = sup_{nle i,jle N} |S_i-S_j| $ and $W_n =sup_{ i,jge n} |S_i-S_j|=lim_{Ntoinfty}W_{n,N}$. We have
$$
P(W_n>lambda) =lim_{Ntoinfty}P(W_{n,N}>lambda)
$$by monotonic convergence of $W_{n,N}$. Hence we get the bound
$$
P(W_n>lambda)lefrac{K'}{lambda^gamma}left(sum_{j>n}u_{j} right)^alpha.tag{*}
$$ Finally, we observe
$$
limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n =lim_{ntoinfty} W_n=:W
$$ and
$$
{S_ntext{ diverges}} = {limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n>0} =bigcup_{jinmathbb{N}}{Wge 1/j}.
$$ Since it holds that
$$
P(Wge lambda) =lim_{ntoinfty} P(W_nge lambda)= 0
$$ for all $lambda>0$ by the above estimate $(*)$, it follows that ${S_ntext{ diverges}}$ is a countable union of null probability events, and hence has probability $0$.
$endgroup$
$begingroup$
Thank you for your answer. It was very helpful! I have a follow-up question of a similar nature; perhaps you could answer it too? math.stackexchange.com/q/3069263/360447
$endgroup$
– cgmil
Jan 10 at 22:40
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3064734%2fif-e-sum-i-l-leq-j-xi-l-gamma-leq-sum-i-l-leq-j-u-l-alpha%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I guess we need some kind of weak dependence assumption on $(xi_j)$, but anyway, this is only about how we get the convergence result using the maximal inequality about $M_n$. First, we note that we can bound
$$
P(max_{nle ile N}|S_i-S_n|>lambda)lefrac{K}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha
$$ and hence
$$
P(max_{nle i, jle N}|S_i-S_j|>lambda)lefrac{K'}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha.
$$ (We can regard $xi_j$ as starting from index $j=n+1$.) To show that $S_n$ converges, id est, that $limsup_{ntoinfty} S_n = liminf_{ntoinfty} S_n$, it seems natural for us to control the oscillation defined by
$$
limsup_{i,jto infty}|S_i-S_j| =lim_{ntoinfty}sup_{i,jge n} |S_i-S_j|.
$$ Let $W_{n,N} = sup_{nle i,jle N} |S_i-S_j| $ and $W_n =sup_{ i,jge n} |S_i-S_j|=lim_{Ntoinfty}W_{n,N}$. We have
$$
P(W_n>lambda) =lim_{Ntoinfty}P(W_{n,N}>lambda)
$$by monotonic convergence of $W_{n,N}$. Hence we get the bound
$$
P(W_n>lambda)lefrac{K'}{lambda^gamma}left(sum_{j>n}u_{j} right)^alpha.tag{*}
$$ Finally, we observe
$$
limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n =lim_{ntoinfty} W_n=:W
$$ and
$$
{S_ntext{ diverges}} = {limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n>0} =bigcup_{jinmathbb{N}}{Wge 1/j}.
$$ Since it holds that
$$
P(Wge lambda) =lim_{ntoinfty} P(W_nge lambda)= 0
$$ for all $lambda>0$ by the above estimate $(*)$, it follows that ${S_ntext{ diverges}}$ is a countable union of null probability events, and hence has probability $0$.
$endgroup$
$begingroup$
Thank you for your answer. It was very helpful! I have a follow-up question of a similar nature; perhaps you could answer it too? math.stackexchange.com/q/3069263/360447
$endgroup$
– cgmil
Jan 10 at 22:40
add a comment |
$begingroup$
I guess we need some kind of weak dependence assumption on $(xi_j)$, but anyway, this is only about how we get the convergence result using the maximal inequality about $M_n$. First, we note that we can bound
$$
P(max_{nle ile N}|S_i-S_n|>lambda)lefrac{K}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha
$$ and hence
$$
P(max_{nle i, jle N}|S_i-S_j|>lambda)lefrac{K'}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha.
$$ (We can regard $xi_j$ as starting from index $j=n+1$.) To show that $S_n$ converges, id est, that $limsup_{ntoinfty} S_n = liminf_{ntoinfty} S_n$, it seems natural for us to control the oscillation defined by
$$
limsup_{i,jto infty}|S_i-S_j| =lim_{ntoinfty}sup_{i,jge n} |S_i-S_j|.
$$ Let $W_{n,N} = sup_{nle i,jle N} |S_i-S_j| $ and $W_n =sup_{ i,jge n} |S_i-S_j|=lim_{Ntoinfty}W_{n,N}$. We have
$$
P(W_n>lambda) =lim_{Ntoinfty}P(W_{n,N}>lambda)
$$by monotonic convergence of $W_{n,N}$. Hence we get the bound
$$
P(W_n>lambda)lefrac{K'}{lambda^gamma}left(sum_{j>n}u_{j} right)^alpha.tag{*}
$$ Finally, we observe
$$
limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n =lim_{ntoinfty} W_n=:W
$$ and
$$
{S_ntext{ diverges}} = {limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n>0} =bigcup_{jinmathbb{N}}{Wge 1/j}.
$$ Since it holds that
$$
P(Wge lambda) =lim_{ntoinfty} P(W_nge lambda)= 0
$$ for all $lambda>0$ by the above estimate $(*)$, it follows that ${S_ntext{ diverges}}$ is a countable union of null probability events, and hence has probability $0$.
$endgroup$
$begingroup$
Thank you for your answer. It was very helpful! I have a follow-up question of a similar nature; perhaps you could answer it too? math.stackexchange.com/q/3069263/360447
$endgroup$
– cgmil
Jan 10 at 22:40
add a comment |
$begingroup$
I guess we need some kind of weak dependence assumption on $(xi_j)$, but anyway, this is only about how we get the convergence result using the maximal inequality about $M_n$. First, we note that we can bound
$$
P(max_{nle ile N}|S_i-S_n|>lambda)lefrac{K}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha
$$ and hence
$$
P(max_{nle i, jle N}|S_i-S_j|>lambda)lefrac{K'}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha.
$$ (We can regard $xi_j$ as starting from index $j=n+1$.) To show that $S_n$ converges, id est, that $limsup_{ntoinfty} S_n = liminf_{ntoinfty} S_n$, it seems natural for us to control the oscillation defined by
$$
limsup_{i,jto infty}|S_i-S_j| =lim_{ntoinfty}sup_{i,jge n} |S_i-S_j|.
$$ Let $W_{n,N} = sup_{nle i,jle N} |S_i-S_j| $ and $W_n =sup_{ i,jge n} |S_i-S_j|=lim_{Ntoinfty}W_{n,N}$. We have
$$
P(W_n>lambda) =lim_{Ntoinfty}P(W_{n,N}>lambda)
$$by monotonic convergence of $W_{n,N}$. Hence we get the bound
$$
P(W_n>lambda)lefrac{K'}{lambda^gamma}left(sum_{j>n}u_{j} right)^alpha.tag{*}
$$ Finally, we observe
$$
limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n =lim_{ntoinfty} W_n=:W
$$ and
$$
{S_ntext{ diverges}} = {limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n>0} =bigcup_{jinmathbb{N}}{Wge 1/j}.
$$ Since it holds that
$$
P(Wge lambda) =lim_{ntoinfty} P(W_nge lambda)= 0
$$ for all $lambda>0$ by the above estimate $(*)$, it follows that ${S_ntext{ diverges}}$ is a countable union of null probability events, and hence has probability $0$.
$endgroup$
I guess we need some kind of weak dependence assumption on $(xi_j)$, but anyway, this is only about how we get the convergence result using the maximal inequality about $M_n$. First, we note that we can bound
$$
P(max_{nle ile N}|S_i-S_n|>lambda)lefrac{K}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha
$$ and hence
$$
P(max_{nle i, jle N}|S_i-S_j|>lambda)lefrac{K'}{lambda^gamma}(u_{n+1} +cdots u_N)^alpha.
$$ (We can regard $xi_j$ as starting from index $j=n+1$.) To show that $S_n$ converges, id est, that $limsup_{ntoinfty} S_n = liminf_{ntoinfty} S_n$, it seems natural for us to control the oscillation defined by
$$
limsup_{i,jto infty}|S_i-S_j| =lim_{ntoinfty}sup_{i,jge n} |S_i-S_j|.
$$ Let $W_{n,N} = sup_{nle i,jle N} |S_i-S_j| $ and $W_n =sup_{ i,jge n} |S_i-S_j|=lim_{Ntoinfty}W_{n,N}$. We have
$$
P(W_n>lambda) =lim_{Ntoinfty}P(W_{n,N}>lambda)
$$by monotonic convergence of $W_{n,N}$. Hence we get the bound
$$
P(W_n>lambda)lefrac{K'}{lambda^gamma}left(sum_{j>n}u_{j} right)^alpha.tag{*}
$$ Finally, we observe
$$
limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n =lim_{ntoinfty} W_n=:W
$$ and
$$
{S_ntext{ diverges}} = {limsup_{ntoinfty} S_n -liminf_{ntoinfty} S_n>0} =bigcup_{jinmathbb{N}}{Wge 1/j}.
$$ Since it holds that
$$
P(Wge lambda) =lim_{ntoinfty} P(W_nge lambda)= 0
$$ for all $lambda>0$ by the above estimate $(*)$, it follows that ${S_ntext{ diverges}}$ is a countable union of null probability events, and hence has probability $0$.
answered Jan 7 at 8:48
SongSong
10.7k628
10.7k628
$begingroup$
Thank you for your answer. It was very helpful! I have a follow-up question of a similar nature; perhaps you could answer it too? math.stackexchange.com/q/3069263/360447
$endgroup$
– cgmil
Jan 10 at 22:40
add a comment |
$begingroup$
Thank you for your answer. It was very helpful! I have a follow-up question of a similar nature; perhaps you could answer it too? math.stackexchange.com/q/3069263/360447
$endgroup$
– cgmil
Jan 10 at 22:40
$begingroup$
Thank you for your answer. It was very helpful! I have a follow-up question of a similar nature; perhaps you could answer it too? math.stackexchange.com/q/3069263/360447
$endgroup$
– cgmil
Jan 10 at 22:40
$begingroup$
Thank you for your answer. It was very helpful! I have a follow-up question of a similar nature; perhaps you could answer it too? math.stackexchange.com/q/3069263/360447
$endgroup$
– cgmil
Jan 10 at 22:40
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3064734%2fif-e-sum-i-l-leq-j-xi-l-gamma-leq-sum-i-l-leq-j-u-l-alpha%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Are you sure there is no independence assumption here? The theorems you are quoting are not true without independence.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:49
$begingroup$
There is an obvious counterexample with $pm 1$ valued random variables $xi_i$.
$endgroup$
– Kavi Rama Murthy
Jan 7 at 7:50
$begingroup$
@KaviRamaMurthy Billingsley explicitly notes that the variables need not be independent, and the counterexample you mention does not satisfy the assumption mentioned; the tightest $u_l$ you could have would not be summable.
$endgroup$
– cgmil
Jan 10 at 22:44