Show that $Y_n := (prod_{i=1}^{n} X_i)^{1/n}$ converges with probability 1












4












$begingroup$


I'm dealing with a problem about stochastic and statistics and hope some of you can help me!




On $[0,1]$ we have a sequence of independent, equally distributed probability variables $(X_n)_{n in mathbb{N}}$. I have to show:



a) $Y_n := (prod_{i=1}^{n} X_i)^{1/n}$ converges with probability $1$.



b) calculate the exact limit of $Y_n$




I've already done some calculations, but I'm really not sure, whether everything is fine.





Some pre-considerations:
To get rid of the product I took the logarithm: $ln(Y_n) = frac{1}{n} sum_{i=1}^{n} ln(X_i)$



After taking the logarithm the sequence $ln(X_i)$ still is equally distributed and independent.



a) I found a theorem in my lecture notes, which states, that $frac{S_n}{n}$ (the $n$-th partial sum of a sequence) converges and has a finite limit with probability $1$ if the sequence is integrable.



It seems to me, that this Theorem might fit, but my concern is, that the logarithm of $X_i = 0$ (allowed since $X_i$ is a sequence on $[0,1]$) isn't integrable.



b) This part some kind of "smells" to me like Kolmogorov's law, which states that for a sequence of indempendent and identically distributed probability variables with finite expectation value it holds:
$$lim_{nrightarrow infty} frac{1}{n} sum_{k=1}^{n}X_k = mathbb{E}(X_1) quad text{a.s.}$$



So the limit would be $lim_{n} ln(Y_n) = mathbb{E}(ln(X_1))$ almost sure.



But I don't see, why the expectation value of $ln(X_i)$ should be finite for $X_i = 0$ again.



So due to this concerns at $X_i = 0$ I'm not sure, whether I'm on the right track, or the problem needs to be solved differently.



I would be very grateful if some of you can help me!



Thanks in Advance!



pcalc










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Note that if one of the $X_i$ is $0$ the whole product is $0$ thus $$Y_n=left( prod_{i=1}^n X_i 1_{X_iin (0,1]}right)^{1/n}$$ and the $X_i 1_{X_iin (0,1]}$ are still i.i.d so the SLLN still applies (provided $log X$ is integrable).
    $endgroup$
    – Gabriel Romon
    Feb 1 at 15:56












  • $begingroup$
    Hi and thanks for your quick response! So - if I take this special case for $X_i=0$ in concern - my way is suitable?
    $endgroup$
    – pcalc
    Feb 1 at 16:19










  • $begingroup$
    @GabrielRomon As far as I understand, the OP's main problem is that there is no assumption that $log(X 1_{{X>0}})$ is integrable.
    $endgroup$
    – saz
    Feb 1 at 17:07
















4












$begingroup$


I'm dealing with a problem about stochastic and statistics and hope some of you can help me!




On $[0,1]$ we have a sequence of independent, equally distributed probability variables $(X_n)_{n in mathbb{N}}$. I have to show:



a) $Y_n := (prod_{i=1}^{n} X_i)^{1/n}$ converges with probability $1$.



b) calculate the exact limit of $Y_n$




I've already done some calculations, but I'm really not sure, whether everything is fine.





Some pre-considerations:
To get rid of the product I took the logarithm: $ln(Y_n) = frac{1}{n} sum_{i=1}^{n} ln(X_i)$



After taking the logarithm the sequence $ln(X_i)$ still is equally distributed and independent.



a) I found a theorem in my lecture notes, which states, that $frac{S_n}{n}$ (the $n$-th partial sum of a sequence) converges and has a finite limit with probability $1$ if the sequence is integrable.



It seems to me, that this Theorem might fit, but my concern is, that the logarithm of $X_i = 0$ (allowed since $X_i$ is a sequence on $[0,1]$) isn't integrable.



b) This part some kind of "smells" to me like Kolmogorov's law, which states that for a sequence of indempendent and identically distributed probability variables with finite expectation value it holds:
$$lim_{nrightarrow infty} frac{1}{n} sum_{k=1}^{n}X_k = mathbb{E}(X_1) quad text{a.s.}$$



So the limit would be $lim_{n} ln(Y_n) = mathbb{E}(ln(X_1))$ almost sure.



But I don't see, why the expectation value of $ln(X_i)$ should be finite for $X_i = 0$ again.



So due to this concerns at $X_i = 0$ I'm not sure, whether I'm on the right track, or the problem needs to be solved differently.



I would be very grateful if some of you can help me!



Thanks in Advance!



pcalc










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Note that if one of the $X_i$ is $0$ the whole product is $0$ thus $$Y_n=left( prod_{i=1}^n X_i 1_{X_iin (0,1]}right)^{1/n}$$ and the $X_i 1_{X_iin (0,1]}$ are still i.i.d so the SLLN still applies (provided $log X$ is integrable).
    $endgroup$
    – Gabriel Romon
    Feb 1 at 15:56












  • $begingroup$
    Hi and thanks for your quick response! So - if I take this special case for $X_i=0$ in concern - my way is suitable?
    $endgroup$
    – pcalc
    Feb 1 at 16:19










  • $begingroup$
    @GabrielRomon As far as I understand, the OP's main problem is that there is no assumption that $log(X 1_{{X>0}})$ is integrable.
    $endgroup$
    – saz
    Feb 1 at 17:07














4












4








4


2



$begingroup$


I'm dealing with a problem about stochastic and statistics and hope some of you can help me!




On $[0,1]$ we have a sequence of independent, equally distributed probability variables $(X_n)_{n in mathbb{N}}$. I have to show:



a) $Y_n := (prod_{i=1}^{n} X_i)^{1/n}$ converges with probability $1$.



b) calculate the exact limit of $Y_n$




I've already done some calculations, but I'm really not sure, whether everything is fine.





Some pre-considerations:
To get rid of the product I took the logarithm: $ln(Y_n) = frac{1}{n} sum_{i=1}^{n} ln(X_i)$



After taking the logarithm the sequence $ln(X_i)$ still is equally distributed and independent.



a) I found a theorem in my lecture notes, which states, that $frac{S_n}{n}$ (the $n$-th partial sum of a sequence) converges and has a finite limit with probability $1$ if the sequence is integrable.



It seems to me, that this Theorem might fit, but my concern is, that the logarithm of $X_i = 0$ (allowed since $X_i$ is a sequence on $[0,1]$) isn't integrable.



b) This part some kind of "smells" to me like Kolmogorov's law, which states that for a sequence of indempendent and identically distributed probability variables with finite expectation value it holds:
$$lim_{nrightarrow infty} frac{1}{n} sum_{k=1}^{n}X_k = mathbb{E}(X_1) quad text{a.s.}$$



So the limit would be $lim_{n} ln(Y_n) = mathbb{E}(ln(X_1))$ almost sure.



But I don't see, why the expectation value of $ln(X_i)$ should be finite for $X_i = 0$ again.



So due to this concerns at $X_i = 0$ I'm not sure, whether I'm on the right track, or the problem needs to be solved differently.



I would be very grateful if some of you can help me!



Thanks in Advance!



pcalc










share|cite|improve this question











$endgroup$




I'm dealing with a problem about stochastic and statistics and hope some of you can help me!




On $[0,1]$ we have a sequence of independent, equally distributed probability variables $(X_n)_{n in mathbb{N}}$. I have to show:



a) $Y_n := (prod_{i=1}^{n} X_i)^{1/n}$ converges with probability $1$.



b) calculate the exact limit of $Y_n$




I've already done some calculations, but I'm really not sure, whether everything is fine.





Some pre-considerations:
To get rid of the product I took the logarithm: $ln(Y_n) = frac{1}{n} sum_{i=1}^{n} ln(X_i)$



After taking the logarithm the sequence $ln(X_i)$ still is equally distributed and independent.



a) I found a theorem in my lecture notes, which states, that $frac{S_n}{n}$ (the $n$-th partial sum of a sequence) converges and has a finite limit with probability $1$ if the sequence is integrable.



It seems to me, that this Theorem might fit, but my concern is, that the logarithm of $X_i = 0$ (allowed since $X_i$ is a sequence on $[0,1]$) isn't integrable.



b) This part some kind of "smells" to me like Kolmogorov's law, which states that for a sequence of indempendent and identically distributed probability variables with finite expectation value it holds:
$$lim_{nrightarrow infty} frac{1}{n} sum_{k=1}^{n}X_k = mathbb{E}(X_1) quad text{a.s.}$$



So the limit would be $lim_{n} ln(Y_n) = mathbb{E}(ln(X_1))$ almost sure.



But I don't see, why the expectation value of $ln(X_i)$ should be finite for $X_i = 0$ again.



So due to this concerns at $X_i = 0$ I'm not sure, whether I'm on the right track, or the problem needs to be solved differently.



I would be very grateful if some of you can help me!



Thanks in Advance!



pcalc







probability-theory convergence






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Feb 1 at 15:40









saz

82.3k862131




82.3k862131










asked Feb 1 at 15:10









pcalcpcalc

29918




29918








  • 1




    $begingroup$
    Note that if one of the $X_i$ is $0$ the whole product is $0$ thus $$Y_n=left( prod_{i=1}^n X_i 1_{X_iin (0,1]}right)^{1/n}$$ and the $X_i 1_{X_iin (0,1]}$ are still i.i.d so the SLLN still applies (provided $log X$ is integrable).
    $endgroup$
    – Gabriel Romon
    Feb 1 at 15:56












  • $begingroup$
    Hi and thanks for your quick response! So - if I take this special case for $X_i=0$ in concern - my way is suitable?
    $endgroup$
    – pcalc
    Feb 1 at 16:19










  • $begingroup$
    @GabrielRomon As far as I understand, the OP's main problem is that there is no assumption that $log(X 1_{{X>0}})$ is integrable.
    $endgroup$
    – saz
    Feb 1 at 17:07














  • 1




    $begingroup$
    Note that if one of the $X_i$ is $0$ the whole product is $0$ thus $$Y_n=left( prod_{i=1}^n X_i 1_{X_iin (0,1]}right)^{1/n}$$ and the $X_i 1_{X_iin (0,1]}$ are still i.i.d so the SLLN still applies (provided $log X$ is integrable).
    $endgroup$
    – Gabriel Romon
    Feb 1 at 15:56












  • $begingroup$
    Hi and thanks for your quick response! So - if I take this special case for $X_i=0$ in concern - my way is suitable?
    $endgroup$
    – pcalc
    Feb 1 at 16:19










  • $begingroup$
    @GabrielRomon As far as I understand, the OP's main problem is that there is no assumption that $log(X 1_{{X>0}})$ is integrable.
    $endgroup$
    – saz
    Feb 1 at 17:07








1




1




$begingroup$
Note that if one of the $X_i$ is $0$ the whole product is $0$ thus $$Y_n=left( prod_{i=1}^n X_i 1_{X_iin (0,1]}right)^{1/n}$$ and the $X_i 1_{X_iin (0,1]}$ are still i.i.d so the SLLN still applies (provided $log X$ is integrable).
$endgroup$
– Gabriel Romon
Feb 1 at 15:56






$begingroup$
Note that if one of the $X_i$ is $0$ the whole product is $0$ thus $$Y_n=left( prod_{i=1}^n X_i 1_{X_iin (0,1]}right)^{1/n}$$ and the $X_i 1_{X_iin (0,1]}$ are still i.i.d so the SLLN still applies (provided $log X$ is integrable).
$endgroup$
– Gabriel Romon
Feb 1 at 15:56














$begingroup$
Hi and thanks for your quick response! So - if I take this special case for $X_i=0$ in concern - my way is suitable?
$endgroup$
– pcalc
Feb 1 at 16:19




$begingroup$
Hi and thanks for your quick response! So - if I take this special case for $X_i=0$ in concern - my way is suitable?
$endgroup$
– pcalc
Feb 1 at 16:19












$begingroup$
@GabrielRomon As far as I understand, the OP's main problem is that there is no assumption that $log(X 1_{{X>0}})$ is integrable.
$endgroup$
– saz
Feb 1 at 17:07




$begingroup$
@GabrielRomon As far as I understand, the OP's main problem is that there is no assumption that $log(X 1_{{X>0}})$ is integrable.
$endgroup$
– saz
Feb 1 at 17:07










1 Answer
1






active

oldest

votes


















4












$begingroup$

If $mathbb{E}(-log(X_1))<infty$ then your reasoning works fine and we find that



$$Y_n to exp(mathbb{E}log(X_1)) quad text{almost surely}. tag{1}$$



Now consider the case $mathbb{E}(-log(X_1))=infty$. Define a sequence of truncated random variables by



$$Z_n^{(k)} := min{k, -log(X_n)}= begin{cases} - log(X_n), & 0 leq -log(X_n) leq k, \ k, & text{otherwise}. end{cases}$$



The sequence $(Z_n^{(k)})_{n in mathbb{N}}$ is independent and identically distributed. Since $mathbb{E}|Z_n^{(k)}| leq k < infty$, the strong law of large numbers gives



$$lim_{n to infty} frac{1}{n} sum_{j=1}^n Z_j^{(k)} xrightarrow{n to infty} mathbb{E}(Z_1^{(k)}) tag{1}$$



almost surely. Since $Z_j^{(k)} leq - log(X_j)$ for each $j in mathbb{N}$ this implies



$$liminf_{n to infty}frac{1}{n} sum_{j=1}^n -log(X_j) geq mathbb{E}(Z_1^{(k)})$$



for all $k in mathbb{N}$. Since the monotone convergence theorem gives $sup_k mathbb{E}(Z_1^{(k)}) = mathbb{E}(-log(X_1))=infty$ we get



$$liminf_{n to infty} frac{1}{n} sum_{j=1}^n -log(X_j) = infty$$
i.e.



$$limsup_{n to infty} frac{1}{n} sum_{j=1}^n log(X_j) = -infty$$



almost surely. Hence, by the continuity of the exponential function,



$$Y_n = expleft( frac{1}{n} sum_{j=1}^n log(X_j) right) xrightarrow{n to infty} 0$$



almost surely.





In summary, we get



$$Y_n to exp(mathbb{E}log(X_1)) quad text{a.s.}$$



with $mathbb{E}log(X_1)$ being possibly $-infty$.





Remark: We have actually proved the following converse of the strong law of large numbers:




Let $(U_j)_{j in mathbb{N}}$ be a sequence of independent identically distributed and non-negative random variables. If $mathbb{E}(U_1)=infty$ then $$liminf_{n to infty} frac{1}{n} sum_{j=1}^n U_j = mathbb{E}(U_1)=infty quad text{a.s.}.$$







share|cite|improve this answer











$endgroup$













  • $begingroup$
    Hey Saz, I see that there is no need to split cases like I did "events such that of $X_n$ is zero.." . I'll delete my answer since yours provides a good neat answer. (+1)
    $endgroup$
    – Shashi
    Feb 1 at 17:55










  • $begingroup$
    Hi! Thank you very much for your very clear answer! That helped me a lot!
    $endgroup$
    – pcalc
    Feb 2 at 13:03










  • $begingroup$
    @pcalc You are welcome.
    $endgroup$
    – saz
    Feb 2 at 13:05












Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3096330%2fshow-that-y-n-prod-i-1n-x-i1-n-converges-with-probability-1%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









4












$begingroup$

If $mathbb{E}(-log(X_1))<infty$ then your reasoning works fine and we find that



$$Y_n to exp(mathbb{E}log(X_1)) quad text{almost surely}. tag{1}$$



Now consider the case $mathbb{E}(-log(X_1))=infty$. Define a sequence of truncated random variables by



$$Z_n^{(k)} := min{k, -log(X_n)}= begin{cases} - log(X_n), & 0 leq -log(X_n) leq k, \ k, & text{otherwise}. end{cases}$$



The sequence $(Z_n^{(k)})_{n in mathbb{N}}$ is independent and identically distributed. Since $mathbb{E}|Z_n^{(k)}| leq k < infty$, the strong law of large numbers gives



$$lim_{n to infty} frac{1}{n} sum_{j=1}^n Z_j^{(k)} xrightarrow{n to infty} mathbb{E}(Z_1^{(k)}) tag{1}$$



almost surely. Since $Z_j^{(k)} leq - log(X_j)$ for each $j in mathbb{N}$ this implies



$$liminf_{n to infty}frac{1}{n} sum_{j=1}^n -log(X_j) geq mathbb{E}(Z_1^{(k)})$$



for all $k in mathbb{N}$. Since the monotone convergence theorem gives $sup_k mathbb{E}(Z_1^{(k)}) = mathbb{E}(-log(X_1))=infty$ we get



$$liminf_{n to infty} frac{1}{n} sum_{j=1}^n -log(X_j) = infty$$
i.e.



$$limsup_{n to infty} frac{1}{n} sum_{j=1}^n log(X_j) = -infty$$



almost surely. Hence, by the continuity of the exponential function,



$$Y_n = expleft( frac{1}{n} sum_{j=1}^n log(X_j) right) xrightarrow{n to infty} 0$$



almost surely.





In summary, we get



$$Y_n to exp(mathbb{E}log(X_1)) quad text{a.s.}$$



with $mathbb{E}log(X_1)$ being possibly $-infty$.





Remark: We have actually proved the following converse of the strong law of large numbers:




Let $(U_j)_{j in mathbb{N}}$ be a sequence of independent identically distributed and non-negative random variables. If $mathbb{E}(U_1)=infty$ then $$liminf_{n to infty} frac{1}{n} sum_{j=1}^n U_j = mathbb{E}(U_1)=infty quad text{a.s.}.$$







share|cite|improve this answer











$endgroup$













  • $begingroup$
    Hey Saz, I see that there is no need to split cases like I did "events such that of $X_n$ is zero.." . I'll delete my answer since yours provides a good neat answer. (+1)
    $endgroup$
    – Shashi
    Feb 1 at 17:55










  • $begingroup$
    Hi! Thank you very much for your very clear answer! That helped me a lot!
    $endgroup$
    – pcalc
    Feb 2 at 13:03










  • $begingroup$
    @pcalc You are welcome.
    $endgroup$
    – saz
    Feb 2 at 13:05
















4












$begingroup$

If $mathbb{E}(-log(X_1))<infty$ then your reasoning works fine and we find that



$$Y_n to exp(mathbb{E}log(X_1)) quad text{almost surely}. tag{1}$$



Now consider the case $mathbb{E}(-log(X_1))=infty$. Define a sequence of truncated random variables by



$$Z_n^{(k)} := min{k, -log(X_n)}= begin{cases} - log(X_n), & 0 leq -log(X_n) leq k, \ k, & text{otherwise}. end{cases}$$



The sequence $(Z_n^{(k)})_{n in mathbb{N}}$ is independent and identically distributed. Since $mathbb{E}|Z_n^{(k)}| leq k < infty$, the strong law of large numbers gives



$$lim_{n to infty} frac{1}{n} sum_{j=1}^n Z_j^{(k)} xrightarrow{n to infty} mathbb{E}(Z_1^{(k)}) tag{1}$$



almost surely. Since $Z_j^{(k)} leq - log(X_j)$ for each $j in mathbb{N}$ this implies



$$liminf_{n to infty}frac{1}{n} sum_{j=1}^n -log(X_j) geq mathbb{E}(Z_1^{(k)})$$



for all $k in mathbb{N}$. Since the monotone convergence theorem gives $sup_k mathbb{E}(Z_1^{(k)}) = mathbb{E}(-log(X_1))=infty$ we get



$$liminf_{n to infty} frac{1}{n} sum_{j=1}^n -log(X_j) = infty$$
i.e.



$$limsup_{n to infty} frac{1}{n} sum_{j=1}^n log(X_j) = -infty$$



almost surely. Hence, by the continuity of the exponential function,



$$Y_n = expleft( frac{1}{n} sum_{j=1}^n log(X_j) right) xrightarrow{n to infty} 0$$



almost surely.





In summary, we get



$$Y_n to exp(mathbb{E}log(X_1)) quad text{a.s.}$$



with $mathbb{E}log(X_1)$ being possibly $-infty$.





Remark: We have actually proved the following converse of the strong law of large numbers:




Let $(U_j)_{j in mathbb{N}}$ be a sequence of independent identically distributed and non-negative random variables. If $mathbb{E}(U_1)=infty$ then $$liminf_{n to infty} frac{1}{n} sum_{j=1}^n U_j = mathbb{E}(U_1)=infty quad text{a.s.}.$$







share|cite|improve this answer











$endgroup$













  • $begingroup$
    Hey Saz, I see that there is no need to split cases like I did "events such that of $X_n$ is zero.." . I'll delete my answer since yours provides a good neat answer. (+1)
    $endgroup$
    – Shashi
    Feb 1 at 17:55










  • $begingroup$
    Hi! Thank you very much for your very clear answer! That helped me a lot!
    $endgroup$
    – pcalc
    Feb 2 at 13:03










  • $begingroup$
    @pcalc You are welcome.
    $endgroup$
    – saz
    Feb 2 at 13:05














4












4








4





$begingroup$

If $mathbb{E}(-log(X_1))<infty$ then your reasoning works fine and we find that



$$Y_n to exp(mathbb{E}log(X_1)) quad text{almost surely}. tag{1}$$



Now consider the case $mathbb{E}(-log(X_1))=infty$. Define a sequence of truncated random variables by



$$Z_n^{(k)} := min{k, -log(X_n)}= begin{cases} - log(X_n), & 0 leq -log(X_n) leq k, \ k, & text{otherwise}. end{cases}$$



The sequence $(Z_n^{(k)})_{n in mathbb{N}}$ is independent and identically distributed. Since $mathbb{E}|Z_n^{(k)}| leq k < infty$, the strong law of large numbers gives



$$lim_{n to infty} frac{1}{n} sum_{j=1}^n Z_j^{(k)} xrightarrow{n to infty} mathbb{E}(Z_1^{(k)}) tag{1}$$



almost surely. Since $Z_j^{(k)} leq - log(X_j)$ for each $j in mathbb{N}$ this implies



$$liminf_{n to infty}frac{1}{n} sum_{j=1}^n -log(X_j) geq mathbb{E}(Z_1^{(k)})$$



for all $k in mathbb{N}$. Since the monotone convergence theorem gives $sup_k mathbb{E}(Z_1^{(k)}) = mathbb{E}(-log(X_1))=infty$ we get



$$liminf_{n to infty} frac{1}{n} sum_{j=1}^n -log(X_j) = infty$$
i.e.



$$limsup_{n to infty} frac{1}{n} sum_{j=1}^n log(X_j) = -infty$$



almost surely. Hence, by the continuity of the exponential function,



$$Y_n = expleft( frac{1}{n} sum_{j=1}^n log(X_j) right) xrightarrow{n to infty} 0$$



almost surely.





In summary, we get



$$Y_n to exp(mathbb{E}log(X_1)) quad text{a.s.}$$



with $mathbb{E}log(X_1)$ being possibly $-infty$.





Remark: We have actually proved the following converse of the strong law of large numbers:




Let $(U_j)_{j in mathbb{N}}$ be a sequence of independent identically distributed and non-negative random variables. If $mathbb{E}(U_1)=infty$ then $$liminf_{n to infty} frac{1}{n} sum_{j=1}^n U_j = mathbb{E}(U_1)=infty quad text{a.s.}.$$







share|cite|improve this answer











$endgroup$



If $mathbb{E}(-log(X_1))<infty$ then your reasoning works fine and we find that



$$Y_n to exp(mathbb{E}log(X_1)) quad text{almost surely}. tag{1}$$



Now consider the case $mathbb{E}(-log(X_1))=infty$. Define a sequence of truncated random variables by



$$Z_n^{(k)} := min{k, -log(X_n)}= begin{cases} - log(X_n), & 0 leq -log(X_n) leq k, \ k, & text{otherwise}. end{cases}$$



The sequence $(Z_n^{(k)})_{n in mathbb{N}}$ is independent and identically distributed. Since $mathbb{E}|Z_n^{(k)}| leq k < infty$, the strong law of large numbers gives



$$lim_{n to infty} frac{1}{n} sum_{j=1}^n Z_j^{(k)} xrightarrow{n to infty} mathbb{E}(Z_1^{(k)}) tag{1}$$



almost surely. Since $Z_j^{(k)} leq - log(X_j)$ for each $j in mathbb{N}$ this implies



$$liminf_{n to infty}frac{1}{n} sum_{j=1}^n -log(X_j) geq mathbb{E}(Z_1^{(k)})$$



for all $k in mathbb{N}$. Since the monotone convergence theorem gives $sup_k mathbb{E}(Z_1^{(k)}) = mathbb{E}(-log(X_1))=infty$ we get



$$liminf_{n to infty} frac{1}{n} sum_{j=1}^n -log(X_j) = infty$$
i.e.



$$limsup_{n to infty} frac{1}{n} sum_{j=1}^n log(X_j) = -infty$$



almost surely. Hence, by the continuity of the exponential function,



$$Y_n = expleft( frac{1}{n} sum_{j=1}^n log(X_j) right) xrightarrow{n to infty} 0$$



almost surely.





In summary, we get



$$Y_n to exp(mathbb{E}log(X_1)) quad text{a.s.}$$



with $mathbb{E}log(X_1)$ being possibly $-infty$.





Remark: We have actually proved the following converse of the strong law of large numbers:




Let $(U_j)_{j in mathbb{N}}$ be a sequence of independent identically distributed and non-negative random variables. If $mathbb{E}(U_1)=infty$ then $$liminf_{n to infty} frac{1}{n} sum_{j=1}^n U_j = mathbb{E}(U_1)=infty quad text{a.s.}.$$








share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Feb 1 at 19:49

























answered Feb 1 at 17:40









sazsaz

82.3k862131




82.3k862131












  • $begingroup$
    Hey Saz, I see that there is no need to split cases like I did "events such that of $X_n$ is zero.." . I'll delete my answer since yours provides a good neat answer. (+1)
    $endgroup$
    – Shashi
    Feb 1 at 17:55










  • $begingroup$
    Hi! Thank you very much for your very clear answer! That helped me a lot!
    $endgroup$
    – pcalc
    Feb 2 at 13:03










  • $begingroup$
    @pcalc You are welcome.
    $endgroup$
    – saz
    Feb 2 at 13:05


















  • $begingroup$
    Hey Saz, I see that there is no need to split cases like I did "events such that of $X_n$ is zero.." . I'll delete my answer since yours provides a good neat answer. (+1)
    $endgroup$
    – Shashi
    Feb 1 at 17:55










  • $begingroup$
    Hi! Thank you very much for your very clear answer! That helped me a lot!
    $endgroup$
    – pcalc
    Feb 2 at 13:03










  • $begingroup$
    @pcalc You are welcome.
    $endgroup$
    – saz
    Feb 2 at 13:05
















$begingroup$
Hey Saz, I see that there is no need to split cases like I did "events such that of $X_n$ is zero.." . I'll delete my answer since yours provides a good neat answer. (+1)
$endgroup$
– Shashi
Feb 1 at 17:55




$begingroup$
Hey Saz, I see that there is no need to split cases like I did "events such that of $X_n$ is zero.." . I'll delete my answer since yours provides a good neat answer. (+1)
$endgroup$
– Shashi
Feb 1 at 17:55












$begingroup$
Hi! Thank you very much for your very clear answer! That helped me a lot!
$endgroup$
– pcalc
Feb 2 at 13:03




$begingroup$
Hi! Thank you very much for your very clear answer! That helped me a lot!
$endgroup$
– pcalc
Feb 2 at 13:03












$begingroup$
@pcalc You are welcome.
$endgroup$
– saz
Feb 2 at 13:05




$begingroup$
@pcalc You are welcome.
$endgroup$
– saz
Feb 2 at 13:05


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3096330%2fshow-that-y-n-prod-i-1n-x-i1-n-converges-with-probability-1%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

How to fix TextFormField cause rebuild widget in Flutter

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith