proofing the strong law of large numbers?












1












$begingroup$


I beg your pardon. But I am not going to write down the statement of the law proofs and so on, because I think every one knows what I am talking about and because I am a hobby mathemation I do not think I would write this text correct anyway .
When i looked up the proof for the weak law it ended like this:
$Pr( left| overline{X}_n-mu right| geq varepsilon) leq frac{sigma^2}{nvarepsilon^2}rightarrow 0$, where the random variable $overline{X}_n$ has a variance of $frac{sigma^2}{n}$.



An equivalent definition of the weak law of large numbers, using chebychev inequality for the proof is:
A random variable $X_n$ with $mu=0$ and $var(X_n)=sigma^2$
converges in probability to zero if
$var(X_n)rightarrow0$.
Now I have the following idea:



$var(X^n)=E[X^{2n}]-E[X^n]^2 geq 0 iff frac{var(X^n)}{E[X^{2n}]}=1-frac{E[X^n]^2}{E[X^{2n}]} geq 0 implieslimsup frac{E[X^n]^2}{E[X^{2n}]} leq 1 $.
Now $var(X^2)implies frac{E[X^n]^2}{E[X^{2n}]}=frac{var(X)^2}{E[X^{4}]} iff var(X)^2 = O(E[X^{4}]) $



Now we use the inequality and the variance used in the proof of the weak law.
$Pr(|X_n|>epsilon)=Pr(|X_n|^2>epsilon^2 leq frac{E[X^{4}]}{epsilon^4} $.
Now summing up the probabilities
$O(sum_{}^{infty} frac{E[X^{4}]}{epsilon^4}) =sum_{}^{infty}frac{var(x)^2}{epsilon^4}=sum_{}^{infty}frac{sigma^4}{n^2epsilon^4}<infty$ which implies almost sure convergance.



I hope you can understand what I was writing and would like to know if my idea is somehow correct. Thanks










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    There is a very elementary proof of the strong law of large numbers under the assumption of finite fourth moments (as you seem to have assumed). However, your argument isn't intelligible to me... too many $implies$'s and $iff$'s and very few words, and no clear statement of the theorem and the assumptions. The fact that only $sigma$ appears in the final sum shows that there is at least a minor error here. $E(X^4)$ is a fourth moment quantity, not a second moment quantity (perhaps you confused $var(x^2)$ for $var(x)^2$?)
    $endgroup$
    – spaceisdarkgreen
    Jan 10 at 4:18








  • 1




    $begingroup$
    Squinting a little harder, assuming you used $sigma^4$ as a notation for the fourth moment, and reading between the lines and typos a bit, it resembles a correct argument.
    $endgroup$
    – spaceisdarkgreen
    Jan 10 at 4:50
















1












$begingroup$


I beg your pardon. But I am not going to write down the statement of the law proofs and so on, because I think every one knows what I am talking about and because I am a hobby mathemation I do not think I would write this text correct anyway .
When i looked up the proof for the weak law it ended like this:
$Pr( left| overline{X}_n-mu right| geq varepsilon) leq frac{sigma^2}{nvarepsilon^2}rightarrow 0$, where the random variable $overline{X}_n$ has a variance of $frac{sigma^2}{n}$.



An equivalent definition of the weak law of large numbers, using chebychev inequality for the proof is:
A random variable $X_n$ with $mu=0$ and $var(X_n)=sigma^2$
converges in probability to zero if
$var(X_n)rightarrow0$.
Now I have the following idea:



$var(X^n)=E[X^{2n}]-E[X^n]^2 geq 0 iff frac{var(X^n)}{E[X^{2n}]}=1-frac{E[X^n]^2}{E[X^{2n}]} geq 0 implieslimsup frac{E[X^n]^2}{E[X^{2n}]} leq 1 $.
Now $var(X^2)implies frac{E[X^n]^2}{E[X^{2n}]}=frac{var(X)^2}{E[X^{4}]} iff var(X)^2 = O(E[X^{4}]) $



Now we use the inequality and the variance used in the proof of the weak law.
$Pr(|X_n|>epsilon)=Pr(|X_n|^2>epsilon^2 leq frac{E[X^{4}]}{epsilon^4} $.
Now summing up the probabilities
$O(sum_{}^{infty} frac{E[X^{4}]}{epsilon^4}) =sum_{}^{infty}frac{var(x)^2}{epsilon^4}=sum_{}^{infty}frac{sigma^4}{n^2epsilon^4}<infty$ which implies almost sure convergance.



I hope you can understand what I was writing and would like to know if my idea is somehow correct. Thanks










share|cite|improve this question









$endgroup$








  • 1




    $begingroup$
    There is a very elementary proof of the strong law of large numbers under the assumption of finite fourth moments (as you seem to have assumed). However, your argument isn't intelligible to me... too many $implies$'s and $iff$'s and very few words, and no clear statement of the theorem and the assumptions. The fact that only $sigma$ appears in the final sum shows that there is at least a minor error here. $E(X^4)$ is a fourth moment quantity, not a second moment quantity (perhaps you confused $var(x^2)$ for $var(x)^2$?)
    $endgroup$
    – spaceisdarkgreen
    Jan 10 at 4:18








  • 1




    $begingroup$
    Squinting a little harder, assuming you used $sigma^4$ as a notation for the fourth moment, and reading between the lines and typos a bit, it resembles a correct argument.
    $endgroup$
    – spaceisdarkgreen
    Jan 10 at 4:50














1












1








1





$begingroup$


I beg your pardon. But I am not going to write down the statement of the law proofs and so on, because I think every one knows what I am talking about and because I am a hobby mathemation I do not think I would write this text correct anyway .
When i looked up the proof for the weak law it ended like this:
$Pr( left| overline{X}_n-mu right| geq varepsilon) leq frac{sigma^2}{nvarepsilon^2}rightarrow 0$, where the random variable $overline{X}_n$ has a variance of $frac{sigma^2}{n}$.



An equivalent definition of the weak law of large numbers, using chebychev inequality for the proof is:
A random variable $X_n$ with $mu=0$ and $var(X_n)=sigma^2$
converges in probability to zero if
$var(X_n)rightarrow0$.
Now I have the following idea:



$var(X^n)=E[X^{2n}]-E[X^n]^2 geq 0 iff frac{var(X^n)}{E[X^{2n}]}=1-frac{E[X^n]^2}{E[X^{2n}]} geq 0 implieslimsup frac{E[X^n]^2}{E[X^{2n}]} leq 1 $.
Now $var(X^2)implies frac{E[X^n]^2}{E[X^{2n}]}=frac{var(X)^2}{E[X^{4}]} iff var(X)^2 = O(E[X^{4}]) $



Now we use the inequality and the variance used in the proof of the weak law.
$Pr(|X_n|>epsilon)=Pr(|X_n|^2>epsilon^2 leq frac{E[X^{4}]}{epsilon^4} $.
Now summing up the probabilities
$O(sum_{}^{infty} frac{E[X^{4}]}{epsilon^4}) =sum_{}^{infty}frac{var(x)^2}{epsilon^4}=sum_{}^{infty}frac{sigma^4}{n^2epsilon^4}<infty$ which implies almost sure convergance.



I hope you can understand what I was writing and would like to know if my idea is somehow correct. Thanks










share|cite|improve this question









$endgroup$




I beg your pardon. But I am not going to write down the statement of the law proofs and so on, because I think every one knows what I am talking about and because I am a hobby mathemation I do not think I would write this text correct anyway .
When i looked up the proof for the weak law it ended like this:
$Pr( left| overline{X}_n-mu right| geq varepsilon) leq frac{sigma^2}{nvarepsilon^2}rightarrow 0$, where the random variable $overline{X}_n$ has a variance of $frac{sigma^2}{n}$.



An equivalent definition of the weak law of large numbers, using chebychev inequality for the proof is:
A random variable $X_n$ with $mu=0$ and $var(X_n)=sigma^2$
converges in probability to zero if
$var(X_n)rightarrow0$.
Now I have the following idea:



$var(X^n)=E[X^{2n}]-E[X^n]^2 geq 0 iff frac{var(X^n)}{E[X^{2n}]}=1-frac{E[X^n]^2}{E[X^{2n}]} geq 0 implieslimsup frac{E[X^n]^2}{E[X^{2n}]} leq 1 $.
Now $var(X^2)implies frac{E[X^n]^2}{E[X^{2n}]}=frac{var(X)^2}{E[X^{4}]} iff var(X)^2 = O(E[X^{4}]) $



Now we use the inequality and the variance used in the proof of the weak law.
$Pr(|X_n|>epsilon)=Pr(|X_n|^2>epsilon^2 leq frac{E[X^{4}]}{epsilon^4} $.
Now summing up the probabilities
$O(sum_{}^{infty} frac{E[X^{4}]}{epsilon^4}) =sum_{}^{infty}frac{var(x)^2}{epsilon^4}=sum_{}^{infty}frac{sigma^4}{n^2epsilon^4}<infty$ which implies almost sure convergance.



I hope you can understand what I was writing and would like to know if my idea is somehow correct. Thanks







probability-theory






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 10 at 1:06









Markus KrumplMarkus Krumpl

61




61








  • 1




    $begingroup$
    There is a very elementary proof of the strong law of large numbers under the assumption of finite fourth moments (as you seem to have assumed). However, your argument isn't intelligible to me... too many $implies$'s and $iff$'s and very few words, and no clear statement of the theorem and the assumptions. The fact that only $sigma$ appears in the final sum shows that there is at least a minor error here. $E(X^4)$ is a fourth moment quantity, not a second moment quantity (perhaps you confused $var(x^2)$ for $var(x)^2$?)
    $endgroup$
    – spaceisdarkgreen
    Jan 10 at 4:18








  • 1




    $begingroup$
    Squinting a little harder, assuming you used $sigma^4$ as a notation for the fourth moment, and reading between the lines and typos a bit, it resembles a correct argument.
    $endgroup$
    – spaceisdarkgreen
    Jan 10 at 4:50














  • 1




    $begingroup$
    There is a very elementary proof of the strong law of large numbers under the assumption of finite fourth moments (as you seem to have assumed). However, your argument isn't intelligible to me... too many $implies$'s and $iff$'s and very few words, and no clear statement of the theorem and the assumptions. The fact that only $sigma$ appears in the final sum shows that there is at least a minor error here. $E(X^4)$ is a fourth moment quantity, not a second moment quantity (perhaps you confused $var(x^2)$ for $var(x)^2$?)
    $endgroup$
    – spaceisdarkgreen
    Jan 10 at 4:18








  • 1




    $begingroup$
    Squinting a little harder, assuming you used $sigma^4$ as a notation for the fourth moment, and reading between the lines and typos a bit, it resembles a correct argument.
    $endgroup$
    – spaceisdarkgreen
    Jan 10 at 4:50








1




1




$begingroup$
There is a very elementary proof of the strong law of large numbers under the assumption of finite fourth moments (as you seem to have assumed). However, your argument isn't intelligible to me... too many $implies$'s and $iff$'s and very few words, and no clear statement of the theorem and the assumptions. The fact that only $sigma$ appears in the final sum shows that there is at least a minor error here. $E(X^4)$ is a fourth moment quantity, not a second moment quantity (perhaps you confused $var(x^2)$ for $var(x)^2$?)
$endgroup$
– spaceisdarkgreen
Jan 10 at 4:18






$begingroup$
There is a very elementary proof of the strong law of large numbers under the assumption of finite fourth moments (as you seem to have assumed). However, your argument isn't intelligible to me... too many $implies$'s and $iff$'s and very few words, and no clear statement of the theorem and the assumptions. The fact that only $sigma$ appears in the final sum shows that there is at least a minor error here. $E(X^4)$ is a fourth moment quantity, not a second moment quantity (perhaps you confused $var(x^2)$ for $var(x)^2$?)
$endgroup$
– spaceisdarkgreen
Jan 10 at 4:18






1




1




$begingroup$
Squinting a little harder, assuming you used $sigma^4$ as a notation for the fourth moment, and reading between the lines and typos a bit, it resembles a correct argument.
$endgroup$
– spaceisdarkgreen
Jan 10 at 4:50




$begingroup$
Squinting a little harder, assuming you used $sigma^4$ as a notation for the fourth moment, and reading between the lines and typos a bit, it resembles a correct argument.
$endgroup$
– spaceisdarkgreen
Jan 10 at 4:50










1 Answer
1






active

oldest

votes


















1












$begingroup$

As I mentioned in the comments, I find your proof difficult to follow. I will give a quick reiteration of (what I think is) your proof for reference:



Let $X_i$ be i.i.d., with mean zero and finite fourth moment.



By Chebyshev, we have $$ P(|bar X_n|^2>epsilon^2) le frac{operatorname{Var}(bar X_n^2)}{epsilon^4} = frac{operatorname{Var}(S_n^2)}{epsilon^4n^4}le frac{E(S_n^4)}{epsilon^4n^4}.$$



Since the $X_i$ have mean zero, we can multiply things out to get $$ E(S_n^4) = E((X_1+ldots+X_n)^4)=nE(X^4)+3n(n-1)E(X^2)^2,$$ and by Jensen, $E(X^2)^2<E(X^4),$ so we have $$ E(S_n^4)le 3n^2E(X^4).$$



Plugging that into the first inequality, $$ P(|bar X_n|^2>epsilon^2)le frac{3E(X^4)}{epsilon^4n^2}.$$



Since this is summable, Borel Cantelli implies that $bar X_nto 0$ almost surely.



This simple proof relies on the fourth moment assumption. The proof for the optimal case that only assumes $E(|X|)<infty$ is a lot harder.






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3068125%2fproofing-the-strong-law-of-large-numbers%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    As I mentioned in the comments, I find your proof difficult to follow. I will give a quick reiteration of (what I think is) your proof for reference:



    Let $X_i$ be i.i.d., with mean zero and finite fourth moment.



    By Chebyshev, we have $$ P(|bar X_n|^2>epsilon^2) le frac{operatorname{Var}(bar X_n^2)}{epsilon^4} = frac{operatorname{Var}(S_n^2)}{epsilon^4n^4}le frac{E(S_n^4)}{epsilon^4n^4}.$$



    Since the $X_i$ have mean zero, we can multiply things out to get $$ E(S_n^4) = E((X_1+ldots+X_n)^4)=nE(X^4)+3n(n-1)E(X^2)^2,$$ and by Jensen, $E(X^2)^2<E(X^4),$ so we have $$ E(S_n^4)le 3n^2E(X^4).$$



    Plugging that into the first inequality, $$ P(|bar X_n|^2>epsilon^2)le frac{3E(X^4)}{epsilon^4n^2}.$$



    Since this is summable, Borel Cantelli implies that $bar X_nto 0$ almost surely.



    This simple proof relies on the fourth moment assumption. The proof for the optimal case that only assumes $E(|X|)<infty$ is a lot harder.






    share|cite|improve this answer









    $endgroup$


















      1












      $begingroup$

      As I mentioned in the comments, I find your proof difficult to follow. I will give a quick reiteration of (what I think is) your proof for reference:



      Let $X_i$ be i.i.d., with mean zero and finite fourth moment.



      By Chebyshev, we have $$ P(|bar X_n|^2>epsilon^2) le frac{operatorname{Var}(bar X_n^2)}{epsilon^4} = frac{operatorname{Var}(S_n^2)}{epsilon^4n^4}le frac{E(S_n^4)}{epsilon^4n^4}.$$



      Since the $X_i$ have mean zero, we can multiply things out to get $$ E(S_n^4) = E((X_1+ldots+X_n)^4)=nE(X^4)+3n(n-1)E(X^2)^2,$$ and by Jensen, $E(X^2)^2<E(X^4),$ so we have $$ E(S_n^4)le 3n^2E(X^4).$$



      Plugging that into the first inequality, $$ P(|bar X_n|^2>epsilon^2)le frac{3E(X^4)}{epsilon^4n^2}.$$



      Since this is summable, Borel Cantelli implies that $bar X_nto 0$ almost surely.



      This simple proof relies on the fourth moment assumption. The proof for the optimal case that only assumes $E(|X|)<infty$ is a lot harder.






      share|cite|improve this answer









      $endgroup$
















        1












        1








        1





        $begingroup$

        As I mentioned in the comments, I find your proof difficult to follow. I will give a quick reiteration of (what I think is) your proof for reference:



        Let $X_i$ be i.i.d., with mean zero and finite fourth moment.



        By Chebyshev, we have $$ P(|bar X_n|^2>epsilon^2) le frac{operatorname{Var}(bar X_n^2)}{epsilon^4} = frac{operatorname{Var}(S_n^2)}{epsilon^4n^4}le frac{E(S_n^4)}{epsilon^4n^4}.$$



        Since the $X_i$ have mean zero, we can multiply things out to get $$ E(S_n^4) = E((X_1+ldots+X_n)^4)=nE(X^4)+3n(n-1)E(X^2)^2,$$ and by Jensen, $E(X^2)^2<E(X^4),$ so we have $$ E(S_n^4)le 3n^2E(X^4).$$



        Plugging that into the first inequality, $$ P(|bar X_n|^2>epsilon^2)le frac{3E(X^4)}{epsilon^4n^2}.$$



        Since this is summable, Borel Cantelli implies that $bar X_nto 0$ almost surely.



        This simple proof relies on the fourth moment assumption. The proof for the optimal case that only assumes $E(|X|)<infty$ is a lot harder.






        share|cite|improve this answer









        $endgroup$



        As I mentioned in the comments, I find your proof difficult to follow. I will give a quick reiteration of (what I think is) your proof for reference:



        Let $X_i$ be i.i.d., with mean zero and finite fourth moment.



        By Chebyshev, we have $$ P(|bar X_n|^2>epsilon^2) le frac{operatorname{Var}(bar X_n^2)}{epsilon^4} = frac{operatorname{Var}(S_n^2)}{epsilon^4n^4}le frac{E(S_n^4)}{epsilon^4n^4}.$$



        Since the $X_i$ have mean zero, we can multiply things out to get $$ E(S_n^4) = E((X_1+ldots+X_n)^4)=nE(X^4)+3n(n-1)E(X^2)^2,$$ and by Jensen, $E(X^2)^2<E(X^4),$ so we have $$ E(S_n^4)le 3n^2E(X^4).$$



        Plugging that into the first inequality, $$ P(|bar X_n|^2>epsilon^2)le frac{3E(X^4)}{epsilon^4n^2}.$$



        Since this is summable, Borel Cantelli implies that $bar X_nto 0$ almost surely.



        This simple proof relies on the fourth moment assumption. The proof for the optimal case that only assumes $E(|X|)<infty$ is a lot harder.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Jan 10 at 5:05









        spaceisdarkgreenspaceisdarkgreen

        32.8k21753




        32.8k21753






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3068125%2fproofing-the-strong-law-of-large-numbers%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Can a sorcerer learn a 5th-level spell early by creating spell slots using the Font of Magic feature?

            Does disintegrating a polymorphed enemy still kill it after the 2018 errata?

            A Topological Invariant for $pi_3(U(n))$