Why is the expectation of cauchy distribution not defined? (What is the intuition behind it?)











up vote
0
down vote

favorite
1












Let $X$ be random variable with pdf $f_X(x) = dfrac{1}{pi(1+x^2)}$. I understand that mathematically, the improper integral, $displaystyleintlimits_{-infty}^{infty}dfrac{x}{pi(1+x^2)}dx$ does not exist ($underset{T_1to-infty}{lim}underset{T_2toinfty}{lim}displaystyleintlimits_{T_1}^{T_2}dfrac{x}{pi(1+x^2)}dx = frac{ln(1+T_2^2) - ln(1+T_1^2)}{2pi} = infty - infty) implies$ undefined. However I am unable to understand the intuition, why $E[X] ne 0$ since the pdf is an even function which takes positive and negative values with equal probability. Hence large enough samples should produce mean close to 0. Please help?










share|cite|improve this question









New contributor




sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
















  • 2




    According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
    – Jack D'Aurizio
    yesterday












  • It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
    – NCh
    yesterday

















up vote
0
down vote

favorite
1












Let $X$ be random variable with pdf $f_X(x) = dfrac{1}{pi(1+x^2)}$. I understand that mathematically, the improper integral, $displaystyleintlimits_{-infty}^{infty}dfrac{x}{pi(1+x^2)}dx$ does not exist ($underset{T_1to-infty}{lim}underset{T_2toinfty}{lim}displaystyleintlimits_{T_1}^{T_2}dfrac{x}{pi(1+x^2)}dx = frac{ln(1+T_2^2) - ln(1+T_1^2)}{2pi} = infty - infty) implies$ undefined. However I am unable to understand the intuition, why $E[X] ne 0$ since the pdf is an even function which takes positive and negative values with equal probability. Hence large enough samples should produce mean close to 0. Please help?










share|cite|improve this question









New contributor




sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
















  • 2




    According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
    – Jack D'Aurizio
    yesterday












  • It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
    – NCh
    yesterday















up vote
0
down vote

favorite
1









up vote
0
down vote

favorite
1






1





Let $X$ be random variable with pdf $f_X(x) = dfrac{1}{pi(1+x^2)}$. I understand that mathematically, the improper integral, $displaystyleintlimits_{-infty}^{infty}dfrac{x}{pi(1+x^2)}dx$ does not exist ($underset{T_1to-infty}{lim}underset{T_2toinfty}{lim}displaystyleintlimits_{T_1}^{T_2}dfrac{x}{pi(1+x^2)}dx = frac{ln(1+T_2^2) - ln(1+T_1^2)}{2pi} = infty - infty) implies$ undefined. However I am unable to understand the intuition, why $E[X] ne 0$ since the pdf is an even function which takes positive and negative values with equal probability. Hence large enough samples should produce mean close to 0. Please help?










share|cite|improve this question









New contributor




sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











Let $X$ be random variable with pdf $f_X(x) = dfrac{1}{pi(1+x^2)}$. I understand that mathematically, the improper integral, $displaystyleintlimits_{-infty}^{infty}dfrac{x}{pi(1+x^2)}dx$ does not exist ($underset{T_1to-infty}{lim}underset{T_2toinfty}{lim}displaystyleintlimits_{T_1}^{T_2}dfrac{x}{pi(1+x^2)}dx = frac{ln(1+T_2^2) - ln(1+T_1^2)}{2pi} = infty - infty) implies$ undefined. However I am unable to understand the intuition, why $E[X] ne 0$ since the pdf is an even function which takes positive and negative values with equal probability. Hence large enough samples should produce mean close to 0. Please help?







probability-distributions improper-integrals means






share|cite|improve this question









New contributor




sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|cite|improve this question









New contributor




sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this question




share|cite|improve this question








edited 21 hours ago





















New contributor




sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked yesterday









sh10

11




11




New contributor




sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






sh10 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








  • 2




    According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
    – Jack D'Aurizio
    yesterday












  • It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
    – NCh
    yesterday
















  • 2




    According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
    – Jack D'Aurizio
    yesterday












  • It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
    – NCh
    yesterday










2




2




According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
– Jack D'Aurizio
yesterday






According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
– Jack D'Aurizio
yesterday














It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
– NCh
yesterday






It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
– NCh
yesterday












1 Answer
1






active

oldest

votes

















up vote
1
down vote













There are several ways to look at it:




  • Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.

  • "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.

  • The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).

  • It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.






share|cite|improve this answer





















    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });






    sh10 is a new contributor. Be nice, and check out our Code of Conduct.










     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3005103%2fwhy-is-the-expectation-of-cauchy-distribution-not-defined-what-is-the-intuitio%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote













    There are several ways to look at it:




    • Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.

    • "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.

    • The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).

    • It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.






    share|cite|improve this answer

























      up vote
      1
      down vote













      There are several ways to look at it:




      • Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.

      • "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.

      • The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).

      • It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.






      share|cite|improve this answer























        up vote
        1
        down vote










        up vote
        1
        down vote









        There are several ways to look at it:




        • Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.

        • "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.

        • The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).

        • It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.






        share|cite|improve this answer












        There are several ways to look at it:




        • Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.

        • "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.

        • The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).

        • It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 21 hours ago









        J.G.

        18.2k21932




        18.2k21932






















            sh10 is a new contributor. Be nice, and check out our Code of Conduct.










             

            draft saved


            draft discarded


















            sh10 is a new contributor. Be nice, and check out our Code of Conduct.













            sh10 is a new contributor. Be nice, and check out our Code of Conduct.












            sh10 is a new contributor. Be nice, and check out our Code of Conduct.















             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3005103%2fwhy-is-the-expectation-of-cauchy-distribution-not-defined-what-is-the-intuitio%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Can a sorcerer learn a 5th-level spell early by creating spell slots using the Font of Magic feature?

            Does disintegrating a polymorphed enemy still kill it after the 2018 errata?

            A Topological Invariant for $pi_3(U(n))$