Why is the expectation of cauchy distribution not defined? (What is the intuition behind it?)
up vote
0
down vote
favorite
Let $X$ be random variable with pdf $f_X(x) = dfrac{1}{pi(1+x^2)}$. I understand that mathematically, the improper integral, $displaystyleintlimits_{-infty}^{infty}dfrac{x}{pi(1+x^2)}dx$ does not exist ($underset{T_1to-infty}{lim}underset{T_2toinfty}{lim}displaystyleintlimits_{T_1}^{T_2}dfrac{x}{pi(1+x^2)}dx = frac{ln(1+T_2^2) - ln(1+T_1^2)}{2pi} = infty - infty) implies$ undefined. However I am unable to understand the intuition, why $E[X] ne 0$ since the pdf is an even function which takes positive and negative values with equal probability. Hence large enough samples should produce mean close to 0. Please help?
probability-distributions improper-integrals means
New contributor
add a comment |
up vote
0
down vote
favorite
Let $X$ be random variable with pdf $f_X(x) = dfrac{1}{pi(1+x^2)}$. I understand that mathematically, the improper integral, $displaystyleintlimits_{-infty}^{infty}dfrac{x}{pi(1+x^2)}dx$ does not exist ($underset{T_1to-infty}{lim}underset{T_2toinfty}{lim}displaystyleintlimits_{T_1}^{T_2}dfrac{x}{pi(1+x^2)}dx = frac{ln(1+T_2^2) - ln(1+T_1^2)}{2pi} = infty - infty) implies$ undefined. However I am unable to understand the intuition, why $E[X] ne 0$ since the pdf is an even function which takes positive and negative values with equal probability. Hence large enough samples should produce mean close to 0. Please help?
probability-distributions improper-integrals means
New contributor
2
According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
– Jack D'Aurizio
yesterday
It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
– NCh
yesterday
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Let $X$ be random variable with pdf $f_X(x) = dfrac{1}{pi(1+x^2)}$. I understand that mathematically, the improper integral, $displaystyleintlimits_{-infty}^{infty}dfrac{x}{pi(1+x^2)}dx$ does not exist ($underset{T_1to-infty}{lim}underset{T_2toinfty}{lim}displaystyleintlimits_{T_1}^{T_2}dfrac{x}{pi(1+x^2)}dx = frac{ln(1+T_2^2) - ln(1+T_1^2)}{2pi} = infty - infty) implies$ undefined. However I am unable to understand the intuition, why $E[X] ne 0$ since the pdf is an even function which takes positive and negative values with equal probability. Hence large enough samples should produce mean close to 0. Please help?
probability-distributions improper-integrals means
New contributor
Let $X$ be random variable with pdf $f_X(x) = dfrac{1}{pi(1+x^2)}$. I understand that mathematically, the improper integral, $displaystyleintlimits_{-infty}^{infty}dfrac{x}{pi(1+x^2)}dx$ does not exist ($underset{T_1to-infty}{lim}underset{T_2toinfty}{lim}displaystyleintlimits_{T_1}^{T_2}dfrac{x}{pi(1+x^2)}dx = frac{ln(1+T_2^2) - ln(1+T_1^2)}{2pi} = infty - infty) implies$ undefined. However I am unable to understand the intuition, why $E[X] ne 0$ since the pdf is an even function which takes positive and negative values with equal probability. Hence large enough samples should produce mean close to 0. Please help?
probability-distributions improper-integrals means
probability-distributions improper-integrals means
New contributor
New contributor
edited 21 hours ago
New contributor
asked yesterday
sh10
11
11
New contributor
New contributor
2
According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
– Jack D'Aurizio
yesterday
It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
– NCh
yesterday
add a comment |
2
According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
– Jack D'Aurizio
yesterday
It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
– NCh
yesterday
2
2
According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
– Jack D'Aurizio
yesterday
According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
– Jack D'Aurizio
yesterday
It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
– NCh
yesterday
It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
– NCh
yesterday
add a comment |
1 Answer
1
active
oldest
votes
up vote
1
down vote
There are several ways to look at it:
- Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.
- "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.
- The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).
- It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
There are several ways to look at it:
- Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.
- "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.
- The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).
- It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.
add a comment |
up vote
1
down vote
There are several ways to look at it:
- Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.
- "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.
- The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).
- It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.
add a comment |
up vote
1
down vote
up vote
1
down vote
There are several ways to look at it:
- Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.
- "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.
- The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).
- It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.
There are several ways to look at it:
- Let $f(x):=frac{pi^{-1}}{1+x^2}$ so $int_{-infty}^c xf(x) dx=-infty,,int_d^infty xf(x) dx$ for any $c,,dinmathbb{R}$. So if we choose $c<d$, you could argue the mean is $-infty+int_c^d xf(x) dx+infty$. In theory, you can get any value you like if you think the infinities cancel.
- "But I'm integrating an odd function! That has to give me $0$!" Yes, if the two pieces you're cancelling are both finite. But $infty-infty$ is an indefinite form, so you can't use that theorem.
- The characteristic function is $varphi(t):=exp -left|tright|$. If we average $n$ samples, the result has characteristic function $varphi^n(t/n)=varphi(t)$. It's immune to the CLT. Nor should you expect otherwise, without a finite and well-defined mean and variance. No $mu$, no $(X-mu^2)$, no variance. The characteristic function provides another way to look at it: we can't very well write $mu=varphi'(0)$ because the modulus has undefined derivative at $0$ (the one-sided limits $lim_{tto 0^pm}frac{|t|}{t}$ differ).
- It can be shown, however, that the median of $n$ samples is asymptotically Normal for large $n$. (The proof is a bit more involved than a standard CLT argument for means; you can get an overview here.) By contrast, if you compute the mean of a gradually growing sample, it'll bounce around like crazy, because (as shown above) it's Cauchy-distributed.
answered 21 hours ago
J.G.
18.2k21932
18.2k21932
add a comment |
add a comment |
sh10 is a new contributor. Be nice, and check out our Code of Conduct.
sh10 is a new contributor. Be nice, and check out our Code of Conduct.
sh10 is a new contributor. Be nice, and check out our Code of Conduct.
sh10 is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3005103%2fwhy-is-the-expectation-of-cauchy-distribution-not-defined-what-is-the-intuitio%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
According to your logic, large enough samples should agree with the central limit theorem, too. But that is not the case for the Cauchy distribution: $frac{x}{x^2+1}notin L^1(mathbb{R})$, full stop.
– Jack D'Aurizio
yesterday
It is well-known fact, that for Cauchy distribution, arithmetic mean of independent samples $frac{X_1+ldots+X_n}{n}$ is also Cauchy distributed with the same pdf as of summands. So, the mean of large enough samples is not close to zero. It behaves as a single sample from this distribution.
– NCh
yesterday