What does the value of a probability density function (PDF) at some x indicate?
$begingroup$
I understand that the probability mass function of a discrete random-variable X is $y=g(x)$. This means $P(X=x_0) = g(x_0)$.
Now, a probability density function of of a continuous random variable X is $y=f(x)$. Wikipedia defines this function $y$ to mean
In probability theory, a probability density function (pdf), or density of a continuous random variable, is a function that describes the relative likelihood for this random variable to take on a given value.
I am confused about the meaning of 'relative likelihood' because it certainly does not mean probability! The probability $P(X<x_0)$ is given by some integral of the pdf.
So what does $f(x_0)$ indicate? It gives a real number, but isn't the relative likelihood of a specific value for a CRV always zero?
probability statistics random-variables
$endgroup$
add a comment |
$begingroup$
I understand that the probability mass function of a discrete random-variable X is $y=g(x)$. This means $P(X=x_0) = g(x_0)$.
Now, a probability density function of of a continuous random variable X is $y=f(x)$. Wikipedia defines this function $y$ to mean
In probability theory, a probability density function (pdf), or density of a continuous random variable, is a function that describes the relative likelihood for this random variable to take on a given value.
I am confused about the meaning of 'relative likelihood' because it certainly does not mean probability! The probability $P(X<x_0)$ is given by some integral of the pdf.
So what does $f(x_0)$ indicate? It gives a real number, but isn't the relative likelihood of a specific value for a CRV always zero?
probability statistics random-variables
$endgroup$
5
$begingroup$
Let $f$ be the density function of $X$. Assume $f$ is continuous. Then if $h$ is small, the probability that $X$ lies in the interval $[a,a+h]$ is approximately $hf(a)$. By approximately I mean that the probability, divided by $h$, approaches $f(a)$ as $h$ approaches $0$. So the ratio $f(a)/f(b)$ measures, approximately, the ratio of the probability that $X$ is in $[a,a+h]$ to the probability $X$ is in $[b,b+h]$.
$endgroup$
– André Nicolas
Oct 10 '12 at 19:28
add a comment |
$begingroup$
I understand that the probability mass function of a discrete random-variable X is $y=g(x)$. This means $P(X=x_0) = g(x_0)$.
Now, a probability density function of of a continuous random variable X is $y=f(x)$. Wikipedia defines this function $y$ to mean
In probability theory, a probability density function (pdf), or density of a continuous random variable, is a function that describes the relative likelihood for this random variable to take on a given value.
I am confused about the meaning of 'relative likelihood' because it certainly does not mean probability! The probability $P(X<x_0)$ is given by some integral of the pdf.
So what does $f(x_0)$ indicate? It gives a real number, but isn't the relative likelihood of a specific value for a CRV always zero?
probability statistics random-variables
$endgroup$
I understand that the probability mass function of a discrete random-variable X is $y=g(x)$. This means $P(X=x_0) = g(x_0)$.
Now, a probability density function of of a continuous random variable X is $y=f(x)$. Wikipedia defines this function $y$ to mean
In probability theory, a probability density function (pdf), or density of a continuous random variable, is a function that describes the relative likelihood for this random variable to take on a given value.
I am confused about the meaning of 'relative likelihood' because it certainly does not mean probability! The probability $P(X<x_0)$ is given by some integral of the pdf.
So what does $f(x_0)$ indicate? It gives a real number, but isn't the relative likelihood of a specific value for a CRV always zero?
probability statistics random-variables
probability statistics random-variables
asked Oct 10 '12 at 19:20
jesterIIjesterII
1,20121326
1,20121326
5
$begingroup$
Let $f$ be the density function of $X$. Assume $f$ is continuous. Then if $h$ is small, the probability that $X$ lies in the interval $[a,a+h]$ is approximately $hf(a)$. By approximately I mean that the probability, divided by $h$, approaches $f(a)$ as $h$ approaches $0$. So the ratio $f(a)/f(b)$ measures, approximately, the ratio of the probability that $X$ is in $[a,a+h]$ to the probability $X$ is in $[b,b+h]$.
$endgroup$
– André Nicolas
Oct 10 '12 at 19:28
add a comment |
5
$begingroup$
Let $f$ be the density function of $X$. Assume $f$ is continuous. Then if $h$ is small, the probability that $X$ lies in the interval $[a,a+h]$ is approximately $hf(a)$. By approximately I mean that the probability, divided by $h$, approaches $f(a)$ as $h$ approaches $0$. So the ratio $f(a)/f(b)$ measures, approximately, the ratio of the probability that $X$ is in $[a,a+h]$ to the probability $X$ is in $[b,b+h]$.
$endgroup$
– André Nicolas
Oct 10 '12 at 19:28
5
5
$begingroup$
Let $f$ be the density function of $X$. Assume $f$ is continuous. Then if $h$ is small, the probability that $X$ lies in the interval $[a,a+h]$ is approximately $hf(a)$. By approximately I mean that the probability, divided by $h$, approaches $f(a)$ as $h$ approaches $0$. So the ratio $f(a)/f(b)$ measures, approximately, the ratio of the probability that $X$ is in $[a,a+h]$ to the probability $X$ is in $[b,b+h]$.
$endgroup$
– André Nicolas
Oct 10 '12 at 19:28
$begingroup$
Let $f$ be the density function of $X$. Assume $f$ is continuous. Then if $h$ is small, the probability that $X$ lies in the interval $[a,a+h]$ is approximately $hf(a)$. By approximately I mean that the probability, divided by $h$, approaches $f(a)$ as $h$ approaches $0$. So the ratio $f(a)/f(b)$ measures, approximately, the ratio of the probability that $X$ is in $[a,a+h]$ to the probability $X$ is in $[b,b+h]$.
$endgroup$
– André Nicolas
Oct 10 '12 at 19:28
add a comment |
5 Answers
5
active
oldest
votes
$begingroup$
'Relative likelihood' is indeed misleading. Look at it as a limit instead:
$$
f(x)=lim_{h to 0}frac{F(x+h)-F(x)}{h}
$$
where $F(x) = P(X leq x)$
$endgroup$
$begingroup$
So you suggest looking at the pdf as being defined by the cumulative distribution function?
$endgroup$
– jesterII
Oct 10 '12 at 19:39
2
$begingroup$
This is essentially the definition of pdf fro CRVs
$endgroup$
– Alex
Oct 10 '12 at 20:12
2
$begingroup$
A good way of thinking about is $f(x) = frac{dF}{dx}$ and so it's the rate of change of the cdf at $x$.
$endgroup$
– Jacob
Feb 27 '13 at 17:39
$begingroup$
Hi Alex, sorry for asking problem related to such an old answer. I know the pdf $f$ is the derivative of the cdf $F$, but what is the physical meaning of "the rate of change of the cdf at some point"? I mean, how to explain it by using the continuous random variable $X$?
$endgroup$
– Sam Wong
Dec 13 '18 at 8:17
$begingroup$
I don't know much about physics sorry.
$endgroup$
– Alex
Dec 13 '18 at 10:55
add a comment |
$begingroup$
In general, if $X$ is a random variable with values of a measure space $(A,mathcal A,mu)$ and with pdf $f:Ato [0,1]$, then for all measurable set $Sinmathcal A$,
$$P(Xin S) = int_S fdmu $$
So, if $A=Bbb R$ (and $mu=lambda$), then
$$P(a<X<b)=int_a^b f(x)dx$$
So, $f(x) = displaystylelim_{tto 0} frac1{2t}int_{x-t}^{x+t} f =lim_{tto 0} frac1{2t} P(|X-x|<t) $ for example.. We can call it 'relative likelihood'..
$endgroup$
1
$begingroup$
This is a better answer than Alex's but doesn't explain the significance of the number $f(x)$. Does it have a meaning independent of a cdf? Andre's answer of it being approximately $hf(a)$ is great but he doesn't indicate if there's more to $f(x)$ by itself.
$endgroup$
– Jacob
Feb 27 '13 at 17:23
add a comment |
$begingroup$
Intro statistics focuses on the PDF as the description of the population, but in fact it is the CDF (cumulative density function) that gives you a functional understanding of the population, as points on the CDF denote probabilities over a relevant range of measures. If you look at all stats from this perspective, then the PDF is just the description of probability change with respect to a change around a point along the measure at hand. The values on the PDF therefore only give you a look at the spread. For example, given two normal distributions $N(mu_1, sigma_1^2)$ and $N(mu_2, sigma_2^2)$, if you choose any value of $x$ to get point $p_n=mu_n+xcdotsigma_n$ for the respective distributions and get $X_1[p_1 ] > X_2[p_2 ]$, then this just means $sigma_1 < sigma_2$. Similar relationships exist for other distributions.
$endgroup$
$begingroup$
Very interesting answer!
$endgroup$
– information_interchange
Jun 7 '18 at 18:38
add a comment |
$begingroup$
I am not sure if Jester is still interested, as it's been 5 years, but I think I found a less confusing anwer than in Wikipedia.
In contrast to discrete random variables, if X is continuous, f(X) is a function whose value at any given sample is not the probability but rather it indicates the likelihood that X will be in that sample/interval. For example if the value of the PDF around a point (can be generalized for a sample) x is large, that means the random variable X is more likely to take values close to x. If, on the other hand, f(x)=0 in some interval, then X won't be in that interval
Of course a more practical way of thinking it is that the probability of X being in an interval is given by the integral of the PDF.
You might want to look at the link below for more details: http://mathinsight.org/probability_density_function_idea
$endgroup$
add a comment |
$begingroup$
The ratio of the pdf $f(x)$ at two points, $r_x = f(x_0)/f(x_1)$, is not a measure of relative probability (or "relative likelihood") for the two outcomes for the random variable $X$. The ratio depends on the metric. That is, with a variable transformation, $z=z(x)$, with the pdf for $Z$ given by $h(z)$, the ratio $r_z=h(z_0)/h(z_1)neq r_x$, in general, even though the two ratios refer to the same two outcomes. For monotonic transformations, $f(x),dx = h(z),dz$.
Numerical values of the pdf have no value on their own. The metric, $dx$, is required for probability interpretations (ie. $f(x),dx$). Wikipedia got this wrong, so I have corrected it.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f210630%2fwhat-does-the-value-of-a-probability-density-function-pdf-at-some-x-indicate%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
'Relative likelihood' is indeed misleading. Look at it as a limit instead:
$$
f(x)=lim_{h to 0}frac{F(x+h)-F(x)}{h}
$$
where $F(x) = P(X leq x)$
$endgroup$
$begingroup$
So you suggest looking at the pdf as being defined by the cumulative distribution function?
$endgroup$
– jesterII
Oct 10 '12 at 19:39
2
$begingroup$
This is essentially the definition of pdf fro CRVs
$endgroup$
– Alex
Oct 10 '12 at 20:12
2
$begingroup$
A good way of thinking about is $f(x) = frac{dF}{dx}$ and so it's the rate of change of the cdf at $x$.
$endgroup$
– Jacob
Feb 27 '13 at 17:39
$begingroup$
Hi Alex, sorry for asking problem related to such an old answer. I know the pdf $f$ is the derivative of the cdf $F$, but what is the physical meaning of "the rate of change of the cdf at some point"? I mean, how to explain it by using the continuous random variable $X$?
$endgroup$
– Sam Wong
Dec 13 '18 at 8:17
$begingroup$
I don't know much about physics sorry.
$endgroup$
– Alex
Dec 13 '18 at 10:55
add a comment |
$begingroup$
'Relative likelihood' is indeed misleading. Look at it as a limit instead:
$$
f(x)=lim_{h to 0}frac{F(x+h)-F(x)}{h}
$$
where $F(x) = P(X leq x)$
$endgroup$
$begingroup$
So you suggest looking at the pdf as being defined by the cumulative distribution function?
$endgroup$
– jesterII
Oct 10 '12 at 19:39
2
$begingroup$
This is essentially the definition of pdf fro CRVs
$endgroup$
– Alex
Oct 10 '12 at 20:12
2
$begingroup$
A good way of thinking about is $f(x) = frac{dF}{dx}$ and so it's the rate of change of the cdf at $x$.
$endgroup$
– Jacob
Feb 27 '13 at 17:39
$begingroup$
Hi Alex, sorry for asking problem related to such an old answer. I know the pdf $f$ is the derivative of the cdf $F$, but what is the physical meaning of "the rate of change of the cdf at some point"? I mean, how to explain it by using the continuous random variable $X$?
$endgroup$
– Sam Wong
Dec 13 '18 at 8:17
$begingroup$
I don't know much about physics sorry.
$endgroup$
– Alex
Dec 13 '18 at 10:55
add a comment |
$begingroup$
'Relative likelihood' is indeed misleading. Look at it as a limit instead:
$$
f(x)=lim_{h to 0}frac{F(x+h)-F(x)}{h}
$$
where $F(x) = P(X leq x)$
$endgroup$
'Relative likelihood' is indeed misleading. Look at it as a limit instead:
$$
f(x)=lim_{h to 0}frac{F(x+h)-F(x)}{h}
$$
where $F(x) = P(X leq x)$
answered Oct 10 '12 at 19:28
AlexAlex
14.2k42134
14.2k42134
$begingroup$
So you suggest looking at the pdf as being defined by the cumulative distribution function?
$endgroup$
– jesterII
Oct 10 '12 at 19:39
2
$begingroup$
This is essentially the definition of pdf fro CRVs
$endgroup$
– Alex
Oct 10 '12 at 20:12
2
$begingroup$
A good way of thinking about is $f(x) = frac{dF}{dx}$ and so it's the rate of change of the cdf at $x$.
$endgroup$
– Jacob
Feb 27 '13 at 17:39
$begingroup$
Hi Alex, sorry for asking problem related to such an old answer. I know the pdf $f$ is the derivative of the cdf $F$, but what is the physical meaning of "the rate of change of the cdf at some point"? I mean, how to explain it by using the continuous random variable $X$?
$endgroup$
– Sam Wong
Dec 13 '18 at 8:17
$begingroup$
I don't know much about physics sorry.
$endgroup$
– Alex
Dec 13 '18 at 10:55
add a comment |
$begingroup$
So you suggest looking at the pdf as being defined by the cumulative distribution function?
$endgroup$
– jesterII
Oct 10 '12 at 19:39
2
$begingroup$
This is essentially the definition of pdf fro CRVs
$endgroup$
– Alex
Oct 10 '12 at 20:12
2
$begingroup$
A good way of thinking about is $f(x) = frac{dF}{dx}$ and so it's the rate of change of the cdf at $x$.
$endgroup$
– Jacob
Feb 27 '13 at 17:39
$begingroup$
Hi Alex, sorry for asking problem related to such an old answer. I know the pdf $f$ is the derivative of the cdf $F$, but what is the physical meaning of "the rate of change of the cdf at some point"? I mean, how to explain it by using the continuous random variable $X$?
$endgroup$
– Sam Wong
Dec 13 '18 at 8:17
$begingroup$
I don't know much about physics sorry.
$endgroup$
– Alex
Dec 13 '18 at 10:55
$begingroup$
So you suggest looking at the pdf as being defined by the cumulative distribution function?
$endgroup$
– jesterII
Oct 10 '12 at 19:39
$begingroup$
So you suggest looking at the pdf as being defined by the cumulative distribution function?
$endgroup$
– jesterII
Oct 10 '12 at 19:39
2
2
$begingroup$
This is essentially the definition of pdf fro CRVs
$endgroup$
– Alex
Oct 10 '12 at 20:12
$begingroup$
This is essentially the definition of pdf fro CRVs
$endgroup$
– Alex
Oct 10 '12 at 20:12
2
2
$begingroup$
A good way of thinking about is $f(x) = frac{dF}{dx}$ and so it's the rate of change of the cdf at $x$.
$endgroup$
– Jacob
Feb 27 '13 at 17:39
$begingroup$
A good way of thinking about is $f(x) = frac{dF}{dx}$ and so it's the rate of change of the cdf at $x$.
$endgroup$
– Jacob
Feb 27 '13 at 17:39
$begingroup$
Hi Alex, sorry for asking problem related to such an old answer. I know the pdf $f$ is the derivative of the cdf $F$, but what is the physical meaning of "the rate of change of the cdf at some point"? I mean, how to explain it by using the continuous random variable $X$?
$endgroup$
– Sam Wong
Dec 13 '18 at 8:17
$begingroup$
Hi Alex, sorry for asking problem related to such an old answer. I know the pdf $f$ is the derivative of the cdf $F$, but what is the physical meaning of "the rate of change of the cdf at some point"? I mean, how to explain it by using the continuous random variable $X$?
$endgroup$
– Sam Wong
Dec 13 '18 at 8:17
$begingroup$
I don't know much about physics sorry.
$endgroup$
– Alex
Dec 13 '18 at 10:55
$begingroup$
I don't know much about physics sorry.
$endgroup$
– Alex
Dec 13 '18 at 10:55
add a comment |
$begingroup$
In general, if $X$ is a random variable with values of a measure space $(A,mathcal A,mu)$ and with pdf $f:Ato [0,1]$, then for all measurable set $Sinmathcal A$,
$$P(Xin S) = int_S fdmu $$
So, if $A=Bbb R$ (and $mu=lambda$), then
$$P(a<X<b)=int_a^b f(x)dx$$
So, $f(x) = displaystylelim_{tto 0} frac1{2t}int_{x-t}^{x+t} f =lim_{tto 0} frac1{2t} P(|X-x|<t) $ for example.. We can call it 'relative likelihood'..
$endgroup$
1
$begingroup$
This is a better answer than Alex's but doesn't explain the significance of the number $f(x)$. Does it have a meaning independent of a cdf? Andre's answer of it being approximately $hf(a)$ is great but he doesn't indicate if there's more to $f(x)$ by itself.
$endgroup$
– Jacob
Feb 27 '13 at 17:23
add a comment |
$begingroup$
In general, if $X$ is a random variable with values of a measure space $(A,mathcal A,mu)$ and with pdf $f:Ato [0,1]$, then for all measurable set $Sinmathcal A$,
$$P(Xin S) = int_S fdmu $$
So, if $A=Bbb R$ (and $mu=lambda$), then
$$P(a<X<b)=int_a^b f(x)dx$$
So, $f(x) = displaystylelim_{tto 0} frac1{2t}int_{x-t}^{x+t} f =lim_{tto 0} frac1{2t} P(|X-x|<t) $ for example.. We can call it 'relative likelihood'..
$endgroup$
1
$begingroup$
This is a better answer than Alex's but doesn't explain the significance of the number $f(x)$. Does it have a meaning independent of a cdf? Andre's answer of it being approximately $hf(a)$ is great but he doesn't indicate if there's more to $f(x)$ by itself.
$endgroup$
– Jacob
Feb 27 '13 at 17:23
add a comment |
$begingroup$
In general, if $X$ is a random variable with values of a measure space $(A,mathcal A,mu)$ and with pdf $f:Ato [0,1]$, then for all measurable set $Sinmathcal A$,
$$P(Xin S) = int_S fdmu $$
So, if $A=Bbb R$ (and $mu=lambda$), then
$$P(a<X<b)=int_a^b f(x)dx$$
So, $f(x) = displaystylelim_{tto 0} frac1{2t}int_{x-t}^{x+t} f =lim_{tto 0} frac1{2t} P(|X-x|<t) $ for example.. We can call it 'relative likelihood'..
$endgroup$
In general, if $X$ is a random variable with values of a measure space $(A,mathcal A,mu)$ and with pdf $f:Ato [0,1]$, then for all measurable set $Sinmathcal A$,
$$P(Xin S) = int_S fdmu $$
So, if $A=Bbb R$ (and $mu=lambda$), then
$$P(a<X<b)=int_a^b f(x)dx$$
So, $f(x) = displaystylelim_{tto 0} frac1{2t}int_{x-t}^{x+t} f =lim_{tto 0} frac1{2t} P(|X-x|<t) $ for example.. We can call it 'relative likelihood'..
answered Oct 10 '12 at 19:30
BerciBerci
60k23672
60k23672
1
$begingroup$
This is a better answer than Alex's but doesn't explain the significance of the number $f(x)$. Does it have a meaning independent of a cdf? Andre's answer of it being approximately $hf(a)$ is great but he doesn't indicate if there's more to $f(x)$ by itself.
$endgroup$
– Jacob
Feb 27 '13 at 17:23
add a comment |
1
$begingroup$
This is a better answer than Alex's but doesn't explain the significance of the number $f(x)$. Does it have a meaning independent of a cdf? Andre's answer of it being approximately $hf(a)$ is great but he doesn't indicate if there's more to $f(x)$ by itself.
$endgroup$
– Jacob
Feb 27 '13 at 17:23
1
1
$begingroup$
This is a better answer than Alex's but doesn't explain the significance of the number $f(x)$. Does it have a meaning independent of a cdf? Andre's answer of it being approximately $hf(a)$ is great but he doesn't indicate if there's more to $f(x)$ by itself.
$endgroup$
– Jacob
Feb 27 '13 at 17:23
$begingroup$
This is a better answer than Alex's but doesn't explain the significance of the number $f(x)$. Does it have a meaning independent of a cdf? Andre's answer of it being approximately $hf(a)$ is great but he doesn't indicate if there's more to $f(x)$ by itself.
$endgroup$
– Jacob
Feb 27 '13 at 17:23
add a comment |
$begingroup$
Intro statistics focuses on the PDF as the description of the population, but in fact it is the CDF (cumulative density function) that gives you a functional understanding of the population, as points on the CDF denote probabilities over a relevant range of measures. If you look at all stats from this perspective, then the PDF is just the description of probability change with respect to a change around a point along the measure at hand. The values on the PDF therefore only give you a look at the spread. For example, given two normal distributions $N(mu_1, sigma_1^2)$ and $N(mu_2, sigma_2^2)$, if you choose any value of $x$ to get point $p_n=mu_n+xcdotsigma_n$ for the respective distributions and get $X_1[p_1 ] > X_2[p_2 ]$, then this just means $sigma_1 < sigma_2$. Similar relationships exist for other distributions.
$endgroup$
$begingroup$
Very interesting answer!
$endgroup$
– information_interchange
Jun 7 '18 at 18:38
add a comment |
$begingroup$
Intro statistics focuses on the PDF as the description of the population, but in fact it is the CDF (cumulative density function) that gives you a functional understanding of the population, as points on the CDF denote probabilities over a relevant range of measures. If you look at all stats from this perspective, then the PDF is just the description of probability change with respect to a change around a point along the measure at hand. The values on the PDF therefore only give you a look at the spread. For example, given two normal distributions $N(mu_1, sigma_1^2)$ and $N(mu_2, sigma_2^2)$, if you choose any value of $x$ to get point $p_n=mu_n+xcdotsigma_n$ for the respective distributions and get $X_1[p_1 ] > X_2[p_2 ]$, then this just means $sigma_1 < sigma_2$. Similar relationships exist for other distributions.
$endgroup$
$begingroup$
Very interesting answer!
$endgroup$
– information_interchange
Jun 7 '18 at 18:38
add a comment |
$begingroup$
Intro statistics focuses on the PDF as the description of the population, but in fact it is the CDF (cumulative density function) that gives you a functional understanding of the population, as points on the CDF denote probabilities over a relevant range of measures. If you look at all stats from this perspective, then the PDF is just the description of probability change with respect to a change around a point along the measure at hand. The values on the PDF therefore only give you a look at the spread. For example, given two normal distributions $N(mu_1, sigma_1^2)$ and $N(mu_2, sigma_2^2)$, if you choose any value of $x$ to get point $p_n=mu_n+xcdotsigma_n$ for the respective distributions and get $X_1[p_1 ] > X_2[p_2 ]$, then this just means $sigma_1 < sigma_2$. Similar relationships exist for other distributions.
$endgroup$
Intro statistics focuses on the PDF as the description of the population, but in fact it is the CDF (cumulative density function) that gives you a functional understanding of the population, as points on the CDF denote probabilities over a relevant range of measures. If you look at all stats from this perspective, then the PDF is just the description of probability change with respect to a change around a point along the measure at hand. The values on the PDF therefore only give you a look at the spread. For example, given two normal distributions $N(mu_1, sigma_1^2)$ and $N(mu_2, sigma_2^2)$, if you choose any value of $x$ to get point $p_n=mu_n+xcdotsigma_n$ for the respective distributions and get $X_1[p_1 ] > X_2[p_2 ]$, then this just means $sigma_1 < sigma_2$. Similar relationships exist for other distributions.
answered Jun 28 '15 at 14:44
TopherTopher
347317
347317
$begingroup$
Very interesting answer!
$endgroup$
– information_interchange
Jun 7 '18 at 18:38
add a comment |
$begingroup$
Very interesting answer!
$endgroup$
– information_interchange
Jun 7 '18 at 18:38
$begingroup$
Very interesting answer!
$endgroup$
– information_interchange
Jun 7 '18 at 18:38
$begingroup$
Very interesting answer!
$endgroup$
– information_interchange
Jun 7 '18 at 18:38
add a comment |
$begingroup$
I am not sure if Jester is still interested, as it's been 5 years, but I think I found a less confusing anwer than in Wikipedia.
In contrast to discrete random variables, if X is continuous, f(X) is a function whose value at any given sample is not the probability but rather it indicates the likelihood that X will be in that sample/interval. For example if the value of the PDF around a point (can be generalized for a sample) x is large, that means the random variable X is more likely to take values close to x. If, on the other hand, f(x)=0 in some interval, then X won't be in that interval
Of course a more practical way of thinking it is that the probability of X being in an interval is given by the integral of the PDF.
You might want to look at the link below for more details: http://mathinsight.org/probability_density_function_idea
$endgroup$
add a comment |
$begingroup$
I am not sure if Jester is still interested, as it's been 5 years, but I think I found a less confusing anwer than in Wikipedia.
In contrast to discrete random variables, if X is continuous, f(X) is a function whose value at any given sample is not the probability but rather it indicates the likelihood that X will be in that sample/interval. For example if the value of the PDF around a point (can be generalized for a sample) x is large, that means the random variable X is more likely to take values close to x. If, on the other hand, f(x)=0 in some interval, then X won't be in that interval
Of course a more practical way of thinking it is that the probability of X being in an interval is given by the integral of the PDF.
You might want to look at the link below for more details: http://mathinsight.org/probability_density_function_idea
$endgroup$
add a comment |
$begingroup$
I am not sure if Jester is still interested, as it's been 5 years, but I think I found a less confusing anwer than in Wikipedia.
In contrast to discrete random variables, if X is continuous, f(X) is a function whose value at any given sample is not the probability but rather it indicates the likelihood that X will be in that sample/interval. For example if the value of the PDF around a point (can be generalized for a sample) x is large, that means the random variable X is more likely to take values close to x. If, on the other hand, f(x)=0 in some interval, then X won't be in that interval
Of course a more practical way of thinking it is that the probability of X being in an interval is given by the integral of the PDF.
You might want to look at the link below for more details: http://mathinsight.org/probability_density_function_idea
$endgroup$
I am not sure if Jester is still interested, as it's been 5 years, but I think I found a less confusing anwer than in Wikipedia.
In contrast to discrete random variables, if X is continuous, f(X) is a function whose value at any given sample is not the probability but rather it indicates the likelihood that X will be in that sample/interval. For example if the value of the PDF around a point (can be generalized for a sample) x is large, that means the random variable X is more likely to take values close to x. If, on the other hand, f(x)=0 in some interval, then X won't be in that interval
Of course a more practical way of thinking it is that the probability of X being in an interval is given by the integral of the PDF.
You might want to look at the link below for more details: http://mathinsight.org/probability_density_function_idea
answered Oct 24 '17 at 23:25
ALEX.VAMVASALEX.VAMVAS
211
211
add a comment |
add a comment |
$begingroup$
The ratio of the pdf $f(x)$ at two points, $r_x = f(x_0)/f(x_1)$, is not a measure of relative probability (or "relative likelihood") for the two outcomes for the random variable $X$. The ratio depends on the metric. That is, with a variable transformation, $z=z(x)$, with the pdf for $Z$ given by $h(z)$, the ratio $r_z=h(z_0)/h(z_1)neq r_x$, in general, even though the two ratios refer to the same two outcomes. For monotonic transformations, $f(x),dx = h(z),dz$.
Numerical values of the pdf have no value on their own. The metric, $dx$, is required for probability interpretations (ie. $f(x),dx$). Wikipedia got this wrong, so I have corrected it.
$endgroup$
add a comment |
$begingroup$
The ratio of the pdf $f(x)$ at two points, $r_x = f(x_0)/f(x_1)$, is not a measure of relative probability (or "relative likelihood") for the two outcomes for the random variable $X$. The ratio depends on the metric. That is, with a variable transformation, $z=z(x)$, with the pdf for $Z$ given by $h(z)$, the ratio $r_z=h(z_0)/h(z_1)neq r_x$, in general, even though the two ratios refer to the same two outcomes. For monotonic transformations, $f(x),dx = h(z),dz$.
Numerical values of the pdf have no value on their own. The metric, $dx$, is required for probability interpretations (ie. $f(x),dx$). Wikipedia got this wrong, so I have corrected it.
$endgroup$
add a comment |
$begingroup$
The ratio of the pdf $f(x)$ at two points, $r_x = f(x_0)/f(x_1)$, is not a measure of relative probability (or "relative likelihood") for the two outcomes for the random variable $X$. The ratio depends on the metric. That is, with a variable transformation, $z=z(x)$, with the pdf for $Z$ given by $h(z)$, the ratio $r_z=h(z_0)/h(z_1)neq r_x$, in general, even though the two ratios refer to the same two outcomes. For monotonic transformations, $f(x),dx = h(z),dz$.
Numerical values of the pdf have no value on their own. The metric, $dx$, is required for probability interpretations (ie. $f(x),dx$). Wikipedia got this wrong, so I have corrected it.
$endgroup$
The ratio of the pdf $f(x)$ at two points, $r_x = f(x_0)/f(x_1)$, is not a measure of relative probability (or "relative likelihood") for the two outcomes for the random variable $X$. The ratio depends on the metric. That is, with a variable transformation, $z=z(x)$, with the pdf for $Z$ given by $h(z)$, the ratio $r_z=h(z_0)/h(z_1)neq r_x$, in general, even though the two ratios refer to the same two outcomes. For monotonic transformations, $f(x),dx = h(z),dz$.
Numerical values of the pdf have no value on their own. The metric, $dx$, is required for probability interpretations (ie. $f(x),dx$). Wikipedia got this wrong, so I have corrected it.
answered Nov 27 '16 at 0:47
DeanDean
1,00537
1,00537
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f210630%2fwhat-does-the-value-of-a-probability-density-function-pdf-at-some-x-indicate%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
5
$begingroup$
Let $f$ be the density function of $X$. Assume $f$ is continuous. Then if $h$ is small, the probability that $X$ lies in the interval $[a,a+h]$ is approximately $hf(a)$. By approximately I mean that the probability, divided by $h$, approaches $f(a)$ as $h$ approaches $0$. So the ratio $f(a)/f(b)$ measures, approximately, the ratio of the probability that $X$ is in $[a,a+h]$ to the probability $X$ is in $[b,b+h]$.
$endgroup$
– André Nicolas
Oct 10 '12 at 19:28