Independence of $(X,Y)$ with joint PDF $f_{X,Y}(x,y)=8xy$ on $ 0<y<x<1$
$begingroup$
I know that as a general property, if
$f_{X,Y}(x,y)=g(x)h(y)$, then we say X and Y are independent random variables.
However, I am having trouble to accept this statement. Take the case:
$f_{X,Y}(x,y)=8xy, 0<y<x<1$
Then, $f_{X}(x)=4x$ and $f_{Y}(y)=4y$, where $f_{X}(x).f_{Y}(y)ne f_{X,Y}(x,y)$
Maybe I am getting something wrong over the limits of integration, I often trip on simple things... Or maybe, in this case, since the variables are dependent in the limits of integration, then they cannot be independent. But I havent seen any book point that as a requirement... So I suppose I got something wrong in my calculations.
If someone could please clarify this for me, I would be grateful.
probability-theory probability-distributions independence
$endgroup$
add a comment |
$begingroup$
I know that as a general property, if
$f_{X,Y}(x,y)=g(x)h(y)$, then we say X and Y are independent random variables.
However, I am having trouble to accept this statement. Take the case:
$f_{X,Y}(x,y)=8xy, 0<y<x<1$
Then, $f_{X}(x)=4x$ and $f_{Y}(y)=4y$, where $f_{X}(x).f_{Y}(y)ne f_{X,Y}(x,y)$
Maybe I am getting something wrong over the limits of integration, I often trip on simple things... Or maybe, in this case, since the variables are dependent in the limits of integration, then they cannot be independent. But I havent seen any book point that as a requirement... So I suppose I got something wrong in my calculations.
If someone could please clarify this for me, I would be grateful.
probability-theory probability-distributions independence
$endgroup$
1
$begingroup$
You are missing the point that $f(x,y)$ must equal $g(x)h(y)$ for all $x$ and $y$, $-infty < x < infty$, $-infty < y < infty$. For your $8xy$ function, the product rule does not hold for all $(x,y)$. For example, $f(0.6,0.4) = g(0.6)h(0.4)$ but $f(0.4,0.6) = 0 neq g(0.4)h(0.6)$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 4:29
$begingroup$
@MJS If X and Y are independent random-variables then $f_{X,Y}(x,y)=g(x)h(y)$ is always true, BUT converse may not be true. i-e if $f_{U,V}(u,v)=g(u)h(v)$ then it doesn't guarantees that U and V are independent random variables.
$endgroup$
– kaka
Aug 2 '13 at 14:47
3
$begingroup$
@kaka This is wrong.
$endgroup$
– Did
Aug 2 '13 at 15:00
$begingroup$
@Did Thanks a lot for correcting me.
$endgroup$
– kaka
Aug 2 '13 at 15:42
$begingroup$
Thanks a lot, comments were very useful.
$endgroup$
– user191919
Aug 5 '13 at 17:45
add a comment |
$begingroup$
I know that as a general property, if
$f_{X,Y}(x,y)=g(x)h(y)$, then we say X and Y are independent random variables.
However, I am having trouble to accept this statement. Take the case:
$f_{X,Y}(x,y)=8xy, 0<y<x<1$
Then, $f_{X}(x)=4x$ and $f_{Y}(y)=4y$, where $f_{X}(x).f_{Y}(y)ne f_{X,Y}(x,y)$
Maybe I am getting something wrong over the limits of integration, I often trip on simple things... Or maybe, in this case, since the variables are dependent in the limits of integration, then they cannot be independent. But I havent seen any book point that as a requirement... So I suppose I got something wrong in my calculations.
If someone could please clarify this for me, I would be grateful.
probability-theory probability-distributions independence
$endgroup$
I know that as a general property, if
$f_{X,Y}(x,y)=g(x)h(y)$, then we say X and Y are independent random variables.
However, I am having trouble to accept this statement. Take the case:
$f_{X,Y}(x,y)=8xy, 0<y<x<1$
Then, $f_{X}(x)=4x$ and $f_{Y}(y)=4y$, where $f_{X}(x).f_{Y}(y)ne f_{X,Y}(x,y)$
Maybe I am getting something wrong over the limits of integration, I often trip on simple things... Or maybe, in this case, since the variables are dependent in the limits of integration, then they cannot be independent. But I havent seen any book point that as a requirement... So I suppose I got something wrong in my calculations.
If someone could please clarify this for me, I would be grateful.
probability-theory probability-distributions independence
probability-theory probability-distributions independence
edited Jan 28 at 12:04
Did
249k23226466
249k23226466
asked Aug 1 '13 at 16:56
user191919user191919
349312
349312
1
$begingroup$
You are missing the point that $f(x,y)$ must equal $g(x)h(y)$ for all $x$ and $y$, $-infty < x < infty$, $-infty < y < infty$. For your $8xy$ function, the product rule does not hold for all $(x,y)$. For example, $f(0.6,0.4) = g(0.6)h(0.4)$ but $f(0.4,0.6) = 0 neq g(0.4)h(0.6)$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 4:29
$begingroup$
@MJS If X and Y are independent random-variables then $f_{X,Y}(x,y)=g(x)h(y)$ is always true, BUT converse may not be true. i-e if $f_{U,V}(u,v)=g(u)h(v)$ then it doesn't guarantees that U and V are independent random variables.
$endgroup$
– kaka
Aug 2 '13 at 14:47
3
$begingroup$
@kaka This is wrong.
$endgroup$
– Did
Aug 2 '13 at 15:00
$begingroup$
@Did Thanks a lot for correcting me.
$endgroup$
– kaka
Aug 2 '13 at 15:42
$begingroup$
Thanks a lot, comments were very useful.
$endgroup$
– user191919
Aug 5 '13 at 17:45
add a comment |
1
$begingroup$
You are missing the point that $f(x,y)$ must equal $g(x)h(y)$ for all $x$ and $y$, $-infty < x < infty$, $-infty < y < infty$. For your $8xy$ function, the product rule does not hold for all $(x,y)$. For example, $f(0.6,0.4) = g(0.6)h(0.4)$ but $f(0.4,0.6) = 0 neq g(0.4)h(0.6)$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 4:29
$begingroup$
@MJS If X and Y are independent random-variables then $f_{X,Y}(x,y)=g(x)h(y)$ is always true, BUT converse may not be true. i-e if $f_{U,V}(u,v)=g(u)h(v)$ then it doesn't guarantees that U and V are independent random variables.
$endgroup$
– kaka
Aug 2 '13 at 14:47
3
$begingroup$
@kaka This is wrong.
$endgroup$
– Did
Aug 2 '13 at 15:00
$begingroup$
@Did Thanks a lot for correcting me.
$endgroup$
– kaka
Aug 2 '13 at 15:42
$begingroup$
Thanks a lot, comments were very useful.
$endgroup$
– user191919
Aug 5 '13 at 17:45
1
1
$begingroup$
You are missing the point that $f(x,y)$ must equal $g(x)h(y)$ for all $x$ and $y$, $-infty < x < infty$, $-infty < y < infty$. For your $8xy$ function, the product rule does not hold for all $(x,y)$. For example, $f(0.6,0.4) = g(0.6)h(0.4)$ but $f(0.4,0.6) = 0 neq g(0.4)h(0.6)$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 4:29
$begingroup$
You are missing the point that $f(x,y)$ must equal $g(x)h(y)$ for all $x$ and $y$, $-infty < x < infty$, $-infty < y < infty$. For your $8xy$ function, the product rule does not hold for all $(x,y)$. For example, $f(0.6,0.4) = g(0.6)h(0.4)$ but $f(0.4,0.6) = 0 neq g(0.4)h(0.6)$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 4:29
$begingroup$
@MJS If X and Y are independent random-variables then $f_{X,Y}(x,y)=g(x)h(y)$ is always true, BUT converse may not be true. i-e if $f_{U,V}(u,v)=g(u)h(v)$ then it doesn't guarantees that U and V are independent random variables.
$endgroup$
– kaka
Aug 2 '13 at 14:47
$begingroup$
@MJS If X and Y are independent random-variables then $f_{X,Y}(x,y)=g(x)h(y)$ is always true, BUT converse may not be true. i-e if $f_{U,V}(u,v)=g(u)h(v)$ then it doesn't guarantees that U and V are independent random variables.
$endgroup$
– kaka
Aug 2 '13 at 14:47
3
3
$begingroup$
@kaka This is wrong.
$endgroup$
– Did
Aug 2 '13 at 15:00
$begingroup$
@kaka This is wrong.
$endgroup$
– Did
Aug 2 '13 at 15:00
$begingroup$
@Did Thanks a lot for correcting me.
$endgroup$
– kaka
Aug 2 '13 at 15:42
$begingroup$
@Did Thanks a lot for correcting me.
$endgroup$
– kaka
Aug 2 '13 at 15:42
$begingroup$
Thanks a lot, comments were very useful.
$endgroup$
– user191919
Aug 5 '13 at 17:45
$begingroup$
Thanks a lot, comments were very useful.
$endgroup$
– user191919
Aug 5 '13 at 17:45
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Take the case: $f_{X,Y}(x,y)=8xy, 0<y<x<1$.
Yet again an excellent example of the fact that densities should include restrictions on their domain.
Recall that any density of the couple of random variables $(X,Y)$ should be a function $f:mathbb Rtimesmathbb Rtomathbb R_+$ such that what-you-know holds. In the present case, $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xy$ is (obviously) not a density. Rather, a density of $(X,Y)$ is $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xymathbf 1_{0<y<x<1}$ (for example $f(1,2)=0$, not $16$).
Now the result you mention is correct:
The random variables $X$ and $Y$ with density $f$ are independent if and only if there exist $g$ and $h$ such that $f(x,y)=g(x)h(y)$ for (almost) every $(x,y)$ in $mathbb Rtimesmathbb R$.
Which does not hold for the density $f$ in the example.
This remark is also useful when computing marginals. In general,
$$
f_X(x)=int_mathbb Rf(x,y)mathrm dy,
$$
hence in the present case,
$$
f_X(x)=mathbf 1_{0<x<1}int_mathbb R8xymathbf 1_{0<y<x}mathrm dy=xmathbf 1_{0<x<1}int_0^x8ymathrm dy=4x^3mathbf 1_{0<x<1}.
$$
Note the absence of cases and the automaticity of the computations, thanks to adequate notations.
To sum up:
When a joint distribution is given by its PDF, a détour by the joint CDF is useless (and frankly often cumbersome) provided one uses the true PDF, which should include indicator functions if need be.
$endgroup$
add a comment |
$begingroup$
The complete, formal, heavy, official, no-joking, definition of independence is: Two random variables $X$ and $Y$ are independent if every function of $X$ is independent of every function of $Y$... Thankfully, it suffices to state what @Zelareth stated, namely that they are independent iff their joint probability density function can be written as the product of their marginal densities (what if they don't have densities?)
$$ $$
As for your example, it is one of the tricky ones. Indeed, the variables are not independent. To derive their density functions, the safest approach (but visibly longer) is to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions. But this is safe, and illuminating, especially when the support sets of the variables are "linking" the variables together. In general, when the supports are not plus-minus infinity, care must always be exercised, even if they appear disjoint.
So by definition the joint distribution function is $F(x,y)= P(min xle Xle x, min yle Yle y)$. In our case $min X = y$, while $min Y=0$. Using dummy variables of integration ($u$ for $x$ and $w$ for $y$), we have
$$F(x,y)=int_{w=0}^y int_{u=w}^xf_{XY}(u,w)dudw=8int_{w=0}^yw int_{u=w}^xududw=8int_{w=0}^yw frac12left(x^2-w^2right)dw=$$
$$4x^2int_{w=0}^yw dw -4int_{w=0}^yw^3dw=2x^2y^2-y^4 ;,; 0<y<x<1qquad [1]$$
One critical detail to remember in the above, is that any lower limits of integration that involve $x$ or $y$ must be expressed in terms of the dummy variables of integration, while the upper integration limits must be kept written in terms of the actual variables $X$ and $Y$.
We turn now to the marginal distribution functions. By definition
$$F(x) = lim_{yto max y}F(x,y)$$
In our case (this is another critical detail) $max y = x$. And this holds although the marginal density of $y$ will have support $(0,1)$. Intuitively, we haven't yet "separated" the variables, so we must still respect their interrelation. Substituting in $F(x,y)$ we obtain
$$F(x) = lim_{yto x}left(2x^2y^2-y^4right) = 2x^4-x^4 = x^4 ;,; xin (0,1)qquad [2]$$
Now that we have ousted $Y$, the variable $X$ can behave as though $Y$ doesn't exist, and so its support is $(0,1)$. The marginal density of $X$ is the derivative:
$$f_X(x)=frac{d}{dx}F(x) = 4x^3;,; xin (0,1) qquad [3]$$
You can verify that it integrates to unity over its support.
For the $Y$ variable we have analogously
$$F(y) = lim_{xto max x}F(x,y)$$
In our case $max x = 1$. Substituting in $F(x,y)$ we obtain
$$F(y) = lim_{xto 1}left(2x^2y^2-y^4right) = 2y^2-y^4 ;,; yin (0,1)qquad [4]$$
and the density is of $Y$ is:
$$f_Y(y)=frac{d}{dy}F(y) = 4y-4y^3;,; y in (0,1) qquad [5]$$
It too integrates to unity. As you can see, the product of the marginal densities has nothing to do with the joint density, so the variables are not independent. I would suggest you work the conditional distributions and densities, to complete the example.
$endgroup$
2
$begingroup$
Your statement "To derive their density functions, we need to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions." seems to imply that it is necessary to go through all these steps. In fact, we can use $$begin{align}f_Y(y)&=int_{-infty}^infty f_{X,Y}(x,y),mathrm dx\&=int_y^1 8xy,mathrm dx\&=4x^2ybigr|{x=y}^1\&=4y-y^3end{align}$$ and similarly $$f_X(x)=int_0^x 8xy,mathrm dy=4x^3$$ to save ourselves a lot of extra work.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 13:54
$begingroup$
"need" was indeed an overstatement, and I just corrected it. "The safe (and educational) way to go" was the intended meaning, especially for less experienced density diggers -a category in which the OP clearly belongs, considering the pdf's (s)he gave in the question statement.
$endgroup$
– Alecos Papadopoulos
Aug 2 '13 at 14:15
$begingroup$
That should have been a $displaystyle 4x^2yBigr|_{x=y}^1 = 4y-y^3$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 14:43
add a comment |
$begingroup$
I think you're slightly confused with this characterization of independence. If we assume that $X$ and $Y$ have density functions $f_X$ and $f_Y$ and joint distribution $f_{X,Y}$, then $X$ and $Y$ are independent iff $f_{X,Y} = f_x cdot f_y$. This is not the same as the property you listed above, the $g(x)$ and $h(y)$ must be the marginal density functions and not just any function of $x$ and $y$.
$endgroup$
$begingroup$
Thats my point, some books state a lemma that if the density function is separable, then the variables are independent. Maybe I am misreading the lemma. I'll copy it here: Let (x,y) be a bivariate random variable with joint pdf f(x,y). Then X and Y are independent random variables if and only if there exist functions g(x) and h(y) such that, for every x and y in the reals, f(x,y)=g(x)h(y).
$endgroup$
– user191919
Aug 1 '13 at 17:52
$begingroup$
The proof seems straightforwards, they simply integrate from minus infinity to infinity on each function and say that g(x) is the same as marginal except for a constant multiplication...
$endgroup$
– user191919
Aug 1 '13 at 18:08
3
$begingroup$
This is wrong. $g$ and $h$ don't have to be the marginals. Factoring for any $g$ and $h$ implies the joint factors as a product of the marginals, and it's easy to see that any such $g$ and $h$ is nessecarily proportional to the associated marginal. Did's answer is correct.
$endgroup$
– guy
Aug 2 '13 at 16:01
$begingroup$
Oops, I guess I need to review this myself then. Thank you for correcting me though.
$endgroup$
– Zelareth
Aug 5 '13 at 0:28
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f457390%2findependence-of-x-y-with-joint-pdf-f-x-yx-y-8xy-on-0yx1%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Take the case: $f_{X,Y}(x,y)=8xy, 0<y<x<1$.
Yet again an excellent example of the fact that densities should include restrictions on their domain.
Recall that any density of the couple of random variables $(X,Y)$ should be a function $f:mathbb Rtimesmathbb Rtomathbb R_+$ such that what-you-know holds. In the present case, $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xy$ is (obviously) not a density. Rather, a density of $(X,Y)$ is $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xymathbf 1_{0<y<x<1}$ (for example $f(1,2)=0$, not $16$).
Now the result you mention is correct:
The random variables $X$ and $Y$ with density $f$ are independent if and only if there exist $g$ and $h$ such that $f(x,y)=g(x)h(y)$ for (almost) every $(x,y)$ in $mathbb Rtimesmathbb R$.
Which does not hold for the density $f$ in the example.
This remark is also useful when computing marginals. In general,
$$
f_X(x)=int_mathbb Rf(x,y)mathrm dy,
$$
hence in the present case,
$$
f_X(x)=mathbf 1_{0<x<1}int_mathbb R8xymathbf 1_{0<y<x}mathrm dy=xmathbf 1_{0<x<1}int_0^x8ymathrm dy=4x^3mathbf 1_{0<x<1}.
$$
Note the absence of cases and the automaticity of the computations, thanks to adequate notations.
To sum up:
When a joint distribution is given by its PDF, a détour by the joint CDF is useless (and frankly often cumbersome) provided one uses the true PDF, which should include indicator functions if need be.
$endgroup$
add a comment |
$begingroup$
Take the case: $f_{X,Y}(x,y)=8xy, 0<y<x<1$.
Yet again an excellent example of the fact that densities should include restrictions on their domain.
Recall that any density of the couple of random variables $(X,Y)$ should be a function $f:mathbb Rtimesmathbb Rtomathbb R_+$ such that what-you-know holds. In the present case, $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xy$ is (obviously) not a density. Rather, a density of $(X,Y)$ is $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xymathbf 1_{0<y<x<1}$ (for example $f(1,2)=0$, not $16$).
Now the result you mention is correct:
The random variables $X$ and $Y$ with density $f$ are independent if and only if there exist $g$ and $h$ such that $f(x,y)=g(x)h(y)$ for (almost) every $(x,y)$ in $mathbb Rtimesmathbb R$.
Which does not hold for the density $f$ in the example.
This remark is also useful when computing marginals. In general,
$$
f_X(x)=int_mathbb Rf(x,y)mathrm dy,
$$
hence in the present case,
$$
f_X(x)=mathbf 1_{0<x<1}int_mathbb R8xymathbf 1_{0<y<x}mathrm dy=xmathbf 1_{0<x<1}int_0^x8ymathrm dy=4x^3mathbf 1_{0<x<1}.
$$
Note the absence of cases and the automaticity of the computations, thanks to adequate notations.
To sum up:
When a joint distribution is given by its PDF, a détour by the joint CDF is useless (and frankly often cumbersome) provided one uses the true PDF, which should include indicator functions if need be.
$endgroup$
add a comment |
$begingroup$
Take the case: $f_{X,Y}(x,y)=8xy, 0<y<x<1$.
Yet again an excellent example of the fact that densities should include restrictions on their domain.
Recall that any density of the couple of random variables $(X,Y)$ should be a function $f:mathbb Rtimesmathbb Rtomathbb R_+$ such that what-you-know holds. In the present case, $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xy$ is (obviously) not a density. Rather, a density of $(X,Y)$ is $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xymathbf 1_{0<y<x<1}$ (for example $f(1,2)=0$, not $16$).
Now the result you mention is correct:
The random variables $X$ and $Y$ with density $f$ are independent if and only if there exist $g$ and $h$ such that $f(x,y)=g(x)h(y)$ for (almost) every $(x,y)$ in $mathbb Rtimesmathbb R$.
Which does not hold for the density $f$ in the example.
This remark is also useful when computing marginals. In general,
$$
f_X(x)=int_mathbb Rf(x,y)mathrm dy,
$$
hence in the present case,
$$
f_X(x)=mathbf 1_{0<x<1}int_mathbb R8xymathbf 1_{0<y<x}mathrm dy=xmathbf 1_{0<x<1}int_0^x8ymathrm dy=4x^3mathbf 1_{0<x<1}.
$$
Note the absence of cases and the automaticity of the computations, thanks to adequate notations.
To sum up:
When a joint distribution is given by its PDF, a détour by the joint CDF is useless (and frankly often cumbersome) provided one uses the true PDF, which should include indicator functions if need be.
$endgroup$
Take the case: $f_{X,Y}(x,y)=8xy, 0<y<x<1$.
Yet again an excellent example of the fact that densities should include restrictions on their domain.
Recall that any density of the couple of random variables $(X,Y)$ should be a function $f:mathbb Rtimesmathbb Rtomathbb R_+$ such that what-you-know holds. In the present case, $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xy$ is (obviously) not a density. Rather, a density of $(X,Y)$ is $f:mathbb Rtimesmathbb Rtomathbb R_+$, $(x,y)mapsto8xymathbf 1_{0<y<x<1}$ (for example $f(1,2)=0$, not $16$).
Now the result you mention is correct:
The random variables $X$ and $Y$ with density $f$ are independent if and only if there exist $g$ and $h$ such that $f(x,y)=g(x)h(y)$ for (almost) every $(x,y)$ in $mathbb Rtimesmathbb R$.
Which does not hold for the density $f$ in the example.
This remark is also useful when computing marginals. In general,
$$
f_X(x)=int_mathbb Rf(x,y)mathrm dy,
$$
hence in the present case,
$$
f_X(x)=mathbf 1_{0<x<1}int_mathbb R8xymathbf 1_{0<y<x}mathrm dy=xmathbf 1_{0<x<1}int_0^x8ymathrm dy=4x^3mathbf 1_{0<x<1}.
$$
Note the absence of cases and the automaticity of the computations, thanks to adequate notations.
To sum up:
When a joint distribution is given by its PDF, a détour by the joint CDF is useless (and frankly often cumbersome) provided one uses the true PDF, which should include indicator functions if need be.
edited Aug 2 '13 at 20:15
answered Aug 2 '13 at 14:25
DidDid
249k23226466
249k23226466
add a comment |
add a comment |
$begingroup$
The complete, formal, heavy, official, no-joking, definition of independence is: Two random variables $X$ and $Y$ are independent if every function of $X$ is independent of every function of $Y$... Thankfully, it suffices to state what @Zelareth stated, namely that they are independent iff their joint probability density function can be written as the product of their marginal densities (what if they don't have densities?)
$$ $$
As for your example, it is one of the tricky ones. Indeed, the variables are not independent. To derive their density functions, the safest approach (but visibly longer) is to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions. But this is safe, and illuminating, especially when the support sets of the variables are "linking" the variables together. In general, when the supports are not plus-minus infinity, care must always be exercised, even if they appear disjoint.
So by definition the joint distribution function is $F(x,y)= P(min xle Xle x, min yle Yle y)$. In our case $min X = y$, while $min Y=0$. Using dummy variables of integration ($u$ for $x$ and $w$ for $y$), we have
$$F(x,y)=int_{w=0}^y int_{u=w}^xf_{XY}(u,w)dudw=8int_{w=0}^yw int_{u=w}^xududw=8int_{w=0}^yw frac12left(x^2-w^2right)dw=$$
$$4x^2int_{w=0}^yw dw -4int_{w=0}^yw^3dw=2x^2y^2-y^4 ;,; 0<y<x<1qquad [1]$$
One critical detail to remember in the above, is that any lower limits of integration that involve $x$ or $y$ must be expressed in terms of the dummy variables of integration, while the upper integration limits must be kept written in terms of the actual variables $X$ and $Y$.
We turn now to the marginal distribution functions. By definition
$$F(x) = lim_{yto max y}F(x,y)$$
In our case (this is another critical detail) $max y = x$. And this holds although the marginal density of $y$ will have support $(0,1)$. Intuitively, we haven't yet "separated" the variables, so we must still respect their interrelation. Substituting in $F(x,y)$ we obtain
$$F(x) = lim_{yto x}left(2x^2y^2-y^4right) = 2x^4-x^4 = x^4 ;,; xin (0,1)qquad [2]$$
Now that we have ousted $Y$, the variable $X$ can behave as though $Y$ doesn't exist, and so its support is $(0,1)$. The marginal density of $X$ is the derivative:
$$f_X(x)=frac{d}{dx}F(x) = 4x^3;,; xin (0,1) qquad [3]$$
You can verify that it integrates to unity over its support.
For the $Y$ variable we have analogously
$$F(y) = lim_{xto max x}F(x,y)$$
In our case $max x = 1$. Substituting in $F(x,y)$ we obtain
$$F(y) = lim_{xto 1}left(2x^2y^2-y^4right) = 2y^2-y^4 ;,; yin (0,1)qquad [4]$$
and the density is of $Y$ is:
$$f_Y(y)=frac{d}{dy}F(y) = 4y-4y^3;,; y in (0,1) qquad [5]$$
It too integrates to unity. As you can see, the product of the marginal densities has nothing to do with the joint density, so the variables are not independent. I would suggest you work the conditional distributions and densities, to complete the example.
$endgroup$
2
$begingroup$
Your statement "To derive their density functions, we need to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions." seems to imply that it is necessary to go through all these steps. In fact, we can use $$begin{align}f_Y(y)&=int_{-infty}^infty f_{X,Y}(x,y),mathrm dx\&=int_y^1 8xy,mathrm dx\&=4x^2ybigr|{x=y}^1\&=4y-y^3end{align}$$ and similarly $$f_X(x)=int_0^x 8xy,mathrm dy=4x^3$$ to save ourselves a lot of extra work.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 13:54
$begingroup$
"need" was indeed an overstatement, and I just corrected it. "The safe (and educational) way to go" was the intended meaning, especially for less experienced density diggers -a category in which the OP clearly belongs, considering the pdf's (s)he gave in the question statement.
$endgroup$
– Alecos Papadopoulos
Aug 2 '13 at 14:15
$begingroup$
That should have been a $displaystyle 4x^2yBigr|_{x=y}^1 = 4y-y^3$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 14:43
add a comment |
$begingroup$
The complete, formal, heavy, official, no-joking, definition of independence is: Two random variables $X$ and $Y$ are independent if every function of $X$ is independent of every function of $Y$... Thankfully, it suffices to state what @Zelareth stated, namely that they are independent iff their joint probability density function can be written as the product of their marginal densities (what if they don't have densities?)
$$ $$
As for your example, it is one of the tricky ones. Indeed, the variables are not independent. To derive their density functions, the safest approach (but visibly longer) is to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions. But this is safe, and illuminating, especially when the support sets of the variables are "linking" the variables together. In general, when the supports are not plus-minus infinity, care must always be exercised, even if they appear disjoint.
So by definition the joint distribution function is $F(x,y)= P(min xle Xle x, min yle Yle y)$. In our case $min X = y$, while $min Y=0$. Using dummy variables of integration ($u$ for $x$ and $w$ for $y$), we have
$$F(x,y)=int_{w=0}^y int_{u=w}^xf_{XY}(u,w)dudw=8int_{w=0}^yw int_{u=w}^xududw=8int_{w=0}^yw frac12left(x^2-w^2right)dw=$$
$$4x^2int_{w=0}^yw dw -4int_{w=0}^yw^3dw=2x^2y^2-y^4 ;,; 0<y<x<1qquad [1]$$
One critical detail to remember in the above, is that any lower limits of integration that involve $x$ or $y$ must be expressed in terms of the dummy variables of integration, while the upper integration limits must be kept written in terms of the actual variables $X$ and $Y$.
We turn now to the marginal distribution functions. By definition
$$F(x) = lim_{yto max y}F(x,y)$$
In our case (this is another critical detail) $max y = x$. And this holds although the marginal density of $y$ will have support $(0,1)$. Intuitively, we haven't yet "separated" the variables, so we must still respect their interrelation. Substituting in $F(x,y)$ we obtain
$$F(x) = lim_{yto x}left(2x^2y^2-y^4right) = 2x^4-x^4 = x^4 ;,; xin (0,1)qquad [2]$$
Now that we have ousted $Y$, the variable $X$ can behave as though $Y$ doesn't exist, and so its support is $(0,1)$. The marginal density of $X$ is the derivative:
$$f_X(x)=frac{d}{dx}F(x) = 4x^3;,; xin (0,1) qquad [3]$$
You can verify that it integrates to unity over its support.
For the $Y$ variable we have analogously
$$F(y) = lim_{xto max x}F(x,y)$$
In our case $max x = 1$. Substituting in $F(x,y)$ we obtain
$$F(y) = lim_{xto 1}left(2x^2y^2-y^4right) = 2y^2-y^4 ;,; yin (0,1)qquad [4]$$
and the density is of $Y$ is:
$$f_Y(y)=frac{d}{dy}F(y) = 4y-4y^3;,; y in (0,1) qquad [5]$$
It too integrates to unity. As you can see, the product of the marginal densities has nothing to do with the joint density, so the variables are not independent. I would suggest you work the conditional distributions and densities, to complete the example.
$endgroup$
2
$begingroup$
Your statement "To derive their density functions, we need to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions." seems to imply that it is necessary to go through all these steps. In fact, we can use $$begin{align}f_Y(y)&=int_{-infty}^infty f_{X,Y}(x,y),mathrm dx\&=int_y^1 8xy,mathrm dx\&=4x^2ybigr|{x=y}^1\&=4y-y^3end{align}$$ and similarly $$f_X(x)=int_0^x 8xy,mathrm dy=4x^3$$ to save ourselves a lot of extra work.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 13:54
$begingroup$
"need" was indeed an overstatement, and I just corrected it. "The safe (and educational) way to go" was the intended meaning, especially for less experienced density diggers -a category in which the OP clearly belongs, considering the pdf's (s)he gave in the question statement.
$endgroup$
– Alecos Papadopoulos
Aug 2 '13 at 14:15
$begingroup$
That should have been a $displaystyle 4x^2yBigr|_{x=y}^1 = 4y-y^3$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 14:43
add a comment |
$begingroup$
The complete, formal, heavy, official, no-joking, definition of independence is: Two random variables $X$ and $Y$ are independent if every function of $X$ is independent of every function of $Y$... Thankfully, it suffices to state what @Zelareth stated, namely that they are independent iff their joint probability density function can be written as the product of their marginal densities (what if they don't have densities?)
$$ $$
As for your example, it is one of the tricky ones. Indeed, the variables are not independent. To derive their density functions, the safest approach (but visibly longer) is to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions. But this is safe, and illuminating, especially when the support sets of the variables are "linking" the variables together. In general, when the supports are not plus-minus infinity, care must always be exercised, even if they appear disjoint.
So by definition the joint distribution function is $F(x,y)= P(min xle Xle x, min yle Yle y)$. In our case $min X = y$, while $min Y=0$. Using dummy variables of integration ($u$ for $x$ and $w$ for $y$), we have
$$F(x,y)=int_{w=0}^y int_{u=w}^xf_{XY}(u,w)dudw=8int_{w=0}^yw int_{u=w}^xududw=8int_{w=0}^yw frac12left(x^2-w^2right)dw=$$
$$4x^2int_{w=0}^yw dw -4int_{w=0}^yw^3dw=2x^2y^2-y^4 ;,; 0<y<x<1qquad [1]$$
One critical detail to remember in the above, is that any lower limits of integration that involve $x$ or $y$ must be expressed in terms of the dummy variables of integration, while the upper integration limits must be kept written in terms of the actual variables $X$ and $Y$.
We turn now to the marginal distribution functions. By definition
$$F(x) = lim_{yto max y}F(x,y)$$
In our case (this is another critical detail) $max y = x$. And this holds although the marginal density of $y$ will have support $(0,1)$. Intuitively, we haven't yet "separated" the variables, so we must still respect their interrelation. Substituting in $F(x,y)$ we obtain
$$F(x) = lim_{yto x}left(2x^2y^2-y^4right) = 2x^4-x^4 = x^4 ;,; xin (0,1)qquad [2]$$
Now that we have ousted $Y$, the variable $X$ can behave as though $Y$ doesn't exist, and so its support is $(0,1)$. The marginal density of $X$ is the derivative:
$$f_X(x)=frac{d}{dx}F(x) = 4x^3;,; xin (0,1) qquad [3]$$
You can verify that it integrates to unity over its support.
For the $Y$ variable we have analogously
$$F(y) = lim_{xto max x}F(x,y)$$
In our case $max x = 1$. Substituting in $F(x,y)$ we obtain
$$F(y) = lim_{xto 1}left(2x^2y^2-y^4right) = 2y^2-y^4 ;,; yin (0,1)qquad [4]$$
and the density is of $Y$ is:
$$f_Y(y)=frac{d}{dy}F(y) = 4y-4y^3;,; y in (0,1) qquad [5]$$
It too integrates to unity. As you can see, the product of the marginal densities has nothing to do with the joint density, so the variables are not independent. I would suggest you work the conditional distributions and densities, to complete the example.
$endgroup$
The complete, formal, heavy, official, no-joking, definition of independence is: Two random variables $X$ and $Y$ are independent if every function of $X$ is independent of every function of $Y$... Thankfully, it suffices to state what @Zelareth stated, namely that they are independent iff their joint probability density function can be written as the product of their marginal densities (what if they don't have densities?)
$$ $$
As for your example, it is one of the tricky ones. Indeed, the variables are not independent. To derive their density functions, the safest approach (but visibly longer) is to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions. But this is safe, and illuminating, especially when the support sets of the variables are "linking" the variables together. In general, when the supports are not plus-minus infinity, care must always be exercised, even if they appear disjoint.
So by definition the joint distribution function is $F(x,y)= P(min xle Xle x, min yle Yle y)$. In our case $min X = y$, while $min Y=0$. Using dummy variables of integration ($u$ for $x$ and $w$ for $y$), we have
$$F(x,y)=int_{w=0}^y int_{u=w}^xf_{XY}(u,w)dudw=8int_{w=0}^yw int_{u=w}^xududw=8int_{w=0}^yw frac12left(x^2-w^2right)dw=$$
$$4x^2int_{w=0}^yw dw -4int_{w=0}^yw^3dw=2x^2y^2-y^4 ;,; 0<y<x<1qquad [1]$$
One critical detail to remember in the above, is that any lower limits of integration that involve $x$ or $y$ must be expressed in terms of the dummy variables of integration, while the upper integration limits must be kept written in terms of the actual variables $X$ and $Y$.
We turn now to the marginal distribution functions. By definition
$$F(x) = lim_{yto max y}F(x,y)$$
In our case (this is another critical detail) $max y = x$. And this holds although the marginal density of $y$ will have support $(0,1)$. Intuitively, we haven't yet "separated" the variables, so we must still respect their interrelation. Substituting in $F(x,y)$ we obtain
$$F(x) = lim_{yto x}left(2x^2y^2-y^4right) = 2x^4-x^4 = x^4 ;,; xin (0,1)qquad [2]$$
Now that we have ousted $Y$, the variable $X$ can behave as though $Y$ doesn't exist, and so its support is $(0,1)$. The marginal density of $X$ is the derivative:
$$f_X(x)=frac{d}{dx}F(x) = 4x^3;,; xin (0,1) qquad [3]$$
You can verify that it integrates to unity over its support.
For the $Y$ variable we have analogously
$$F(y) = lim_{xto max x}F(x,y)$$
In our case $max x = 1$. Substituting in $F(x,y)$ we obtain
$$F(y) = lim_{xto 1}left(2x^2y^2-y^4right) = 2y^2-y^4 ;,; yin (0,1)qquad [4]$$
and the density is of $Y$ is:
$$f_Y(y)=frac{d}{dy}F(y) = 4y-4y^3;,; y in (0,1) qquad [5]$$
It too integrates to unity. As you can see, the product of the marginal densities has nothing to do with the joint density, so the variables are not independent. I would suggest you work the conditional distributions and densities, to complete the example.
edited Aug 2 '13 at 14:18
answered Aug 2 '13 at 3:59
Alecos PapadopoulosAlecos Papadopoulos
8,34811535
8,34811535
2
$begingroup$
Your statement "To derive their density functions, we need to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions." seems to imply that it is necessary to go through all these steps. In fact, we can use $$begin{align}f_Y(y)&=int_{-infty}^infty f_{X,Y}(x,y),mathrm dx\&=int_y^1 8xy,mathrm dx\&=4x^2ybigr|{x=y}^1\&=4y-y^3end{align}$$ and similarly $$f_X(x)=int_0^x 8xy,mathrm dy=4x^3$$ to save ourselves a lot of extra work.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 13:54
$begingroup$
"need" was indeed an overstatement, and I just corrected it. "The safe (and educational) way to go" was the intended meaning, especially for less experienced density diggers -a category in which the OP clearly belongs, considering the pdf's (s)he gave in the question statement.
$endgroup$
– Alecos Papadopoulos
Aug 2 '13 at 14:15
$begingroup$
That should have been a $displaystyle 4x^2yBigr|_{x=y}^1 = 4y-y^3$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 14:43
add a comment |
2
$begingroup$
Your statement "To derive their density functions, we need to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions." seems to imply that it is necessary to go through all these steps. In fact, we can use $$begin{align}f_Y(y)&=int_{-infty}^infty f_{X,Y}(x,y),mathrm dx\&=int_y^1 8xy,mathrm dx\&=4x^2ybigr|{x=y}^1\&=4y-y^3end{align}$$ and similarly $$f_X(x)=int_0^x 8xy,mathrm dy=4x^3$$ to save ourselves a lot of extra work.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 13:54
$begingroup$
"need" was indeed an overstatement, and I just corrected it. "The safe (and educational) way to go" was the intended meaning, especially for less experienced density diggers -a category in which the OP clearly belongs, considering the pdf's (s)he gave in the question statement.
$endgroup$
– Alecos Papadopoulos
Aug 2 '13 at 14:15
$begingroup$
That should have been a $displaystyle 4x^2yBigr|_{x=y}^1 = 4y-y^3$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 14:43
2
2
$begingroup$
Your statement "To derive their density functions, we need to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions." seems to imply that it is necessary to go through all these steps. In fact, we can use $$begin{align}f_Y(y)&=int_{-infty}^infty f_{X,Y}(x,y),mathrm dx\&=int_y^1 8xy,mathrm dx\&=4x^2ybigr|{x=y}^1\&=4y-y^3end{align}$$ and similarly $$f_X(x)=int_0^x 8xy,mathrm dy=4x^3$$ to save ourselves a lot of extra work.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 13:54
$begingroup$
Your statement "To derive their density functions, we need to go all the way up to the joint distribution function, then back down to the marginal distribution functions and then to the marginal density functions." seems to imply that it is necessary to go through all these steps. In fact, we can use $$begin{align}f_Y(y)&=int_{-infty}^infty f_{X,Y}(x,y),mathrm dx\&=int_y^1 8xy,mathrm dx\&=4x^2ybigr|{x=y}^1\&=4y-y^3end{align}$$ and similarly $$f_X(x)=int_0^x 8xy,mathrm dy=4x^3$$ to save ourselves a lot of extra work.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 13:54
$begingroup$
"need" was indeed an overstatement, and I just corrected it. "The safe (and educational) way to go" was the intended meaning, especially for less experienced density diggers -a category in which the OP clearly belongs, considering the pdf's (s)he gave in the question statement.
$endgroup$
– Alecos Papadopoulos
Aug 2 '13 at 14:15
$begingroup$
"need" was indeed an overstatement, and I just corrected it. "The safe (and educational) way to go" was the intended meaning, especially for less experienced density diggers -a category in which the OP clearly belongs, considering the pdf's (s)he gave in the question statement.
$endgroup$
– Alecos Papadopoulos
Aug 2 '13 at 14:15
$begingroup$
That should have been a $displaystyle 4x^2yBigr|_{x=y}^1 = 4y-y^3$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 14:43
$begingroup$
That should have been a $displaystyle 4x^2yBigr|_{x=y}^1 = 4y-y^3$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 14:43
add a comment |
$begingroup$
I think you're slightly confused with this characterization of independence. If we assume that $X$ and $Y$ have density functions $f_X$ and $f_Y$ and joint distribution $f_{X,Y}$, then $X$ and $Y$ are independent iff $f_{X,Y} = f_x cdot f_y$. This is not the same as the property you listed above, the $g(x)$ and $h(y)$ must be the marginal density functions and not just any function of $x$ and $y$.
$endgroup$
$begingroup$
Thats my point, some books state a lemma that if the density function is separable, then the variables are independent. Maybe I am misreading the lemma. I'll copy it here: Let (x,y) be a bivariate random variable with joint pdf f(x,y). Then X and Y are independent random variables if and only if there exist functions g(x) and h(y) such that, for every x and y in the reals, f(x,y)=g(x)h(y).
$endgroup$
– user191919
Aug 1 '13 at 17:52
$begingroup$
The proof seems straightforwards, they simply integrate from minus infinity to infinity on each function and say that g(x) is the same as marginal except for a constant multiplication...
$endgroup$
– user191919
Aug 1 '13 at 18:08
3
$begingroup$
This is wrong. $g$ and $h$ don't have to be the marginals. Factoring for any $g$ and $h$ implies the joint factors as a product of the marginals, and it's easy to see that any such $g$ and $h$ is nessecarily proportional to the associated marginal. Did's answer is correct.
$endgroup$
– guy
Aug 2 '13 at 16:01
$begingroup$
Oops, I guess I need to review this myself then. Thank you for correcting me though.
$endgroup$
– Zelareth
Aug 5 '13 at 0:28
add a comment |
$begingroup$
I think you're slightly confused with this characterization of independence. If we assume that $X$ and $Y$ have density functions $f_X$ and $f_Y$ and joint distribution $f_{X,Y}$, then $X$ and $Y$ are independent iff $f_{X,Y} = f_x cdot f_y$. This is not the same as the property you listed above, the $g(x)$ and $h(y)$ must be the marginal density functions and not just any function of $x$ and $y$.
$endgroup$
$begingroup$
Thats my point, some books state a lemma that if the density function is separable, then the variables are independent. Maybe I am misreading the lemma. I'll copy it here: Let (x,y) be a bivariate random variable with joint pdf f(x,y). Then X and Y are independent random variables if and only if there exist functions g(x) and h(y) such that, for every x and y in the reals, f(x,y)=g(x)h(y).
$endgroup$
– user191919
Aug 1 '13 at 17:52
$begingroup$
The proof seems straightforwards, they simply integrate from minus infinity to infinity on each function and say that g(x) is the same as marginal except for a constant multiplication...
$endgroup$
– user191919
Aug 1 '13 at 18:08
3
$begingroup$
This is wrong. $g$ and $h$ don't have to be the marginals. Factoring for any $g$ and $h$ implies the joint factors as a product of the marginals, and it's easy to see that any such $g$ and $h$ is nessecarily proportional to the associated marginal. Did's answer is correct.
$endgroup$
– guy
Aug 2 '13 at 16:01
$begingroup$
Oops, I guess I need to review this myself then. Thank you for correcting me though.
$endgroup$
– Zelareth
Aug 5 '13 at 0:28
add a comment |
$begingroup$
I think you're slightly confused with this characterization of independence. If we assume that $X$ and $Y$ have density functions $f_X$ and $f_Y$ and joint distribution $f_{X,Y}$, then $X$ and $Y$ are independent iff $f_{X,Y} = f_x cdot f_y$. This is not the same as the property you listed above, the $g(x)$ and $h(y)$ must be the marginal density functions and not just any function of $x$ and $y$.
$endgroup$
I think you're slightly confused with this characterization of independence. If we assume that $X$ and $Y$ have density functions $f_X$ and $f_Y$ and joint distribution $f_{X,Y}$, then $X$ and $Y$ are independent iff $f_{X,Y} = f_x cdot f_y$. This is not the same as the property you listed above, the $g(x)$ and $h(y)$ must be the marginal density functions and not just any function of $x$ and $y$.
answered Aug 1 '13 at 17:06
ZelarethZelareth
973
973
$begingroup$
Thats my point, some books state a lemma that if the density function is separable, then the variables are independent. Maybe I am misreading the lemma. I'll copy it here: Let (x,y) be a bivariate random variable with joint pdf f(x,y). Then X and Y are independent random variables if and only if there exist functions g(x) and h(y) such that, for every x and y in the reals, f(x,y)=g(x)h(y).
$endgroup$
– user191919
Aug 1 '13 at 17:52
$begingroup$
The proof seems straightforwards, they simply integrate from minus infinity to infinity on each function and say that g(x) is the same as marginal except for a constant multiplication...
$endgroup$
– user191919
Aug 1 '13 at 18:08
3
$begingroup$
This is wrong. $g$ and $h$ don't have to be the marginals. Factoring for any $g$ and $h$ implies the joint factors as a product of the marginals, and it's easy to see that any such $g$ and $h$ is nessecarily proportional to the associated marginal. Did's answer is correct.
$endgroup$
– guy
Aug 2 '13 at 16:01
$begingroup$
Oops, I guess I need to review this myself then. Thank you for correcting me though.
$endgroup$
– Zelareth
Aug 5 '13 at 0:28
add a comment |
$begingroup$
Thats my point, some books state a lemma that if the density function is separable, then the variables are independent. Maybe I am misreading the lemma. I'll copy it here: Let (x,y) be a bivariate random variable with joint pdf f(x,y). Then X and Y are independent random variables if and only if there exist functions g(x) and h(y) such that, for every x and y in the reals, f(x,y)=g(x)h(y).
$endgroup$
– user191919
Aug 1 '13 at 17:52
$begingroup$
The proof seems straightforwards, they simply integrate from minus infinity to infinity on each function and say that g(x) is the same as marginal except for a constant multiplication...
$endgroup$
– user191919
Aug 1 '13 at 18:08
3
$begingroup$
This is wrong. $g$ and $h$ don't have to be the marginals. Factoring for any $g$ and $h$ implies the joint factors as a product of the marginals, and it's easy to see that any such $g$ and $h$ is nessecarily proportional to the associated marginal. Did's answer is correct.
$endgroup$
– guy
Aug 2 '13 at 16:01
$begingroup$
Oops, I guess I need to review this myself then. Thank you for correcting me though.
$endgroup$
– Zelareth
Aug 5 '13 at 0:28
$begingroup$
Thats my point, some books state a lemma that if the density function is separable, then the variables are independent. Maybe I am misreading the lemma. I'll copy it here: Let (x,y) be a bivariate random variable with joint pdf f(x,y). Then X and Y are independent random variables if and only if there exist functions g(x) and h(y) such that, for every x and y in the reals, f(x,y)=g(x)h(y).
$endgroup$
– user191919
Aug 1 '13 at 17:52
$begingroup$
Thats my point, some books state a lemma that if the density function is separable, then the variables are independent. Maybe I am misreading the lemma. I'll copy it here: Let (x,y) be a bivariate random variable with joint pdf f(x,y). Then X and Y are independent random variables if and only if there exist functions g(x) and h(y) such that, for every x and y in the reals, f(x,y)=g(x)h(y).
$endgroup$
– user191919
Aug 1 '13 at 17:52
$begingroup$
The proof seems straightforwards, they simply integrate from minus infinity to infinity on each function and say that g(x) is the same as marginal except for a constant multiplication...
$endgroup$
– user191919
Aug 1 '13 at 18:08
$begingroup$
The proof seems straightforwards, they simply integrate from minus infinity to infinity on each function and say that g(x) is the same as marginal except for a constant multiplication...
$endgroup$
– user191919
Aug 1 '13 at 18:08
3
3
$begingroup$
This is wrong. $g$ and $h$ don't have to be the marginals. Factoring for any $g$ and $h$ implies the joint factors as a product of the marginals, and it's easy to see that any such $g$ and $h$ is nessecarily proportional to the associated marginal. Did's answer is correct.
$endgroup$
– guy
Aug 2 '13 at 16:01
$begingroup$
This is wrong. $g$ and $h$ don't have to be the marginals. Factoring for any $g$ and $h$ implies the joint factors as a product of the marginals, and it's easy to see that any such $g$ and $h$ is nessecarily proportional to the associated marginal. Did's answer is correct.
$endgroup$
– guy
Aug 2 '13 at 16:01
$begingroup$
Oops, I guess I need to review this myself then. Thank you for correcting me though.
$endgroup$
– Zelareth
Aug 5 '13 at 0:28
$begingroup$
Oops, I guess I need to review this myself then. Thank you for correcting me though.
$endgroup$
– Zelareth
Aug 5 '13 at 0:28
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f457390%2findependence-of-x-y-with-joint-pdf-f-x-yx-y-8xy-on-0yx1%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown

1
$begingroup$
You are missing the point that $f(x,y)$ must equal $g(x)h(y)$ for all $x$ and $y$, $-infty < x < infty$, $-infty < y < infty$. For your $8xy$ function, the product rule does not hold for all $(x,y)$. For example, $f(0.6,0.4) = g(0.6)h(0.4)$ but $f(0.4,0.6) = 0 neq g(0.4)h(0.6)$.
$endgroup$
– Dilip Sarwate
Aug 2 '13 at 4:29
$begingroup$
@MJS If X and Y are independent random-variables then $f_{X,Y}(x,y)=g(x)h(y)$ is always true, BUT converse may not be true. i-e if $f_{U,V}(u,v)=g(u)h(v)$ then it doesn't guarantees that U and V are independent random variables.
$endgroup$
– kaka
Aug 2 '13 at 14:47
3
$begingroup$
@kaka This is wrong.
$endgroup$
– Did
Aug 2 '13 at 15:00
$begingroup$
@Did Thanks a lot for correcting me.
$endgroup$
– kaka
Aug 2 '13 at 15:42
$begingroup$
Thanks a lot, comments were very useful.
$endgroup$
– user191919
Aug 5 '13 at 17:45