Entropy of continuous and discrete random variables












1












$begingroup$


If N is a continuous random variable and X a discrete random variable.



How can I calculate H(X|Y) if Y=X+N?





  • N is a triangular distribution between -1 and 1


  • X can take the values ​​+-0.5 with equal probability


Assuming known: H(X), h(N)=h(Y|X), h(Y) and pdf's of X, N and Y.



All the entropy terms are finite.










share|cite|improve this question











$endgroup$












  • $begingroup$
    What exactly is denoted by $mathbf{H(Xmid Y)}$?
    $endgroup$
    – drhab
    Jan 2 at 18:49










  • $begingroup$
    The entropy of X conditioned on Y
    $endgroup$
    – missca
    Jan 2 at 18:54
















1












$begingroup$


If N is a continuous random variable and X a discrete random variable.



How can I calculate H(X|Y) if Y=X+N?





  • N is a triangular distribution between -1 and 1


  • X can take the values ​​+-0.5 with equal probability


Assuming known: H(X), h(N)=h(Y|X), h(Y) and pdf's of X, N and Y.



All the entropy terms are finite.










share|cite|improve this question











$endgroup$












  • $begingroup$
    What exactly is denoted by $mathbf{H(Xmid Y)}$?
    $endgroup$
    – drhab
    Jan 2 at 18:49










  • $begingroup$
    The entropy of X conditioned on Y
    $endgroup$
    – missca
    Jan 2 at 18:54














1












1








1





$begingroup$


If N is a continuous random variable and X a discrete random variable.



How can I calculate H(X|Y) if Y=X+N?





  • N is a triangular distribution between -1 and 1


  • X can take the values ​​+-0.5 with equal probability


Assuming known: H(X), h(N)=h(Y|X), h(Y) and pdf's of X, N and Y.



All the entropy terms are finite.










share|cite|improve this question











$endgroup$




If N is a continuous random variable and X a discrete random variable.



How can I calculate H(X|Y) if Y=X+N?





  • N is a triangular distribution between -1 and 1


  • X can take the values ​​+-0.5 with equal probability


Assuming known: H(X), h(N)=h(Y|X), h(Y) and pdf's of X, N and Y.



All the entropy terms are finite.







probability random-variables information-theory entropy






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 3 at 22:49







missca

















asked Jan 2 at 18:46









misscamissca

83




83












  • $begingroup$
    What exactly is denoted by $mathbf{H(Xmid Y)}$?
    $endgroup$
    – drhab
    Jan 2 at 18:49










  • $begingroup$
    The entropy of X conditioned on Y
    $endgroup$
    – missca
    Jan 2 at 18:54


















  • $begingroup$
    What exactly is denoted by $mathbf{H(Xmid Y)}$?
    $endgroup$
    – drhab
    Jan 2 at 18:49










  • $begingroup$
    The entropy of X conditioned on Y
    $endgroup$
    – missca
    Jan 2 at 18:54
















$begingroup$
What exactly is denoted by $mathbf{H(Xmid Y)}$?
$endgroup$
– drhab
Jan 2 at 18:49




$begingroup$
What exactly is denoted by $mathbf{H(Xmid Y)}$?
$endgroup$
– drhab
Jan 2 at 18:49












$begingroup$
The entropy of X conditioned on Y
$endgroup$
– missca
Jan 2 at 18:54




$begingroup$
The entropy of X conditioned on Y
$endgroup$
– missca
Jan 2 at 18:54










2 Answers
2






active

oldest

votes


















0












$begingroup$

It's true that you shouldn't mix/confuse the "true" entropy with the differential entropy (differential entropy is not a true Shannon entropy). For one thing, the true entropy of a (non degenerate) continuous variable is infinite. But it's still true that the ("true") mutual information of any two random variables is well defined, using the difference of either true entropies or differential entropies (see for example here or here)



So, if $X$ is discrete and $Y$ continuous, we are justified in writing $I(X;Y)=H(X)-H(X|Y)=h(Y)-h(Y|X)$ and hence



$$ H(X|Y) = H(X) - I(X;Y)=H(X) -h(Y) + h(Y|X) $$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    I understand what you say, but my teacher says we can not use: $$H(X|Y)=H(X)-h(Y)+h(Y|X)$$ because we are mixing H with h. He says that we can only use it if both are discrete or both are continuous, and in my case X is discrete and Y continuous. That's why I do not know how to calculate $H(X|Y)$
    $endgroup$
    – missca
    Jan 3 at 22:42












  • $begingroup$
    No matter how you solve it, the end result will be just mine (and Gautam Shenoy's). So, in the end you will be "mixing H and h" (which are your only data). You cannot escape that. You only need to justify how to do it. And that I have mentioned (and linked to the details)
    $endgroup$
    – leonbloy
    Jan 3 at 23:59



















2












$begingroup$

From the mutual information relation
$$H(X|Y) = H(X) - h(Y) + h(Y|X)$$



Use this to get your desired result.






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    My teacher says that you can not mix entropies (H, discrete variables) with relative entropies (h, continuous variables)
    $endgroup$
    – missca
    Jan 2 at 19:18










  • $begingroup$
    Are all the entropy terms in your example finite?
    $endgroup$
    – Gautam Shenoy
    Jan 3 at 2:55










  • $begingroup$
    Yes! All the entropy terms are finite.
    $endgroup$
    – missca
    Jan 3 at 8:19











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3059820%2fentropy-of-continuous-and-discrete-random-variables%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

It's true that you shouldn't mix/confuse the "true" entropy with the differential entropy (differential entropy is not a true Shannon entropy). For one thing, the true entropy of a (non degenerate) continuous variable is infinite. But it's still true that the ("true") mutual information of any two random variables is well defined, using the difference of either true entropies or differential entropies (see for example here or here)



So, if $X$ is discrete and $Y$ continuous, we are justified in writing $I(X;Y)=H(X)-H(X|Y)=h(Y)-h(Y|X)$ and hence



$$ H(X|Y) = H(X) - I(X;Y)=H(X) -h(Y) + h(Y|X) $$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    I understand what you say, but my teacher says we can not use: $$H(X|Y)=H(X)-h(Y)+h(Y|X)$$ because we are mixing H with h. He says that we can only use it if both are discrete or both are continuous, and in my case X is discrete and Y continuous. That's why I do not know how to calculate $H(X|Y)$
    $endgroup$
    – missca
    Jan 3 at 22:42












  • $begingroup$
    No matter how you solve it, the end result will be just mine (and Gautam Shenoy's). So, in the end you will be "mixing H and h" (which are your only data). You cannot escape that. You only need to justify how to do it. And that I have mentioned (and linked to the details)
    $endgroup$
    – leonbloy
    Jan 3 at 23:59
















0












$begingroup$

It's true that you shouldn't mix/confuse the "true" entropy with the differential entropy (differential entropy is not a true Shannon entropy). For one thing, the true entropy of a (non degenerate) continuous variable is infinite. But it's still true that the ("true") mutual information of any two random variables is well defined, using the difference of either true entropies or differential entropies (see for example here or here)



So, if $X$ is discrete and $Y$ continuous, we are justified in writing $I(X;Y)=H(X)-H(X|Y)=h(Y)-h(Y|X)$ and hence



$$ H(X|Y) = H(X) - I(X;Y)=H(X) -h(Y) + h(Y|X) $$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    I understand what you say, but my teacher says we can not use: $$H(X|Y)=H(X)-h(Y)+h(Y|X)$$ because we are mixing H with h. He says that we can only use it if both are discrete or both are continuous, and in my case X is discrete and Y continuous. That's why I do not know how to calculate $H(X|Y)$
    $endgroup$
    – missca
    Jan 3 at 22:42












  • $begingroup$
    No matter how you solve it, the end result will be just mine (and Gautam Shenoy's). So, in the end you will be "mixing H and h" (which are your only data). You cannot escape that. You only need to justify how to do it. And that I have mentioned (and linked to the details)
    $endgroup$
    – leonbloy
    Jan 3 at 23:59














0












0








0





$begingroup$

It's true that you shouldn't mix/confuse the "true" entropy with the differential entropy (differential entropy is not a true Shannon entropy). For one thing, the true entropy of a (non degenerate) continuous variable is infinite. But it's still true that the ("true") mutual information of any two random variables is well defined, using the difference of either true entropies or differential entropies (see for example here or here)



So, if $X$ is discrete and $Y$ continuous, we are justified in writing $I(X;Y)=H(X)-H(X|Y)=h(Y)-h(Y|X)$ and hence



$$ H(X|Y) = H(X) - I(X;Y)=H(X) -h(Y) + h(Y|X) $$






share|cite|improve this answer











$endgroup$



It's true that you shouldn't mix/confuse the "true" entropy with the differential entropy (differential entropy is not a true Shannon entropy). For one thing, the true entropy of a (non degenerate) continuous variable is infinite. But it's still true that the ("true") mutual information of any two random variables is well defined, using the difference of either true entropies or differential entropies (see for example here or here)



So, if $X$ is discrete and $Y$ continuous, we are justified in writing $I(X;Y)=H(X)-H(X|Y)=h(Y)-h(Y|X)$ and hence



$$ H(X|Y) = H(X) - I(X;Y)=H(X) -h(Y) + h(Y|X) $$







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Jan 4 at 15:45

























answered Jan 3 at 15:09









leonbloyleonbloy

40.5k645107




40.5k645107












  • $begingroup$
    I understand what you say, but my teacher says we can not use: $$H(X|Y)=H(X)-h(Y)+h(Y|X)$$ because we are mixing H with h. He says that we can only use it if both are discrete or both are continuous, and in my case X is discrete and Y continuous. That's why I do not know how to calculate $H(X|Y)$
    $endgroup$
    – missca
    Jan 3 at 22:42












  • $begingroup$
    No matter how you solve it, the end result will be just mine (and Gautam Shenoy's). So, in the end you will be "mixing H and h" (which are your only data). You cannot escape that. You only need to justify how to do it. And that I have mentioned (and linked to the details)
    $endgroup$
    – leonbloy
    Jan 3 at 23:59


















  • $begingroup$
    I understand what you say, but my teacher says we can not use: $$H(X|Y)=H(X)-h(Y)+h(Y|X)$$ because we are mixing H with h. He says that we can only use it if both are discrete or both are continuous, and in my case X is discrete and Y continuous. That's why I do not know how to calculate $H(X|Y)$
    $endgroup$
    – missca
    Jan 3 at 22:42












  • $begingroup$
    No matter how you solve it, the end result will be just mine (and Gautam Shenoy's). So, in the end you will be "mixing H and h" (which are your only data). You cannot escape that. You only need to justify how to do it. And that I have mentioned (and linked to the details)
    $endgroup$
    – leonbloy
    Jan 3 at 23:59
















$begingroup$
I understand what you say, but my teacher says we can not use: $$H(X|Y)=H(X)-h(Y)+h(Y|X)$$ because we are mixing H with h. He says that we can only use it if both are discrete or both are continuous, and in my case X is discrete and Y continuous. That's why I do not know how to calculate $H(X|Y)$
$endgroup$
– missca
Jan 3 at 22:42






$begingroup$
I understand what you say, but my teacher says we can not use: $$H(X|Y)=H(X)-h(Y)+h(Y|X)$$ because we are mixing H with h. He says that we can only use it if both are discrete or both are continuous, and in my case X is discrete and Y continuous. That's why I do not know how to calculate $H(X|Y)$
$endgroup$
– missca
Jan 3 at 22:42














$begingroup$
No matter how you solve it, the end result will be just mine (and Gautam Shenoy's). So, in the end you will be "mixing H and h" (which are your only data). You cannot escape that. You only need to justify how to do it. And that I have mentioned (and linked to the details)
$endgroup$
– leonbloy
Jan 3 at 23:59




$begingroup$
No matter how you solve it, the end result will be just mine (and Gautam Shenoy's). So, in the end you will be "mixing H and h" (which are your only data). You cannot escape that. You only need to justify how to do it. And that I have mentioned (and linked to the details)
$endgroup$
– leonbloy
Jan 3 at 23:59











2












$begingroup$

From the mutual information relation
$$H(X|Y) = H(X) - h(Y) + h(Y|X)$$



Use this to get your desired result.






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    My teacher says that you can not mix entropies (H, discrete variables) with relative entropies (h, continuous variables)
    $endgroup$
    – missca
    Jan 2 at 19:18










  • $begingroup$
    Are all the entropy terms in your example finite?
    $endgroup$
    – Gautam Shenoy
    Jan 3 at 2:55










  • $begingroup$
    Yes! All the entropy terms are finite.
    $endgroup$
    – missca
    Jan 3 at 8:19
















2












$begingroup$

From the mutual information relation
$$H(X|Y) = H(X) - h(Y) + h(Y|X)$$



Use this to get your desired result.






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    My teacher says that you can not mix entropies (H, discrete variables) with relative entropies (h, continuous variables)
    $endgroup$
    – missca
    Jan 2 at 19:18










  • $begingroup$
    Are all the entropy terms in your example finite?
    $endgroup$
    – Gautam Shenoy
    Jan 3 at 2:55










  • $begingroup$
    Yes! All the entropy terms are finite.
    $endgroup$
    – missca
    Jan 3 at 8:19














2












2








2





$begingroup$

From the mutual information relation
$$H(X|Y) = H(X) - h(Y) + h(Y|X)$$



Use this to get your desired result.






share|cite|improve this answer









$endgroup$



From the mutual information relation
$$H(X|Y) = H(X) - h(Y) + h(Y|X)$$



Use this to get your desired result.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 2 at 18:55









Gautam ShenoyGautam Shenoy

7,15911545




7,15911545








  • 1




    $begingroup$
    My teacher says that you can not mix entropies (H, discrete variables) with relative entropies (h, continuous variables)
    $endgroup$
    – missca
    Jan 2 at 19:18










  • $begingroup$
    Are all the entropy terms in your example finite?
    $endgroup$
    – Gautam Shenoy
    Jan 3 at 2:55










  • $begingroup$
    Yes! All the entropy terms are finite.
    $endgroup$
    – missca
    Jan 3 at 8:19














  • 1




    $begingroup$
    My teacher says that you can not mix entropies (H, discrete variables) with relative entropies (h, continuous variables)
    $endgroup$
    – missca
    Jan 2 at 19:18










  • $begingroup$
    Are all the entropy terms in your example finite?
    $endgroup$
    – Gautam Shenoy
    Jan 3 at 2:55










  • $begingroup$
    Yes! All the entropy terms are finite.
    $endgroup$
    – missca
    Jan 3 at 8:19








1




1




$begingroup$
My teacher says that you can not mix entropies (H, discrete variables) with relative entropies (h, continuous variables)
$endgroup$
– missca
Jan 2 at 19:18




$begingroup$
My teacher says that you can not mix entropies (H, discrete variables) with relative entropies (h, continuous variables)
$endgroup$
– missca
Jan 2 at 19:18












$begingroup$
Are all the entropy terms in your example finite?
$endgroup$
– Gautam Shenoy
Jan 3 at 2:55




$begingroup$
Are all the entropy terms in your example finite?
$endgroup$
– Gautam Shenoy
Jan 3 at 2:55












$begingroup$
Yes! All the entropy terms are finite.
$endgroup$
– missca
Jan 3 at 8:19




$begingroup$
Yes! All the entropy terms are finite.
$endgroup$
– missca
Jan 3 at 8:19


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3059820%2fentropy-of-continuous-and-discrete-random-variables%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

How to fix TextFormField cause rebuild widget in Flutter

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith