Is (0/0)log(0/0) identified in information theory?
$begingroup$
I encounter this question when studying the Theil index for income inequality, which is invented by Theil who borrows a lot from information theory.
My question is just as that in the title. Can we calculate (0/0)log(0/0)? I know in information theory we have 0log(0)=0 and 0log(0/0)=0. But what about (0/0)log(0/0)?
Thanks!
logarithms information-theory
$endgroup$
add a comment |
$begingroup$
I encounter this question when studying the Theil index for income inequality, which is invented by Theil who borrows a lot from information theory.
My question is just as that in the title. Can we calculate (0/0)log(0/0)? I know in information theory we have 0log(0)=0 and 0log(0/0)=0. But what about (0/0)log(0/0)?
Thanks!
logarithms information-theory
$endgroup$
2
$begingroup$
I think entropy of 0 probability is often treated as 0 due to the standard limit $lim_{xto 0} xlog(x)=0$. I don't know what 0/0 supposed to mean.
$endgroup$
– mathreadler
Jan 28 at 21:08
add a comment |
$begingroup$
I encounter this question when studying the Theil index for income inequality, which is invented by Theil who borrows a lot from information theory.
My question is just as that in the title. Can we calculate (0/0)log(0/0)? I know in information theory we have 0log(0)=0 and 0log(0/0)=0. But what about (0/0)log(0/0)?
Thanks!
logarithms information-theory
$endgroup$
I encounter this question when studying the Theil index for income inequality, which is invented by Theil who borrows a lot from information theory.
My question is just as that in the title. Can we calculate (0/0)log(0/0)? I know in information theory we have 0log(0)=0 and 0log(0/0)=0. But what about (0/0)log(0/0)?
Thanks!
logarithms information-theory
logarithms information-theory
asked Jan 28 at 21:02


Ethan ShenEthan Shen
162
162
2
$begingroup$
I think entropy of 0 probability is often treated as 0 due to the standard limit $lim_{xto 0} xlog(x)=0$. I don't know what 0/0 supposed to mean.
$endgroup$
– mathreadler
Jan 28 at 21:08
add a comment |
2
$begingroup$
I think entropy of 0 probability is often treated as 0 due to the standard limit $lim_{xto 0} xlog(x)=0$. I don't know what 0/0 supposed to mean.
$endgroup$
– mathreadler
Jan 28 at 21:08
2
2
$begingroup$
I think entropy of 0 probability is often treated as 0 due to the standard limit $lim_{xto 0} xlog(x)=0$. I don't know what 0/0 supposed to mean.
$endgroup$
– mathreadler
Jan 28 at 21:08
$begingroup$
I think entropy of 0 probability is often treated as 0 due to the standard limit $lim_{xto 0} xlog(x)=0$. I don't know what 0/0 supposed to mean.
$endgroup$
– mathreadler
Jan 28 at 21:08
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
I know in information theory we have $0 log(0)=0 $
Actually, what we have is that in several formulas (esp. the entropy) appears the term $ p_i log(p_i)$ - that expression is not defined when $p_i=0$, because $0 log(0)$ is not defined, but it's readily seen that the formulas remain valid if, we take the convention $x log(x)|_{x=0}=0$ (this is also reasonable by a limit argument). Or, writen more concisely (but also more confusingly) $0 log(0)=0 $. But it must be understood that this "rule" only applies for that kind of expression. Elsewhere, we fall into absurd consequences.
and $0 log(0/0)=0$
... and this might be one. I'm not sure where you've read this, and it's not clear what this means. If you mean,as before $x log(x/x)|_{x=0}=0$ ... well, that's obviously true in the limit, because we get $0 log(1) = 0 times 0 =0$.
what about $(0/0)log(0/0)$ ?
Again, the question as stated makes little sense. Obviously, $(x/x) log(x/x) $ tends to $1 log(1)=1 times 0 =0$ as $xto 0$.
But to take as such a general "rule" that $(0/0)log(0/0)=0$ would be as wrong/objectionable and prone to errors as assuming $0/0=1$.
$endgroup$
$begingroup$
Hi leonbloy, thank you so much for your quick reply! Two quick follow-ups: 1. I read about 0log(0/0)=0 at page 5 of web.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf . 2. I am still not sure what the connection is between information theory and limit. Sounds like you tend to use limit to give the answer and then assume that it would also be true "from an information theory perspective"? I will read more on information theory. Thank you!
$endgroup$
– Ethan Shen
Jan 29 at 15:58
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091381%2fis-0-0log0-0-identified-in-information-theory%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I know in information theory we have $0 log(0)=0 $
Actually, what we have is that in several formulas (esp. the entropy) appears the term $ p_i log(p_i)$ - that expression is not defined when $p_i=0$, because $0 log(0)$ is not defined, but it's readily seen that the formulas remain valid if, we take the convention $x log(x)|_{x=0}=0$ (this is also reasonable by a limit argument). Or, writen more concisely (but also more confusingly) $0 log(0)=0 $. But it must be understood that this "rule" only applies for that kind of expression. Elsewhere, we fall into absurd consequences.
and $0 log(0/0)=0$
... and this might be one. I'm not sure where you've read this, and it's not clear what this means. If you mean,as before $x log(x/x)|_{x=0}=0$ ... well, that's obviously true in the limit, because we get $0 log(1) = 0 times 0 =0$.
what about $(0/0)log(0/0)$ ?
Again, the question as stated makes little sense. Obviously, $(x/x) log(x/x) $ tends to $1 log(1)=1 times 0 =0$ as $xto 0$.
But to take as such a general "rule" that $(0/0)log(0/0)=0$ would be as wrong/objectionable and prone to errors as assuming $0/0=1$.
$endgroup$
$begingroup$
Hi leonbloy, thank you so much for your quick reply! Two quick follow-ups: 1. I read about 0log(0/0)=0 at page 5 of web.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf . 2. I am still not sure what the connection is between information theory and limit. Sounds like you tend to use limit to give the answer and then assume that it would also be true "from an information theory perspective"? I will read more on information theory. Thank you!
$endgroup$
– Ethan Shen
Jan 29 at 15:58
add a comment |
$begingroup$
I know in information theory we have $0 log(0)=0 $
Actually, what we have is that in several formulas (esp. the entropy) appears the term $ p_i log(p_i)$ - that expression is not defined when $p_i=0$, because $0 log(0)$ is not defined, but it's readily seen that the formulas remain valid if, we take the convention $x log(x)|_{x=0}=0$ (this is also reasonable by a limit argument). Or, writen more concisely (but also more confusingly) $0 log(0)=0 $. But it must be understood that this "rule" only applies for that kind of expression. Elsewhere, we fall into absurd consequences.
and $0 log(0/0)=0$
... and this might be one. I'm not sure where you've read this, and it's not clear what this means. If you mean,as before $x log(x/x)|_{x=0}=0$ ... well, that's obviously true in the limit, because we get $0 log(1) = 0 times 0 =0$.
what about $(0/0)log(0/0)$ ?
Again, the question as stated makes little sense. Obviously, $(x/x) log(x/x) $ tends to $1 log(1)=1 times 0 =0$ as $xto 0$.
But to take as such a general "rule" that $(0/0)log(0/0)=0$ would be as wrong/objectionable and prone to errors as assuming $0/0=1$.
$endgroup$
$begingroup$
Hi leonbloy, thank you so much for your quick reply! Two quick follow-ups: 1. I read about 0log(0/0)=0 at page 5 of web.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf . 2. I am still not sure what the connection is between information theory and limit. Sounds like you tend to use limit to give the answer and then assume that it would also be true "from an information theory perspective"? I will read more on information theory. Thank you!
$endgroup$
– Ethan Shen
Jan 29 at 15:58
add a comment |
$begingroup$
I know in information theory we have $0 log(0)=0 $
Actually, what we have is that in several formulas (esp. the entropy) appears the term $ p_i log(p_i)$ - that expression is not defined when $p_i=0$, because $0 log(0)$ is not defined, but it's readily seen that the formulas remain valid if, we take the convention $x log(x)|_{x=0}=0$ (this is also reasonable by a limit argument). Or, writen more concisely (but also more confusingly) $0 log(0)=0 $. But it must be understood that this "rule" only applies for that kind of expression. Elsewhere, we fall into absurd consequences.
and $0 log(0/0)=0$
... and this might be one. I'm not sure where you've read this, and it's not clear what this means. If you mean,as before $x log(x/x)|_{x=0}=0$ ... well, that's obviously true in the limit, because we get $0 log(1) = 0 times 0 =0$.
what about $(0/0)log(0/0)$ ?
Again, the question as stated makes little sense. Obviously, $(x/x) log(x/x) $ tends to $1 log(1)=1 times 0 =0$ as $xto 0$.
But to take as such a general "rule" that $(0/0)log(0/0)=0$ would be as wrong/objectionable and prone to errors as assuming $0/0=1$.
$endgroup$
I know in information theory we have $0 log(0)=0 $
Actually, what we have is that in several formulas (esp. the entropy) appears the term $ p_i log(p_i)$ - that expression is not defined when $p_i=0$, because $0 log(0)$ is not defined, but it's readily seen that the formulas remain valid if, we take the convention $x log(x)|_{x=0}=0$ (this is also reasonable by a limit argument). Or, writen more concisely (but also more confusingly) $0 log(0)=0 $. But it must be understood that this "rule" only applies for that kind of expression. Elsewhere, we fall into absurd consequences.
and $0 log(0/0)=0$
... and this might be one. I'm not sure where you've read this, and it's not clear what this means. If you mean,as before $x log(x/x)|_{x=0}=0$ ... well, that's obviously true in the limit, because we get $0 log(1) = 0 times 0 =0$.
what about $(0/0)log(0/0)$ ?
Again, the question as stated makes little sense. Obviously, $(x/x) log(x/x) $ tends to $1 log(1)=1 times 0 =0$ as $xto 0$.
But to take as such a general "rule" that $(0/0)log(0/0)=0$ would be as wrong/objectionable and prone to errors as assuming $0/0=1$.
answered Jan 28 at 21:30
leonbloyleonbloy
41.9k647108
41.9k647108
$begingroup$
Hi leonbloy, thank you so much for your quick reply! Two quick follow-ups: 1. I read about 0log(0/0)=0 at page 5 of web.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf . 2. I am still not sure what the connection is between information theory and limit. Sounds like you tend to use limit to give the answer and then assume that it would also be true "from an information theory perspective"? I will read more on information theory. Thank you!
$endgroup$
– Ethan Shen
Jan 29 at 15:58
add a comment |
$begingroup$
Hi leonbloy, thank you so much for your quick reply! Two quick follow-ups: 1. I read about 0log(0/0)=0 at page 5 of web.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf . 2. I am still not sure what the connection is between information theory and limit. Sounds like you tend to use limit to give the answer and then assume that it would also be true "from an information theory perspective"? I will read more on information theory. Thank you!
$endgroup$
– Ethan Shen
Jan 29 at 15:58
$begingroup$
Hi leonbloy, thank you so much for your quick reply! Two quick follow-ups: 1. I read about 0log(0/0)=0 at page 5 of web.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf . 2. I am still not sure what the connection is between information theory and limit. Sounds like you tend to use limit to give the answer and then assume that it would also be true "from an information theory perspective"? I will read more on information theory. Thank you!
$endgroup$
– Ethan Shen
Jan 29 at 15:58
$begingroup$
Hi leonbloy, thank you so much for your quick reply! Two quick follow-ups: 1. I read about 0log(0/0)=0 at page 5 of web.stanford.edu/~montanar/RESEARCH/BOOK/partA.pdf . 2. I am still not sure what the connection is between information theory and limit. Sounds like you tend to use limit to give the answer and then assume that it would also be true "from an information theory perspective"? I will read more on information theory. Thank you!
$endgroup$
– Ethan Shen
Jan 29 at 15:58
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091381%2fis-0-0log0-0-identified-in-information-theory%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
I think entropy of 0 probability is often treated as 0 due to the standard limit $lim_{xto 0} xlog(x)=0$. I don't know what 0/0 supposed to mean.
$endgroup$
– mathreadler
Jan 28 at 21:08