Find the MLE of $p$ where $f(y;p)=2p^2y^{-3}$
$begingroup$
Find the MLE of $p$ where $f(y;p)=2p^2y^{-3}$.
Attempt:
Method: find the likelihood function, differentiate with respect to $p$ then set to zero and solve for $p$.
$L(p;y)=prodlimits_{i=1}^n [2p^2y^{-3}_i] =prodlimits_{i=1}^n[2] prodlimits_{i=1}^n[p^2] prodlimits_{i=1}^n [y^{-3}_i] = 2^n p^{2n} prodlimits_{i=1}^n [y^{-3}_i]$
Differentiating the log likelihood gives $2n/p$ (i think) and the $y_i$ all disappear. So what happens when you set this to zero? Because if $0=2n/p$ then surely $0=2n$ and $n=0$.
Have I misstated $L(p;y)$? Or is something going wrong with my differentiation? (Or both?)
statistics statistical-inference maximum-likelihood
$endgroup$
add a comment |
$begingroup$
Find the MLE of $p$ where $f(y;p)=2p^2y^{-3}$.
Attempt:
Method: find the likelihood function, differentiate with respect to $p$ then set to zero and solve for $p$.
$L(p;y)=prodlimits_{i=1}^n [2p^2y^{-3}_i] =prodlimits_{i=1}^n[2] prodlimits_{i=1}^n[p^2] prodlimits_{i=1}^n [y^{-3}_i] = 2^n p^{2n} prodlimits_{i=1}^n [y^{-3}_i]$
Differentiating the log likelihood gives $2n/p$ (i think) and the $y_i$ all disappear. So what happens when you set this to zero? Because if $0=2n/p$ then surely $0=2n$ and $n=0$.
Have I misstated $L(p;y)$? Or is something going wrong with my differentiation? (Or both?)
statistics statistical-inference maximum-likelihood
$endgroup$
1
$begingroup$
Previously asked: math.stackexchange.com/questions/446142/pareto-distribution-mle. Without mentioning support of the distribution, your $f(y;p)$ is incomplete. And when you mention the support, say by using indicator functions, you will see that differentiation is not valid to derive the MLE because the likelihood is not differentiable everywhere in the first place. See the linked threads here.
$endgroup$
– StubbornAtom
Jan 6 at 14:16
$begingroup$
THANK YOU! I didn't realise this was a Pareto distribution so I am extremely grateful for this.
$endgroup$
– Maths Barry
Jan 6 at 14:36
add a comment |
$begingroup$
Find the MLE of $p$ where $f(y;p)=2p^2y^{-3}$.
Attempt:
Method: find the likelihood function, differentiate with respect to $p$ then set to zero and solve for $p$.
$L(p;y)=prodlimits_{i=1}^n [2p^2y^{-3}_i] =prodlimits_{i=1}^n[2] prodlimits_{i=1}^n[p^2] prodlimits_{i=1}^n [y^{-3}_i] = 2^n p^{2n} prodlimits_{i=1}^n [y^{-3}_i]$
Differentiating the log likelihood gives $2n/p$ (i think) and the $y_i$ all disappear. So what happens when you set this to zero? Because if $0=2n/p$ then surely $0=2n$ and $n=0$.
Have I misstated $L(p;y)$? Or is something going wrong with my differentiation? (Or both?)
statistics statistical-inference maximum-likelihood
$endgroup$
Find the MLE of $p$ where $f(y;p)=2p^2y^{-3}$.
Attempt:
Method: find the likelihood function, differentiate with respect to $p$ then set to zero and solve for $p$.
$L(p;y)=prodlimits_{i=1}^n [2p^2y^{-3}_i] =prodlimits_{i=1}^n[2] prodlimits_{i=1}^n[p^2] prodlimits_{i=1}^n [y^{-3}_i] = 2^n p^{2n} prodlimits_{i=1}^n [y^{-3}_i]$
Differentiating the log likelihood gives $2n/p$ (i think) and the $y_i$ all disappear. So what happens when you set this to zero? Because if $0=2n/p$ then surely $0=2n$ and $n=0$.
Have I misstated $L(p;y)$? Or is something going wrong with my differentiation? (Or both?)
statistics statistical-inference maximum-likelihood
statistics statistical-inference maximum-likelihood
edited Jan 6 at 12:40
Did
247k23223459
247k23223459
asked Jan 6 at 12:07
Maths BarryMaths Barry
458
458
1
$begingroup$
Previously asked: math.stackexchange.com/questions/446142/pareto-distribution-mle. Without mentioning support of the distribution, your $f(y;p)$ is incomplete. And when you mention the support, say by using indicator functions, you will see that differentiation is not valid to derive the MLE because the likelihood is not differentiable everywhere in the first place. See the linked threads here.
$endgroup$
– StubbornAtom
Jan 6 at 14:16
$begingroup$
THANK YOU! I didn't realise this was a Pareto distribution so I am extremely grateful for this.
$endgroup$
– Maths Barry
Jan 6 at 14:36
add a comment |
1
$begingroup$
Previously asked: math.stackexchange.com/questions/446142/pareto-distribution-mle. Without mentioning support of the distribution, your $f(y;p)$ is incomplete. And when you mention the support, say by using indicator functions, you will see that differentiation is not valid to derive the MLE because the likelihood is not differentiable everywhere in the first place. See the linked threads here.
$endgroup$
– StubbornAtom
Jan 6 at 14:16
$begingroup$
THANK YOU! I didn't realise this was a Pareto distribution so I am extremely grateful for this.
$endgroup$
– Maths Barry
Jan 6 at 14:36
1
1
$begingroup$
Previously asked: math.stackexchange.com/questions/446142/pareto-distribution-mle. Without mentioning support of the distribution, your $f(y;p)$ is incomplete. And when you mention the support, say by using indicator functions, you will see that differentiation is not valid to derive the MLE because the likelihood is not differentiable everywhere in the first place. See the linked threads here.
$endgroup$
– StubbornAtom
Jan 6 at 14:16
$begingroup$
Previously asked: math.stackexchange.com/questions/446142/pareto-distribution-mle. Without mentioning support of the distribution, your $f(y;p)$ is incomplete. And when you mention the support, say by using indicator functions, you will see that differentiation is not valid to derive the MLE because the likelihood is not differentiable everywhere in the first place. See the linked threads here.
$endgroup$
– StubbornAtom
Jan 6 at 14:16
$begingroup$
THANK YOU! I didn't realise this was a Pareto distribution so I am extremely grateful for this.
$endgroup$
– Maths Barry
Jan 6 at 14:36
$begingroup$
THANK YOU! I didn't realise this was a Pareto distribution so I am extremely grateful for this.
$endgroup$
– Maths Barry
Jan 6 at 14:36
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Both the PDF and the likelihood in your post are incorrect, due to the fact that you forget to include indicator functions.
In fact, the PDF is $$f(y;p)=2p^2y^{-3}mathbf 1_{ygeqslant p}$$ hence, the likelihood of some i.i.d. sample $mathbf y=(y_i)_{1leqslant ileqslant n}$ for the PDF $f( ;p)$ is $$L(p;mathbf y)=2^np^{2n}left(prod_iy_i^{-3}right)mathbf 1_{m(mathbf y)geqslant p}$$ where $$m(mathbf y)=min_{1leqslant ileqslant n} y_i$$ One sees readily that $$L(p;mathbf y)=c(mathbf y)p^{2n}mathbf 1_{m(mathbf y)geqslant p}$$ for some positive constant $c(mathbf y)$ independent of $p$, hence $L( ;mathbf y)$ is maximum at $$hat p=m(mathbf y)$$ No differentiation is involved, rather a precise understanding of the situation.
$endgroup$
$begingroup$
What if we assume $Y geq p$?
$endgroup$
– Maths Barry
Jan 6 at 13:14
$begingroup$
Sorry, but what if what?
$endgroup$
– Did
Jan 6 at 13:22
$begingroup$
What if the range is $0<pleq y<infty$. The indicator function is not still needed then is it? And differentation would be the only way to get the MLE?
$endgroup$
– Maths Barry
Jan 6 at 13:33
$begingroup$
Yes the range of each $f( ;p)$ is $[p,infty)$. This is exactly why the indicator function is (much) needed. "And different[i]ation would be the only way to get the MLE?" Sorry but what are you talking about? If you can present a proof based on solving $$frac{partial L(p;mathbf y)}{partial p}=0$$ please do so (there are none...).
$endgroup$
– Did
Jan 6 at 14:13
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063777%2ffind-the-mle-of-p-where-fyp-2p2y-3%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Both the PDF and the likelihood in your post are incorrect, due to the fact that you forget to include indicator functions.
In fact, the PDF is $$f(y;p)=2p^2y^{-3}mathbf 1_{ygeqslant p}$$ hence, the likelihood of some i.i.d. sample $mathbf y=(y_i)_{1leqslant ileqslant n}$ for the PDF $f( ;p)$ is $$L(p;mathbf y)=2^np^{2n}left(prod_iy_i^{-3}right)mathbf 1_{m(mathbf y)geqslant p}$$ where $$m(mathbf y)=min_{1leqslant ileqslant n} y_i$$ One sees readily that $$L(p;mathbf y)=c(mathbf y)p^{2n}mathbf 1_{m(mathbf y)geqslant p}$$ for some positive constant $c(mathbf y)$ independent of $p$, hence $L( ;mathbf y)$ is maximum at $$hat p=m(mathbf y)$$ No differentiation is involved, rather a precise understanding of the situation.
$endgroup$
$begingroup$
What if we assume $Y geq p$?
$endgroup$
– Maths Barry
Jan 6 at 13:14
$begingroup$
Sorry, but what if what?
$endgroup$
– Did
Jan 6 at 13:22
$begingroup$
What if the range is $0<pleq y<infty$. The indicator function is not still needed then is it? And differentation would be the only way to get the MLE?
$endgroup$
– Maths Barry
Jan 6 at 13:33
$begingroup$
Yes the range of each $f( ;p)$ is $[p,infty)$. This is exactly why the indicator function is (much) needed. "And different[i]ation would be the only way to get the MLE?" Sorry but what are you talking about? If you can present a proof based on solving $$frac{partial L(p;mathbf y)}{partial p}=0$$ please do so (there are none...).
$endgroup$
– Did
Jan 6 at 14:13
add a comment |
$begingroup$
Both the PDF and the likelihood in your post are incorrect, due to the fact that you forget to include indicator functions.
In fact, the PDF is $$f(y;p)=2p^2y^{-3}mathbf 1_{ygeqslant p}$$ hence, the likelihood of some i.i.d. sample $mathbf y=(y_i)_{1leqslant ileqslant n}$ for the PDF $f( ;p)$ is $$L(p;mathbf y)=2^np^{2n}left(prod_iy_i^{-3}right)mathbf 1_{m(mathbf y)geqslant p}$$ where $$m(mathbf y)=min_{1leqslant ileqslant n} y_i$$ One sees readily that $$L(p;mathbf y)=c(mathbf y)p^{2n}mathbf 1_{m(mathbf y)geqslant p}$$ for some positive constant $c(mathbf y)$ independent of $p$, hence $L( ;mathbf y)$ is maximum at $$hat p=m(mathbf y)$$ No differentiation is involved, rather a precise understanding of the situation.
$endgroup$
$begingroup$
What if we assume $Y geq p$?
$endgroup$
– Maths Barry
Jan 6 at 13:14
$begingroup$
Sorry, but what if what?
$endgroup$
– Did
Jan 6 at 13:22
$begingroup$
What if the range is $0<pleq y<infty$. The indicator function is not still needed then is it? And differentation would be the only way to get the MLE?
$endgroup$
– Maths Barry
Jan 6 at 13:33
$begingroup$
Yes the range of each $f( ;p)$ is $[p,infty)$. This is exactly why the indicator function is (much) needed. "And different[i]ation would be the only way to get the MLE?" Sorry but what are you talking about? If you can present a proof based on solving $$frac{partial L(p;mathbf y)}{partial p}=0$$ please do so (there are none...).
$endgroup$
– Did
Jan 6 at 14:13
add a comment |
$begingroup$
Both the PDF and the likelihood in your post are incorrect, due to the fact that you forget to include indicator functions.
In fact, the PDF is $$f(y;p)=2p^2y^{-3}mathbf 1_{ygeqslant p}$$ hence, the likelihood of some i.i.d. sample $mathbf y=(y_i)_{1leqslant ileqslant n}$ for the PDF $f( ;p)$ is $$L(p;mathbf y)=2^np^{2n}left(prod_iy_i^{-3}right)mathbf 1_{m(mathbf y)geqslant p}$$ where $$m(mathbf y)=min_{1leqslant ileqslant n} y_i$$ One sees readily that $$L(p;mathbf y)=c(mathbf y)p^{2n}mathbf 1_{m(mathbf y)geqslant p}$$ for some positive constant $c(mathbf y)$ independent of $p$, hence $L( ;mathbf y)$ is maximum at $$hat p=m(mathbf y)$$ No differentiation is involved, rather a precise understanding of the situation.
$endgroup$
Both the PDF and the likelihood in your post are incorrect, due to the fact that you forget to include indicator functions.
In fact, the PDF is $$f(y;p)=2p^2y^{-3}mathbf 1_{ygeqslant p}$$ hence, the likelihood of some i.i.d. sample $mathbf y=(y_i)_{1leqslant ileqslant n}$ for the PDF $f( ;p)$ is $$L(p;mathbf y)=2^np^{2n}left(prod_iy_i^{-3}right)mathbf 1_{m(mathbf y)geqslant p}$$ where $$m(mathbf y)=min_{1leqslant ileqslant n} y_i$$ One sees readily that $$L(p;mathbf y)=c(mathbf y)p^{2n}mathbf 1_{m(mathbf y)geqslant p}$$ for some positive constant $c(mathbf y)$ independent of $p$, hence $L( ;mathbf y)$ is maximum at $$hat p=m(mathbf y)$$ No differentiation is involved, rather a precise understanding of the situation.
edited Jan 6 at 14:46
answered Jan 6 at 12:36
DidDid
247k23223459
247k23223459
$begingroup$
What if we assume $Y geq p$?
$endgroup$
– Maths Barry
Jan 6 at 13:14
$begingroup$
Sorry, but what if what?
$endgroup$
– Did
Jan 6 at 13:22
$begingroup$
What if the range is $0<pleq y<infty$. The indicator function is not still needed then is it? And differentation would be the only way to get the MLE?
$endgroup$
– Maths Barry
Jan 6 at 13:33
$begingroup$
Yes the range of each $f( ;p)$ is $[p,infty)$. This is exactly why the indicator function is (much) needed. "And different[i]ation would be the only way to get the MLE?" Sorry but what are you talking about? If you can present a proof based on solving $$frac{partial L(p;mathbf y)}{partial p}=0$$ please do so (there are none...).
$endgroup$
– Did
Jan 6 at 14:13
add a comment |
$begingroup$
What if we assume $Y geq p$?
$endgroup$
– Maths Barry
Jan 6 at 13:14
$begingroup$
Sorry, but what if what?
$endgroup$
– Did
Jan 6 at 13:22
$begingroup$
What if the range is $0<pleq y<infty$. The indicator function is not still needed then is it? And differentation would be the only way to get the MLE?
$endgroup$
– Maths Barry
Jan 6 at 13:33
$begingroup$
Yes the range of each $f( ;p)$ is $[p,infty)$. This is exactly why the indicator function is (much) needed. "And different[i]ation would be the only way to get the MLE?" Sorry but what are you talking about? If you can present a proof based on solving $$frac{partial L(p;mathbf y)}{partial p}=0$$ please do so (there are none...).
$endgroup$
– Did
Jan 6 at 14:13
$begingroup$
What if we assume $Y geq p$?
$endgroup$
– Maths Barry
Jan 6 at 13:14
$begingroup$
What if we assume $Y geq p$?
$endgroup$
– Maths Barry
Jan 6 at 13:14
$begingroup$
Sorry, but what if what?
$endgroup$
– Did
Jan 6 at 13:22
$begingroup$
Sorry, but what if what?
$endgroup$
– Did
Jan 6 at 13:22
$begingroup$
What if the range is $0<pleq y<infty$. The indicator function is not still needed then is it? And differentation would be the only way to get the MLE?
$endgroup$
– Maths Barry
Jan 6 at 13:33
$begingroup$
What if the range is $0<pleq y<infty$. The indicator function is not still needed then is it? And differentation would be the only way to get the MLE?
$endgroup$
– Maths Barry
Jan 6 at 13:33
$begingroup$
Yes the range of each $f( ;p)$ is $[p,infty)$. This is exactly why the indicator function is (much) needed. "And different[i]ation would be the only way to get the MLE?" Sorry but what are you talking about? If you can present a proof based on solving $$frac{partial L(p;mathbf y)}{partial p}=0$$ please do so (there are none...).
$endgroup$
– Did
Jan 6 at 14:13
$begingroup$
Yes the range of each $f( ;p)$ is $[p,infty)$. This is exactly why the indicator function is (much) needed. "And different[i]ation would be the only way to get the MLE?" Sorry but what are you talking about? If you can present a proof based on solving $$frac{partial L(p;mathbf y)}{partial p}=0$$ please do so (there are none...).
$endgroup$
– Did
Jan 6 at 14:13
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063777%2ffind-the-mle-of-p-where-fyp-2p2y-3%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
Previously asked: math.stackexchange.com/questions/446142/pareto-distribution-mle. Without mentioning support of the distribution, your $f(y;p)$ is incomplete. And when you mention the support, say by using indicator functions, you will see that differentiation is not valid to derive the MLE because the likelihood is not differentiable everywhere in the first place. See the linked threads here.
$endgroup$
– StubbornAtom
Jan 6 at 14:16
$begingroup$
THANK YOU! I didn't realise this was a Pareto distribution so I am extremely grateful for this.
$endgroup$
– Maths Barry
Jan 6 at 14:36