MLE coin toss problem












0












$begingroup$


Given a coin with an unknown bias and the observation of $N$ heads and $0$
tails, what is expected probability that the next flip is a head?



i want to solve with MLE, not Bayesian analysis.



My attempt:



For any value of p , the probability of k Heads in n tosses is given by



$binom{n}{k} p^k left ( 1-p right )^{n-k}$



Consider the maximization problem:



$frac{partial p^k binom{n}{k} (1-p)^{n-k}}{partial p}=0$



$hat{p}=frac{k}{n}$



and I'm stuck here. Thank you.



Answer: $frac{n+1}{n+2}$










share|cite|improve this question











$endgroup$












  • $begingroup$
    What does Ans mean?
    $endgroup$
    – Stockfish
    Jan 9 at 14:11










  • $begingroup$
    sorry, i edited.
    $endgroup$
    – jekyll
    Jan 9 at 14:12






  • 4




    $begingroup$
    Not sure what you hope to get out of Maximum Liklihood. Clearly the probabilty with the highest probability of getting $N$ out of $N$ Heads is $p=1$. So what?
    $endgroup$
    – lulu
    Jan 9 at 14:19








  • 1




    $begingroup$
    $frac{n+1}{n+2}$ is the mean of the Bayesian posterior distribution starting with a uniform prior and is not difficult. But you have excluded that approach
    $endgroup$
    – Henry
    Jan 9 at 14:37






  • 1




    $begingroup$
    jekyll - your solution has an error: you say the likelihood is proportional to $p^k(1-p)^{n-k}$ but then you take the derivative of $p^{k+1}(1-p)^{n-k+1}$
    $endgroup$
    – Henry
    Jan 10 at 1:14


















0












$begingroup$


Given a coin with an unknown bias and the observation of $N$ heads and $0$
tails, what is expected probability that the next flip is a head?



i want to solve with MLE, not Bayesian analysis.



My attempt:



For any value of p , the probability of k Heads in n tosses is given by



$binom{n}{k} p^k left ( 1-p right )^{n-k}$



Consider the maximization problem:



$frac{partial p^k binom{n}{k} (1-p)^{n-k}}{partial p}=0$



$hat{p}=frac{k}{n}$



and I'm stuck here. Thank you.



Answer: $frac{n+1}{n+2}$










share|cite|improve this question











$endgroup$












  • $begingroup$
    What does Ans mean?
    $endgroup$
    – Stockfish
    Jan 9 at 14:11










  • $begingroup$
    sorry, i edited.
    $endgroup$
    – jekyll
    Jan 9 at 14:12






  • 4




    $begingroup$
    Not sure what you hope to get out of Maximum Liklihood. Clearly the probabilty with the highest probability of getting $N$ out of $N$ Heads is $p=1$. So what?
    $endgroup$
    – lulu
    Jan 9 at 14:19








  • 1




    $begingroup$
    $frac{n+1}{n+2}$ is the mean of the Bayesian posterior distribution starting with a uniform prior and is not difficult. But you have excluded that approach
    $endgroup$
    – Henry
    Jan 9 at 14:37






  • 1




    $begingroup$
    jekyll - your solution has an error: you say the likelihood is proportional to $p^k(1-p)^{n-k}$ but then you take the derivative of $p^{k+1}(1-p)^{n-k+1}$
    $endgroup$
    – Henry
    Jan 10 at 1:14
















0












0








0


3



$begingroup$


Given a coin with an unknown bias and the observation of $N$ heads and $0$
tails, what is expected probability that the next flip is a head?



i want to solve with MLE, not Bayesian analysis.



My attempt:



For any value of p , the probability of k Heads in n tosses is given by



$binom{n}{k} p^k left ( 1-p right )^{n-k}$



Consider the maximization problem:



$frac{partial p^k binom{n}{k} (1-p)^{n-k}}{partial p}=0$



$hat{p}=frac{k}{n}$



and I'm stuck here. Thank you.



Answer: $frac{n+1}{n+2}$










share|cite|improve this question











$endgroup$




Given a coin with an unknown bias and the observation of $N$ heads and $0$
tails, what is expected probability that the next flip is a head?



i want to solve with MLE, not Bayesian analysis.



My attempt:



For any value of p , the probability of k Heads in n tosses is given by



$binom{n}{k} p^k left ( 1-p right )^{n-k}$



Consider the maximization problem:



$frac{partial p^k binom{n}{k} (1-p)^{n-k}}{partial p}=0$



$hat{p}=frac{k}{n}$



and I'm stuck here. Thank you.



Answer: $frac{n+1}{n+2}$







probability probability-theory statistics probability-distributions






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 9 at 14:11







jekyll

















asked Jan 9 at 14:08









jekylljekyll

73




73












  • $begingroup$
    What does Ans mean?
    $endgroup$
    – Stockfish
    Jan 9 at 14:11










  • $begingroup$
    sorry, i edited.
    $endgroup$
    – jekyll
    Jan 9 at 14:12






  • 4




    $begingroup$
    Not sure what you hope to get out of Maximum Liklihood. Clearly the probabilty with the highest probability of getting $N$ out of $N$ Heads is $p=1$. So what?
    $endgroup$
    – lulu
    Jan 9 at 14:19








  • 1




    $begingroup$
    $frac{n+1}{n+2}$ is the mean of the Bayesian posterior distribution starting with a uniform prior and is not difficult. But you have excluded that approach
    $endgroup$
    – Henry
    Jan 9 at 14:37






  • 1




    $begingroup$
    jekyll - your solution has an error: you say the likelihood is proportional to $p^k(1-p)^{n-k}$ but then you take the derivative of $p^{k+1}(1-p)^{n-k+1}$
    $endgroup$
    – Henry
    Jan 10 at 1:14




















  • $begingroup$
    What does Ans mean?
    $endgroup$
    – Stockfish
    Jan 9 at 14:11










  • $begingroup$
    sorry, i edited.
    $endgroup$
    – jekyll
    Jan 9 at 14:12






  • 4




    $begingroup$
    Not sure what you hope to get out of Maximum Liklihood. Clearly the probabilty with the highest probability of getting $N$ out of $N$ Heads is $p=1$. So what?
    $endgroup$
    – lulu
    Jan 9 at 14:19








  • 1




    $begingroup$
    $frac{n+1}{n+2}$ is the mean of the Bayesian posterior distribution starting with a uniform prior and is not difficult. But you have excluded that approach
    $endgroup$
    – Henry
    Jan 9 at 14:37






  • 1




    $begingroup$
    jekyll - your solution has an error: you say the likelihood is proportional to $p^k(1-p)^{n-k}$ but then you take the derivative of $p^{k+1}(1-p)^{n-k+1}$
    $endgroup$
    – Henry
    Jan 10 at 1:14


















$begingroup$
What does Ans mean?
$endgroup$
– Stockfish
Jan 9 at 14:11




$begingroup$
What does Ans mean?
$endgroup$
– Stockfish
Jan 9 at 14:11












$begingroup$
sorry, i edited.
$endgroup$
– jekyll
Jan 9 at 14:12




$begingroup$
sorry, i edited.
$endgroup$
– jekyll
Jan 9 at 14:12




4




4




$begingroup$
Not sure what you hope to get out of Maximum Liklihood. Clearly the probabilty with the highest probability of getting $N$ out of $N$ Heads is $p=1$. So what?
$endgroup$
– lulu
Jan 9 at 14:19






$begingroup$
Not sure what you hope to get out of Maximum Liklihood. Clearly the probabilty with the highest probability of getting $N$ out of $N$ Heads is $p=1$. So what?
$endgroup$
– lulu
Jan 9 at 14:19






1




1




$begingroup$
$frac{n+1}{n+2}$ is the mean of the Bayesian posterior distribution starting with a uniform prior and is not difficult. But you have excluded that approach
$endgroup$
– Henry
Jan 9 at 14:37




$begingroup$
$frac{n+1}{n+2}$ is the mean of the Bayesian posterior distribution starting with a uniform prior and is not difficult. But you have excluded that approach
$endgroup$
– Henry
Jan 9 at 14:37




1




1




$begingroup$
jekyll - your solution has an error: you say the likelihood is proportional to $p^k(1-p)^{n-k}$ but then you take the derivative of $p^{k+1}(1-p)^{n-k+1}$
$endgroup$
– Henry
Jan 10 at 1:14






$begingroup$
jekyll - your solution has an error: you say the likelihood is proportional to $p^k(1-p)^{n-k}$ but then you take the derivative of $p^{k+1}(1-p)^{n-k+1}$
$endgroup$
– Henry
Jan 10 at 1:14












2 Answers
2






active

oldest

votes


















1












$begingroup$

Really not sure what you mean by Maximum Likelihood in your context, but here goes an attemp. Let $Z_n$ be the even that the first $n$ flips are all heads and $H_n$ be the $n$th coin turning out head, then we are interested in $P(H_{n+1} vert Z_n)$, which is given as follows
begin{equation}
P(H_{n+1} vert Z_n)
=
sum_{i=0}^n
P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n) tag{1}
end{equation}

Assuming that the flipping trials are independent conditioning on the $i^{th}$ coin being the chosen one, then
begin{equation}
P(H_{n+1} vert Z_n A_i)
=
frac{i}{k}
end{equation}

Using Bayes theorem, we can say
begin{equation}
P(A_i vert Z_n) = frac{P(Z_n vert A_i)P(A_i)}{P(Z_n)}
=
frac{frac{1}{k+1}(frac{i}{k})^n}{frac{1}{k+1}sum_{j=0}^k (frac{j}{k})^n}
=
frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
end{equation}

Replacing in $(1)$, we get
begin{equation}
P(H_{n+1} vert Z_n)
=
sum_{i=0}^n
P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n)
=
sum_{i=0}^n
frac{i}{k}frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
=
frac{sum_{i=0}^k (frac{i}{k})^{n+1}}{sum_{i=0}^k (frac{i}{k})^n} tag{2}
end{equation}

and we're done.





For large $k$



As $k rightarrow infty$, the sum becomes an integral, therefore
begin{equation}
lim_{k rightarrow infty}
=
frac{1}{k}
sum_{i=0}^k
(frac{i}{k})^{beta}
=
int_0^1
x^beta dx
=
frac{1}{1+beta}
end{equation}

For $beta = n+1$ in the numerator of $(2)$ and $beta=n$ for the denominator in $(2)$, we get
begin{equation}
P(H_{n+1} vert Z_n)
=
frac{n+1}{n+2}
end{equation}



As $n rightarrow infty$, we can see that the probability becomes $1$, which is intuitive.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    thanks, but we observed n tossing of a unknown biased coin with probability of heads being θ. How can I calculate this parameter via maximum likelihood? How can I derive the log-likelihood formula and the correct maximum likelihood estimate of θ? I was trying to find Likelihood function.. @AhmadBazzi
    $endgroup$
    – jekyll
    Jan 9 at 17:38





















0












$begingroup$

we observe $k=N$ heads in $N$ trials and want to determine the unknown probability $p$ and the accuracy of the estimate. The maximum likelihood estimate is the value of $p$ giving the largest probability for the observed data.



$0leq pleq 1$



$Unif(0, 1)$ and the beta distribution where $α = 1$, $β = 1$ in our case.



Lets find the posterior distribution of p for a prior, π(p) ∼ Beta(α, β)



$pi (p)=frac{1}{B(alpha ,beta )}p^{alpha -1}left ( 1-p right )^{beta -1}$



$f(k|p)=binom{n}{k}p^{k}(1-p)^{n-k}$



$f(p|k)=frac{f(k|p)}{f_{K}k}pi(p)$



$propto p^{k+alpha -1}(1-p)^{n-k+beta -1}$



Based on this we can see that $f(p|k)$ has a Beta(k+α, n−k+β) distribution.



so $beta=1$ and $alpha=1$



the likelihood is proportional to the beta distribution, with parameters $k+1$ and $n-k+1$.



In our problem $N$ Heads in $N$ tosses.
N=k



This partial derivative is $0$ at the maximum likelihood estimates;



$frac{partial p^{k+1} (1-p)^{n-k+1}}{partial p}=0$



$p=frac{k+1}{n+2}$



$n=k$ $Rightarrow$ $p=frac{n+1}{n+2}$



(Note that it isn’t necessary to find $f_K(k)$ explicitly and we can ignore the normalizing constants of both the Likelihood and Prior.)






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3067482%2fmle-coin-toss-problem%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    Really not sure what you mean by Maximum Likelihood in your context, but here goes an attemp. Let $Z_n$ be the even that the first $n$ flips are all heads and $H_n$ be the $n$th coin turning out head, then we are interested in $P(H_{n+1} vert Z_n)$, which is given as follows
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    sum_{i=0}^n
    P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n) tag{1}
    end{equation}

    Assuming that the flipping trials are independent conditioning on the $i^{th}$ coin being the chosen one, then
    begin{equation}
    P(H_{n+1} vert Z_n A_i)
    =
    frac{i}{k}
    end{equation}

    Using Bayes theorem, we can say
    begin{equation}
    P(A_i vert Z_n) = frac{P(Z_n vert A_i)P(A_i)}{P(Z_n)}
    =
    frac{frac{1}{k+1}(frac{i}{k})^n}{frac{1}{k+1}sum_{j=0}^k (frac{j}{k})^n}
    =
    frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
    end{equation}

    Replacing in $(1)$, we get
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    sum_{i=0}^n
    P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n)
    =
    sum_{i=0}^n
    frac{i}{k}frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
    =
    frac{sum_{i=0}^k (frac{i}{k})^{n+1}}{sum_{i=0}^k (frac{i}{k})^n} tag{2}
    end{equation}

    and we're done.





    For large $k$



    As $k rightarrow infty$, the sum becomes an integral, therefore
    begin{equation}
    lim_{k rightarrow infty}
    =
    frac{1}{k}
    sum_{i=0}^k
    (frac{i}{k})^{beta}
    =
    int_0^1
    x^beta dx
    =
    frac{1}{1+beta}
    end{equation}

    For $beta = n+1$ in the numerator of $(2)$ and $beta=n$ for the denominator in $(2)$, we get
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    frac{n+1}{n+2}
    end{equation}



    As $n rightarrow infty$, we can see that the probability becomes $1$, which is intuitive.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      thanks, but we observed n tossing of a unknown biased coin with probability of heads being θ. How can I calculate this parameter via maximum likelihood? How can I derive the log-likelihood formula and the correct maximum likelihood estimate of θ? I was trying to find Likelihood function.. @AhmadBazzi
      $endgroup$
      – jekyll
      Jan 9 at 17:38


















    1












    $begingroup$

    Really not sure what you mean by Maximum Likelihood in your context, but here goes an attemp. Let $Z_n$ be the even that the first $n$ flips are all heads and $H_n$ be the $n$th coin turning out head, then we are interested in $P(H_{n+1} vert Z_n)$, which is given as follows
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    sum_{i=0}^n
    P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n) tag{1}
    end{equation}

    Assuming that the flipping trials are independent conditioning on the $i^{th}$ coin being the chosen one, then
    begin{equation}
    P(H_{n+1} vert Z_n A_i)
    =
    frac{i}{k}
    end{equation}

    Using Bayes theorem, we can say
    begin{equation}
    P(A_i vert Z_n) = frac{P(Z_n vert A_i)P(A_i)}{P(Z_n)}
    =
    frac{frac{1}{k+1}(frac{i}{k})^n}{frac{1}{k+1}sum_{j=0}^k (frac{j}{k})^n}
    =
    frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
    end{equation}

    Replacing in $(1)$, we get
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    sum_{i=0}^n
    P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n)
    =
    sum_{i=0}^n
    frac{i}{k}frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
    =
    frac{sum_{i=0}^k (frac{i}{k})^{n+1}}{sum_{i=0}^k (frac{i}{k})^n} tag{2}
    end{equation}

    and we're done.





    For large $k$



    As $k rightarrow infty$, the sum becomes an integral, therefore
    begin{equation}
    lim_{k rightarrow infty}
    =
    frac{1}{k}
    sum_{i=0}^k
    (frac{i}{k})^{beta}
    =
    int_0^1
    x^beta dx
    =
    frac{1}{1+beta}
    end{equation}

    For $beta = n+1$ in the numerator of $(2)$ and $beta=n$ for the denominator in $(2)$, we get
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    frac{n+1}{n+2}
    end{equation}



    As $n rightarrow infty$, we can see that the probability becomes $1$, which is intuitive.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      thanks, but we observed n tossing of a unknown biased coin with probability of heads being θ. How can I calculate this parameter via maximum likelihood? How can I derive the log-likelihood formula and the correct maximum likelihood estimate of θ? I was trying to find Likelihood function.. @AhmadBazzi
      $endgroup$
      – jekyll
      Jan 9 at 17:38
















    1












    1








    1





    $begingroup$

    Really not sure what you mean by Maximum Likelihood in your context, but here goes an attemp. Let $Z_n$ be the even that the first $n$ flips are all heads and $H_n$ be the $n$th coin turning out head, then we are interested in $P(H_{n+1} vert Z_n)$, which is given as follows
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    sum_{i=0}^n
    P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n) tag{1}
    end{equation}

    Assuming that the flipping trials are independent conditioning on the $i^{th}$ coin being the chosen one, then
    begin{equation}
    P(H_{n+1} vert Z_n A_i)
    =
    frac{i}{k}
    end{equation}

    Using Bayes theorem, we can say
    begin{equation}
    P(A_i vert Z_n) = frac{P(Z_n vert A_i)P(A_i)}{P(Z_n)}
    =
    frac{frac{1}{k+1}(frac{i}{k})^n}{frac{1}{k+1}sum_{j=0}^k (frac{j}{k})^n}
    =
    frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
    end{equation}

    Replacing in $(1)$, we get
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    sum_{i=0}^n
    P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n)
    =
    sum_{i=0}^n
    frac{i}{k}frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
    =
    frac{sum_{i=0}^k (frac{i}{k})^{n+1}}{sum_{i=0}^k (frac{i}{k})^n} tag{2}
    end{equation}

    and we're done.





    For large $k$



    As $k rightarrow infty$, the sum becomes an integral, therefore
    begin{equation}
    lim_{k rightarrow infty}
    =
    frac{1}{k}
    sum_{i=0}^k
    (frac{i}{k})^{beta}
    =
    int_0^1
    x^beta dx
    =
    frac{1}{1+beta}
    end{equation}

    For $beta = n+1$ in the numerator of $(2)$ and $beta=n$ for the denominator in $(2)$, we get
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    frac{n+1}{n+2}
    end{equation}



    As $n rightarrow infty$, we can see that the probability becomes $1$, which is intuitive.






    share|cite|improve this answer









    $endgroup$



    Really not sure what you mean by Maximum Likelihood in your context, but here goes an attemp. Let $Z_n$ be the even that the first $n$ flips are all heads and $H_n$ be the $n$th coin turning out head, then we are interested in $P(H_{n+1} vert Z_n)$, which is given as follows
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    sum_{i=0}^n
    P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n) tag{1}
    end{equation}

    Assuming that the flipping trials are independent conditioning on the $i^{th}$ coin being the chosen one, then
    begin{equation}
    P(H_{n+1} vert Z_n A_i)
    =
    frac{i}{k}
    end{equation}

    Using Bayes theorem, we can say
    begin{equation}
    P(A_i vert Z_n) = frac{P(Z_n vert A_i)P(A_i)}{P(Z_n)}
    =
    frac{frac{1}{k+1}(frac{i}{k})^n}{frac{1}{k+1}sum_{j=0}^k (frac{j}{k})^n}
    =
    frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
    end{equation}

    Replacing in $(1)$, we get
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    sum_{i=0}^n
    P(H_{n+1} vert Z_n A_i)P(A_i vert Z_n)
    =
    sum_{i=0}^n
    frac{i}{k}frac{(frac{i}{k})^n}{sum_{j=0}^k (frac{j}{k})^n}
    =
    frac{sum_{i=0}^k (frac{i}{k})^{n+1}}{sum_{i=0}^k (frac{i}{k})^n} tag{2}
    end{equation}

    and we're done.





    For large $k$



    As $k rightarrow infty$, the sum becomes an integral, therefore
    begin{equation}
    lim_{k rightarrow infty}
    =
    frac{1}{k}
    sum_{i=0}^k
    (frac{i}{k})^{beta}
    =
    int_0^1
    x^beta dx
    =
    frac{1}{1+beta}
    end{equation}

    For $beta = n+1$ in the numerator of $(2)$ and $beta=n$ for the denominator in $(2)$, we get
    begin{equation}
    P(H_{n+1} vert Z_n)
    =
    frac{n+1}{n+2}
    end{equation}



    As $n rightarrow infty$, we can see that the probability becomes $1$, which is intuitive.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jan 9 at 14:21









    Ahmad BazziAhmad Bazzi

    8,0362724




    8,0362724












    • $begingroup$
      thanks, but we observed n tossing of a unknown biased coin with probability of heads being θ. How can I calculate this parameter via maximum likelihood? How can I derive the log-likelihood formula and the correct maximum likelihood estimate of θ? I was trying to find Likelihood function.. @AhmadBazzi
      $endgroup$
      – jekyll
      Jan 9 at 17:38




















    • $begingroup$
      thanks, but we observed n tossing of a unknown biased coin with probability of heads being θ. How can I calculate this parameter via maximum likelihood? How can I derive the log-likelihood formula and the correct maximum likelihood estimate of θ? I was trying to find Likelihood function.. @AhmadBazzi
      $endgroup$
      – jekyll
      Jan 9 at 17:38


















    $begingroup$
    thanks, but we observed n tossing of a unknown biased coin with probability of heads being θ. How can I calculate this parameter via maximum likelihood? How can I derive the log-likelihood formula and the correct maximum likelihood estimate of θ? I was trying to find Likelihood function.. @AhmadBazzi
    $endgroup$
    – jekyll
    Jan 9 at 17:38






    $begingroup$
    thanks, but we observed n tossing of a unknown biased coin with probability of heads being θ. How can I calculate this parameter via maximum likelihood? How can I derive the log-likelihood formula and the correct maximum likelihood estimate of θ? I was trying to find Likelihood function.. @AhmadBazzi
    $endgroup$
    – jekyll
    Jan 9 at 17:38













    0












    $begingroup$

    we observe $k=N$ heads in $N$ trials and want to determine the unknown probability $p$ and the accuracy of the estimate. The maximum likelihood estimate is the value of $p$ giving the largest probability for the observed data.



    $0leq pleq 1$



    $Unif(0, 1)$ and the beta distribution where $α = 1$, $β = 1$ in our case.



    Lets find the posterior distribution of p for a prior, π(p) ∼ Beta(α, β)



    $pi (p)=frac{1}{B(alpha ,beta )}p^{alpha -1}left ( 1-p right )^{beta -1}$



    $f(k|p)=binom{n}{k}p^{k}(1-p)^{n-k}$



    $f(p|k)=frac{f(k|p)}{f_{K}k}pi(p)$



    $propto p^{k+alpha -1}(1-p)^{n-k+beta -1}$



    Based on this we can see that $f(p|k)$ has a Beta(k+α, n−k+β) distribution.



    so $beta=1$ and $alpha=1$



    the likelihood is proportional to the beta distribution, with parameters $k+1$ and $n-k+1$.



    In our problem $N$ Heads in $N$ tosses.
    N=k



    This partial derivative is $0$ at the maximum likelihood estimates;



    $frac{partial p^{k+1} (1-p)^{n-k+1}}{partial p}=0$



    $p=frac{k+1}{n+2}$



    $n=k$ $Rightarrow$ $p=frac{n+1}{n+2}$



    (Note that it isn’t necessary to find $f_K(k)$ explicitly and we can ignore the normalizing constants of both the Likelihood and Prior.)






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      we observe $k=N$ heads in $N$ trials and want to determine the unknown probability $p$ and the accuracy of the estimate. The maximum likelihood estimate is the value of $p$ giving the largest probability for the observed data.



      $0leq pleq 1$



      $Unif(0, 1)$ and the beta distribution where $α = 1$, $β = 1$ in our case.



      Lets find the posterior distribution of p for a prior, π(p) ∼ Beta(α, β)



      $pi (p)=frac{1}{B(alpha ,beta )}p^{alpha -1}left ( 1-p right )^{beta -1}$



      $f(k|p)=binom{n}{k}p^{k}(1-p)^{n-k}$



      $f(p|k)=frac{f(k|p)}{f_{K}k}pi(p)$



      $propto p^{k+alpha -1}(1-p)^{n-k+beta -1}$



      Based on this we can see that $f(p|k)$ has a Beta(k+α, n−k+β) distribution.



      so $beta=1$ and $alpha=1$



      the likelihood is proportional to the beta distribution, with parameters $k+1$ and $n-k+1$.



      In our problem $N$ Heads in $N$ tosses.
      N=k



      This partial derivative is $0$ at the maximum likelihood estimates;



      $frac{partial p^{k+1} (1-p)^{n-k+1}}{partial p}=0$



      $p=frac{k+1}{n+2}$



      $n=k$ $Rightarrow$ $p=frac{n+1}{n+2}$



      (Note that it isn’t necessary to find $f_K(k)$ explicitly and we can ignore the normalizing constants of both the Likelihood and Prior.)






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        we observe $k=N$ heads in $N$ trials and want to determine the unknown probability $p$ and the accuracy of the estimate. The maximum likelihood estimate is the value of $p$ giving the largest probability for the observed data.



        $0leq pleq 1$



        $Unif(0, 1)$ and the beta distribution where $α = 1$, $β = 1$ in our case.



        Lets find the posterior distribution of p for a prior, π(p) ∼ Beta(α, β)



        $pi (p)=frac{1}{B(alpha ,beta )}p^{alpha -1}left ( 1-p right )^{beta -1}$



        $f(k|p)=binom{n}{k}p^{k}(1-p)^{n-k}$



        $f(p|k)=frac{f(k|p)}{f_{K}k}pi(p)$



        $propto p^{k+alpha -1}(1-p)^{n-k+beta -1}$



        Based on this we can see that $f(p|k)$ has a Beta(k+α, n−k+β) distribution.



        so $beta=1$ and $alpha=1$



        the likelihood is proportional to the beta distribution, with parameters $k+1$ and $n-k+1$.



        In our problem $N$ Heads in $N$ tosses.
        N=k



        This partial derivative is $0$ at the maximum likelihood estimates;



        $frac{partial p^{k+1} (1-p)^{n-k+1}}{partial p}=0$



        $p=frac{k+1}{n+2}$



        $n=k$ $Rightarrow$ $p=frac{n+1}{n+2}$



        (Note that it isn’t necessary to find $f_K(k)$ explicitly and we can ignore the normalizing constants of both the Likelihood and Prior.)






        share|cite|improve this answer









        $endgroup$



        we observe $k=N$ heads in $N$ trials and want to determine the unknown probability $p$ and the accuracy of the estimate. The maximum likelihood estimate is the value of $p$ giving the largest probability for the observed data.



        $0leq pleq 1$



        $Unif(0, 1)$ and the beta distribution where $α = 1$, $β = 1$ in our case.



        Lets find the posterior distribution of p for a prior, π(p) ∼ Beta(α, β)



        $pi (p)=frac{1}{B(alpha ,beta )}p^{alpha -1}left ( 1-p right )^{beta -1}$



        $f(k|p)=binom{n}{k}p^{k}(1-p)^{n-k}$



        $f(p|k)=frac{f(k|p)}{f_{K}k}pi(p)$



        $propto p^{k+alpha -1}(1-p)^{n-k+beta -1}$



        Based on this we can see that $f(p|k)$ has a Beta(k+α, n−k+β) distribution.



        so $beta=1$ and $alpha=1$



        the likelihood is proportional to the beta distribution, with parameters $k+1$ and $n-k+1$.



        In our problem $N$ Heads in $N$ tosses.
        N=k



        This partial derivative is $0$ at the maximum likelihood estimates;



        $frac{partial p^{k+1} (1-p)^{n-k+1}}{partial p}=0$



        $p=frac{k+1}{n+2}$



        $n=k$ $Rightarrow$ $p=frac{n+1}{n+2}$



        (Note that it isn’t necessary to find $f_K(k)$ explicitly and we can ignore the normalizing constants of both the Likelihood and Prior.)







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Jan 10 at 13:50









        jekylljekyll

        73




        73






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3067482%2fmle-coin-toss-problem%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            MongoDB - Not Authorized To Execute Command

            How to fix TextFormField cause rebuild widget in Flutter

            in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith