Express moments in terms of exponential series












0












$begingroup$


Consider a random variable $X$ with cumulative distribution function
$$ F(x)= [G(x)]^{alpha} $$
where $ G(x)$ is baseline distribution, and its survival function is $ R_{G}(x)=1-G(x)$.



Put $R_{G}(x) $ in $F(x)$ it yields the density $f(x) = dF(x)/ d(x)$ as



$$ f(x)= alpha u'(x)e^{-u(x)}left(1 - e^{-u(x)} right)^{alpha-1}$$



where $u(x)= -ln R_G(x)$ and $u'(x)$ is the derivative $du(x) / dx$.



My question:




Show that the $rth$ moment of $X$ following the above pdf is given by
$$ E(X^{r})= r sum_{j=1}^{nu} c_{j} I_{j}(r) \
c_{j}= (-1)^{j} alpha (alpha-1)ldots (alpha-j+1),/,j! qquad I_{j}(r) = int_{0}^{infty} x^{r-1} e^{-j u(x)} dx $$




where $alpha > 0$ can be of non-integral value, and $nu in { alpha, infty }$.










share|cite|improve this question











$endgroup$












  • $begingroup$
    @LeeDavidChungLin! yes [u(x)]'= du(x)/dx. It is derivative of u(x).
    $endgroup$
    – J.H
    Jan 29 at 9:08










  • $begingroup$
    $$ u(x)= -ln (R(x))$$ where R(x) is survival function. @LeeDavidChungLin
    $endgroup$
    – J.H
    Jan 29 at 9:15










  • $begingroup$
    Ok. Sorry for this. I do edit the question. @LeeDavidChungLin
    $endgroup$
    – J.H
    Jan 29 at 11:24










  • $begingroup$
    Yes I am still interested in answer. @LeeDavidChungLin
    $endgroup$
    – J.H
    Feb 2 at 9:38










  • $begingroup$
    @LeeDavidChungLin. our cdf is $$F(x) = (1-e^{-u(x)})^ {alpha} $$. I am also interested in expansion of its survival function (1-F(x)). which may be used to get moment.
    $endgroup$
    – J.H
    Feb 2 at 9:52
















0












$begingroup$


Consider a random variable $X$ with cumulative distribution function
$$ F(x)= [G(x)]^{alpha} $$
where $ G(x)$ is baseline distribution, and its survival function is $ R_{G}(x)=1-G(x)$.



Put $R_{G}(x) $ in $F(x)$ it yields the density $f(x) = dF(x)/ d(x)$ as



$$ f(x)= alpha u'(x)e^{-u(x)}left(1 - e^{-u(x)} right)^{alpha-1}$$



where $u(x)= -ln R_G(x)$ and $u'(x)$ is the derivative $du(x) / dx$.



My question:




Show that the $rth$ moment of $X$ following the above pdf is given by
$$ E(X^{r})= r sum_{j=1}^{nu} c_{j} I_{j}(r) \
c_{j}= (-1)^{j} alpha (alpha-1)ldots (alpha-j+1),/,j! qquad I_{j}(r) = int_{0}^{infty} x^{r-1} e^{-j u(x)} dx $$




where $alpha > 0$ can be of non-integral value, and $nu in { alpha, infty }$.










share|cite|improve this question











$endgroup$












  • $begingroup$
    @LeeDavidChungLin! yes [u(x)]'= du(x)/dx. It is derivative of u(x).
    $endgroup$
    – J.H
    Jan 29 at 9:08










  • $begingroup$
    $$ u(x)= -ln (R(x))$$ where R(x) is survival function. @LeeDavidChungLin
    $endgroup$
    – J.H
    Jan 29 at 9:15










  • $begingroup$
    Ok. Sorry for this. I do edit the question. @LeeDavidChungLin
    $endgroup$
    – J.H
    Jan 29 at 11:24










  • $begingroup$
    Yes I am still interested in answer. @LeeDavidChungLin
    $endgroup$
    – J.H
    Feb 2 at 9:38










  • $begingroup$
    @LeeDavidChungLin. our cdf is $$F(x) = (1-e^{-u(x)})^ {alpha} $$. I am also interested in expansion of its survival function (1-F(x)). which may be used to get moment.
    $endgroup$
    – J.H
    Feb 2 at 9:52














0












0








0





$begingroup$


Consider a random variable $X$ with cumulative distribution function
$$ F(x)= [G(x)]^{alpha} $$
where $ G(x)$ is baseline distribution, and its survival function is $ R_{G}(x)=1-G(x)$.



Put $R_{G}(x) $ in $F(x)$ it yields the density $f(x) = dF(x)/ d(x)$ as



$$ f(x)= alpha u'(x)e^{-u(x)}left(1 - e^{-u(x)} right)^{alpha-1}$$



where $u(x)= -ln R_G(x)$ and $u'(x)$ is the derivative $du(x) / dx$.



My question:




Show that the $rth$ moment of $X$ following the above pdf is given by
$$ E(X^{r})= r sum_{j=1}^{nu} c_{j} I_{j}(r) \
c_{j}= (-1)^{j} alpha (alpha-1)ldots (alpha-j+1),/,j! qquad I_{j}(r) = int_{0}^{infty} x^{r-1} e^{-j u(x)} dx $$




where $alpha > 0$ can be of non-integral value, and $nu in { alpha, infty }$.










share|cite|improve this question











$endgroup$




Consider a random variable $X$ with cumulative distribution function
$$ F(x)= [G(x)]^{alpha} $$
where $ G(x)$ is baseline distribution, and its survival function is $ R_{G}(x)=1-G(x)$.



Put $R_{G}(x) $ in $F(x)$ it yields the density $f(x) = dF(x)/ d(x)$ as



$$ f(x)= alpha u'(x)e^{-u(x)}left(1 - e^{-u(x)} right)^{alpha-1}$$



where $u(x)= -ln R_G(x)$ and $u'(x)$ is the derivative $du(x) / dx$.



My question:




Show that the $rth$ moment of $X$ following the above pdf is given by
$$ E(X^{r})= r sum_{j=1}^{nu} c_{j} I_{j}(r) \
c_{j}= (-1)^{j} alpha (alpha-1)ldots (alpha-j+1),/,j! qquad I_{j}(r) = int_{0}^{infty} x^{r-1} e^{-j u(x)} dx $$




where $alpha > 0$ can be of non-integral value, and $nu in { alpha, infty }$.







integration sequences-and-series probability-distributions expected-value






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Feb 2 at 13:25









Lee David Chung Lin

4,47841242




4,47841242










asked Jan 29 at 8:48









J.HJ.H

12




12












  • $begingroup$
    @LeeDavidChungLin! yes [u(x)]'= du(x)/dx. It is derivative of u(x).
    $endgroup$
    – J.H
    Jan 29 at 9:08










  • $begingroup$
    $$ u(x)= -ln (R(x))$$ where R(x) is survival function. @LeeDavidChungLin
    $endgroup$
    – J.H
    Jan 29 at 9:15










  • $begingroup$
    Ok. Sorry for this. I do edit the question. @LeeDavidChungLin
    $endgroup$
    – J.H
    Jan 29 at 11:24










  • $begingroup$
    Yes I am still interested in answer. @LeeDavidChungLin
    $endgroup$
    – J.H
    Feb 2 at 9:38










  • $begingroup$
    @LeeDavidChungLin. our cdf is $$F(x) = (1-e^{-u(x)})^ {alpha} $$. I am also interested in expansion of its survival function (1-F(x)). which may be used to get moment.
    $endgroup$
    – J.H
    Feb 2 at 9:52


















  • $begingroup$
    @LeeDavidChungLin! yes [u(x)]'= du(x)/dx. It is derivative of u(x).
    $endgroup$
    – J.H
    Jan 29 at 9:08










  • $begingroup$
    $$ u(x)= -ln (R(x))$$ where R(x) is survival function. @LeeDavidChungLin
    $endgroup$
    – J.H
    Jan 29 at 9:15










  • $begingroup$
    Ok. Sorry for this. I do edit the question. @LeeDavidChungLin
    $endgroup$
    – J.H
    Jan 29 at 11:24










  • $begingroup$
    Yes I am still interested in answer. @LeeDavidChungLin
    $endgroup$
    – J.H
    Feb 2 at 9:38










  • $begingroup$
    @LeeDavidChungLin. our cdf is $$F(x) = (1-e^{-u(x)})^ {alpha} $$. I am also interested in expansion of its survival function (1-F(x)). which may be used to get moment.
    $endgroup$
    – J.H
    Feb 2 at 9:52
















$begingroup$
@LeeDavidChungLin! yes [u(x)]'= du(x)/dx. It is derivative of u(x).
$endgroup$
– J.H
Jan 29 at 9:08




$begingroup$
@LeeDavidChungLin! yes [u(x)]'= du(x)/dx. It is derivative of u(x).
$endgroup$
– J.H
Jan 29 at 9:08












$begingroup$
$$ u(x)= -ln (R(x))$$ where R(x) is survival function. @LeeDavidChungLin
$endgroup$
– J.H
Jan 29 at 9:15




$begingroup$
$$ u(x)= -ln (R(x))$$ where R(x) is survival function. @LeeDavidChungLin
$endgroup$
– J.H
Jan 29 at 9:15












$begingroup$
Ok. Sorry for this. I do edit the question. @LeeDavidChungLin
$endgroup$
– J.H
Jan 29 at 11:24




$begingroup$
Ok. Sorry for this. I do edit the question. @LeeDavidChungLin
$endgroup$
– J.H
Jan 29 at 11:24












$begingroup$
Yes I am still interested in answer. @LeeDavidChungLin
$endgroup$
– J.H
Feb 2 at 9:38




$begingroup$
Yes I am still interested in answer. @LeeDavidChungLin
$endgroup$
– J.H
Feb 2 at 9:38












$begingroup$
@LeeDavidChungLin. our cdf is $$F(x) = (1-e^{-u(x)})^ {alpha} $$. I am also interested in expansion of its survival function (1-F(x)). which may be used to get moment.
$endgroup$
– J.H
Feb 2 at 9:52




$begingroup$
@LeeDavidChungLin. our cdf is $$F(x) = (1-e^{-u(x)})^ {alpha} $$. I am also interested in expansion of its survival function (1-F(x)). which may be used to get moment.
$endgroup$
– J.H
Feb 2 at 9:52










1 Answer
1






active

oldest

votes


















1












$begingroup$


Please load the page twice for the hyperlinks to work properly




$require{begingroup}begingrouprenewcommand{dd}[1]{,mathrm{d}#1}$Allow me to reiterate the setting:





  1. $G$ is the baseline CDF for a non-negative random variable.


  2. $R_G equiv 1 - G$ as the baseline survival function.

  3. Define $u equiv -log R_G$, or equivalently $R_G = e^{-u}$

  4. Consider a random variable $X$ which CDF is $F = G^alpha$, where $alpha > 0 $ can be non-integer.

  5. When $alpha in mathbb{N}$ is an integer, then $X overset{d}{=}max { W_1,~W_2,~ldots~, W_{alpha}}$, namely, $X$ has the same distribution as the max of an iid set with $G$ being their common CDF. At any rate, $F$ is well-defined for any non-integer $alpha > 0$.

  6. Taking $G = 1 - R_G = 1 - e^{-u}$ yields $F = (1 - e^{-u})^{alpha}$, which upon differentiating yields the density $f$ as shown in the question post.




The $r$-th moment of $X$ by the simplest definition is



$$E[X^r] = int_0^{infty} x^r f(x) dd{x}$$



Meanwhile, there's a well-known relation between the expectation and the integral of survival function. In particular, see the second half of this answer and another anwser from the statistics site for a visualization. In the current notation, it is



$$ E[X^r] = r int_0^{infty} x^{r-1} R_F(x) dd{x} $$



where $R_F equiv 1 - F$ is the survival function of $X$ such that after invoking the given item #6 in the beginning, we have



$$R_F(x) = 1 - left(1 - e^{-u(x)}right)^{alpha} label{eq_R_G_before_expansion} tag*{Eq.(1)}$$



Now, apply the series expansion (generalized Binomial theorem) on this quantity $1 - e^{-u(x)}$, which magnitude is smaller than unity, raised to the $alpha$-th power.



begin{align}
left(1 - e^{-u(x)}right)^{alpha} &= sum_{k = 0}^{infty} { alpha choose k} left(- e^{-u(x)} right)^k \
&= sum_{k = 0}^{infty} (-1)^k frac{ alpha! }{k! (alpha - k)! } e^{-k,u(x)} \
&= 1 - alpha e^{-u(x)} + frac{ alpha (alpha - 1) }{ 2! } e^{-2u(x)} - frac{ alpha (alpha - 1)(alpha - 2) }{ 3! } e^{-3u(x)} + cdots \
&= 1 + sum_{j = 1}^{infty~~text{or}~~alpha} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} label{eq_expansion_for_R_G} tag*{Eq.(2)}
end{align}

The leading $1$ is pulled out in anticipation of the next step, and the shift in the lower summation limit automatically accommodates the expression $(alpha - j + 1)$.



There are two cases for the upper summation limit:




  • when $alpha$ is non-integer, the upper-limit is $infty$ as the series goes on forever.

  • When $alpha in mathbb{N}$, the series is just the ordinary binomial expansion that terminates at $j = alpha$ with a total of $alpha + 1$ terms.



One can succinctly write the summation as $displaystyle sum_{j = 1}^{nu}~$ , where $nu in {alpha, infty}$, meaning $nu$ is either $alpha$ or $infty$.




Put ref{eq_expansion_for_R_G} back into ref{eq_R_G_before_expansion} which in turn goes into the expectation integral, we have



$$R_F(x) = sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} \
implies begin{aligned}[t]
E[X^r] &= r int_0^{infty} x^{r-1} sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} dd{x} \
&= r sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } int_0^{infty} x^{r-1} e^{-j,u(x)} dd{x}
end{aligned}$$

which is the desired expression.$quad Q.E.D.$



In case you're wondering, the exchange of the integral and summation is justified because the integrand-summand is positive and integrable. Please see e.g. this for more details or the relevant chapters in just any textbook on real analysis.
$endgroup$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thank you for all detail and answer. @ Lee David Chung Lin. You explained the things in very nice way.
    $endgroup$
    – J.H
    Feb 4 at 10:29












Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091930%2fexpress-moments-in-terms-of-exponential-series%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$


Please load the page twice for the hyperlinks to work properly




$require{begingroup}begingrouprenewcommand{dd}[1]{,mathrm{d}#1}$Allow me to reiterate the setting:





  1. $G$ is the baseline CDF for a non-negative random variable.


  2. $R_G equiv 1 - G$ as the baseline survival function.

  3. Define $u equiv -log R_G$, or equivalently $R_G = e^{-u}$

  4. Consider a random variable $X$ which CDF is $F = G^alpha$, where $alpha > 0 $ can be non-integer.

  5. When $alpha in mathbb{N}$ is an integer, then $X overset{d}{=}max { W_1,~W_2,~ldots~, W_{alpha}}$, namely, $X$ has the same distribution as the max of an iid set with $G$ being their common CDF. At any rate, $F$ is well-defined for any non-integer $alpha > 0$.

  6. Taking $G = 1 - R_G = 1 - e^{-u}$ yields $F = (1 - e^{-u})^{alpha}$, which upon differentiating yields the density $f$ as shown in the question post.




The $r$-th moment of $X$ by the simplest definition is



$$E[X^r] = int_0^{infty} x^r f(x) dd{x}$$



Meanwhile, there's a well-known relation between the expectation and the integral of survival function. In particular, see the second half of this answer and another anwser from the statistics site for a visualization. In the current notation, it is



$$ E[X^r] = r int_0^{infty} x^{r-1} R_F(x) dd{x} $$



where $R_F equiv 1 - F$ is the survival function of $X$ such that after invoking the given item #6 in the beginning, we have



$$R_F(x) = 1 - left(1 - e^{-u(x)}right)^{alpha} label{eq_R_G_before_expansion} tag*{Eq.(1)}$$



Now, apply the series expansion (generalized Binomial theorem) on this quantity $1 - e^{-u(x)}$, which magnitude is smaller than unity, raised to the $alpha$-th power.



begin{align}
left(1 - e^{-u(x)}right)^{alpha} &= sum_{k = 0}^{infty} { alpha choose k} left(- e^{-u(x)} right)^k \
&= sum_{k = 0}^{infty} (-1)^k frac{ alpha! }{k! (alpha - k)! } e^{-k,u(x)} \
&= 1 - alpha e^{-u(x)} + frac{ alpha (alpha - 1) }{ 2! } e^{-2u(x)} - frac{ alpha (alpha - 1)(alpha - 2) }{ 3! } e^{-3u(x)} + cdots \
&= 1 + sum_{j = 1}^{infty~~text{or}~~alpha} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} label{eq_expansion_for_R_G} tag*{Eq.(2)}
end{align}

The leading $1$ is pulled out in anticipation of the next step, and the shift in the lower summation limit automatically accommodates the expression $(alpha - j + 1)$.



There are two cases for the upper summation limit:




  • when $alpha$ is non-integer, the upper-limit is $infty$ as the series goes on forever.

  • When $alpha in mathbb{N}$, the series is just the ordinary binomial expansion that terminates at $j = alpha$ with a total of $alpha + 1$ terms.



One can succinctly write the summation as $displaystyle sum_{j = 1}^{nu}~$ , where $nu in {alpha, infty}$, meaning $nu$ is either $alpha$ or $infty$.




Put ref{eq_expansion_for_R_G} back into ref{eq_R_G_before_expansion} which in turn goes into the expectation integral, we have



$$R_F(x) = sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} \
implies begin{aligned}[t]
E[X^r] &= r int_0^{infty} x^{r-1} sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} dd{x} \
&= r sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } int_0^{infty} x^{r-1} e^{-j,u(x)} dd{x}
end{aligned}$$

which is the desired expression.$quad Q.E.D.$



In case you're wondering, the exchange of the integral and summation is justified because the integrand-summand is positive and integrable. Please see e.g. this for more details or the relevant chapters in just any textbook on real analysis.
$endgroup$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thank you for all detail and answer. @ Lee David Chung Lin. You explained the things in very nice way.
    $endgroup$
    – J.H
    Feb 4 at 10:29
















1












$begingroup$


Please load the page twice for the hyperlinks to work properly




$require{begingroup}begingrouprenewcommand{dd}[1]{,mathrm{d}#1}$Allow me to reiterate the setting:





  1. $G$ is the baseline CDF for a non-negative random variable.


  2. $R_G equiv 1 - G$ as the baseline survival function.

  3. Define $u equiv -log R_G$, or equivalently $R_G = e^{-u}$

  4. Consider a random variable $X$ which CDF is $F = G^alpha$, where $alpha > 0 $ can be non-integer.

  5. When $alpha in mathbb{N}$ is an integer, then $X overset{d}{=}max { W_1,~W_2,~ldots~, W_{alpha}}$, namely, $X$ has the same distribution as the max of an iid set with $G$ being their common CDF. At any rate, $F$ is well-defined for any non-integer $alpha > 0$.

  6. Taking $G = 1 - R_G = 1 - e^{-u}$ yields $F = (1 - e^{-u})^{alpha}$, which upon differentiating yields the density $f$ as shown in the question post.




The $r$-th moment of $X$ by the simplest definition is



$$E[X^r] = int_0^{infty} x^r f(x) dd{x}$$



Meanwhile, there's a well-known relation between the expectation and the integral of survival function. In particular, see the second half of this answer and another anwser from the statistics site for a visualization. In the current notation, it is



$$ E[X^r] = r int_0^{infty} x^{r-1} R_F(x) dd{x} $$



where $R_F equiv 1 - F$ is the survival function of $X$ such that after invoking the given item #6 in the beginning, we have



$$R_F(x) = 1 - left(1 - e^{-u(x)}right)^{alpha} label{eq_R_G_before_expansion} tag*{Eq.(1)}$$



Now, apply the series expansion (generalized Binomial theorem) on this quantity $1 - e^{-u(x)}$, which magnitude is smaller than unity, raised to the $alpha$-th power.



begin{align}
left(1 - e^{-u(x)}right)^{alpha} &= sum_{k = 0}^{infty} { alpha choose k} left(- e^{-u(x)} right)^k \
&= sum_{k = 0}^{infty} (-1)^k frac{ alpha! }{k! (alpha - k)! } e^{-k,u(x)} \
&= 1 - alpha e^{-u(x)} + frac{ alpha (alpha - 1) }{ 2! } e^{-2u(x)} - frac{ alpha (alpha - 1)(alpha - 2) }{ 3! } e^{-3u(x)} + cdots \
&= 1 + sum_{j = 1}^{infty~~text{or}~~alpha} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} label{eq_expansion_for_R_G} tag*{Eq.(2)}
end{align}

The leading $1$ is pulled out in anticipation of the next step, and the shift in the lower summation limit automatically accommodates the expression $(alpha - j + 1)$.



There are two cases for the upper summation limit:




  • when $alpha$ is non-integer, the upper-limit is $infty$ as the series goes on forever.

  • When $alpha in mathbb{N}$, the series is just the ordinary binomial expansion that terminates at $j = alpha$ with a total of $alpha + 1$ terms.



One can succinctly write the summation as $displaystyle sum_{j = 1}^{nu}~$ , where $nu in {alpha, infty}$, meaning $nu$ is either $alpha$ or $infty$.




Put ref{eq_expansion_for_R_G} back into ref{eq_R_G_before_expansion} which in turn goes into the expectation integral, we have



$$R_F(x) = sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} \
implies begin{aligned}[t]
E[X^r] &= r int_0^{infty} x^{r-1} sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} dd{x} \
&= r sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } int_0^{infty} x^{r-1} e^{-j,u(x)} dd{x}
end{aligned}$$

which is the desired expression.$quad Q.E.D.$



In case you're wondering, the exchange of the integral and summation is justified because the integrand-summand is positive and integrable. Please see e.g. this for more details or the relevant chapters in just any textbook on real analysis.
$endgroup$






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thank you for all detail and answer. @ Lee David Chung Lin. You explained the things in very nice way.
    $endgroup$
    – J.H
    Feb 4 at 10:29














1












1








1





$begingroup$


Please load the page twice for the hyperlinks to work properly




$require{begingroup}begingrouprenewcommand{dd}[1]{,mathrm{d}#1}$Allow me to reiterate the setting:





  1. $G$ is the baseline CDF for a non-negative random variable.


  2. $R_G equiv 1 - G$ as the baseline survival function.

  3. Define $u equiv -log R_G$, or equivalently $R_G = e^{-u}$

  4. Consider a random variable $X$ which CDF is $F = G^alpha$, where $alpha > 0 $ can be non-integer.

  5. When $alpha in mathbb{N}$ is an integer, then $X overset{d}{=}max { W_1,~W_2,~ldots~, W_{alpha}}$, namely, $X$ has the same distribution as the max of an iid set with $G$ being their common CDF. At any rate, $F$ is well-defined for any non-integer $alpha > 0$.

  6. Taking $G = 1 - R_G = 1 - e^{-u}$ yields $F = (1 - e^{-u})^{alpha}$, which upon differentiating yields the density $f$ as shown in the question post.




The $r$-th moment of $X$ by the simplest definition is



$$E[X^r] = int_0^{infty} x^r f(x) dd{x}$$



Meanwhile, there's a well-known relation between the expectation and the integral of survival function. In particular, see the second half of this answer and another anwser from the statistics site for a visualization. In the current notation, it is



$$ E[X^r] = r int_0^{infty} x^{r-1} R_F(x) dd{x} $$



where $R_F equiv 1 - F$ is the survival function of $X$ such that after invoking the given item #6 in the beginning, we have



$$R_F(x) = 1 - left(1 - e^{-u(x)}right)^{alpha} label{eq_R_G_before_expansion} tag*{Eq.(1)}$$



Now, apply the series expansion (generalized Binomial theorem) on this quantity $1 - e^{-u(x)}$, which magnitude is smaller than unity, raised to the $alpha$-th power.



begin{align}
left(1 - e^{-u(x)}right)^{alpha} &= sum_{k = 0}^{infty} { alpha choose k} left(- e^{-u(x)} right)^k \
&= sum_{k = 0}^{infty} (-1)^k frac{ alpha! }{k! (alpha - k)! } e^{-k,u(x)} \
&= 1 - alpha e^{-u(x)} + frac{ alpha (alpha - 1) }{ 2! } e^{-2u(x)} - frac{ alpha (alpha - 1)(alpha - 2) }{ 3! } e^{-3u(x)} + cdots \
&= 1 + sum_{j = 1}^{infty~~text{or}~~alpha} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} label{eq_expansion_for_R_G} tag*{Eq.(2)}
end{align}

The leading $1$ is pulled out in anticipation of the next step, and the shift in the lower summation limit automatically accommodates the expression $(alpha - j + 1)$.



There are two cases for the upper summation limit:




  • when $alpha$ is non-integer, the upper-limit is $infty$ as the series goes on forever.

  • When $alpha in mathbb{N}$, the series is just the ordinary binomial expansion that terminates at $j = alpha$ with a total of $alpha + 1$ terms.



One can succinctly write the summation as $displaystyle sum_{j = 1}^{nu}~$ , where $nu in {alpha, infty}$, meaning $nu$ is either $alpha$ or $infty$.




Put ref{eq_expansion_for_R_G} back into ref{eq_R_G_before_expansion} which in turn goes into the expectation integral, we have



$$R_F(x) = sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} \
implies begin{aligned}[t]
E[X^r] &= r int_0^{infty} x^{r-1} sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} dd{x} \
&= r sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } int_0^{infty} x^{r-1} e^{-j,u(x)} dd{x}
end{aligned}$$

which is the desired expression.$quad Q.E.D.$



In case you're wondering, the exchange of the integral and summation is justified because the integrand-summand is positive and integrable. Please see e.g. this for more details or the relevant chapters in just any textbook on real analysis.
$endgroup$






share|cite|improve this answer











$endgroup$




Please load the page twice for the hyperlinks to work properly




$require{begingroup}begingrouprenewcommand{dd}[1]{,mathrm{d}#1}$Allow me to reiterate the setting:





  1. $G$ is the baseline CDF for a non-negative random variable.


  2. $R_G equiv 1 - G$ as the baseline survival function.

  3. Define $u equiv -log R_G$, or equivalently $R_G = e^{-u}$

  4. Consider a random variable $X$ which CDF is $F = G^alpha$, where $alpha > 0 $ can be non-integer.

  5. When $alpha in mathbb{N}$ is an integer, then $X overset{d}{=}max { W_1,~W_2,~ldots~, W_{alpha}}$, namely, $X$ has the same distribution as the max of an iid set with $G$ being their common CDF. At any rate, $F$ is well-defined for any non-integer $alpha > 0$.

  6. Taking $G = 1 - R_G = 1 - e^{-u}$ yields $F = (1 - e^{-u})^{alpha}$, which upon differentiating yields the density $f$ as shown in the question post.




The $r$-th moment of $X$ by the simplest definition is



$$E[X^r] = int_0^{infty} x^r f(x) dd{x}$$



Meanwhile, there's a well-known relation between the expectation and the integral of survival function. In particular, see the second half of this answer and another anwser from the statistics site for a visualization. In the current notation, it is



$$ E[X^r] = r int_0^{infty} x^{r-1} R_F(x) dd{x} $$



where $R_F equiv 1 - F$ is the survival function of $X$ such that after invoking the given item #6 in the beginning, we have



$$R_F(x) = 1 - left(1 - e^{-u(x)}right)^{alpha} label{eq_R_G_before_expansion} tag*{Eq.(1)}$$



Now, apply the series expansion (generalized Binomial theorem) on this quantity $1 - e^{-u(x)}$, which magnitude is smaller than unity, raised to the $alpha$-th power.



begin{align}
left(1 - e^{-u(x)}right)^{alpha} &= sum_{k = 0}^{infty} { alpha choose k} left(- e^{-u(x)} right)^k \
&= sum_{k = 0}^{infty} (-1)^k frac{ alpha! }{k! (alpha - k)! } e^{-k,u(x)} \
&= 1 - alpha e^{-u(x)} + frac{ alpha (alpha - 1) }{ 2! } e^{-2u(x)} - frac{ alpha (alpha - 1)(alpha - 2) }{ 3! } e^{-3u(x)} + cdots \
&= 1 + sum_{j = 1}^{infty~~text{or}~~alpha} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} label{eq_expansion_for_R_G} tag*{Eq.(2)}
end{align}

The leading $1$ is pulled out in anticipation of the next step, and the shift in the lower summation limit automatically accommodates the expression $(alpha - j + 1)$.



There are two cases for the upper summation limit:




  • when $alpha$ is non-integer, the upper-limit is $infty$ as the series goes on forever.

  • When $alpha in mathbb{N}$, the series is just the ordinary binomial expansion that terminates at $j = alpha$ with a total of $alpha + 1$ terms.



One can succinctly write the summation as $displaystyle sum_{j = 1}^{nu}~$ , where $nu in {alpha, infty}$, meaning $nu$ is either $alpha$ or $infty$.




Put ref{eq_expansion_for_R_G} back into ref{eq_R_G_before_expansion} which in turn goes into the expectation integral, we have



$$R_F(x) = sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} \
implies begin{aligned}[t]
E[X^r] &= r int_0^{infty} x^{r-1} sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } e^{-j,u(x)} dd{x} \
&= r sum_{j = 1}^{nu} (-1)^j frac{ alpha (alpha - 1) cdots (alpha - j + 1) }{ k! } int_0^{infty} x^{r-1} e^{-j,u(x)} dd{x}
end{aligned}$$

which is the desired expression.$quad Q.E.D.$



In case you're wondering, the exchange of the integral and summation is justified because the integrand-summand is positive and integrable. Please see e.g. this for more details or the relevant chapters in just any textbook on real analysis.
$endgroup$







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Feb 2 at 13:26

























answered Feb 2 at 12:44









Lee David Chung LinLee David Chung Lin

4,47841242




4,47841242












  • $begingroup$
    Thank you for all detail and answer. @ Lee David Chung Lin. You explained the things in very nice way.
    $endgroup$
    – J.H
    Feb 4 at 10:29


















  • $begingroup$
    Thank you for all detail and answer. @ Lee David Chung Lin. You explained the things in very nice way.
    $endgroup$
    – J.H
    Feb 4 at 10:29
















$begingroup$
Thank you for all detail and answer. @ Lee David Chung Lin. You explained the things in very nice way.
$endgroup$
– J.H
Feb 4 at 10:29




$begingroup$
Thank you for all detail and answer. @ Lee David Chung Lin. You explained the things in very nice way.
$endgroup$
– J.H
Feb 4 at 10:29


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091930%2fexpress-moments-in-terms-of-exponential-series%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

How to fix TextFormField cause rebuild widget in Flutter

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith