Proving positivity of the exponential function












4












$begingroup$


Question. Without using the semigroup property ($mathrm{e}^{x}mathrm{e}^{y}=mathrm{e}^{x+y}$),
how can we show that $mathrm{e}^{x}>0$ for all $xinmathbb{R}$ only by using the series expansion?



Explanation.
From the series expansion of $mathrm{e}^{x}=sum_{k=0}^{infty}frac{x^{k}}{k!}$ for $xinmathbb{R}$, we see that $mathrm{e}^{x}>0$ for $xgeq0$.
Thus, if the series becomes negative, this can only happen for negative values of $x$.
So proving $mathrm{e}^{-x}$ for $x>0$ will complete the proof.
As the series converges uniformly on any compact interval $Isubsetmathrm{R}$, we can rearrange the terms of the series and write
$mathrm{e}^{-x}=lim_{ntoinfty}g_{n}(x)$ for $xgeq0$, where $g_{n}(x):=1+sum_{k=1}^{n}Big(frac{x^{2k}}{(2k)!}-frac{x^{2k-1}}{(2k-1)!}Big)$ for $xgeq0$ and $ninmathbb{N}$.
Obviously, $g_{n}$ is decreasing on $[0,1]$ and $g_{n}(1)>frac{1}{mathrm{e}}$.



I need to prove the following.



Claim. There exists an increasing divergent sequence ${xi_{n}}subset(0,infty)$ such that $g_{n}$ is decreasing on $[0,xi_{n}]$ with $g_{n}(xi_{n})>0$ for $ninmathbb{N}$.



Strengthened Claim. $xi_{n}:=sum_{k=1}^{n}frac{1}{k}$ for $kinmathbb{N}$.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Do you have a special reason for not using that property?
    $endgroup$
    – PhoemueX
    Apr 5 '15 at 6:39










  • $begingroup$
    @PhoemueX Exactly, the functions I am working are of the series form and they do not have such nice properties. When I plot their graphics, they seem to be positive everywhere but I could not handle it. This pushed me back to the exponential function.
    $endgroup$
    – bkarpuz
    Apr 5 '15 at 6:42












  • $begingroup$
    Then say what functions you are working on! What is the point of asking us to do something else that you don't actually want?
    $endgroup$
    – user21820
    Apr 5 '15 at 6:44






  • 1




    $begingroup$
    With $p_n(x)=sum_{k=0}^n frac{x^k}{k!}$, it is true that $p_n$ is always positive when $n$ is even while $p_n$ has a unique (negative) root when $n$ is odd (this can be proved by induction). If you can show that that negative root tends to $-infty$, I think you're done. The identities $p_n(x) = frac{x^n}{n!}+p_{n-1}(x)$ and $p'_n(x) = p_{n-1}(x)$ are useful here.
    $endgroup$
    – Greg Martin
    Apr 5 '15 at 19:58






  • 1




    $begingroup$
    Fair point. But this isn't too hard: for a fixed negative $x$, the sequence ${p_n(x)colon n$ odd$}$ is increasing for $n>|x|$. So if one of the terms is strictly positive, so is the limit.
    $endgroup$
    – Greg Martin
    Apr 12 '15 at 19:23
















4












$begingroup$


Question. Without using the semigroup property ($mathrm{e}^{x}mathrm{e}^{y}=mathrm{e}^{x+y}$),
how can we show that $mathrm{e}^{x}>0$ for all $xinmathbb{R}$ only by using the series expansion?



Explanation.
From the series expansion of $mathrm{e}^{x}=sum_{k=0}^{infty}frac{x^{k}}{k!}$ for $xinmathbb{R}$, we see that $mathrm{e}^{x}>0$ for $xgeq0$.
Thus, if the series becomes negative, this can only happen for negative values of $x$.
So proving $mathrm{e}^{-x}$ for $x>0$ will complete the proof.
As the series converges uniformly on any compact interval $Isubsetmathrm{R}$, we can rearrange the terms of the series and write
$mathrm{e}^{-x}=lim_{ntoinfty}g_{n}(x)$ for $xgeq0$, where $g_{n}(x):=1+sum_{k=1}^{n}Big(frac{x^{2k}}{(2k)!}-frac{x^{2k-1}}{(2k-1)!}Big)$ for $xgeq0$ and $ninmathbb{N}$.
Obviously, $g_{n}$ is decreasing on $[0,1]$ and $g_{n}(1)>frac{1}{mathrm{e}}$.



I need to prove the following.



Claim. There exists an increasing divergent sequence ${xi_{n}}subset(0,infty)$ such that $g_{n}$ is decreasing on $[0,xi_{n}]$ with $g_{n}(xi_{n})>0$ for $ninmathbb{N}$.



Strengthened Claim. $xi_{n}:=sum_{k=1}^{n}frac{1}{k}$ for $kinmathbb{N}$.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Do you have a special reason for not using that property?
    $endgroup$
    – PhoemueX
    Apr 5 '15 at 6:39










  • $begingroup$
    @PhoemueX Exactly, the functions I am working are of the series form and they do not have such nice properties. When I plot their graphics, they seem to be positive everywhere but I could not handle it. This pushed me back to the exponential function.
    $endgroup$
    – bkarpuz
    Apr 5 '15 at 6:42












  • $begingroup$
    Then say what functions you are working on! What is the point of asking us to do something else that you don't actually want?
    $endgroup$
    – user21820
    Apr 5 '15 at 6:44






  • 1




    $begingroup$
    With $p_n(x)=sum_{k=0}^n frac{x^k}{k!}$, it is true that $p_n$ is always positive when $n$ is even while $p_n$ has a unique (negative) root when $n$ is odd (this can be proved by induction). If you can show that that negative root tends to $-infty$, I think you're done. The identities $p_n(x) = frac{x^n}{n!}+p_{n-1}(x)$ and $p'_n(x) = p_{n-1}(x)$ are useful here.
    $endgroup$
    – Greg Martin
    Apr 5 '15 at 19:58






  • 1




    $begingroup$
    Fair point. But this isn't too hard: for a fixed negative $x$, the sequence ${p_n(x)colon n$ odd$}$ is increasing for $n>|x|$. So if one of the terms is strictly positive, so is the limit.
    $endgroup$
    – Greg Martin
    Apr 12 '15 at 19:23














4












4








4


2



$begingroup$


Question. Without using the semigroup property ($mathrm{e}^{x}mathrm{e}^{y}=mathrm{e}^{x+y}$),
how can we show that $mathrm{e}^{x}>0$ for all $xinmathbb{R}$ only by using the series expansion?



Explanation.
From the series expansion of $mathrm{e}^{x}=sum_{k=0}^{infty}frac{x^{k}}{k!}$ for $xinmathbb{R}$, we see that $mathrm{e}^{x}>0$ for $xgeq0$.
Thus, if the series becomes negative, this can only happen for negative values of $x$.
So proving $mathrm{e}^{-x}$ for $x>0$ will complete the proof.
As the series converges uniformly on any compact interval $Isubsetmathrm{R}$, we can rearrange the terms of the series and write
$mathrm{e}^{-x}=lim_{ntoinfty}g_{n}(x)$ for $xgeq0$, where $g_{n}(x):=1+sum_{k=1}^{n}Big(frac{x^{2k}}{(2k)!}-frac{x^{2k-1}}{(2k-1)!}Big)$ for $xgeq0$ and $ninmathbb{N}$.
Obviously, $g_{n}$ is decreasing on $[0,1]$ and $g_{n}(1)>frac{1}{mathrm{e}}$.



I need to prove the following.



Claim. There exists an increasing divergent sequence ${xi_{n}}subset(0,infty)$ such that $g_{n}$ is decreasing on $[0,xi_{n}]$ with $g_{n}(xi_{n})>0$ for $ninmathbb{N}$.



Strengthened Claim. $xi_{n}:=sum_{k=1}^{n}frac{1}{k}$ for $kinmathbb{N}$.










share|cite|improve this question











$endgroup$




Question. Without using the semigroup property ($mathrm{e}^{x}mathrm{e}^{y}=mathrm{e}^{x+y}$),
how can we show that $mathrm{e}^{x}>0$ for all $xinmathbb{R}$ only by using the series expansion?



Explanation.
From the series expansion of $mathrm{e}^{x}=sum_{k=0}^{infty}frac{x^{k}}{k!}$ for $xinmathbb{R}$, we see that $mathrm{e}^{x}>0$ for $xgeq0$.
Thus, if the series becomes negative, this can only happen for negative values of $x$.
So proving $mathrm{e}^{-x}$ for $x>0$ will complete the proof.
As the series converges uniformly on any compact interval $Isubsetmathrm{R}$, we can rearrange the terms of the series and write
$mathrm{e}^{-x}=lim_{ntoinfty}g_{n}(x)$ for $xgeq0$, where $g_{n}(x):=1+sum_{k=1}^{n}Big(frac{x^{2k}}{(2k)!}-frac{x^{2k-1}}{(2k-1)!}Big)$ for $xgeq0$ and $ninmathbb{N}$.
Obviously, $g_{n}$ is decreasing on $[0,1]$ and $g_{n}(1)>frac{1}{mathrm{e}}$.



I need to prove the following.



Claim. There exists an increasing divergent sequence ${xi_{n}}subset(0,infty)$ such that $g_{n}$ is decreasing on $[0,xi_{n}]$ with $g_{n}(xi_{n})>0$ for $ninmathbb{N}$.



Strengthened Claim. $xi_{n}:=sum_{k=1}^{n}frac{1}{k}$ for $kinmathbb{N}$.







calculus real-analysis power-series special-functions exponential-function






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Apr 12 '15 at 21:23









Yiorgos S. Smyrlis

63.3k1385163




63.3k1385163










asked Apr 5 '15 at 6:37









bkarpuzbkarpuz

500210




500210












  • $begingroup$
    Do you have a special reason for not using that property?
    $endgroup$
    – PhoemueX
    Apr 5 '15 at 6:39










  • $begingroup$
    @PhoemueX Exactly, the functions I am working are of the series form and they do not have such nice properties. When I plot their graphics, they seem to be positive everywhere but I could not handle it. This pushed me back to the exponential function.
    $endgroup$
    – bkarpuz
    Apr 5 '15 at 6:42












  • $begingroup$
    Then say what functions you are working on! What is the point of asking us to do something else that you don't actually want?
    $endgroup$
    – user21820
    Apr 5 '15 at 6:44






  • 1




    $begingroup$
    With $p_n(x)=sum_{k=0}^n frac{x^k}{k!}$, it is true that $p_n$ is always positive when $n$ is even while $p_n$ has a unique (negative) root when $n$ is odd (this can be proved by induction). If you can show that that negative root tends to $-infty$, I think you're done. The identities $p_n(x) = frac{x^n}{n!}+p_{n-1}(x)$ and $p'_n(x) = p_{n-1}(x)$ are useful here.
    $endgroup$
    – Greg Martin
    Apr 5 '15 at 19:58






  • 1




    $begingroup$
    Fair point. But this isn't too hard: for a fixed negative $x$, the sequence ${p_n(x)colon n$ odd$}$ is increasing for $n>|x|$. So if one of the terms is strictly positive, so is the limit.
    $endgroup$
    – Greg Martin
    Apr 12 '15 at 19:23


















  • $begingroup$
    Do you have a special reason for not using that property?
    $endgroup$
    – PhoemueX
    Apr 5 '15 at 6:39










  • $begingroup$
    @PhoemueX Exactly, the functions I am working are of the series form and they do not have such nice properties. When I plot their graphics, they seem to be positive everywhere but I could not handle it. This pushed me back to the exponential function.
    $endgroup$
    – bkarpuz
    Apr 5 '15 at 6:42












  • $begingroup$
    Then say what functions you are working on! What is the point of asking us to do something else that you don't actually want?
    $endgroup$
    – user21820
    Apr 5 '15 at 6:44






  • 1




    $begingroup$
    With $p_n(x)=sum_{k=0}^n frac{x^k}{k!}$, it is true that $p_n$ is always positive when $n$ is even while $p_n$ has a unique (negative) root when $n$ is odd (this can be proved by induction). If you can show that that negative root tends to $-infty$, I think you're done. The identities $p_n(x) = frac{x^n}{n!}+p_{n-1}(x)$ and $p'_n(x) = p_{n-1}(x)$ are useful here.
    $endgroup$
    – Greg Martin
    Apr 5 '15 at 19:58






  • 1




    $begingroup$
    Fair point. But this isn't too hard: for a fixed negative $x$, the sequence ${p_n(x)colon n$ odd$}$ is increasing for $n>|x|$. So if one of the terms is strictly positive, so is the limit.
    $endgroup$
    – Greg Martin
    Apr 12 '15 at 19:23
















$begingroup$
Do you have a special reason for not using that property?
$endgroup$
– PhoemueX
Apr 5 '15 at 6:39




$begingroup$
Do you have a special reason for not using that property?
$endgroup$
– PhoemueX
Apr 5 '15 at 6:39












$begingroup$
@PhoemueX Exactly, the functions I am working are of the series form and they do not have such nice properties. When I plot their graphics, they seem to be positive everywhere but I could not handle it. This pushed me back to the exponential function.
$endgroup$
– bkarpuz
Apr 5 '15 at 6:42






$begingroup$
@PhoemueX Exactly, the functions I am working are of the series form and they do not have such nice properties. When I plot their graphics, they seem to be positive everywhere but I could not handle it. This pushed me back to the exponential function.
$endgroup$
– bkarpuz
Apr 5 '15 at 6:42














$begingroup$
Then say what functions you are working on! What is the point of asking us to do something else that you don't actually want?
$endgroup$
– user21820
Apr 5 '15 at 6:44




$begingroup$
Then say what functions you are working on! What is the point of asking us to do something else that you don't actually want?
$endgroup$
– user21820
Apr 5 '15 at 6:44




1




1




$begingroup$
With $p_n(x)=sum_{k=0}^n frac{x^k}{k!}$, it is true that $p_n$ is always positive when $n$ is even while $p_n$ has a unique (negative) root when $n$ is odd (this can be proved by induction). If you can show that that negative root tends to $-infty$, I think you're done. The identities $p_n(x) = frac{x^n}{n!}+p_{n-1}(x)$ and $p'_n(x) = p_{n-1}(x)$ are useful here.
$endgroup$
– Greg Martin
Apr 5 '15 at 19:58




$begingroup$
With $p_n(x)=sum_{k=0}^n frac{x^k}{k!}$, it is true that $p_n$ is always positive when $n$ is even while $p_n$ has a unique (negative) root when $n$ is odd (this can be proved by induction). If you can show that that negative root tends to $-infty$, I think you're done. The identities $p_n(x) = frac{x^n}{n!}+p_{n-1}(x)$ and $p'_n(x) = p_{n-1}(x)$ are useful here.
$endgroup$
– Greg Martin
Apr 5 '15 at 19:58




1




1




$begingroup$
Fair point. But this isn't too hard: for a fixed negative $x$, the sequence ${p_n(x)colon n$ odd$}$ is increasing for $n>|x|$. So if one of the terms is strictly positive, so is the limit.
$endgroup$
– Greg Martin
Apr 12 '15 at 19:23




$begingroup$
Fair point. But this isn't too hard: for a fixed negative $x$, the sequence ${p_n(x)colon n$ odd$}$ is increasing for $n>|x|$. So if one of the terms is strictly positive, so is the limit.
$endgroup$
– Greg Martin
Apr 12 '15 at 19:23










4 Answers
4






active

oldest

votes


















3












$begingroup$

A hyperbolic trigonometry approach. Set
$$
C(x)=sum_{k=0}^inftyfrac{x^{2k}}{(2k)!}quadtext{and}quad S(x)=sum_{k=1}^inftyfrac{x^{2k-1}}{(2k-1)!}
$$
It suffices to show that $C(x)>S(x)$, for every $xinmathbb R$.



First observe that: $C'(x)=S(x)$ and $S'(x)=C(x)$. Then observe that
$$
big(C^2(x)-S^2(x)big)'=2big(C(x)C'(x)-S(x)S'(x)big)=2big(C(x)S(x)-S(x)C(x)big)=0,
$$
and hence
$$
C^2(x)-S^2(x)=C^2(0)-S^2(0)=1.
$$
Thus, for every $xinmathbb R$,
$$
C(x)=sqrt{S^2(x)+1}>S(x).
$$






share|cite|improve this answer









$endgroup$





















    1












    $begingroup$

    Using termwise differentiation one finds that $exp$ satisfies the linear differential equation $y'=y$, which obvioulsy satisfies the assumptions of the existence and uniqueness theorem. The function $y_0(x):equiv0$ is a solution, and no other solution can cross the graph of $y_0$. It follows that $xmapsto e^x$, which is positive when $x=0$, is positive on its full domain ${mathbb R}$.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      That's perfectly correct. However, I cannot use this for my purpose as my functions do not satisfy a homogeneous linear equation. But as I have mentioned before, the proof is correct.
      $endgroup$
      – bkarpuz
      Apr 5 '15 at 11:18










    • $begingroup$
      In order to prove the Uniqueness part of Picard-Lindelof you NEED the semigroup properties of the exponential!
      $endgroup$
      – Yiorgos S. Smyrlis
      Apr 7 '15 at 7:31










    • $begingroup$
      I think that we can use the Uniqueness result by Peano, which requires the right-hand side function to be nonincreasing in $y$. As $mathrm{e}^{x}geq1$ is obvious for $xgeq0$, we need to show that $y:=mathrm{e}^{-x}>0$ for $xgeq0$. Note that $y^{prime}=-y$ (decreasing in $y$) and $y(0)=1$.
      $endgroup$
      – bkarpuz
      Apr 7 '15 at 18:28





















    1












    $begingroup$

    Assuming uniform convergence of the series you can show by termwise differentiation that $f(x) = e^x$ verifies $f'(x) = f(x).$



    Clearly $e^x = sum_k frac{x^k}{k!}$ is strictly positive for all positive $x$ therefore it is an increasing function on $mathbb{R}^+$. Consider the set $A = {x < 0 : e^x leq 0 }$ and assume that it is non empty.



    Let $(x_n)_{nin mathbb{N}}$ be a sequence in $A$ that converges to $L$. Then $L in A$ by continuity of $f$
    $$ f(L) = f(lim_{n to infty} x_n) = lim_{n to infty} f(x_n) leq 0.$$



    Therefore $a := sup A in A$ and $a < 0$ and $e^a leq 0$.



    If $e^a < 0$ then notice that $e^0 = 1$ and by the intermediate value theorem there exists $ a < c < 0$ such that $e^c = 0$ and $c in A$ which contradicts the maximality of $a$.



    If $e^a = 0$ then consider $$C = {c' leq a ; vert forall x in (c',a], ; ;f(x) = 0 }.$$ If $inf C = k > - infty$, then $exists delta > 0$ s.t. the interval $[k- delta, k + delta]$ around $k$ is such that $f > 0$ on $[k- delta, k)$ or $f < 0$ on $[k- delta,k)$.



    Since $f$ is equal to its own derivative it is either positive-increasing or negative increasing on $(k-delta,k$). In both cases by the mean value theorem, $exists alpha in (k-delta/2,k)$ s.t.
    $$ f'(alpha) = frac{f(k) - f(k- delta/2)}{delta/2} = - frac{f(k-delta/2)}{delta/2} $$
    This is a contradiction since $f'(alpha) = f(alpha)$ and $f(k- delta/2)$ have the same sign and are both not equal to zero.



    If $inf C' = - infty$ then $e^x = 0 forall x leq a.$ We must proceed differently:



    Consider the function $F : mathbb{R}^+ rightarrow mathbb{R}: x mapsto F(x) = int_{a+x}^0 e^t ;dt$



    Clearly $$F(x) = [e^t]_{a+x}^0 = 1 - e^{a+x}.$$



    Using the change of variable $u(t) = t - x $ in the integral we get



    $$ F(x) = int_{x+a}^0 e^t dt = int_{a}^{-x} e^t dt = [e^t]^{-x}_a = e^{-x} - e^a = e^{-x}$$



    Therefore $forall x > 0$:



    $$ 1 - e^{a + x} = e^{-x}.$$



    Since $e^t > 0 ; forall t > a$ the exponential function is increasing on $(a, + infty)$ so



    $$a + x > 0 Rightarrow e^{a + x} > e^{0} = 1 iff F(x) = 1 - e^{a+x} < 0.$$



    But $ a + x > 0 Rightarrow -x < a$ and $e^{-x} = 0$ so $$F(x) = e^{-x} = 0$$
    which is a contradiction.



    We conclude that $A = { x < 0 : e^x leq 0 } = varnothing$ and the exponential function is positive everywhere.






    share|cite|improve this answer











    $endgroup$





















      0












      $begingroup$

      The series expansion is
      $$
      e^x=sum_{n=0}^inftyfrac{x^n}{n!}=1+x+frac{x^2}{2!}+frac{x^3}{3!}+frac{x^4}{4!}+cdots
      $$
      For $xge0$ we have $1$ and a bunch of nonnegative numbers, so the result is clearly positive.



      For $x<0$ notice that:
      $$
      frac1{e^x}=e^{-x}
      $$
      So positivity of $e^x$ clearly implies that $e^{-x}$ is positive.






      share|cite|improve this answer











      $endgroup$









      • 1




        $begingroup$
        Put $x:=-20$ in your second displayed formula!
        $endgroup$
        – Christian Blatter
        Apr 5 '15 at 9:41










      • $begingroup$
        @ChristianBlatter Right, okay, if $x<-1$ then you should group them with $1$ seperate, good point.
        $endgroup$
        – Alice Ryhl
        Apr 5 '15 at 9:45










      • $begingroup$
        I cannot say that the proof is rigorious.
        $endgroup$
        – bkarpuz
        Apr 5 '15 at 12:37






      • 1




        $begingroup$
        Note that you have used the semi group property $mathrm{e}^{x}mathrm{e}^{-x}=mathrm{e}^{0}=1$, which is an infraction of the rule.
        $endgroup$
        – bkarpuz
        Apr 5 '15 at 13:59








      • 1




        $begingroup$
        @bkarpuz It becomes difficult to rearrange the terms for $x<-1$. It would be easier to prove that $e^xe^{-x}=1$ using the series expansion.
        $endgroup$
        – Alice Ryhl
        Apr 5 '15 at 14:02











      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1220729%2fproving-positivity-of-the-exponential-function%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      4 Answers
      4






      active

      oldest

      votes








      4 Answers
      4






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      3












      $begingroup$

      A hyperbolic trigonometry approach. Set
      $$
      C(x)=sum_{k=0}^inftyfrac{x^{2k}}{(2k)!}quadtext{and}quad S(x)=sum_{k=1}^inftyfrac{x^{2k-1}}{(2k-1)!}
      $$
      It suffices to show that $C(x)>S(x)$, for every $xinmathbb R$.



      First observe that: $C'(x)=S(x)$ and $S'(x)=C(x)$. Then observe that
      $$
      big(C^2(x)-S^2(x)big)'=2big(C(x)C'(x)-S(x)S'(x)big)=2big(C(x)S(x)-S(x)C(x)big)=0,
      $$
      and hence
      $$
      C^2(x)-S^2(x)=C^2(0)-S^2(0)=1.
      $$
      Thus, for every $xinmathbb R$,
      $$
      C(x)=sqrt{S^2(x)+1}>S(x).
      $$






      share|cite|improve this answer









      $endgroup$


















        3












        $begingroup$

        A hyperbolic trigonometry approach. Set
        $$
        C(x)=sum_{k=0}^inftyfrac{x^{2k}}{(2k)!}quadtext{and}quad S(x)=sum_{k=1}^inftyfrac{x^{2k-1}}{(2k-1)!}
        $$
        It suffices to show that $C(x)>S(x)$, for every $xinmathbb R$.



        First observe that: $C'(x)=S(x)$ and $S'(x)=C(x)$. Then observe that
        $$
        big(C^2(x)-S^2(x)big)'=2big(C(x)C'(x)-S(x)S'(x)big)=2big(C(x)S(x)-S(x)C(x)big)=0,
        $$
        and hence
        $$
        C^2(x)-S^2(x)=C^2(0)-S^2(0)=1.
        $$
        Thus, for every $xinmathbb R$,
        $$
        C(x)=sqrt{S^2(x)+1}>S(x).
        $$






        share|cite|improve this answer









        $endgroup$
















          3












          3








          3





          $begingroup$

          A hyperbolic trigonometry approach. Set
          $$
          C(x)=sum_{k=0}^inftyfrac{x^{2k}}{(2k)!}quadtext{and}quad S(x)=sum_{k=1}^inftyfrac{x^{2k-1}}{(2k-1)!}
          $$
          It suffices to show that $C(x)>S(x)$, for every $xinmathbb R$.



          First observe that: $C'(x)=S(x)$ and $S'(x)=C(x)$. Then observe that
          $$
          big(C^2(x)-S^2(x)big)'=2big(C(x)C'(x)-S(x)S'(x)big)=2big(C(x)S(x)-S(x)C(x)big)=0,
          $$
          and hence
          $$
          C^2(x)-S^2(x)=C^2(0)-S^2(0)=1.
          $$
          Thus, for every $xinmathbb R$,
          $$
          C(x)=sqrt{S^2(x)+1}>S(x).
          $$






          share|cite|improve this answer









          $endgroup$



          A hyperbolic trigonometry approach. Set
          $$
          C(x)=sum_{k=0}^inftyfrac{x^{2k}}{(2k)!}quadtext{and}quad S(x)=sum_{k=1}^inftyfrac{x^{2k-1}}{(2k-1)!}
          $$
          It suffices to show that $C(x)>S(x)$, for every $xinmathbb R$.



          First observe that: $C'(x)=S(x)$ and $S'(x)=C(x)$. Then observe that
          $$
          big(C^2(x)-S^2(x)big)'=2big(C(x)C'(x)-S(x)S'(x)big)=2big(C(x)S(x)-S(x)C(x)big)=0,
          $$
          and hence
          $$
          C^2(x)-S^2(x)=C^2(0)-S^2(0)=1.
          $$
          Thus, for every $xinmathbb R$,
          $$
          C(x)=sqrt{S^2(x)+1}>S(x).
          $$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Apr 5 '15 at 23:54









          Yiorgos S. SmyrlisYiorgos S. Smyrlis

          63.3k1385163




          63.3k1385163























              1












              $begingroup$

              Using termwise differentiation one finds that $exp$ satisfies the linear differential equation $y'=y$, which obvioulsy satisfies the assumptions of the existence and uniqueness theorem. The function $y_0(x):equiv0$ is a solution, and no other solution can cross the graph of $y_0$. It follows that $xmapsto e^x$, which is positive when $x=0$, is positive on its full domain ${mathbb R}$.






              share|cite|improve this answer









              $endgroup$













              • $begingroup$
                That's perfectly correct. However, I cannot use this for my purpose as my functions do not satisfy a homogeneous linear equation. But as I have mentioned before, the proof is correct.
                $endgroup$
                – bkarpuz
                Apr 5 '15 at 11:18










              • $begingroup$
                In order to prove the Uniqueness part of Picard-Lindelof you NEED the semigroup properties of the exponential!
                $endgroup$
                – Yiorgos S. Smyrlis
                Apr 7 '15 at 7:31










              • $begingroup$
                I think that we can use the Uniqueness result by Peano, which requires the right-hand side function to be nonincreasing in $y$. As $mathrm{e}^{x}geq1$ is obvious for $xgeq0$, we need to show that $y:=mathrm{e}^{-x}>0$ for $xgeq0$. Note that $y^{prime}=-y$ (decreasing in $y$) and $y(0)=1$.
                $endgroup$
                – bkarpuz
                Apr 7 '15 at 18:28


















              1












              $begingroup$

              Using termwise differentiation one finds that $exp$ satisfies the linear differential equation $y'=y$, which obvioulsy satisfies the assumptions of the existence and uniqueness theorem. The function $y_0(x):equiv0$ is a solution, and no other solution can cross the graph of $y_0$. It follows that $xmapsto e^x$, which is positive when $x=0$, is positive on its full domain ${mathbb R}$.






              share|cite|improve this answer









              $endgroup$













              • $begingroup$
                That's perfectly correct. However, I cannot use this for my purpose as my functions do not satisfy a homogeneous linear equation. But as I have mentioned before, the proof is correct.
                $endgroup$
                – bkarpuz
                Apr 5 '15 at 11:18










              • $begingroup$
                In order to prove the Uniqueness part of Picard-Lindelof you NEED the semigroup properties of the exponential!
                $endgroup$
                – Yiorgos S. Smyrlis
                Apr 7 '15 at 7:31










              • $begingroup$
                I think that we can use the Uniqueness result by Peano, which requires the right-hand side function to be nonincreasing in $y$. As $mathrm{e}^{x}geq1$ is obvious for $xgeq0$, we need to show that $y:=mathrm{e}^{-x}>0$ for $xgeq0$. Note that $y^{prime}=-y$ (decreasing in $y$) and $y(0)=1$.
                $endgroup$
                – bkarpuz
                Apr 7 '15 at 18:28
















              1












              1








              1





              $begingroup$

              Using termwise differentiation one finds that $exp$ satisfies the linear differential equation $y'=y$, which obvioulsy satisfies the assumptions of the existence and uniqueness theorem. The function $y_0(x):equiv0$ is a solution, and no other solution can cross the graph of $y_0$. It follows that $xmapsto e^x$, which is positive when $x=0$, is positive on its full domain ${mathbb R}$.






              share|cite|improve this answer









              $endgroup$



              Using termwise differentiation one finds that $exp$ satisfies the linear differential equation $y'=y$, which obvioulsy satisfies the assumptions of the existence and uniqueness theorem. The function $y_0(x):equiv0$ is a solution, and no other solution can cross the graph of $y_0$. It follows that $xmapsto e^x$, which is positive when $x=0$, is positive on its full domain ${mathbb R}$.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Apr 5 '15 at 11:15









              Christian BlatterChristian Blatter

              173k7113326




              173k7113326












              • $begingroup$
                That's perfectly correct. However, I cannot use this for my purpose as my functions do not satisfy a homogeneous linear equation. But as I have mentioned before, the proof is correct.
                $endgroup$
                – bkarpuz
                Apr 5 '15 at 11:18










              • $begingroup$
                In order to prove the Uniqueness part of Picard-Lindelof you NEED the semigroup properties of the exponential!
                $endgroup$
                – Yiorgos S. Smyrlis
                Apr 7 '15 at 7:31










              • $begingroup$
                I think that we can use the Uniqueness result by Peano, which requires the right-hand side function to be nonincreasing in $y$. As $mathrm{e}^{x}geq1$ is obvious for $xgeq0$, we need to show that $y:=mathrm{e}^{-x}>0$ for $xgeq0$. Note that $y^{prime}=-y$ (decreasing in $y$) and $y(0)=1$.
                $endgroup$
                – bkarpuz
                Apr 7 '15 at 18:28




















              • $begingroup$
                That's perfectly correct. However, I cannot use this for my purpose as my functions do not satisfy a homogeneous linear equation. But as I have mentioned before, the proof is correct.
                $endgroup$
                – bkarpuz
                Apr 5 '15 at 11:18










              • $begingroup$
                In order to prove the Uniqueness part of Picard-Lindelof you NEED the semigroup properties of the exponential!
                $endgroup$
                – Yiorgos S. Smyrlis
                Apr 7 '15 at 7:31










              • $begingroup$
                I think that we can use the Uniqueness result by Peano, which requires the right-hand side function to be nonincreasing in $y$. As $mathrm{e}^{x}geq1$ is obvious for $xgeq0$, we need to show that $y:=mathrm{e}^{-x}>0$ for $xgeq0$. Note that $y^{prime}=-y$ (decreasing in $y$) and $y(0)=1$.
                $endgroup$
                – bkarpuz
                Apr 7 '15 at 18:28


















              $begingroup$
              That's perfectly correct. However, I cannot use this for my purpose as my functions do not satisfy a homogeneous linear equation. But as I have mentioned before, the proof is correct.
              $endgroup$
              – bkarpuz
              Apr 5 '15 at 11:18




              $begingroup$
              That's perfectly correct. However, I cannot use this for my purpose as my functions do not satisfy a homogeneous linear equation. But as I have mentioned before, the proof is correct.
              $endgroup$
              – bkarpuz
              Apr 5 '15 at 11:18












              $begingroup$
              In order to prove the Uniqueness part of Picard-Lindelof you NEED the semigroup properties of the exponential!
              $endgroup$
              – Yiorgos S. Smyrlis
              Apr 7 '15 at 7:31




              $begingroup$
              In order to prove the Uniqueness part of Picard-Lindelof you NEED the semigroup properties of the exponential!
              $endgroup$
              – Yiorgos S. Smyrlis
              Apr 7 '15 at 7:31












              $begingroup$
              I think that we can use the Uniqueness result by Peano, which requires the right-hand side function to be nonincreasing in $y$. As $mathrm{e}^{x}geq1$ is obvious for $xgeq0$, we need to show that $y:=mathrm{e}^{-x}>0$ for $xgeq0$. Note that $y^{prime}=-y$ (decreasing in $y$) and $y(0)=1$.
              $endgroup$
              – bkarpuz
              Apr 7 '15 at 18:28






              $begingroup$
              I think that we can use the Uniqueness result by Peano, which requires the right-hand side function to be nonincreasing in $y$. As $mathrm{e}^{x}geq1$ is obvious for $xgeq0$, we need to show that $y:=mathrm{e}^{-x}>0$ for $xgeq0$. Note that $y^{prime}=-y$ (decreasing in $y$) and $y(0)=1$.
              $endgroup$
              – bkarpuz
              Apr 7 '15 at 18:28













              1












              $begingroup$

              Assuming uniform convergence of the series you can show by termwise differentiation that $f(x) = e^x$ verifies $f'(x) = f(x).$



              Clearly $e^x = sum_k frac{x^k}{k!}$ is strictly positive for all positive $x$ therefore it is an increasing function on $mathbb{R}^+$. Consider the set $A = {x < 0 : e^x leq 0 }$ and assume that it is non empty.



              Let $(x_n)_{nin mathbb{N}}$ be a sequence in $A$ that converges to $L$. Then $L in A$ by continuity of $f$
              $$ f(L) = f(lim_{n to infty} x_n) = lim_{n to infty} f(x_n) leq 0.$$



              Therefore $a := sup A in A$ and $a < 0$ and $e^a leq 0$.



              If $e^a < 0$ then notice that $e^0 = 1$ and by the intermediate value theorem there exists $ a < c < 0$ such that $e^c = 0$ and $c in A$ which contradicts the maximality of $a$.



              If $e^a = 0$ then consider $$C = {c' leq a ; vert forall x in (c',a], ; ;f(x) = 0 }.$$ If $inf C = k > - infty$, then $exists delta > 0$ s.t. the interval $[k- delta, k + delta]$ around $k$ is such that $f > 0$ on $[k- delta, k)$ or $f < 0$ on $[k- delta,k)$.



              Since $f$ is equal to its own derivative it is either positive-increasing or negative increasing on $(k-delta,k$). In both cases by the mean value theorem, $exists alpha in (k-delta/2,k)$ s.t.
              $$ f'(alpha) = frac{f(k) - f(k- delta/2)}{delta/2} = - frac{f(k-delta/2)}{delta/2} $$
              This is a contradiction since $f'(alpha) = f(alpha)$ and $f(k- delta/2)$ have the same sign and are both not equal to zero.



              If $inf C' = - infty$ then $e^x = 0 forall x leq a.$ We must proceed differently:



              Consider the function $F : mathbb{R}^+ rightarrow mathbb{R}: x mapsto F(x) = int_{a+x}^0 e^t ;dt$



              Clearly $$F(x) = [e^t]_{a+x}^0 = 1 - e^{a+x}.$$



              Using the change of variable $u(t) = t - x $ in the integral we get



              $$ F(x) = int_{x+a}^0 e^t dt = int_{a}^{-x} e^t dt = [e^t]^{-x}_a = e^{-x} - e^a = e^{-x}$$



              Therefore $forall x > 0$:



              $$ 1 - e^{a + x} = e^{-x}.$$



              Since $e^t > 0 ; forall t > a$ the exponential function is increasing on $(a, + infty)$ so



              $$a + x > 0 Rightarrow e^{a + x} > e^{0} = 1 iff F(x) = 1 - e^{a+x} < 0.$$



              But $ a + x > 0 Rightarrow -x < a$ and $e^{-x} = 0$ so $$F(x) = e^{-x} = 0$$
              which is a contradiction.



              We conclude that $A = { x < 0 : e^x leq 0 } = varnothing$ and the exponential function is positive everywhere.






              share|cite|improve this answer











              $endgroup$


















                1












                $begingroup$

                Assuming uniform convergence of the series you can show by termwise differentiation that $f(x) = e^x$ verifies $f'(x) = f(x).$



                Clearly $e^x = sum_k frac{x^k}{k!}$ is strictly positive for all positive $x$ therefore it is an increasing function on $mathbb{R}^+$. Consider the set $A = {x < 0 : e^x leq 0 }$ and assume that it is non empty.



                Let $(x_n)_{nin mathbb{N}}$ be a sequence in $A$ that converges to $L$. Then $L in A$ by continuity of $f$
                $$ f(L) = f(lim_{n to infty} x_n) = lim_{n to infty} f(x_n) leq 0.$$



                Therefore $a := sup A in A$ and $a < 0$ and $e^a leq 0$.



                If $e^a < 0$ then notice that $e^0 = 1$ and by the intermediate value theorem there exists $ a < c < 0$ such that $e^c = 0$ and $c in A$ which contradicts the maximality of $a$.



                If $e^a = 0$ then consider $$C = {c' leq a ; vert forall x in (c',a], ; ;f(x) = 0 }.$$ If $inf C = k > - infty$, then $exists delta > 0$ s.t. the interval $[k- delta, k + delta]$ around $k$ is such that $f > 0$ on $[k- delta, k)$ or $f < 0$ on $[k- delta,k)$.



                Since $f$ is equal to its own derivative it is either positive-increasing or negative increasing on $(k-delta,k$). In both cases by the mean value theorem, $exists alpha in (k-delta/2,k)$ s.t.
                $$ f'(alpha) = frac{f(k) - f(k- delta/2)}{delta/2} = - frac{f(k-delta/2)}{delta/2} $$
                This is a contradiction since $f'(alpha) = f(alpha)$ and $f(k- delta/2)$ have the same sign and are both not equal to zero.



                If $inf C' = - infty$ then $e^x = 0 forall x leq a.$ We must proceed differently:



                Consider the function $F : mathbb{R}^+ rightarrow mathbb{R}: x mapsto F(x) = int_{a+x}^0 e^t ;dt$



                Clearly $$F(x) = [e^t]_{a+x}^0 = 1 - e^{a+x}.$$



                Using the change of variable $u(t) = t - x $ in the integral we get



                $$ F(x) = int_{x+a}^0 e^t dt = int_{a}^{-x} e^t dt = [e^t]^{-x}_a = e^{-x} - e^a = e^{-x}$$



                Therefore $forall x > 0$:



                $$ 1 - e^{a + x} = e^{-x}.$$



                Since $e^t > 0 ; forall t > a$ the exponential function is increasing on $(a, + infty)$ so



                $$a + x > 0 Rightarrow e^{a + x} > e^{0} = 1 iff F(x) = 1 - e^{a+x} < 0.$$



                But $ a + x > 0 Rightarrow -x < a$ and $e^{-x} = 0$ so $$F(x) = e^{-x} = 0$$
                which is a contradiction.



                We conclude that $A = { x < 0 : e^x leq 0 } = varnothing$ and the exponential function is positive everywhere.






                share|cite|improve this answer











                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  Assuming uniform convergence of the series you can show by termwise differentiation that $f(x) = e^x$ verifies $f'(x) = f(x).$



                  Clearly $e^x = sum_k frac{x^k}{k!}$ is strictly positive for all positive $x$ therefore it is an increasing function on $mathbb{R}^+$. Consider the set $A = {x < 0 : e^x leq 0 }$ and assume that it is non empty.



                  Let $(x_n)_{nin mathbb{N}}$ be a sequence in $A$ that converges to $L$. Then $L in A$ by continuity of $f$
                  $$ f(L) = f(lim_{n to infty} x_n) = lim_{n to infty} f(x_n) leq 0.$$



                  Therefore $a := sup A in A$ and $a < 0$ and $e^a leq 0$.



                  If $e^a < 0$ then notice that $e^0 = 1$ and by the intermediate value theorem there exists $ a < c < 0$ such that $e^c = 0$ and $c in A$ which contradicts the maximality of $a$.



                  If $e^a = 0$ then consider $$C = {c' leq a ; vert forall x in (c',a], ; ;f(x) = 0 }.$$ If $inf C = k > - infty$, then $exists delta > 0$ s.t. the interval $[k- delta, k + delta]$ around $k$ is such that $f > 0$ on $[k- delta, k)$ or $f < 0$ on $[k- delta,k)$.



                  Since $f$ is equal to its own derivative it is either positive-increasing or negative increasing on $(k-delta,k$). In both cases by the mean value theorem, $exists alpha in (k-delta/2,k)$ s.t.
                  $$ f'(alpha) = frac{f(k) - f(k- delta/2)}{delta/2} = - frac{f(k-delta/2)}{delta/2} $$
                  This is a contradiction since $f'(alpha) = f(alpha)$ and $f(k- delta/2)$ have the same sign and are both not equal to zero.



                  If $inf C' = - infty$ then $e^x = 0 forall x leq a.$ We must proceed differently:



                  Consider the function $F : mathbb{R}^+ rightarrow mathbb{R}: x mapsto F(x) = int_{a+x}^0 e^t ;dt$



                  Clearly $$F(x) = [e^t]_{a+x}^0 = 1 - e^{a+x}.$$



                  Using the change of variable $u(t) = t - x $ in the integral we get



                  $$ F(x) = int_{x+a}^0 e^t dt = int_{a}^{-x} e^t dt = [e^t]^{-x}_a = e^{-x} - e^a = e^{-x}$$



                  Therefore $forall x > 0$:



                  $$ 1 - e^{a + x} = e^{-x}.$$



                  Since $e^t > 0 ; forall t > a$ the exponential function is increasing on $(a, + infty)$ so



                  $$a + x > 0 Rightarrow e^{a + x} > e^{0} = 1 iff F(x) = 1 - e^{a+x} < 0.$$



                  But $ a + x > 0 Rightarrow -x < a$ and $e^{-x} = 0$ so $$F(x) = e^{-x} = 0$$
                  which is a contradiction.



                  We conclude that $A = { x < 0 : e^x leq 0 } = varnothing$ and the exponential function is positive everywhere.






                  share|cite|improve this answer











                  $endgroup$



                  Assuming uniform convergence of the series you can show by termwise differentiation that $f(x) = e^x$ verifies $f'(x) = f(x).$



                  Clearly $e^x = sum_k frac{x^k}{k!}$ is strictly positive for all positive $x$ therefore it is an increasing function on $mathbb{R}^+$. Consider the set $A = {x < 0 : e^x leq 0 }$ and assume that it is non empty.



                  Let $(x_n)_{nin mathbb{N}}$ be a sequence in $A$ that converges to $L$. Then $L in A$ by continuity of $f$
                  $$ f(L) = f(lim_{n to infty} x_n) = lim_{n to infty} f(x_n) leq 0.$$



                  Therefore $a := sup A in A$ and $a < 0$ and $e^a leq 0$.



                  If $e^a < 0$ then notice that $e^0 = 1$ and by the intermediate value theorem there exists $ a < c < 0$ such that $e^c = 0$ and $c in A$ which contradicts the maximality of $a$.



                  If $e^a = 0$ then consider $$C = {c' leq a ; vert forall x in (c',a], ; ;f(x) = 0 }.$$ If $inf C = k > - infty$, then $exists delta > 0$ s.t. the interval $[k- delta, k + delta]$ around $k$ is such that $f > 0$ on $[k- delta, k)$ or $f < 0$ on $[k- delta,k)$.



                  Since $f$ is equal to its own derivative it is either positive-increasing or negative increasing on $(k-delta,k$). In both cases by the mean value theorem, $exists alpha in (k-delta/2,k)$ s.t.
                  $$ f'(alpha) = frac{f(k) - f(k- delta/2)}{delta/2} = - frac{f(k-delta/2)}{delta/2} $$
                  This is a contradiction since $f'(alpha) = f(alpha)$ and $f(k- delta/2)$ have the same sign and are both not equal to zero.



                  If $inf C' = - infty$ then $e^x = 0 forall x leq a.$ We must proceed differently:



                  Consider the function $F : mathbb{R}^+ rightarrow mathbb{R}: x mapsto F(x) = int_{a+x}^0 e^t ;dt$



                  Clearly $$F(x) = [e^t]_{a+x}^0 = 1 - e^{a+x}.$$



                  Using the change of variable $u(t) = t - x $ in the integral we get



                  $$ F(x) = int_{x+a}^0 e^t dt = int_{a}^{-x} e^t dt = [e^t]^{-x}_a = e^{-x} - e^a = e^{-x}$$



                  Therefore $forall x > 0$:



                  $$ 1 - e^{a + x} = e^{-x}.$$



                  Since $e^t > 0 ; forall t > a$ the exponential function is increasing on $(a, + infty)$ so



                  $$a + x > 0 Rightarrow e^{a + x} > e^{0} = 1 iff F(x) = 1 - e^{a+x} < 0.$$



                  But $ a + x > 0 Rightarrow -x < a$ and $e^{-x} = 0$ so $$F(x) = e^{-x} = 0$$
                  which is a contradiction.



                  We conclude that $A = { x < 0 : e^x leq 0 } = varnothing$ and the exponential function is positive everywhere.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Jan 11 at 14:10

























                  answered Jan 10 at 19:48









                  DigitalisDigitalis

                  528216




                  528216























                      0












                      $begingroup$

                      The series expansion is
                      $$
                      e^x=sum_{n=0}^inftyfrac{x^n}{n!}=1+x+frac{x^2}{2!}+frac{x^3}{3!}+frac{x^4}{4!}+cdots
                      $$
                      For $xge0$ we have $1$ and a bunch of nonnegative numbers, so the result is clearly positive.



                      For $x<0$ notice that:
                      $$
                      frac1{e^x}=e^{-x}
                      $$
                      So positivity of $e^x$ clearly implies that $e^{-x}$ is positive.






                      share|cite|improve this answer











                      $endgroup$









                      • 1




                        $begingroup$
                        Put $x:=-20$ in your second displayed formula!
                        $endgroup$
                        – Christian Blatter
                        Apr 5 '15 at 9:41










                      • $begingroup$
                        @ChristianBlatter Right, okay, if $x<-1$ then you should group them with $1$ seperate, good point.
                        $endgroup$
                        – Alice Ryhl
                        Apr 5 '15 at 9:45










                      • $begingroup$
                        I cannot say that the proof is rigorious.
                        $endgroup$
                        – bkarpuz
                        Apr 5 '15 at 12:37






                      • 1




                        $begingroup$
                        Note that you have used the semi group property $mathrm{e}^{x}mathrm{e}^{-x}=mathrm{e}^{0}=1$, which is an infraction of the rule.
                        $endgroup$
                        – bkarpuz
                        Apr 5 '15 at 13:59








                      • 1




                        $begingroup$
                        @bkarpuz It becomes difficult to rearrange the terms for $x<-1$. It would be easier to prove that $e^xe^{-x}=1$ using the series expansion.
                        $endgroup$
                        – Alice Ryhl
                        Apr 5 '15 at 14:02
















                      0












                      $begingroup$

                      The series expansion is
                      $$
                      e^x=sum_{n=0}^inftyfrac{x^n}{n!}=1+x+frac{x^2}{2!}+frac{x^3}{3!}+frac{x^4}{4!}+cdots
                      $$
                      For $xge0$ we have $1$ and a bunch of nonnegative numbers, so the result is clearly positive.



                      For $x<0$ notice that:
                      $$
                      frac1{e^x}=e^{-x}
                      $$
                      So positivity of $e^x$ clearly implies that $e^{-x}$ is positive.






                      share|cite|improve this answer











                      $endgroup$









                      • 1




                        $begingroup$
                        Put $x:=-20$ in your second displayed formula!
                        $endgroup$
                        – Christian Blatter
                        Apr 5 '15 at 9:41










                      • $begingroup$
                        @ChristianBlatter Right, okay, if $x<-1$ then you should group them with $1$ seperate, good point.
                        $endgroup$
                        – Alice Ryhl
                        Apr 5 '15 at 9:45










                      • $begingroup$
                        I cannot say that the proof is rigorious.
                        $endgroup$
                        – bkarpuz
                        Apr 5 '15 at 12:37






                      • 1




                        $begingroup$
                        Note that you have used the semi group property $mathrm{e}^{x}mathrm{e}^{-x}=mathrm{e}^{0}=1$, which is an infraction of the rule.
                        $endgroup$
                        – bkarpuz
                        Apr 5 '15 at 13:59








                      • 1




                        $begingroup$
                        @bkarpuz It becomes difficult to rearrange the terms for $x<-1$. It would be easier to prove that $e^xe^{-x}=1$ using the series expansion.
                        $endgroup$
                        – Alice Ryhl
                        Apr 5 '15 at 14:02














                      0












                      0








                      0





                      $begingroup$

                      The series expansion is
                      $$
                      e^x=sum_{n=0}^inftyfrac{x^n}{n!}=1+x+frac{x^2}{2!}+frac{x^3}{3!}+frac{x^4}{4!}+cdots
                      $$
                      For $xge0$ we have $1$ and a bunch of nonnegative numbers, so the result is clearly positive.



                      For $x<0$ notice that:
                      $$
                      frac1{e^x}=e^{-x}
                      $$
                      So positivity of $e^x$ clearly implies that $e^{-x}$ is positive.






                      share|cite|improve this answer











                      $endgroup$



                      The series expansion is
                      $$
                      e^x=sum_{n=0}^inftyfrac{x^n}{n!}=1+x+frac{x^2}{2!}+frac{x^3}{3!}+frac{x^4}{4!}+cdots
                      $$
                      For $xge0$ we have $1$ and a bunch of nonnegative numbers, so the result is clearly positive.



                      For $x<0$ notice that:
                      $$
                      frac1{e^x}=e^{-x}
                      $$
                      So positivity of $e^x$ clearly implies that $e^{-x}$ is positive.







                      share|cite|improve this answer














                      share|cite|improve this answer



                      share|cite|improve this answer








                      edited Apr 5 '15 at 13:49

























                      answered Apr 5 '15 at 8:14









                      Alice RyhlAlice Ryhl

                      5,99011235




                      5,99011235








                      • 1




                        $begingroup$
                        Put $x:=-20$ in your second displayed formula!
                        $endgroup$
                        – Christian Blatter
                        Apr 5 '15 at 9:41










                      • $begingroup$
                        @ChristianBlatter Right, okay, if $x<-1$ then you should group them with $1$ seperate, good point.
                        $endgroup$
                        – Alice Ryhl
                        Apr 5 '15 at 9:45










                      • $begingroup$
                        I cannot say that the proof is rigorious.
                        $endgroup$
                        – bkarpuz
                        Apr 5 '15 at 12:37






                      • 1




                        $begingroup$
                        Note that you have used the semi group property $mathrm{e}^{x}mathrm{e}^{-x}=mathrm{e}^{0}=1$, which is an infraction of the rule.
                        $endgroup$
                        – bkarpuz
                        Apr 5 '15 at 13:59








                      • 1




                        $begingroup$
                        @bkarpuz It becomes difficult to rearrange the terms for $x<-1$. It would be easier to prove that $e^xe^{-x}=1$ using the series expansion.
                        $endgroup$
                        – Alice Ryhl
                        Apr 5 '15 at 14:02














                      • 1




                        $begingroup$
                        Put $x:=-20$ in your second displayed formula!
                        $endgroup$
                        – Christian Blatter
                        Apr 5 '15 at 9:41










                      • $begingroup$
                        @ChristianBlatter Right, okay, if $x<-1$ then you should group them with $1$ seperate, good point.
                        $endgroup$
                        – Alice Ryhl
                        Apr 5 '15 at 9:45










                      • $begingroup$
                        I cannot say that the proof is rigorious.
                        $endgroup$
                        – bkarpuz
                        Apr 5 '15 at 12:37






                      • 1




                        $begingroup$
                        Note that you have used the semi group property $mathrm{e}^{x}mathrm{e}^{-x}=mathrm{e}^{0}=1$, which is an infraction of the rule.
                        $endgroup$
                        – bkarpuz
                        Apr 5 '15 at 13:59








                      • 1




                        $begingroup$
                        @bkarpuz It becomes difficult to rearrange the terms for $x<-1$. It would be easier to prove that $e^xe^{-x}=1$ using the series expansion.
                        $endgroup$
                        – Alice Ryhl
                        Apr 5 '15 at 14:02








                      1




                      1




                      $begingroup$
                      Put $x:=-20$ in your second displayed formula!
                      $endgroup$
                      – Christian Blatter
                      Apr 5 '15 at 9:41




                      $begingroup$
                      Put $x:=-20$ in your second displayed formula!
                      $endgroup$
                      – Christian Blatter
                      Apr 5 '15 at 9:41












                      $begingroup$
                      @ChristianBlatter Right, okay, if $x<-1$ then you should group them with $1$ seperate, good point.
                      $endgroup$
                      – Alice Ryhl
                      Apr 5 '15 at 9:45




                      $begingroup$
                      @ChristianBlatter Right, okay, if $x<-1$ then you should group them with $1$ seperate, good point.
                      $endgroup$
                      – Alice Ryhl
                      Apr 5 '15 at 9:45












                      $begingroup$
                      I cannot say that the proof is rigorious.
                      $endgroup$
                      – bkarpuz
                      Apr 5 '15 at 12:37




                      $begingroup$
                      I cannot say that the proof is rigorious.
                      $endgroup$
                      – bkarpuz
                      Apr 5 '15 at 12:37




                      1




                      1




                      $begingroup$
                      Note that you have used the semi group property $mathrm{e}^{x}mathrm{e}^{-x}=mathrm{e}^{0}=1$, which is an infraction of the rule.
                      $endgroup$
                      – bkarpuz
                      Apr 5 '15 at 13:59






                      $begingroup$
                      Note that you have used the semi group property $mathrm{e}^{x}mathrm{e}^{-x}=mathrm{e}^{0}=1$, which is an infraction of the rule.
                      $endgroup$
                      – bkarpuz
                      Apr 5 '15 at 13:59






                      1




                      1




                      $begingroup$
                      @bkarpuz It becomes difficult to rearrange the terms for $x<-1$. It would be easier to prove that $e^xe^{-x}=1$ using the series expansion.
                      $endgroup$
                      – Alice Ryhl
                      Apr 5 '15 at 14:02




                      $begingroup$
                      @bkarpuz It becomes difficult to rearrange the terms for $x<-1$. It would be easier to prove that $e^xe^{-x}=1$ using the series expansion.
                      $endgroup$
                      – Alice Ryhl
                      Apr 5 '15 at 14:02


















                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1220729%2fproving-positivity-of-the-exponential-function%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      MongoDB - Not Authorized To Execute Command

                      in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

                      Npm cannot find a required file even through it is in the searched directory