Proofs of the Cauchy-Schwarz Inequality?












20












$begingroup$


How many proofs of the Cauchy-Schwarz inequality are there? Is there some kind of reference that lists all of these proofs?










share|cite|improve this question











$endgroup$












  • $begingroup$
    I've slightly edited the body your question in order to make it self-contained. I've also added the real-analysis tag.
    $endgroup$
    – t.b.
    Feb 24 '11 at 12:58








  • 14




    $begingroup$
    I would try the book "The Cauchy-Schwarz Masterclass".
    $endgroup$
    – user3533
    Feb 24 '11 at 13:03










  • $begingroup$
    @user3533: excellent. I got that book. It seems that this textbooks also talks about few other inequalities that I was going to study. Do you know any other references that basically is about different kind of inequalities and their proofs? Thanks
    $endgroup$
    – Vafa Khalighi
    Feb 24 '11 at 13:30






  • 2




    $begingroup$
    @Vafa: many Olympiad preparation books include sections on inequalities. There is also Kedlaya's notes: artofproblemsolving.com/Resources/Papers/…
    $endgroup$
    – Qiaochu Yuan
    Feb 24 '11 at 13:35






  • 3




    $begingroup$
    Your first question is essentially unanswerable, except, maybe, by "many"...
    $endgroup$
    – Mariano Suárez-Álvarez
    Feb 24 '11 at 22:52
















20












$begingroup$


How many proofs of the Cauchy-Schwarz inequality are there? Is there some kind of reference that lists all of these proofs?










share|cite|improve this question











$endgroup$












  • $begingroup$
    I've slightly edited the body your question in order to make it self-contained. I've also added the real-analysis tag.
    $endgroup$
    – t.b.
    Feb 24 '11 at 12:58








  • 14




    $begingroup$
    I would try the book "The Cauchy-Schwarz Masterclass".
    $endgroup$
    – user3533
    Feb 24 '11 at 13:03










  • $begingroup$
    @user3533: excellent. I got that book. It seems that this textbooks also talks about few other inequalities that I was going to study. Do you know any other references that basically is about different kind of inequalities and their proofs? Thanks
    $endgroup$
    – Vafa Khalighi
    Feb 24 '11 at 13:30






  • 2




    $begingroup$
    @Vafa: many Olympiad preparation books include sections on inequalities. There is also Kedlaya's notes: artofproblemsolving.com/Resources/Papers/…
    $endgroup$
    – Qiaochu Yuan
    Feb 24 '11 at 13:35






  • 3




    $begingroup$
    Your first question is essentially unanswerable, except, maybe, by "many"...
    $endgroup$
    – Mariano Suárez-Álvarez
    Feb 24 '11 at 22:52














20












20








20


9



$begingroup$


How many proofs of the Cauchy-Schwarz inequality are there? Is there some kind of reference that lists all of these proofs?










share|cite|improve this question











$endgroup$




How many proofs of the Cauchy-Schwarz inequality are there? Is there some kind of reference that lists all of these proofs?







real-analysis inequality big-list inner-product-space






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jul 2 '12 at 9:55


























community wiki





5 revs, 2 users 67%
Vafa Khalighi














  • $begingroup$
    I've slightly edited the body your question in order to make it self-contained. I've also added the real-analysis tag.
    $endgroup$
    – t.b.
    Feb 24 '11 at 12:58








  • 14




    $begingroup$
    I would try the book "The Cauchy-Schwarz Masterclass".
    $endgroup$
    – user3533
    Feb 24 '11 at 13:03










  • $begingroup$
    @user3533: excellent. I got that book. It seems that this textbooks also talks about few other inequalities that I was going to study. Do you know any other references that basically is about different kind of inequalities and their proofs? Thanks
    $endgroup$
    – Vafa Khalighi
    Feb 24 '11 at 13:30






  • 2




    $begingroup$
    @Vafa: many Olympiad preparation books include sections on inequalities. There is also Kedlaya's notes: artofproblemsolving.com/Resources/Papers/…
    $endgroup$
    – Qiaochu Yuan
    Feb 24 '11 at 13:35






  • 3




    $begingroup$
    Your first question is essentially unanswerable, except, maybe, by "many"...
    $endgroup$
    – Mariano Suárez-Álvarez
    Feb 24 '11 at 22:52


















  • $begingroup$
    I've slightly edited the body your question in order to make it self-contained. I've also added the real-analysis tag.
    $endgroup$
    – t.b.
    Feb 24 '11 at 12:58








  • 14




    $begingroup$
    I would try the book "The Cauchy-Schwarz Masterclass".
    $endgroup$
    – user3533
    Feb 24 '11 at 13:03










  • $begingroup$
    @user3533: excellent. I got that book. It seems that this textbooks also talks about few other inequalities that I was going to study. Do you know any other references that basically is about different kind of inequalities and their proofs? Thanks
    $endgroup$
    – Vafa Khalighi
    Feb 24 '11 at 13:30






  • 2




    $begingroup$
    @Vafa: many Olympiad preparation books include sections on inequalities. There is also Kedlaya's notes: artofproblemsolving.com/Resources/Papers/…
    $endgroup$
    – Qiaochu Yuan
    Feb 24 '11 at 13:35






  • 3




    $begingroup$
    Your first question is essentially unanswerable, except, maybe, by "many"...
    $endgroup$
    – Mariano Suárez-Álvarez
    Feb 24 '11 at 22:52
















$begingroup$
I've slightly edited the body your question in order to make it self-contained. I've also added the real-analysis tag.
$endgroup$
– t.b.
Feb 24 '11 at 12:58






$begingroup$
I've slightly edited the body your question in order to make it self-contained. I've also added the real-analysis tag.
$endgroup$
– t.b.
Feb 24 '11 at 12:58






14




14




$begingroup$
I would try the book "The Cauchy-Schwarz Masterclass".
$endgroup$
– user3533
Feb 24 '11 at 13:03




$begingroup$
I would try the book "The Cauchy-Schwarz Masterclass".
$endgroup$
– user3533
Feb 24 '11 at 13:03












$begingroup$
@user3533: excellent. I got that book. It seems that this textbooks also talks about few other inequalities that I was going to study. Do you know any other references that basically is about different kind of inequalities and their proofs? Thanks
$endgroup$
– Vafa Khalighi
Feb 24 '11 at 13:30




$begingroup$
@user3533: excellent. I got that book. It seems that this textbooks also talks about few other inequalities that I was going to study. Do you know any other references that basically is about different kind of inequalities and their proofs? Thanks
$endgroup$
– Vafa Khalighi
Feb 24 '11 at 13:30




2




2




$begingroup$
@Vafa: many Olympiad preparation books include sections on inequalities. There is also Kedlaya's notes: artofproblemsolving.com/Resources/Papers/…
$endgroup$
– Qiaochu Yuan
Feb 24 '11 at 13:35




$begingroup$
@Vafa: many Olympiad preparation books include sections on inequalities. There is also Kedlaya's notes: artofproblemsolving.com/Resources/Papers/…
$endgroup$
– Qiaochu Yuan
Feb 24 '11 at 13:35




3




3




$begingroup$
Your first question is essentially unanswerable, except, maybe, by "many"...
$endgroup$
– Mariano Suárez-Álvarez
Feb 24 '11 at 22:52




$begingroup$
Your first question is essentially unanswerable, except, maybe, by "many"...
$endgroup$
– Mariano Suárez-Álvarez
Feb 24 '11 at 22:52










6 Answers
6






active

oldest

votes


















8












$begingroup$

$newcommand{bbx}[1]{,bbox[15px,border:1px groove navy]{displaystyle{#1}},}newcommand{i}{mathrm{i}}newcommand{text}[1]{mathrm{#1}}newcommand{root}[2]{^{#2}sqrt[#1]} newcommand{derivative}[3]{frac{mathrm{d}^{#1} #2}{mathrm{d} #3^{#1}}} newcommand{abs}[1]{leftvert,{#1},rightvert}newcommand{x}[0]{times}newcommand{summ}[3]{sum^{#2}_{#1}#3}newcommand{s}[0]{space}newcommand{i}[0]{mathrm{i}}newcommand{kume}[1]{mathbb{#1}}newcommand{bold}[1]{textbf{#1}}newcommand{italic}[1]{textit{#1}}newcommand{kumedigerBETA}[1]{rm #1!#1}$



Here's a simple proof:



$|vec{x}cdotvec{y}| leq |vec{x}||vec{y}| $



Substitute $|vec{x}cdotvec{y}| = |vec{x}||vec{y}|cos theta$



$|vec{x}||vec{y}|cos theta leq |vec{x}||vec{y}| $



Divide both sides by $|vec{x}||vec{y}|$



$cos theta leq 1$



-Hey, I was looking for a "more serious" proof!



Then here you are!



Here's another simple proof:



This is projecting a vector to another one (Click the gif if it doesn't load):





You drag its end in a line that is perpendicular to the other vector. Then multiply the length of the new vector with the old vector.



Do you know what the multiplication is equal to? The dot product of the vectors





When you project that vector, its norm (length) becomes lower - or stays the same if one of them is a scalar multiple of the other one.



^^ That was the proof. Think about it.



Source: $3$Blue$1$Brown



Wait, I look for a "really serious" proof!



Here you are.



Another proof:



Let $p(t)=||tvec{y}-vec{x}||^2$



As there's an absolute value, it must be equal to or bigger than $0$.



$p(t)=||tvec{y}-vec{x}||^2geq 0$



$p(t)=(tvec{y}-vec{x})(tvec{y}-vec{x})geq 0$



$p(t)=t^2(vec{y}cdot vec{y})-2t(vec{x}cdotvec{y})+vec{x}cdot vec{x}geq0$



Let's substitute some things.



$p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



$p(t)=color{blue}{a}t^2+color{red}{b}t+color{green}{c}geq0$



Its minimum value must be $large frac{-color{red}{b}}{2color{blue}{a}}$



Substituting $large t= frac{-color{red}{b}}{2color{blue}{a}}$



$p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{-color{red}{b}}{2color{blue}{a}})^2+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



$p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{color{red}{b}^2}{4color{blue}{a}^2})+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



$p(frac{-color{red}{b}}{2color{blue}{a}})=frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



Forget the $large p(t)$ function side (LHS)



$frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



Multiply by $large 4color{blue}{a}$



$color{red}{b}^2-2color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



$-color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



$4color{blue}{a}color{green}{c}geq color{red}{b}^2$



De-substitute



$p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



$4color{blue}{(vec{y}cdot vec{y})}color{green}{(vec{x}cdot vec{x})}geq color{red}{(-2vec{x}cdotvec{y})}^2$



Using the identity $large vec{v}cdotvec{v}=||vec{v}||^2$



$4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(-2vec{x}cdotvec{y})}^2$



Using the identity $(f(x))^2=(|f(x)|)^2$ (where $f(x)inkume{R}$)



$4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(|-2vec{x}cdotvec{y}|)}^2$



As the both sides are not negative, you can square root both sides.



$2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|-2vec{x}cdotvec{y}|}$



$2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{2|vec{x}cdotvec{y}|}$



$largecolor{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|vec{x}cdotvec{y}|}$



This one was from KhanAcademy






share|cite|improve this answer











$endgroup$





















    7












    $begingroup$

    Here is one:



    Claim: $|langle x,y rangle| leq |x||y| $



    Proof: If one of the two vectors is zero then both sides are zero so we may assume that both $x,y$ are non-zero. Let $t in mathbb C$. Then



    $$ begin{align}
    0 leq |x + ty |^2 &= langle x + ty, x + tyrangle \
    &= langle x,xrangle + langle x,t yrangle + langle yt, xrangle + langle ty,tyrangle \
    &= langle x,xrangle + bar{t} langle x,yrangle + t overline{langle x,yrangle} + |t|^2 langle y,yrangle \
    &= langle x,xrangle + 2 Re(t overline{langle x,yrangle}) + |t|^2 langle y,yrangle
    end{align}$$



    Now choose $t := -frac{langle x, y rangle}{langle y, y rangle}$. Then we get
    $$ 0 leq langle x,xrangle + 2 Re(- frac{|langle x,yrangle|^2}{langle y, y rangle}) + frac{|langle x,yrangle|^2}{langle y, y rangle} = langle x, x rangle - frac{|langle x,yrangle|^2}{langle y, y rangle}$$



    And hence $|langle x,y rangle| leq |x||y| $.



    Note that if $y = lambda x$ for $lambda in mathbb C$ then equality holds:
    $$ |lambda|^2 |langle x, x rangle| = |lambda|^2 |x||x| $$






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      I think that a proof of the Cauchy-Schwarz inequality should also include a discussion of the equality case (which is also straightforward from this argument).
      $endgroup$
      – t.b.
      Jul 2 '12 at 9:55










    • $begingroup$
      @t.b. Like this?
      $endgroup$
      – Rudy the Reindeer
      Jul 2 '12 at 10:22






    • 1




      $begingroup$
      This is half of what I had in mind. More interesting is the fact that if equality $lvertlangle x,yranglervert = lVert xrVert lVert y rVert$ holds then $y = lambda x$ or $x = 0$.
      $endgroup$
      – t.b.
      Jul 2 '12 at 10:26










    • $begingroup$
      In other words: equality holds if and only if $x$ and $y$ are linearly dependent.
      $endgroup$
      – t.b.
      Jul 2 '12 at 10:33






    • 1




      $begingroup$
      It doesn't help to use the angle, since that is defined in terms of the inner product. It's just that $|x+ty|^2=0$ iff $x+ty=0$ iff $x=-ty$ (with your choice of $t$).
      $endgroup$
      – wildildildlife
      Jul 2 '12 at 11:10



















    4












    $begingroup$

    Here is a nice simple proof. Fix, $X,Yin mathbb{R}^n$ then we wish to show
    $$
    Xcdot Y leq |X||Y|
    $$
    the trick is to construct a suitable vector $Zin mathbb{R}^n$ and then use the property of the dot product $Zcdot Z geq 0$. Take
    $$
    Z = frac{X}{|X|}-frac{Y}{|Y|}
    $$
    then we compute $Zcdot Z$
    begin{align}
    Zcdot Z &= frac{Xcdot X}{|X|^2}-2frac{Xcdot Y}{|X||Y|}+frac{Ycdot Y}{|Y|^2}\
    &=2 - 2frac{Xcdot Y}{|X||Y|}
    end{align}
    then we use $Zcdot Z geq 0$ to write
    begin{align}
    2-2frac{Xcdot Y}{|X||Y|}geq 0\
    2geq 2frac{Xcdot Y}{|X||Y|}\
    |X||Y|geq Xcdot Y
    end{align}
    and we are done.






    share|cite|improve this answer











    $endgroup$





















      2












      $begingroup$

      Without loss of generality, assume $|y|=1$. Write $x=left<x,yright>y+z$. Then $z$ is orthogonal to $y$, because
      $$left<x,yright>=left<(left<x,yright>y+z),yright>=left<x,yright>left<y,yright>+left<z,yright>,$$
      indeed yields $left<z,yright>=0$. Hence
      $$|x|^2=left<x,xright>=|left<x,yright>|^2+left<z,zright>geq |left<x,yright>|^2,$$
      with equality iff $z= 0$, i.e. $xinmathbb{F}y$.






      share|cite|improve this answer











      $endgroup$





















        1












        $begingroup$

        I like this proof for real vectors a lot. Recall that an inner product for real vectors has the following properties:



        $langle x,yrangle=langle y,xrangle$



        $langle ax+y,zrangle=alangle x,zrangle+langle y,zrangle$



        $langle x,xranglegeq0$



        Then
        $0leqlangle lx+y,lx+yrangle=l^2langle x,xrangle+llangle x,yrangle+llangle y,xrangle+langle y,yrangle=l^2langle x,xrangle+2llangle x,yrangle+langle y,yrangle$



        $Let:a=langle x,xrangle, b=langle x,yrangle,c=langle y,yrangle$, then the equation becomes



        $al^2+bl+cgeq0$



        This is a quadratic equation in $l$ with at most 1 real root. Therefore



        $b^2-4acleq 0$



        $implies4{langle x,yrangle}^2-4langle x,xranglelangle y,yrangleleq 0$



        $implies{langle x,yrangle}^2leqlangle x,xranglelangle y,yrangle$



        Not bad huh? Sadly it doesn't work out so nicely with complex vectors $:($






        share|cite|improve this answer











        $endgroup$





















          1












          $begingroup$

          Here is the proof from ``Introductory Real Analysis'', Kolmogorov & Fomin, Silverman Translation. Assume all sums are from $1$ to $n$.



          Lemma:
          $$
          ( sum_i x_i y_i )^2 = (sum_i x_i^2)(sum_i y_i^2)
          - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
          $$



          Proof of Cauchy-Schwarz: The third term in the Lemma is always non-positive, so clearly $( sum_i x_i y_i )^2 leq (sum_i x_i^2)(sum_i y_i^2) $ .



          Proof of Lemma: The left hand side (LHS), and the right hand side (RHS) should be shown to be equal. For the LHS write
          $$ text{LHS} = ( sum_i x_i y_i )^2 = (sum_i x_i y_i)(sum_j x_j y_j) = sum_isum_j x_iy_ix_jy_j. $$
          For the RHS write
          $$
          text{RHS}=
          frac{1}{2}(sum_i x_i^2)(sum_j y_j^2)
          +frac{1}{2} (sum_j x_j^2)(sum_i y_i^2)
          - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
          \ =
          frac{1}{2}sum_isum_jleft(
          x_i^2 y_j^2 + x_j^2y_i^2 - x_i^2 y_j^2 - x_j^2y_i^2 + 2 x_i y_i x_j y_j
          right)
          =
          sum_isum_j x_iy_ix_jy_j .
          $$

          This shows that LHS$=$RHS and finishes the proof.






          share|cite|improve this answer











          $endgroup$













            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f23522%2fproofs-of-the-cauchy-schwarz-inequality%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            6 Answers
            6






            active

            oldest

            votes








            6 Answers
            6






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            8












            $begingroup$

            $newcommand{bbx}[1]{,bbox[15px,border:1px groove navy]{displaystyle{#1}},}newcommand{i}{mathrm{i}}newcommand{text}[1]{mathrm{#1}}newcommand{root}[2]{^{#2}sqrt[#1]} newcommand{derivative}[3]{frac{mathrm{d}^{#1} #2}{mathrm{d} #3^{#1}}} newcommand{abs}[1]{leftvert,{#1},rightvert}newcommand{x}[0]{times}newcommand{summ}[3]{sum^{#2}_{#1}#3}newcommand{s}[0]{space}newcommand{i}[0]{mathrm{i}}newcommand{kume}[1]{mathbb{#1}}newcommand{bold}[1]{textbf{#1}}newcommand{italic}[1]{textit{#1}}newcommand{kumedigerBETA}[1]{rm #1!#1}$



            Here's a simple proof:



            $|vec{x}cdotvec{y}| leq |vec{x}||vec{y}| $



            Substitute $|vec{x}cdotvec{y}| = |vec{x}||vec{y}|cos theta$



            $|vec{x}||vec{y}|cos theta leq |vec{x}||vec{y}| $



            Divide both sides by $|vec{x}||vec{y}|$



            $cos theta leq 1$



            -Hey, I was looking for a "more serious" proof!



            Then here you are!



            Here's another simple proof:



            This is projecting a vector to another one (Click the gif if it doesn't load):





            You drag its end in a line that is perpendicular to the other vector. Then multiply the length of the new vector with the old vector.



            Do you know what the multiplication is equal to? The dot product of the vectors





            When you project that vector, its norm (length) becomes lower - or stays the same if one of them is a scalar multiple of the other one.



            ^^ That was the proof. Think about it.



            Source: $3$Blue$1$Brown



            Wait, I look for a "really serious" proof!



            Here you are.



            Another proof:



            Let $p(t)=||tvec{y}-vec{x}||^2$



            As there's an absolute value, it must be equal to or bigger than $0$.



            $p(t)=||tvec{y}-vec{x}||^2geq 0$



            $p(t)=(tvec{y}-vec{x})(tvec{y}-vec{x})geq 0$



            $p(t)=t^2(vec{y}cdot vec{y})-2t(vec{x}cdotvec{y})+vec{x}cdot vec{x}geq0$



            Let's substitute some things.



            $p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



            $p(t)=color{blue}{a}t^2+color{red}{b}t+color{green}{c}geq0$



            Its minimum value must be $large frac{-color{red}{b}}{2color{blue}{a}}$



            Substituting $large t= frac{-color{red}{b}}{2color{blue}{a}}$



            $p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{-color{red}{b}}{2color{blue}{a}})^2+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



            $p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{color{red}{b}^2}{4color{blue}{a}^2})+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



            $p(frac{-color{red}{b}}{2color{blue}{a}})=frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



            Forget the $large p(t)$ function side (LHS)



            $frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



            Multiply by $large 4color{blue}{a}$



            $color{red}{b}^2-2color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



            $-color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



            $4color{blue}{a}color{green}{c}geq color{red}{b}^2$



            De-substitute



            $p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



            $4color{blue}{(vec{y}cdot vec{y})}color{green}{(vec{x}cdot vec{x})}geq color{red}{(-2vec{x}cdotvec{y})}^2$



            Using the identity $large vec{v}cdotvec{v}=||vec{v}||^2$



            $4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(-2vec{x}cdotvec{y})}^2$



            Using the identity $(f(x))^2=(|f(x)|)^2$ (where $f(x)inkume{R}$)



            $4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(|-2vec{x}cdotvec{y}|)}^2$



            As the both sides are not negative, you can square root both sides.



            $2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|-2vec{x}cdotvec{y}|}$



            $2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{2|vec{x}cdotvec{y}|}$



            $largecolor{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|vec{x}cdotvec{y}|}$



            This one was from KhanAcademy






            share|cite|improve this answer











            $endgroup$


















              8












              $begingroup$

              $newcommand{bbx}[1]{,bbox[15px,border:1px groove navy]{displaystyle{#1}},}newcommand{i}{mathrm{i}}newcommand{text}[1]{mathrm{#1}}newcommand{root}[2]{^{#2}sqrt[#1]} newcommand{derivative}[3]{frac{mathrm{d}^{#1} #2}{mathrm{d} #3^{#1}}} newcommand{abs}[1]{leftvert,{#1},rightvert}newcommand{x}[0]{times}newcommand{summ}[3]{sum^{#2}_{#1}#3}newcommand{s}[0]{space}newcommand{i}[0]{mathrm{i}}newcommand{kume}[1]{mathbb{#1}}newcommand{bold}[1]{textbf{#1}}newcommand{italic}[1]{textit{#1}}newcommand{kumedigerBETA}[1]{rm #1!#1}$



              Here's a simple proof:



              $|vec{x}cdotvec{y}| leq |vec{x}||vec{y}| $



              Substitute $|vec{x}cdotvec{y}| = |vec{x}||vec{y}|cos theta$



              $|vec{x}||vec{y}|cos theta leq |vec{x}||vec{y}| $



              Divide both sides by $|vec{x}||vec{y}|$



              $cos theta leq 1$



              -Hey, I was looking for a "more serious" proof!



              Then here you are!



              Here's another simple proof:



              This is projecting a vector to another one (Click the gif if it doesn't load):





              You drag its end in a line that is perpendicular to the other vector. Then multiply the length of the new vector with the old vector.



              Do you know what the multiplication is equal to? The dot product of the vectors





              When you project that vector, its norm (length) becomes lower - or stays the same if one of them is a scalar multiple of the other one.



              ^^ That was the proof. Think about it.



              Source: $3$Blue$1$Brown



              Wait, I look for a "really serious" proof!



              Here you are.



              Another proof:



              Let $p(t)=||tvec{y}-vec{x}||^2$



              As there's an absolute value, it must be equal to or bigger than $0$.



              $p(t)=||tvec{y}-vec{x}||^2geq 0$



              $p(t)=(tvec{y}-vec{x})(tvec{y}-vec{x})geq 0$



              $p(t)=t^2(vec{y}cdot vec{y})-2t(vec{x}cdotvec{y})+vec{x}cdot vec{x}geq0$



              Let's substitute some things.



              $p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



              $p(t)=color{blue}{a}t^2+color{red}{b}t+color{green}{c}geq0$



              Its minimum value must be $large frac{-color{red}{b}}{2color{blue}{a}}$



              Substituting $large t= frac{-color{red}{b}}{2color{blue}{a}}$



              $p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{-color{red}{b}}{2color{blue}{a}})^2+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



              $p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{color{red}{b}^2}{4color{blue}{a}^2})+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



              $p(frac{-color{red}{b}}{2color{blue}{a}})=frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



              Forget the $large p(t)$ function side (LHS)



              $frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



              Multiply by $large 4color{blue}{a}$



              $color{red}{b}^2-2color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



              $-color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



              $4color{blue}{a}color{green}{c}geq color{red}{b}^2$



              De-substitute



              $p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



              $4color{blue}{(vec{y}cdot vec{y})}color{green}{(vec{x}cdot vec{x})}geq color{red}{(-2vec{x}cdotvec{y})}^2$



              Using the identity $large vec{v}cdotvec{v}=||vec{v}||^2$



              $4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(-2vec{x}cdotvec{y})}^2$



              Using the identity $(f(x))^2=(|f(x)|)^2$ (where $f(x)inkume{R}$)



              $4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(|-2vec{x}cdotvec{y}|)}^2$



              As the both sides are not negative, you can square root both sides.



              $2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|-2vec{x}cdotvec{y}|}$



              $2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{2|vec{x}cdotvec{y}|}$



              $largecolor{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|vec{x}cdotvec{y}|}$



              This one was from KhanAcademy






              share|cite|improve this answer











              $endgroup$
















                8












                8








                8





                $begingroup$

                $newcommand{bbx}[1]{,bbox[15px,border:1px groove navy]{displaystyle{#1}},}newcommand{i}{mathrm{i}}newcommand{text}[1]{mathrm{#1}}newcommand{root}[2]{^{#2}sqrt[#1]} newcommand{derivative}[3]{frac{mathrm{d}^{#1} #2}{mathrm{d} #3^{#1}}} newcommand{abs}[1]{leftvert,{#1},rightvert}newcommand{x}[0]{times}newcommand{summ}[3]{sum^{#2}_{#1}#3}newcommand{s}[0]{space}newcommand{i}[0]{mathrm{i}}newcommand{kume}[1]{mathbb{#1}}newcommand{bold}[1]{textbf{#1}}newcommand{italic}[1]{textit{#1}}newcommand{kumedigerBETA}[1]{rm #1!#1}$



                Here's a simple proof:



                $|vec{x}cdotvec{y}| leq |vec{x}||vec{y}| $



                Substitute $|vec{x}cdotvec{y}| = |vec{x}||vec{y}|cos theta$



                $|vec{x}||vec{y}|cos theta leq |vec{x}||vec{y}| $



                Divide both sides by $|vec{x}||vec{y}|$



                $cos theta leq 1$



                -Hey, I was looking for a "more serious" proof!



                Then here you are!



                Here's another simple proof:



                This is projecting a vector to another one (Click the gif if it doesn't load):





                You drag its end in a line that is perpendicular to the other vector. Then multiply the length of the new vector with the old vector.



                Do you know what the multiplication is equal to? The dot product of the vectors





                When you project that vector, its norm (length) becomes lower - or stays the same if one of them is a scalar multiple of the other one.



                ^^ That was the proof. Think about it.



                Source: $3$Blue$1$Brown



                Wait, I look for a "really serious" proof!



                Here you are.



                Another proof:



                Let $p(t)=||tvec{y}-vec{x}||^2$



                As there's an absolute value, it must be equal to or bigger than $0$.



                $p(t)=||tvec{y}-vec{x}||^2geq 0$



                $p(t)=(tvec{y}-vec{x})(tvec{y}-vec{x})geq 0$



                $p(t)=t^2(vec{y}cdot vec{y})-2t(vec{x}cdotvec{y})+vec{x}cdot vec{x}geq0$



                Let's substitute some things.



                $p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



                $p(t)=color{blue}{a}t^2+color{red}{b}t+color{green}{c}geq0$



                Its minimum value must be $large frac{-color{red}{b}}{2color{blue}{a}}$



                Substituting $large t= frac{-color{red}{b}}{2color{blue}{a}}$



                $p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{-color{red}{b}}{2color{blue}{a}})^2+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



                $p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{color{red}{b}^2}{4color{blue}{a}^2})+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



                $p(frac{-color{red}{b}}{2color{blue}{a}})=frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



                Forget the $large p(t)$ function side (LHS)



                $frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



                Multiply by $large 4color{blue}{a}$



                $color{red}{b}^2-2color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



                $-color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



                $4color{blue}{a}color{green}{c}geq color{red}{b}^2$



                De-substitute



                $p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



                $4color{blue}{(vec{y}cdot vec{y})}color{green}{(vec{x}cdot vec{x})}geq color{red}{(-2vec{x}cdotvec{y})}^2$



                Using the identity $large vec{v}cdotvec{v}=||vec{v}||^2$



                $4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(-2vec{x}cdotvec{y})}^2$



                Using the identity $(f(x))^2=(|f(x)|)^2$ (where $f(x)inkume{R}$)



                $4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(|-2vec{x}cdotvec{y}|)}^2$



                As the both sides are not negative, you can square root both sides.



                $2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|-2vec{x}cdotvec{y}|}$



                $2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{2|vec{x}cdotvec{y}|}$



                $largecolor{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|vec{x}cdotvec{y}|}$



                This one was from KhanAcademy






                share|cite|improve this answer











                $endgroup$



                $newcommand{bbx}[1]{,bbox[15px,border:1px groove navy]{displaystyle{#1}},}newcommand{i}{mathrm{i}}newcommand{text}[1]{mathrm{#1}}newcommand{root}[2]{^{#2}sqrt[#1]} newcommand{derivative}[3]{frac{mathrm{d}^{#1} #2}{mathrm{d} #3^{#1}}} newcommand{abs}[1]{leftvert,{#1},rightvert}newcommand{x}[0]{times}newcommand{summ}[3]{sum^{#2}_{#1}#3}newcommand{s}[0]{space}newcommand{i}[0]{mathrm{i}}newcommand{kume}[1]{mathbb{#1}}newcommand{bold}[1]{textbf{#1}}newcommand{italic}[1]{textit{#1}}newcommand{kumedigerBETA}[1]{rm #1!#1}$



                Here's a simple proof:



                $|vec{x}cdotvec{y}| leq |vec{x}||vec{y}| $



                Substitute $|vec{x}cdotvec{y}| = |vec{x}||vec{y}|cos theta$



                $|vec{x}||vec{y}|cos theta leq |vec{x}||vec{y}| $



                Divide both sides by $|vec{x}||vec{y}|$



                $cos theta leq 1$



                -Hey, I was looking for a "more serious" proof!



                Then here you are!



                Here's another simple proof:



                This is projecting a vector to another one (Click the gif if it doesn't load):





                You drag its end in a line that is perpendicular to the other vector. Then multiply the length of the new vector with the old vector.



                Do you know what the multiplication is equal to? The dot product of the vectors





                When you project that vector, its norm (length) becomes lower - or stays the same if one of them is a scalar multiple of the other one.



                ^^ That was the proof. Think about it.



                Source: $3$Blue$1$Brown



                Wait, I look for a "really serious" proof!



                Here you are.



                Another proof:



                Let $p(t)=||tvec{y}-vec{x}||^2$



                As there's an absolute value, it must be equal to or bigger than $0$.



                $p(t)=||tvec{y}-vec{x}||^2geq 0$



                $p(t)=(tvec{y}-vec{x})(tvec{y}-vec{x})geq 0$



                $p(t)=t^2(vec{y}cdot vec{y})-2t(vec{x}cdotvec{y})+vec{x}cdot vec{x}geq0$



                Let's substitute some things.



                $p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



                $p(t)=color{blue}{a}t^2+color{red}{b}t+color{green}{c}geq0$



                Its minimum value must be $large frac{-color{red}{b}}{2color{blue}{a}}$



                Substituting $large t= frac{-color{red}{b}}{2color{blue}{a}}$



                $p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{-color{red}{b}}{2color{blue}{a}})^2+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



                $p(frac{-color{red}{b}}{2color{blue}{a}})=color{blue}{a}(frac{color{red}{b}^2}{4color{blue}{a}^2})+color{red}{b}(frac{-color{red}{b}}{2color{blue}{a}})+color{green}{c}geq0$



                $p(frac{-color{red}{b}}{2color{blue}{a}})=frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



                Forget the $large p(t)$ function side (LHS)



                $frac{color{red}{b}^2}{4color{blue}{a}}+frac{-color{red}{b}^2}{2color{blue}{a}}+color{green}{c}geq0$



                Multiply by $large 4color{blue}{a}$



                $color{red}{b}^2-2color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



                $-color{red}{b}^2+4color{blue}{a}color{green}{c}geq0$



                $4color{blue}{a}color{green}{c}geq color{red}{b}^2$



                De-substitute



                $p(t)=t^2underbrace{(vec{y}cdot vec{y})}_color{blue}{large a}+tunderbrace{(-2vec{x}cdotvec{y})}_color{red}{large b}+underbrace{(vec{x}cdot vec{x})}_color{green}{large c}geq0$



                $4color{blue}{(vec{y}cdot vec{y})}color{green}{(vec{x}cdot vec{x})}geq color{red}{(-2vec{x}cdotvec{y})}^2$



                Using the identity $large vec{v}cdotvec{v}=||vec{v}||^2$



                $4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(-2vec{x}cdotvec{y})}^2$



                Using the identity $(f(x))^2=(|f(x)|)^2$ (where $f(x)inkume{R}$)



                $4color{blue}{||vec{y}||^2}color{green}{||vec{x}||^2}geq color{red}{(|-2vec{x}cdotvec{y}|)}^2$



                As the both sides are not negative, you can square root both sides.



                $2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|-2vec{x}cdotvec{y}|}$



                $2color{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{2|vec{x}cdotvec{y}|}$



                $largecolor{blue}{||vec{y}||}color{green}{||vec{x}||}geq color{red}{|vec{x}cdotvec{y}|}$



                This one was from KhanAcademy







                share|cite|improve this answer














                share|cite|improve this answer



                share|cite|improve this answer








                edited Sep 4 '17 at 16:31


























                community wiki





                2 revs
                MCCCS
























                    7












                    $begingroup$

                    Here is one:



                    Claim: $|langle x,y rangle| leq |x||y| $



                    Proof: If one of the two vectors is zero then both sides are zero so we may assume that both $x,y$ are non-zero. Let $t in mathbb C$. Then



                    $$ begin{align}
                    0 leq |x + ty |^2 &= langle x + ty, x + tyrangle \
                    &= langle x,xrangle + langle x,t yrangle + langle yt, xrangle + langle ty,tyrangle \
                    &= langle x,xrangle + bar{t} langle x,yrangle + t overline{langle x,yrangle} + |t|^2 langle y,yrangle \
                    &= langle x,xrangle + 2 Re(t overline{langle x,yrangle}) + |t|^2 langle y,yrangle
                    end{align}$$



                    Now choose $t := -frac{langle x, y rangle}{langle y, y rangle}$. Then we get
                    $$ 0 leq langle x,xrangle + 2 Re(- frac{|langle x,yrangle|^2}{langle y, y rangle}) + frac{|langle x,yrangle|^2}{langle y, y rangle} = langle x, x rangle - frac{|langle x,yrangle|^2}{langle y, y rangle}$$



                    And hence $|langle x,y rangle| leq |x||y| $.



                    Note that if $y = lambda x$ for $lambda in mathbb C$ then equality holds:
                    $$ |lambda|^2 |langle x, x rangle| = |lambda|^2 |x||x| $$






                    share|cite|improve this answer











                    $endgroup$













                    • $begingroup$
                      I think that a proof of the Cauchy-Schwarz inequality should also include a discussion of the equality case (which is also straightforward from this argument).
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 9:55










                    • $begingroup$
                      @t.b. Like this?
                      $endgroup$
                      – Rudy the Reindeer
                      Jul 2 '12 at 10:22






                    • 1




                      $begingroup$
                      This is half of what I had in mind. More interesting is the fact that if equality $lvertlangle x,yranglervert = lVert xrVert lVert y rVert$ holds then $y = lambda x$ or $x = 0$.
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 10:26










                    • $begingroup$
                      In other words: equality holds if and only if $x$ and $y$ are linearly dependent.
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 10:33






                    • 1




                      $begingroup$
                      It doesn't help to use the angle, since that is defined in terms of the inner product. It's just that $|x+ty|^2=0$ iff $x+ty=0$ iff $x=-ty$ (with your choice of $t$).
                      $endgroup$
                      – wildildildlife
                      Jul 2 '12 at 11:10
















                    7












                    $begingroup$

                    Here is one:



                    Claim: $|langle x,y rangle| leq |x||y| $



                    Proof: If one of the two vectors is zero then both sides are zero so we may assume that both $x,y$ are non-zero. Let $t in mathbb C$. Then



                    $$ begin{align}
                    0 leq |x + ty |^2 &= langle x + ty, x + tyrangle \
                    &= langle x,xrangle + langle x,t yrangle + langle yt, xrangle + langle ty,tyrangle \
                    &= langle x,xrangle + bar{t} langle x,yrangle + t overline{langle x,yrangle} + |t|^2 langle y,yrangle \
                    &= langle x,xrangle + 2 Re(t overline{langle x,yrangle}) + |t|^2 langle y,yrangle
                    end{align}$$



                    Now choose $t := -frac{langle x, y rangle}{langle y, y rangle}$. Then we get
                    $$ 0 leq langle x,xrangle + 2 Re(- frac{|langle x,yrangle|^2}{langle y, y rangle}) + frac{|langle x,yrangle|^2}{langle y, y rangle} = langle x, x rangle - frac{|langle x,yrangle|^2}{langle y, y rangle}$$



                    And hence $|langle x,y rangle| leq |x||y| $.



                    Note that if $y = lambda x$ for $lambda in mathbb C$ then equality holds:
                    $$ |lambda|^2 |langle x, x rangle| = |lambda|^2 |x||x| $$






                    share|cite|improve this answer











                    $endgroup$













                    • $begingroup$
                      I think that a proof of the Cauchy-Schwarz inequality should also include a discussion of the equality case (which is also straightforward from this argument).
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 9:55










                    • $begingroup$
                      @t.b. Like this?
                      $endgroup$
                      – Rudy the Reindeer
                      Jul 2 '12 at 10:22






                    • 1




                      $begingroup$
                      This is half of what I had in mind. More interesting is the fact that if equality $lvertlangle x,yranglervert = lVert xrVert lVert y rVert$ holds then $y = lambda x$ or $x = 0$.
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 10:26










                    • $begingroup$
                      In other words: equality holds if and only if $x$ and $y$ are linearly dependent.
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 10:33






                    • 1




                      $begingroup$
                      It doesn't help to use the angle, since that is defined in terms of the inner product. It's just that $|x+ty|^2=0$ iff $x+ty=0$ iff $x=-ty$ (with your choice of $t$).
                      $endgroup$
                      – wildildildlife
                      Jul 2 '12 at 11:10














                    7












                    7








                    7





                    $begingroup$

                    Here is one:



                    Claim: $|langle x,y rangle| leq |x||y| $



                    Proof: If one of the two vectors is zero then both sides are zero so we may assume that both $x,y$ are non-zero. Let $t in mathbb C$. Then



                    $$ begin{align}
                    0 leq |x + ty |^2 &= langle x + ty, x + tyrangle \
                    &= langle x,xrangle + langle x,t yrangle + langle yt, xrangle + langle ty,tyrangle \
                    &= langle x,xrangle + bar{t} langle x,yrangle + t overline{langle x,yrangle} + |t|^2 langle y,yrangle \
                    &= langle x,xrangle + 2 Re(t overline{langle x,yrangle}) + |t|^2 langle y,yrangle
                    end{align}$$



                    Now choose $t := -frac{langle x, y rangle}{langle y, y rangle}$. Then we get
                    $$ 0 leq langle x,xrangle + 2 Re(- frac{|langle x,yrangle|^2}{langle y, y rangle}) + frac{|langle x,yrangle|^2}{langle y, y rangle} = langle x, x rangle - frac{|langle x,yrangle|^2}{langle y, y rangle}$$



                    And hence $|langle x,y rangle| leq |x||y| $.



                    Note that if $y = lambda x$ for $lambda in mathbb C$ then equality holds:
                    $$ |lambda|^2 |langle x, x rangle| = |lambda|^2 |x||x| $$






                    share|cite|improve this answer











                    $endgroup$



                    Here is one:



                    Claim: $|langle x,y rangle| leq |x||y| $



                    Proof: If one of the two vectors is zero then both sides are zero so we may assume that both $x,y$ are non-zero. Let $t in mathbb C$. Then



                    $$ begin{align}
                    0 leq |x + ty |^2 &= langle x + ty, x + tyrangle \
                    &= langle x,xrangle + langle x,t yrangle + langle yt, xrangle + langle ty,tyrangle \
                    &= langle x,xrangle + bar{t} langle x,yrangle + t overline{langle x,yrangle} + |t|^2 langle y,yrangle \
                    &= langle x,xrangle + 2 Re(t overline{langle x,yrangle}) + |t|^2 langle y,yrangle
                    end{align}$$



                    Now choose $t := -frac{langle x, y rangle}{langle y, y rangle}$. Then we get
                    $$ 0 leq langle x,xrangle + 2 Re(- frac{|langle x,yrangle|^2}{langle y, y rangle}) + frac{|langle x,yrangle|^2}{langle y, y rangle} = langle x, x rangle - frac{|langle x,yrangle|^2}{langle y, y rangle}$$



                    And hence $|langle x,y rangle| leq |x||y| $.



                    Note that if $y = lambda x$ for $lambda in mathbb C$ then equality holds:
                    $$ |lambda|^2 |langle x, x rangle| = |lambda|^2 |x||x| $$







                    share|cite|improve this answer














                    share|cite|improve this answer



                    share|cite|improve this answer








                    edited Jul 2 '12 at 10:22


























                    community wiki





                    2 revs
                    Matt N.













                    • $begingroup$
                      I think that a proof of the Cauchy-Schwarz inequality should also include a discussion of the equality case (which is also straightforward from this argument).
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 9:55










                    • $begingroup$
                      @t.b. Like this?
                      $endgroup$
                      – Rudy the Reindeer
                      Jul 2 '12 at 10:22






                    • 1




                      $begingroup$
                      This is half of what I had in mind. More interesting is the fact that if equality $lvertlangle x,yranglervert = lVert xrVert lVert y rVert$ holds then $y = lambda x$ or $x = 0$.
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 10:26










                    • $begingroup$
                      In other words: equality holds if and only if $x$ and $y$ are linearly dependent.
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 10:33






                    • 1




                      $begingroup$
                      It doesn't help to use the angle, since that is defined in terms of the inner product. It's just that $|x+ty|^2=0$ iff $x+ty=0$ iff $x=-ty$ (with your choice of $t$).
                      $endgroup$
                      – wildildildlife
                      Jul 2 '12 at 11:10


















                    • $begingroup$
                      I think that a proof of the Cauchy-Schwarz inequality should also include a discussion of the equality case (which is also straightforward from this argument).
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 9:55










                    • $begingroup$
                      @t.b. Like this?
                      $endgroup$
                      – Rudy the Reindeer
                      Jul 2 '12 at 10:22






                    • 1




                      $begingroup$
                      This is half of what I had in mind. More interesting is the fact that if equality $lvertlangle x,yranglervert = lVert xrVert lVert y rVert$ holds then $y = lambda x$ or $x = 0$.
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 10:26










                    • $begingroup$
                      In other words: equality holds if and only if $x$ and $y$ are linearly dependent.
                      $endgroup$
                      – t.b.
                      Jul 2 '12 at 10:33






                    • 1




                      $begingroup$
                      It doesn't help to use the angle, since that is defined in terms of the inner product. It's just that $|x+ty|^2=0$ iff $x+ty=0$ iff $x=-ty$ (with your choice of $t$).
                      $endgroup$
                      – wildildildlife
                      Jul 2 '12 at 11:10
















                    $begingroup$
                    I think that a proof of the Cauchy-Schwarz inequality should also include a discussion of the equality case (which is also straightforward from this argument).
                    $endgroup$
                    – t.b.
                    Jul 2 '12 at 9:55




                    $begingroup$
                    I think that a proof of the Cauchy-Schwarz inequality should also include a discussion of the equality case (which is also straightforward from this argument).
                    $endgroup$
                    – t.b.
                    Jul 2 '12 at 9:55












                    $begingroup$
                    @t.b. Like this?
                    $endgroup$
                    – Rudy the Reindeer
                    Jul 2 '12 at 10:22




                    $begingroup$
                    @t.b. Like this?
                    $endgroup$
                    – Rudy the Reindeer
                    Jul 2 '12 at 10:22




                    1




                    1




                    $begingroup$
                    This is half of what I had in mind. More interesting is the fact that if equality $lvertlangle x,yranglervert = lVert xrVert lVert y rVert$ holds then $y = lambda x$ or $x = 0$.
                    $endgroup$
                    – t.b.
                    Jul 2 '12 at 10:26




                    $begingroup$
                    This is half of what I had in mind. More interesting is the fact that if equality $lvertlangle x,yranglervert = lVert xrVert lVert y rVert$ holds then $y = lambda x$ or $x = 0$.
                    $endgroup$
                    – t.b.
                    Jul 2 '12 at 10:26












                    $begingroup$
                    In other words: equality holds if and only if $x$ and $y$ are linearly dependent.
                    $endgroup$
                    – t.b.
                    Jul 2 '12 at 10:33




                    $begingroup$
                    In other words: equality holds if and only if $x$ and $y$ are linearly dependent.
                    $endgroup$
                    – t.b.
                    Jul 2 '12 at 10:33




                    1




                    1




                    $begingroup$
                    It doesn't help to use the angle, since that is defined in terms of the inner product. It's just that $|x+ty|^2=0$ iff $x+ty=0$ iff $x=-ty$ (with your choice of $t$).
                    $endgroup$
                    – wildildildlife
                    Jul 2 '12 at 11:10




                    $begingroup$
                    It doesn't help to use the angle, since that is defined in terms of the inner product. It's just that $|x+ty|^2=0$ iff $x+ty=0$ iff $x=-ty$ (with your choice of $t$).
                    $endgroup$
                    – wildildildlife
                    Jul 2 '12 at 11:10











                    4












                    $begingroup$

                    Here is a nice simple proof. Fix, $X,Yin mathbb{R}^n$ then we wish to show
                    $$
                    Xcdot Y leq |X||Y|
                    $$
                    the trick is to construct a suitable vector $Zin mathbb{R}^n$ and then use the property of the dot product $Zcdot Z geq 0$. Take
                    $$
                    Z = frac{X}{|X|}-frac{Y}{|Y|}
                    $$
                    then we compute $Zcdot Z$
                    begin{align}
                    Zcdot Z &= frac{Xcdot X}{|X|^2}-2frac{Xcdot Y}{|X||Y|}+frac{Ycdot Y}{|Y|^2}\
                    &=2 - 2frac{Xcdot Y}{|X||Y|}
                    end{align}
                    then we use $Zcdot Z geq 0$ to write
                    begin{align}
                    2-2frac{Xcdot Y}{|X||Y|}geq 0\
                    2geq 2frac{Xcdot Y}{|X||Y|}\
                    |X||Y|geq Xcdot Y
                    end{align}
                    and we are done.






                    share|cite|improve this answer











                    $endgroup$


















                      4












                      $begingroup$

                      Here is a nice simple proof. Fix, $X,Yin mathbb{R}^n$ then we wish to show
                      $$
                      Xcdot Y leq |X||Y|
                      $$
                      the trick is to construct a suitable vector $Zin mathbb{R}^n$ and then use the property of the dot product $Zcdot Z geq 0$. Take
                      $$
                      Z = frac{X}{|X|}-frac{Y}{|Y|}
                      $$
                      then we compute $Zcdot Z$
                      begin{align}
                      Zcdot Z &= frac{Xcdot X}{|X|^2}-2frac{Xcdot Y}{|X||Y|}+frac{Ycdot Y}{|Y|^2}\
                      &=2 - 2frac{Xcdot Y}{|X||Y|}
                      end{align}
                      then we use $Zcdot Z geq 0$ to write
                      begin{align}
                      2-2frac{Xcdot Y}{|X||Y|}geq 0\
                      2geq 2frac{Xcdot Y}{|X||Y|}\
                      |X||Y|geq Xcdot Y
                      end{align}
                      and we are done.






                      share|cite|improve this answer











                      $endgroup$
















                        4












                        4








                        4





                        $begingroup$

                        Here is a nice simple proof. Fix, $X,Yin mathbb{R}^n$ then we wish to show
                        $$
                        Xcdot Y leq |X||Y|
                        $$
                        the trick is to construct a suitable vector $Zin mathbb{R}^n$ and then use the property of the dot product $Zcdot Z geq 0$. Take
                        $$
                        Z = frac{X}{|X|}-frac{Y}{|Y|}
                        $$
                        then we compute $Zcdot Z$
                        begin{align}
                        Zcdot Z &= frac{Xcdot X}{|X|^2}-2frac{Xcdot Y}{|X||Y|}+frac{Ycdot Y}{|Y|^2}\
                        &=2 - 2frac{Xcdot Y}{|X||Y|}
                        end{align}
                        then we use $Zcdot Z geq 0$ to write
                        begin{align}
                        2-2frac{Xcdot Y}{|X||Y|}geq 0\
                        2geq 2frac{Xcdot Y}{|X||Y|}\
                        |X||Y|geq Xcdot Y
                        end{align}
                        and we are done.






                        share|cite|improve this answer











                        $endgroup$



                        Here is a nice simple proof. Fix, $X,Yin mathbb{R}^n$ then we wish to show
                        $$
                        Xcdot Y leq |X||Y|
                        $$
                        the trick is to construct a suitable vector $Zin mathbb{R}^n$ and then use the property of the dot product $Zcdot Z geq 0$. Take
                        $$
                        Z = frac{X}{|X|}-frac{Y}{|Y|}
                        $$
                        then we compute $Zcdot Z$
                        begin{align}
                        Zcdot Z &= frac{Xcdot X}{|X|^2}-2frac{Xcdot Y}{|X||Y|}+frac{Ycdot Y}{|Y|^2}\
                        &=2 - 2frac{Xcdot Y}{|X||Y|}
                        end{align}
                        then we use $Zcdot Z geq 0$ to write
                        begin{align}
                        2-2frac{Xcdot Y}{|X||Y|}geq 0\
                        2geq 2frac{Xcdot Y}{|X||Y|}\
                        |X||Y|geq Xcdot Y
                        end{align}
                        and we are done.







                        share|cite|improve this answer














                        share|cite|improve this answer



                        share|cite|improve this answer








                        answered May 8 '18 at 2:52


























                        community wiki





                        Eli Fonseca
























                            2












                            $begingroup$

                            Without loss of generality, assume $|y|=1$. Write $x=left<x,yright>y+z$. Then $z$ is orthogonal to $y$, because
                            $$left<x,yright>=left<(left<x,yright>y+z),yright>=left<x,yright>left<y,yright>+left<z,yright>,$$
                            indeed yields $left<z,yright>=0$. Hence
                            $$|x|^2=left<x,xright>=|left<x,yright>|^2+left<z,zright>geq |left<x,yright>|^2,$$
                            with equality iff $z= 0$, i.e. $xinmathbb{F}y$.






                            share|cite|improve this answer











                            $endgroup$


















                              2












                              $begingroup$

                              Without loss of generality, assume $|y|=1$. Write $x=left<x,yright>y+z$. Then $z$ is orthogonal to $y$, because
                              $$left<x,yright>=left<(left<x,yright>y+z),yright>=left<x,yright>left<y,yright>+left<z,yright>,$$
                              indeed yields $left<z,yright>=0$. Hence
                              $$|x|^2=left<x,xright>=|left<x,yright>|^2+left<z,zright>geq |left<x,yright>|^2,$$
                              with equality iff $z= 0$, i.e. $xinmathbb{F}y$.






                              share|cite|improve this answer











                              $endgroup$
















                                2












                                2








                                2





                                $begingroup$

                                Without loss of generality, assume $|y|=1$. Write $x=left<x,yright>y+z$. Then $z$ is orthogonal to $y$, because
                                $$left<x,yright>=left<(left<x,yright>y+z),yright>=left<x,yright>left<y,yright>+left<z,yright>,$$
                                indeed yields $left<z,yright>=0$. Hence
                                $$|x|^2=left<x,xright>=|left<x,yright>|^2+left<z,zright>geq |left<x,yright>|^2,$$
                                with equality iff $z= 0$, i.e. $xinmathbb{F}y$.






                                share|cite|improve this answer











                                $endgroup$



                                Without loss of generality, assume $|y|=1$. Write $x=left<x,yright>y+z$. Then $z$ is orthogonal to $y$, because
                                $$left<x,yright>=left<(left<x,yright>y+z),yright>=left<x,yright>left<y,yright>+left<z,yright>,$$
                                indeed yields $left<z,yright>=0$. Hence
                                $$|x|^2=left<x,xright>=|left<x,yright>|^2+left<z,zright>geq |left<x,yright>|^2,$$
                                with equality iff $z= 0$, i.e. $xinmathbb{F}y$.







                                share|cite|improve this answer














                                share|cite|improve this answer



                                share|cite|improve this answer








                                answered Jul 2 '12 at 10:51


























                                community wiki





                                wildildildlife
























                                    1












                                    $begingroup$

                                    I like this proof for real vectors a lot. Recall that an inner product for real vectors has the following properties:



                                    $langle x,yrangle=langle y,xrangle$



                                    $langle ax+y,zrangle=alangle x,zrangle+langle y,zrangle$



                                    $langle x,xranglegeq0$



                                    Then
                                    $0leqlangle lx+y,lx+yrangle=l^2langle x,xrangle+llangle x,yrangle+llangle y,xrangle+langle y,yrangle=l^2langle x,xrangle+2llangle x,yrangle+langle y,yrangle$



                                    $Let:a=langle x,xrangle, b=langle x,yrangle,c=langle y,yrangle$, then the equation becomes



                                    $al^2+bl+cgeq0$



                                    This is a quadratic equation in $l$ with at most 1 real root. Therefore



                                    $b^2-4acleq 0$



                                    $implies4{langle x,yrangle}^2-4langle x,xranglelangle y,yrangleleq 0$



                                    $implies{langle x,yrangle}^2leqlangle x,xranglelangle y,yrangle$



                                    Not bad huh? Sadly it doesn't work out so nicely with complex vectors $:($






                                    share|cite|improve this answer











                                    $endgroup$


















                                      1












                                      $begingroup$

                                      I like this proof for real vectors a lot. Recall that an inner product for real vectors has the following properties:



                                      $langle x,yrangle=langle y,xrangle$



                                      $langle ax+y,zrangle=alangle x,zrangle+langle y,zrangle$



                                      $langle x,xranglegeq0$



                                      Then
                                      $0leqlangle lx+y,lx+yrangle=l^2langle x,xrangle+llangle x,yrangle+llangle y,xrangle+langle y,yrangle=l^2langle x,xrangle+2llangle x,yrangle+langle y,yrangle$



                                      $Let:a=langle x,xrangle, b=langle x,yrangle,c=langle y,yrangle$, then the equation becomes



                                      $al^2+bl+cgeq0$



                                      This is a quadratic equation in $l$ with at most 1 real root. Therefore



                                      $b^2-4acleq 0$



                                      $implies4{langle x,yrangle}^2-4langle x,xranglelangle y,yrangleleq 0$



                                      $implies{langle x,yrangle}^2leqlangle x,xranglelangle y,yrangle$



                                      Not bad huh? Sadly it doesn't work out so nicely with complex vectors $:($






                                      share|cite|improve this answer











                                      $endgroup$
















                                        1












                                        1








                                        1





                                        $begingroup$

                                        I like this proof for real vectors a lot. Recall that an inner product for real vectors has the following properties:



                                        $langle x,yrangle=langle y,xrangle$



                                        $langle ax+y,zrangle=alangle x,zrangle+langle y,zrangle$



                                        $langle x,xranglegeq0$



                                        Then
                                        $0leqlangle lx+y,lx+yrangle=l^2langle x,xrangle+llangle x,yrangle+llangle y,xrangle+langle y,yrangle=l^2langle x,xrangle+2llangle x,yrangle+langle y,yrangle$



                                        $Let:a=langle x,xrangle, b=langle x,yrangle,c=langle y,yrangle$, then the equation becomes



                                        $al^2+bl+cgeq0$



                                        This is a quadratic equation in $l$ with at most 1 real root. Therefore



                                        $b^2-4acleq 0$



                                        $implies4{langle x,yrangle}^2-4langle x,xranglelangle y,yrangleleq 0$



                                        $implies{langle x,yrangle}^2leqlangle x,xranglelangle y,yrangle$



                                        Not bad huh? Sadly it doesn't work out so nicely with complex vectors $:($






                                        share|cite|improve this answer











                                        $endgroup$



                                        I like this proof for real vectors a lot. Recall that an inner product for real vectors has the following properties:



                                        $langle x,yrangle=langle y,xrangle$



                                        $langle ax+y,zrangle=alangle x,zrangle+langle y,zrangle$



                                        $langle x,xranglegeq0$



                                        Then
                                        $0leqlangle lx+y,lx+yrangle=l^2langle x,xrangle+llangle x,yrangle+llangle y,xrangle+langle y,yrangle=l^2langle x,xrangle+2llangle x,yrangle+langle y,yrangle$



                                        $Let:a=langle x,xrangle, b=langle x,yrangle,c=langle y,yrangle$, then the equation becomes



                                        $al^2+bl+cgeq0$



                                        This is a quadratic equation in $l$ with at most 1 real root. Therefore



                                        $b^2-4acleq 0$



                                        $implies4{langle x,yrangle}^2-4langle x,xranglelangle y,yrangleleq 0$



                                        $implies{langle x,yrangle}^2leqlangle x,xranglelangle y,yrangle$



                                        Not bad huh? Sadly it doesn't work out so nicely with complex vectors $:($







                                        share|cite|improve this answer














                                        share|cite|improve this answer



                                        share|cite|improve this answer








                                        answered Jul 27 '14 at 20:42


























                                        community wiki





                                        Pauly B
























                                            1












                                            $begingroup$

                                            Here is the proof from ``Introductory Real Analysis'', Kolmogorov & Fomin, Silverman Translation. Assume all sums are from $1$ to $n$.



                                            Lemma:
                                            $$
                                            ( sum_i x_i y_i )^2 = (sum_i x_i^2)(sum_i y_i^2)
                                            - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
                                            $$



                                            Proof of Cauchy-Schwarz: The third term in the Lemma is always non-positive, so clearly $( sum_i x_i y_i )^2 leq (sum_i x_i^2)(sum_i y_i^2) $ .



                                            Proof of Lemma: The left hand side (LHS), and the right hand side (RHS) should be shown to be equal. For the LHS write
                                            $$ text{LHS} = ( sum_i x_i y_i )^2 = (sum_i x_i y_i)(sum_j x_j y_j) = sum_isum_j x_iy_ix_jy_j. $$
                                            For the RHS write
                                            $$
                                            text{RHS}=
                                            frac{1}{2}(sum_i x_i^2)(sum_j y_j^2)
                                            +frac{1}{2} (sum_j x_j^2)(sum_i y_i^2)
                                            - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
                                            \ =
                                            frac{1}{2}sum_isum_jleft(
                                            x_i^2 y_j^2 + x_j^2y_i^2 - x_i^2 y_j^2 - x_j^2y_i^2 + 2 x_i y_i x_j y_j
                                            right)
                                            =
                                            sum_isum_j x_iy_ix_jy_j .
                                            $$

                                            This shows that LHS$=$RHS and finishes the proof.






                                            share|cite|improve this answer











                                            $endgroup$


















                                              1












                                              $begingroup$

                                              Here is the proof from ``Introductory Real Analysis'', Kolmogorov & Fomin, Silverman Translation. Assume all sums are from $1$ to $n$.



                                              Lemma:
                                              $$
                                              ( sum_i x_i y_i )^2 = (sum_i x_i^2)(sum_i y_i^2)
                                              - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
                                              $$



                                              Proof of Cauchy-Schwarz: The third term in the Lemma is always non-positive, so clearly $( sum_i x_i y_i )^2 leq (sum_i x_i^2)(sum_i y_i^2) $ .



                                              Proof of Lemma: The left hand side (LHS), and the right hand side (RHS) should be shown to be equal. For the LHS write
                                              $$ text{LHS} = ( sum_i x_i y_i )^2 = (sum_i x_i y_i)(sum_j x_j y_j) = sum_isum_j x_iy_ix_jy_j. $$
                                              For the RHS write
                                              $$
                                              text{RHS}=
                                              frac{1}{2}(sum_i x_i^2)(sum_j y_j^2)
                                              +frac{1}{2} (sum_j x_j^2)(sum_i y_i^2)
                                              - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
                                              \ =
                                              frac{1}{2}sum_isum_jleft(
                                              x_i^2 y_j^2 + x_j^2y_i^2 - x_i^2 y_j^2 - x_j^2y_i^2 + 2 x_i y_i x_j y_j
                                              right)
                                              =
                                              sum_isum_j x_iy_ix_jy_j .
                                              $$

                                              This shows that LHS$=$RHS and finishes the proof.






                                              share|cite|improve this answer











                                              $endgroup$
















                                                1












                                                1








                                                1





                                                $begingroup$

                                                Here is the proof from ``Introductory Real Analysis'', Kolmogorov & Fomin, Silverman Translation. Assume all sums are from $1$ to $n$.



                                                Lemma:
                                                $$
                                                ( sum_i x_i y_i )^2 = (sum_i x_i^2)(sum_i y_i^2)
                                                - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
                                                $$



                                                Proof of Cauchy-Schwarz: The third term in the Lemma is always non-positive, so clearly $( sum_i x_i y_i )^2 leq (sum_i x_i^2)(sum_i y_i^2) $ .



                                                Proof of Lemma: The left hand side (LHS), and the right hand side (RHS) should be shown to be equal. For the LHS write
                                                $$ text{LHS} = ( sum_i x_i y_i )^2 = (sum_i x_i y_i)(sum_j x_j y_j) = sum_isum_j x_iy_ix_jy_j. $$
                                                For the RHS write
                                                $$
                                                text{RHS}=
                                                frac{1}{2}(sum_i x_i^2)(sum_j y_j^2)
                                                +frac{1}{2} (sum_j x_j^2)(sum_i y_i^2)
                                                - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
                                                \ =
                                                frac{1}{2}sum_isum_jleft(
                                                x_i^2 y_j^2 + x_j^2y_i^2 - x_i^2 y_j^2 - x_j^2y_i^2 + 2 x_i y_i x_j y_j
                                                right)
                                                =
                                                sum_isum_j x_iy_ix_jy_j .
                                                $$

                                                This shows that LHS$=$RHS and finishes the proof.






                                                share|cite|improve this answer











                                                $endgroup$



                                                Here is the proof from ``Introductory Real Analysis'', Kolmogorov & Fomin, Silverman Translation. Assume all sums are from $1$ to $n$.



                                                Lemma:
                                                $$
                                                ( sum_i x_i y_i )^2 = (sum_i x_i^2)(sum_i y_i^2)
                                                - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
                                                $$



                                                Proof of Cauchy-Schwarz: The third term in the Lemma is always non-positive, so clearly $( sum_i x_i y_i )^2 leq (sum_i x_i^2)(sum_i y_i^2) $ .



                                                Proof of Lemma: The left hand side (LHS), and the right hand side (RHS) should be shown to be equal. For the LHS write
                                                $$ text{LHS} = ( sum_i x_i y_i )^2 = (sum_i x_i y_i)(sum_j x_j y_j) = sum_isum_j x_iy_ix_jy_j. $$
                                                For the RHS write
                                                $$
                                                text{RHS}=
                                                frac{1}{2}(sum_i x_i^2)(sum_j y_j^2)
                                                +frac{1}{2} (sum_j x_j^2)(sum_i y_i^2)
                                                - frac{1}{2} sum_i sum_j (x_iy_j -x_jy_i)^2
                                                \ =
                                                frac{1}{2}sum_isum_jleft(
                                                x_i^2 y_j^2 + x_j^2y_i^2 - x_i^2 y_j^2 - x_j^2y_i^2 + 2 x_i y_i x_j y_j
                                                right)
                                                =
                                                sum_isum_j x_iy_ix_jy_j .
                                                $$

                                                This shows that LHS$=$RHS and finishes the proof.







                                                share|cite|improve this answer














                                                share|cite|improve this answer



                                                share|cite|improve this answer








                                                answered Jan 20 at 19:43


























                                                community wiki





                                                Hashimoto































                                                    draft saved

                                                    draft discarded




















































                                                    Thanks for contributing an answer to Mathematics Stack Exchange!


                                                    • Please be sure to answer the question. Provide details and share your research!

                                                    But avoid



                                                    • Asking for help, clarification, or responding to other answers.

                                                    • Making statements based on opinion; back them up with references or personal experience.


                                                    Use MathJax to format equations. MathJax reference.


                                                    To learn more, see our tips on writing great answers.




                                                    draft saved


                                                    draft discarded














                                                    StackExchange.ready(
                                                    function () {
                                                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f23522%2fproofs-of-the-cauchy-schwarz-inequality%23new-answer', 'question_page');
                                                    }
                                                    );

                                                    Post as a guest















                                                    Required, but never shown





















































                                                    Required, but never shown














                                                    Required, but never shown












                                                    Required, but never shown







                                                    Required, but never shown

































                                                    Required, but never shown














                                                    Required, but never shown












                                                    Required, but never shown







                                                    Required, but never shown







                                                    Popular posts from this blog

                                                    MongoDB - Not Authorized To Execute Command

                                                    in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith

                                                    Npm cannot find a required file even through it is in the searched directory