Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all...












2












$begingroup$


The Problem:



Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all have the same eigenvalue.



My Approach:



Let $phi(v) = alpha v$, $phi(w) = beta w$, and $phi(v + w) = gamma (v + w)$. We then have that
$$ phi(v) + phi(w) = gamma v + gamma w implies alpha v + beta w = gamma v + gamma w implies (alpha - gamma)v = (gamma - beta)w. $$
This means that $v$ and $w$ are scalar multiplies of one another; say, $w = lambda v$...



I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...










share|cite|improve this question









$endgroup$

















    2












    $begingroup$


    The Problem:



    Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all have the same eigenvalue.



    My Approach:



    Let $phi(v) = alpha v$, $phi(w) = beta w$, and $phi(v + w) = gamma (v + w)$. We then have that
    $$ phi(v) + phi(w) = gamma v + gamma w implies alpha v + beta w = gamma v + gamma w implies (alpha - gamma)v = (gamma - beta)w. $$
    This means that $v$ and $w$ are scalar multiplies of one another; say, $w = lambda v$...



    I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...










    share|cite|improve this question









    $endgroup$















      2












      2








      2





      $begingroup$


      The Problem:



      Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all have the same eigenvalue.



      My Approach:



      Let $phi(v) = alpha v$, $phi(w) = beta w$, and $phi(v + w) = gamma (v + w)$. We then have that
      $$ phi(v) + phi(w) = gamma v + gamma w implies alpha v + beta w = gamma v + gamma w implies (alpha - gamma)v = (gamma - beta)w. $$
      This means that $v$ and $w$ are scalar multiplies of one another; say, $w = lambda v$...



      I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...










      share|cite|improve this question









      $endgroup$




      The Problem:



      Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all have the same eigenvalue.



      My Approach:



      Let $phi(v) = alpha v$, $phi(w) = beta w$, and $phi(v + w) = gamma (v + w)$. We then have that
      $$ phi(v) + phi(w) = gamma v + gamma w implies alpha v + beta w = gamma v + gamma w implies (alpha - gamma)v = (gamma - beta)w. $$
      This means that $v$ and $w$ are scalar multiplies of one another; say, $w = lambda v$...



      I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...







      linear-algebra eigenvalues-eigenvectors






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 22 at 14:22









      thisisourconcerndudethisisourconcerndude

      1,1321122




      1,1321122






















          4 Answers
          4






          active

          oldest

          votes


















          4












          $begingroup$

          From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:




          1. If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.

          2. If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.






          share|cite|improve this answer









          $endgroup$





















            2












            $begingroup$

            Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
            $$(alpha-gamma)alpha v = (gamma-beta)beta w$$
            On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
            so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$



            Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.



            Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.






            share|cite|improve this answer









            $endgroup$





















              0












              $begingroup$

              If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...



              $(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.






              share|cite|improve this answer











              $endgroup$













              • $begingroup$
                You don't know that $v$ and $w$ are linearly independent.
                $endgroup$
                – kccu
                Jan 22 at 14:25



















              0












              $begingroup$

              Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
              $$ lambda v + mu w = T(v + w) = kappa(v + w).$$
              We therefore deduce that
              $$ (kappa - lambda) v + (kappa - mu) w = 0.$$
              If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.






              share|cite|improve this answer









              $endgroup$













                Your Answer





                StackExchange.ifUsing("editor", function () {
                return StackExchange.using("mathjaxEditing", function () {
                StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
                StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
                });
                });
                }, "mathjax-editing");

                StackExchange.ready(function() {
                var channelOptions = {
                tags: "".split(" "),
                id: "69"
                };
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function() {
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled) {
                StackExchange.using("snippets", function() {
                createEditor();
                });
                }
                else {
                createEditor();
                }
                });

                function createEditor() {
                StackExchange.prepareEditor({
                heartbeatType: 'answer',
                autoActivateHeartbeat: false,
                convertImagesToLinks: true,
                noModals: true,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: 10,
                bindNavPrevention: true,
                postfix: "",
                imageUploader: {
                brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                allowUrls: true
                },
                noCode: true, onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                });


                }
                });














                draft saved

                draft discarded


















                StackExchange.ready(
                function () {
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083219%2fsuppose-v-w-v-w-are-all-eigenvectors-of-the-linear-operator-phiv-to-v%23new-answer', 'question_page');
                }
                );

                Post as a guest















                Required, but never shown

























                4 Answers
                4






                active

                oldest

                votes








                4 Answers
                4






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes









                4












                $begingroup$

                From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:




                1. If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.

                2. If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.






                share|cite|improve this answer









                $endgroup$


















                  4












                  $begingroup$

                  From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:




                  1. If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.

                  2. If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.






                  share|cite|improve this answer









                  $endgroup$
















                    4












                    4








                    4





                    $begingroup$

                    From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:




                    1. If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.

                    2. If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.






                    share|cite|improve this answer









                    $endgroup$



                    From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:




                    1. If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.

                    2. If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered Jan 22 at 14:29









                    kccukccu

                    10.6k11229




                    10.6k11229























                        2












                        $begingroup$

                        Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
                        $$(alpha-gamma)alpha v = (gamma-beta)beta w$$
                        On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
                        so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$



                        Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.



                        Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.






                        share|cite|improve this answer









                        $endgroup$


















                          2












                          $begingroup$

                          Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
                          $$(alpha-gamma)alpha v = (gamma-beta)beta w$$
                          On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
                          so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$



                          Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.



                          Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.






                          share|cite|improve this answer









                          $endgroup$
















                            2












                            2








                            2





                            $begingroup$

                            Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
                            $$(alpha-gamma)alpha v = (gamma-beta)beta w$$
                            On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
                            so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$



                            Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.



                            Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.






                            share|cite|improve this answer









                            $endgroup$



                            Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
                            $$(alpha-gamma)alpha v = (gamma-beta)beta w$$
                            On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
                            so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$



                            Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.



                            Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.







                            share|cite|improve this answer












                            share|cite|improve this answer



                            share|cite|improve this answer










                            answered Jan 22 at 14:32









                            mechanodroidmechanodroid

                            28.5k62548




                            28.5k62548























                                0












                                $begingroup$

                                If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...



                                $(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.






                                share|cite|improve this answer











                                $endgroup$













                                • $begingroup$
                                  You don't know that $v$ and $w$ are linearly independent.
                                  $endgroup$
                                  – kccu
                                  Jan 22 at 14:25
















                                0












                                $begingroup$

                                If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...



                                $(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.






                                share|cite|improve this answer











                                $endgroup$













                                • $begingroup$
                                  You don't know that $v$ and $w$ are linearly independent.
                                  $endgroup$
                                  – kccu
                                  Jan 22 at 14:25














                                0












                                0








                                0





                                $begingroup$

                                If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...



                                $(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.






                                share|cite|improve this answer











                                $endgroup$



                                If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...



                                $(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.







                                share|cite|improve this answer














                                share|cite|improve this answer



                                share|cite|improve this answer








                                edited Jan 22 at 14:26

























                                answered Jan 22 at 14:25









                                Tsemo AristideTsemo Aristide

                                59.2k11445




                                59.2k11445












                                • $begingroup$
                                  You don't know that $v$ and $w$ are linearly independent.
                                  $endgroup$
                                  – kccu
                                  Jan 22 at 14:25


















                                • $begingroup$
                                  You don't know that $v$ and $w$ are linearly independent.
                                  $endgroup$
                                  – kccu
                                  Jan 22 at 14:25
















                                $begingroup$
                                You don't know that $v$ and $w$ are linearly independent.
                                $endgroup$
                                – kccu
                                Jan 22 at 14:25




                                $begingroup$
                                You don't know that $v$ and $w$ are linearly independent.
                                $endgroup$
                                – kccu
                                Jan 22 at 14:25











                                0












                                $begingroup$

                                Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
                                $$ lambda v + mu w = T(v + w) = kappa(v + w).$$
                                We therefore deduce that
                                $$ (kappa - lambda) v + (kappa - mu) w = 0.$$
                                If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.






                                share|cite|improve this answer









                                $endgroup$


















                                  0












                                  $begingroup$

                                  Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
                                  $$ lambda v + mu w = T(v + w) = kappa(v + w).$$
                                  We therefore deduce that
                                  $$ (kappa - lambda) v + (kappa - mu) w = 0.$$
                                  If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.






                                  share|cite|improve this answer









                                  $endgroup$
















                                    0












                                    0








                                    0





                                    $begingroup$

                                    Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
                                    $$ lambda v + mu w = T(v + w) = kappa(v + w).$$
                                    We therefore deduce that
                                    $$ (kappa - lambda) v + (kappa - mu) w = 0.$$
                                    If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.






                                    share|cite|improve this answer









                                    $endgroup$



                                    Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
                                    $$ lambda v + mu w = T(v + w) = kappa(v + w).$$
                                    We therefore deduce that
                                    $$ (kappa - lambda) v + (kappa - mu) w = 0.$$
                                    If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.







                                    share|cite|improve this answer












                                    share|cite|improve this answer



                                    share|cite|improve this answer










                                    answered Jan 22 at 14:34









                                    ncmathsadistncmathsadist

                                    42.9k260103




                                    42.9k260103






























                                        draft saved

                                        draft discarded




















































                                        Thanks for contributing an answer to Mathematics Stack Exchange!


                                        • Please be sure to answer the question. Provide details and share your research!

                                        But avoid



                                        • Asking for help, clarification, or responding to other answers.

                                        • Making statements based on opinion; back them up with references or personal experience.


                                        Use MathJax to format equations. MathJax reference.


                                        To learn more, see our tips on writing great answers.




                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function () {
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083219%2fsuppose-v-w-v-w-are-all-eigenvectors-of-the-linear-operator-phiv-to-v%23new-answer', 'question_page');
                                        }
                                        );

                                        Post as a guest















                                        Required, but never shown





















































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown

































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown







                                        Popular posts from this blog

                                        'app-layout' is not a known element: how to share Component with different Modules

                                        android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

                                        WPF add header to Image with URL pettitions [duplicate]