Extending ${u_1, u_2}$ to an orthonormal basis when finding an SVD












0












$begingroup$


I've been working through my linear algebra textbook, and when finding an SVD there's just one thing I don't understand.



For example, finding an SVD for the 3x2 matrix A. I will skip the steps of finding the eigenvectors, eigenvalues, singular values... anyway, we find that



$$
V = begin{bmatrix}vec{v}_1 & vec{v}_2end{bmatrix} =
begin{bmatrix}
1/sqrt2 & -1/sqrt2\
1/sqrt2 & 1/sqrt2
end{bmatrix}
$$



and we know that



$$
vec{u}_n = frac{1}{sigma_n}Avec{v}_n
$$



which gives



$$
vec{u}_1 = begin{bmatrix}2/sqrt6\1/sqrt6\1/sqrt6end{bmatrix}, vec{u}_2 = begin{bmatrix}0\-1/sqrt2\1/sqrt2end{bmatrix}
$$



but we know that $U$ is a $m$ by $m$ matrix, so it must be 3 by 3, and so we have to find $vec{u}_3$. This is where I get stuck; the book says that one method to do this is to use the Gram-Schmidt Process, but I just can't seem to wrap my head around how to do this with the vectors shown above.










share|cite|improve this question









$endgroup$












  • $begingroup$
    Choose a vector $x$ linearly independent of $u_1,u_2$ basically at random; one possible choice is $begin{bmatrix} 0 \ 1 \ 0 end{bmatrix}$. Then run Gram-Schmidt on ${ u_1,u_2,x }$.
    $endgroup$
    – Ian
    May 17 '16 at 20:36


















0












$begingroup$


I've been working through my linear algebra textbook, and when finding an SVD there's just one thing I don't understand.



For example, finding an SVD for the 3x2 matrix A. I will skip the steps of finding the eigenvectors, eigenvalues, singular values... anyway, we find that



$$
V = begin{bmatrix}vec{v}_1 & vec{v}_2end{bmatrix} =
begin{bmatrix}
1/sqrt2 & -1/sqrt2\
1/sqrt2 & 1/sqrt2
end{bmatrix}
$$



and we know that



$$
vec{u}_n = frac{1}{sigma_n}Avec{v}_n
$$



which gives



$$
vec{u}_1 = begin{bmatrix}2/sqrt6\1/sqrt6\1/sqrt6end{bmatrix}, vec{u}_2 = begin{bmatrix}0\-1/sqrt2\1/sqrt2end{bmatrix}
$$



but we know that $U$ is a $m$ by $m$ matrix, so it must be 3 by 3, and so we have to find $vec{u}_3$. This is where I get stuck; the book says that one method to do this is to use the Gram-Schmidt Process, but I just can't seem to wrap my head around how to do this with the vectors shown above.










share|cite|improve this question









$endgroup$












  • $begingroup$
    Choose a vector $x$ linearly independent of $u_1,u_2$ basically at random; one possible choice is $begin{bmatrix} 0 \ 1 \ 0 end{bmatrix}$. Then run Gram-Schmidt on ${ u_1,u_2,x }$.
    $endgroup$
    – Ian
    May 17 '16 at 20:36
















0












0








0





$begingroup$


I've been working through my linear algebra textbook, and when finding an SVD there's just one thing I don't understand.



For example, finding an SVD for the 3x2 matrix A. I will skip the steps of finding the eigenvectors, eigenvalues, singular values... anyway, we find that



$$
V = begin{bmatrix}vec{v}_1 & vec{v}_2end{bmatrix} =
begin{bmatrix}
1/sqrt2 & -1/sqrt2\
1/sqrt2 & 1/sqrt2
end{bmatrix}
$$



and we know that



$$
vec{u}_n = frac{1}{sigma_n}Avec{v}_n
$$



which gives



$$
vec{u}_1 = begin{bmatrix}2/sqrt6\1/sqrt6\1/sqrt6end{bmatrix}, vec{u}_2 = begin{bmatrix}0\-1/sqrt2\1/sqrt2end{bmatrix}
$$



but we know that $U$ is a $m$ by $m$ matrix, so it must be 3 by 3, and so we have to find $vec{u}_3$. This is where I get stuck; the book says that one method to do this is to use the Gram-Schmidt Process, but I just can't seem to wrap my head around how to do this with the vectors shown above.










share|cite|improve this question









$endgroup$




I've been working through my linear algebra textbook, and when finding an SVD there's just one thing I don't understand.



For example, finding an SVD for the 3x2 matrix A. I will skip the steps of finding the eigenvectors, eigenvalues, singular values... anyway, we find that



$$
V = begin{bmatrix}vec{v}_1 & vec{v}_2end{bmatrix} =
begin{bmatrix}
1/sqrt2 & -1/sqrt2\
1/sqrt2 & 1/sqrt2
end{bmatrix}
$$



and we know that



$$
vec{u}_n = frac{1}{sigma_n}Avec{v}_n
$$



which gives



$$
vec{u}_1 = begin{bmatrix}2/sqrt6\1/sqrt6\1/sqrt6end{bmatrix}, vec{u}_2 = begin{bmatrix}0\-1/sqrt2\1/sqrt2end{bmatrix}
$$



but we know that $U$ is a $m$ by $m$ matrix, so it must be 3 by 3, and so we have to find $vec{u}_3$. This is where I get stuck; the book says that one method to do this is to use the Gram-Schmidt Process, but I just can't seem to wrap my head around how to do this with the vectors shown above.







linear-algebra orthonormal svd






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked May 17 '16 at 20:31









user5368737user5368737

1477




1477












  • $begingroup$
    Choose a vector $x$ linearly independent of $u_1,u_2$ basically at random; one possible choice is $begin{bmatrix} 0 \ 1 \ 0 end{bmatrix}$. Then run Gram-Schmidt on ${ u_1,u_2,x }$.
    $endgroup$
    – Ian
    May 17 '16 at 20:36




















  • $begingroup$
    Choose a vector $x$ linearly independent of $u_1,u_2$ basically at random; one possible choice is $begin{bmatrix} 0 \ 1 \ 0 end{bmatrix}$. Then run Gram-Schmidt on ${ u_1,u_2,x }$.
    $endgroup$
    – Ian
    May 17 '16 at 20:36


















$begingroup$
Choose a vector $x$ linearly independent of $u_1,u_2$ basically at random; one possible choice is $begin{bmatrix} 0 \ 1 \ 0 end{bmatrix}$. Then run Gram-Schmidt on ${ u_1,u_2,x }$.
$endgroup$
– Ian
May 17 '16 at 20:36






$begingroup$
Choose a vector $x$ linearly independent of $u_1,u_2$ basically at random; one possible choice is $begin{bmatrix} 0 \ 1 \ 0 end{bmatrix}$. Then run Gram-Schmidt on ${ u_1,u_2,x }$.
$endgroup$
– Ian
May 17 '16 at 20:36












1 Answer
1






active

oldest

votes


















0












$begingroup$

There are a few ways to approach this problem.



Eyeball method



Scrape off the distractions of normalization. The column vectors are
$$
tilde{v}_{1} =
%
left[ begin{array}{c}
2 \ 1 \ 1
end{array} right], qquad
%
tilde{v}_{2} =
%
left[ begin{array}{r}
0 \ -1 \ 1
end{array} right].
%
$$
Find a vector perpendicular to both. One such solution is
$$
tilde{v}_{3} =
%
left[ begin{array}{r}
-1 \ 1 \ 1
end{array} right].
$$



Systematic approach



Start with
$$
mathbf{A} =
left[
begin{array}{cr}
2 & 0 \
1 & -1 \
1 & 1 \
end{array}
right].
$$
Find the nullspace $mathcal{N}left(mathbf{A}^{*} right)$. The row reduced form is
$$
begin{align}
%
mathbf{A}^{T} &mapsto mathbf{E}_{mathbf{A}^{T}} \
%
left[
begin{array}{crc}
2 & 1 & 1 \
0 & -1 & 1 \
end{array}
right]
%
&mapsto
%
left[
begin{array}{ccr}
1 & 0 & 1 \
0 & 1 & -1 \
end{array}
right]
end{align}
$$
In terms of basic variables,
$$
begin{align}
x_{1} &= -x_{3}, \
x_{2} &= x_{3}.
end{align}
$$
Making the natural choice that $x_{3}=1$ produces the column vector
$$
%
left[
begin{array}{r}
x_{1} \
x_{2} \
x_{3}
end{array}
right]
%
=
%
left[
begin{array}{r}
-1 \
1 \
1
end{array}
right]
$$



Gram-Schmidt



Make any choice for the third vector and use the process of Gram and Schmidt to make it an orthogonal vector. A wise choice to begin is
$$
tilde{v}_{3} =
left[ begin{array}{c}
0 \ 0 \ 1
end{array} right]
$$
Why is this a wise choice? It is rich in $0$s, which make manipulation easy.



Define the operator which projects the vector $u$ onto the vector $v$ as
$$
p_{uto v} =
frac{vcdot u}{u cdot u}
u
$$
The Gram-Schmidt process fixes $v_{3}$ using the prescription
$$
v_{GS} = v_{3} -
frac{v_{3} cdot v_{1}} {v_{1} cdot v_{1}} v_{1} -
frac{v_{3} cdot v_{2}} {v_{2} cdot v_{2}} v_{2}
$$



$$
frac{1}{3}
left[ begin{array}{r}
-1 \ 1 \ 1
end{array} right]
%
=
%
left[ begin{array}{c}
0 \ 0 \ 1
end{array} right]
%
-
%
frac{1}{6}
left[ begin{array}{c}
2 \ 1 \ 1
end{array} right]
%
-
%
frac{1}{2}
left[ begin{array}{r}
0 \ -1 \ 1
end{array} right]
%
$$



The normalized form is the column vector you want
$$
frac{1}{sqrt{3}}
left[ begin{array}{r}
-1 \ 1 \ 1
end{array} right]
$$






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1789474%2fextending-u-1-u-2-to-an-orthonormal-basis-when-finding-an-svd%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0












    $begingroup$

    There are a few ways to approach this problem.



    Eyeball method



    Scrape off the distractions of normalization. The column vectors are
    $$
    tilde{v}_{1} =
    %
    left[ begin{array}{c}
    2 \ 1 \ 1
    end{array} right], qquad
    %
    tilde{v}_{2} =
    %
    left[ begin{array}{r}
    0 \ -1 \ 1
    end{array} right].
    %
    $$
    Find a vector perpendicular to both. One such solution is
    $$
    tilde{v}_{3} =
    %
    left[ begin{array}{r}
    -1 \ 1 \ 1
    end{array} right].
    $$



    Systematic approach



    Start with
    $$
    mathbf{A} =
    left[
    begin{array}{cr}
    2 & 0 \
    1 & -1 \
    1 & 1 \
    end{array}
    right].
    $$
    Find the nullspace $mathcal{N}left(mathbf{A}^{*} right)$. The row reduced form is
    $$
    begin{align}
    %
    mathbf{A}^{T} &mapsto mathbf{E}_{mathbf{A}^{T}} \
    %
    left[
    begin{array}{crc}
    2 & 1 & 1 \
    0 & -1 & 1 \
    end{array}
    right]
    %
    &mapsto
    %
    left[
    begin{array}{ccr}
    1 & 0 & 1 \
    0 & 1 & -1 \
    end{array}
    right]
    end{align}
    $$
    In terms of basic variables,
    $$
    begin{align}
    x_{1} &= -x_{3}, \
    x_{2} &= x_{3}.
    end{align}
    $$
    Making the natural choice that $x_{3}=1$ produces the column vector
    $$
    %
    left[
    begin{array}{r}
    x_{1} \
    x_{2} \
    x_{3}
    end{array}
    right]
    %
    =
    %
    left[
    begin{array}{r}
    -1 \
    1 \
    1
    end{array}
    right]
    $$



    Gram-Schmidt



    Make any choice for the third vector and use the process of Gram and Schmidt to make it an orthogonal vector. A wise choice to begin is
    $$
    tilde{v}_{3} =
    left[ begin{array}{c}
    0 \ 0 \ 1
    end{array} right]
    $$
    Why is this a wise choice? It is rich in $0$s, which make manipulation easy.



    Define the operator which projects the vector $u$ onto the vector $v$ as
    $$
    p_{uto v} =
    frac{vcdot u}{u cdot u}
    u
    $$
    The Gram-Schmidt process fixes $v_{3}$ using the prescription
    $$
    v_{GS} = v_{3} -
    frac{v_{3} cdot v_{1}} {v_{1} cdot v_{1}} v_{1} -
    frac{v_{3} cdot v_{2}} {v_{2} cdot v_{2}} v_{2}
    $$



    $$
    frac{1}{3}
    left[ begin{array}{r}
    -1 \ 1 \ 1
    end{array} right]
    %
    =
    %
    left[ begin{array}{c}
    0 \ 0 \ 1
    end{array} right]
    %
    -
    %
    frac{1}{6}
    left[ begin{array}{c}
    2 \ 1 \ 1
    end{array} right]
    %
    -
    %
    frac{1}{2}
    left[ begin{array}{r}
    0 \ -1 \ 1
    end{array} right]
    %
    $$



    The normalized form is the column vector you want
    $$
    frac{1}{sqrt{3}}
    left[ begin{array}{r}
    -1 \ 1 \ 1
    end{array} right]
    $$






    share|cite|improve this answer











    $endgroup$


















      0












      $begingroup$

      There are a few ways to approach this problem.



      Eyeball method



      Scrape off the distractions of normalization. The column vectors are
      $$
      tilde{v}_{1} =
      %
      left[ begin{array}{c}
      2 \ 1 \ 1
      end{array} right], qquad
      %
      tilde{v}_{2} =
      %
      left[ begin{array}{r}
      0 \ -1 \ 1
      end{array} right].
      %
      $$
      Find a vector perpendicular to both. One such solution is
      $$
      tilde{v}_{3} =
      %
      left[ begin{array}{r}
      -1 \ 1 \ 1
      end{array} right].
      $$



      Systematic approach



      Start with
      $$
      mathbf{A} =
      left[
      begin{array}{cr}
      2 & 0 \
      1 & -1 \
      1 & 1 \
      end{array}
      right].
      $$
      Find the nullspace $mathcal{N}left(mathbf{A}^{*} right)$. The row reduced form is
      $$
      begin{align}
      %
      mathbf{A}^{T} &mapsto mathbf{E}_{mathbf{A}^{T}} \
      %
      left[
      begin{array}{crc}
      2 & 1 & 1 \
      0 & -1 & 1 \
      end{array}
      right]
      %
      &mapsto
      %
      left[
      begin{array}{ccr}
      1 & 0 & 1 \
      0 & 1 & -1 \
      end{array}
      right]
      end{align}
      $$
      In terms of basic variables,
      $$
      begin{align}
      x_{1} &= -x_{3}, \
      x_{2} &= x_{3}.
      end{align}
      $$
      Making the natural choice that $x_{3}=1$ produces the column vector
      $$
      %
      left[
      begin{array}{r}
      x_{1} \
      x_{2} \
      x_{3}
      end{array}
      right]
      %
      =
      %
      left[
      begin{array}{r}
      -1 \
      1 \
      1
      end{array}
      right]
      $$



      Gram-Schmidt



      Make any choice for the third vector and use the process of Gram and Schmidt to make it an orthogonal vector. A wise choice to begin is
      $$
      tilde{v}_{3} =
      left[ begin{array}{c}
      0 \ 0 \ 1
      end{array} right]
      $$
      Why is this a wise choice? It is rich in $0$s, which make manipulation easy.



      Define the operator which projects the vector $u$ onto the vector $v$ as
      $$
      p_{uto v} =
      frac{vcdot u}{u cdot u}
      u
      $$
      The Gram-Schmidt process fixes $v_{3}$ using the prescription
      $$
      v_{GS} = v_{3} -
      frac{v_{3} cdot v_{1}} {v_{1} cdot v_{1}} v_{1} -
      frac{v_{3} cdot v_{2}} {v_{2} cdot v_{2}} v_{2}
      $$



      $$
      frac{1}{3}
      left[ begin{array}{r}
      -1 \ 1 \ 1
      end{array} right]
      %
      =
      %
      left[ begin{array}{c}
      0 \ 0 \ 1
      end{array} right]
      %
      -
      %
      frac{1}{6}
      left[ begin{array}{c}
      2 \ 1 \ 1
      end{array} right]
      %
      -
      %
      frac{1}{2}
      left[ begin{array}{r}
      0 \ -1 \ 1
      end{array} right]
      %
      $$



      The normalized form is the column vector you want
      $$
      frac{1}{sqrt{3}}
      left[ begin{array}{r}
      -1 \ 1 \ 1
      end{array} right]
      $$






      share|cite|improve this answer











      $endgroup$
















        0












        0








        0





        $begingroup$

        There are a few ways to approach this problem.



        Eyeball method



        Scrape off the distractions of normalization. The column vectors are
        $$
        tilde{v}_{1} =
        %
        left[ begin{array}{c}
        2 \ 1 \ 1
        end{array} right], qquad
        %
        tilde{v}_{2} =
        %
        left[ begin{array}{r}
        0 \ -1 \ 1
        end{array} right].
        %
        $$
        Find a vector perpendicular to both. One such solution is
        $$
        tilde{v}_{3} =
        %
        left[ begin{array}{r}
        -1 \ 1 \ 1
        end{array} right].
        $$



        Systematic approach



        Start with
        $$
        mathbf{A} =
        left[
        begin{array}{cr}
        2 & 0 \
        1 & -1 \
        1 & 1 \
        end{array}
        right].
        $$
        Find the nullspace $mathcal{N}left(mathbf{A}^{*} right)$. The row reduced form is
        $$
        begin{align}
        %
        mathbf{A}^{T} &mapsto mathbf{E}_{mathbf{A}^{T}} \
        %
        left[
        begin{array}{crc}
        2 & 1 & 1 \
        0 & -1 & 1 \
        end{array}
        right]
        %
        &mapsto
        %
        left[
        begin{array}{ccr}
        1 & 0 & 1 \
        0 & 1 & -1 \
        end{array}
        right]
        end{align}
        $$
        In terms of basic variables,
        $$
        begin{align}
        x_{1} &= -x_{3}, \
        x_{2} &= x_{3}.
        end{align}
        $$
        Making the natural choice that $x_{3}=1$ produces the column vector
        $$
        %
        left[
        begin{array}{r}
        x_{1} \
        x_{2} \
        x_{3}
        end{array}
        right]
        %
        =
        %
        left[
        begin{array}{r}
        -1 \
        1 \
        1
        end{array}
        right]
        $$



        Gram-Schmidt



        Make any choice for the third vector and use the process of Gram and Schmidt to make it an orthogonal vector. A wise choice to begin is
        $$
        tilde{v}_{3} =
        left[ begin{array}{c}
        0 \ 0 \ 1
        end{array} right]
        $$
        Why is this a wise choice? It is rich in $0$s, which make manipulation easy.



        Define the operator which projects the vector $u$ onto the vector $v$ as
        $$
        p_{uto v} =
        frac{vcdot u}{u cdot u}
        u
        $$
        The Gram-Schmidt process fixes $v_{3}$ using the prescription
        $$
        v_{GS} = v_{3} -
        frac{v_{3} cdot v_{1}} {v_{1} cdot v_{1}} v_{1} -
        frac{v_{3} cdot v_{2}} {v_{2} cdot v_{2}} v_{2}
        $$



        $$
        frac{1}{3}
        left[ begin{array}{r}
        -1 \ 1 \ 1
        end{array} right]
        %
        =
        %
        left[ begin{array}{c}
        0 \ 0 \ 1
        end{array} right]
        %
        -
        %
        frac{1}{6}
        left[ begin{array}{c}
        2 \ 1 \ 1
        end{array} right]
        %
        -
        %
        frac{1}{2}
        left[ begin{array}{r}
        0 \ -1 \ 1
        end{array} right]
        %
        $$



        The normalized form is the column vector you want
        $$
        frac{1}{sqrt{3}}
        left[ begin{array}{r}
        -1 \ 1 \ 1
        end{array} right]
        $$






        share|cite|improve this answer











        $endgroup$



        There are a few ways to approach this problem.



        Eyeball method



        Scrape off the distractions of normalization. The column vectors are
        $$
        tilde{v}_{1} =
        %
        left[ begin{array}{c}
        2 \ 1 \ 1
        end{array} right], qquad
        %
        tilde{v}_{2} =
        %
        left[ begin{array}{r}
        0 \ -1 \ 1
        end{array} right].
        %
        $$
        Find a vector perpendicular to both. One such solution is
        $$
        tilde{v}_{3} =
        %
        left[ begin{array}{r}
        -1 \ 1 \ 1
        end{array} right].
        $$



        Systematic approach



        Start with
        $$
        mathbf{A} =
        left[
        begin{array}{cr}
        2 & 0 \
        1 & -1 \
        1 & 1 \
        end{array}
        right].
        $$
        Find the nullspace $mathcal{N}left(mathbf{A}^{*} right)$. The row reduced form is
        $$
        begin{align}
        %
        mathbf{A}^{T} &mapsto mathbf{E}_{mathbf{A}^{T}} \
        %
        left[
        begin{array}{crc}
        2 & 1 & 1 \
        0 & -1 & 1 \
        end{array}
        right]
        %
        &mapsto
        %
        left[
        begin{array}{ccr}
        1 & 0 & 1 \
        0 & 1 & -1 \
        end{array}
        right]
        end{align}
        $$
        In terms of basic variables,
        $$
        begin{align}
        x_{1} &= -x_{3}, \
        x_{2} &= x_{3}.
        end{align}
        $$
        Making the natural choice that $x_{3}=1$ produces the column vector
        $$
        %
        left[
        begin{array}{r}
        x_{1} \
        x_{2} \
        x_{3}
        end{array}
        right]
        %
        =
        %
        left[
        begin{array}{r}
        -1 \
        1 \
        1
        end{array}
        right]
        $$



        Gram-Schmidt



        Make any choice for the third vector and use the process of Gram and Schmidt to make it an orthogonal vector. A wise choice to begin is
        $$
        tilde{v}_{3} =
        left[ begin{array}{c}
        0 \ 0 \ 1
        end{array} right]
        $$
        Why is this a wise choice? It is rich in $0$s, which make manipulation easy.



        Define the operator which projects the vector $u$ onto the vector $v$ as
        $$
        p_{uto v} =
        frac{vcdot u}{u cdot u}
        u
        $$
        The Gram-Schmidt process fixes $v_{3}$ using the prescription
        $$
        v_{GS} = v_{3} -
        frac{v_{3} cdot v_{1}} {v_{1} cdot v_{1}} v_{1} -
        frac{v_{3} cdot v_{2}} {v_{2} cdot v_{2}} v_{2}
        $$



        $$
        frac{1}{3}
        left[ begin{array}{r}
        -1 \ 1 \ 1
        end{array} right]
        %
        =
        %
        left[ begin{array}{c}
        0 \ 0 \ 1
        end{array} right]
        %
        -
        %
        frac{1}{6}
        left[ begin{array}{c}
        2 \ 1 \ 1
        end{array} right]
        %
        -
        %
        frac{1}{2}
        left[ begin{array}{r}
        0 \ -1 \ 1
        end{array} right]
        %
        $$



        The normalized form is the column vector you want
        $$
        frac{1}{sqrt{3}}
        left[ begin{array}{r}
        -1 \ 1 \ 1
        end{array} right]
        $$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Apr 17 '17 at 3:33

























        answered Apr 15 '17 at 20:51









        dantopadantopa

        6,64942245




        6,64942245






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1789474%2fextending-u-1-u-2-to-an-orthonormal-basis-when-finding-an-svd%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            MongoDB - Not Authorized To Execute Command

            How to fix TextFormField cause rebuild widget in Flutter

            in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith