Find orthogonal projection of $ [n,0,0,…,0]^T$ on subspace $V$
$begingroup$
$n>1$ Given is $$V = left{ vec{x} in mathbb R^n : x_1+x_2 + ... + x_n = 0 right} $$
a) Find orthogonal basis of $V^{perp} $
b) Find orthogonal projection $vec{x} = [n,0,0,...,0]^T$ on subspace $V$
If it comes to a)
$$dim V^{perp} = n - dim V = dim V^{perp} = n - n + 1 = 1$$ So $V^{perp} = span$ one_vector_perpendicular_to_v
Put $[1,1,1,...,1,1]^T$ - it is perpendicular to $V$
Let's start Gram–Schmidt process - but we have $1$ vector so $u_1 = [1,1,1,...,1,1]^T$ = orthogonal basis of $V^{perp}$
b) It seems to be very interesting and hard. I found basis of $V$:
$$[-1,1,0,0,...,0] = vec{v_1}$$
$$[-1,0,1,0,...,0] = vec{v_2}$$
$$[-1,0,0,1,...,0] = vec{v_3}$$
$$...$$
$$[-1,0,0,0,...,1] = vec{v_{n-1}}$$
Now I start Gram–Schmidt process
$$u_1 = v_1 $$
$$u_2 = v_2 - frac{1}{2} cdot v_1$$
$$u_ 3 = v_3 - frac{1}{4} cdot v_2 + frac{1}{8} cdot v_1 $$
$$u_4 = v_4 - frac{7}{16} cdot v_3 + frac{7}{64} cdot v_2 - frac{7}{64} cdot v_1$$
I don't even know if I don't take mistake.
Moreover the calculations getting harder and harder and I still don't see any regular sequence in it. Can somebody help me with this task?
linear-algebra orthogonality
$endgroup$
add a comment |
$begingroup$
$n>1$ Given is $$V = left{ vec{x} in mathbb R^n : x_1+x_2 + ... + x_n = 0 right} $$
a) Find orthogonal basis of $V^{perp} $
b) Find orthogonal projection $vec{x} = [n,0,0,...,0]^T$ on subspace $V$
If it comes to a)
$$dim V^{perp} = n - dim V = dim V^{perp} = n - n + 1 = 1$$ So $V^{perp} = span$ one_vector_perpendicular_to_v
Put $[1,1,1,...,1,1]^T$ - it is perpendicular to $V$
Let's start Gram–Schmidt process - but we have $1$ vector so $u_1 = [1,1,1,...,1,1]^T$ = orthogonal basis of $V^{perp}$
b) It seems to be very interesting and hard. I found basis of $V$:
$$[-1,1,0,0,...,0] = vec{v_1}$$
$$[-1,0,1,0,...,0] = vec{v_2}$$
$$[-1,0,0,1,...,0] = vec{v_3}$$
$$...$$
$$[-1,0,0,0,...,1] = vec{v_{n-1}}$$
Now I start Gram–Schmidt process
$$u_1 = v_1 $$
$$u_2 = v_2 - frac{1}{2} cdot v_1$$
$$u_ 3 = v_3 - frac{1}{4} cdot v_2 + frac{1}{8} cdot v_1 $$
$$u_4 = v_4 - frac{7}{16} cdot v_3 + frac{7}{64} cdot v_2 - frac{7}{64} cdot v_1$$
I don't even know if I don't take mistake.
Moreover the calculations getting harder and harder and I still don't see any regular sequence in it. Can somebody help me with this task?
linear-algebra orthogonality
$endgroup$
$begingroup$
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
$endgroup$
– A.Γ.
Jan 3 at 14:15
$begingroup$
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
$endgroup$
– VirtualUser
Jan 3 at 14:18
$begingroup$
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
$endgroup$
– amd
Jan 3 at 21:40
add a comment |
$begingroup$
$n>1$ Given is $$V = left{ vec{x} in mathbb R^n : x_1+x_2 + ... + x_n = 0 right} $$
a) Find orthogonal basis of $V^{perp} $
b) Find orthogonal projection $vec{x} = [n,0,0,...,0]^T$ on subspace $V$
If it comes to a)
$$dim V^{perp} = n - dim V = dim V^{perp} = n - n + 1 = 1$$ So $V^{perp} = span$ one_vector_perpendicular_to_v
Put $[1,1,1,...,1,1]^T$ - it is perpendicular to $V$
Let's start Gram–Schmidt process - but we have $1$ vector so $u_1 = [1,1,1,...,1,1]^T$ = orthogonal basis of $V^{perp}$
b) It seems to be very interesting and hard. I found basis of $V$:
$$[-1,1,0,0,...,0] = vec{v_1}$$
$$[-1,0,1,0,...,0] = vec{v_2}$$
$$[-1,0,0,1,...,0] = vec{v_3}$$
$$...$$
$$[-1,0,0,0,...,1] = vec{v_{n-1}}$$
Now I start Gram–Schmidt process
$$u_1 = v_1 $$
$$u_2 = v_2 - frac{1}{2} cdot v_1$$
$$u_ 3 = v_3 - frac{1}{4} cdot v_2 + frac{1}{8} cdot v_1 $$
$$u_4 = v_4 - frac{7}{16} cdot v_3 + frac{7}{64} cdot v_2 - frac{7}{64} cdot v_1$$
I don't even know if I don't take mistake.
Moreover the calculations getting harder and harder and I still don't see any regular sequence in it. Can somebody help me with this task?
linear-algebra orthogonality
$endgroup$
$n>1$ Given is $$V = left{ vec{x} in mathbb R^n : x_1+x_2 + ... + x_n = 0 right} $$
a) Find orthogonal basis of $V^{perp} $
b) Find orthogonal projection $vec{x} = [n,0,0,...,0]^T$ on subspace $V$
If it comes to a)
$$dim V^{perp} = n - dim V = dim V^{perp} = n - n + 1 = 1$$ So $V^{perp} = span$ one_vector_perpendicular_to_v
Put $[1,1,1,...,1,1]^T$ - it is perpendicular to $V$
Let's start Gram–Schmidt process - but we have $1$ vector so $u_1 = [1,1,1,...,1,1]^T$ = orthogonal basis of $V^{perp}$
b) It seems to be very interesting and hard. I found basis of $V$:
$$[-1,1,0,0,...,0] = vec{v_1}$$
$$[-1,0,1,0,...,0] = vec{v_2}$$
$$[-1,0,0,1,...,0] = vec{v_3}$$
$$...$$
$$[-1,0,0,0,...,1] = vec{v_{n-1}}$$
Now I start Gram–Schmidt process
$$u_1 = v_1 $$
$$u_2 = v_2 - frac{1}{2} cdot v_1$$
$$u_ 3 = v_3 - frac{1}{4} cdot v_2 + frac{1}{8} cdot v_1 $$
$$u_4 = v_4 - frac{7}{16} cdot v_3 + frac{7}{64} cdot v_2 - frac{7}{64} cdot v_1$$
I don't even know if I don't take mistake.
Moreover the calculations getting harder and harder and I still don't see any regular sequence in it. Can somebody help me with this task?
linear-algebra orthogonality
linear-algebra orthogonality
asked Jan 3 at 14:10
VirtualUserVirtualUser
59212
59212
$begingroup$
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
$endgroup$
– A.Γ.
Jan 3 at 14:15
$begingroup$
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
$endgroup$
– VirtualUser
Jan 3 at 14:18
$begingroup$
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
$endgroup$
– amd
Jan 3 at 21:40
add a comment |
$begingroup$
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
$endgroup$
– A.Γ.
Jan 3 at 14:15
$begingroup$
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
$endgroup$
– VirtualUser
Jan 3 at 14:18
$begingroup$
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
$endgroup$
– amd
Jan 3 at 21:40
$begingroup$
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
$endgroup$
– A.Γ.
Jan 3 at 14:15
$begingroup$
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
$endgroup$
– A.Γ.
Jan 3 at 14:15
$begingroup$
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
$endgroup$
– VirtualUser
Jan 3 at 14:18
$begingroup$
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
$endgroup$
– VirtualUser
Jan 3 at 14:18
$begingroup$
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
$endgroup$
– amd
Jan 3 at 21:40
$begingroup$
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
$endgroup$
– amd
Jan 3 at 21:40
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
$endgroup$
$begingroup$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
$endgroup$
– VirtualUser
Jan 3 at 14:36
$begingroup$
Great, it is a loooot of simpler way than my idea, thanks
$endgroup$
– VirtualUser
Jan 3 at 14:39
1
$begingroup$
@VirtualUser Corrected. You read my answer before I had time to check it!
$endgroup$
– Damien
Jan 3 at 14:39
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060582%2ffind-orthogonal-projection-of-n-0-0-0t-on-subspace-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
$endgroup$
$begingroup$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
$endgroup$
– VirtualUser
Jan 3 at 14:36
$begingroup$
Great, it is a loooot of simpler way than my idea, thanks
$endgroup$
– VirtualUser
Jan 3 at 14:39
1
$begingroup$
@VirtualUser Corrected. You read my answer before I had time to check it!
$endgroup$
– Damien
Jan 3 at 14:39
add a comment |
$begingroup$
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
$endgroup$
$begingroup$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
$endgroup$
– VirtualUser
Jan 3 at 14:36
$begingroup$
Great, it is a loooot of simpler way than my idea, thanks
$endgroup$
– VirtualUser
Jan 3 at 14:39
1
$begingroup$
@VirtualUser Corrected. You read my answer before I had time to check it!
$endgroup$
– Damien
Jan 3 at 14:39
add a comment |
$begingroup$
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
$endgroup$
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
edited Jan 3 at 14:38
answered Jan 3 at 14:33
DamienDamien
58214
58214
$begingroup$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
$endgroup$
– VirtualUser
Jan 3 at 14:36
$begingroup$
Great, it is a loooot of simpler way than my idea, thanks
$endgroup$
– VirtualUser
Jan 3 at 14:39
1
$begingroup$
@VirtualUser Corrected. You read my answer before I had time to check it!
$endgroup$
– Damien
Jan 3 at 14:39
add a comment |
$begingroup$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
$endgroup$
– VirtualUser
Jan 3 at 14:36
$begingroup$
Great, it is a loooot of simpler way than my idea, thanks
$endgroup$
– VirtualUser
Jan 3 at 14:39
1
$begingroup$
@VirtualUser Corrected. You read my answer before I had time to check it!
$endgroup$
– Damien
Jan 3 at 14:39
$begingroup$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
$endgroup$
– VirtualUser
Jan 3 at 14:36
$begingroup$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
$endgroup$
– VirtualUser
Jan 3 at 14:36
$begingroup$
Great, it is a loooot of simpler way than my idea, thanks
$endgroup$
– VirtualUser
Jan 3 at 14:39
$begingroup$
Great, it is a loooot of simpler way than my idea, thanks
$endgroup$
– VirtualUser
Jan 3 at 14:39
1
1
$begingroup$
@VirtualUser Corrected. You read my answer before I had time to check it!
$endgroup$
– Damien
Jan 3 at 14:39
$begingroup$
@VirtualUser Corrected. You read my answer before I had time to check it!
$endgroup$
– Damien
Jan 3 at 14:39
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060582%2ffind-orthogonal-projection-of-n-0-0-0t-on-subspace-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
$endgroup$
– A.Γ.
Jan 3 at 14:15
$begingroup$
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
$endgroup$
– VirtualUser
Jan 3 at 14:18
$begingroup$
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
$endgroup$
– amd
Jan 3 at 21:40