Least square solution to the system
up vote
0
down vote
favorite
I am trying to solve the following problem:
Let $u_1$ and $u_2$ be two orthogonal vectors in ${rm I!R}^n$ and
set $a_1 = u_1$, $a_2 = u_1 + varepsilon u_2$ for $varepsilon>0$.
Let also $A$ be the matrix with columns $a_1$ and $a_2$ and $b$ a
vector linearly independenet of $a_1$ and $a_2$. Least square solution
is discussed here to the system $Ax = b$ as $varepsilonto0$.
(a) Find the matrix $A^top A$, its inverse, and then $hat{x} =
> (A^top A)^{-1}A^top b$ explicitly. Show that $hat{x}$ explodes as
$varepsilonto0$
(b) Find the projection $Ahat{x}$ of $b$ onto $operatorname{col}(A)$
and check that it does not depend on $varepsilon>0$. Explain the
result.
I have assumed, that $A = begin{pmatrix}u_1 & u_1+varepsilon u_2end{pmatrix}$, therefore:
$A^TA=begin{pmatrix}u_1 \ u_1+varepsilon u_2end{pmatrix}begin{pmatrix}u_1 & u_1+varepsilon u_2end{pmatrix}=begin{pmatrix}u_1^2 & u_1(u_1+varepsilon u_2)\u_1(u_1+varepsilon u_2) & (u_1 + varepsilon u_2)^2end{pmatrix}$
But when I try to compute $(A^TA)^{-1}$, determinant becomes zero, what means that matrix is not invertible. What am I doing wrong? Thanks in advance for any hints!
least-squares
New contributor
add a comment |
up vote
0
down vote
favorite
I am trying to solve the following problem:
Let $u_1$ and $u_2$ be two orthogonal vectors in ${rm I!R}^n$ and
set $a_1 = u_1$, $a_2 = u_1 + varepsilon u_2$ for $varepsilon>0$.
Let also $A$ be the matrix with columns $a_1$ and $a_2$ and $b$ a
vector linearly independenet of $a_1$ and $a_2$. Least square solution
is discussed here to the system $Ax = b$ as $varepsilonto0$.
(a) Find the matrix $A^top A$, its inverse, and then $hat{x} =
> (A^top A)^{-1}A^top b$ explicitly. Show that $hat{x}$ explodes as
$varepsilonto0$
(b) Find the projection $Ahat{x}$ of $b$ onto $operatorname{col}(A)$
and check that it does not depend on $varepsilon>0$. Explain the
result.
I have assumed, that $A = begin{pmatrix}u_1 & u_1+varepsilon u_2end{pmatrix}$, therefore:
$A^TA=begin{pmatrix}u_1 \ u_1+varepsilon u_2end{pmatrix}begin{pmatrix}u_1 & u_1+varepsilon u_2end{pmatrix}=begin{pmatrix}u_1^2 & u_1(u_1+varepsilon u_2)\u_1(u_1+varepsilon u_2) & (u_1 + varepsilon u_2)^2end{pmatrix}$
But when I try to compute $(A^TA)^{-1}$, determinant becomes zero, what means that matrix is not invertible. What am I doing wrong? Thanks in advance for any hints!
least-squares
New contributor
I guess that $A^T*A$ always has a zero determinant unless A is invertible.
– NoChance
yesterday
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
I am trying to solve the following problem:
Let $u_1$ and $u_2$ be two orthogonal vectors in ${rm I!R}^n$ and
set $a_1 = u_1$, $a_2 = u_1 + varepsilon u_2$ for $varepsilon>0$.
Let also $A$ be the matrix with columns $a_1$ and $a_2$ and $b$ a
vector linearly independenet of $a_1$ and $a_2$. Least square solution
is discussed here to the system $Ax = b$ as $varepsilonto0$.
(a) Find the matrix $A^top A$, its inverse, and then $hat{x} =
> (A^top A)^{-1}A^top b$ explicitly. Show that $hat{x}$ explodes as
$varepsilonto0$
(b) Find the projection $Ahat{x}$ of $b$ onto $operatorname{col}(A)$
and check that it does not depend on $varepsilon>0$. Explain the
result.
I have assumed, that $A = begin{pmatrix}u_1 & u_1+varepsilon u_2end{pmatrix}$, therefore:
$A^TA=begin{pmatrix}u_1 \ u_1+varepsilon u_2end{pmatrix}begin{pmatrix}u_1 & u_1+varepsilon u_2end{pmatrix}=begin{pmatrix}u_1^2 & u_1(u_1+varepsilon u_2)\u_1(u_1+varepsilon u_2) & (u_1 + varepsilon u_2)^2end{pmatrix}$
But when I try to compute $(A^TA)^{-1}$, determinant becomes zero, what means that matrix is not invertible. What am I doing wrong? Thanks in advance for any hints!
least-squares
New contributor
I am trying to solve the following problem:
Let $u_1$ and $u_2$ be two orthogonal vectors in ${rm I!R}^n$ and
set $a_1 = u_1$, $a_2 = u_1 + varepsilon u_2$ for $varepsilon>0$.
Let also $A$ be the matrix with columns $a_1$ and $a_2$ and $b$ a
vector linearly independenet of $a_1$ and $a_2$. Least square solution
is discussed here to the system $Ax = b$ as $varepsilonto0$.
(a) Find the matrix $A^top A$, its inverse, and then $hat{x} =
> (A^top A)^{-1}A^top b$ explicitly. Show that $hat{x}$ explodes as
$varepsilonto0$
(b) Find the projection $Ahat{x}$ of $b$ onto $operatorname{col}(A)$
and check that it does not depend on $varepsilon>0$. Explain the
result.
I have assumed, that $A = begin{pmatrix}u_1 & u_1+varepsilon u_2end{pmatrix}$, therefore:
$A^TA=begin{pmatrix}u_1 \ u_1+varepsilon u_2end{pmatrix}begin{pmatrix}u_1 & u_1+varepsilon u_2end{pmatrix}=begin{pmatrix}u_1^2 & u_1(u_1+varepsilon u_2)\u_1(u_1+varepsilon u_2) & (u_1 + varepsilon u_2)^2end{pmatrix}$
But when I try to compute $(A^TA)^{-1}$, determinant becomes zero, what means that matrix is not invertible. What am I doing wrong? Thanks in advance for any hints!
least-squares
least-squares
New contributor
New contributor
New contributor
asked yesterday
Michael
1014
1014
New contributor
New contributor
I guess that $A^T*A$ always has a zero determinant unless A is invertible.
– NoChance
yesterday
add a comment |
I guess that $A^T*A$ always has a zero determinant unless A is invertible.
– NoChance
yesterday
I guess that $A^T*A$ always has a zero determinant unless A is invertible.
– NoChance
yesterday
I guess that $A^T*A$ always has a zero determinant unless A is invertible.
– NoChance
yesterday
add a comment |
1 Answer
1
active
oldest
votes
up vote
0
down vote
Bear in mind what it means to multiply vectors in the first place. Your determinant is $$(u_1cdot u_1) (u_1cdot u_1 + 2varepsilon u_1cdot u_2+varepsilon^2 u_2cdot u_2)-(u_1cdot u_1 + varepsilon u_1cdot u_2)^2=varepsilon^2 (u_1cdot u_1 u_2cdot u_2-(u_1cdot u_2)^2).$$
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
Bear in mind what it means to multiply vectors in the first place. Your determinant is $$(u_1cdot u_1) (u_1cdot u_1 + 2varepsilon u_1cdot u_2+varepsilon^2 u_2cdot u_2)-(u_1cdot u_1 + varepsilon u_1cdot u_2)^2=varepsilon^2 (u_1cdot u_1 u_2cdot u_2-(u_1cdot u_2)^2).$$
add a comment |
up vote
0
down vote
Bear in mind what it means to multiply vectors in the first place. Your determinant is $$(u_1cdot u_1) (u_1cdot u_1 + 2varepsilon u_1cdot u_2+varepsilon^2 u_2cdot u_2)-(u_1cdot u_1 + varepsilon u_1cdot u_2)^2=varepsilon^2 (u_1cdot u_1 u_2cdot u_2-(u_1cdot u_2)^2).$$
add a comment |
up vote
0
down vote
up vote
0
down vote
Bear in mind what it means to multiply vectors in the first place. Your determinant is $$(u_1cdot u_1) (u_1cdot u_1 + 2varepsilon u_1cdot u_2+varepsilon^2 u_2cdot u_2)-(u_1cdot u_1 + varepsilon u_1cdot u_2)^2=varepsilon^2 (u_1cdot u_1 u_2cdot u_2-(u_1cdot u_2)^2).$$
Bear in mind what it means to multiply vectors in the first place. Your determinant is $$(u_1cdot u_1) (u_1cdot u_1 + 2varepsilon u_1cdot u_2+varepsilon^2 u_2cdot u_2)-(u_1cdot u_1 + varepsilon u_1cdot u_2)^2=varepsilon^2 (u_1cdot u_1 u_2cdot u_2-(u_1cdot u_2)^2).$$
answered yesterday
J.G.
18.2k21932
18.2k21932
add a comment |
add a comment |
Michael is a new contributor. Be nice, and check out our Code of Conduct.
Michael is a new contributor. Be nice, and check out our Code of Conduct.
Michael is a new contributor. Be nice, and check out our Code of Conduct.
Michael is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3005088%2fleast-square-solution-to-the-system%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
I guess that $A^T*A$ always has a zero determinant unless A is invertible.
– NoChance
yesterday