Matrix associated with a projection mapping












1












$begingroup$


From S.L Linear Algebra:




Find the matrix associated with the following linear maps. The vectors
are written horizontally with a transpose sign for typographical
reasons.

(a) $F:mathbb{R}^4 rightarrow mathbb{R}^2$ given by $Fleft ((x_1, x_2, x_3, x_4)^T right)=(x_1, x_2)^T$ (the projection)




Solution Attempt



I will use Theorem 2.1 from the book:




Let $L: K^n rightarrow K^m$ be a linear map. Then there exists a
unique matrix $A$ such that $L = L_A$.




Hence, for all $X$ we have $L(X)=AX$.



In this case, some $A$ is the matrix associated with linear map $F$ such that:



$$F(X^T)=AX^T$$
$$Fleft ((x_1, x_2, x_3, x_4)^T right)=Aleft ((x_1, x_2, x_3, x_4)^T right)$$
$$AX^T=Aleft ((x_1, x_2, x_3, x_4)^T right)=(x_1, x_2)^T$$



Solving for $A$:



$$A=left ((x_1, x_2, x_3, x_4)^T right)^{-1}(x_1, x_2)^T$$



Now this makes no sense, because $X$ must be non-degenerate square matrix in order to be invertible, but $X$ is just a vector with cardinality (dimension) $4$.



Perhaps I could invert $A$ (then I would have to show that $F$ is injective with trivial kernel), but I don't see any point of doing this.



Is $A$ truly a matrix associated with $F$? If not, how to find $A$ if it is a matrix that is truly associated with $F$?





P.S



I also found this in the book:




Let $F: mathbb{R}^3 rightarrow mathbb{R}^2$ be the projection, in
other words the mapping such that $F(x_1, x_2, x_3) = (x_1, x_2)$.
Then the matrix associated with $F$ is:



$$begin{pmatrix} 1 & 0 & 0 \ 0&1 & 0 end{pmatrix}$$




I don't know how was it exactly calculated, but the matrix above contains standard basis vectors only, which I believe must mean something.



Thank you!










share|cite|improve this question









$endgroup$












  • $begingroup$
    Hint: The columns of the matrix are the images of the basis vectors.
    $endgroup$
    – amd
    Jan 6 at 21:05
















1












$begingroup$


From S.L Linear Algebra:




Find the matrix associated with the following linear maps. The vectors
are written horizontally with a transpose sign for typographical
reasons.

(a) $F:mathbb{R}^4 rightarrow mathbb{R}^2$ given by $Fleft ((x_1, x_2, x_3, x_4)^T right)=(x_1, x_2)^T$ (the projection)




Solution Attempt



I will use Theorem 2.1 from the book:




Let $L: K^n rightarrow K^m$ be a linear map. Then there exists a
unique matrix $A$ such that $L = L_A$.




Hence, for all $X$ we have $L(X)=AX$.



In this case, some $A$ is the matrix associated with linear map $F$ such that:



$$F(X^T)=AX^T$$
$$Fleft ((x_1, x_2, x_3, x_4)^T right)=Aleft ((x_1, x_2, x_3, x_4)^T right)$$
$$AX^T=Aleft ((x_1, x_2, x_3, x_4)^T right)=(x_1, x_2)^T$$



Solving for $A$:



$$A=left ((x_1, x_2, x_3, x_4)^T right)^{-1}(x_1, x_2)^T$$



Now this makes no sense, because $X$ must be non-degenerate square matrix in order to be invertible, but $X$ is just a vector with cardinality (dimension) $4$.



Perhaps I could invert $A$ (then I would have to show that $F$ is injective with trivial kernel), but I don't see any point of doing this.



Is $A$ truly a matrix associated with $F$? If not, how to find $A$ if it is a matrix that is truly associated with $F$?





P.S



I also found this in the book:




Let $F: mathbb{R}^3 rightarrow mathbb{R}^2$ be the projection, in
other words the mapping such that $F(x_1, x_2, x_3) = (x_1, x_2)$.
Then the matrix associated with $F$ is:



$$begin{pmatrix} 1 & 0 & 0 \ 0&1 & 0 end{pmatrix}$$




I don't know how was it exactly calculated, but the matrix above contains standard basis vectors only, which I believe must mean something.



Thank you!










share|cite|improve this question









$endgroup$












  • $begingroup$
    Hint: The columns of the matrix are the images of the basis vectors.
    $endgroup$
    – amd
    Jan 6 at 21:05














1












1








1





$begingroup$


From S.L Linear Algebra:




Find the matrix associated with the following linear maps. The vectors
are written horizontally with a transpose sign for typographical
reasons.

(a) $F:mathbb{R}^4 rightarrow mathbb{R}^2$ given by $Fleft ((x_1, x_2, x_3, x_4)^T right)=(x_1, x_2)^T$ (the projection)




Solution Attempt



I will use Theorem 2.1 from the book:




Let $L: K^n rightarrow K^m$ be a linear map. Then there exists a
unique matrix $A$ such that $L = L_A$.




Hence, for all $X$ we have $L(X)=AX$.



In this case, some $A$ is the matrix associated with linear map $F$ such that:



$$F(X^T)=AX^T$$
$$Fleft ((x_1, x_2, x_3, x_4)^T right)=Aleft ((x_1, x_2, x_3, x_4)^T right)$$
$$AX^T=Aleft ((x_1, x_2, x_3, x_4)^T right)=(x_1, x_2)^T$$



Solving for $A$:



$$A=left ((x_1, x_2, x_3, x_4)^T right)^{-1}(x_1, x_2)^T$$



Now this makes no sense, because $X$ must be non-degenerate square matrix in order to be invertible, but $X$ is just a vector with cardinality (dimension) $4$.



Perhaps I could invert $A$ (then I would have to show that $F$ is injective with trivial kernel), but I don't see any point of doing this.



Is $A$ truly a matrix associated with $F$? If not, how to find $A$ if it is a matrix that is truly associated with $F$?





P.S



I also found this in the book:




Let $F: mathbb{R}^3 rightarrow mathbb{R}^2$ be the projection, in
other words the mapping such that $F(x_1, x_2, x_3) = (x_1, x_2)$.
Then the matrix associated with $F$ is:



$$begin{pmatrix} 1 & 0 & 0 \ 0&1 & 0 end{pmatrix}$$




I don't know how was it exactly calculated, but the matrix above contains standard basis vectors only, which I believe must mean something.



Thank you!










share|cite|improve this question









$endgroup$




From S.L Linear Algebra:




Find the matrix associated with the following linear maps. The vectors
are written horizontally with a transpose sign for typographical
reasons.

(a) $F:mathbb{R}^4 rightarrow mathbb{R}^2$ given by $Fleft ((x_1, x_2, x_3, x_4)^T right)=(x_1, x_2)^T$ (the projection)




Solution Attempt



I will use Theorem 2.1 from the book:




Let $L: K^n rightarrow K^m$ be a linear map. Then there exists a
unique matrix $A$ such that $L = L_A$.




Hence, for all $X$ we have $L(X)=AX$.



In this case, some $A$ is the matrix associated with linear map $F$ such that:



$$F(X^T)=AX^T$$
$$Fleft ((x_1, x_2, x_3, x_4)^T right)=Aleft ((x_1, x_2, x_3, x_4)^T right)$$
$$AX^T=Aleft ((x_1, x_2, x_3, x_4)^T right)=(x_1, x_2)^T$$



Solving for $A$:



$$A=left ((x_1, x_2, x_3, x_4)^T right)^{-1}(x_1, x_2)^T$$



Now this makes no sense, because $X$ must be non-degenerate square matrix in order to be invertible, but $X$ is just a vector with cardinality (dimension) $4$.



Perhaps I could invert $A$ (then I would have to show that $F$ is injective with trivial kernel), but I don't see any point of doing this.



Is $A$ truly a matrix associated with $F$? If not, how to find $A$ if it is a matrix that is truly associated with $F$?





P.S



I also found this in the book:




Let $F: mathbb{R}^3 rightarrow mathbb{R}^2$ be the projection, in
other words the mapping such that $F(x_1, x_2, x_3) = (x_1, x_2)$.
Then the matrix associated with $F$ is:



$$begin{pmatrix} 1 & 0 & 0 \ 0&1 & 0 end{pmatrix}$$




I don't know how was it exactly calculated, but the matrix above contains standard basis vectors only, which I believe must mean something.



Thank you!







linear-algebra matrices linear-transformations projection






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 6 at 14:06









ShellRoxShellRox

26928




26928












  • $begingroup$
    Hint: The columns of the matrix are the images of the basis vectors.
    $endgroup$
    – amd
    Jan 6 at 21:05


















  • $begingroup$
    Hint: The columns of the matrix are the images of the basis vectors.
    $endgroup$
    – amd
    Jan 6 at 21:05
















$begingroup$
Hint: The columns of the matrix are the images of the basis vectors.
$endgroup$
– amd
Jan 6 at 21:05




$begingroup$
Hint: The columns of the matrix are the images of the basis vectors.
$endgroup$
– amd
Jan 6 at 21:05










1 Answer
1






active

oldest

votes


















1












$begingroup$

To begin with, the Vector Space of Matrices $mathscr{M}_{mn}(mathbb{R}^4)$ is not abelian under matrix product! It is a fundamental error to change sides of a matrix by "inverting it" in an equation.



And keep in mind: you can see obviously that $mathbb{R}^2$ can be extended so to be isomorphic to $mathbb{R}^4$. It is why they indicated that the image was a projection in fact.



Now let's calculate:



The first intuition is to write down what your matrix operations look like; to find out if the product is well-defined somehow: this will eliminate some big issues in your thoughts.



$$ F(left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
)=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



ie.



$$ left (
begin{matrix}
a_{11} & a_{12} & a_{13} & a_{14} \
a_{21} & a_{22} & a_{23} & a_{24}
end{matrix}
right )
.left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



when we introduce the Matrix of elements $(a_{ij})$ as the Matrix associated to the linear transformation. We must have a $2x4$ matrix for this transformation.



When you do the calculations, you'll find:



$$ begin{cases}
a_{11}x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = x_1 \
a_{21}x_{1} + a_{22}x_{2} + a_{23}x_{3} + a_{24}x_{4} = x_2
end{cases} $$



ie.



$$ begin{cases}
(a_{11}-1)x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = 0 \
a_{21}x_{1} + (a_{22}-1)x_{2} + a_{23}x_{3} + a_{24}x_{4} = 0
end{cases} $$



But as we're looking for a basis, we know it must be linearly independent (within row vctors). So that, we know, every coefficients are zero, with $(a_{11}-1)=0$ and $(a_{22}-1)=0$ so that we have $a_{11}=1$ and $a_{22}=1$, over row vectors.



Thus, we conclude that one basis for this linear transformation in $mathbb{R}^4$ is:



$$
left (
begin{matrix}
1 & 0 & 0 & 0 \
0 & 1 & 0 & 0
end{matrix}
right )$$






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Wow that makes a lot of sense. Thank you! My lack of familiarity with matrices is the primary reason for my ignorance. Also, you have mentioned that linear transformation is an isomorphism, that means that it is bijective and hence kernel under $F$ is trivial, correct?
    $endgroup$
    – ShellRox
    Jan 6 at 20:59








  • 1




    $begingroup$
    I'm glad you see more clearly. But attention: I didn't say that every linear transformation is isomorphism. Indeed, here it is surjective because we have a projection of a vector of $mathbb{R}^4$ into $mathbb{R}^2$. I wanted you to see the reverse: We could have extended the image by null vectors to have an isomorphism. So that you can visualize the fact that matrix associated needn't to be square, but only well-defined.
    $endgroup$
    – freehumorist
    Jan 6 at 21:05






  • 1




    $begingroup$
    Yes, of course. I just wanted to further discover properties of only linear transformations that are defined by projection. Thank you for the help!
    $endgroup$
    – ShellRox
    Jan 6 at 21:10











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063901%2fmatrix-associated-with-a-projection-mapping%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$

To begin with, the Vector Space of Matrices $mathscr{M}_{mn}(mathbb{R}^4)$ is not abelian under matrix product! It is a fundamental error to change sides of a matrix by "inverting it" in an equation.



And keep in mind: you can see obviously that $mathbb{R}^2$ can be extended so to be isomorphic to $mathbb{R}^4$. It is why they indicated that the image was a projection in fact.



Now let's calculate:



The first intuition is to write down what your matrix operations look like; to find out if the product is well-defined somehow: this will eliminate some big issues in your thoughts.



$$ F(left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
)=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



ie.



$$ left (
begin{matrix}
a_{11} & a_{12} & a_{13} & a_{14} \
a_{21} & a_{22} & a_{23} & a_{24}
end{matrix}
right )
.left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



when we introduce the Matrix of elements $(a_{ij})$ as the Matrix associated to the linear transformation. We must have a $2x4$ matrix for this transformation.



When you do the calculations, you'll find:



$$ begin{cases}
a_{11}x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = x_1 \
a_{21}x_{1} + a_{22}x_{2} + a_{23}x_{3} + a_{24}x_{4} = x_2
end{cases} $$



ie.



$$ begin{cases}
(a_{11}-1)x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = 0 \
a_{21}x_{1} + (a_{22}-1)x_{2} + a_{23}x_{3} + a_{24}x_{4} = 0
end{cases} $$



But as we're looking for a basis, we know it must be linearly independent (within row vctors). So that, we know, every coefficients are zero, with $(a_{11}-1)=0$ and $(a_{22}-1)=0$ so that we have $a_{11}=1$ and $a_{22}=1$, over row vectors.



Thus, we conclude that one basis for this linear transformation in $mathbb{R}^4$ is:



$$
left (
begin{matrix}
1 & 0 & 0 & 0 \
0 & 1 & 0 & 0
end{matrix}
right )$$






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Wow that makes a lot of sense. Thank you! My lack of familiarity with matrices is the primary reason for my ignorance. Also, you have mentioned that linear transformation is an isomorphism, that means that it is bijective and hence kernel under $F$ is trivial, correct?
    $endgroup$
    – ShellRox
    Jan 6 at 20:59








  • 1




    $begingroup$
    I'm glad you see more clearly. But attention: I didn't say that every linear transformation is isomorphism. Indeed, here it is surjective because we have a projection of a vector of $mathbb{R}^4$ into $mathbb{R}^2$. I wanted you to see the reverse: We could have extended the image by null vectors to have an isomorphism. So that you can visualize the fact that matrix associated needn't to be square, but only well-defined.
    $endgroup$
    – freehumorist
    Jan 6 at 21:05






  • 1




    $begingroup$
    Yes, of course. I just wanted to further discover properties of only linear transformations that are defined by projection. Thank you for the help!
    $endgroup$
    – ShellRox
    Jan 6 at 21:10
















1












$begingroup$

To begin with, the Vector Space of Matrices $mathscr{M}_{mn}(mathbb{R}^4)$ is not abelian under matrix product! It is a fundamental error to change sides of a matrix by "inverting it" in an equation.



And keep in mind: you can see obviously that $mathbb{R}^2$ can be extended so to be isomorphic to $mathbb{R}^4$. It is why they indicated that the image was a projection in fact.



Now let's calculate:



The first intuition is to write down what your matrix operations look like; to find out if the product is well-defined somehow: this will eliminate some big issues in your thoughts.



$$ F(left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
)=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



ie.



$$ left (
begin{matrix}
a_{11} & a_{12} & a_{13} & a_{14} \
a_{21} & a_{22} & a_{23} & a_{24}
end{matrix}
right )
.left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



when we introduce the Matrix of elements $(a_{ij})$ as the Matrix associated to the linear transformation. We must have a $2x4$ matrix for this transformation.



When you do the calculations, you'll find:



$$ begin{cases}
a_{11}x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = x_1 \
a_{21}x_{1} + a_{22}x_{2} + a_{23}x_{3} + a_{24}x_{4} = x_2
end{cases} $$



ie.



$$ begin{cases}
(a_{11}-1)x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = 0 \
a_{21}x_{1} + (a_{22}-1)x_{2} + a_{23}x_{3} + a_{24}x_{4} = 0
end{cases} $$



But as we're looking for a basis, we know it must be linearly independent (within row vctors). So that, we know, every coefficients are zero, with $(a_{11}-1)=0$ and $(a_{22}-1)=0$ so that we have $a_{11}=1$ and $a_{22}=1$, over row vectors.



Thus, we conclude that one basis for this linear transformation in $mathbb{R}^4$ is:



$$
left (
begin{matrix}
1 & 0 & 0 & 0 \
0 & 1 & 0 & 0
end{matrix}
right )$$






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Wow that makes a lot of sense. Thank you! My lack of familiarity with matrices is the primary reason for my ignorance. Also, you have mentioned that linear transformation is an isomorphism, that means that it is bijective and hence kernel under $F$ is trivial, correct?
    $endgroup$
    – ShellRox
    Jan 6 at 20:59








  • 1




    $begingroup$
    I'm glad you see more clearly. But attention: I didn't say that every linear transformation is isomorphism. Indeed, here it is surjective because we have a projection of a vector of $mathbb{R}^4$ into $mathbb{R}^2$. I wanted you to see the reverse: We could have extended the image by null vectors to have an isomorphism. So that you can visualize the fact that matrix associated needn't to be square, but only well-defined.
    $endgroup$
    – freehumorist
    Jan 6 at 21:05






  • 1




    $begingroup$
    Yes, of course. I just wanted to further discover properties of only linear transformations that are defined by projection. Thank you for the help!
    $endgroup$
    – ShellRox
    Jan 6 at 21:10














1












1








1





$begingroup$

To begin with, the Vector Space of Matrices $mathscr{M}_{mn}(mathbb{R}^4)$ is not abelian under matrix product! It is a fundamental error to change sides of a matrix by "inverting it" in an equation.



And keep in mind: you can see obviously that $mathbb{R}^2$ can be extended so to be isomorphic to $mathbb{R}^4$. It is why they indicated that the image was a projection in fact.



Now let's calculate:



The first intuition is to write down what your matrix operations look like; to find out if the product is well-defined somehow: this will eliminate some big issues in your thoughts.



$$ F(left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
)=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



ie.



$$ left (
begin{matrix}
a_{11} & a_{12} & a_{13} & a_{14} \
a_{21} & a_{22} & a_{23} & a_{24}
end{matrix}
right )
.left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



when we introduce the Matrix of elements $(a_{ij})$ as the Matrix associated to the linear transformation. We must have a $2x4$ matrix for this transformation.



When you do the calculations, you'll find:



$$ begin{cases}
a_{11}x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = x_1 \
a_{21}x_{1} + a_{22}x_{2} + a_{23}x_{3} + a_{24}x_{4} = x_2
end{cases} $$



ie.



$$ begin{cases}
(a_{11}-1)x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = 0 \
a_{21}x_{1} + (a_{22}-1)x_{2} + a_{23}x_{3} + a_{24}x_{4} = 0
end{cases} $$



But as we're looking for a basis, we know it must be linearly independent (within row vctors). So that, we know, every coefficients are zero, with $(a_{11}-1)=0$ and $(a_{22}-1)=0$ so that we have $a_{11}=1$ and $a_{22}=1$, over row vectors.



Thus, we conclude that one basis for this linear transformation in $mathbb{R}^4$ is:



$$
left (
begin{matrix}
1 & 0 & 0 & 0 \
0 & 1 & 0 & 0
end{matrix}
right )$$






share|cite|improve this answer









$endgroup$



To begin with, the Vector Space of Matrices $mathscr{M}_{mn}(mathbb{R}^4)$ is not abelian under matrix product! It is a fundamental error to change sides of a matrix by "inverting it" in an equation.



And keep in mind: you can see obviously that $mathbb{R}^2$ can be extended so to be isomorphic to $mathbb{R}^4$. It is why they indicated that the image was a projection in fact.



Now let's calculate:



The first intuition is to write down what your matrix operations look like; to find out if the product is well-defined somehow: this will eliminate some big issues in your thoughts.



$$ F(left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
)=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



ie.



$$ left (
begin{matrix}
a_{11} & a_{12} & a_{13} & a_{14} \
a_{21} & a_{22} & a_{23} & a_{24}
end{matrix}
right )
.left (
begin{matrix}
x_1 \
x_2 \
x_3 \
x_4
end{matrix}
right )
=left (
begin{matrix}
x_1 \
x_2
end{matrix}
right )
$$



when we introduce the Matrix of elements $(a_{ij})$ as the Matrix associated to the linear transformation. We must have a $2x4$ matrix for this transformation.



When you do the calculations, you'll find:



$$ begin{cases}
a_{11}x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = x_1 \
a_{21}x_{1} + a_{22}x_{2} + a_{23}x_{3} + a_{24}x_{4} = x_2
end{cases} $$



ie.



$$ begin{cases}
(a_{11}-1)x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = 0 \
a_{21}x_{1} + (a_{22}-1)x_{2} + a_{23}x_{3} + a_{24}x_{4} = 0
end{cases} $$



But as we're looking for a basis, we know it must be linearly independent (within row vctors). So that, we know, every coefficients are zero, with $(a_{11}-1)=0$ and $(a_{22}-1)=0$ so that we have $a_{11}=1$ and $a_{22}=1$, over row vectors.



Thus, we conclude that one basis for this linear transformation in $mathbb{R}^4$ is:



$$
left (
begin{matrix}
1 & 0 & 0 & 0 \
0 & 1 & 0 & 0
end{matrix}
right )$$







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered Jan 6 at 17:44









freehumoristfreehumorist

173112




173112












  • $begingroup$
    Wow that makes a lot of sense. Thank you! My lack of familiarity with matrices is the primary reason for my ignorance. Also, you have mentioned that linear transformation is an isomorphism, that means that it is bijective and hence kernel under $F$ is trivial, correct?
    $endgroup$
    – ShellRox
    Jan 6 at 20:59








  • 1




    $begingroup$
    I'm glad you see more clearly. But attention: I didn't say that every linear transformation is isomorphism. Indeed, here it is surjective because we have a projection of a vector of $mathbb{R}^4$ into $mathbb{R}^2$. I wanted you to see the reverse: We could have extended the image by null vectors to have an isomorphism. So that you can visualize the fact that matrix associated needn't to be square, but only well-defined.
    $endgroup$
    – freehumorist
    Jan 6 at 21:05






  • 1




    $begingroup$
    Yes, of course. I just wanted to further discover properties of only linear transformations that are defined by projection. Thank you for the help!
    $endgroup$
    – ShellRox
    Jan 6 at 21:10


















  • $begingroup$
    Wow that makes a lot of sense. Thank you! My lack of familiarity with matrices is the primary reason for my ignorance. Also, you have mentioned that linear transformation is an isomorphism, that means that it is bijective and hence kernel under $F$ is trivial, correct?
    $endgroup$
    – ShellRox
    Jan 6 at 20:59








  • 1




    $begingroup$
    I'm glad you see more clearly. But attention: I didn't say that every linear transformation is isomorphism. Indeed, here it is surjective because we have a projection of a vector of $mathbb{R}^4$ into $mathbb{R}^2$. I wanted you to see the reverse: We could have extended the image by null vectors to have an isomorphism. So that you can visualize the fact that matrix associated needn't to be square, but only well-defined.
    $endgroup$
    – freehumorist
    Jan 6 at 21:05






  • 1




    $begingroup$
    Yes, of course. I just wanted to further discover properties of only linear transformations that are defined by projection. Thank you for the help!
    $endgroup$
    – ShellRox
    Jan 6 at 21:10
















$begingroup$
Wow that makes a lot of sense. Thank you! My lack of familiarity with matrices is the primary reason for my ignorance. Also, you have mentioned that linear transformation is an isomorphism, that means that it is bijective and hence kernel under $F$ is trivial, correct?
$endgroup$
– ShellRox
Jan 6 at 20:59






$begingroup$
Wow that makes a lot of sense. Thank you! My lack of familiarity with matrices is the primary reason for my ignorance. Also, you have mentioned that linear transformation is an isomorphism, that means that it is bijective and hence kernel under $F$ is trivial, correct?
$endgroup$
– ShellRox
Jan 6 at 20:59






1




1




$begingroup$
I'm glad you see more clearly. But attention: I didn't say that every linear transformation is isomorphism. Indeed, here it is surjective because we have a projection of a vector of $mathbb{R}^4$ into $mathbb{R}^2$. I wanted you to see the reverse: We could have extended the image by null vectors to have an isomorphism. So that you can visualize the fact that matrix associated needn't to be square, but only well-defined.
$endgroup$
– freehumorist
Jan 6 at 21:05




$begingroup$
I'm glad you see more clearly. But attention: I didn't say that every linear transformation is isomorphism. Indeed, here it is surjective because we have a projection of a vector of $mathbb{R}^4$ into $mathbb{R}^2$. I wanted you to see the reverse: We could have extended the image by null vectors to have an isomorphism. So that you can visualize the fact that matrix associated needn't to be square, but only well-defined.
$endgroup$
– freehumorist
Jan 6 at 21:05




1




1




$begingroup$
Yes, of course. I just wanted to further discover properties of only linear transformations that are defined by projection. Thank you for the help!
$endgroup$
– ShellRox
Jan 6 at 21:10




$begingroup$
Yes, of course. I just wanted to further discover properties of only linear transformations that are defined by projection. Thank you for the help!
$endgroup$
– ShellRox
Jan 6 at 21:10


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063901%2fmatrix-associated-with-a-projection-mapping%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

How to fix TextFormField cause rebuild widget in Flutter

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith