What is the intuition behind why a rank deficient matrix does not have an inverse?
$begingroup$
Suppose that we have a $p$ dimensional square matrix $A$ whose rank is less than $p$. We know that such a matrix cannot have an inverse and there are several different ways to prove that the $A$ does not have an inverse.
However, I am struggling to obtain an intuition behind why the inverse does not exist. I considered the following ideas to generate an intuition but failed to do so.
The matrix $A$ can be viewed as a set of transformations such as scaling, translation, rotation etc. Thus, when we apply $A$ to a vector in $p$ dimensions it always maps the $p$ dimensional vector to a vector in a subspace spanned by $A$ if it is less than full rank. Lack of an inverse implies that we cannot reverse the transformations. Why not?
The columns of $A$ span only a subspace of $R^p$ if it is less than full rank. Thus, a transformation such as $A y$ takes a vector from $R^p$ to a vector that always belongs to that subspace. The above viewpoint did not help in obtaining an intuition either.
Is there a way to obtain an intuition as to why a rank deficient matrix does not have an inverse?
linear-algebra inverse intuition matrix-rank
$endgroup$
add a comment |
$begingroup$
Suppose that we have a $p$ dimensional square matrix $A$ whose rank is less than $p$. We know that such a matrix cannot have an inverse and there are several different ways to prove that the $A$ does not have an inverse.
However, I am struggling to obtain an intuition behind why the inverse does not exist. I considered the following ideas to generate an intuition but failed to do so.
The matrix $A$ can be viewed as a set of transformations such as scaling, translation, rotation etc. Thus, when we apply $A$ to a vector in $p$ dimensions it always maps the $p$ dimensional vector to a vector in a subspace spanned by $A$ if it is less than full rank. Lack of an inverse implies that we cannot reverse the transformations. Why not?
The columns of $A$ span only a subspace of $R^p$ if it is less than full rank. Thus, a transformation such as $A y$ takes a vector from $R^p$ to a vector that always belongs to that subspace. The above viewpoint did not help in obtaining an intuition either.
Is there a way to obtain an intuition as to why a rank deficient matrix does not have an inverse?
linear-algebra inverse intuition matrix-rank
$endgroup$
2
$begingroup$
I think point 2 will give you the intuition if you consider the action on a basis. (The image of a basis is a basis of the image.)
$endgroup$
– saulspatz
Jan 17 at 14:57
$begingroup$
To elaborate on the above comment, if $A$ does not have full rank, then a nontrivial subspace gets sent to the zero vector. What happens if we try to invert that? In other words, how can we find the inverse image of the zero vector if lots of vectors are mapped to it (the answer: we can't!).
$endgroup$
– OldGodzilla
Jan 17 at 15:00
$begingroup$
It hasn't an inverse but it has a pseudo-inverse, a very useful notion.
$endgroup$
– Jean Marie
Jan 25 at 15:44
add a comment |
$begingroup$
Suppose that we have a $p$ dimensional square matrix $A$ whose rank is less than $p$. We know that such a matrix cannot have an inverse and there are several different ways to prove that the $A$ does not have an inverse.
However, I am struggling to obtain an intuition behind why the inverse does not exist. I considered the following ideas to generate an intuition but failed to do so.
The matrix $A$ can be viewed as a set of transformations such as scaling, translation, rotation etc. Thus, when we apply $A$ to a vector in $p$ dimensions it always maps the $p$ dimensional vector to a vector in a subspace spanned by $A$ if it is less than full rank. Lack of an inverse implies that we cannot reverse the transformations. Why not?
The columns of $A$ span only a subspace of $R^p$ if it is less than full rank. Thus, a transformation such as $A y$ takes a vector from $R^p$ to a vector that always belongs to that subspace. The above viewpoint did not help in obtaining an intuition either.
Is there a way to obtain an intuition as to why a rank deficient matrix does not have an inverse?
linear-algebra inverse intuition matrix-rank
$endgroup$
Suppose that we have a $p$ dimensional square matrix $A$ whose rank is less than $p$. We know that such a matrix cannot have an inverse and there are several different ways to prove that the $A$ does not have an inverse.
However, I am struggling to obtain an intuition behind why the inverse does not exist. I considered the following ideas to generate an intuition but failed to do so.
The matrix $A$ can be viewed as a set of transformations such as scaling, translation, rotation etc. Thus, when we apply $A$ to a vector in $p$ dimensions it always maps the $p$ dimensional vector to a vector in a subspace spanned by $A$ if it is less than full rank. Lack of an inverse implies that we cannot reverse the transformations. Why not?
The columns of $A$ span only a subspace of $R^p$ if it is less than full rank. Thus, a transformation such as $A y$ takes a vector from $R^p$ to a vector that always belongs to that subspace. The above viewpoint did not help in obtaining an intuition either.
Is there a way to obtain an intuition as to why a rank deficient matrix does not have an inverse?
linear-algebra inverse intuition matrix-rank
linear-algebra inverse intuition matrix-rank
asked Jan 17 at 14:50
jaggujaggu
111
111
2
$begingroup$
I think point 2 will give you the intuition if you consider the action on a basis. (The image of a basis is a basis of the image.)
$endgroup$
– saulspatz
Jan 17 at 14:57
$begingroup$
To elaborate on the above comment, if $A$ does not have full rank, then a nontrivial subspace gets sent to the zero vector. What happens if we try to invert that? In other words, how can we find the inverse image of the zero vector if lots of vectors are mapped to it (the answer: we can't!).
$endgroup$
– OldGodzilla
Jan 17 at 15:00
$begingroup$
It hasn't an inverse but it has a pseudo-inverse, a very useful notion.
$endgroup$
– Jean Marie
Jan 25 at 15:44
add a comment |
2
$begingroup$
I think point 2 will give you the intuition if you consider the action on a basis. (The image of a basis is a basis of the image.)
$endgroup$
– saulspatz
Jan 17 at 14:57
$begingroup$
To elaborate on the above comment, if $A$ does not have full rank, then a nontrivial subspace gets sent to the zero vector. What happens if we try to invert that? In other words, how can we find the inverse image of the zero vector if lots of vectors are mapped to it (the answer: we can't!).
$endgroup$
– OldGodzilla
Jan 17 at 15:00
$begingroup$
It hasn't an inverse but it has a pseudo-inverse, a very useful notion.
$endgroup$
– Jean Marie
Jan 25 at 15:44
2
2
$begingroup$
I think point 2 will give you the intuition if you consider the action on a basis. (The image of a basis is a basis of the image.)
$endgroup$
– saulspatz
Jan 17 at 14:57
$begingroup$
I think point 2 will give you the intuition if you consider the action on a basis. (The image of a basis is a basis of the image.)
$endgroup$
– saulspatz
Jan 17 at 14:57
$begingroup$
To elaborate on the above comment, if $A$ does not have full rank, then a nontrivial subspace gets sent to the zero vector. What happens if we try to invert that? In other words, how can we find the inverse image of the zero vector if lots of vectors are mapped to it (the answer: we can't!).
$endgroup$
– OldGodzilla
Jan 17 at 15:00
$begingroup$
To elaborate on the above comment, if $A$ does not have full rank, then a nontrivial subspace gets sent to the zero vector. What happens if we try to invert that? In other words, how can we find the inverse image of the zero vector if lots of vectors are mapped to it (the answer: we can't!).
$endgroup$
– OldGodzilla
Jan 17 at 15:00
$begingroup$
It hasn't an inverse but it has a pseudo-inverse, a very useful notion.
$endgroup$
– Jean Marie
Jan 25 at 15:44
$begingroup$
It hasn't an inverse but it has a pseudo-inverse, a very useful notion.
$endgroup$
– Jean Marie
Jan 25 at 15:44
add a comment |
4 Answers
4
active
oldest
votes
$begingroup$
"Invertible" when talking about linear transformations means "reversible". In other words, a linear transformation (and the corresponding matrix in a given basis) is invertible iff it is possible, given an output, to figure out exactly what the input was.
A rank deficient linear transformation will collapse at least one dimension, meaning each output could be the result of any of a number of different inputs. Specifically, it will have a non-trivial kernel, so multiple different inputs result in the output $vec 0$.
$endgroup$
add a comment |
$begingroup$
The row reduced echelon form of your matrix will have some rows of zero at the bottom , which is not invertible.
An invertible matrix when reduced to its row reduced echelon form becomes the identity matrix.
$endgroup$
add a comment |
$begingroup$
This stems from the fact that a mapping $f:Vto V$ cannot possess any left inverse if it is not injective and it cannot possess any right inverse if it is not surjective:
- if $f$ is not injective, i.e. if $f(u)=f(v)$ for some $une v$, then $f$ cannot have any left inverse, otherwise we would have $u=(f^{-1}circ f)(u)=f^{-1}(f(u))=f^{-1}(f(v))=(f^{-1}circ f)(v)=v$, which is a contradiction;
- if $f$ is not surjective, i.e. if there is some member $w$ of $V$ that lies outside $f(V)$, then $f$ cannot possess any right inverse, otherwise we would have $w=f(f^{-1}(w))in f(V)$, which is a contradiction.
Now, if a square matrix $A$ is rank deficient, its columns are linearly dependent. Therefore $Au=0$ for some nonzero vector $u$. In other words, $A$ maps both $u$ and $0$ to $0$. Hence $A$ has not any left inverse, because the mapping $xmapsto Ax$ is not injective. (If you invert back, what should $A^{-1}0$ be? $u$ or $0$?)
Also, as $A$ is rank deficient, its column space $A$ is a proper subspace of the ambient space. Hence $A$ has not any right inverse, because the mapping $xmapsto Ax$ is not surjective. (If $w$ lies outside the column space of $A$ and it has an inverse image, then $w$ itself is the image of its own inverse image, hence $w$ also lies inside the column space of $A$. How paradoxical!)
$endgroup$
add a comment |
$begingroup$
The way i learned is was like this:
You should see the Det function as sending a matrix (of dim=n),to the oriented n-dim volume it's column vector's span. if this is zero, this will be a n-1 dim hyper-surface. Thus losing surjectivity into its image, we know injectivity and surjectivity are equivalent for finite dim square matrices. So it can't be invertible.
$endgroup$
$begingroup$
All functions are surjective onto their image. Onto their codomains, on the other hand...
$endgroup$
– Arthur
Jan 17 at 15:17
$begingroup$
yes ofcourse, i meant onto it's n dim vector space
$endgroup$
– Aylon Pinto
Jan 17 at 15:21
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3077074%2fwhat-is-the-intuition-behind-why-a-rank-deficient-matrix-does-not-have-an-invers%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
"Invertible" when talking about linear transformations means "reversible". In other words, a linear transformation (and the corresponding matrix in a given basis) is invertible iff it is possible, given an output, to figure out exactly what the input was.
A rank deficient linear transformation will collapse at least one dimension, meaning each output could be the result of any of a number of different inputs. Specifically, it will have a non-trivial kernel, so multiple different inputs result in the output $vec 0$.
$endgroup$
add a comment |
$begingroup$
"Invertible" when talking about linear transformations means "reversible". In other words, a linear transformation (and the corresponding matrix in a given basis) is invertible iff it is possible, given an output, to figure out exactly what the input was.
A rank deficient linear transformation will collapse at least one dimension, meaning each output could be the result of any of a number of different inputs. Specifically, it will have a non-trivial kernel, so multiple different inputs result in the output $vec 0$.
$endgroup$
add a comment |
$begingroup$
"Invertible" when talking about linear transformations means "reversible". In other words, a linear transformation (and the corresponding matrix in a given basis) is invertible iff it is possible, given an output, to figure out exactly what the input was.
A rank deficient linear transformation will collapse at least one dimension, meaning each output could be the result of any of a number of different inputs. Specifically, it will have a non-trivial kernel, so multiple different inputs result in the output $vec 0$.
$endgroup$
"Invertible" when talking about linear transformations means "reversible". In other words, a linear transformation (and the corresponding matrix in a given basis) is invertible iff it is possible, given an output, to figure out exactly what the input was.
A rank deficient linear transformation will collapse at least one dimension, meaning each output could be the result of any of a number of different inputs. Specifically, it will have a non-trivial kernel, so multiple different inputs result in the output $vec 0$.
answered Jan 17 at 15:07
ArthurArthur
116k7116199
116k7116199
add a comment |
add a comment |
$begingroup$
The row reduced echelon form of your matrix will have some rows of zero at the bottom , which is not invertible.
An invertible matrix when reduced to its row reduced echelon form becomes the identity matrix.
$endgroup$
add a comment |
$begingroup$
The row reduced echelon form of your matrix will have some rows of zero at the bottom , which is not invertible.
An invertible matrix when reduced to its row reduced echelon form becomes the identity matrix.
$endgroup$
add a comment |
$begingroup$
The row reduced echelon form of your matrix will have some rows of zero at the bottom , which is not invertible.
An invertible matrix when reduced to its row reduced echelon form becomes the identity matrix.
$endgroup$
The row reduced echelon form of your matrix will have some rows of zero at the bottom , which is not invertible.
An invertible matrix when reduced to its row reduced echelon form becomes the identity matrix.
answered Jan 17 at 15:00
Mohammad Riazi-KermaniMohammad Riazi-Kermani
41.6k42061
41.6k42061
add a comment |
add a comment |
$begingroup$
This stems from the fact that a mapping $f:Vto V$ cannot possess any left inverse if it is not injective and it cannot possess any right inverse if it is not surjective:
- if $f$ is not injective, i.e. if $f(u)=f(v)$ for some $une v$, then $f$ cannot have any left inverse, otherwise we would have $u=(f^{-1}circ f)(u)=f^{-1}(f(u))=f^{-1}(f(v))=(f^{-1}circ f)(v)=v$, which is a contradiction;
- if $f$ is not surjective, i.e. if there is some member $w$ of $V$ that lies outside $f(V)$, then $f$ cannot possess any right inverse, otherwise we would have $w=f(f^{-1}(w))in f(V)$, which is a contradiction.
Now, if a square matrix $A$ is rank deficient, its columns are linearly dependent. Therefore $Au=0$ for some nonzero vector $u$. In other words, $A$ maps both $u$ and $0$ to $0$. Hence $A$ has not any left inverse, because the mapping $xmapsto Ax$ is not injective. (If you invert back, what should $A^{-1}0$ be? $u$ or $0$?)
Also, as $A$ is rank deficient, its column space $A$ is a proper subspace of the ambient space. Hence $A$ has not any right inverse, because the mapping $xmapsto Ax$ is not surjective. (If $w$ lies outside the column space of $A$ and it has an inverse image, then $w$ itself is the image of its own inverse image, hence $w$ also lies inside the column space of $A$. How paradoxical!)
$endgroup$
add a comment |
$begingroup$
This stems from the fact that a mapping $f:Vto V$ cannot possess any left inverse if it is not injective and it cannot possess any right inverse if it is not surjective:
- if $f$ is not injective, i.e. if $f(u)=f(v)$ for some $une v$, then $f$ cannot have any left inverse, otherwise we would have $u=(f^{-1}circ f)(u)=f^{-1}(f(u))=f^{-1}(f(v))=(f^{-1}circ f)(v)=v$, which is a contradiction;
- if $f$ is not surjective, i.e. if there is some member $w$ of $V$ that lies outside $f(V)$, then $f$ cannot possess any right inverse, otherwise we would have $w=f(f^{-1}(w))in f(V)$, which is a contradiction.
Now, if a square matrix $A$ is rank deficient, its columns are linearly dependent. Therefore $Au=0$ for some nonzero vector $u$. In other words, $A$ maps both $u$ and $0$ to $0$. Hence $A$ has not any left inverse, because the mapping $xmapsto Ax$ is not injective. (If you invert back, what should $A^{-1}0$ be? $u$ or $0$?)
Also, as $A$ is rank deficient, its column space $A$ is a proper subspace of the ambient space. Hence $A$ has not any right inverse, because the mapping $xmapsto Ax$ is not surjective. (If $w$ lies outside the column space of $A$ and it has an inverse image, then $w$ itself is the image of its own inverse image, hence $w$ also lies inside the column space of $A$. How paradoxical!)
$endgroup$
add a comment |
$begingroup$
This stems from the fact that a mapping $f:Vto V$ cannot possess any left inverse if it is not injective and it cannot possess any right inverse if it is not surjective:
- if $f$ is not injective, i.e. if $f(u)=f(v)$ for some $une v$, then $f$ cannot have any left inverse, otherwise we would have $u=(f^{-1}circ f)(u)=f^{-1}(f(u))=f^{-1}(f(v))=(f^{-1}circ f)(v)=v$, which is a contradiction;
- if $f$ is not surjective, i.e. if there is some member $w$ of $V$ that lies outside $f(V)$, then $f$ cannot possess any right inverse, otherwise we would have $w=f(f^{-1}(w))in f(V)$, which is a contradiction.
Now, if a square matrix $A$ is rank deficient, its columns are linearly dependent. Therefore $Au=0$ for some nonzero vector $u$. In other words, $A$ maps both $u$ and $0$ to $0$. Hence $A$ has not any left inverse, because the mapping $xmapsto Ax$ is not injective. (If you invert back, what should $A^{-1}0$ be? $u$ or $0$?)
Also, as $A$ is rank deficient, its column space $A$ is a proper subspace of the ambient space. Hence $A$ has not any right inverse, because the mapping $xmapsto Ax$ is not surjective. (If $w$ lies outside the column space of $A$ and it has an inverse image, then $w$ itself is the image of its own inverse image, hence $w$ also lies inside the column space of $A$. How paradoxical!)
$endgroup$
This stems from the fact that a mapping $f:Vto V$ cannot possess any left inverse if it is not injective and it cannot possess any right inverse if it is not surjective:
- if $f$ is not injective, i.e. if $f(u)=f(v)$ for some $une v$, then $f$ cannot have any left inverse, otherwise we would have $u=(f^{-1}circ f)(u)=f^{-1}(f(u))=f^{-1}(f(v))=(f^{-1}circ f)(v)=v$, which is a contradiction;
- if $f$ is not surjective, i.e. if there is some member $w$ of $V$ that lies outside $f(V)$, then $f$ cannot possess any right inverse, otherwise we would have $w=f(f^{-1}(w))in f(V)$, which is a contradiction.
Now, if a square matrix $A$ is rank deficient, its columns are linearly dependent. Therefore $Au=0$ for some nonzero vector $u$. In other words, $A$ maps both $u$ and $0$ to $0$. Hence $A$ has not any left inverse, because the mapping $xmapsto Ax$ is not injective. (If you invert back, what should $A^{-1}0$ be? $u$ or $0$?)
Also, as $A$ is rank deficient, its column space $A$ is a proper subspace of the ambient space. Hence $A$ has not any right inverse, because the mapping $xmapsto Ax$ is not surjective. (If $w$ lies outside the column space of $A$ and it has an inverse image, then $w$ itself is the image of its own inverse image, hence $w$ also lies inside the column space of $A$. How paradoxical!)
answered Jan 17 at 15:58
user1551user1551
72.9k566128
72.9k566128
add a comment |
add a comment |
$begingroup$
The way i learned is was like this:
You should see the Det function as sending a matrix (of dim=n),to the oriented n-dim volume it's column vector's span. if this is zero, this will be a n-1 dim hyper-surface. Thus losing surjectivity into its image, we know injectivity and surjectivity are equivalent for finite dim square matrices. So it can't be invertible.
$endgroup$
$begingroup$
All functions are surjective onto their image. Onto their codomains, on the other hand...
$endgroup$
– Arthur
Jan 17 at 15:17
$begingroup$
yes ofcourse, i meant onto it's n dim vector space
$endgroup$
– Aylon Pinto
Jan 17 at 15:21
add a comment |
$begingroup$
The way i learned is was like this:
You should see the Det function as sending a matrix (of dim=n),to the oriented n-dim volume it's column vector's span. if this is zero, this will be a n-1 dim hyper-surface. Thus losing surjectivity into its image, we know injectivity and surjectivity are equivalent for finite dim square matrices. So it can't be invertible.
$endgroup$
$begingroup$
All functions are surjective onto their image. Onto their codomains, on the other hand...
$endgroup$
– Arthur
Jan 17 at 15:17
$begingroup$
yes ofcourse, i meant onto it's n dim vector space
$endgroup$
– Aylon Pinto
Jan 17 at 15:21
add a comment |
$begingroup$
The way i learned is was like this:
You should see the Det function as sending a matrix (of dim=n),to the oriented n-dim volume it's column vector's span. if this is zero, this will be a n-1 dim hyper-surface. Thus losing surjectivity into its image, we know injectivity and surjectivity are equivalent for finite dim square matrices. So it can't be invertible.
$endgroup$
The way i learned is was like this:
You should see the Det function as sending a matrix (of dim=n),to the oriented n-dim volume it's column vector's span. if this is zero, this will be a n-1 dim hyper-surface. Thus losing surjectivity into its image, we know injectivity and surjectivity are equivalent for finite dim square matrices. So it can't be invertible.
answered Jan 17 at 15:13
Aylon PintoAylon Pinto
235
235
$begingroup$
All functions are surjective onto their image. Onto their codomains, on the other hand...
$endgroup$
– Arthur
Jan 17 at 15:17
$begingroup$
yes ofcourse, i meant onto it's n dim vector space
$endgroup$
– Aylon Pinto
Jan 17 at 15:21
add a comment |
$begingroup$
All functions are surjective onto their image. Onto their codomains, on the other hand...
$endgroup$
– Arthur
Jan 17 at 15:17
$begingroup$
yes ofcourse, i meant onto it's n dim vector space
$endgroup$
– Aylon Pinto
Jan 17 at 15:21
$begingroup$
All functions are surjective onto their image. Onto their codomains, on the other hand...
$endgroup$
– Arthur
Jan 17 at 15:17
$begingroup$
All functions are surjective onto their image. Onto their codomains, on the other hand...
$endgroup$
– Arthur
Jan 17 at 15:17
$begingroup$
yes ofcourse, i meant onto it's n dim vector space
$endgroup$
– Aylon Pinto
Jan 17 at 15:21
$begingroup$
yes ofcourse, i meant onto it's n dim vector space
$endgroup$
– Aylon Pinto
Jan 17 at 15:21
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3077074%2fwhat-is-the-intuition-behind-why-a-rank-deficient-matrix-does-not-have-an-invers%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
I think point 2 will give you the intuition if you consider the action on a basis. (The image of a basis is a basis of the image.)
$endgroup$
– saulspatz
Jan 17 at 14:57
$begingroup$
To elaborate on the above comment, if $A$ does not have full rank, then a nontrivial subspace gets sent to the zero vector. What happens if we try to invert that? In other words, how can we find the inverse image of the zero vector if lots of vectors are mapped to it (the answer: we can't!).
$endgroup$
– OldGodzilla
Jan 17 at 15:00
$begingroup$
It hasn't an inverse but it has a pseudo-inverse, a very useful notion.
$endgroup$
– Jean Marie
Jan 25 at 15:44