Prove that the rank of $(1-I)$ is $n$
$begingroup$
The rank of $(1-I_n)$, where $1$ is the $n times n$ all-1 matrix and $I_n$ the $n times n$ identity matrix, seems to be $n$.
How to prove this concisely?
linear-algebra matrices
$endgroup$
add a comment |
$begingroup$
The rank of $(1-I_n)$, where $1$ is the $n times n$ all-1 matrix and $I_n$ the $n times n$ identity matrix, seems to be $n$.
How to prove this concisely?
linear-algebra matrices
$endgroup$
2
$begingroup$
Well if you look at a example. When you subtract a matrix filled with ones from a matrix with 1`s going down in a diagonal this creates all columns that are linearly independent due to the zeroes appearing one position lower from column 1 to column N.
$endgroup$
– user60887
Jun 20 '13 at 18:11
$begingroup$
Maybe easiest is to assume given that $r(A + B) leq r(A) + r(B)$ (subadditivity of rank)? By then choosing $A = 1$ and $B = -(1-I)$, this gives $r(I) leq r(1) + r(-(1-I))$ or $n leq 1 + r(-(1-I))$ and hence $r(1-I) = r(-(1-I)) geq n-1$.
$endgroup$
– ranky
Jun 20 '13 at 18:24
add a comment |
$begingroup$
The rank of $(1-I_n)$, where $1$ is the $n times n$ all-1 matrix and $I_n$ the $n times n$ identity matrix, seems to be $n$.
How to prove this concisely?
linear-algebra matrices
$endgroup$
The rank of $(1-I_n)$, where $1$ is the $n times n$ all-1 matrix and $I_n$ the $n times n$ identity matrix, seems to be $n$.
How to prove this concisely?
linear-algebra matrices
linear-algebra matrices
edited Jun 20 '13 at 18:57
Davide Giraudo
128k17156268
128k17156268
asked Jun 20 '13 at 18:02
rankyranky
61
61
2
$begingroup$
Well if you look at a example. When you subtract a matrix filled with ones from a matrix with 1`s going down in a diagonal this creates all columns that are linearly independent due to the zeroes appearing one position lower from column 1 to column N.
$endgroup$
– user60887
Jun 20 '13 at 18:11
$begingroup$
Maybe easiest is to assume given that $r(A + B) leq r(A) + r(B)$ (subadditivity of rank)? By then choosing $A = 1$ and $B = -(1-I)$, this gives $r(I) leq r(1) + r(-(1-I))$ or $n leq 1 + r(-(1-I))$ and hence $r(1-I) = r(-(1-I)) geq n-1$.
$endgroup$
– ranky
Jun 20 '13 at 18:24
add a comment |
2
$begingroup$
Well if you look at a example. When you subtract a matrix filled with ones from a matrix with 1`s going down in a diagonal this creates all columns that are linearly independent due to the zeroes appearing one position lower from column 1 to column N.
$endgroup$
– user60887
Jun 20 '13 at 18:11
$begingroup$
Maybe easiest is to assume given that $r(A + B) leq r(A) + r(B)$ (subadditivity of rank)? By then choosing $A = 1$ and $B = -(1-I)$, this gives $r(I) leq r(1) + r(-(1-I))$ or $n leq 1 + r(-(1-I))$ and hence $r(1-I) = r(-(1-I)) geq n-1$.
$endgroup$
– ranky
Jun 20 '13 at 18:24
2
2
$begingroup$
Well if you look at a example. When you subtract a matrix filled with ones from a matrix with 1`s going down in a diagonal this creates all columns that are linearly independent due to the zeroes appearing one position lower from column 1 to column N.
$endgroup$
– user60887
Jun 20 '13 at 18:11
$begingroup$
Well if you look at a example. When you subtract a matrix filled with ones from a matrix with 1`s going down in a diagonal this creates all columns that are linearly independent due to the zeroes appearing one position lower from column 1 to column N.
$endgroup$
– user60887
Jun 20 '13 at 18:11
$begingroup$
Maybe easiest is to assume given that $r(A + B) leq r(A) + r(B)$ (subadditivity of rank)? By then choosing $A = 1$ and $B = -(1-I)$, this gives $r(I) leq r(1) + r(-(1-I))$ or $n leq 1 + r(-(1-I))$ and hence $r(1-I) = r(-(1-I)) geq n-1$.
$endgroup$
– ranky
Jun 20 '13 at 18:24
$begingroup$
Maybe easiest is to assume given that $r(A + B) leq r(A) + r(B)$ (subadditivity of rank)? By then choosing $A = 1$ and $B = -(1-I)$, this gives $r(I) leq r(1) + r(-(1-I))$ or $n leq 1 + r(-(1-I))$ and hence $r(1-I) = r(-(1-I)) geq n-1$.
$endgroup$
– ranky
Jun 20 '13 at 18:24
add a comment |
4 Answers
4
active
oldest
votes
$begingroup$
Let's call $J$ the matrix with all coefficients equal to $1$. Its eigenvalues are $n$ and $0$: in fact $J$ has rank $1$ and so $0$ is an eigenvalue of multiplicity $n-1$; clearly $Jv=nv$ where $v$ is the "all $1$" vector.
So $1$ is not a root of the characteristic polynomial of $J$, that is,
$$
det(J-XI_n)
$$
which means that $det(J-I_n)ne0$.
Of course we assume $n>1$, otherwise the assertion is false.
$endgroup$
add a comment |
$begingroup$
If the rank were not to be $n$, there exists a non-zero vector $x$ such that
$$(ee^T - I)x = 0$$
i.e.,
$$sum_{overset{i=1}{i neq j}}^n x_i = 0$$ for all $j in {1,2,ldots,n}$. Can this be true for $x neq 0$ ?
Also, as an aside, the inverse of $(ee^T - I)$ is given by the Sherman-Morrison formula or the more general Woodbury formula
$$- left(I + dfrac{ee^T}{1-e^Te}right) = - left(I - dfrac{ee^T}{n-1}right)$$
In general, if $A$ is invertible, then $A+uv^T$ is invertible when $1+v^TA^{-1}u neq 0$. In your case, this corresponds to the condition that $n-1 neq 0$, i.e., $n neq 1$.
$endgroup$
add a comment |
$begingroup$
If $A$ is any $ntimes n$ matrix of rank$~1$, then it has an eigenspace for$~0$ of dimension$~n-1$, and the eigenvalue$~0$ has at least that (algebraic) multiplicity; the final eigenvalue is $deftr{operatorname{tr}}tr A$ (as that is the sum of the eigenvalues). Therefore $A-lambda I$ with $lambdaneq0$ is invertible unless $lambda=tr A$, in which case $defrk{operatorname{rk}}rk(A-lambda I)=n-1$. In your case if $A$ is the matrix you confusingly call$~1$, you have $tr A=n$, so for $lambda=1$ you get that $A-I$ is invertible unless $n=1$, in which case it has rank$~n-1=0$.
$endgroup$
add a comment |
$begingroup$
$newcommand{rank}{operatorname{rank}}$Hint: The matrix $A=[a_{ij}]_{ntimes n}$is full column rank ($rank(A)=n$)if and only if it's invertible.
$$det(1-I_n)=(n-1)(-1)^{n-1}neq0$$
use row echelon operations to Compute $det(B)$$$forall i, ~~ 2le ile n :~~~~~~~C _1-C _i,~~~~~~ R_1+R_i$$
wherein $R$ stands for row and $C$ stands for column. Now;$$det(B)=detbegin{pmatrix}
r & a & a & cdots &a \
a & r& a & cdots & a \
vdots & vdots& vdots & ddots & vdots \
a & a & a & cdots & r
end{pmatrix}=(r+(n-1)a)(r-a)^{n-1}$$
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f425597%2fprove-that-the-rank-of-1-i-is-n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let's call $J$ the matrix with all coefficients equal to $1$. Its eigenvalues are $n$ and $0$: in fact $J$ has rank $1$ and so $0$ is an eigenvalue of multiplicity $n-1$; clearly $Jv=nv$ where $v$ is the "all $1$" vector.
So $1$ is not a root of the characteristic polynomial of $J$, that is,
$$
det(J-XI_n)
$$
which means that $det(J-I_n)ne0$.
Of course we assume $n>1$, otherwise the assertion is false.
$endgroup$
add a comment |
$begingroup$
Let's call $J$ the matrix with all coefficients equal to $1$. Its eigenvalues are $n$ and $0$: in fact $J$ has rank $1$ and so $0$ is an eigenvalue of multiplicity $n-1$; clearly $Jv=nv$ where $v$ is the "all $1$" vector.
So $1$ is not a root of the characteristic polynomial of $J$, that is,
$$
det(J-XI_n)
$$
which means that $det(J-I_n)ne0$.
Of course we assume $n>1$, otherwise the assertion is false.
$endgroup$
add a comment |
$begingroup$
Let's call $J$ the matrix with all coefficients equal to $1$. Its eigenvalues are $n$ and $0$: in fact $J$ has rank $1$ and so $0$ is an eigenvalue of multiplicity $n-1$; clearly $Jv=nv$ where $v$ is the "all $1$" vector.
So $1$ is not a root of the characteristic polynomial of $J$, that is,
$$
det(J-XI_n)
$$
which means that $det(J-I_n)ne0$.
Of course we assume $n>1$, otherwise the assertion is false.
$endgroup$
Let's call $J$ the matrix with all coefficients equal to $1$. Its eigenvalues are $n$ and $0$: in fact $J$ has rank $1$ and so $0$ is an eigenvalue of multiplicity $n-1$; clearly $Jv=nv$ where $v$ is the "all $1$" vector.
So $1$ is not a root of the characteristic polynomial of $J$, that is,
$$
det(J-XI_n)
$$
which means that $det(J-I_n)ne0$.
Of course we assume $n>1$, otherwise the assertion is false.
answered Jun 20 '13 at 18:07
egregegreg
185k1486207
185k1486207
add a comment |
add a comment |
$begingroup$
If the rank were not to be $n$, there exists a non-zero vector $x$ such that
$$(ee^T - I)x = 0$$
i.e.,
$$sum_{overset{i=1}{i neq j}}^n x_i = 0$$ for all $j in {1,2,ldots,n}$. Can this be true for $x neq 0$ ?
Also, as an aside, the inverse of $(ee^T - I)$ is given by the Sherman-Morrison formula or the more general Woodbury formula
$$- left(I + dfrac{ee^T}{1-e^Te}right) = - left(I - dfrac{ee^T}{n-1}right)$$
In general, if $A$ is invertible, then $A+uv^T$ is invertible when $1+v^TA^{-1}u neq 0$. In your case, this corresponds to the condition that $n-1 neq 0$, i.e., $n neq 1$.
$endgroup$
add a comment |
$begingroup$
If the rank were not to be $n$, there exists a non-zero vector $x$ such that
$$(ee^T - I)x = 0$$
i.e.,
$$sum_{overset{i=1}{i neq j}}^n x_i = 0$$ for all $j in {1,2,ldots,n}$. Can this be true for $x neq 0$ ?
Also, as an aside, the inverse of $(ee^T - I)$ is given by the Sherman-Morrison formula or the more general Woodbury formula
$$- left(I + dfrac{ee^T}{1-e^Te}right) = - left(I - dfrac{ee^T}{n-1}right)$$
In general, if $A$ is invertible, then $A+uv^T$ is invertible when $1+v^TA^{-1}u neq 0$. In your case, this corresponds to the condition that $n-1 neq 0$, i.e., $n neq 1$.
$endgroup$
add a comment |
$begingroup$
If the rank were not to be $n$, there exists a non-zero vector $x$ such that
$$(ee^T - I)x = 0$$
i.e.,
$$sum_{overset{i=1}{i neq j}}^n x_i = 0$$ for all $j in {1,2,ldots,n}$. Can this be true for $x neq 0$ ?
Also, as an aside, the inverse of $(ee^T - I)$ is given by the Sherman-Morrison formula or the more general Woodbury formula
$$- left(I + dfrac{ee^T}{1-e^Te}right) = - left(I - dfrac{ee^T}{n-1}right)$$
In general, if $A$ is invertible, then $A+uv^T$ is invertible when $1+v^TA^{-1}u neq 0$. In your case, this corresponds to the condition that $n-1 neq 0$, i.e., $n neq 1$.
$endgroup$
If the rank were not to be $n$, there exists a non-zero vector $x$ such that
$$(ee^T - I)x = 0$$
i.e.,
$$sum_{overset{i=1}{i neq j}}^n x_i = 0$$ for all $j in {1,2,ldots,n}$. Can this be true for $x neq 0$ ?
Also, as an aside, the inverse of $(ee^T - I)$ is given by the Sherman-Morrison formula or the more general Woodbury formula
$$- left(I + dfrac{ee^T}{1-e^Te}right) = - left(I - dfrac{ee^T}{n-1}right)$$
In general, if $A$ is invertible, then $A+uv^T$ is invertible when $1+v^TA^{-1}u neq 0$. In your case, this corresponds to the condition that $n-1 neq 0$, i.e., $n neq 1$.
answered Jun 20 '13 at 18:08
user17762
add a comment |
add a comment |
$begingroup$
If $A$ is any $ntimes n$ matrix of rank$~1$, then it has an eigenspace for$~0$ of dimension$~n-1$, and the eigenvalue$~0$ has at least that (algebraic) multiplicity; the final eigenvalue is $deftr{operatorname{tr}}tr A$ (as that is the sum of the eigenvalues). Therefore $A-lambda I$ with $lambdaneq0$ is invertible unless $lambda=tr A$, in which case $defrk{operatorname{rk}}rk(A-lambda I)=n-1$. In your case if $A$ is the matrix you confusingly call$~1$, you have $tr A=n$, so for $lambda=1$ you get that $A-I$ is invertible unless $n=1$, in which case it has rank$~n-1=0$.
$endgroup$
add a comment |
$begingroup$
If $A$ is any $ntimes n$ matrix of rank$~1$, then it has an eigenspace for$~0$ of dimension$~n-1$, and the eigenvalue$~0$ has at least that (algebraic) multiplicity; the final eigenvalue is $deftr{operatorname{tr}}tr A$ (as that is the sum of the eigenvalues). Therefore $A-lambda I$ with $lambdaneq0$ is invertible unless $lambda=tr A$, in which case $defrk{operatorname{rk}}rk(A-lambda I)=n-1$. In your case if $A$ is the matrix you confusingly call$~1$, you have $tr A=n$, so for $lambda=1$ you get that $A-I$ is invertible unless $n=1$, in which case it has rank$~n-1=0$.
$endgroup$
add a comment |
$begingroup$
If $A$ is any $ntimes n$ matrix of rank$~1$, then it has an eigenspace for$~0$ of dimension$~n-1$, and the eigenvalue$~0$ has at least that (algebraic) multiplicity; the final eigenvalue is $deftr{operatorname{tr}}tr A$ (as that is the sum of the eigenvalues). Therefore $A-lambda I$ with $lambdaneq0$ is invertible unless $lambda=tr A$, in which case $defrk{operatorname{rk}}rk(A-lambda I)=n-1$. In your case if $A$ is the matrix you confusingly call$~1$, you have $tr A=n$, so for $lambda=1$ you get that $A-I$ is invertible unless $n=1$, in which case it has rank$~n-1=0$.
$endgroup$
If $A$ is any $ntimes n$ matrix of rank$~1$, then it has an eigenspace for$~0$ of dimension$~n-1$, and the eigenvalue$~0$ has at least that (algebraic) multiplicity; the final eigenvalue is $deftr{operatorname{tr}}tr A$ (as that is the sum of the eigenvalues). Therefore $A-lambda I$ with $lambdaneq0$ is invertible unless $lambda=tr A$, in which case $defrk{operatorname{rk}}rk(A-lambda I)=n-1$. In your case if $A$ is the matrix you confusingly call$~1$, you have $tr A=n$, so for $lambda=1$ you get that $A-I$ is invertible unless $n=1$, in which case it has rank$~n-1=0$.
edited Jun 20 '13 at 19:34
answered Jun 20 '13 at 19:26
Marc van LeeuwenMarc van Leeuwen
88.7k5111230
88.7k5111230
add a comment |
add a comment |
$begingroup$
$newcommand{rank}{operatorname{rank}}$Hint: The matrix $A=[a_{ij}]_{ntimes n}$is full column rank ($rank(A)=n$)if and only if it's invertible.
$$det(1-I_n)=(n-1)(-1)^{n-1}neq0$$
use row echelon operations to Compute $det(B)$$$forall i, ~~ 2le ile n :~~~~~~~C _1-C _i,~~~~~~ R_1+R_i$$
wherein $R$ stands for row and $C$ stands for column. Now;$$det(B)=detbegin{pmatrix}
r & a & a & cdots &a \
a & r& a & cdots & a \
vdots & vdots& vdots & ddots & vdots \
a & a & a & cdots & r
end{pmatrix}=(r+(n-1)a)(r-a)^{n-1}$$
$endgroup$
add a comment |
$begingroup$
$newcommand{rank}{operatorname{rank}}$Hint: The matrix $A=[a_{ij}]_{ntimes n}$is full column rank ($rank(A)=n$)if and only if it's invertible.
$$det(1-I_n)=(n-1)(-1)^{n-1}neq0$$
use row echelon operations to Compute $det(B)$$$forall i, ~~ 2le ile n :~~~~~~~C _1-C _i,~~~~~~ R_1+R_i$$
wherein $R$ stands for row and $C$ stands for column. Now;$$det(B)=detbegin{pmatrix}
r & a & a & cdots &a \
a & r& a & cdots & a \
vdots & vdots& vdots & ddots & vdots \
a & a & a & cdots & r
end{pmatrix}=(r+(n-1)a)(r-a)^{n-1}$$
$endgroup$
add a comment |
$begingroup$
$newcommand{rank}{operatorname{rank}}$Hint: The matrix $A=[a_{ij}]_{ntimes n}$is full column rank ($rank(A)=n$)if and only if it's invertible.
$$det(1-I_n)=(n-1)(-1)^{n-1}neq0$$
use row echelon operations to Compute $det(B)$$$forall i, ~~ 2le ile n :~~~~~~~C _1-C _i,~~~~~~ R_1+R_i$$
wherein $R$ stands for row and $C$ stands for column. Now;$$det(B)=detbegin{pmatrix}
r & a & a & cdots &a \
a & r& a & cdots & a \
vdots & vdots& vdots & ddots & vdots \
a & a & a & cdots & r
end{pmatrix}=(r+(n-1)a)(r-a)^{n-1}$$
$endgroup$
$newcommand{rank}{operatorname{rank}}$Hint: The matrix $A=[a_{ij}]_{ntimes n}$is full column rank ($rank(A)=n$)if and only if it's invertible.
$$det(1-I_n)=(n-1)(-1)^{n-1}neq0$$
use row echelon operations to Compute $det(B)$$$forall i, ~~ 2le ile n :~~~~~~~C _1-C _i,~~~~~~ R_1+R_i$$
wherein $R$ stands for row and $C$ stands for column. Now;$$det(B)=detbegin{pmatrix}
r & a & a & cdots &a \
a & r& a & cdots & a \
vdots & vdots& vdots & ddots & vdots \
a & a & a & cdots & r
end{pmatrix}=(r+(n-1)a)(r-a)^{n-1}$$
edited Jan 30 at 19:36
Martin Sleziak
44.9k10122277
44.9k10122277
answered Jun 20 '13 at 18:08
M.HM.H
7,32211654
7,32211654
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f425597%2fprove-that-the-rank-of-1-i-is-n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
Well if you look at a example. When you subtract a matrix filled with ones from a matrix with 1`s going down in a diagonal this creates all columns that are linearly independent due to the zeroes appearing one position lower from column 1 to column N.
$endgroup$
– user60887
Jun 20 '13 at 18:11
$begingroup$
Maybe easiest is to assume given that $r(A + B) leq r(A) + r(B)$ (subadditivity of rank)? By then choosing $A = 1$ and $B = -(1-I)$, this gives $r(I) leq r(1) + r(-(1-I))$ or $n leq 1 + r(-(1-I))$ and hence $r(1-I) = r(-(1-I)) geq n-1$.
$endgroup$
– ranky
Jun 20 '13 at 18:24