A Representation Theory Problem in Putnam Competition
The following was the B6 problem of 1985 Putnam Competition: Suppose $G$ is a finite group (under matrix multiplication) of real $ntimes n$ matrices ${M_i}, 1leq ileq r$. Suppose that $$sum_{i=1}^r tr (M_i)=0$$, prove that $$sum_{i=1}^rM_i=0.$$
Here is an official proof from the committee which I didn't understand:
Lemma: Let $G$ be a finite group of order $r$. Let $rho: Grightarrow GL(V)$ be a representation of $G$ on some finite dimensional vector space $V$. Then $$sum_{gin G}tr rho_g$$ is a non-negative integer divisible by $r$, and is zero iff $$sum_{gin G}rho_g=0$$.
Proof: Let $chi_1,cdots, chi_s$ be the irreducible characters of $G$ and $chi= sum_{i=1}^s a_ichi_i$ and $psi=sum_{i=1}^sb_ichi_i$ be arbitrary characters. Then by the orthogonality relations of characters, we have
$$frac{1}{|G|}sum_{gin G}chi(g)overline{psi (g)}=sum_{i=1}^sa_ib_i$$.
Applying this to the character of $rho$ and the trivial character $mathbb{1}$ shows that $frac{1}{|G|} sum_{gin G}tr rho_g$ equals the multiplicity of $mathbb{1}$ in $rho$, which is a non-negative integer.
Now suppose that the matrix $S=sum_{gin G}rho_g$ is non-zero. Choose $vin V$ with $Svnot=0$. The relation $rho_hS=S$ shows that $Sv$ is fixed by $rho_h$ for all $hin G$. In other words, $Sv$ spans a trivial subrepresentation of $rho$, so the non-negative integer of the previous paragraph is positive. QED
We now return to the problem at hand. "Unfortunately the $M_i$ do not necessarily define a representation of $G$, since the $M_i$ need not be invertible." Instead we need to apply the lemma to the action of $G$ on $mathbb{C}^n/K$ for some subspace $K$ ...
I do not understand wthe sentence in "". Isn't the set of $M_i$'s form a group under multiplication? why they need not to be invertible? The above proof is copied from Kedlaya, Poonen and Vakil's Putnam competition 1985-2000. Thanks for helping
linear-algebra representation-theory contest-math
add a comment |
The following was the B6 problem of 1985 Putnam Competition: Suppose $G$ is a finite group (under matrix multiplication) of real $ntimes n$ matrices ${M_i}, 1leq ileq r$. Suppose that $$sum_{i=1}^r tr (M_i)=0$$, prove that $$sum_{i=1}^rM_i=0.$$
Here is an official proof from the committee which I didn't understand:
Lemma: Let $G$ be a finite group of order $r$. Let $rho: Grightarrow GL(V)$ be a representation of $G$ on some finite dimensional vector space $V$. Then $$sum_{gin G}tr rho_g$$ is a non-negative integer divisible by $r$, and is zero iff $$sum_{gin G}rho_g=0$$.
Proof: Let $chi_1,cdots, chi_s$ be the irreducible characters of $G$ and $chi= sum_{i=1}^s a_ichi_i$ and $psi=sum_{i=1}^sb_ichi_i$ be arbitrary characters. Then by the orthogonality relations of characters, we have
$$frac{1}{|G|}sum_{gin G}chi(g)overline{psi (g)}=sum_{i=1}^sa_ib_i$$.
Applying this to the character of $rho$ and the trivial character $mathbb{1}$ shows that $frac{1}{|G|} sum_{gin G}tr rho_g$ equals the multiplicity of $mathbb{1}$ in $rho$, which is a non-negative integer.
Now suppose that the matrix $S=sum_{gin G}rho_g$ is non-zero. Choose $vin V$ with $Svnot=0$. The relation $rho_hS=S$ shows that $Sv$ is fixed by $rho_h$ for all $hin G$. In other words, $Sv$ spans a trivial subrepresentation of $rho$, so the non-negative integer of the previous paragraph is positive. QED
We now return to the problem at hand. "Unfortunately the $M_i$ do not necessarily define a representation of $G$, since the $M_i$ need not be invertible." Instead we need to apply the lemma to the action of $G$ on $mathbb{C}^n/K$ for some subspace $K$ ...
I do not understand wthe sentence in "". Isn't the set of $M_i$'s form a group under multiplication? why they need not to be invertible? The above proof is copied from Kedlaya, Poonen and Vakil's Putnam competition 1985-2000. Thanks for helping
linear-algebra representation-theory contest-math
3
Wow, this is evil. I would argue that it is asking to be misunderstood, since most reasonable mathematicians would understand "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
– darij grinberg
Feb 23 '15 at 3:45
4
Anyway there is no need to involve orthogonality of group characters here. Enough to observe that $dfrac{1}{r}left(M_1+M_2+ldots+M_rright)$ is idempotent (because of the $M_i$ forming a group), and the trace of an idempotent matrix equals its rank.
– darij grinberg
Feb 23 '15 at 3:47
@whacka: Done, with some more detail.
– darij grinberg
Feb 23 '15 at 4:29
add a comment |
The following was the B6 problem of 1985 Putnam Competition: Suppose $G$ is a finite group (under matrix multiplication) of real $ntimes n$ matrices ${M_i}, 1leq ileq r$. Suppose that $$sum_{i=1}^r tr (M_i)=0$$, prove that $$sum_{i=1}^rM_i=0.$$
Here is an official proof from the committee which I didn't understand:
Lemma: Let $G$ be a finite group of order $r$. Let $rho: Grightarrow GL(V)$ be a representation of $G$ on some finite dimensional vector space $V$. Then $$sum_{gin G}tr rho_g$$ is a non-negative integer divisible by $r$, and is zero iff $$sum_{gin G}rho_g=0$$.
Proof: Let $chi_1,cdots, chi_s$ be the irreducible characters of $G$ and $chi= sum_{i=1}^s a_ichi_i$ and $psi=sum_{i=1}^sb_ichi_i$ be arbitrary characters. Then by the orthogonality relations of characters, we have
$$frac{1}{|G|}sum_{gin G}chi(g)overline{psi (g)}=sum_{i=1}^sa_ib_i$$.
Applying this to the character of $rho$ and the trivial character $mathbb{1}$ shows that $frac{1}{|G|} sum_{gin G}tr rho_g$ equals the multiplicity of $mathbb{1}$ in $rho$, which is a non-negative integer.
Now suppose that the matrix $S=sum_{gin G}rho_g$ is non-zero. Choose $vin V$ with $Svnot=0$. The relation $rho_hS=S$ shows that $Sv$ is fixed by $rho_h$ for all $hin G$. In other words, $Sv$ spans a trivial subrepresentation of $rho$, so the non-negative integer of the previous paragraph is positive. QED
We now return to the problem at hand. "Unfortunately the $M_i$ do not necessarily define a representation of $G$, since the $M_i$ need not be invertible." Instead we need to apply the lemma to the action of $G$ on $mathbb{C}^n/K$ for some subspace $K$ ...
I do not understand wthe sentence in "". Isn't the set of $M_i$'s form a group under multiplication? why they need not to be invertible? The above proof is copied from Kedlaya, Poonen and Vakil's Putnam competition 1985-2000. Thanks for helping
linear-algebra representation-theory contest-math
The following was the B6 problem of 1985 Putnam Competition: Suppose $G$ is a finite group (under matrix multiplication) of real $ntimes n$ matrices ${M_i}, 1leq ileq r$. Suppose that $$sum_{i=1}^r tr (M_i)=0$$, prove that $$sum_{i=1}^rM_i=0.$$
Here is an official proof from the committee which I didn't understand:
Lemma: Let $G$ be a finite group of order $r$. Let $rho: Grightarrow GL(V)$ be a representation of $G$ on some finite dimensional vector space $V$. Then $$sum_{gin G}tr rho_g$$ is a non-negative integer divisible by $r$, and is zero iff $$sum_{gin G}rho_g=0$$.
Proof: Let $chi_1,cdots, chi_s$ be the irreducible characters of $G$ and $chi= sum_{i=1}^s a_ichi_i$ and $psi=sum_{i=1}^sb_ichi_i$ be arbitrary characters. Then by the orthogonality relations of characters, we have
$$frac{1}{|G|}sum_{gin G}chi(g)overline{psi (g)}=sum_{i=1}^sa_ib_i$$.
Applying this to the character of $rho$ and the trivial character $mathbb{1}$ shows that $frac{1}{|G|} sum_{gin G}tr rho_g$ equals the multiplicity of $mathbb{1}$ in $rho$, which is a non-negative integer.
Now suppose that the matrix $S=sum_{gin G}rho_g$ is non-zero. Choose $vin V$ with $Svnot=0$. The relation $rho_hS=S$ shows that $Sv$ is fixed by $rho_h$ for all $hin G$. In other words, $Sv$ spans a trivial subrepresentation of $rho$, so the non-negative integer of the previous paragraph is positive. QED
We now return to the problem at hand. "Unfortunately the $M_i$ do not necessarily define a representation of $G$, since the $M_i$ need not be invertible." Instead we need to apply the lemma to the action of $G$ on $mathbb{C}^n/K$ for some subspace $K$ ...
I do not understand wthe sentence in "". Isn't the set of $M_i$'s form a group under multiplication? why they need not to be invertible? The above proof is copied from Kedlaya, Poonen and Vakil's Putnam competition 1985-2000. Thanks for helping
linear-algebra representation-theory contest-math
linear-algebra representation-theory contest-math
asked Feb 23 '15 at 3:40
Gru
361
361
3
Wow, this is evil. I would argue that it is asking to be misunderstood, since most reasonable mathematicians would understand "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
– darij grinberg
Feb 23 '15 at 3:45
4
Anyway there is no need to involve orthogonality of group characters here. Enough to observe that $dfrac{1}{r}left(M_1+M_2+ldots+M_rright)$ is idempotent (because of the $M_i$ forming a group), and the trace of an idempotent matrix equals its rank.
– darij grinberg
Feb 23 '15 at 3:47
@whacka: Done, with some more detail.
– darij grinberg
Feb 23 '15 at 4:29
add a comment |
3
Wow, this is evil. I would argue that it is asking to be misunderstood, since most reasonable mathematicians would understand "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
– darij grinberg
Feb 23 '15 at 3:45
4
Anyway there is no need to involve orthogonality of group characters here. Enough to observe that $dfrac{1}{r}left(M_1+M_2+ldots+M_rright)$ is idempotent (because of the $M_i$ forming a group), and the trace of an idempotent matrix equals its rank.
– darij grinberg
Feb 23 '15 at 3:47
@whacka: Done, with some more detail.
– darij grinberg
Feb 23 '15 at 4:29
3
3
Wow, this is evil. I would argue that it is asking to be misunderstood, since most reasonable mathematicians would understand "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
– darij grinberg
Feb 23 '15 at 3:45
Wow, this is evil. I would argue that it is asking to be misunderstood, since most reasonable mathematicians would understand "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
– darij grinberg
Feb 23 '15 at 3:45
4
4
Anyway there is no need to involve orthogonality of group characters here. Enough to observe that $dfrac{1}{r}left(M_1+M_2+ldots+M_rright)$ is idempotent (because of the $M_i$ forming a group), and the trace of an idempotent matrix equals its rank.
– darij grinberg
Feb 23 '15 at 3:47
Anyway there is no need to involve orthogonality of group characters here. Enough to observe that $dfrac{1}{r}left(M_1+M_2+ldots+M_rright)$ is idempotent (because of the $M_i$ forming a group), and the trace of an idempotent matrix equals its rank.
– darij grinberg
Feb 23 '15 at 3:47
@whacka: Done, with some more detail.
– darij grinberg
Feb 23 '15 at 4:29
@whacka: Done, with some more detail.
– darij grinberg
Feb 23 '15 at 4:29
add a comment |
1 Answer
1
active
oldest
votes
It seems to me that the proposers of this problem went an extra mile to be misunderstood: Most reasonable mathematicians would read "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Under this interpretation, the $M_i$ do define a representation of $G$. But apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
But ambiguity is not a one-player game. Consider $r = 3$, $M_1 = I_n$, $M_2 = operatorname{diag}left(1, -1right)$ and $M_3 = operatorname{diag}left(1, -1right)$. Oh, $M_1, M_2, ldots, M_r$ are supposed to be distinct? Good to know. I am wondering how often this came up on appeal?
Anyway the solution you quoted is overkill. The problem straightforwardly generalizes to matrices over any field of characteristic $0$ instead of real matrices; good luck defining Hermitian forms over arbitrary fields. A solution that generalizes (and is a lot shorter and more elementary than the one in the original post) proceeds as follows (very roughly sketched):
We have
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2 = sum_{i, j} M_i M_j = sum_{k} sum_{substack{i, j ;\ M_i M_j = M_k}} M_k
end{align}
(since the $M_i$ form a group, so each $M_i M_j$ equals some $M_k$),
where all indices in sums range over $left{1,2,ldots,rright}$. Now, for every $1 leq k leq r$, there exist precisely $r$ pairs $left(i, jright)$ such that $M_k = M_i M_j$ (again since the $M_i$ form a group). Hence, for every $1 leq k leq r$, we have
begin{align}
sum_{substack{i, j ;\ M_i M_j = M_k}} M_k = r M_k .
end{align}
Thus,
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2
&= sum_{k} underbrace{sum_{substack{i, j ;\ M_i M_j = M_k}} M_k}_{=r M_k} = sum_{k} r M_k \
&= rleft(M_1 + M_2 + cdots + M_rright) .
end{align}
This readily yields that $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ is an idempotent. But the trace of an idempotent matrix equals its rank (this is a well-known fact), and this particular idempotent matrix $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ has trace $0$. How many matrices with rank $0$ are there?
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1161115%2fa-representation-theory-problem-in-putnam-competition%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
It seems to me that the proposers of this problem went an extra mile to be misunderstood: Most reasonable mathematicians would read "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Under this interpretation, the $M_i$ do define a representation of $G$. But apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
But ambiguity is not a one-player game. Consider $r = 3$, $M_1 = I_n$, $M_2 = operatorname{diag}left(1, -1right)$ and $M_3 = operatorname{diag}left(1, -1right)$. Oh, $M_1, M_2, ldots, M_r$ are supposed to be distinct? Good to know. I am wondering how often this came up on appeal?
Anyway the solution you quoted is overkill. The problem straightforwardly generalizes to matrices over any field of characteristic $0$ instead of real matrices; good luck defining Hermitian forms over arbitrary fields. A solution that generalizes (and is a lot shorter and more elementary than the one in the original post) proceeds as follows (very roughly sketched):
We have
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2 = sum_{i, j} M_i M_j = sum_{k} sum_{substack{i, j ;\ M_i M_j = M_k}} M_k
end{align}
(since the $M_i$ form a group, so each $M_i M_j$ equals some $M_k$),
where all indices in sums range over $left{1,2,ldots,rright}$. Now, for every $1 leq k leq r$, there exist precisely $r$ pairs $left(i, jright)$ such that $M_k = M_i M_j$ (again since the $M_i$ form a group). Hence, for every $1 leq k leq r$, we have
begin{align}
sum_{substack{i, j ;\ M_i M_j = M_k}} M_k = r M_k .
end{align}
Thus,
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2
&= sum_{k} underbrace{sum_{substack{i, j ;\ M_i M_j = M_k}} M_k}_{=r M_k} = sum_{k} r M_k \
&= rleft(M_1 + M_2 + cdots + M_rright) .
end{align}
This readily yields that $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ is an idempotent. But the trace of an idempotent matrix equals its rank (this is a well-known fact), and this particular idempotent matrix $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ has trace $0$. How many matrices with rank $0$ are there?
add a comment |
It seems to me that the proposers of this problem went an extra mile to be misunderstood: Most reasonable mathematicians would read "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Under this interpretation, the $M_i$ do define a representation of $G$. But apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
But ambiguity is not a one-player game. Consider $r = 3$, $M_1 = I_n$, $M_2 = operatorname{diag}left(1, -1right)$ and $M_3 = operatorname{diag}left(1, -1right)$. Oh, $M_1, M_2, ldots, M_r$ are supposed to be distinct? Good to know. I am wondering how often this came up on appeal?
Anyway the solution you quoted is overkill. The problem straightforwardly generalizes to matrices over any field of characteristic $0$ instead of real matrices; good luck defining Hermitian forms over arbitrary fields. A solution that generalizes (and is a lot shorter and more elementary than the one in the original post) proceeds as follows (very roughly sketched):
We have
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2 = sum_{i, j} M_i M_j = sum_{k} sum_{substack{i, j ;\ M_i M_j = M_k}} M_k
end{align}
(since the $M_i$ form a group, so each $M_i M_j$ equals some $M_k$),
where all indices in sums range over $left{1,2,ldots,rright}$. Now, for every $1 leq k leq r$, there exist precisely $r$ pairs $left(i, jright)$ such that $M_k = M_i M_j$ (again since the $M_i$ form a group). Hence, for every $1 leq k leq r$, we have
begin{align}
sum_{substack{i, j ;\ M_i M_j = M_k}} M_k = r M_k .
end{align}
Thus,
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2
&= sum_{k} underbrace{sum_{substack{i, j ;\ M_i M_j = M_k}} M_k}_{=r M_k} = sum_{k} r M_k \
&= rleft(M_1 + M_2 + cdots + M_rright) .
end{align}
This readily yields that $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ is an idempotent. But the trace of an idempotent matrix equals its rank (this is a well-known fact), and this particular idempotent matrix $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ has trace $0$. How many matrices with rank $0$ are there?
add a comment |
It seems to me that the proposers of this problem went an extra mile to be misunderstood: Most reasonable mathematicians would read "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Under this interpretation, the $M_i$ do define a representation of $G$. But apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
But ambiguity is not a one-player game. Consider $r = 3$, $M_1 = I_n$, $M_2 = operatorname{diag}left(1, -1right)$ and $M_3 = operatorname{diag}left(1, -1right)$. Oh, $M_1, M_2, ldots, M_r$ are supposed to be distinct? Good to know. I am wondering how often this came up on appeal?
Anyway the solution you quoted is overkill. The problem straightforwardly generalizes to matrices over any field of characteristic $0$ instead of real matrices; good luck defining Hermitian forms over arbitrary fields. A solution that generalizes (and is a lot shorter and more elementary than the one in the original post) proceeds as follows (very roughly sketched):
We have
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2 = sum_{i, j} M_i M_j = sum_{k} sum_{substack{i, j ;\ M_i M_j = M_k}} M_k
end{align}
(since the $M_i$ form a group, so each $M_i M_j$ equals some $M_k$),
where all indices in sums range over $left{1,2,ldots,rright}$. Now, for every $1 leq k leq r$, there exist precisely $r$ pairs $left(i, jright)$ such that $M_k = M_i M_j$ (again since the $M_i$ form a group). Hence, for every $1 leq k leq r$, we have
begin{align}
sum_{substack{i, j ;\ M_i M_j = M_k}} M_k = r M_k .
end{align}
Thus,
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2
&= sum_{k} underbrace{sum_{substack{i, j ;\ M_i M_j = M_k}} M_k}_{=r M_k} = sum_{k} r M_k \
&= rleft(M_1 + M_2 + cdots + M_rright) .
end{align}
This readily yields that $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ is an idempotent. But the trace of an idempotent matrix equals its rank (this is a well-known fact), and this particular idempotent matrix $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ has trace $0$. How many matrices with rank $0$ are there?
It seems to me that the proposers of this problem went an extra mile to be misunderstood: Most reasonable mathematicians would read "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Under this interpretation, the $M_i$ do define a representation of $G$. But apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
But ambiguity is not a one-player game. Consider $r = 3$, $M_1 = I_n$, $M_2 = operatorname{diag}left(1, -1right)$ and $M_3 = operatorname{diag}left(1, -1right)$. Oh, $M_1, M_2, ldots, M_r$ are supposed to be distinct? Good to know. I am wondering how often this came up on appeal?
Anyway the solution you quoted is overkill. The problem straightforwardly generalizes to matrices over any field of characteristic $0$ instead of real matrices; good luck defining Hermitian forms over arbitrary fields. A solution that generalizes (and is a lot shorter and more elementary than the one in the original post) proceeds as follows (very roughly sketched):
We have
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2 = sum_{i, j} M_i M_j = sum_{k} sum_{substack{i, j ;\ M_i M_j = M_k}} M_k
end{align}
(since the $M_i$ form a group, so each $M_i M_j$ equals some $M_k$),
where all indices in sums range over $left{1,2,ldots,rright}$. Now, for every $1 leq k leq r$, there exist precisely $r$ pairs $left(i, jright)$ such that $M_k = M_i M_j$ (again since the $M_i$ form a group). Hence, for every $1 leq k leq r$, we have
begin{align}
sum_{substack{i, j ;\ M_i M_j = M_k}} M_k = r M_k .
end{align}
Thus,
begin{align}
left(M_1 + M_2 + cdots + M_rright)^2
&= sum_{k} underbrace{sum_{substack{i, j ;\ M_i M_j = M_k}} M_k}_{=r M_k} = sum_{k} r M_k \
&= rleft(M_1 + M_2 + cdots + M_rright) .
end{align}
This readily yields that $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ is an idempotent. But the trace of an idempotent matrix equals its rank (this is a well-known fact), and this particular idempotent matrix $dfrac{1}{r}left(M_1 + M_2 + cdots + M_rright)$ has trace $0$. How many matrices with rank $0$ are there?
edited Nov 20 '18 at 20:48
answered Feb 23 '15 at 4:27
darij grinberg
10.2k33062
10.2k33062
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1161115%2fa-representation-theory-problem-in-putnam-competition%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
Wow, this is evil. I would argue that it is asking to be misunderstood, since most reasonable mathematicians would understand "group under matrix multiplication" as "group whose multiplication is matrix multiplication and whose identity is the identity of matrix multiplication". Apparently the problem did not mean to require the identity of the group to be the identity of matrix multiplication, and so you do not know if the matrices are actually invertible as matrices.
– darij grinberg
Feb 23 '15 at 3:45
4
Anyway there is no need to involve orthogonality of group characters here. Enough to observe that $dfrac{1}{r}left(M_1+M_2+ldots+M_rright)$ is idempotent (because of the $M_i$ forming a group), and the trace of an idempotent matrix equals its rank.
– darij grinberg
Feb 23 '15 at 3:47
@whacka: Done, with some more detail.
– darij grinberg
Feb 23 '15 at 4:29