$SP_{2n}(mathbb {R})$ acts transitively on $mathbb {R}^{2n}$
$begingroup$
I am trying to prove that $SP_{2n}(mathbb {R})$ acts transitively on $mathbb{R}^{2n}$ by $(A,x)=Ax$, where
$SP_{2n}(mathbb {R})={Ain GL_{2n} (mathbb{R})| A^{T}J_{2n}A=J_{2n}}$ is the real symplectic group.
To do so, I need to show that if $x,yinmathbb {R}^{2n}$, then there exists $Ain SP_{2n}(mathbb{R})$ such that $Ax=y$.
In the $n=1$ case I was able to explicitly construct such an $A$ by solving $Ax=y$, but have been unable to scale up and generalise the calculation. Any hints would be much appreciated!
abstract-algebra group-theory lie-groups symplectic-linear-algebra
$endgroup$
add a comment |
$begingroup$
I am trying to prove that $SP_{2n}(mathbb {R})$ acts transitively on $mathbb{R}^{2n}$ by $(A,x)=Ax$, where
$SP_{2n}(mathbb {R})={Ain GL_{2n} (mathbb{R})| A^{T}J_{2n}A=J_{2n}}$ is the real symplectic group.
To do so, I need to show that if $x,yinmathbb {R}^{2n}$, then there exists $Ain SP_{2n}(mathbb{R})$ such that $Ax=y$.
In the $n=1$ case I was able to explicitly construct such an $A$ by solving $Ax=y$, but have been unable to scale up and generalise the calculation. Any hints would be much appreciated!
abstract-algebra group-theory lie-groups symplectic-linear-algebra
$endgroup$
add a comment |
$begingroup$
I am trying to prove that $SP_{2n}(mathbb {R})$ acts transitively on $mathbb{R}^{2n}$ by $(A,x)=Ax$, where
$SP_{2n}(mathbb {R})={Ain GL_{2n} (mathbb{R})| A^{T}J_{2n}A=J_{2n}}$ is the real symplectic group.
To do so, I need to show that if $x,yinmathbb {R}^{2n}$, then there exists $Ain SP_{2n}(mathbb{R})$ such that $Ax=y$.
In the $n=1$ case I was able to explicitly construct such an $A$ by solving $Ax=y$, but have been unable to scale up and generalise the calculation. Any hints would be much appreciated!
abstract-algebra group-theory lie-groups symplectic-linear-algebra
$endgroup$
I am trying to prove that $SP_{2n}(mathbb {R})$ acts transitively on $mathbb{R}^{2n}$ by $(A,x)=Ax$, where
$SP_{2n}(mathbb {R})={Ain GL_{2n} (mathbb{R})| A^{T}J_{2n}A=J_{2n}}$ is the real symplectic group.
To do so, I need to show that if $x,yinmathbb {R}^{2n}$, then there exists $Ain SP_{2n}(mathbb{R})$ such that $Ax=y$.
In the $n=1$ case I was able to explicitly construct such an $A$ by solving $Ax=y$, but have been unable to scale up and generalise the calculation. Any hints would be much appreciated!
abstract-algebra group-theory lie-groups symplectic-linear-algebra
abstract-algebra group-theory lie-groups symplectic-linear-algebra
edited Jan 24 at 7:13
CoffeeCrow
asked Jan 24 at 3:46
CoffeeCrowCoffeeCrow
592317
592317
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Via the symplectic Gram-Schmidt process, we can extend any vector $v$ to a symplectic basis ${x_1 = v, x_2,ldots, x_n, y_1,ldots, y_n}$. Given another symplectic basis ${x'_1 = v',x_2',ldots,x_n',y'_1,ldots,y_n'}$ the transformation $x_i mapsto x_i', y_imapsto y_i'$ is symplectic and sends $v mapsto v'$. This transformation preserves the symplectic form, because both bases are symplectic.
In case it's unfamiliar, the symplectic Gram-Schmidt process works like this:
Start with any non-zero vector $x_1 = v$. Then there exists a vector $w$ such that $(x_1,w)neq 0$, because of non-degeneracy (and such an element must be independent from $x_1$). Now rescale $y_1 = lambda w$ such that $(x_1,y_1) = -1$, e.g. $lambda = -1/(x_1,w)$.
Given a symplectic basis ${x_1,ldots,x_k,y_1,ldots,y_k}$, next pick any vector $u$ independent from the previous $x_1,y_1,ldots,x_k,y_k$. If you set $u' := u + (x_1,u)y_1$ then $$(x_1,u') = (x_1,u) + (x_1,u)(-1) = 0.$$ Similarly setting $u'' := u' +(u',y_1)x_1$ then still $(x_1,u'') = 0$ and also $(u'', y_1) = (u',y_1) + (u',y_1)(x_1,y_1) = 0$. Keep going until you kill off all the components: $$x_{k+1} = u + sum (x_i,u)y_1 + (u,y_i)x_1.$$
Next, since $(x_{k+1},t) = 0$ for $t = x_1,ldots, x_k,y_1,ldots,y_k$, by nondegeneracy there must be some independent vector $q$ such that $(x_{k+1},q) neq 0$. First remove the components which pair nontrivially with $x_i,y_i$ for $i leq k$, $$q' := q + sum_{ileq k} (x_i,q)y_i + (q,y_i)x_i.$$ This doesn't change $(x_{k+1},q') = (x_{k+1},q) neq 0$, so now rescale $y_{k+1} := lambda q'$ where $lambda = -1/(x_{k+1},q)$. This is now a symplectic basis of dimension $2k$.
So, you can see it is possible to extend any initial vector $v$ to a symplectic basis, which is the only fact we used above.
ADDED: (Elaborating on why the above linear transformation is given by a symplectic matrix.)
Responding to your comment, I'm not sure which of these points 1-3 is more helpful so hopefully the format makes it easier to skip things which aren't useful.
1. Why the above transformation preserves the symplectic form.
Any linear transformation sending a symplectic basis to another symplectic basis is symplectic, let's explain why. A symplectic basis has the property $(x_i,y_j) = -delta_{ij}$ and $(x_i,x_j) = (y_i,y_j) = 0$. So if $x_i,y_i mapsto x_i',y_i'$ via a linear transformation $C$, and both are symplectic bases, then for instance: $$(x_i,y_j) = -delta_{ij} = (x_i', y_j') = (Cx_i, Cy_j)$$ similarly $$(y_i, y_j) = 0 = (y_i',y_j') = (Cy_i, Cy_j)$$ and similarly $(Cx_i,Cx_j) = (x_i,x_j)$. So the form is preserved - checking on a basis is enough to show $(Cv,Cw) = (v,w)$ for any $v,w$.
2. Why preserving the standard symplectic form is the same is being a represented by a symplectic matrix, with respect to the standard symplectic basis.
Your working definition of a symplectic matrix is a matrix $A$ such that $A^TJA = J$ where $J = begin{pmatrix}0&I\-I&0end{pmatrix}$. First I'll recall why that is the same as saying the linear transformation corresponding to $A$ preserves the standard symplectic form. The matrix $J$ is what's called a Gram matrix/Gramian matrix of the form, for the standard symplectic form on $mathbb R^{2n}$. This means that the symplectic form is given by $$(v,w) = v^TJw.$$ In general for any bilinear form, there is a linear transformation such $A$ such that $(v,w) = v^TAw$. The matrices which preserve a form $(v,w)$ are those such that $(v,w) = (Cv,Cw)$. Using the Gram matrix for the standard symplectic form, this equation says $$v^TJw = v^TC^TJCw$$ and since this holds for all $v,w$ it implies $J = C^TJC$, which is the definition of a symplectic matrix above.
So, the basis-invariant definition of $SP$ is the automorphism group of a symplectic vector space. Above we have constructed an invertible linear transformation which preserves the form, thus an automorphism of a symplectic vector space. By the above discussion, if you write this matrix down with respect to the standard symplectic basis, it will be a symplectic matrix.
3. Why we found a symplectic transformation instead of a symplectic matrix.
The reason we didn't actually do that is because the linear transformation we found wasn't very constructive. Every time we say "the form is nondegenerate, so we can find some vector pairing with $x_k$ to be non-zero" we are being nonconstructive - in general there are many many choices leading to different linear transformations, all of which send $vmapsto v'$. If you wanted to be more explicit and find a matrix you would have to make these choices explicitly. This would be pretty annoying, since they depend on the initial pair of vectors $v,v'$ (or in the OP's notation $x,y$).
$endgroup$
$begingroup$
Yes, but the definition of $SP_{2n}$ already presumes you’ve done this - it’s the set of matrices preserving the form. If you pick a different form it will be a different set of matrices. Usually one takes the form where the standard basis is symplectic.
$endgroup$
– Ben
Jan 24 at 8:00
$begingroup$
Okay, that makes sense. So we have two symplectic bases, say $B$ and $B'$ with $vin B$ and $v' in B'$. So we can write $Cv=v'$ where $C $ is the change of basis matrix between $B$ and $B'$. So if $C$ is symplectic we are done. You've mentioned that this transformation is symplectic because the bases are symplectic, but could you possibly elaborate on why this is so? Sorry if this is a bit trivial, thos is the first time I've dealt with the symplectic group.
$endgroup$
– CoffeeCrow
Jan 24 at 8:46
$begingroup$
@CoffeeCrow Sure no problem, unfortunately it got a little long-winded though. Let me know if you can't find what you're looking for!
$endgroup$
– Ben
Jan 24 at 10:31
$begingroup$
@CoffeeCrow Your right that its a bit trivial, but you should compare it with the situation of an inner product. Any linear transformation sending an orthonormal basis to an orthonormal basis is an orthogonal matrix.
$endgroup$
– Ben
Jan 24 at 10:35
$begingroup$
Thanks, that explanation helps a lot!
$endgroup$
– CoffeeCrow
Jan 27 at 13:19
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3085426%2fsp-2n-mathbb-r-acts-transitively-on-mathbb-r2n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Via the symplectic Gram-Schmidt process, we can extend any vector $v$ to a symplectic basis ${x_1 = v, x_2,ldots, x_n, y_1,ldots, y_n}$. Given another symplectic basis ${x'_1 = v',x_2',ldots,x_n',y'_1,ldots,y_n'}$ the transformation $x_i mapsto x_i', y_imapsto y_i'$ is symplectic and sends $v mapsto v'$. This transformation preserves the symplectic form, because both bases are symplectic.
In case it's unfamiliar, the symplectic Gram-Schmidt process works like this:
Start with any non-zero vector $x_1 = v$. Then there exists a vector $w$ such that $(x_1,w)neq 0$, because of non-degeneracy (and such an element must be independent from $x_1$). Now rescale $y_1 = lambda w$ such that $(x_1,y_1) = -1$, e.g. $lambda = -1/(x_1,w)$.
Given a symplectic basis ${x_1,ldots,x_k,y_1,ldots,y_k}$, next pick any vector $u$ independent from the previous $x_1,y_1,ldots,x_k,y_k$. If you set $u' := u + (x_1,u)y_1$ then $$(x_1,u') = (x_1,u) + (x_1,u)(-1) = 0.$$ Similarly setting $u'' := u' +(u',y_1)x_1$ then still $(x_1,u'') = 0$ and also $(u'', y_1) = (u',y_1) + (u',y_1)(x_1,y_1) = 0$. Keep going until you kill off all the components: $$x_{k+1} = u + sum (x_i,u)y_1 + (u,y_i)x_1.$$
Next, since $(x_{k+1},t) = 0$ for $t = x_1,ldots, x_k,y_1,ldots,y_k$, by nondegeneracy there must be some independent vector $q$ such that $(x_{k+1},q) neq 0$. First remove the components which pair nontrivially with $x_i,y_i$ for $i leq k$, $$q' := q + sum_{ileq k} (x_i,q)y_i + (q,y_i)x_i.$$ This doesn't change $(x_{k+1},q') = (x_{k+1},q) neq 0$, so now rescale $y_{k+1} := lambda q'$ where $lambda = -1/(x_{k+1},q)$. This is now a symplectic basis of dimension $2k$.
So, you can see it is possible to extend any initial vector $v$ to a symplectic basis, which is the only fact we used above.
ADDED: (Elaborating on why the above linear transformation is given by a symplectic matrix.)
Responding to your comment, I'm not sure which of these points 1-3 is more helpful so hopefully the format makes it easier to skip things which aren't useful.
1. Why the above transformation preserves the symplectic form.
Any linear transformation sending a symplectic basis to another symplectic basis is symplectic, let's explain why. A symplectic basis has the property $(x_i,y_j) = -delta_{ij}$ and $(x_i,x_j) = (y_i,y_j) = 0$. So if $x_i,y_i mapsto x_i',y_i'$ via a linear transformation $C$, and both are symplectic bases, then for instance: $$(x_i,y_j) = -delta_{ij} = (x_i', y_j') = (Cx_i, Cy_j)$$ similarly $$(y_i, y_j) = 0 = (y_i',y_j') = (Cy_i, Cy_j)$$ and similarly $(Cx_i,Cx_j) = (x_i,x_j)$. So the form is preserved - checking on a basis is enough to show $(Cv,Cw) = (v,w)$ for any $v,w$.
2. Why preserving the standard symplectic form is the same is being a represented by a symplectic matrix, with respect to the standard symplectic basis.
Your working definition of a symplectic matrix is a matrix $A$ such that $A^TJA = J$ where $J = begin{pmatrix}0&I\-I&0end{pmatrix}$. First I'll recall why that is the same as saying the linear transformation corresponding to $A$ preserves the standard symplectic form. The matrix $J$ is what's called a Gram matrix/Gramian matrix of the form, for the standard symplectic form on $mathbb R^{2n}$. This means that the symplectic form is given by $$(v,w) = v^TJw.$$ In general for any bilinear form, there is a linear transformation such $A$ such that $(v,w) = v^TAw$. The matrices which preserve a form $(v,w)$ are those such that $(v,w) = (Cv,Cw)$. Using the Gram matrix for the standard symplectic form, this equation says $$v^TJw = v^TC^TJCw$$ and since this holds for all $v,w$ it implies $J = C^TJC$, which is the definition of a symplectic matrix above.
So, the basis-invariant definition of $SP$ is the automorphism group of a symplectic vector space. Above we have constructed an invertible linear transformation which preserves the form, thus an automorphism of a symplectic vector space. By the above discussion, if you write this matrix down with respect to the standard symplectic basis, it will be a symplectic matrix.
3. Why we found a symplectic transformation instead of a symplectic matrix.
The reason we didn't actually do that is because the linear transformation we found wasn't very constructive. Every time we say "the form is nondegenerate, so we can find some vector pairing with $x_k$ to be non-zero" we are being nonconstructive - in general there are many many choices leading to different linear transformations, all of which send $vmapsto v'$. If you wanted to be more explicit and find a matrix you would have to make these choices explicitly. This would be pretty annoying, since they depend on the initial pair of vectors $v,v'$ (or in the OP's notation $x,y$).
$endgroup$
$begingroup$
Yes, but the definition of $SP_{2n}$ already presumes you’ve done this - it’s the set of matrices preserving the form. If you pick a different form it will be a different set of matrices. Usually one takes the form where the standard basis is symplectic.
$endgroup$
– Ben
Jan 24 at 8:00
$begingroup$
Okay, that makes sense. So we have two symplectic bases, say $B$ and $B'$ with $vin B$ and $v' in B'$. So we can write $Cv=v'$ where $C $ is the change of basis matrix between $B$ and $B'$. So if $C$ is symplectic we are done. You've mentioned that this transformation is symplectic because the bases are symplectic, but could you possibly elaborate on why this is so? Sorry if this is a bit trivial, thos is the first time I've dealt with the symplectic group.
$endgroup$
– CoffeeCrow
Jan 24 at 8:46
$begingroup$
@CoffeeCrow Sure no problem, unfortunately it got a little long-winded though. Let me know if you can't find what you're looking for!
$endgroup$
– Ben
Jan 24 at 10:31
$begingroup$
@CoffeeCrow Your right that its a bit trivial, but you should compare it with the situation of an inner product. Any linear transformation sending an orthonormal basis to an orthonormal basis is an orthogonal matrix.
$endgroup$
– Ben
Jan 24 at 10:35
$begingroup$
Thanks, that explanation helps a lot!
$endgroup$
– CoffeeCrow
Jan 27 at 13:19
add a comment |
$begingroup$
Via the symplectic Gram-Schmidt process, we can extend any vector $v$ to a symplectic basis ${x_1 = v, x_2,ldots, x_n, y_1,ldots, y_n}$. Given another symplectic basis ${x'_1 = v',x_2',ldots,x_n',y'_1,ldots,y_n'}$ the transformation $x_i mapsto x_i', y_imapsto y_i'$ is symplectic and sends $v mapsto v'$. This transformation preserves the symplectic form, because both bases are symplectic.
In case it's unfamiliar, the symplectic Gram-Schmidt process works like this:
Start with any non-zero vector $x_1 = v$. Then there exists a vector $w$ such that $(x_1,w)neq 0$, because of non-degeneracy (and such an element must be independent from $x_1$). Now rescale $y_1 = lambda w$ such that $(x_1,y_1) = -1$, e.g. $lambda = -1/(x_1,w)$.
Given a symplectic basis ${x_1,ldots,x_k,y_1,ldots,y_k}$, next pick any vector $u$ independent from the previous $x_1,y_1,ldots,x_k,y_k$. If you set $u' := u + (x_1,u)y_1$ then $$(x_1,u') = (x_1,u) + (x_1,u)(-1) = 0.$$ Similarly setting $u'' := u' +(u',y_1)x_1$ then still $(x_1,u'') = 0$ and also $(u'', y_1) = (u',y_1) + (u',y_1)(x_1,y_1) = 0$. Keep going until you kill off all the components: $$x_{k+1} = u + sum (x_i,u)y_1 + (u,y_i)x_1.$$
Next, since $(x_{k+1},t) = 0$ for $t = x_1,ldots, x_k,y_1,ldots,y_k$, by nondegeneracy there must be some independent vector $q$ such that $(x_{k+1},q) neq 0$. First remove the components which pair nontrivially with $x_i,y_i$ for $i leq k$, $$q' := q + sum_{ileq k} (x_i,q)y_i + (q,y_i)x_i.$$ This doesn't change $(x_{k+1},q') = (x_{k+1},q) neq 0$, so now rescale $y_{k+1} := lambda q'$ where $lambda = -1/(x_{k+1},q)$. This is now a symplectic basis of dimension $2k$.
So, you can see it is possible to extend any initial vector $v$ to a symplectic basis, which is the only fact we used above.
ADDED: (Elaborating on why the above linear transformation is given by a symplectic matrix.)
Responding to your comment, I'm not sure which of these points 1-3 is more helpful so hopefully the format makes it easier to skip things which aren't useful.
1. Why the above transformation preserves the symplectic form.
Any linear transformation sending a symplectic basis to another symplectic basis is symplectic, let's explain why. A symplectic basis has the property $(x_i,y_j) = -delta_{ij}$ and $(x_i,x_j) = (y_i,y_j) = 0$. So if $x_i,y_i mapsto x_i',y_i'$ via a linear transformation $C$, and both are symplectic bases, then for instance: $$(x_i,y_j) = -delta_{ij} = (x_i', y_j') = (Cx_i, Cy_j)$$ similarly $$(y_i, y_j) = 0 = (y_i',y_j') = (Cy_i, Cy_j)$$ and similarly $(Cx_i,Cx_j) = (x_i,x_j)$. So the form is preserved - checking on a basis is enough to show $(Cv,Cw) = (v,w)$ for any $v,w$.
2. Why preserving the standard symplectic form is the same is being a represented by a symplectic matrix, with respect to the standard symplectic basis.
Your working definition of a symplectic matrix is a matrix $A$ such that $A^TJA = J$ where $J = begin{pmatrix}0&I\-I&0end{pmatrix}$. First I'll recall why that is the same as saying the linear transformation corresponding to $A$ preserves the standard symplectic form. The matrix $J$ is what's called a Gram matrix/Gramian matrix of the form, for the standard symplectic form on $mathbb R^{2n}$. This means that the symplectic form is given by $$(v,w) = v^TJw.$$ In general for any bilinear form, there is a linear transformation such $A$ such that $(v,w) = v^TAw$. The matrices which preserve a form $(v,w)$ are those such that $(v,w) = (Cv,Cw)$. Using the Gram matrix for the standard symplectic form, this equation says $$v^TJw = v^TC^TJCw$$ and since this holds for all $v,w$ it implies $J = C^TJC$, which is the definition of a symplectic matrix above.
So, the basis-invariant definition of $SP$ is the automorphism group of a symplectic vector space. Above we have constructed an invertible linear transformation which preserves the form, thus an automorphism of a symplectic vector space. By the above discussion, if you write this matrix down with respect to the standard symplectic basis, it will be a symplectic matrix.
3. Why we found a symplectic transformation instead of a symplectic matrix.
The reason we didn't actually do that is because the linear transformation we found wasn't very constructive. Every time we say "the form is nondegenerate, so we can find some vector pairing with $x_k$ to be non-zero" we are being nonconstructive - in general there are many many choices leading to different linear transformations, all of which send $vmapsto v'$. If you wanted to be more explicit and find a matrix you would have to make these choices explicitly. This would be pretty annoying, since they depend on the initial pair of vectors $v,v'$ (or in the OP's notation $x,y$).
$endgroup$
$begingroup$
Yes, but the definition of $SP_{2n}$ already presumes you’ve done this - it’s the set of matrices preserving the form. If you pick a different form it will be a different set of matrices. Usually one takes the form where the standard basis is symplectic.
$endgroup$
– Ben
Jan 24 at 8:00
$begingroup$
Okay, that makes sense. So we have two symplectic bases, say $B$ and $B'$ with $vin B$ and $v' in B'$. So we can write $Cv=v'$ where $C $ is the change of basis matrix between $B$ and $B'$. So if $C$ is symplectic we are done. You've mentioned that this transformation is symplectic because the bases are symplectic, but could you possibly elaborate on why this is so? Sorry if this is a bit trivial, thos is the first time I've dealt with the symplectic group.
$endgroup$
– CoffeeCrow
Jan 24 at 8:46
$begingroup$
@CoffeeCrow Sure no problem, unfortunately it got a little long-winded though. Let me know if you can't find what you're looking for!
$endgroup$
– Ben
Jan 24 at 10:31
$begingroup$
@CoffeeCrow Your right that its a bit trivial, but you should compare it with the situation of an inner product. Any linear transformation sending an orthonormal basis to an orthonormal basis is an orthogonal matrix.
$endgroup$
– Ben
Jan 24 at 10:35
$begingroup$
Thanks, that explanation helps a lot!
$endgroup$
– CoffeeCrow
Jan 27 at 13:19
add a comment |
$begingroup$
Via the symplectic Gram-Schmidt process, we can extend any vector $v$ to a symplectic basis ${x_1 = v, x_2,ldots, x_n, y_1,ldots, y_n}$. Given another symplectic basis ${x'_1 = v',x_2',ldots,x_n',y'_1,ldots,y_n'}$ the transformation $x_i mapsto x_i', y_imapsto y_i'$ is symplectic and sends $v mapsto v'$. This transformation preserves the symplectic form, because both bases are symplectic.
In case it's unfamiliar, the symplectic Gram-Schmidt process works like this:
Start with any non-zero vector $x_1 = v$. Then there exists a vector $w$ such that $(x_1,w)neq 0$, because of non-degeneracy (and such an element must be independent from $x_1$). Now rescale $y_1 = lambda w$ such that $(x_1,y_1) = -1$, e.g. $lambda = -1/(x_1,w)$.
Given a symplectic basis ${x_1,ldots,x_k,y_1,ldots,y_k}$, next pick any vector $u$ independent from the previous $x_1,y_1,ldots,x_k,y_k$. If you set $u' := u + (x_1,u)y_1$ then $$(x_1,u') = (x_1,u) + (x_1,u)(-1) = 0.$$ Similarly setting $u'' := u' +(u',y_1)x_1$ then still $(x_1,u'') = 0$ and also $(u'', y_1) = (u',y_1) + (u',y_1)(x_1,y_1) = 0$. Keep going until you kill off all the components: $$x_{k+1} = u + sum (x_i,u)y_1 + (u,y_i)x_1.$$
Next, since $(x_{k+1},t) = 0$ for $t = x_1,ldots, x_k,y_1,ldots,y_k$, by nondegeneracy there must be some independent vector $q$ such that $(x_{k+1},q) neq 0$. First remove the components which pair nontrivially with $x_i,y_i$ for $i leq k$, $$q' := q + sum_{ileq k} (x_i,q)y_i + (q,y_i)x_i.$$ This doesn't change $(x_{k+1},q') = (x_{k+1},q) neq 0$, so now rescale $y_{k+1} := lambda q'$ where $lambda = -1/(x_{k+1},q)$. This is now a symplectic basis of dimension $2k$.
So, you can see it is possible to extend any initial vector $v$ to a symplectic basis, which is the only fact we used above.
ADDED: (Elaborating on why the above linear transformation is given by a symplectic matrix.)
Responding to your comment, I'm not sure which of these points 1-3 is more helpful so hopefully the format makes it easier to skip things which aren't useful.
1. Why the above transformation preserves the symplectic form.
Any linear transformation sending a symplectic basis to another symplectic basis is symplectic, let's explain why. A symplectic basis has the property $(x_i,y_j) = -delta_{ij}$ and $(x_i,x_j) = (y_i,y_j) = 0$. So if $x_i,y_i mapsto x_i',y_i'$ via a linear transformation $C$, and both are symplectic bases, then for instance: $$(x_i,y_j) = -delta_{ij} = (x_i', y_j') = (Cx_i, Cy_j)$$ similarly $$(y_i, y_j) = 0 = (y_i',y_j') = (Cy_i, Cy_j)$$ and similarly $(Cx_i,Cx_j) = (x_i,x_j)$. So the form is preserved - checking on a basis is enough to show $(Cv,Cw) = (v,w)$ for any $v,w$.
2. Why preserving the standard symplectic form is the same is being a represented by a symplectic matrix, with respect to the standard symplectic basis.
Your working definition of a symplectic matrix is a matrix $A$ such that $A^TJA = J$ where $J = begin{pmatrix}0&I\-I&0end{pmatrix}$. First I'll recall why that is the same as saying the linear transformation corresponding to $A$ preserves the standard symplectic form. The matrix $J$ is what's called a Gram matrix/Gramian matrix of the form, for the standard symplectic form on $mathbb R^{2n}$. This means that the symplectic form is given by $$(v,w) = v^TJw.$$ In general for any bilinear form, there is a linear transformation such $A$ such that $(v,w) = v^TAw$. The matrices which preserve a form $(v,w)$ are those such that $(v,w) = (Cv,Cw)$. Using the Gram matrix for the standard symplectic form, this equation says $$v^TJw = v^TC^TJCw$$ and since this holds for all $v,w$ it implies $J = C^TJC$, which is the definition of a symplectic matrix above.
So, the basis-invariant definition of $SP$ is the automorphism group of a symplectic vector space. Above we have constructed an invertible linear transformation which preserves the form, thus an automorphism of a symplectic vector space. By the above discussion, if you write this matrix down with respect to the standard symplectic basis, it will be a symplectic matrix.
3. Why we found a symplectic transformation instead of a symplectic matrix.
The reason we didn't actually do that is because the linear transformation we found wasn't very constructive. Every time we say "the form is nondegenerate, so we can find some vector pairing with $x_k$ to be non-zero" we are being nonconstructive - in general there are many many choices leading to different linear transformations, all of which send $vmapsto v'$. If you wanted to be more explicit and find a matrix you would have to make these choices explicitly. This would be pretty annoying, since they depend on the initial pair of vectors $v,v'$ (or in the OP's notation $x,y$).
$endgroup$
Via the symplectic Gram-Schmidt process, we can extend any vector $v$ to a symplectic basis ${x_1 = v, x_2,ldots, x_n, y_1,ldots, y_n}$. Given another symplectic basis ${x'_1 = v',x_2',ldots,x_n',y'_1,ldots,y_n'}$ the transformation $x_i mapsto x_i', y_imapsto y_i'$ is symplectic and sends $v mapsto v'$. This transformation preserves the symplectic form, because both bases are symplectic.
In case it's unfamiliar, the symplectic Gram-Schmidt process works like this:
Start with any non-zero vector $x_1 = v$. Then there exists a vector $w$ such that $(x_1,w)neq 0$, because of non-degeneracy (and such an element must be independent from $x_1$). Now rescale $y_1 = lambda w$ such that $(x_1,y_1) = -1$, e.g. $lambda = -1/(x_1,w)$.
Given a symplectic basis ${x_1,ldots,x_k,y_1,ldots,y_k}$, next pick any vector $u$ independent from the previous $x_1,y_1,ldots,x_k,y_k$. If you set $u' := u + (x_1,u)y_1$ then $$(x_1,u') = (x_1,u) + (x_1,u)(-1) = 0.$$ Similarly setting $u'' := u' +(u',y_1)x_1$ then still $(x_1,u'') = 0$ and also $(u'', y_1) = (u',y_1) + (u',y_1)(x_1,y_1) = 0$. Keep going until you kill off all the components: $$x_{k+1} = u + sum (x_i,u)y_1 + (u,y_i)x_1.$$
Next, since $(x_{k+1},t) = 0$ for $t = x_1,ldots, x_k,y_1,ldots,y_k$, by nondegeneracy there must be some independent vector $q$ such that $(x_{k+1},q) neq 0$. First remove the components which pair nontrivially with $x_i,y_i$ for $i leq k$, $$q' := q + sum_{ileq k} (x_i,q)y_i + (q,y_i)x_i.$$ This doesn't change $(x_{k+1},q') = (x_{k+1},q) neq 0$, so now rescale $y_{k+1} := lambda q'$ where $lambda = -1/(x_{k+1},q)$. This is now a symplectic basis of dimension $2k$.
So, you can see it is possible to extend any initial vector $v$ to a symplectic basis, which is the only fact we used above.
ADDED: (Elaborating on why the above linear transformation is given by a symplectic matrix.)
Responding to your comment, I'm not sure which of these points 1-3 is more helpful so hopefully the format makes it easier to skip things which aren't useful.
1. Why the above transformation preserves the symplectic form.
Any linear transformation sending a symplectic basis to another symplectic basis is symplectic, let's explain why. A symplectic basis has the property $(x_i,y_j) = -delta_{ij}$ and $(x_i,x_j) = (y_i,y_j) = 0$. So if $x_i,y_i mapsto x_i',y_i'$ via a linear transformation $C$, and both are symplectic bases, then for instance: $$(x_i,y_j) = -delta_{ij} = (x_i', y_j') = (Cx_i, Cy_j)$$ similarly $$(y_i, y_j) = 0 = (y_i',y_j') = (Cy_i, Cy_j)$$ and similarly $(Cx_i,Cx_j) = (x_i,x_j)$. So the form is preserved - checking on a basis is enough to show $(Cv,Cw) = (v,w)$ for any $v,w$.
2. Why preserving the standard symplectic form is the same is being a represented by a symplectic matrix, with respect to the standard symplectic basis.
Your working definition of a symplectic matrix is a matrix $A$ such that $A^TJA = J$ where $J = begin{pmatrix}0&I\-I&0end{pmatrix}$. First I'll recall why that is the same as saying the linear transformation corresponding to $A$ preserves the standard symplectic form. The matrix $J$ is what's called a Gram matrix/Gramian matrix of the form, for the standard symplectic form on $mathbb R^{2n}$. This means that the symplectic form is given by $$(v,w) = v^TJw.$$ In general for any bilinear form, there is a linear transformation such $A$ such that $(v,w) = v^TAw$. The matrices which preserve a form $(v,w)$ are those such that $(v,w) = (Cv,Cw)$. Using the Gram matrix for the standard symplectic form, this equation says $$v^TJw = v^TC^TJCw$$ and since this holds for all $v,w$ it implies $J = C^TJC$, which is the definition of a symplectic matrix above.
So, the basis-invariant definition of $SP$ is the automorphism group of a symplectic vector space. Above we have constructed an invertible linear transformation which preserves the form, thus an automorphism of a symplectic vector space. By the above discussion, if you write this matrix down with respect to the standard symplectic basis, it will be a symplectic matrix.
3. Why we found a symplectic transformation instead of a symplectic matrix.
The reason we didn't actually do that is because the linear transformation we found wasn't very constructive. Every time we say "the form is nondegenerate, so we can find some vector pairing with $x_k$ to be non-zero" we are being nonconstructive - in general there are many many choices leading to different linear transformations, all of which send $vmapsto v'$. If you wanted to be more explicit and find a matrix you would have to make these choices explicitly. This would be pretty annoying, since they depend on the initial pair of vectors $v,v'$ (or in the OP's notation $x,y$).
edited Jan 24 at 10:23
answered Jan 24 at 5:42
BenBen
4,283617
4,283617
$begingroup$
Yes, but the definition of $SP_{2n}$ already presumes you’ve done this - it’s the set of matrices preserving the form. If you pick a different form it will be a different set of matrices. Usually one takes the form where the standard basis is symplectic.
$endgroup$
– Ben
Jan 24 at 8:00
$begingroup$
Okay, that makes sense. So we have two symplectic bases, say $B$ and $B'$ with $vin B$ and $v' in B'$. So we can write $Cv=v'$ where $C $ is the change of basis matrix between $B$ and $B'$. So if $C$ is symplectic we are done. You've mentioned that this transformation is symplectic because the bases are symplectic, but could you possibly elaborate on why this is so? Sorry if this is a bit trivial, thos is the first time I've dealt with the symplectic group.
$endgroup$
– CoffeeCrow
Jan 24 at 8:46
$begingroup$
@CoffeeCrow Sure no problem, unfortunately it got a little long-winded though. Let me know if you can't find what you're looking for!
$endgroup$
– Ben
Jan 24 at 10:31
$begingroup$
@CoffeeCrow Your right that its a bit trivial, but you should compare it with the situation of an inner product. Any linear transformation sending an orthonormal basis to an orthonormal basis is an orthogonal matrix.
$endgroup$
– Ben
Jan 24 at 10:35
$begingroup$
Thanks, that explanation helps a lot!
$endgroup$
– CoffeeCrow
Jan 27 at 13:19
add a comment |
$begingroup$
Yes, but the definition of $SP_{2n}$ already presumes you’ve done this - it’s the set of matrices preserving the form. If you pick a different form it will be a different set of matrices. Usually one takes the form where the standard basis is symplectic.
$endgroup$
– Ben
Jan 24 at 8:00
$begingroup$
Okay, that makes sense. So we have two symplectic bases, say $B$ and $B'$ with $vin B$ and $v' in B'$. So we can write $Cv=v'$ where $C $ is the change of basis matrix between $B$ and $B'$. So if $C$ is symplectic we are done. You've mentioned that this transformation is symplectic because the bases are symplectic, but could you possibly elaborate on why this is so? Sorry if this is a bit trivial, thos is the first time I've dealt with the symplectic group.
$endgroup$
– CoffeeCrow
Jan 24 at 8:46
$begingroup$
@CoffeeCrow Sure no problem, unfortunately it got a little long-winded though. Let me know if you can't find what you're looking for!
$endgroup$
– Ben
Jan 24 at 10:31
$begingroup$
@CoffeeCrow Your right that its a bit trivial, but you should compare it with the situation of an inner product. Any linear transformation sending an orthonormal basis to an orthonormal basis is an orthogonal matrix.
$endgroup$
– Ben
Jan 24 at 10:35
$begingroup$
Thanks, that explanation helps a lot!
$endgroup$
– CoffeeCrow
Jan 27 at 13:19
$begingroup$
Yes, but the definition of $SP_{2n}$ already presumes you’ve done this - it’s the set of matrices preserving the form. If you pick a different form it will be a different set of matrices. Usually one takes the form where the standard basis is symplectic.
$endgroup$
– Ben
Jan 24 at 8:00
$begingroup$
Yes, but the definition of $SP_{2n}$ already presumes you’ve done this - it’s the set of matrices preserving the form. If you pick a different form it will be a different set of matrices. Usually one takes the form where the standard basis is symplectic.
$endgroup$
– Ben
Jan 24 at 8:00
$begingroup$
Okay, that makes sense. So we have two symplectic bases, say $B$ and $B'$ with $vin B$ and $v' in B'$. So we can write $Cv=v'$ where $C $ is the change of basis matrix between $B$ and $B'$. So if $C$ is symplectic we are done. You've mentioned that this transformation is symplectic because the bases are symplectic, but could you possibly elaborate on why this is so? Sorry if this is a bit trivial, thos is the first time I've dealt with the symplectic group.
$endgroup$
– CoffeeCrow
Jan 24 at 8:46
$begingroup$
Okay, that makes sense. So we have two symplectic bases, say $B$ and $B'$ with $vin B$ and $v' in B'$. So we can write $Cv=v'$ where $C $ is the change of basis matrix between $B$ and $B'$. So if $C$ is symplectic we are done. You've mentioned that this transformation is symplectic because the bases are symplectic, but could you possibly elaborate on why this is so? Sorry if this is a bit trivial, thos is the first time I've dealt with the symplectic group.
$endgroup$
– CoffeeCrow
Jan 24 at 8:46
$begingroup$
@CoffeeCrow Sure no problem, unfortunately it got a little long-winded though. Let me know if you can't find what you're looking for!
$endgroup$
– Ben
Jan 24 at 10:31
$begingroup$
@CoffeeCrow Sure no problem, unfortunately it got a little long-winded though. Let me know if you can't find what you're looking for!
$endgroup$
– Ben
Jan 24 at 10:31
$begingroup$
@CoffeeCrow Your right that its a bit trivial, but you should compare it with the situation of an inner product. Any linear transformation sending an orthonormal basis to an orthonormal basis is an orthogonal matrix.
$endgroup$
– Ben
Jan 24 at 10:35
$begingroup$
@CoffeeCrow Your right that its a bit trivial, but you should compare it with the situation of an inner product. Any linear transformation sending an orthonormal basis to an orthonormal basis is an orthogonal matrix.
$endgroup$
– Ben
Jan 24 at 10:35
$begingroup$
Thanks, that explanation helps a lot!
$endgroup$
– CoffeeCrow
Jan 27 at 13:19
$begingroup$
Thanks, that explanation helps a lot!
$endgroup$
– CoffeeCrow
Jan 27 at 13:19
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3085426%2fsp-2n-mathbb-r-acts-transitively-on-mathbb-r2n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown