Orthogonal matrices only defined for standard inner product?
$begingroup$
$newcommand{tp}[1]{#1^mathrm{T}} newcommand{Id}{mathrm{Id}} newcommand{n}{{1,ldots,n}} newcommand{siff}{quadLeftrightarrowquad} newcommand{ijth}[2][tp{Q}Q]{[#1]_{#2}} newcommand{K}{mathbb{K}}$
Let orthogonal matrices be defined as follows.
A matrix $Qinmathcal{M}_{mtimes n}(mathbb{K})$, where $mathbb{K}$
is a field, is said to be orthogonal if $$ Q^mathrm{T}Q = mathrm{Id}_n$$
I'm not fully sure if I'm understanding the following fact correctly:
A matrix $Qinmathcal{M}_{mtimes n}(K)$ is orthogonal iff
the columns of $Q$ form an orthonormal set in $K^m$.
Proof
Let $q_i$ denote the $i$-th column of $Q$ for all $iin{1,ldots,n}$, and let $ijth[A]{ij}$ denote the $(i,j)$-th
element of $A$ for any matrix $A$. Then, $Q$ being an orthogonal
matrix is equivalent to $$tp{Q}Q = Id_n siff ijth{ij} = delta_{ij},,$$ where $delta_{ij}$ is the Kronecker delta. On the
other hand, by the definition of matrix multiplication, $$ijth{ij} = sum_{k=1}^{m} ijth[tp{Q}]{ik}ijth[{Q}]{kj} = sum_{k=1}^{m} ijth[Q]{ki}ijth[{Q}]{kj} stackrel{color{red}*}{=} langle q_i, q_jrangle,.$$ Thus $Q$ is orthogonal iff $$ langle q_i, q_jrangle = delta_{ij} qquadforall (i,j)inntimesn,, $$ which is true iff $(q_i)_{iinn}$ form an orthonormal set.
Particularly, I'm suspicious of the equality marked with the red asterisk. Isn't that true only for the standard inner product (i.e. the dot product), defined as
$
langle u, v rangle = sum_i u_iv_i
$? So, are orthogonal matrices only treated in the context of the standard inner product? If so, is there a "generalization" of orthogonal matrices for general inner product spaces?
linear-algebra inner-product-space orthogonal-matrices
$endgroup$
add a comment |
$begingroup$
$newcommand{tp}[1]{#1^mathrm{T}} newcommand{Id}{mathrm{Id}} newcommand{n}{{1,ldots,n}} newcommand{siff}{quadLeftrightarrowquad} newcommand{ijth}[2][tp{Q}Q]{[#1]_{#2}} newcommand{K}{mathbb{K}}$
Let orthogonal matrices be defined as follows.
A matrix $Qinmathcal{M}_{mtimes n}(mathbb{K})$, where $mathbb{K}$
is a field, is said to be orthogonal if $$ Q^mathrm{T}Q = mathrm{Id}_n$$
I'm not fully sure if I'm understanding the following fact correctly:
A matrix $Qinmathcal{M}_{mtimes n}(K)$ is orthogonal iff
the columns of $Q$ form an orthonormal set in $K^m$.
Proof
Let $q_i$ denote the $i$-th column of $Q$ for all $iin{1,ldots,n}$, and let $ijth[A]{ij}$ denote the $(i,j)$-th
element of $A$ for any matrix $A$. Then, $Q$ being an orthogonal
matrix is equivalent to $$tp{Q}Q = Id_n siff ijth{ij} = delta_{ij},,$$ where $delta_{ij}$ is the Kronecker delta. On the
other hand, by the definition of matrix multiplication, $$ijth{ij} = sum_{k=1}^{m} ijth[tp{Q}]{ik}ijth[{Q}]{kj} = sum_{k=1}^{m} ijth[Q]{ki}ijth[{Q}]{kj} stackrel{color{red}*}{=} langle q_i, q_jrangle,.$$ Thus $Q$ is orthogonal iff $$ langle q_i, q_jrangle = delta_{ij} qquadforall (i,j)inntimesn,, $$ which is true iff $(q_i)_{iinn}$ form an orthonormal set.
Particularly, I'm suspicious of the equality marked with the red asterisk. Isn't that true only for the standard inner product (i.e. the dot product), defined as
$
langle u, v rangle = sum_i u_iv_i
$? So, are orthogonal matrices only treated in the context of the standard inner product? If so, is there a "generalization" of orthogonal matrices for general inner product spaces?
linear-algebra inner-product-space orthogonal-matrices
$endgroup$
add a comment |
$begingroup$
$newcommand{tp}[1]{#1^mathrm{T}} newcommand{Id}{mathrm{Id}} newcommand{n}{{1,ldots,n}} newcommand{siff}{quadLeftrightarrowquad} newcommand{ijth}[2][tp{Q}Q]{[#1]_{#2}} newcommand{K}{mathbb{K}}$
Let orthogonal matrices be defined as follows.
A matrix $Qinmathcal{M}_{mtimes n}(mathbb{K})$, where $mathbb{K}$
is a field, is said to be orthogonal if $$ Q^mathrm{T}Q = mathrm{Id}_n$$
I'm not fully sure if I'm understanding the following fact correctly:
A matrix $Qinmathcal{M}_{mtimes n}(K)$ is orthogonal iff
the columns of $Q$ form an orthonormal set in $K^m$.
Proof
Let $q_i$ denote the $i$-th column of $Q$ for all $iin{1,ldots,n}$, and let $ijth[A]{ij}$ denote the $(i,j)$-th
element of $A$ for any matrix $A$. Then, $Q$ being an orthogonal
matrix is equivalent to $$tp{Q}Q = Id_n siff ijth{ij} = delta_{ij},,$$ where $delta_{ij}$ is the Kronecker delta. On the
other hand, by the definition of matrix multiplication, $$ijth{ij} = sum_{k=1}^{m} ijth[tp{Q}]{ik}ijth[{Q}]{kj} = sum_{k=1}^{m} ijth[Q]{ki}ijth[{Q}]{kj} stackrel{color{red}*}{=} langle q_i, q_jrangle,.$$ Thus $Q$ is orthogonal iff $$ langle q_i, q_jrangle = delta_{ij} qquadforall (i,j)inntimesn,, $$ which is true iff $(q_i)_{iinn}$ form an orthonormal set.
Particularly, I'm suspicious of the equality marked with the red asterisk. Isn't that true only for the standard inner product (i.e. the dot product), defined as
$
langle u, v rangle = sum_i u_iv_i
$? So, are orthogonal matrices only treated in the context of the standard inner product? If so, is there a "generalization" of orthogonal matrices for general inner product spaces?
linear-algebra inner-product-space orthogonal-matrices
$endgroup$
$newcommand{tp}[1]{#1^mathrm{T}} newcommand{Id}{mathrm{Id}} newcommand{n}{{1,ldots,n}} newcommand{siff}{quadLeftrightarrowquad} newcommand{ijth}[2][tp{Q}Q]{[#1]_{#2}} newcommand{K}{mathbb{K}}$
Let orthogonal matrices be defined as follows.
A matrix $Qinmathcal{M}_{mtimes n}(mathbb{K})$, where $mathbb{K}$
is a field, is said to be orthogonal if $$ Q^mathrm{T}Q = mathrm{Id}_n$$
I'm not fully sure if I'm understanding the following fact correctly:
A matrix $Qinmathcal{M}_{mtimes n}(K)$ is orthogonal iff
the columns of $Q$ form an orthonormal set in $K^m$.
Proof
Let $q_i$ denote the $i$-th column of $Q$ for all $iin{1,ldots,n}$, and let $ijth[A]{ij}$ denote the $(i,j)$-th
element of $A$ for any matrix $A$. Then, $Q$ being an orthogonal
matrix is equivalent to $$tp{Q}Q = Id_n siff ijth{ij} = delta_{ij},,$$ where $delta_{ij}$ is the Kronecker delta. On the
other hand, by the definition of matrix multiplication, $$ijth{ij} = sum_{k=1}^{m} ijth[tp{Q}]{ik}ijth[{Q}]{kj} = sum_{k=1}^{m} ijth[Q]{ki}ijth[{Q}]{kj} stackrel{color{red}*}{=} langle q_i, q_jrangle,.$$ Thus $Q$ is orthogonal iff $$ langle q_i, q_jrangle = delta_{ij} qquadforall (i,j)inntimesn,, $$ which is true iff $(q_i)_{iinn}$ form an orthonormal set.
Particularly, I'm suspicious of the equality marked with the red asterisk. Isn't that true only for the standard inner product (i.e. the dot product), defined as
$
langle u, v rangle = sum_i u_iv_i
$? So, are orthogonal matrices only treated in the context of the standard inner product? If so, is there a "generalization" of orthogonal matrices for general inner product spaces?
linear-algebra inner-product-space orthogonal-matrices
linear-algebra inner-product-space orthogonal-matrices
asked Jan 9 at 12:12
AnakhandAnakhand
233113
233113
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
It might be instructive here to start with the corresponding invariant (i.e., basis-free) description of orthogonality.
On a finite-dimensional inner product space $(Bbb V, langle,cdot,,,cdot,rangle)$, a linear transformation $T : Bbb V to Bbb V$ is said to be orthogonal if it preserves the inner product, that is, if $langle T({bf x}), T({bf y}) rangle = langle {bf x}, {bf y}rangle$. Any basis $({bf e}_a)$ of $Bbb V$ determines matrix representations $[T]$ of $T$ and $[Sigma]$ of the inner product: These are characterized by
$$[T({bf e}_a)] = sum_b [T]_{ba} [{bf e}_a], qquad
[Sigma]_{ab} = langle {bf e}_a, {bf e}_b rangle .$$
Unwinding all of this, we see that $T$ is orthogonal if
$$[T]^{top} [Sigma] [T] = [Sigma] .$$
In the special case that the basis $({bf e}_a)$ is orthogonal, substituting gives $[Sigma] = I$ and the condition simplifies to the familiar definition of orthogonal matrix:
$$[T]^{top} [T] = I .$$
Over a real inner product space, we can always choose an orthogonal basis, so the more general construction might seem like an unnecessary formalism. But such bases are not always the most convenient in applications, and if we extend our attention to nondegenerate, symmetric bilinear forms (so, drop the condition of positive-definiteness from the definition of inner product), orthogonal bases don't exist, but we still care about the notion of orthogonality.
$endgroup$
add a comment |
$begingroup$
Yep, just as you expected, however, all of this heavily depends on your choice of basis. Since every basis $B$ of a vector space $V$ is actually a choice of isomorphism $varphi_B : k^{dim(V)} to V$, we may interpret any homomorphism $f: V to W$ as a matrix (heavily dependent on the choice of basis). Picking an orthonormal basis w.r.t an arbitrary inner product $langle _ ,_ rangle _V$ this isomorphism even becomes compatible with the inner product, i.e. $langle varphi_B (x) , varphi_B (y)rangle _V =langle x , y rangle _{textrm{eucl}}$. So you can carry over all constructions. However in general you can define an orthogonal matrix as commuting with the inner product. This does, using the above identification become equivalent to the definition you know.
However, I prefer doing the general case first (i.e. commuting with inner product) and then specializing using basis.
I hope this is a satisfactory answer.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3067383%2forthogonal-matrices-only-defined-for-standard-inner-product%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
It might be instructive here to start with the corresponding invariant (i.e., basis-free) description of orthogonality.
On a finite-dimensional inner product space $(Bbb V, langle,cdot,,,cdot,rangle)$, a linear transformation $T : Bbb V to Bbb V$ is said to be orthogonal if it preserves the inner product, that is, if $langle T({bf x}), T({bf y}) rangle = langle {bf x}, {bf y}rangle$. Any basis $({bf e}_a)$ of $Bbb V$ determines matrix representations $[T]$ of $T$ and $[Sigma]$ of the inner product: These are characterized by
$$[T({bf e}_a)] = sum_b [T]_{ba} [{bf e}_a], qquad
[Sigma]_{ab} = langle {bf e}_a, {bf e}_b rangle .$$
Unwinding all of this, we see that $T$ is orthogonal if
$$[T]^{top} [Sigma] [T] = [Sigma] .$$
In the special case that the basis $({bf e}_a)$ is orthogonal, substituting gives $[Sigma] = I$ and the condition simplifies to the familiar definition of orthogonal matrix:
$$[T]^{top} [T] = I .$$
Over a real inner product space, we can always choose an orthogonal basis, so the more general construction might seem like an unnecessary formalism. But such bases are not always the most convenient in applications, and if we extend our attention to nondegenerate, symmetric bilinear forms (so, drop the condition of positive-definiteness from the definition of inner product), orthogonal bases don't exist, but we still care about the notion of orthogonality.
$endgroup$
add a comment |
$begingroup$
It might be instructive here to start with the corresponding invariant (i.e., basis-free) description of orthogonality.
On a finite-dimensional inner product space $(Bbb V, langle,cdot,,,cdot,rangle)$, a linear transformation $T : Bbb V to Bbb V$ is said to be orthogonal if it preserves the inner product, that is, if $langle T({bf x}), T({bf y}) rangle = langle {bf x}, {bf y}rangle$. Any basis $({bf e}_a)$ of $Bbb V$ determines matrix representations $[T]$ of $T$ and $[Sigma]$ of the inner product: These are characterized by
$$[T({bf e}_a)] = sum_b [T]_{ba} [{bf e}_a], qquad
[Sigma]_{ab} = langle {bf e}_a, {bf e}_b rangle .$$
Unwinding all of this, we see that $T$ is orthogonal if
$$[T]^{top} [Sigma] [T] = [Sigma] .$$
In the special case that the basis $({bf e}_a)$ is orthogonal, substituting gives $[Sigma] = I$ and the condition simplifies to the familiar definition of orthogonal matrix:
$$[T]^{top} [T] = I .$$
Over a real inner product space, we can always choose an orthogonal basis, so the more general construction might seem like an unnecessary formalism. But such bases are not always the most convenient in applications, and if we extend our attention to nondegenerate, symmetric bilinear forms (so, drop the condition of positive-definiteness from the definition of inner product), orthogonal bases don't exist, but we still care about the notion of orthogonality.
$endgroup$
add a comment |
$begingroup$
It might be instructive here to start with the corresponding invariant (i.e., basis-free) description of orthogonality.
On a finite-dimensional inner product space $(Bbb V, langle,cdot,,,cdot,rangle)$, a linear transformation $T : Bbb V to Bbb V$ is said to be orthogonal if it preserves the inner product, that is, if $langle T({bf x}), T({bf y}) rangle = langle {bf x}, {bf y}rangle$. Any basis $({bf e}_a)$ of $Bbb V$ determines matrix representations $[T]$ of $T$ and $[Sigma]$ of the inner product: These are characterized by
$$[T({bf e}_a)] = sum_b [T]_{ba} [{bf e}_a], qquad
[Sigma]_{ab} = langle {bf e}_a, {bf e}_b rangle .$$
Unwinding all of this, we see that $T$ is orthogonal if
$$[T]^{top} [Sigma] [T] = [Sigma] .$$
In the special case that the basis $({bf e}_a)$ is orthogonal, substituting gives $[Sigma] = I$ and the condition simplifies to the familiar definition of orthogonal matrix:
$$[T]^{top} [T] = I .$$
Over a real inner product space, we can always choose an orthogonal basis, so the more general construction might seem like an unnecessary formalism. But such bases are not always the most convenient in applications, and if we extend our attention to nondegenerate, symmetric bilinear forms (so, drop the condition of positive-definiteness from the definition of inner product), orthogonal bases don't exist, but we still care about the notion of orthogonality.
$endgroup$
It might be instructive here to start with the corresponding invariant (i.e., basis-free) description of orthogonality.
On a finite-dimensional inner product space $(Bbb V, langle,cdot,,,cdot,rangle)$, a linear transformation $T : Bbb V to Bbb V$ is said to be orthogonal if it preserves the inner product, that is, if $langle T({bf x}), T({bf y}) rangle = langle {bf x}, {bf y}rangle$. Any basis $({bf e}_a)$ of $Bbb V$ determines matrix representations $[T]$ of $T$ and $[Sigma]$ of the inner product: These are characterized by
$$[T({bf e}_a)] = sum_b [T]_{ba} [{bf e}_a], qquad
[Sigma]_{ab} = langle {bf e}_a, {bf e}_b rangle .$$
Unwinding all of this, we see that $T$ is orthogonal if
$$[T]^{top} [Sigma] [T] = [Sigma] .$$
In the special case that the basis $({bf e}_a)$ is orthogonal, substituting gives $[Sigma] = I$ and the condition simplifies to the familiar definition of orthogonal matrix:
$$[T]^{top} [T] = I .$$
Over a real inner product space, we can always choose an orthogonal basis, so the more general construction might seem like an unnecessary formalism. But such bases are not always the most convenient in applications, and if we extend our attention to nondegenerate, symmetric bilinear forms (so, drop the condition of positive-definiteness from the definition of inner product), orthogonal bases don't exist, but we still care about the notion of orthogonality.
answered Jan 9 at 12:48


TravisTravis
60.1k767147
60.1k767147
add a comment |
add a comment |
$begingroup$
Yep, just as you expected, however, all of this heavily depends on your choice of basis. Since every basis $B$ of a vector space $V$ is actually a choice of isomorphism $varphi_B : k^{dim(V)} to V$, we may interpret any homomorphism $f: V to W$ as a matrix (heavily dependent on the choice of basis). Picking an orthonormal basis w.r.t an arbitrary inner product $langle _ ,_ rangle _V$ this isomorphism even becomes compatible with the inner product, i.e. $langle varphi_B (x) , varphi_B (y)rangle _V =langle x , y rangle _{textrm{eucl}}$. So you can carry over all constructions. However in general you can define an orthogonal matrix as commuting with the inner product. This does, using the above identification become equivalent to the definition you know.
However, I prefer doing the general case first (i.e. commuting with inner product) and then specializing using basis.
I hope this is a satisfactory answer.
$endgroup$
add a comment |
$begingroup$
Yep, just as you expected, however, all of this heavily depends on your choice of basis. Since every basis $B$ of a vector space $V$ is actually a choice of isomorphism $varphi_B : k^{dim(V)} to V$, we may interpret any homomorphism $f: V to W$ as a matrix (heavily dependent on the choice of basis). Picking an orthonormal basis w.r.t an arbitrary inner product $langle _ ,_ rangle _V$ this isomorphism even becomes compatible with the inner product, i.e. $langle varphi_B (x) , varphi_B (y)rangle _V =langle x , y rangle _{textrm{eucl}}$. So you can carry over all constructions. However in general you can define an orthogonal matrix as commuting with the inner product. This does, using the above identification become equivalent to the definition you know.
However, I prefer doing the general case first (i.e. commuting with inner product) and then specializing using basis.
I hope this is a satisfactory answer.
$endgroup$
add a comment |
$begingroup$
Yep, just as you expected, however, all of this heavily depends on your choice of basis. Since every basis $B$ of a vector space $V$ is actually a choice of isomorphism $varphi_B : k^{dim(V)} to V$, we may interpret any homomorphism $f: V to W$ as a matrix (heavily dependent on the choice of basis). Picking an orthonormal basis w.r.t an arbitrary inner product $langle _ ,_ rangle _V$ this isomorphism even becomes compatible with the inner product, i.e. $langle varphi_B (x) , varphi_B (y)rangle _V =langle x , y rangle _{textrm{eucl}}$. So you can carry over all constructions. However in general you can define an orthogonal matrix as commuting with the inner product. This does, using the above identification become equivalent to the definition you know.
However, I prefer doing the general case first (i.e. commuting with inner product) and then specializing using basis.
I hope this is a satisfactory answer.
$endgroup$
Yep, just as you expected, however, all of this heavily depends on your choice of basis. Since every basis $B$ of a vector space $V$ is actually a choice of isomorphism $varphi_B : k^{dim(V)} to V$, we may interpret any homomorphism $f: V to W$ as a matrix (heavily dependent on the choice of basis). Picking an orthonormal basis w.r.t an arbitrary inner product $langle _ ,_ rangle _V$ this isomorphism even becomes compatible with the inner product, i.e. $langle varphi_B (x) , varphi_B (y)rangle _V =langle x , y rangle _{textrm{eucl}}$. So you can carry over all constructions. However in general you can define an orthogonal matrix as commuting with the inner product. This does, using the above identification become equivalent to the definition you know.
However, I prefer doing the general case first (i.e. commuting with inner product) and then specializing using basis.
I hope this is a satisfactory answer.
answered Jan 9 at 12:26
EnkiduEnkidu
1,32619
1,32619
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3067383%2forthogonal-matrices-only-defined-for-standard-inner-product%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown