Can you take the dot product of a column vector and a row vector (i.e. a vector and a dual vector)
$begingroup$
I recently learned about the definition of work, namely the one involving path integration. W=integral of (F.dr). In this case, F is a vector field and dr is a small segment of the path r being traversed. However, dr can also be interpreted as a covector field which assigns a covector(dual vector) to every point. So essentially, the line integral sums up the dot product between the Force vectors and the dual vectors of the path. The question I have then is how you can take the dot product of a vector and a dual vector when the dot product is defined as the transpose of the first vector matrix multiplied to the second vector. Clearly this cannot be done because you would be matrix multiplying a (1xn) matrix by a (1xn) matrix. In short, my question is could one take the dot product of a vector and a dual vector?
vectors tensors dual-spaces
$endgroup$
add a comment |
$begingroup$
I recently learned about the definition of work, namely the one involving path integration. W=integral of (F.dr). In this case, F is a vector field and dr is a small segment of the path r being traversed. However, dr can also be interpreted as a covector field which assigns a covector(dual vector) to every point. So essentially, the line integral sums up the dot product between the Force vectors and the dual vectors of the path. The question I have then is how you can take the dot product of a vector and a dual vector when the dot product is defined as the transpose of the first vector matrix multiplied to the second vector. Clearly this cannot be done because you would be matrix multiplying a (1xn) matrix by a (1xn) matrix. In short, my question is could one take the dot product of a vector and a dual vector?
vectors tensors dual-spaces
$endgroup$
$begingroup$
I thought a dot product was a vector operation that works as long as the two vectors have the same number of entries. I didn't think that it had anything to do with matrices or whether they're row /column vectors... right?
$endgroup$
– Zubin Mukerjee
Jan 17 at 23:37
$begingroup$
As it happens, although most likely not by any accident, the dot product of two vectors v and w in the form v.w is equivalent to (v^T)w (v transpose matrix multiplied to w).
$endgroup$
– John Wick
Jan 17 at 23:42
$begingroup$
just dub it “pairing”
$endgroup$
– janmarqz
Jan 20 at 16:07
add a comment |
$begingroup$
I recently learned about the definition of work, namely the one involving path integration. W=integral of (F.dr). In this case, F is a vector field and dr is a small segment of the path r being traversed. However, dr can also be interpreted as a covector field which assigns a covector(dual vector) to every point. So essentially, the line integral sums up the dot product between the Force vectors and the dual vectors of the path. The question I have then is how you can take the dot product of a vector and a dual vector when the dot product is defined as the transpose of the first vector matrix multiplied to the second vector. Clearly this cannot be done because you would be matrix multiplying a (1xn) matrix by a (1xn) matrix. In short, my question is could one take the dot product of a vector and a dual vector?
vectors tensors dual-spaces
$endgroup$
I recently learned about the definition of work, namely the one involving path integration. W=integral of (F.dr). In this case, F is a vector field and dr is a small segment of the path r being traversed. However, dr can also be interpreted as a covector field which assigns a covector(dual vector) to every point. So essentially, the line integral sums up the dot product between the Force vectors and the dual vectors of the path. The question I have then is how you can take the dot product of a vector and a dual vector when the dot product is defined as the transpose of the first vector matrix multiplied to the second vector. Clearly this cannot be done because you would be matrix multiplying a (1xn) matrix by a (1xn) matrix. In short, my question is could one take the dot product of a vector and a dual vector?
vectors tensors dual-spaces
vectors tensors dual-spaces
asked Jan 17 at 23:30
John WickJohn Wick
172
172
$begingroup$
I thought a dot product was a vector operation that works as long as the two vectors have the same number of entries. I didn't think that it had anything to do with matrices or whether they're row /column vectors... right?
$endgroup$
– Zubin Mukerjee
Jan 17 at 23:37
$begingroup$
As it happens, although most likely not by any accident, the dot product of two vectors v and w in the form v.w is equivalent to (v^T)w (v transpose matrix multiplied to w).
$endgroup$
– John Wick
Jan 17 at 23:42
$begingroup$
just dub it “pairing”
$endgroup$
– janmarqz
Jan 20 at 16:07
add a comment |
$begingroup$
I thought a dot product was a vector operation that works as long as the two vectors have the same number of entries. I didn't think that it had anything to do with matrices or whether they're row /column vectors... right?
$endgroup$
– Zubin Mukerjee
Jan 17 at 23:37
$begingroup$
As it happens, although most likely not by any accident, the dot product of two vectors v and w in the form v.w is equivalent to (v^T)w (v transpose matrix multiplied to w).
$endgroup$
– John Wick
Jan 17 at 23:42
$begingroup$
just dub it “pairing”
$endgroup$
– janmarqz
Jan 20 at 16:07
$begingroup$
I thought a dot product was a vector operation that works as long as the two vectors have the same number of entries. I didn't think that it had anything to do with matrices or whether they're row /column vectors... right?
$endgroup$
– Zubin Mukerjee
Jan 17 at 23:37
$begingroup$
I thought a dot product was a vector operation that works as long as the two vectors have the same number of entries. I didn't think that it had anything to do with matrices or whether they're row /column vectors... right?
$endgroup$
– Zubin Mukerjee
Jan 17 at 23:37
$begingroup$
As it happens, although most likely not by any accident, the dot product of two vectors v and w in the form v.w is equivalent to (v^T)w (v transpose matrix multiplied to w).
$endgroup$
– John Wick
Jan 17 at 23:42
$begingroup$
As it happens, although most likely not by any accident, the dot product of two vectors v and w in the form v.w is equivalent to (v^T)w (v transpose matrix multiplied to w).
$endgroup$
– John Wick
Jan 17 at 23:42
$begingroup$
just dub it “pairing”
$endgroup$
– janmarqz
Jan 20 at 16:07
$begingroup$
just dub it “pairing”
$endgroup$
– janmarqz
Jan 20 at 16:07
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
The dual space consists of linear functionals: functions that eat a vector and spit out a scalar. For a finite-dimensional vector space over the field $mathbb K$ and a suitable choice of bases for it and its dual space, evaluating the covector (a.k.a. dual vector, a.k.a. functional) $mathbfalpha$ at the vector $mathbf v$ can be expressed as the product of a row vector-an element of $mathbb K^{1times n}$—with a column vector—an element of $mathbb K^{ntimes1}$. You could call that their dot product, but I myself wouldn’t. I generally reserve that for a product of elements of the same space, whether it’s $mathbb K^{ntimes1}$ or $mathbb K^{1times n}$, which happens to be expressible as an identical-looking matrix product to $mathbfalpha(mathbf v)$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3077662%2fcan-you-take-the-dot-product-of-a-column-vector-and-a-row-vector-i-e-a-vector%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The dual space consists of linear functionals: functions that eat a vector and spit out a scalar. For a finite-dimensional vector space over the field $mathbb K$ and a suitable choice of bases for it and its dual space, evaluating the covector (a.k.a. dual vector, a.k.a. functional) $mathbfalpha$ at the vector $mathbf v$ can be expressed as the product of a row vector-an element of $mathbb K^{1times n}$—with a column vector—an element of $mathbb K^{ntimes1}$. You could call that their dot product, but I myself wouldn’t. I generally reserve that for a product of elements of the same space, whether it’s $mathbb K^{ntimes1}$ or $mathbb K^{1times n}$, which happens to be expressible as an identical-looking matrix product to $mathbfalpha(mathbf v)$.
$endgroup$
add a comment |
$begingroup$
The dual space consists of linear functionals: functions that eat a vector and spit out a scalar. For a finite-dimensional vector space over the field $mathbb K$ and a suitable choice of bases for it and its dual space, evaluating the covector (a.k.a. dual vector, a.k.a. functional) $mathbfalpha$ at the vector $mathbf v$ can be expressed as the product of a row vector-an element of $mathbb K^{1times n}$—with a column vector—an element of $mathbb K^{ntimes1}$. You could call that their dot product, but I myself wouldn’t. I generally reserve that for a product of elements of the same space, whether it’s $mathbb K^{ntimes1}$ or $mathbb K^{1times n}$, which happens to be expressible as an identical-looking matrix product to $mathbfalpha(mathbf v)$.
$endgroup$
add a comment |
$begingroup$
The dual space consists of linear functionals: functions that eat a vector and spit out a scalar. For a finite-dimensional vector space over the field $mathbb K$ and a suitable choice of bases for it and its dual space, evaluating the covector (a.k.a. dual vector, a.k.a. functional) $mathbfalpha$ at the vector $mathbf v$ can be expressed as the product of a row vector-an element of $mathbb K^{1times n}$—with a column vector—an element of $mathbb K^{ntimes1}$. You could call that their dot product, but I myself wouldn’t. I generally reserve that for a product of elements of the same space, whether it’s $mathbb K^{ntimes1}$ or $mathbb K^{1times n}$, which happens to be expressible as an identical-looking matrix product to $mathbfalpha(mathbf v)$.
$endgroup$
The dual space consists of linear functionals: functions that eat a vector and spit out a scalar. For a finite-dimensional vector space over the field $mathbb K$ and a suitable choice of bases for it and its dual space, evaluating the covector (a.k.a. dual vector, a.k.a. functional) $mathbfalpha$ at the vector $mathbf v$ can be expressed as the product of a row vector-an element of $mathbb K^{1times n}$—with a column vector—an element of $mathbb K^{ntimes1}$. You could call that their dot product, but I myself wouldn’t. I generally reserve that for a product of elements of the same space, whether it’s $mathbb K^{ntimes1}$ or $mathbb K^{1times n}$, which happens to be expressible as an identical-looking matrix product to $mathbfalpha(mathbf v)$.
answered Jan 18 at 0:10
amdamd
30.6k21050
30.6k21050
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3077662%2fcan-you-take-the-dot-product-of-a-column-vector-and-a-row-vector-i-e-a-vector%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
I thought a dot product was a vector operation that works as long as the two vectors have the same number of entries. I didn't think that it had anything to do with matrices or whether they're row /column vectors... right?
$endgroup$
– Zubin Mukerjee
Jan 17 at 23:37
$begingroup$
As it happens, although most likely not by any accident, the dot product of two vectors v and w in the form v.w is equivalent to (v^T)w (v transpose matrix multiplied to w).
$endgroup$
– John Wick
Jan 17 at 23:42
$begingroup$
just dub it “pairing”
$endgroup$
– janmarqz
Jan 20 at 16:07