integral of differences of vector
$begingroup$
I have a vector function $f: mathbb{R}^n to mathbb{R}^n$ defined with components
$$
f_i(a) = sum_{j=1}^n sin(a_i - a_j)
$$
which I want to integrate from ${bf{alpha}}^0$ to ${bf{alpha}}^1$ where ${bf{alpha}}^k = [alpha^k_1, ldots, alpha^k_n]$ for $k in {1,2}$.
So the problem looks like
$$
int_{{bf{alpha}}^0}^{{bf{alpha}}^1} f(a)^{top} {rm d}, a.
$$
I thought that I could integrate as below
$$
int_{alpha^0}^{alpha^1} sum_{i=1}^n left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i
$$
by expanding the inner product in the integrand. I think that I can then write the integral as
$$
int_{(alpha_1^0, ldots, alpha_n^0)}^{(alpha_1^1, ldots, alpha_n^1)} sum_{i=1}^n left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i = sum_{i=1}^n int_{hat{alpha}_i^0}^{hat{alpha}_i^1} left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i
$$
where $hat{alpha}_i^0$ treats every component of $a$ as fixed $alpha_j^0$ for $j neq i$, which I think would give
$$
sum_{i=1}^n sum_{j=1}^n left[ cos(alpha_i^1 - alpha_j^1) - cos(alpha_i^0 - alpha_j^0) right].
$$
Is this correct?
calculus integration
$endgroup$
|
show 1 more comment
$begingroup$
I have a vector function $f: mathbb{R}^n to mathbb{R}^n$ defined with components
$$
f_i(a) = sum_{j=1}^n sin(a_i - a_j)
$$
which I want to integrate from ${bf{alpha}}^0$ to ${bf{alpha}}^1$ where ${bf{alpha}}^k = [alpha^k_1, ldots, alpha^k_n]$ for $k in {1,2}$.
So the problem looks like
$$
int_{{bf{alpha}}^0}^{{bf{alpha}}^1} f(a)^{top} {rm d}, a.
$$
I thought that I could integrate as below
$$
int_{alpha^0}^{alpha^1} sum_{i=1}^n left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i
$$
by expanding the inner product in the integrand. I think that I can then write the integral as
$$
int_{(alpha_1^0, ldots, alpha_n^0)}^{(alpha_1^1, ldots, alpha_n^1)} sum_{i=1}^n left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i = sum_{i=1}^n int_{hat{alpha}_i^0}^{hat{alpha}_i^1} left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i
$$
where $hat{alpha}_i^0$ treats every component of $a$ as fixed $alpha_j^0$ for $j neq i$, which I think would give
$$
sum_{i=1}^n sum_{j=1}^n left[ cos(alpha_i^1 - alpha_j^1) - cos(alpha_i^0 - alpha_j^0) right].
$$
Is this correct?
calculus integration
$endgroup$
$begingroup$
The problem here is in the original expression, which is not well-defined. You have $a_i$ as the dummy variable for a definite integral (in which case it should disappear), but then you're expecting to be able to sum over $i$ after everything else is done. I think if you provide more context, we might be able to help you sort it out.
$endgroup$
– Adrian Keister
Jan 28 at 19:06
1
$begingroup$
@AdrianKeister, thank you, i added the full problem
$endgroup$
– dunno
Jan 28 at 19:20
$begingroup$
@AdrianKeister, i guess there is a third summation over $k$ because of the inner product?
$endgroup$
– dunno
Jan 28 at 19:30
$begingroup$
It's really depending on the correct interpretation of the original integral. If it's a dot product, then each differential is multiplying one component of $f$ that itself depends on all the $a_i$ - that's problematic. I don't see how you're going to get a single number out of that unless it's a line integral. If it's a line integral, then parametrization is the way to go, assuming we have a well-defined one we can use. Otherwise, if the integral is really more of a volume integral, then every component of $f$ gets multiplied by all the differentials: you can get a single number out of that.
$endgroup$
– Adrian Keister
Jan 29 at 2:17
$begingroup$
All right! I think I've got it. Please check over my answer for errors.
$endgroup$
– Adrian Keister
Jan 29 at 13:50
|
show 1 more comment
$begingroup$
I have a vector function $f: mathbb{R}^n to mathbb{R}^n$ defined with components
$$
f_i(a) = sum_{j=1}^n sin(a_i - a_j)
$$
which I want to integrate from ${bf{alpha}}^0$ to ${bf{alpha}}^1$ where ${bf{alpha}}^k = [alpha^k_1, ldots, alpha^k_n]$ for $k in {1,2}$.
So the problem looks like
$$
int_{{bf{alpha}}^0}^{{bf{alpha}}^1} f(a)^{top} {rm d}, a.
$$
I thought that I could integrate as below
$$
int_{alpha^0}^{alpha^1} sum_{i=1}^n left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i
$$
by expanding the inner product in the integrand. I think that I can then write the integral as
$$
int_{(alpha_1^0, ldots, alpha_n^0)}^{(alpha_1^1, ldots, alpha_n^1)} sum_{i=1}^n left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i = sum_{i=1}^n int_{hat{alpha}_i^0}^{hat{alpha}_i^1} left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i
$$
where $hat{alpha}_i^0$ treats every component of $a$ as fixed $alpha_j^0$ for $j neq i$, which I think would give
$$
sum_{i=1}^n sum_{j=1}^n left[ cos(alpha_i^1 - alpha_j^1) - cos(alpha_i^0 - alpha_j^0) right].
$$
Is this correct?
calculus integration
$endgroup$
I have a vector function $f: mathbb{R}^n to mathbb{R}^n$ defined with components
$$
f_i(a) = sum_{j=1}^n sin(a_i - a_j)
$$
which I want to integrate from ${bf{alpha}}^0$ to ${bf{alpha}}^1$ where ${bf{alpha}}^k = [alpha^k_1, ldots, alpha^k_n]$ for $k in {1,2}$.
So the problem looks like
$$
int_{{bf{alpha}}^0}^{{bf{alpha}}^1} f(a)^{top} {rm d}, a.
$$
I thought that I could integrate as below
$$
int_{alpha^0}^{alpha^1} sum_{i=1}^n left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i
$$
by expanding the inner product in the integrand. I think that I can then write the integral as
$$
int_{(alpha_1^0, ldots, alpha_n^0)}^{(alpha_1^1, ldots, alpha_n^1)} sum_{i=1}^n left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i = sum_{i=1}^n int_{hat{alpha}_i^0}^{hat{alpha}_i^1} left{ sum_{j=1}^n sin(a_i - a_j)right} {rm d}a_i
$$
where $hat{alpha}_i^0$ treats every component of $a$ as fixed $alpha_j^0$ for $j neq i$, which I think would give
$$
sum_{i=1}^n sum_{j=1}^n left[ cos(alpha_i^1 - alpha_j^1) - cos(alpha_i^0 - alpha_j^0) right].
$$
Is this correct?
calculus integration
calculus integration
edited Jan 28 at 20:19
dunno
asked Jan 28 at 18:59
dunnodunno
1038
1038
$begingroup$
The problem here is in the original expression, which is not well-defined. You have $a_i$ as the dummy variable for a definite integral (in which case it should disappear), but then you're expecting to be able to sum over $i$ after everything else is done. I think if you provide more context, we might be able to help you sort it out.
$endgroup$
– Adrian Keister
Jan 28 at 19:06
1
$begingroup$
@AdrianKeister, thank you, i added the full problem
$endgroup$
– dunno
Jan 28 at 19:20
$begingroup$
@AdrianKeister, i guess there is a third summation over $k$ because of the inner product?
$endgroup$
– dunno
Jan 28 at 19:30
$begingroup$
It's really depending on the correct interpretation of the original integral. If it's a dot product, then each differential is multiplying one component of $f$ that itself depends on all the $a_i$ - that's problematic. I don't see how you're going to get a single number out of that unless it's a line integral. If it's a line integral, then parametrization is the way to go, assuming we have a well-defined one we can use. Otherwise, if the integral is really more of a volume integral, then every component of $f$ gets multiplied by all the differentials: you can get a single number out of that.
$endgroup$
– Adrian Keister
Jan 29 at 2:17
$begingroup$
All right! I think I've got it. Please check over my answer for errors.
$endgroup$
– Adrian Keister
Jan 29 at 13:50
|
show 1 more comment
$begingroup$
The problem here is in the original expression, which is not well-defined. You have $a_i$ as the dummy variable for a definite integral (in which case it should disappear), but then you're expecting to be able to sum over $i$ after everything else is done. I think if you provide more context, we might be able to help you sort it out.
$endgroup$
– Adrian Keister
Jan 28 at 19:06
1
$begingroup$
@AdrianKeister, thank you, i added the full problem
$endgroup$
– dunno
Jan 28 at 19:20
$begingroup$
@AdrianKeister, i guess there is a third summation over $k$ because of the inner product?
$endgroup$
– dunno
Jan 28 at 19:30
$begingroup$
It's really depending on the correct interpretation of the original integral. If it's a dot product, then each differential is multiplying one component of $f$ that itself depends on all the $a_i$ - that's problematic. I don't see how you're going to get a single number out of that unless it's a line integral. If it's a line integral, then parametrization is the way to go, assuming we have a well-defined one we can use. Otherwise, if the integral is really more of a volume integral, then every component of $f$ gets multiplied by all the differentials: you can get a single number out of that.
$endgroup$
– Adrian Keister
Jan 29 at 2:17
$begingroup$
All right! I think I've got it. Please check over my answer for errors.
$endgroup$
– Adrian Keister
Jan 29 at 13:50
$begingroup$
The problem here is in the original expression, which is not well-defined. You have $a_i$ as the dummy variable for a definite integral (in which case it should disappear), but then you're expecting to be able to sum over $i$ after everything else is done. I think if you provide more context, we might be able to help you sort it out.
$endgroup$
– Adrian Keister
Jan 28 at 19:06
$begingroup$
The problem here is in the original expression, which is not well-defined. You have $a_i$ as the dummy variable for a definite integral (in which case it should disappear), but then you're expecting to be able to sum over $i$ after everything else is done. I think if you provide more context, we might be able to help you sort it out.
$endgroup$
– Adrian Keister
Jan 28 at 19:06
1
1
$begingroup$
@AdrianKeister, thank you, i added the full problem
$endgroup$
– dunno
Jan 28 at 19:20
$begingroup$
@AdrianKeister, thank you, i added the full problem
$endgroup$
– dunno
Jan 28 at 19:20
$begingroup$
@AdrianKeister, i guess there is a third summation over $k$ because of the inner product?
$endgroup$
– dunno
Jan 28 at 19:30
$begingroup$
@AdrianKeister, i guess there is a third summation over $k$ because of the inner product?
$endgroup$
– dunno
Jan 28 at 19:30
$begingroup$
It's really depending on the correct interpretation of the original integral. If it's a dot product, then each differential is multiplying one component of $f$ that itself depends on all the $a_i$ - that's problematic. I don't see how you're going to get a single number out of that unless it's a line integral. If it's a line integral, then parametrization is the way to go, assuming we have a well-defined one we can use. Otherwise, if the integral is really more of a volume integral, then every component of $f$ gets multiplied by all the differentials: you can get a single number out of that.
$endgroup$
– Adrian Keister
Jan 29 at 2:17
$begingroup$
It's really depending on the correct interpretation of the original integral. If it's a dot product, then each differential is multiplying one component of $f$ that itself depends on all the $a_i$ - that's problematic. I don't see how you're going to get a single number out of that unless it's a line integral. If it's a line integral, then parametrization is the way to go, assuming we have a well-defined one we can use. Otherwise, if the integral is really more of a volume integral, then every component of $f$ gets multiplied by all the differentials: you can get a single number out of that.
$endgroup$
– Adrian Keister
Jan 29 at 2:17
$begingroup$
All right! I think I've got it. Please check over my answer for errors.
$endgroup$
– Adrian Keister
Jan 29 at 13:50
$begingroup$
All right! I think I've got it. Please check over my answer for errors.
$endgroup$
– Adrian Keister
Jan 29 at 13:50
|
show 1 more comment
1 Answer
1
active
oldest
votes
$begingroup$
We are asked to compute the line integral
$$int_{alpha^0}^{alpha^1}f(a)cdot da, $$
where
$$f_i(a)=sum_{j=1}^{n}sin(a_i-a_j).$$
Ideally, this line integral is path-independent. We would need $f=nabla g$ for some scalar field $g(a).$ That is, we would need
$$sum_{j=1}^{n}sin(a_i-a_j)=frac{partial}{partial a_i},g(a).$$
This would force
$$intsum_{j=1}^{n}sin(a_i-a_j),da_i=g(a),$$
or
$$g(a)=-sum_{j=1}^{n}cos(a_i-a_j).$$
But this needs to be true of all the $a_i,$ so let's modify this to
$$g(a)=-sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
When $i=j,$ we're going to pick up a number of $-1$'s, but that should be immaterial. The point is, that if we differentiate this $g(a)$ w.r.t. $a_i,$ we'll get $f_i.$ Let's double-check that this works by computing:
begin{align*}
frac{partial g(a)}{partial a_1}&=-frac{partial}{partial a_1}sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j)\
&=-sum_{i=1}^nsum_{j=1}^nfrac{partial}{partial a_1},cos(a_i-a_j).
end{align*}
Now we see here that if neither $i$ nor $j$ is $1,$ the derivative annihilates the $cos$. So, which terms have $a_1$ in them? Well, we have a number of terms. If $i=1,$ and $jnot=1,$ those will contribute. Also, if $inot=1$ but $j=1,$ those will also contribute. If $i=j,$ then, as before mentioned, the term is $cos(0)=1,$ which disappears under differentiation. Therefore, the expression above becomes the following:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_1-a_j)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1).
end{align*}
Since $cos$ is even, these sums will actually turn out to be identical, which means we double-counted initially. That is:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_j-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-2sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=2sum_{i=2}^nsin(a_i-a_1).
end{align*}
This means we need to adjust our $g$ by a factor of $1/2:$
$$g(a)=-frac12 sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
Once we do this, we can see that the proof above goes through, and it is indeed the case that $partial_{a_1}g(a)=f_1,$ since in $f_1,$ when $j=1,$ we see that $sin(a_1-a_1)=sin(0)=0.$
Whew! We have path independence on account of the form of $f$. It remains to calculate the line integral itself, but we've actually done all the hard work because we know what $g$ is. Since the integral is path-independent, we'll choose the straight line from $alpha^0$ to $alpha^1,$ parametrized as
$$gamma(t)=talpha^1+(1-t)alpha^0,quad tin[0,1],$$
as it has all the nice properties we need to use the Fundamental Theorem for Line Integrals:
begin{align*}
int_{alpha^0}^{alpha^1}f(a)cdot da &=int_{alpha^0}^{alpha^1}(nabla g(a))cdot da \
&=g(alpha^1)-g(alpha^0) \
&=frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^0-alpha_j^0)-frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^1-alpha_j^1) \
&=frac12 sum_{i=1}^nsum_{j=1}^n[cos(alpha_i^0-alpha_j^0)-cos(alpha_i^1-alpha_j^1)].
end{align*}
You could use
$$cos(theta)-cos(varphi)=-2sinleft(frac{theta+varphi}{2}right)sinleft(frac{theta-varphi}{2}right)$$
to combine the cosines in the last expression, but that might or might not be simpler.
$endgroup$
1
$begingroup$
$int f(a) cdot da$ for $a$ a vector
$endgroup$
– dunno
Jan 28 at 19:38
1
$begingroup$
So that's going to look like $sum_{i=1}^nint_{alpha_j^0}^{alpha_j^1}sum_{j=1}^ncos(a_i-a_j),da_i,$ I think, which is what you had above.
$endgroup$
– Adrian Keister
Jan 28 at 19:44
1
$begingroup$
Oh, right - definitely sin. I think it's ok, because fundamentally it's a sum of integrals (that's the dot product). I don't think my answer is correct. The final result should be a number involving only $alpha_i^0$ and $alpha_i^1$.
$endgroup$
– Adrian Keister
Jan 28 at 19:52
1
$begingroup$
It just hit me: I think this is a line integral, right? Going back to the original expression you're asked to evaluate: $displaystyleint_{alpha^0}^{alpha^1}f(a)cdot da.$ Aren't $alpha^0$ and $alpha^1$ vectors of the same dimension as $f$ and $a?$ If it's a line integral, then we need to know what path (if it matters) you mean to take from the starting point to the finishing point. Then we should parameterize the expression and compute like this: $displaystyleint_a^b f(a(t))cdot dot{a}(t),dt.$
$endgroup$
– Adrian Keister
Jan 29 at 2:03
1
$begingroup$
ah yeah youre right, it is a line integral. But somehow i think it is supposed to be path-independent?
$endgroup$
– dunno
Jan 29 at 3:02
|
show 14 more comments
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091258%2fintegral-of-differences-of-vector%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
We are asked to compute the line integral
$$int_{alpha^0}^{alpha^1}f(a)cdot da, $$
where
$$f_i(a)=sum_{j=1}^{n}sin(a_i-a_j).$$
Ideally, this line integral is path-independent. We would need $f=nabla g$ for some scalar field $g(a).$ That is, we would need
$$sum_{j=1}^{n}sin(a_i-a_j)=frac{partial}{partial a_i},g(a).$$
This would force
$$intsum_{j=1}^{n}sin(a_i-a_j),da_i=g(a),$$
or
$$g(a)=-sum_{j=1}^{n}cos(a_i-a_j).$$
But this needs to be true of all the $a_i,$ so let's modify this to
$$g(a)=-sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
When $i=j,$ we're going to pick up a number of $-1$'s, but that should be immaterial. The point is, that if we differentiate this $g(a)$ w.r.t. $a_i,$ we'll get $f_i.$ Let's double-check that this works by computing:
begin{align*}
frac{partial g(a)}{partial a_1}&=-frac{partial}{partial a_1}sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j)\
&=-sum_{i=1}^nsum_{j=1}^nfrac{partial}{partial a_1},cos(a_i-a_j).
end{align*}
Now we see here that if neither $i$ nor $j$ is $1,$ the derivative annihilates the $cos$. So, which terms have $a_1$ in them? Well, we have a number of terms. If $i=1,$ and $jnot=1,$ those will contribute. Also, if $inot=1$ but $j=1,$ those will also contribute. If $i=j,$ then, as before mentioned, the term is $cos(0)=1,$ which disappears under differentiation. Therefore, the expression above becomes the following:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_1-a_j)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1).
end{align*}
Since $cos$ is even, these sums will actually turn out to be identical, which means we double-counted initially. That is:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_j-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-2sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=2sum_{i=2}^nsin(a_i-a_1).
end{align*}
This means we need to adjust our $g$ by a factor of $1/2:$
$$g(a)=-frac12 sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
Once we do this, we can see that the proof above goes through, and it is indeed the case that $partial_{a_1}g(a)=f_1,$ since in $f_1,$ when $j=1,$ we see that $sin(a_1-a_1)=sin(0)=0.$
Whew! We have path independence on account of the form of $f$. It remains to calculate the line integral itself, but we've actually done all the hard work because we know what $g$ is. Since the integral is path-independent, we'll choose the straight line from $alpha^0$ to $alpha^1,$ parametrized as
$$gamma(t)=talpha^1+(1-t)alpha^0,quad tin[0,1],$$
as it has all the nice properties we need to use the Fundamental Theorem for Line Integrals:
begin{align*}
int_{alpha^0}^{alpha^1}f(a)cdot da &=int_{alpha^0}^{alpha^1}(nabla g(a))cdot da \
&=g(alpha^1)-g(alpha^0) \
&=frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^0-alpha_j^0)-frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^1-alpha_j^1) \
&=frac12 sum_{i=1}^nsum_{j=1}^n[cos(alpha_i^0-alpha_j^0)-cos(alpha_i^1-alpha_j^1)].
end{align*}
You could use
$$cos(theta)-cos(varphi)=-2sinleft(frac{theta+varphi}{2}right)sinleft(frac{theta-varphi}{2}right)$$
to combine the cosines in the last expression, but that might or might not be simpler.
$endgroup$
1
$begingroup$
$int f(a) cdot da$ for $a$ a vector
$endgroup$
– dunno
Jan 28 at 19:38
1
$begingroup$
So that's going to look like $sum_{i=1}^nint_{alpha_j^0}^{alpha_j^1}sum_{j=1}^ncos(a_i-a_j),da_i,$ I think, which is what you had above.
$endgroup$
– Adrian Keister
Jan 28 at 19:44
1
$begingroup$
Oh, right - definitely sin. I think it's ok, because fundamentally it's a sum of integrals (that's the dot product). I don't think my answer is correct. The final result should be a number involving only $alpha_i^0$ and $alpha_i^1$.
$endgroup$
– Adrian Keister
Jan 28 at 19:52
1
$begingroup$
It just hit me: I think this is a line integral, right? Going back to the original expression you're asked to evaluate: $displaystyleint_{alpha^0}^{alpha^1}f(a)cdot da.$ Aren't $alpha^0$ and $alpha^1$ vectors of the same dimension as $f$ and $a?$ If it's a line integral, then we need to know what path (if it matters) you mean to take from the starting point to the finishing point. Then we should parameterize the expression and compute like this: $displaystyleint_a^b f(a(t))cdot dot{a}(t),dt.$
$endgroup$
– Adrian Keister
Jan 29 at 2:03
1
$begingroup$
ah yeah youre right, it is a line integral. But somehow i think it is supposed to be path-independent?
$endgroup$
– dunno
Jan 29 at 3:02
|
show 14 more comments
$begingroup$
We are asked to compute the line integral
$$int_{alpha^0}^{alpha^1}f(a)cdot da, $$
where
$$f_i(a)=sum_{j=1}^{n}sin(a_i-a_j).$$
Ideally, this line integral is path-independent. We would need $f=nabla g$ for some scalar field $g(a).$ That is, we would need
$$sum_{j=1}^{n}sin(a_i-a_j)=frac{partial}{partial a_i},g(a).$$
This would force
$$intsum_{j=1}^{n}sin(a_i-a_j),da_i=g(a),$$
or
$$g(a)=-sum_{j=1}^{n}cos(a_i-a_j).$$
But this needs to be true of all the $a_i,$ so let's modify this to
$$g(a)=-sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
When $i=j,$ we're going to pick up a number of $-1$'s, but that should be immaterial. The point is, that if we differentiate this $g(a)$ w.r.t. $a_i,$ we'll get $f_i.$ Let's double-check that this works by computing:
begin{align*}
frac{partial g(a)}{partial a_1}&=-frac{partial}{partial a_1}sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j)\
&=-sum_{i=1}^nsum_{j=1}^nfrac{partial}{partial a_1},cos(a_i-a_j).
end{align*}
Now we see here that if neither $i$ nor $j$ is $1,$ the derivative annihilates the $cos$. So, which terms have $a_1$ in them? Well, we have a number of terms. If $i=1,$ and $jnot=1,$ those will contribute. Also, if $inot=1$ but $j=1,$ those will also contribute. If $i=j,$ then, as before mentioned, the term is $cos(0)=1,$ which disappears under differentiation. Therefore, the expression above becomes the following:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_1-a_j)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1).
end{align*}
Since $cos$ is even, these sums will actually turn out to be identical, which means we double-counted initially. That is:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_j-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-2sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=2sum_{i=2}^nsin(a_i-a_1).
end{align*}
This means we need to adjust our $g$ by a factor of $1/2:$
$$g(a)=-frac12 sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
Once we do this, we can see that the proof above goes through, and it is indeed the case that $partial_{a_1}g(a)=f_1,$ since in $f_1,$ when $j=1,$ we see that $sin(a_1-a_1)=sin(0)=0.$
Whew! We have path independence on account of the form of $f$. It remains to calculate the line integral itself, but we've actually done all the hard work because we know what $g$ is. Since the integral is path-independent, we'll choose the straight line from $alpha^0$ to $alpha^1,$ parametrized as
$$gamma(t)=talpha^1+(1-t)alpha^0,quad tin[0,1],$$
as it has all the nice properties we need to use the Fundamental Theorem for Line Integrals:
begin{align*}
int_{alpha^0}^{alpha^1}f(a)cdot da &=int_{alpha^0}^{alpha^1}(nabla g(a))cdot da \
&=g(alpha^1)-g(alpha^0) \
&=frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^0-alpha_j^0)-frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^1-alpha_j^1) \
&=frac12 sum_{i=1}^nsum_{j=1}^n[cos(alpha_i^0-alpha_j^0)-cos(alpha_i^1-alpha_j^1)].
end{align*}
You could use
$$cos(theta)-cos(varphi)=-2sinleft(frac{theta+varphi}{2}right)sinleft(frac{theta-varphi}{2}right)$$
to combine the cosines in the last expression, but that might or might not be simpler.
$endgroup$
1
$begingroup$
$int f(a) cdot da$ for $a$ a vector
$endgroup$
– dunno
Jan 28 at 19:38
1
$begingroup$
So that's going to look like $sum_{i=1}^nint_{alpha_j^0}^{alpha_j^1}sum_{j=1}^ncos(a_i-a_j),da_i,$ I think, which is what you had above.
$endgroup$
– Adrian Keister
Jan 28 at 19:44
1
$begingroup$
Oh, right - definitely sin. I think it's ok, because fundamentally it's a sum of integrals (that's the dot product). I don't think my answer is correct. The final result should be a number involving only $alpha_i^0$ and $alpha_i^1$.
$endgroup$
– Adrian Keister
Jan 28 at 19:52
1
$begingroup$
It just hit me: I think this is a line integral, right? Going back to the original expression you're asked to evaluate: $displaystyleint_{alpha^0}^{alpha^1}f(a)cdot da.$ Aren't $alpha^0$ and $alpha^1$ vectors of the same dimension as $f$ and $a?$ If it's a line integral, then we need to know what path (if it matters) you mean to take from the starting point to the finishing point. Then we should parameterize the expression and compute like this: $displaystyleint_a^b f(a(t))cdot dot{a}(t),dt.$
$endgroup$
– Adrian Keister
Jan 29 at 2:03
1
$begingroup$
ah yeah youre right, it is a line integral. But somehow i think it is supposed to be path-independent?
$endgroup$
– dunno
Jan 29 at 3:02
|
show 14 more comments
$begingroup$
We are asked to compute the line integral
$$int_{alpha^0}^{alpha^1}f(a)cdot da, $$
where
$$f_i(a)=sum_{j=1}^{n}sin(a_i-a_j).$$
Ideally, this line integral is path-independent. We would need $f=nabla g$ for some scalar field $g(a).$ That is, we would need
$$sum_{j=1}^{n}sin(a_i-a_j)=frac{partial}{partial a_i},g(a).$$
This would force
$$intsum_{j=1}^{n}sin(a_i-a_j),da_i=g(a),$$
or
$$g(a)=-sum_{j=1}^{n}cos(a_i-a_j).$$
But this needs to be true of all the $a_i,$ so let's modify this to
$$g(a)=-sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
When $i=j,$ we're going to pick up a number of $-1$'s, but that should be immaterial. The point is, that if we differentiate this $g(a)$ w.r.t. $a_i,$ we'll get $f_i.$ Let's double-check that this works by computing:
begin{align*}
frac{partial g(a)}{partial a_1}&=-frac{partial}{partial a_1}sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j)\
&=-sum_{i=1}^nsum_{j=1}^nfrac{partial}{partial a_1},cos(a_i-a_j).
end{align*}
Now we see here that if neither $i$ nor $j$ is $1,$ the derivative annihilates the $cos$. So, which terms have $a_1$ in them? Well, we have a number of terms. If $i=1,$ and $jnot=1,$ those will contribute. Also, if $inot=1$ but $j=1,$ those will also contribute. If $i=j,$ then, as before mentioned, the term is $cos(0)=1,$ which disappears under differentiation. Therefore, the expression above becomes the following:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_1-a_j)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1).
end{align*}
Since $cos$ is even, these sums will actually turn out to be identical, which means we double-counted initially. That is:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_j-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-2sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=2sum_{i=2}^nsin(a_i-a_1).
end{align*}
This means we need to adjust our $g$ by a factor of $1/2:$
$$g(a)=-frac12 sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
Once we do this, we can see that the proof above goes through, and it is indeed the case that $partial_{a_1}g(a)=f_1,$ since in $f_1,$ when $j=1,$ we see that $sin(a_1-a_1)=sin(0)=0.$
Whew! We have path independence on account of the form of $f$. It remains to calculate the line integral itself, but we've actually done all the hard work because we know what $g$ is. Since the integral is path-independent, we'll choose the straight line from $alpha^0$ to $alpha^1,$ parametrized as
$$gamma(t)=talpha^1+(1-t)alpha^0,quad tin[0,1],$$
as it has all the nice properties we need to use the Fundamental Theorem for Line Integrals:
begin{align*}
int_{alpha^0}^{alpha^1}f(a)cdot da &=int_{alpha^0}^{alpha^1}(nabla g(a))cdot da \
&=g(alpha^1)-g(alpha^0) \
&=frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^0-alpha_j^0)-frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^1-alpha_j^1) \
&=frac12 sum_{i=1}^nsum_{j=1}^n[cos(alpha_i^0-alpha_j^0)-cos(alpha_i^1-alpha_j^1)].
end{align*}
You could use
$$cos(theta)-cos(varphi)=-2sinleft(frac{theta+varphi}{2}right)sinleft(frac{theta-varphi}{2}right)$$
to combine the cosines in the last expression, but that might or might not be simpler.
$endgroup$
We are asked to compute the line integral
$$int_{alpha^0}^{alpha^1}f(a)cdot da, $$
where
$$f_i(a)=sum_{j=1}^{n}sin(a_i-a_j).$$
Ideally, this line integral is path-independent. We would need $f=nabla g$ for some scalar field $g(a).$ That is, we would need
$$sum_{j=1}^{n}sin(a_i-a_j)=frac{partial}{partial a_i},g(a).$$
This would force
$$intsum_{j=1}^{n}sin(a_i-a_j),da_i=g(a),$$
or
$$g(a)=-sum_{j=1}^{n}cos(a_i-a_j).$$
But this needs to be true of all the $a_i,$ so let's modify this to
$$g(a)=-sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
When $i=j,$ we're going to pick up a number of $-1$'s, but that should be immaterial. The point is, that if we differentiate this $g(a)$ w.r.t. $a_i,$ we'll get $f_i.$ Let's double-check that this works by computing:
begin{align*}
frac{partial g(a)}{partial a_1}&=-frac{partial}{partial a_1}sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j)\
&=-sum_{i=1}^nsum_{j=1}^nfrac{partial}{partial a_1},cos(a_i-a_j).
end{align*}
Now we see here that if neither $i$ nor $j$ is $1,$ the derivative annihilates the $cos$. So, which terms have $a_1$ in them? Well, we have a number of terms. If $i=1,$ and $jnot=1,$ those will contribute. Also, if $inot=1$ but $j=1,$ those will also contribute. If $i=j,$ then, as before mentioned, the term is $cos(0)=1,$ which disappears under differentiation. Therefore, the expression above becomes the following:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_1-a_j)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1).
end{align*}
Since $cos$ is even, these sums will actually turn out to be identical, which means we double-counted initially. That is:
begin{align*}
frac{partial g(a)}{partial a_1}&=
-sum_{j=2}^nfrac{partial}{partial a_1},cos(a_j-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1)-sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=-2sum_{i=2}^nfrac{partial}{partial a_1},cos(a_i-a_1) \
&=2sum_{i=2}^nsin(a_i-a_1).
end{align*}
This means we need to adjust our $g$ by a factor of $1/2:$
$$g(a)=-frac12 sum_{i=1}^nsum_{j=1}^ncos(a_i-a_j).$$
Once we do this, we can see that the proof above goes through, and it is indeed the case that $partial_{a_1}g(a)=f_1,$ since in $f_1,$ when $j=1,$ we see that $sin(a_1-a_1)=sin(0)=0.$
Whew! We have path independence on account of the form of $f$. It remains to calculate the line integral itself, but we've actually done all the hard work because we know what $g$ is. Since the integral is path-independent, we'll choose the straight line from $alpha^0$ to $alpha^1,$ parametrized as
$$gamma(t)=talpha^1+(1-t)alpha^0,quad tin[0,1],$$
as it has all the nice properties we need to use the Fundamental Theorem for Line Integrals:
begin{align*}
int_{alpha^0}^{alpha^1}f(a)cdot da &=int_{alpha^0}^{alpha^1}(nabla g(a))cdot da \
&=g(alpha^1)-g(alpha^0) \
&=frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^0-alpha_j^0)-frac12 sum_{i=1}^nsum_{j=1}^ncos(alpha_i^1-alpha_j^1) \
&=frac12 sum_{i=1}^nsum_{j=1}^n[cos(alpha_i^0-alpha_j^0)-cos(alpha_i^1-alpha_j^1)].
end{align*}
You could use
$$cos(theta)-cos(varphi)=-2sinleft(frac{theta+varphi}{2}right)sinleft(frac{theta-varphi}{2}right)$$
to combine the cosines in the last expression, but that might or might not be simpler.
edited Jan 29 at 14:43
answered Jan 28 at 19:23
Adrian KeisterAdrian Keister
5,28171933
5,28171933
1
$begingroup$
$int f(a) cdot da$ for $a$ a vector
$endgroup$
– dunno
Jan 28 at 19:38
1
$begingroup$
So that's going to look like $sum_{i=1}^nint_{alpha_j^0}^{alpha_j^1}sum_{j=1}^ncos(a_i-a_j),da_i,$ I think, which is what you had above.
$endgroup$
– Adrian Keister
Jan 28 at 19:44
1
$begingroup$
Oh, right - definitely sin. I think it's ok, because fundamentally it's a sum of integrals (that's the dot product). I don't think my answer is correct. The final result should be a number involving only $alpha_i^0$ and $alpha_i^1$.
$endgroup$
– Adrian Keister
Jan 28 at 19:52
1
$begingroup$
It just hit me: I think this is a line integral, right? Going back to the original expression you're asked to evaluate: $displaystyleint_{alpha^0}^{alpha^1}f(a)cdot da.$ Aren't $alpha^0$ and $alpha^1$ vectors of the same dimension as $f$ and $a?$ If it's a line integral, then we need to know what path (if it matters) you mean to take from the starting point to the finishing point. Then we should parameterize the expression and compute like this: $displaystyleint_a^b f(a(t))cdot dot{a}(t),dt.$
$endgroup$
– Adrian Keister
Jan 29 at 2:03
1
$begingroup$
ah yeah youre right, it is a line integral. But somehow i think it is supposed to be path-independent?
$endgroup$
– dunno
Jan 29 at 3:02
|
show 14 more comments
1
$begingroup$
$int f(a) cdot da$ for $a$ a vector
$endgroup$
– dunno
Jan 28 at 19:38
1
$begingroup$
So that's going to look like $sum_{i=1}^nint_{alpha_j^0}^{alpha_j^1}sum_{j=1}^ncos(a_i-a_j),da_i,$ I think, which is what you had above.
$endgroup$
– Adrian Keister
Jan 28 at 19:44
1
$begingroup$
Oh, right - definitely sin. I think it's ok, because fundamentally it's a sum of integrals (that's the dot product). I don't think my answer is correct. The final result should be a number involving only $alpha_i^0$ and $alpha_i^1$.
$endgroup$
– Adrian Keister
Jan 28 at 19:52
1
$begingroup$
It just hit me: I think this is a line integral, right? Going back to the original expression you're asked to evaluate: $displaystyleint_{alpha^0}^{alpha^1}f(a)cdot da.$ Aren't $alpha^0$ and $alpha^1$ vectors of the same dimension as $f$ and $a?$ If it's a line integral, then we need to know what path (if it matters) you mean to take from the starting point to the finishing point. Then we should parameterize the expression and compute like this: $displaystyleint_a^b f(a(t))cdot dot{a}(t),dt.$
$endgroup$
– Adrian Keister
Jan 29 at 2:03
1
$begingroup$
ah yeah youre right, it is a line integral. But somehow i think it is supposed to be path-independent?
$endgroup$
– dunno
Jan 29 at 3:02
1
1
$begingroup$
$int f(a) cdot da$ for $a$ a vector
$endgroup$
– dunno
Jan 28 at 19:38
$begingroup$
$int f(a) cdot da$ for $a$ a vector
$endgroup$
– dunno
Jan 28 at 19:38
1
1
$begingroup$
So that's going to look like $sum_{i=1}^nint_{alpha_j^0}^{alpha_j^1}sum_{j=1}^ncos(a_i-a_j),da_i,$ I think, which is what you had above.
$endgroup$
– Adrian Keister
Jan 28 at 19:44
$begingroup$
So that's going to look like $sum_{i=1}^nint_{alpha_j^0}^{alpha_j^1}sum_{j=1}^ncos(a_i-a_j),da_i,$ I think, which is what you had above.
$endgroup$
– Adrian Keister
Jan 28 at 19:44
1
1
$begingroup$
Oh, right - definitely sin. I think it's ok, because fundamentally it's a sum of integrals (that's the dot product). I don't think my answer is correct. The final result should be a number involving only $alpha_i^0$ and $alpha_i^1$.
$endgroup$
– Adrian Keister
Jan 28 at 19:52
$begingroup$
Oh, right - definitely sin. I think it's ok, because fundamentally it's a sum of integrals (that's the dot product). I don't think my answer is correct. The final result should be a number involving only $alpha_i^0$ and $alpha_i^1$.
$endgroup$
– Adrian Keister
Jan 28 at 19:52
1
1
$begingroup$
It just hit me: I think this is a line integral, right? Going back to the original expression you're asked to evaluate: $displaystyleint_{alpha^0}^{alpha^1}f(a)cdot da.$ Aren't $alpha^0$ and $alpha^1$ vectors of the same dimension as $f$ and $a?$ If it's a line integral, then we need to know what path (if it matters) you mean to take from the starting point to the finishing point. Then we should parameterize the expression and compute like this: $displaystyleint_a^b f(a(t))cdot dot{a}(t),dt.$
$endgroup$
– Adrian Keister
Jan 29 at 2:03
$begingroup$
It just hit me: I think this is a line integral, right? Going back to the original expression you're asked to evaluate: $displaystyleint_{alpha^0}^{alpha^1}f(a)cdot da.$ Aren't $alpha^0$ and $alpha^1$ vectors of the same dimension as $f$ and $a?$ If it's a line integral, then we need to know what path (if it matters) you mean to take from the starting point to the finishing point. Then we should parameterize the expression and compute like this: $displaystyleint_a^b f(a(t))cdot dot{a}(t),dt.$
$endgroup$
– Adrian Keister
Jan 29 at 2:03
1
1
$begingroup$
ah yeah youre right, it is a line integral. But somehow i think it is supposed to be path-independent?
$endgroup$
– dunno
Jan 29 at 3:02
$begingroup$
ah yeah youre right, it is a line integral. But somehow i think it is supposed to be path-independent?
$endgroup$
– dunno
Jan 29 at 3:02
|
show 14 more comments
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3091258%2fintegral-of-differences-of-vector%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
The problem here is in the original expression, which is not well-defined. You have $a_i$ as the dummy variable for a definite integral (in which case it should disappear), but then you're expecting to be able to sum over $i$ after everything else is done. I think if you provide more context, we might be able to help you sort it out.
$endgroup$
– Adrian Keister
Jan 28 at 19:06
1
$begingroup$
@AdrianKeister, thank you, i added the full problem
$endgroup$
– dunno
Jan 28 at 19:20
$begingroup$
@AdrianKeister, i guess there is a third summation over $k$ because of the inner product?
$endgroup$
– dunno
Jan 28 at 19:30
$begingroup$
It's really depending on the correct interpretation of the original integral. If it's a dot product, then each differential is multiplying one component of $f$ that itself depends on all the $a_i$ - that's problematic. I don't see how you're going to get a single number out of that unless it's a line integral. If it's a line integral, then parametrization is the way to go, assuming we have a well-defined one we can use. Otherwise, if the integral is really more of a volume integral, then every component of $f$ gets multiplied by all the differentials: you can get a single number out of that.
$endgroup$
– Adrian Keister
Jan 29 at 2:17
$begingroup$
All right! I think I've got it. Please check over my answer for errors.
$endgroup$
– Adrian Keister
Jan 29 at 13:50