Maximizing and minimizing dot products
$begingroup$
Given 2 vectors $u,v in mathbb{R^n}$ such that $|u| = 1$ and $sum_{i=1}^n v_i= c$ where $c<1$, I would like to maximize $$sum_{i=1}^n u_i v_i log (v_i)$$ and minimize
$$sum_{i=1}^n u_i v_i log (u_i)$$.
The answer in my opinion is $-clog(c)$ if $c<1/e$ (and $clog(c)$ for the minimum). I think so because on $[-1/e, 1/e]$ the function is monotonic (not sure of that). However I can't find formal proof of that, and I'll be glad for help
optimization vector-spaces inner-product-space lagrange-multiplier
$endgroup$
|
show 15 more comments
$begingroup$
Given 2 vectors $u,v in mathbb{R^n}$ such that $|u| = 1$ and $sum_{i=1}^n v_i= c$ where $c<1$, I would like to maximize $$sum_{i=1}^n u_i v_i log (v_i)$$ and minimize
$$sum_{i=1}^n u_i v_i log (u_i)$$.
The answer in my opinion is $-clog(c)$ if $c<1/e$ (and $clog(c)$ for the minimum). I think so because on $[-1/e, 1/e]$ the function is monotonic (not sure of that). However I can't find formal proof of that, and I'll be glad for help
optimization vector-spaces inner-product-space lagrange-multiplier
$endgroup$
$begingroup$
Hi. Which are knowns and which are unknowns and what are we optimizing with respect to? Also, what have you tried so far in trying to solve the problem?
$endgroup$
– mathreadler
Mar 30 '16 at 8:24
$begingroup$
The vectors u and v are unknown, I know the norm of u and sum of coordinates of v. I tried lagrange multipliers but it didn't work and confusing so I didn't write the equations here.
$endgroup$
– James F.
Mar 30 '16 at 8:29
1
$begingroup$
What function is monotonic? Anyway, I agree that this is a Lagrange multipliers problem. Rewrite the constraint on $u$ as $u_1^2+u_2^2+ldots + u_n^2=1$, it is easier to differentiate
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:41
$begingroup$
Xlogx on [-1/e, 1/e] is monotonic. I tried to rewrite the constraint as you wrote but it doesn't work as well
$endgroup$
– James F.
Mar 30 '16 at 8:47
$begingroup$
So is $x$ known or unknown?
$endgroup$
– mathreadler
Mar 30 '16 at 8:49
|
show 15 more comments
$begingroup$
Given 2 vectors $u,v in mathbb{R^n}$ such that $|u| = 1$ and $sum_{i=1}^n v_i= c$ where $c<1$, I would like to maximize $$sum_{i=1}^n u_i v_i log (v_i)$$ and minimize
$$sum_{i=1}^n u_i v_i log (u_i)$$.
The answer in my opinion is $-clog(c)$ if $c<1/e$ (and $clog(c)$ for the minimum). I think so because on $[-1/e, 1/e]$ the function is monotonic (not sure of that). However I can't find formal proof of that, and I'll be glad for help
optimization vector-spaces inner-product-space lagrange-multiplier
$endgroup$
Given 2 vectors $u,v in mathbb{R^n}$ such that $|u| = 1$ and $sum_{i=1}^n v_i= c$ where $c<1$, I would like to maximize $$sum_{i=1}^n u_i v_i log (v_i)$$ and minimize
$$sum_{i=1}^n u_i v_i log (u_i)$$.
The answer in my opinion is $-clog(c)$ if $c<1/e$ (and $clog(c)$ for the minimum). I think so because on $[-1/e, 1/e]$ the function is monotonic (not sure of that). However I can't find formal proof of that, and I'll be glad for help
optimization vector-spaces inner-product-space lagrange-multiplier
optimization vector-spaces inner-product-space lagrange-multiplier
edited Mar 30 '16 at 8:53
James F.
asked Mar 30 '16 at 8:11
James F.James F.
115
115
$begingroup$
Hi. Which are knowns and which are unknowns and what are we optimizing with respect to? Also, what have you tried so far in trying to solve the problem?
$endgroup$
– mathreadler
Mar 30 '16 at 8:24
$begingroup$
The vectors u and v are unknown, I know the norm of u and sum of coordinates of v. I tried lagrange multipliers but it didn't work and confusing so I didn't write the equations here.
$endgroup$
– James F.
Mar 30 '16 at 8:29
1
$begingroup$
What function is monotonic? Anyway, I agree that this is a Lagrange multipliers problem. Rewrite the constraint on $u$ as $u_1^2+u_2^2+ldots + u_n^2=1$, it is easier to differentiate
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:41
$begingroup$
Xlogx on [-1/e, 1/e] is monotonic. I tried to rewrite the constraint as you wrote but it doesn't work as well
$endgroup$
– James F.
Mar 30 '16 at 8:47
$begingroup$
So is $x$ known or unknown?
$endgroup$
– mathreadler
Mar 30 '16 at 8:49
|
show 15 more comments
$begingroup$
Hi. Which are knowns and which are unknowns and what are we optimizing with respect to? Also, what have you tried so far in trying to solve the problem?
$endgroup$
– mathreadler
Mar 30 '16 at 8:24
$begingroup$
The vectors u and v are unknown, I know the norm of u and sum of coordinates of v. I tried lagrange multipliers but it didn't work and confusing so I didn't write the equations here.
$endgroup$
– James F.
Mar 30 '16 at 8:29
1
$begingroup$
What function is monotonic? Anyway, I agree that this is a Lagrange multipliers problem. Rewrite the constraint on $u$ as $u_1^2+u_2^2+ldots + u_n^2=1$, it is easier to differentiate
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:41
$begingroup$
Xlogx on [-1/e, 1/e] is monotonic. I tried to rewrite the constraint as you wrote but it doesn't work as well
$endgroup$
– James F.
Mar 30 '16 at 8:47
$begingroup$
So is $x$ known or unknown?
$endgroup$
– mathreadler
Mar 30 '16 at 8:49
$begingroup$
Hi. Which are knowns and which are unknowns and what are we optimizing with respect to? Also, what have you tried so far in trying to solve the problem?
$endgroup$
– mathreadler
Mar 30 '16 at 8:24
$begingroup$
Hi. Which are knowns and which are unknowns and what are we optimizing with respect to? Also, what have you tried so far in trying to solve the problem?
$endgroup$
– mathreadler
Mar 30 '16 at 8:24
$begingroup$
The vectors u and v are unknown, I know the norm of u and sum of coordinates of v. I tried lagrange multipliers but it didn't work and confusing so I didn't write the equations here.
$endgroup$
– James F.
Mar 30 '16 at 8:29
$begingroup$
The vectors u and v are unknown, I know the norm of u and sum of coordinates of v. I tried lagrange multipliers but it didn't work and confusing so I didn't write the equations here.
$endgroup$
– James F.
Mar 30 '16 at 8:29
1
1
$begingroup$
What function is monotonic? Anyway, I agree that this is a Lagrange multipliers problem. Rewrite the constraint on $u$ as $u_1^2+u_2^2+ldots + u_n^2=1$, it is easier to differentiate
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:41
$begingroup$
What function is monotonic? Anyway, I agree that this is a Lagrange multipliers problem. Rewrite the constraint on $u$ as $u_1^2+u_2^2+ldots + u_n^2=1$, it is easier to differentiate
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:41
$begingroup$
Xlogx on [-1/e, 1/e] is monotonic. I tried to rewrite the constraint as you wrote but it doesn't work as well
$endgroup$
– James F.
Mar 30 '16 at 8:47
$begingroup$
Xlogx on [-1/e, 1/e] is monotonic. I tried to rewrite the constraint as you wrote but it doesn't work as well
$endgroup$
– James F.
Mar 30 '16 at 8:47
$begingroup$
So is $x$ known or unknown?
$endgroup$
– mathreadler
Mar 30 '16 at 8:49
$begingroup$
So is $x$ known or unknown?
$endgroup$
– mathreadler
Mar 30 '16 at 8:49
|
show 15 more comments
1 Answer
1
active
oldest
votes
$begingroup$
A hint could be to write the expressions as functions $f$ and $g$:
$$f({bf u},{bf v}) = sum_{i=1}^n u_i v_i log (v_i) \
g({bf u}, {bf v}) = sum_{i=1}^n u_i v_i log (u_i)$$
and then use partial differentiation and set the gradient of both to 0 and see where it leads. If necessary (and possible) then add the constraint that the hessians (matrices of partial second order differentials) should be positive resp. negative definite.
For the equality constraints as Giuseppe proposed we can add lagrange multipliers $$lambda_1|{bf v}^t{bf 1}-c|, lambda_2 |{bf u}^t{bf u}-1|$$
And maybe a barrier function for $c<1$ constraint if $c$ is also an unknown.
As we want the gradients to be zero vectors a candidate total function to minimize would be:
$$min_{bf u,v}left{|nabla f({bf u , v})| + |nabla g({bf u,v})| + lambda_1|{bf v}^t{bf 1}-c| + lambda_2 |{bf u}^t{bf u}-1|right}$$
For some suitable norm(s).
$endgroup$
1
$begingroup$
This ignores the constraints (which make it harder, if I didn't have them this question was about maximizing entropy which is easy to do). As I wrote in the previous comment, I tried solving it with lagrange multipliers but it doesn't work (the lagrange variables don't cancel out)
$endgroup$
– James F.
Mar 30 '16 at 8:40
$begingroup$
This approach disregards the constraints.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:40
$begingroup$
Yes sorry, I forgot the constraints, added an approach for them now.
$endgroup$
– mathreadler
Mar 30 '16 at 8:59
$begingroup$
It is not me who proposed the Lagrange multipliers actually! :-) James F mentioned them in his original post already.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 9:05
$begingroup$
How can I derive that expressions?
$endgroup$
– James F.
Mar 30 '16 at 11:02
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1720015%2fmaximizing-and-minimizing-dot-products%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
A hint could be to write the expressions as functions $f$ and $g$:
$$f({bf u},{bf v}) = sum_{i=1}^n u_i v_i log (v_i) \
g({bf u}, {bf v}) = sum_{i=1}^n u_i v_i log (u_i)$$
and then use partial differentiation and set the gradient of both to 0 and see where it leads. If necessary (and possible) then add the constraint that the hessians (matrices of partial second order differentials) should be positive resp. negative definite.
For the equality constraints as Giuseppe proposed we can add lagrange multipliers $$lambda_1|{bf v}^t{bf 1}-c|, lambda_2 |{bf u}^t{bf u}-1|$$
And maybe a barrier function for $c<1$ constraint if $c$ is also an unknown.
As we want the gradients to be zero vectors a candidate total function to minimize would be:
$$min_{bf u,v}left{|nabla f({bf u , v})| + |nabla g({bf u,v})| + lambda_1|{bf v}^t{bf 1}-c| + lambda_2 |{bf u}^t{bf u}-1|right}$$
For some suitable norm(s).
$endgroup$
1
$begingroup$
This ignores the constraints (which make it harder, if I didn't have them this question was about maximizing entropy which is easy to do). As I wrote in the previous comment, I tried solving it with lagrange multipliers but it doesn't work (the lagrange variables don't cancel out)
$endgroup$
– James F.
Mar 30 '16 at 8:40
$begingroup$
This approach disregards the constraints.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:40
$begingroup$
Yes sorry, I forgot the constraints, added an approach for them now.
$endgroup$
– mathreadler
Mar 30 '16 at 8:59
$begingroup$
It is not me who proposed the Lagrange multipliers actually! :-) James F mentioned them in his original post already.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 9:05
$begingroup$
How can I derive that expressions?
$endgroup$
– James F.
Mar 30 '16 at 11:02
add a comment |
$begingroup$
A hint could be to write the expressions as functions $f$ and $g$:
$$f({bf u},{bf v}) = sum_{i=1}^n u_i v_i log (v_i) \
g({bf u}, {bf v}) = sum_{i=1}^n u_i v_i log (u_i)$$
and then use partial differentiation and set the gradient of both to 0 and see where it leads. If necessary (and possible) then add the constraint that the hessians (matrices of partial second order differentials) should be positive resp. negative definite.
For the equality constraints as Giuseppe proposed we can add lagrange multipliers $$lambda_1|{bf v}^t{bf 1}-c|, lambda_2 |{bf u}^t{bf u}-1|$$
And maybe a barrier function for $c<1$ constraint if $c$ is also an unknown.
As we want the gradients to be zero vectors a candidate total function to minimize would be:
$$min_{bf u,v}left{|nabla f({bf u , v})| + |nabla g({bf u,v})| + lambda_1|{bf v}^t{bf 1}-c| + lambda_2 |{bf u}^t{bf u}-1|right}$$
For some suitable norm(s).
$endgroup$
1
$begingroup$
This ignores the constraints (which make it harder, if I didn't have them this question was about maximizing entropy which is easy to do). As I wrote in the previous comment, I tried solving it with lagrange multipliers but it doesn't work (the lagrange variables don't cancel out)
$endgroup$
– James F.
Mar 30 '16 at 8:40
$begingroup$
This approach disregards the constraints.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:40
$begingroup$
Yes sorry, I forgot the constraints, added an approach for them now.
$endgroup$
– mathreadler
Mar 30 '16 at 8:59
$begingroup$
It is not me who proposed the Lagrange multipliers actually! :-) James F mentioned them in his original post already.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 9:05
$begingroup$
How can I derive that expressions?
$endgroup$
– James F.
Mar 30 '16 at 11:02
add a comment |
$begingroup$
A hint could be to write the expressions as functions $f$ and $g$:
$$f({bf u},{bf v}) = sum_{i=1}^n u_i v_i log (v_i) \
g({bf u}, {bf v}) = sum_{i=1}^n u_i v_i log (u_i)$$
and then use partial differentiation and set the gradient of both to 0 and see where it leads. If necessary (and possible) then add the constraint that the hessians (matrices of partial second order differentials) should be positive resp. negative definite.
For the equality constraints as Giuseppe proposed we can add lagrange multipliers $$lambda_1|{bf v}^t{bf 1}-c|, lambda_2 |{bf u}^t{bf u}-1|$$
And maybe a barrier function for $c<1$ constraint if $c$ is also an unknown.
As we want the gradients to be zero vectors a candidate total function to minimize would be:
$$min_{bf u,v}left{|nabla f({bf u , v})| + |nabla g({bf u,v})| + lambda_1|{bf v}^t{bf 1}-c| + lambda_2 |{bf u}^t{bf u}-1|right}$$
For some suitable norm(s).
$endgroup$
A hint could be to write the expressions as functions $f$ and $g$:
$$f({bf u},{bf v}) = sum_{i=1}^n u_i v_i log (v_i) \
g({bf u}, {bf v}) = sum_{i=1}^n u_i v_i log (u_i)$$
and then use partial differentiation and set the gradient of both to 0 and see where it leads. If necessary (and possible) then add the constraint that the hessians (matrices of partial second order differentials) should be positive resp. negative definite.
For the equality constraints as Giuseppe proposed we can add lagrange multipliers $$lambda_1|{bf v}^t{bf 1}-c|, lambda_2 |{bf u}^t{bf u}-1|$$
And maybe a barrier function for $c<1$ constraint if $c$ is also an unknown.
As we want the gradients to be zero vectors a candidate total function to minimize would be:
$$min_{bf u,v}left{|nabla f({bf u , v})| + |nabla g({bf u,v})| + lambda_1|{bf v}^t{bf 1}-c| + lambda_2 |{bf u}^t{bf u}-1|right}$$
For some suitable norm(s).
edited Mar 30 '16 at 11:12
answered Mar 30 '16 at 8:34


mathreadlermathreadler
14.9k72161
14.9k72161
1
$begingroup$
This ignores the constraints (which make it harder, if I didn't have them this question was about maximizing entropy which is easy to do). As I wrote in the previous comment, I tried solving it with lagrange multipliers but it doesn't work (the lagrange variables don't cancel out)
$endgroup$
– James F.
Mar 30 '16 at 8:40
$begingroup$
This approach disregards the constraints.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:40
$begingroup$
Yes sorry, I forgot the constraints, added an approach for them now.
$endgroup$
– mathreadler
Mar 30 '16 at 8:59
$begingroup$
It is not me who proposed the Lagrange multipliers actually! :-) James F mentioned them in his original post already.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 9:05
$begingroup$
How can I derive that expressions?
$endgroup$
– James F.
Mar 30 '16 at 11:02
add a comment |
1
$begingroup$
This ignores the constraints (which make it harder, if I didn't have them this question was about maximizing entropy which is easy to do). As I wrote in the previous comment, I tried solving it with lagrange multipliers but it doesn't work (the lagrange variables don't cancel out)
$endgroup$
– James F.
Mar 30 '16 at 8:40
$begingroup$
This approach disregards the constraints.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:40
$begingroup$
Yes sorry, I forgot the constraints, added an approach for them now.
$endgroup$
– mathreadler
Mar 30 '16 at 8:59
$begingroup$
It is not me who proposed the Lagrange multipliers actually! :-) James F mentioned them in his original post already.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 9:05
$begingroup$
How can I derive that expressions?
$endgroup$
– James F.
Mar 30 '16 at 11:02
1
1
$begingroup$
This ignores the constraints (which make it harder, if I didn't have them this question was about maximizing entropy which is easy to do). As I wrote in the previous comment, I tried solving it with lagrange multipliers but it doesn't work (the lagrange variables don't cancel out)
$endgroup$
– James F.
Mar 30 '16 at 8:40
$begingroup$
This ignores the constraints (which make it harder, if I didn't have them this question was about maximizing entropy which is easy to do). As I wrote in the previous comment, I tried solving it with lagrange multipliers but it doesn't work (the lagrange variables don't cancel out)
$endgroup$
– James F.
Mar 30 '16 at 8:40
$begingroup$
This approach disregards the constraints.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:40
$begingroup$
This approach disregards the constraints.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:40
$begingroup$
Yes sorry, I forgot the constraints, added an approach for them now.
$endgroup$
– mathreadler
Mar 30 '16 at 8:59
$begingroup$
Yes sorry, I forgot the constraints, added an approach for them now.
$endgroup$
– mathreadler
Mar 30 '16 at 8:59
$begingroup$
It is not me who proposed the Lagrange multipliers actually! :-) James F mentioned them in his original post already.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 9:05
$begingroup$
It is not me who proposed the Lagrange multipliers actually! :-) James F mentioned them in his original post already.
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 9:05
$begingroup$
How can I derive that expressions?
$endgroup$
– James F.
Mar 30 '16 at 11:02
$begingroup$
How can I derive that expressions?
$endgroup$
– James F.
Mar 30 '16 at 11:02
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1720015%2fmaximizing-and-minimizing-dot-products%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Hi. Which are knowns and which are unknowns and what are we optimizing with respect to? Also, what have you tried so far in trying to solve the problem?
$endgroup$
– mathreadler
Mar 30 '16 at 8:24
$begingroup$
The vectors u and v are unknown, I know the norm of u and sum of coordinates of v. I tried lagrange multipliers but it didn't work and confusing so I didn't write the equations here.
$endgroup$
– James F.
Mar 30 '16 at 8:29
1
$begingroup$
What function is monotonic? Anyway, I agree that this is a Lagrange multipliers problem. Rewrite the constraint on $u$ as $u_1^2+u_2^2+ldots + u_n^2=1$, it is easier to differentiate
$endgroup$
– Giuseppe Negro
Mar 30 '16 at 8:41
$begingroup$
Xlogx on [-1/e, 1/e] is monotonic. I tried to rewrite the constraint as you wrote but it doesn't work as well
$endgroup$
– James F.
Mar 30 '16 at 8:47
$begingroup$
So is $x$ known or unknown?
$endgroup$
– mathreadler
Mar 30 '16 at 8:49