Compare different optimization techniques.
up vote
0
down vote
favorite
Usually, the more you know about the function (gradients, Hessians, etc.) and higher order optimization technique is used (Interpolation methods, Quasi-Newton > Newton's method) > the less function evaluation is needed to find an optimum.
Let's say I have a function that is very costly to calculate.
What technique should be used to find an optimum with as less function evaluation as possible?
Is there any chart of the internet to compare different optimization methods in term of "# of the objective function evaluation" vs "the complexity & cost of intermediate steps"?
optimization computational-complexity
New contributor
add a comment |
up vote
0
down vote
favorite
Usually, the more you know about the function (gradients, Hessians, etc.) and higher order optimization technique is used (Interpolation methods, Quasi-Newton > Newton's method) > the less function evaluation is needed to find an optimum.
Let's say I have a function that is very costly to calculate.
What technique should be used to find an optimum with as less function evaluation as possible?
Is there any chart of the internet to compare different optimization methods in term of "# of the objective function evaluation" vs "the complexity & cost of intermediate steps"?
optimization computational-complexity
New contributor
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
19 hours ago
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
19 hours ago
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
18 hours ago
Various surrogate function approaches, such as kriging
– Johan Löfberg
11 hours ago
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Usually, the more you know about the function (gradients, Hessians, etc.) and higher order optimization technique is used (Interpolation methods, Quasi-Newton > Newton's method) > the less function evaluation is needed to find an optimum.
Let's say I have a function that is very costly to calculate.
What technique should be used to find an optimum with as less function evaluation as possible?
Is there any chart of the internet to compare different optimization methods in term of "# of the objective function evaluation" vs "the complexity & cost of intermediate steps"?
optimization computational-complexity
New contributor
Usually, the more you know about the function (gradients, Hessians, etc.) and higher order optimization technique is used (Interpolation methods, Quasi-Newton > Newton's method) > the less function evaluation is needed to find an optimum.
Let's say I have a function that is very costly to calculate.
What technique should be used to find an optimum with as less function evaluation as possible?
Is there any chart of the internet to compare different optimization methods in term of "# of the objective function evaluation" vs "the complexity & cost of intermediate steps"?
optimization computational-complexity
optimization computational-complexity
New contributor
New contributor
New contributor
asked 19 hours ago
Alex Ozerov
101
101
New contributor
New contributor
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
19 hours ago
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
19 hours ago
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
18 hours ago
Various surrogate function approaches, such as kriging
– Johan Löfberg
11 hours ago
add a comment |
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
19 hours ago
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
19 hours ago
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
18 hours ago
Various surrogate function approaches, such as kriging
– Johan Löfberg
11 hours ago
1
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
19 hours ago
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
19 hours ago
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
19 hours ago
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
19 hours ago
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
18 hours ago
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
18 hours ago
Various surrogate function approaches, such as kriging
– Johan Löfberg
11 hours ago
Various surrogate function approaches, such as kriging
– Johan Löfberg
11 hours ago
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Alex Ozerov is a new contributor. Be nice, and check out our Code of Conduct.
Alex Ozerov is a new contributor. Be nice, and check out our Code of Conduct.
Alex Ozerov is a new contributor. Be nice, and check out our Code of Conduct.
Alex Ozerov is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3004770%2fcompare-different-optimization-techniques%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
Which optimization algorithm to use depends on the structure of your problem: is it convex? Is the objective function differentiable? Are there any constraints? Is the objective function a sum of a large number of terms?
– littleO
19 hours ago
I don't have any particular problem, the question is theoretical. How should I choose the optimization technique to minimize the number of objective function evaluation? For both convex and general functions, differentiable and other functions.
– Alex Ozerov
19 hours ago
There's not a simple answer to this question. I think the answer would require giving an overview of all optimization algorithms and describing the types of problems for which the algorithms are most effective. For small or medium sized unconstrained problems where the objective function is smooth, Newton's method is often a good choice. For very large scale smooth unconstrained problems, accelerated gradient descent is often a good choice. If the objective function is a sum of many terms, some variant of stochastic gradient descent might be good. The list of problem types goes on.
– littleO
18 hours ago
Various surrogate function approaches, such as kriging
– Johan Löfberg
11 hours ago