Why is runtime of Fibonacci θ(1.6^N)?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
Why is the runtime closer to O(1.6^N) and not O(2^N)? I think it has something to do with the call stack, but it's still a little unclear to me.
int Fibonacci(int n)
{
if (n <= 1)
return n;
else
return Fibonacci(n - 1) + Fibonacci(n - 2);
}
runtime big-o fibonacci
add a comment |
Why is the runtime closer to O(1.6^N) and not O(2^N)? I think it has something to do with the call stack, but it's still a little unclear to me.
int Fibonacci(int n)
{
if (n <= 1)
return n;
else
return Fibonacci(n - 1) + Fibonacci(n - 2);
}
runtime big-o fibonacci
where did you get the idea of it is O(1.6^N)?
– CodingLab
Jan 3 at 6:38
1
The precise expression isO(ϕ^N)
whereϕ = ½(1+√5) ≈ 1.618...
is the Golden Ratio. You can obtain this by making a substitutionT(N) = a^N
into the recurrence relationT(N) = T(N - 1) + T(N - 2) + O(1)
and solving for the basea
. See this page for more details.
– meowgoesthedog
Jan 3 at 9:34
add a comment |
Why is the runtime closer to O(1.6^N) and not O(2^N)? I think it has something to do with the call stack, but it's still a little unclear to me.
int Fibonacci(int n)
{
if (n <= 1)
return n;
else
return Fibonacci(n - 1) + Fibonacci(n - 2);
}
runtime big-o fibonacci
Why is the runtime closer to O(1.6^N) and not O(2^N)? I think it has something to do with the call stack, but it's still a little unclear to me.
int Fibonacci(int n)
{
if (n <= 1)
return n;
else
return Fibonacci(n - 1) + Fibonacci(n - 2);
}
runtime big-o fibonacci
runtime big-o fibonacci
asked Jan 3 at 2:37
ajcajc
64
64
where did you get the idea of it is O(1.6^N)?
– CodingLab
Jan 3 at 6:38
1
The precise expression isO(ϕ^N)
whereϕ = ½(1+√5) ≈ 1.618...
is the Golden Ratio. You can obtain this by making a substitutionT(N) = a^N
into the recurrence relationT(N) = T(N - 1) + T(N - 2) + O(1)
and solving for the basea
. See this page for more details.
– meowgoesthedog
Jan 3 at 9:34
add a comment |
where did you get the idea of it is O(1.6^N)?
– CodingLab
Jan 3 at 6:38
1
The precise expression isO(ϕ^N)
whereϕ = ½(1+√5) ≈ 1.618...
is the Golden Ratio. You can obtain this by making a substitutionT(N) = a^N
into the recurrence relationT(N) = T(N - 1) + T(N - 2) + O(1)
and solving for the basea
. See this page for more details.
– meowgoesthedog
Jan 3 at 9:34
where did you get the idea of it is O(1.6^N)?
– CodingLab
Jan 3 at 6:38
where did you get the idea of it is O(1.6^N)?
– CodingLab
Jan 3 at 6:38
1
1
The precise expression is
O(ϕ^N)
where ϕ = ½(1+√5) ≈ 1.618...
is the Golden Ratio. You can obtain this by making a substitution T(N) = a^N
into the recurrence relation T(N) = T(N - 1) + T(N - 2) + O(1)
and solving for the base a
. See this page for more details.– meowgoesthedog
Jan 3 at 9:34
The precise expression is
O(ϕ^N)
where ϕ = ½(1+√5) ≈ 1.618...
is the Golden Ratio. You can obtain this by making a substitution T(N) = a^N
into the recurrence relation T(N) = T(N - 1) + T(N - 2) + O(1)
and solving for the base a
. See this page for more details.– meowgoesthedog
Jan 3 at 9:34
add a comment |
2 Answers
2
active
oldest
votes
The number ϕ = (1+sqrt(5))/2
is characterized by the two following properties:
ϕ >= 1
ϕ^2 = ϕ + 1
.
Multiplying the second equation by ϕ^{n-1}
we get
ϕ^{n+1} = ϕ^n + ϕ^{n-1}
Since f(0) = 0
, f(1) = 1
and f(n+1) = f(n) + f(n-1)
, using 1 and 3 it is easy to see by induction in n
that
f(n) <= ϕ^n
Thus f(n)
is O(ϕ^n)
.
A similar inductive argument shows that
f(n) >= ϕ^{n-3} = ϕ^{-3}ϕ^n (n >= 1)
thus f(n) = θ(ϕ^n)
.
add a comment |
For starters, remember that big-O notation always provides an upper bound. That is, if a function is O(n), it's also O(n2), O(n3), O(2n), etc. So in that sense, you are not incorrect if you say that the runtime is O(2n). You just don't have a tight bound.
To see where the tight bound of Θ(φn) comes from, it might help to look at how many recursive calls end up getting made when evaluating Fibonacci(n)
. Notice that
Fibonacci(0)
requires one call, the call toFibonacci(0)
.
Fibonacci(1)
requires one call, the call toFibonacci(1)
.
Fibonacci(2)
requires three calls: one toFibonacci(2)
, and one each toFibonacci(0)
andFibonacci(1)
.
Fibonacci(3)
requires five calls: one toFibonacci(3)
, then the three calls generated by invokingFibonacci(2)
and the one call generated by invokingFibonacci(1)
.
Fibonacci(4)
requires nine calls: one toFibonacci(4)
, then five from the call toFibonacci(3)
and three from the call toFibonacci(2)
.
More generally, the pattern seems to be that if you're calling Fibonacci(n)
for some n ≥ 2, that the number of calls made is one (for the call itself), plus the number of calls needed to evaluate Fibonacci(n-1)
and Fibonacci(n-2)
. If we let Ln denote the number of calls made, this means that we have that
- L0 = L1 = 1
- Ln+2 = 1 + Ln + Ln+1.
So now the question is how fast this sequence grows. Evaluating the first few terms gives us
1, 1, 3, 5, 9, 15, 25, 41, ...
which definitely gets bigger and bigger, but it's not clear how much bigger that is.
Something you might notice here is that Ln kinda sorta ish looks like the Fibonacci numbers. That is, it's defined in terms of the sum of the two previous terms, but it has an extra +1 term. So maybe we might want to look at the difference between Ln and Fn, since that might show us how much "faster" the L series grows. You might notice that the first two values of the L series are 1, 1 and the first two values of the Fibonacci series are 0, 1, so we'll shift things over by one term to make things line up a bit more nicely:
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff: 0 0 1 2 4 7 12 20
And wait, hold on a second. What happens if we add one to each term of the difference? That gives us
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff+1 1 1 2 3 5 8 13 21
Whoa! Looks like Ln - Fn+1 + 1 = Fn+1. Rearranging, we see that
Ln = 2Fn+1 - 1.
Wow! So the actual number of calls made by the Fibonacci recursive function is very closely related to the actual value returned. So we could say that the runtime of the Fibonacci function is Θ(Fn+1) and we'd be correct.
But now the question is where φ comes in. There's a lovely mathematical result called Binet's formula that says that Fn = Θ(φn). There are many ways to prove this, but they all essentially boil down to the observation that
- the Fibonacci numbers seem to grow exponentially quickly;
- if they grow exponentially with a base of x, then Fn+2 = Fn + Fn+1 can be rewritten as x2 = x + 1; and
- φ is a solution to x2 = x + 1.
From this, we can see that since the runtime of Fibonacci is Θ(Fn+1), then the runtime is also Θ(φn+1) = Θ(φn).
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54015629%2fwhy-is-runtime-of-fibonacci-%25ce%25b81-6n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
The number ϕ = (1+sqrt(5))/2
is characterized by the two following properties:
ϕ >= 1
ϕ^2 = ϕ + 1
.
Multiplying the second equation by ϕ^{n-1}
we get
ϕ^{n+1} = ϕ^n + ϕ^{n-1}
Since f(0) = 0
, f(1) = 1
and f(n+1) = f(n) + f(n-1)
, using 1 and 3 it is easy to see by induction in n
that
f(n) <= ϕ^n
Thus f(n)
is O(ϕ^n)
.
A similar inductive argument shows that
f(n) >= ϕ^{n-3} = ϕ^{-3}ϕ^n (n >= 1)
thus f(n) = θ(ϕ^n)
.
add a comment |
The number ϕ = (1+sqrt(5))/2
is characterized by the two following properties:
ϕ >= 1
ϕ^2 = ϕ + 1
.
Multiplying the second equation by ϕ^{n-1}
we get
ϕ^{n+1} = ϕ^n + ϕ^{n-1}
Since f(0) = 0
, f(1) = 1
and f(n+1) = f(n) + f(n-1)
, using 1 and 3 it is easy to see by induction in n
that
f(n) <= ϕ^n
Thus f(n)
is O(ϕ^n)
.
A similar inductive argument shows that
f(n) >= ϕ^{n-3} = ϕ^{-3}ϕ^n (n >= 1)
thus f(n) = θ(ϕ^n)
.
add a comment |
The number ϕ = (1+sqrt(5))/2
is characterized by the two following properties:
ϕ >= 1
ϕ^2 = ϕ + 1
.
Multiplying the second equation by ϕ^{n-1}
we get
ϕ^{n+1} = ϕ^n + ϕ^{n-1}
Since f(0) = 0
, f(1) = 1
and f(n+1) = f(n) + f(n-1)
, using 1 and 3 it is easy to see by induction in n
that
f(n) <= ϕ^n
Thus f(n)
is O(ϕ^n)
.
A similar inductive argument shows that
f(n) >= ϕ^{n-3} = ϕ^{-3}ϕ^n (n >= 1)
thus f(n) = θ(ϕ^n)
.
The number ϕ = (1+sqrt(5))/2
is characterized by the two following properties:
ϕ >= 1
ϕ^2 = ϕ + 1
.
Multiplying the second equation by ϕ^{n-1}
we get
ϕ^{n+1} = ϕ^n + ϕ^{n-1}
Since f(0) = 0
, f(1) = 1
and f(n+1) = f(n) + f(n-1)
, using 1 and 3 it is easy to see by induction in n
that
f(n) <= ϕ^n
Thus f(n)
is O(ϕ^n)
.
A similar inductive argument shows that
f(n) >= ϕ^{n-3} = ϕ^{-3}ϕ^n (n >= 1)
thus f(n) = θ(ϕ^n)
.
edited Jan 4 at 2:21
answered Jan 4 at 2:03


Leandro CanigliaLeandro Caniglia
9,27122137
9,27122137
add a comment |
add a comment |
For starters, remember that big-O notation always provides an upper bound. That is, if a function is O(n), it's also O(n2), O(n3), O(2n), etc. So in that sense, you are not incorrect if you say that the runtime is O(2n). You just don't have a tight bound.
To see where the tight bound of Θ(φn) comes from, it might help to look at how many recursive calls end up getting made when evaluating Fibonacci(n)
. Notice that
Fibonacci(0)
requires one call, the call toFibonacci(0)
.
Fibonacci(1)
requires one call, the call toFibonacci(1)
.
Fibonacci(2)
requires three calls: one toFibonacci(2)
, and one each toFibonacci(0)
andFibonacci(1)
.
Fibonacci(3)
requires five calls: one toFibonacci(3)
, then the three calls generated by invokingFibonacci(2)
and the one call generated by invokingFibonacci(1)
.
Fibonacci(4)
requires nine calls: one toFibonacci(4)
, then five from the call toFibonacci(3)
and three from the call toFibonacci(2)
.
More generally, the pattern seems to be that if you're calling Fibonacci(n)
for some n ≥ 2, that the number of calls made is one (for the call itself), plus the number of calls needed to evaluate Fibonacci(n-1)
and Fibonacci(n-2)
. If we let Ln denote the number of calls made, this means that we have that
- L0 = L1 = 1
- Ln+2 = 1 + Ln + Ln+1.
So now the question is how fast this sequence grows. Evaluating the first few terms gives us
1, 1, 3, 5, 9, 15, 25, 41, ...
which definitely gets bigger and bigger, but it's not clear how much bigger that is.
Something you might notice here is that Ln kinda sorta ish looks like the Fibonacci numbers. That is, it's defined in terms of the sum of the two previous terms, but it has an extra +1 term. So maybe we might want to look at the difference between Ln and Fn, since that might show us how much "faster" the L series grows. You might notice that the first two values of the L series are 1, 1 and the first two values of the Fibonacci series are 0, 1, so we'll shift things over by one term to make things line up a bit more nicely:
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff: 0 0 1 2 4 7 12 20
And wait, hold on a second. What happens if we add one to each term of the difference? That gives us
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff+1 1 1 2 3 5 8 13 21
Whoa! Looks like Ln - Fn+1 + 1 = Fn+1. Rearranging, we see that
Ln = 2Fn+1 - 1.
Wow! So the actual number of calls made by the Fibonacci recursive function is very closely related to the actual value returned. So we could say that the runtime of the Fibonacci function is Θ(Fn+1) and we'd be correct.
But now the question is where φ comes in. There's a lovely mathematical result called Binet's formula that says that Fn = Θ(φn). There are many ways to prove this, but they all essentially boil down to the observation that
- the Fibonacci numbers seem to grow exponentially quickly;
- if they grow exponentially with a base of x, then Fn+2 = Fn + Fn+1 can be rewritten as x2 = x + 1; and
- φ is a solution to x2 = x + 1.
From this, we can see that since the runtime of Fibonacci is Θ(Fn+1), then the runtime is also Θ(φn+1) = Θ(φn).
add a comment |
For starters, remember that big-O notation always provides an upper bound. That is, if a function is O(n), it's also O(n2), O(n3), O(2n), etc. So in that sense, you are not incorrect if you say that the runtime is O(2n). You just don't have a tight bound.
To see where the tight bound of Θ(φn) comes from, it might help to look at how many recursive calls end up getting made when evaluating Fibonacci(n)
. Notice that
Fibonacci(0)
requires one call, the call toFibonacci(0)
.
Fibonacci(1)
requires one call, the call toFibonacci(1)
.
Fibonacci(2)
requires three calls: one toFibonacci(2)
, and one each toFibonacci(0)
andFibonacci(1)
.
Fibonacci(3)
requires five calls: one toFibonacci(3)
, then the three calls generated by invokingFibonacci(2)
and the one call generated by invokingFibonacci(1)
.
Fibonacci(4)
requires nine calls: one toFibonacci(4)
, then five from the call toFibonacci(3)
and three from the call toFibonacci(2)
.
More generally, the pattern seems to be that if you're calling Fibonacci(n)
for some n ≥ 2, that the number of calls made is one (for the call itself), plus the number of calls needed to evaluate Fibonacci(n-1)
and Fibonacci(n-2)
. If we let Ln denote the number of calls made, this means that we have that
- L0 = L1 = 1
- Ln+2 = 1 + Ln + Ln+1.
So now the question is how fast this sequence grows. Evaluating the first few terms gives us
1, 1, 3, 5, 9, 15, 25, 41, ...
which definitely gets bigger and bigger, but it's not clear how much bigger that is.
Something you might notice here is that Ln kinda sorta ish looks like the Fibonacci numbers. That is, it's defined in terms of the sum of the two previous terms, but it has an extra +1 term. So maybe we might want to look at the difference between Ln and Fn, since that might show us how much "faster" the L series grows. You might notice that the first two values of the L series are 1, 1 and the first two values of the Fibonacci series are 0, 1, so we'll shift things over by one term to make things line up a bit more nicely:
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff: 0 0 1 2 4 7 12 20
And wait, hold on a second. What happens if we add one to each term of the difference? That gives us
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff+1 1 1 2 3 5 8 13 21
Whoa! Looks like Ln - Fn+1 + 1 = Fn+1. Rearranging, we see that
Ln = 2Fn+1 - 1.
Wow! So the actual number of calls made by the Fibonacci recursive function is very closely related to the actual value returned. So we could say that the runtime of the Fibonacci function is Θ(Fn+1) and we'd be correct.
But now the question is where φ comes in. There's a lovely mathematical result called Binet's formula that says that Fn = Θ(φn). There are many ways to prove this, but they all essentially boil down to the observation that
- the Fibonacci numbers seem to grow exponentially quickly;
- if they grow exponentially with a base of x, then Fn+2 = Fn + Fn+1 can be rewritten as x2 = x + 1; and
- φ is a solution to x2 = x + 1.
From this, we can see that since the runtime of Fibonacci is Θ(Fn+1), then the runtime is also Θ(φn+1) = Θ(φn).
add a comment |
For starters, remember that big-O notation always provides an upper bound. That is, if a function is O(n), it's also O(n2), O(n3), O(2n), etc. So in that sense, you are not incorrect if you say that the runtime is O(2n). You just don't have a tight bound.
To see where the tight bound of Θ(φn) comes from, it might help to look at how many recursive calls end up getting made when evaluating Fibonacci(n)
. Notice that
Fibonacci(0)
requires one call, the call toFibonacci(0)
.
Fibonacci(1)
requires one call, the call toFibonacci(1)
.
Fibonacci(2)
requires three calls: one toFibonacci(2)
, and one each toFibonacci(0)
andFibonacci(1)
.
Fibonacci(3)
requires five calls: one toFibonacci(3)
, then the three calls generated by invokingFibonacci(2)
and the one call generated by invokingFibonacci(1)
.
Fibonacci(4)
requires nine calls: one toFibonacci(4)
, then five from the call toFibonacci(3)
and three from the call toFibonacci(2)
.
More generally, the pattern seems to be that if you're calling Fibonacci(n)
for some n ≥ 2, that the number of calls made is one (for the call itself), plus the number of calls needed to evaluate Fibonacci(n-1)
and Fibonacci(n-2)
. If we let Ln denote the number of calls made, this means that we have that
- L0 = L1 = 1
- Ln+2 = 1 + Ln + Ln+1.
So now the question is how fast this sequence grows. Evaluating the first few terms gives us
1, 1, 3, 5, 9, 15, 25, 41, ...
which definitely gets bigger and bigger, but it's not clear how much bigger that is.
Something you might notice here is that Ln kinda sorta ish looks like the Fibonacci numbers. That is, it's defined in terms of the sum of the two previous terms, but it has an extra +1 term. So maybe we might want to look at the difference between Ln and Fn, since that might show us how much "faster" the L series grows. You might notice that the first two values of the L series are 1, 1 and the first two values of the Fibonacci series are 0, 1, so we'll shift things over by one term to make things line up a bit more nicely:
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff: 0 0 1 2 4 7 12 20
And wait, hold on a second. What happens if we add one to each term of the difference? That gives us
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff+1 1 1 2 3 5 8 13 21
Whoa! Looks like Ln - Fn+1 + 1 = Fn+1. Rearranging, we see that
Ln = 2Fn+1 - 1.
Wow! So the actual number of calls made by the Fibonacci recursive function is very closely related to the actual value returned. So we could say that the runtime of the Fibonacci function is Θ(Fn+1) and we'd be correct.
But now the question is where φ comes in. There's a lovely mathematical result called Binet's formula that says that Fn = Θ(φn). There are many ways to prove this, but they all essentially boil down to the observation that
- the Fibonacci numbers seem to grow exponentially quickly;
- if they grow exponentially with a base of x, then Fn+2 = Fn + Fn+1 can be rewritten as x2 = x + 1; and
- φ is a solution to x2 = x + 1.
From this, we can see that since the runtime of Fibonacci is Θ(Fn+1), then the runtime is also Θ(φn+1) = Θ(φn).
For starters, remember that big-O notation always provides an upper bound. That is, if a function is O(n), it's also O(n2), O(n3), O(2n), etc. So in that sense, you are not incorrect if you say that the runtime is O(2n). You just don't have a tight bound.
To see where the tight bound of Θ(φn) comes from, it might help to look at how many recursive calls end up getting made when evaluating Fibonacci(n)
. Notice that
Fibonacci(0)
requires one call, the call toFibonacci(0)
.
Fibonacci(1)
requires one call, the call toFibonacci(1)
.
Fibonacci(2)
requires three calls: one toFibonacci(2)
, and one each toFibonacci(0)
andFibonacci(1)
.
Fibonacci(3)
requires five calls: one toFibonacci(3)
, then the three calls generated by invokingFibonacci(2)
and the one call generated by invokingFibonacci(1)
.
Fibonacci(4)
requires nine calls: one toFibonacci(4)
, then five from the call toFibonacci(3)
and three from the call toFibonacci(2)
.
More generally, the pattern seems to be that if you're calling Fibonacci(n)
for some n ≥ 2, that the number of calls made is one (for the call itself), plus the number of calls needed to evaluate Fibonacci(n-1)
and Fibonacci(n-2)
. If we let Ln denote the number of calls made, this means that we have that
- L0 = L1 = 1
- Ln+2 = 1 + Ln + Ln+1.
So now the question is how fast this sequence grows. Evaluating the first few terms gives us
1, 1, 3, 5, 9, 15, 25, 41, ...
which definitely gets bigger and bigger, but it's not clear how much bigger that is.
Something you might notice here is that Ln kinda sorta ish looks like the Fibonacci numbers. That is, it's defined in terms of the sum of the two previous terms, but it has an extra +1 term. So maybe we might want to look at the difference between Ln and Fn, since that might show us how much "faster" the L series grows. You might notice that the first two values of the L series are 1, 1 and the first two values of the Fibonacci series are 0, 1, so we'll shift things over by one term to make things line up a bit more nicely:
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff: 0 0 1 2 4 7 12 20
And wait, hold on a second. What happens if we add one to each term of the difference? That gives us
L(n) 1 1 3 5 9 15 25 41
F(n+1) 1 1 2 3 5 8 13 21
Diff+1 1 1 2 3 5 8 13 21
Whoa! Looks like Ln - Fn+1 + 1 = Fn+1. Rearranging, we see that
Ln = 2Fn+1 - 1.
Wow! So the actual number of calls made by the Fibonacci recursive function is very closely related to the actual value returned. So we could say that the runtime of the Fibonacci function is Θ(Fn+1) and we'd be correct.
But now the question is where φ comes in. There's a lovely mathematical result called Binet's formula that says that Fn = Θ(φn). There are many ways to prove this, but they all essentially boil down to the observation that
- the Fibonacci numbers seem to grow exponentially quickly;
- if they grow exponentially with a base of x, then Fn+2 = Fn + Fn+1 can be rewritten as x2 = x + 1; and
- φ is a solution to x2 = x + 1.
From this, we can see that since the runtime of Fibonacci is Θ(Fn+1), then the runtime is also Θ(φn+1) = Θ(φn).
answered Jan 4 at 3:05
templatetypedeftemplatetypedef
267k70679897
267k70679897
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54015629%2fwhy-is-runtime-of-fibonacci-%25ce%25b81-6n%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
where did you get the idea of it is O(1.6^N)?
– CodingLab
Jan 3 at 6:38
1
The precise expression is
O(ϕ^N)
whereϕ = ½(1+√5) ≈ 1.618...
is the Golden Ratio. You can obtain this by making a substitutionT(N) = a^N
into the recurrence relationT(N) = T(N - 1) + T(N - 2) + O(1)
and solving for the basea
. See this page for more details.– meowgoesthedog
Jan 3 at 9:34