I once learned of a way to figure out the big-O* for recursive algorithms and in today's post I just wanted to go back and learn it again. Apparently it's call the master theorem.
Basically you take the recursive algorithm and let a be the number of sub-problems in the recursion, b the size of each sub-problem, and f(n) the work done outside the recursive calls.
Formulate T(n) as aT(n/b) + f(n). T(n) can't be something like sin n (must be monotonic) and f(n) can't be something that's not polynomial.
If f(n) ∈ O(nd) then the big-O for T is:
- O(nd) if a < bd
- O(nd log n) if a = bd
- O(nlogb a) if a > bd
As an example, lets apply this to the merge sort. There are two sub-problems so a = 2. The size of each sub-problem is half of n, so b = 2. Outside of the recursive calls there is one pass through the data so d = 1. This leaves us with T(n) = 2T(n/2) + n which is O(n log n).
For binary search, it would be a = 1, b = 2, and d = 0. Therefore O(log n).
* Technically I think this should be big-theta but informally people only seem to use big-O so I'm sticking with that.
For binary search, it would be a = 1, b = 2, and d = 0. Therefore O(log n).
* Technically I think this should be big-theta but informally people only seem to use big-O so I'm sticking with that.