2011-01-31

Big Oh notation explained

Assuming a hypothetical computer called the Random Access Machine where:
  • Each simple operation takes exactly one time step.
  • Loops and subroutines are not considered simple operations, but a composition of many single-step operations.
  • Each memory access takes exactly one time step.
  • Memory is unlimited.
The run time is measured by counting up the number of steps an algorithm takes on a given problem instance. We then have the worst, average, and best case complexity functions.


The Big Oh notation further simplifies function classification by disregarding multiplicative constants and defining an upper bound.

f(n) = O(g(n)) means c . g(n) is an upper bound on f(n).
Thus there exists some constant c such that f(n) is always <= c . g(n), for large enough n >= n0 (for some constant n0).


Reference: Algorithm Design Manual by Steven Skiena, book and web site.

No comments: