This is an old revision of the document!
Section 2.2 Asymptotic Order of Growth
An algorithm's worst-case running time can be tested quantitatively by computing its algorithmic upper and lower bounds. To prove that an algorithm has an asymptotic upper bound of n2 we can imagine one whose worst-case running time T(n) is f(n) = pn2+qn+r. We can prove that this function is O(n2) with the inequality T(n)=pn2+qn+r < = p2+q2+r2=(p+q+r)n2 thus T(n) < = cn2 where c = p+q+r.
An algorithm's asymptotic lower bound can be proven in a similar fashion. First, a note about notation: asymptotic lower bounds are conventionally represented by an omega, which can not be represented in this markup, so <OMEGA> will be its stand-in. We can prove that an algorithm whose worst-case running time T(n) is <OMEGA>(n2) when it has a worst-case running time modeled by the function T(n)=pn2+qn+r with the inequality T(n)=pn2+qn+r > = pn2.
Additionally, we can use both of these components to arrive at an asymptotically tight bound. When T(n) = O(f(n)) = <OMEGA>(f(n)), it is safe to say that T(n) = <THETA>(f(n)). Another way to compute asymptotically tight bounds is to take the limit of f(n)/g(n) as n goes to infinity.
Finally, asymptotic growth rates have properties of transitivity (if f is bounded by g and g is bounded by h, f is bounded by h so long as f and g are both bounded in the same direction) and the ability to be summed (if f is upper bounded by h and g is upper bounded by h, f+g is upper bounded by h).
