This is an old revision of the document!


2.2 Asymptotic Order of Growth

Asymptotic Upper Bounds

  • Let T(n) be a function(let's assume it is the worst-case running time of a certain algorithm on an input of size n).
  • Given another function f(n), we say that T(n) is O( f(n)) (T(n) is order f(n))if for sufficiently large n, the function T(n) is bounded above by a constant multiple of f(n). We can write this relation as T(n) = O(f(n)). To be precise, T(n) is O(f(n)) if there exist constants c > 0 and n0 >= 0 such that for all n >= n0, we have T(n) ≤ c. The constant c works for all n,doesn't depend on n. In this case, we say that T is asymptotically upper-bounded by f. To find the upper bound f, we inflate the terms in the equation of T(n).

For instance, if T(n) = pn2 + qn + r where p, q, r are positive constants,
For all n ≥ 1,
T(n) = pn2 + qn + r ⇒ T(n) ≤ pn2 + qn2+ rn2
pn2 + qn2+ rn2 = (p+q+r) n2 = c n2
⇒ T(n) ≤ cn2, where c = p+q+r
Thus, T(n) = O(n2)

Asymptotic Lower Bounds

  • Let T(n) be a function(let's assume it is the worst-case running time of a certain algorithm on an input of size n).
  • To show that an upper bound found the best one possible, we need to express the notion that for arbitrarily large input sizes n,the function T(n) is at least a constant multiple of some specific function f(n). Thus we write T(n) = Ω(f(n))(T(n) is Ω(f(n))) if there exist constants if there exist constants ε > 0 and n0 ≥ 0 such that for all n ≥ n0 , we have T(n) ≥ ε · f(n)). In this case, T is asymptotically lower-bounded by f. the constant ε is fixed,and is independent of n. This time, we deflate the terms of the polynomial T instead of inflating them to find the lower bound.

For instance, if T(n) = pn2 + qn + r where p, q, r are positive constants

For all n ≥ 0, T(n) = pn2 + qn + r ≥ pn2
⇒ T(n) ≥ εn2, where ε = p > 0

⇒T(n) = Ω(n)

Asymptotic Tight Bounds

  • If we can show that a running time T(n) is both O(f(n)) and also Ω (f(n)),then we have found the right bound: T(n) grows exactly like f(n)to within a constant factor.
  • If a function is both O(f(n)) and also Ω (f(n)), we say that T(n) is Θ(f(n)). In this case, we say that f(n) is an asymptotically tight bound for T(n).
  • If we let f and g be two functions such that lim (n tends to infinity) of f(n) / g(n) exists and is equal to some number c > 0, then f(n) = Θ(g(n)).

Properties of Asymptotic Growth Rates

  • Transitivity:
    • If f = O(g) and g = O(h) then f = O(h)
    • If f = Ω(g) and g = Ω(h) then f = Ω(h)
    • If f = Θ(g) and g = Θ(h) then f = Θ(h)
  • Sums of functions:
    • If f = O(h) and g = O(h) then f + g = O(h)
    • If f = Ω(h) and g = Ω(h) then f + g = Ω(h)
    • If f = Θ(h) and g = Θ(h) then f + g = Θ(h)
    • Let k be a fixed constant and let f1,f2,…,fk and h be functions such that fi = O(h) for all i. Then f1 + f2+…+ fk = O(h).
Asymptotic Bounds for some Common Functions
  • Polynomials: For a polynomial T(n) = a0 + a1n + … + adnd of degree d, in which the coefficient ad is positive, f = O(nd)
  • Logarithms: logb n is the number x such that b x = n. If we round logbn to the nearest integer, it is one less than the number of digits in the base-b representation of the number n.
    • For every b > 1 and every x > 0, logbn = O(nx<sup>. * Exponentials : Let f(n) = r<sup>n. For every r > 1 and every d > 0, n d = O(rn).

These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials.

courses/cs211/winter2012/journals/jeanpaul/chaptter2section2.1326774813.txt.gz · Last modified: 2012/01/17 04:33 by mugabej
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0