Differences
This shows you the differences between two versions of the page.
| Next revision | Previous revision |
| courses:cs211:winter2018:journals:holmesr:section_2.2 [2018/01/16 18:27] – created holmesr | courses:cs211:winter2018:journals:holmesr:section_2.2 [2018/01/21 00:58] (current) – holmesr |
|---|
| ====== Section 2.2 ====== | ====== Section 2.2 Asymptotic Order of Growth ====== |
| | |
| | An algorithm's worst-case running time can be tested quantitatively by computing its algorithmic upper and lower bounds. To prove that an algorithm has an asymptotic upper bound of n<sup>2</sup> we can imagine one whose worst-case running time T(n) is f(n) = pn<sup>2</sup>+qn+r. We can prove that this function is O(n<sup>2</sup>) with the inequality T(n)=pn<sup>2</sup>+qn+r < = p<sup>2</sup>+q<sup>2</sup>+r<sup>2</sup>=(p+q+r)n<sup>2</sup> thus T(n) < = cn<sup>2</sup> where c = p+q+r. |
| | |
| | An algorithm's asymptotic lower bound can be proven in a similar fashion. First, a note about notation: asymptotic lower bounds are conventionally represented by an omega, which can not be represented in this markup, so <OMEGA> will be its stand-in. We can prove that an algorithm whose worst-case running time T(n) is <OMEGA>(n<sup>2</sup>) when it has a worst-case running time modeled by the function T(n)=pn<sup>2</sup>+qn+r with the inequality T(n)=pn<sup>2</sup>+qn+r > = pn<sup>2</sup>. |
| | |
| | Additionally, we can use both of these components to arrive at an asymptotically tight bound. When T(n) = O(f(n)) = <OMEGA>(f(n)), it is safe to say that T(n) = <THETA>(f(n)). Another way to compute asymptotically tight bounds is to take the limit of f(n)/g(n) as n goes to infinity. |
| | |
| | It is also important to note that asymptotic growth rates have properties of transitivity (if f is bounded by g and g is bounded by h, f is bounded by h so long as f and g are both bounded in the same direction) and the ability to be summed (if f is upper bounded by h and g is upper bounded by h, f+g is upper bounded by h). |
| | |
| | Finally, the section includes a brief treatment of asymptotic bounds for common functions. Summarily, it states that for some constants d > 0 r > 0 and n >= 1, O(log n) < O(n<sup>d</sup>) < O(r<sup>n</sup>). |
| |