This is an old revision of the document!
Table of Contents
Chapter 2
Section 2.1 Computational Tractability
The first attempt to describe efficiency is that “an algorithm is efficient if, when implemented, it runs quickly on real input instances.” This is good, but it fails to address on what type of machine the algorithm is being run, how well it runs, and the scalability it possesses.
We can consider worst-case runtimes and brute force methods on the way to a better definition of efficiency. It is important to use a worst-case run time to compare performance of algorithms because using the alluring average-case ignites a debate over how to arrive at the random values to input into a function. A good benchmark with which to compare an algorithm's worst-case run time is the run time of its brute-force method, which traverses the entire search space for a solution. These two components lead us to the next definition of efficiency, which is: “an algorithm is efficient if it achieves qualitatively better worst-case performance, at an analytical level, than brute-force search.” A qualm we may have with this is how to define “qualitatively better running time.”
The use of polynomial time gives us the ability to evaluate algorithms using quantitative analysis. The desirable scaling property of algorithms states that there exists constants c > 0 and d > 0 whose running time is bounded by CNd, thus the algorithm slows down by a constant factor of 2d when N increases to 2N and this yields a third definition of efficiency: “an algorithm is efficient if it has a polynomial running time.” One last, unexpected result of this definition is that we now have a definition that can be tested quantitatively and proven or disproven.
Section 2.2 Asymptotic Order of Growth
An algorithm's worst-case running time can be tested quantitatively by computing its algorithmic upper and lower bounds. To prove that an algorithm has an asymptotic upper bound of n2 we can imagine one whose worst-case running time T(n) is f(n) = pn2+qn+r. We can prove that this function is O(n2) with the inequality T(n)=pn2+qn+r < = p2+q2+r2=(p+q+r)n2 thus T(n) < = cn2 where c = p+q+r.
An algorithm's asymptotic lower bound can be proven in a similar fashion. First, a note about notation: asymptotic lower bounds are conventionally represented by an omega, which can not be represented in this markup, so <OMEGA> will be its stand-in. We can prove that an algorithm whose worst-case running time T(n) is <OMEGA>(n2) when it has a worst-case running time modeled by the function T(n)=pn2+qn+r with the inequality T(n)=pn2+qn+r > = pn2.
Additionally, we can use both of these components to arrive at an asymptotically tight bound. When T(n) = O(f(n)) = <OMEGA>(f(n)), it is safe to say that T(n) = <THETA>(f(n)). Another way to compute asymptotically tight bounds is to take the limit of f(n)/g(n) as n goes to infinity.
Finally, asymptotic growth rates have properties of transitivity (if f is bounded by g and g is bounded by h, f is bounded by h so long as f and g are both bounded in the same direction) and the ability to be summed (if f is upper bounded by h and g is upper bounded by h, f+g is upper bounded by h).
Section 2.3 Implementing the G-S Algorithm
Analytically, we have found that the Gale-Shapley stable matching algorithm can run in O(n2) time. This is the result of the loop that drives the progress of the algorithm running in O(n2) time, which means that each step within the loop must run in constant time. Our task, then, is to determine which data structures to implement in order to support this constant time execution of steps within the loop.
The first step in deciding which data structures to implement is identifying what will be happening in each instruction within the loop: 1) choosing a free man; 2) for a man m, find the highest-ranked woman to whom he has not yet proposed; 3) for a woman w, find if she is engaged and if so, identify her current partner; 4) find which man, m or m' that a woman w prefers.
