This is an old revision of the document!


Chapter 2

Section 2.1 Computational Tractability

The authors begin by stating the importance of efficient and discrete algorithms. The authors then attempt to define the term efficiency, concluding that a concrete definition of efficiency is “platform-independent, instance-independent, and of predictive value with respect to increasing size.” The authors then discuss Worst-Case Running Times, explaining that it is best to analyze the worst-case of an algorithm because the worst-case does a reasonably effective job of truly capturing an algorithm's efficiency. (Average-case analysis is too opaque, as there is rarely a quantifiable average. Best-case is obviously too unrealistic). The authors also explain Brute-Force Search, or the process of trying all possibilities for a problem, is almost always the slowest and least effective algorithm. Thus, it can be used as a benchmark for more sophisticated algorithms. Finally, the authors explain the concept of Polynomial Time. An algorithm is polynomial time if its running time is at most proportional to N^d. However, this is a deliberately vague definition, and I will need a firmer definition later on. The authors conclude that the best definition of efficiency is: An algorithm is efficient if it has polynomial running time. They note, however, that efficiency is ultimately relative. Couldn't a non-polynomial time algorithm be the most “efficient” or appropriate for a given problem? The overall readability of this section is 6/10.

Section 2.2 Asymptotic Order of Growth

The authors note that discussing an algorithm's worst case running time requires some function f(n) to relate the running time to as n increases. They also note that running times need not be expressed exactly in terms of case-specific constants; simplifying running times is acceptable. Asymptotic Upper Bounds: T(n), worst-case running time, is O(f(n)) if there exists constants c > 0 and nsub0 >= 0 so that for all n >= nsub0, we have T(n) ⇐ c * f(n). The author explains that in this case, T is asymptotically upper bounded by f. Asymptotic Lower Bounds: T(n) is Ω(f(n)) if there exists constants ε > 0 and nsub0 >= 0 so that for all n >= nsub0, we have T(n) >= ε * f(n). In this case, T is asymptotically lower bounded by f. Asymptotically Tight Bounds: If a running time is both O(f(n)) and Ω(f(n)), then we found the best bound, where T(n) grows with f(n) to within a constant factor. If T(n) is both O(f(n)) and Ω(f(n)), then we say that T(n) is Θ(f(n)). Some important properties of growth rates are that they are transitive (e.g. if f = O(g) and g = O(h), then f = O(h)). Two functions can also be added together (e.g. if f = O(h) and g = O(h), then f + g = O(h). This section was definitely mathematically dense and I will need some practice. Overall the readability is 5/10.

Section 2.3 Implementing the Stable Matching Problem Using Lists and Arrays

Whenever we implement an algorithm, we almost always use specific data structures. The Gale-Shapely algorithm can be implemented using only lists and arrays, two of the the simplest data structures. The authors then discuss Arrays and Lists. Some key pros and cons for each are:

   Arrays: O(1) indexing, not dynamic 
   Lists: dynamic, O(i) indexing

Question We tend to measure efficiency in the worst case; however, why do we measure list indexing as O(i)? If we are trying to find an element in a list, and we are traversing across nodes, wouldn't that be O(n) in the worst case and therefore be considered O(n) in general?

Back to the algorithm; men and women preference lists are implemented as arrays, one for the men and one for the women. For identifying a free man, we use a linked list because once a man is engaged, he must be removed from this list (dynamic). For identifying the highest ranked woman on a man's preference list, we use a separate array called Next. Current engagements are stored in an array called Current. Any new matches need to be updated within Current. Overall, the running time of the G-S algorithm is O(n^2).

courses/cs211/winter2018/journals/donohuem/chapter2.1516669334.txt.gz · Last modified: by donohuem
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0