Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| courses:cs211:winter2018:journals:holmesr:section_2.1 [2018/01/21 17:36] – holmesr | courses:cs211:winter2018:journals:holmesr:section_2.1 [2018/01/29 19:21] (current) – holmesr | ||
|---|---|---|---|
| Line 19: | Line 19: | ||
| Finally, asymptotic growth rates have properties of transitivity (if f is bounded by g and g is bounded by h, f is bounded by h so long as f and g are both bounded in the same direction) and the ability to be summed (if f is upper bounded by h and g is upper bounded by h, f+g is upper bounded by h). | Finally, asymptotic growth rates have properties of transitivity (if f is bounded by g and g is bounded by h, f is bounded by h so long as f and g are both bounded in the same direction) and the ability to be summed (if f is upper bounded by h and g is upper bounded by h, f+g is upper bounded by h). | ||
| - | ====== Section 2.3 Implementing the G-S Algorithm | + | ===== Section 2.3 Implementing the G-S Algorithm ===== |
| Analytically, | Analytically, | ||
| The first step in deciding which data structures to implement is identifying what will be happening in each instruction within the loop: 1) choosing a free man; 2) for a man m, find the highest-ranked woman to whom he has not yet proposed; 3) for a woman w, find if she is engaged and if so, identify her current partner; 4) find which man, m or m' that a woman w prefers. | The first step in deciding which data structures to implement is identifying what will be happening in each instruction within the loop: 1) choosing a free man; 2) for a man m, find the highest-ranked woman to whom he has not yet proposed; 3) for a woman w, find if she is engaged and if so, identify her current partner; 4) find which man, m or m' that a woman w prefers. | ||
| + | |||
| + | For step 1, the algorithm will be choosing a single man in no specific order. If he becomes engaged, he will be deleted from the list and a man m' will be added to the list if m just replaced him as woman w's engagement. Since the size of this list will be continually changing, it would be better to implement it as a linked list, since items can be added and removed to and from the front of a linked list in constant time. | ||
| + | |||
| + | Step 2 requires the tracking of the position of the next woman in a man m's preference list. All of these for each man will be initialized at 1 and every time m needs to propose to a woman, he will choose w = [ManPref[m, Next[m]] and Next[m] will be incremented, | ||
| + | |||
| + | The third step inside the loop is concerned with deciding whether a woman is engaged and if so, identifying the partner to whom she is engaged. The value Current[w] is m' when w is engaged to m' and when she is not engaged the value will be null. It is interesting which is why this step is excellent to implement using an array. An array holds the same amount of data throughout the running of the algorithm and supports constant-time indexing and replacements. | ||
| + | |||
| + | The last thing that the algorithm must do in constant time is compare a woman w's rankings of her current man m and a new proposal m'. Since the length of the preference list is certain and the algorithm needs constant-time access of any position in order to keep overall running time O(n< | ||
| + | |||
| + | I found this section easy to read and I appreciated how it began with a review of the unique characteristics of each data type and then discussed the steps that occurred inside the loop, including details that ushered the reader towards the implementation of one data structure or the other. The final section with rigorous explanations of why one data type was preferred over the other for each step helped show what to look for in components of other algorithms when trying to decide which data structures to implement. | ||
| + | |||
| + | ===== Section 2.4 A Survey or Common Running Times ===== | ||
| + | |||
| + | The first common running time treated in this section is O(n) running time. This is sometimes described as a " | ||
| + | |||
| + | Another common run time is O(n log n). This is the run time of any algorithm that cuts an input in half, recursively solves each half and recombines them. An example of an algorithm that works like this is mergesort. | ||
| + | |||
| + | Quadratic and Cubic times are also seen a lot, due to the fact that they are the product of a loop within a loop or a loop within a loop within a loop, respectively. Quadratic time can also arise from a situation where an algorithm must perform a search over all input items and then expend constant time operating on each pair. One algorithm that uses an O(n< | ||
| + | |||
| + | Some algorithms can run in the worst case at times that are worse than polynomial. These include running time such as exponentials and factorial growth rates. Exponentials arise in algorithms that consider all subsets, such as Independent Set, which is a computationally hard problem that requires an algorithm to find the independent set of maximum size. Factorials grow even faster than exponentials, | ||
| + | |||
| + | Sublinear time is a special treat that arises when an input can be " | ||
| + | |||
| + | This chapter was very straightforward and easy to read due to it discussion of running times, which I have studied before. | ||
| + | |||
| + | ===== Section 2.5 Priority Queues ===== | ||
| + | |||
| + | Statement 2.11 from the book says that "a sequence of O(n) priority queue operations can be used to sort a set of n numbers" | ||
| + | |||
| + | Regardless, we will use a heap to implement our priority queue. It helps to visualize a heap as a balanced binary tree with a root node and each node having two or fewer child nodes. The heap will be represented in an array with an object at position i, its left child at 2i and its right child at 2i+1. Importantly, | ||
| + | |||
| + | Maintaining the heap order in an array during additions and deletions requires special algorithms that rearrange the nodes and allow for insertion and deletion in O(log n) time. First, to add an item to the heap, it is inserted into position n + 1 (for a heap of size n) and then Heapify-up is called. Heapify-up checks whether the recently inserted value is less than the value of it parent node and if so, it swaps the two and recursively calls itself. This maintains heap order by pushing lower values towards the top of the array. To delete an item in a heap and maintain heap order, the item at position n is placed in the array where the item to be deleted had been, and then Heapify-down is called. As one would expect, Heapify-down works in the opposite direction of Heapify-up by comparing a parent to its children and swapping the parent with the lower child if possible. This algorithm also calls itself recursively until heap order has been restored. | ||
| + | |||
| + | Some important running times to note when implementing priority queues with heaps: | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
