Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| courses:cs211:winter2018:journals:holmesr:section_2.1 [2018/01/21 21:08] – [Section 2.3 Implementing the G-S Algorithm] holmesr | courses:cs211:winter2018:journals:holmesr:section_2.1 [2018/01/29 19:21] (current) – holmesr | ||
|---|---|---|---|
| Line 34: | Line 34: | ||
| I found this section easy to read and I appreciated how it began with a review of the unique characteristics of each data type and then discussed the steps that occurred inside the loop, including details that ushered the reader towards the implementation of one data structure or the other. The final section with rigorous explanations of why one data type was preferred over the other for each step helped show what to look for in components of other algorithms when trying to decide which data structures to implement. | I found this section easy to read and I appreciated how it began with a review of the unique characteristics of each data type and then discussed the steps that occurred inside the loop, including details that ushered the reader towards the implementation of one data structure or the other. The final section with rigorous explanations of why one data type was preferred over the other for each step helped show what to look for in components of other algorithms when trying to decide which data structures to implement. | ||
| + | |||
| + | ===== Section 2.4 A Survey or Common Running Times ===== | ||
| + | |||
| + | The first common running time treated in this section is O(n) running time. This is sometimes described as a " | ||
| + | |||
| + | Another common run time is O(n log n). This is the run time of any algorithm that cuts an input in half, recursively solves each half and recombines them. An example of an algorithm that works like this is mergesort. | ||
| + | |||
| + | Quadratic and Cubic times are also seen a lot, due to the fact that they are the product of a loop within a loop or a loop within a loop within a loop, respectively. Quadratic time can also arise from a situation where an algorithm must perform a search over all input items and then expend constant time operating on each pair. One algorithm that uses an O(n< | ||
| + | |||
| + | Some algorithms can run in the worst case at times that are worse than polynomial. These include running time such as exponentials and factorial growth rates. Exponentials arise in algorithms that consider all subsets, such as Independent Set, which is a computationally hard problem that requires an algorithm to find the independent set of maximum size. Factorials grow even faster than exponentials, | ||
| + | |||
| + | Sublinear time is a special treat that arises when an input can be " | ||
| + | |||
| + | This chapter was very straightforward and easy to read due to it discussion of running times, which I have studied before. | ||
| + | |||
| + | ===== Section 2.5 Priority Queues ===== | ||
| + | |||
| + | Statement 2.11 from the book says that "a sequence of O(n) priority queue operations can be used to sort a set of n numbers" | ||
| + | |||
| + | Regardless, we will use a heap to implement our priority queue. It helps to visualize a heap as a balanced binary tree with a root node and each node having two or fewer child nodes. The heap will be represented in an array with an object at position i, its left child at 2i and its right child at 2i+1. Importantly, | ||
| + | |||
| + | Maintaining the heap order in an array during additions and deletions requires special algorithms that rearrange the nodes and allow for insertion and deletion in O(log n) time. First, to add an item to the heap, it is inserted into position n + 1 (for a heap of size n) and then Heapify-up is called. Heapify-up checks whether the recently inserted value is less than the value of it parent node and if so, it swaps the two and recursively calls itself. This maintains heap order by pushing lower values towards the top of the array. To delete an item in a heap and maintain heap order, the item at position n is placed in the array where the item to be deleted had been, and then Heapify-down is called. As one would expect, Heapify-down works in the opposite direction of Heapify-up by comparing a parent to its children and swapping the parent with the lower child if possible. This algorithm also calls itself recursively until heap order has been restored. | ||
| + | |||
| + | Some important running times to note when implementing priority queues with heaps: | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
