Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2018:journals:goldm:ch2 [2018/01/23 01:24] goldmcourses:cs211:winter2018:journals:goldm:ch2 [2018/01/29 23:52] (current) – [2.5: A More Complex Data Structure: Priority Queues] goldm
Line 21: Line 21:
  
 I generally find data structures and implementation extremely important, so I give this section a 7/10. I generally find data structures and implementation extremely important, so I give this section a 7/10.
 +
 +===== 2.4: A Survey of Common Running Times =====
 +
 +True to its name, this section discusses the aspects of different algorithms that lead to their run times. As such, one can expound to try and understand why other algorithms end up running in these common times. In discussing linear algorithms, the most common thing to lead to linear run time is going over each element of the input once and performing constant time computations on each element. Additionally, it discusses how clever algorithms can take problems like merging two sorted lists, which at first glance seems as if it would be n squared, and allows it to be simply order n. Algorithms running in nlogn often will separate the input in half, solve each half recursively, then merge them back together in linear time, leading to nlogn run time. For quadratic run time, algorithms can act on pairs of input elements in constant time. Alternatively, nested loops will typically result in quadratic run time. For cubic run time, we typically would expect to see more elaborate nested loops. At this run time, many algorithms stop being practical for input of reasonable size. We expect to get greater powers of n from comparing subsets of size k. A running time of 2 to the n can come from iterating over every single subset of a set. A run time of n! can come from checking every single possible ordering of elements in the input. Lastly, sublinear run times usually will occur when a constant amount of computations are performed to throw out a constant fraction of the input, such as in binary search.
 +
 +I found this section quite exhaustive as it both provided examples and logic. As such, I do not really have any questions about it. It serves to but a face and reason to the typical names we hear as common run times. 
 +
 +This section did feel as if it dragged on a bit, so I give it a 5/10.
 +
 +===== 2.5: A More Complex Data Structure: Priority Queues =====
 +
 +The section goes over what a priority queue is, a set where each element has an associated key that allows it to be sorted by priority; the lower the key the higher the priority. An example of when this is useful is managing computer processes. It goes on to discuss an implementation that allows elements to be added, removed, and have the minimum element selected in logarithmic time. Having the aforementioned run times allows priority queue operations to sort a set. In order to implement the priority queue, the book uses a heap data structure. A heap structure implements the algorithm heapify up to insert elements in logarithmic time. In order to remove the minimum element of the heap, we use ExtractMin. After doing this, if the key of the replacement element is too small, we use heapify up. If it is too big, we use heapify down. Both of these functions fix the heap in logn time. Thus, we can delete an element in logn time. The section then goes on to explain how we can implement the priority queue with a heap and discusses the necessary methods for doing so. Next, to add extra functionality to the queue, specifically, to be able to access nodes by position, we keep a separate array that is organized in a manner that allows us to track position.
 +
 +This section is important in giving an example of how to use an appropriate data structure to efficiently implement a more complicated idea. I would like to see more and more examples of this throughout the text as I would consider it integral to developing useful algorithms in the real world, not just ones that look good on paper.
 +
 +Overall, while important, as we have learned this all in class, I found the reading a little slow. Overall, it earns a 4/10.
courses/cs211/winter2018/journals/goldm/ch2.1516670677.txt.gz · Last modified: by goldm
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0