Chapter 2

2.3: Implementing the Stable Matching Algorithm Using Lists and Arrays

*Arrays are simple to implement
*Constant time retrieval of the ith element in the list
*O(n) time to find out if an element is equal to another element in the list
*If list is sorted you can use binary search (O(logn)) to find an element
*Arrays do not work well if the data in the array is constantly changing, as is the case in the stable matching problem
*Linked lists are much more efficient than arrays for adding and removing but lack the same efficiency in finding the ith element in the list
*A better solution would be to use a combination of these two types of lists and change between the two of them when convenient

2.4: A Survey of Common Running Times

*Linear Time (O(n)): running time is at most a constant factor times the size of the input (c*n)

  • Linear Time can be illustrated by the two algorithms on pages 48 and 49

*O(nlogn) Time: running time of any algorithm that splits its input into two equal-sized pieces, solves each piece recursively, and then combines the two solutions in linear time

  • An example of this would be the recursive sorting algorithm Merge Sort.
  • Merge sort follows the principle that “sorting's hard so don't do it.” Merge Sort simply divides the problem into smaller and smaller pieces and then puts everything back together in a sorted list.

*Quadratic Time:usually the “brute force” solution to a problem creates quadratic running time (for example n^2).

  • Can be caused by nested loops

*Cubic Time (O(n^3)): can be caused by even more nested loops.

  • For an example of this look to pages 52 and 53 of the text.

*O(n^k) Time: obtain a running time like this when you search all subsets of size k

  • Can be found on page 53

*Beyond Polynomial Time (n!, 2^n): want to avoid these running times at all costs as they are prohibitive

  • Can be found on page 54

*Sublinear Time (logn): best example of this is binary search on a sorted list

2.5: A More Complex Data Structure

*A priority queue is designed for applications in which elements are assigned priority values. Every time an element is selected from this queue it should return the element with the highest priority value
*PQ's can be very useful for rationing memory, time, etc to different processes on a computer. Processes may not come in at the same time but the ones with the highest priorities will always jump the line and be executed before those with lower priorities
*We aim for logn time when using priority queues.
*Heaps are the best way to implement priority queues as they are the only data structure that allows us to perform all of the PQ operations in logn time *Heaps “combine the advantages of a sorted array and a list” *A heap can be thought of, conceptually, as a balanced binary tree

Ending Thoughts

Yet again the book does a good job in this chapter of shoring up any leaks I may have had in my own mind about the information we have covered in class on this topic. While Prof. Sprenkle does a much better job of teaching the material and making it learn-able, the extra readings in the text book allow me to gain a better understanding of the subject matter. Readability: 7/10

courses/cs211/winter2011/journals/andrew/chapter2_continued.txt · Last modified: 2011/01/26 03:02 by bennetta
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0