Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2012:journals:carrie:ch2 [2012/01/12 18:46] hopkinsccourses:cs211:winter2012:journals:carrie:ch2 [2012/01/23 14:08] (current) hopkinsc
Line 1: Line 1:
-Chapter 2+====== Chapter 2 ======
  
-2.1: Computational Tractability + 
-The goal this course is to write correct and efficient algorithms and this sections goes through how we define efficiency. +==== 2.1: Computational Tractability ==== 
 + 
 +The goal of this course is to write correct and efficient algorithms and this sections goes through how we define efficiency. 
  
 Intial attempt at defining efficiancy:  Intial attempt at defining efficiancy: 
Line 29: Line 31:
   * Too specific   * Too specific
  
-2.2 Asymptotic Order of Growth+==== 2.2 Asymptotic Order of Growth ==== 
 This section discuss how to measure the order of an algorithm and looks at how to find a lower bound, an upper bound, and a tight bound. Our goal is to express the growth rate of running times of algorithms.  This section discuss how to measure the order of an algorithm and looks at how to find a lower bound, an upper bound, and a tight bound. Our goal is to express the growth rate of running times of algorithms. 
  
Line 60: Line 63:
  
 Helpful to have the bounds for common functions, but I am wondering how will we find them in the future?  Helpful to have the bounds for common functions, but I am wondering how will we find them in the future? 
 +
 +==== 2.3: Implementing the Stable Matching Algorithm Using Lists and Arrays ====
 +Moving from general discussion of GS algorithm to actual implementation. First need to figure out what Data Structures to use. 
 +
 +Arrays:
 +  * Arrays have length n and A[i] is the ith element in the list. 
 +  * Give direct access to ith elements - there for O(1) time to get value A[i]
 +  * check if e is in A, (e = A[i] for some i), then it will take O(n) time
 +  * if elements are sorted check for e in O(logn) using binary search. 
 +  * "An array is less good for dynamically mainting a list of elements that changes over time, such as a l ist of free men"
 +
 +Lists:
 +  * can delete or add
 +  * should use a linked list
 +  * possible option is doubly linked list:
 +    * deletion: constant time
 +    * insertion: constant time
 +    * ith access: O(i) time. No direct access. 
 +
 +We will need to use both in the stable matching alogorithm
 +  * identify a free man - use a linked list
 +  * for a man m identify his highest ranked woman who he has not proposed to - array
 +  * for a woman w, decide if w is currently engaged and who her current partner is - use an array. 
 +  * for a woman w and two men m and m' we need to be able to check in constant time which W prefers m or m'. - inialize with an n x n array of preferences then we can use constant time access through out algorithm 
 +
 +This gives O(n^2) running time. 
 +
 +Getting into this stuff is more difficult for me. I don't remember running times as much and I try to think of it logically and actually go through the process, but it doesn't always work. Any step by step process would be helpful, but I am guessing it's something you get better at with time. 
 +==== 2.4: A Survey of Common Running Times ====
 +This section was easier to read since we just spent so much time on it and I felt pretty comfortable with it. 
 +
 +Linear time
 +  * O(n)
 +  * takes a constant times n amount of time 
 +  * computing the maximum - have to go through all n objects
 +  * Merging Two Sorted Lists - basically have to check the min in each list against the other min. pop off the smaller one and add it to a new list. 
 +    The book has more info on why this is O(n) look it over if confused: p. 50
 +
 +O(n*logn) time
 +  * very common 
 +  * Mergesort algorithm the set of input #s into two equal-sized pieces than solves each half recursively. 
 +  * common because most expensive step is sorting. 
 +
 +Quadratic time:
 +  * O(n^2)
 +  * trying to find all the pairs 
 +  * two nested loops
 +
 +Cubic time
 +  * O(n^3)
 +  * trying to find all three tuples
 +  * "elaborate sets of nested loops"
 +
 +O(n^k)
 +  * similar to quadratic and cubic times
 +
 +Beyond polynomial time
 +  * find maximum independent set in graph of k nodes 
 +    * talked about this in class
 +  * n! is huge and usually away to find smaller bounds
 +
 +Sublinear time
 +  * Binary search = logn
 +  * O(logn) arises when we are doing constant amount of work in order to throw away a fraction of the input. 
 +
 +This all makes sense to me, I think the biggest struggle will be when I need to design an alogirthm with a certain running time, but these are the notes I should look at! 
 +
 +==== 2.5: A More Complex Data Structure Priority Queues ====
 +Priority Queue:
 +  * Designed for applications in which elements have a priority value, or key, and each time we need to select an element from S, we want to take the one with highest priority. 
 +  * A priority queue is a data structure that maintains a set of elements S, where each element v in S has an associated value key(v) that denotes the priority of element v; smaller keys represent higher priorities. 
 +
 +Use a heap to implement a priority queue
 +  * gets us the O(log n) 
 +  * For every element v, at a node i, the element w at i's parents satisfies key(w) <= key(v)
 +  * leftchild(i) = 2i, rightchild(i) = 2i+1 
 +  * Heapify up and heapify down - allows for log n
 +Use these operations:
 +  * StartHeap(N)
 +  * Insert(H,V)
 +  * FindMin(H)
 +  * Delete(H,i)
 +  * ExtractMin(H)
 +
 +I think heaps are soooo fun! I enjoyed this chapter because it layed out the code in a pretty readabe way. 
 +
 +
 + 
courses/cs211/winter2012/journals/carrie/ch2.1326394009.txt.gz · Last modified: 2012/01/12 18:46 by hopkinsc
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0