Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
courses:cs211:winter2011:journals:wendy:chapter2 [2011/01/19 23:32] – created shangwcourses:cs211:winter2011:journals:wendy:chapter2 [2011/01/27 01:09] (current) – [Section 4: A Survey of Common Running Times] shangw
Line 10: Line 10:
 In class, we especially emphasized the concept of asymptotic order of growth. Thus the key points are not unfamiliar. However, due to time constrain, we skipped much of the detailed proof, which I think is not very hard to pursue, but good to go over anyway. So the readable score is also 6.  In class, we especially emphasized the concept of asymptotic order of growth. Thus the key points are not unfamiliar. However, due to time constrain, we skipped much of the detailed proof, which I think is not very hard to pursue, but good to go over anyway. So the readable score is also 6. 
  
 +===== Section 3: Implementing the Stable Matching Algorithm Using Lists and Arrays =====
 +This section introduces the advantages and disadvantages of lists and arrays in implementations. First, it introduces implementations by arrays: advantage lies in to access a certain element; disadvantage is the fixed size. Second, the section talks about implementation of lists, particularly linked list: it has constant time in deleting and inserting (provided that the element's position is known); but it has linear time accessing. The book then suggested alternating between array and list implementation. Last part of the chapter describes the algorithm for the stable matching problem using both array and list implementations to achieve n^2 running time.This section gives a detailed explanation of the stable matching algorithm and when&why using array or list. We have covered those carefully in details. So the readability score is 6. 
 + 
 +===== Section 4: A Survey of Common Running Times =====
 +This section introduces several common running times and provides examples. Linear running time's examples are maximization and merging lists; nlogn time example is mergesort; quadratic example is brute force algorithm for finding closest pair; an example of n^k time is the brute force algorithm for testing if there is an independent graph of size k given n nodes;then it goes on giving examples of algorithms beyond polynomial times; at the end, the section illustrates how binary search to be log n time-smaller than linear time.I found parts of the section very interesting, namely, the independent set problem and the salesman problem, especially because they are both NP-hard problems and their running time as well as reducing the running time in special cases are very interesting topics. The rest are encountered in 112 and discussed quite clearly in class. The readability is 7.
  
 +===== Section 5: A More Complex Data Structure: Priority Queues =====
 +This section focuses on implementations of priority queues using heap. The motivation is to achieve the sorting in nlogn time. The book defines what is a heap. Then it illustrates how heapify-up and down work. Next,the running time is analyzed. At the end, there is a summarization as well as an improvement with an additional position array. This section basically recorded in very details of what we talked in class about heaps and priority queues with heap data structure. I can see it being useful when the memory from lecture becomes rusty in the future. But now, it is a little bit lack of color. The readability is 5. 
courses/cs211/winter2011/journals/wendy/chapter2.1295479936.txt.gz · Last modified: 2011/01/19 23:32 by shangw
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0