Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2012:journals:carrie:ch2 [2012/01/19 21:13] โ€“ hopkinsccourses:cs211:winter2012:journals:carrie:ch2 [2012/01/23 14:08] (current) โ€“ hopkinsc
Line 1: Line 1:
-  *   * Unordered List Item====== Chapter 2 ======+====== Chapter 2 ======
  
  
-===== 2.1: Computational Tractability =====+==== 2.1: Computational Tractability ====
  
 The goal of this course is to write correct and efficient algorithms and this sections goes through how we define efficiency.  The goal of this course is to write correct and efficient algorithms and this sections goes through how we define efficiency. 
Line 31: Line 31:
   * Too specific   * Too specific
  
-===== 2.2 Asymptotic Order of Growth =====+==== 2.2 Asymptotic Order of Growth ====
  
 This section discuss how to measure the order of an algorithm and looks at how to find a lower bound, an upper bound, and a tight bound. Our goal is to express the growth rate of running times of algorithms.  This section discuss how to measure the order of an algorithm and looks at how to find a lower bound, an upper bound, and a tight bound. Our goal is to express the growth rate of running times of algorithms. 
Line 90: Line 90:
 This gives O(n^2) running time.  This gives O(n^2) running time. 
  
 +Getting into this stuff is more difficult for me. I don't remember running times as much and I try to think of it logically and actually go through the process, but it doesn't always work. Any step by step process would be helpful, but I am guessing it's something you get better at with time. 
 ==== 2.4: A Survey of Common Running Times ==== ==== 2.4: A Survey of Common Running Times ====
 This section was easier to read since we just spent so much time on it and I felt pretty comfortable with it.  This section was easier to read since we just spent so much time on it and I felt pretty comfortable with it. 
Line 126: Line 127:
   * Binary search = logn   * Binary search = logn
   * O(logn) arises when we are doing constant amount of work in order to throw away a fraction of the input.    * O(logn) arises when we are doing constant amount of work in order to throw away a fraction of the input. 
 +
 +This all makes sense to me, I think the biggest struggle will be when I need to design an alogirthm with a certain running time, but these are the notes I should look at! 
  
 ==== 2.5: A More Complex Data Structure Priority Queues ==== ==== 2.5: A More Complex Data Structure Priority Queues ====
Line 144: Line 147:
   * ExtractMin(H)   * ExtractMin(H)
  
-I think heaps are soooo fun! +I think heaps are soooo fun! I enjoyed this chapter because it layed out the code in a pretty readabe way. 
  
  
    
courses/cs211/winter2012/journals/carrie/ch2.1327007614.txt.gz ยท Last modified: 2012/01/19 21:13 by hopkinsc
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0