This chapter introduces us the basic mathematical tools needed for algorithm analysis.
This section introduces several ways of defining “efficiency”. The first way is the literal meaning of “efficiency”-it runs quickly. The second way is defined through comparison: if an algorithm can be designed to run faster than the most basic “brute-force search” way, then it is efficiency. The third way is that if an algorithm can be run in linear time, then it is efficient. Then the author explains why the third way is the most realistic way of defining efficiency. This section gives an evolutionary view of the concept of efficiency. It helps me understand and remember the actual definition of efficiency. Though there are many overlaps with the lecture. So the readable score is 6.
This section introduces the concept of asymptotic order of growth. First, it defines the big O, little O, and tight O. Then it shows several basic properties of the O notations as well as gives out detailed proofs. Last, it lists several common functions as examples to describe their asymptotic bounds. In class, we especially emphasized the concept of asymptotic order of growth. Thus the key points are not unfamiliar. However, due to time constrain, we skipped much of the detailed proof, which I think is not very hard to pursue, but good to go over anyway. So the readable score is also 6.
This section introduces the advantages and disadvantages of lists and arrays in implementations. First, it introduces implementations by arrays: advantage lies in to access a certain element; disadvantage is the fixed size. Second, the section talks about implementation of lists, particularly linked list: it has constant time in deleting and inserting (provided that the element's position is known); but it has linear time accessing. The book then suggested alternating between array and list implementation. Last part of the chapter describes the algorithm for the stable matching problem using both array and list implementations to achieve n^2 running time.This section gives a detailed explanation of the stable matching algorithm and when&why using array or list. We have covered those carefully in details. So the readability score is 6.
This section introduces several common running times and provides examples. Linear running time's examples are maximization and merging lists; nlogn time example is mergesort; quadratic example is brute force algorithm for finding closest pair; an example of n^k time is the brute force algorithm for testing if there is an independent graph of size k given n nodes;then it goes on giving examples of algorithms beyond polynomial times; at the end, the section illustrates how binary search to be log n time-smaller than linear time.I found parts of the section very interesting, namely, the independent set problem and the salesman problem, especially because they are both NP-hard problems and their running time as well as reducing the running time in special cases are very interesting topics. The rest are encountered in 112 and discussed quite clearly in class. The readability is 7.
This section focuses on implementations of priority queues using heap. The motivation is to achieve the sorting in nlogn time. The book defines what is a heap. Then it illustrates how heapify-up and down work. Next,the running time is analyzed. At the end, there is a summarization as well as an improvement with an additional position array. This section basically recorded in very details of what we talked in class about heaps and priority queues with heap data structure. I can see it being useful when the memory from lecture becomes rusty in the future. But now, it is a little bit lack of color. The readability is 5.