Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| courses:cs211:winter2018:journals:devlinn:chapter2 [2018/01/09 23:22] – devlinn | courses:cs211:winter2018:journals:devlinn:chapter2 [2018/01/28 02:17] (current) – devlinn | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ====== Chapter 2: Basics of Algorithm Analysis ====== | ====== Chapter 2: Basics of Algorithm Analysis ====== | ||
| - | ==== 2.1 Computational Tractability ==== | + | ===== 2.1 Computational Tractability |
| Several factors are key in the design of algorithms, including understanding the discrete nature of many problems as well as addressing the efficiency and space of an algorithm. The text gives one definition of efficiency: "An algorithm is efficient if, when implemented, | Several factors are key in the design of algorithms, including understanding the discrete nature of many problems as well as addressing the efficiency and space of an algorithm. The text gives one definition of efficiency: "An algorithm is efficient if, when implemented, | ||
| - | To be more specific, we can argue that an algorithm is efficient if it, at the bare minimum, performs in the worst case better than would a brute-force search. To narrow the definition even further, we could classify an ' | + | To be more specific, we can argue that an algorithm is efficient if it, at the bare minimum, performs in the worst case better than would a brute-force search. To narrow the definition even further, we could classify an ' |
| - | ==== 2.2 Asymptotic Order of Growth ==== | + | ===== 2.2 Asymptotic Order of Growth |
| - | For the purposes of simplicity and clarity, classifying running times should be done in terms of constant factors; for example instead of calculating a run time as an< | + | For the purposes of simplicity and clarity, classifying running times should be done in terms of constant factors; for example instead of calculating a run time as an< |
| * there exists a constant //c// > 0 | * there exists a constant //c// > 0 | ||
| - | * there exists and a constant n< | + | * there exists and a constant n< |
| - | * //T(n)// ≤ //c//. | + | * //T(n)// ≤ //c • f(n)//. |
| + | Suppose //T(n)// ≥ //d • f(n)//, for a fixed constant //d// > 0; then T is // | ||
| + | |||
| + | The growth rates //O//, Ω, and θ share some basic properties: | ||
| + | * // | ||
| + | * if //f// = θ(//g//) and //g// = θ(//h//) then //f// = θ(//h//) | ||
| + | * if //f// = // | ||
| + | * if //g// = // | ||
| + | * if //g// = // | ||
| + | |||
| + | Logarithms, polynomials, | ||
| + | |||
| + | ===== 2.3 Implementing the Stable Matching Problem Using Lists and Arrays ===== | ||
| + | Deciding how data will be represented in an algorithm is critical in the calculation of running time. In the Gale-Shapley algorithm, the only data structures needed are lists and arrays. Using arrays has several benefits, including constant-time access for the i< | ||
| + | |||
| + | In order to attain a preferable run time for the Gale-Shapley algorithm, we must choose data structures to model our data which will be most beneficial for their use. For example, we will store the preference lists of the men and the preference lists for the women in arrays, since this data structure is ideal for accessing the i< | ||
| + | |||
| + | I did not find this section as easy to follow as our discussion in class about the same decisions. If we had not worked through these questions prior to this reading I would have had a very hard time understanding their logic, since it was weakly justified. I rate this section a 5 for readability. I would also have liked a more in-depth explanation of the method to ensure O(// | ||
| + | |||
| + | ===== 2.4 A Survey of Common Running Times ===== | ||
| + | As we learned earlier, when creating algorithms a goal is to have a more efficient running time than a brute-force algorithm would have. There are several common run times that appear frequently in algorithms, including linear time, //nlogn// time, quadratic time, cubic time, and // | ||
| + | |||
| + | There are some running times that are worse than polynomial running time, including 2//< | ||
| + | |||
| + | ===== 2.5 A More Complex Data Structure: Priority Queues ===== | ||
| + | A priority queue is considered one of the most applicable data structures for use in algorithms. These structures are useful when a group of elements have a priority, which we call a key, and we seek to obtain the element with the highest priority. Ideally, we would hope to implement a priority queue which can perform insertion and deletion operations with O(//logn//) running time and can sort //n// numbers in O(// | ||
| + | |||
| + | In order to maintain a heap's structure after the insertion of an element, the algorithm Heapify-up will be useful. This algorithm can be roughly sketched as follows: | ||
| + | Heapify-up(// | ||
| + | if i > 1 then | ||
| + | let j = parent(i) = ⌊i/2⌋ | ||
| + | if key[// | ||
| + | swap array entries //H//[i] and //H//[j] | ||
| + | Heapify-up(// | ||
| + | endif | ||
| + | endif | ||
| + | |||
| + | Deleting an element must also consider how to maintain the order in the heap. The algorithm of Heapify-down can be useful for this, which is sketched as follows: | ||
| + | Heapify-down(// | ||
| + | let n = length(// | ||
| + | if 2i > //n// then | ||
| + | terminate with //H// unchanged | ||
| + | else if 2i < //n// then | ||
| + | let left = 2i and right = 2i + 1 | ||
| + | let j be the index that minimizes key[// | ||
| + | else if 2i = //n// then | ||
| + | let j = 2i | ||
| + | endif | ||
| + | if key[// | ||
| + | swap array entries //H//[i] and //H//[j] | ||
| + | Heapify-down(// | ||
| + | endif | ||
