Differences
This shows you the differences between two versions of the page.
| Next revision | Previous revision | ||
| courses:cs211:winter2018:journals:beckg:ch2 [2018/01/15 22:29] – created beckg | courses:cs211:winter2018:journals:beckg:ch2 [2018/01/30 04:48] (current) – beckg | ||
|---|---|---|---|
| Line 9: | Line 9: | ||
| * there exist constants //c// > 0 and //d// > 0 such that for sufficiently large input instances //N//, the running time is bounded by // | * there exist constants //c// > 0 and //d// > 0 such that for sufficiently large input instances //N//, the running time is bounded by // | ||
| - | Put simply, if the input size were to increase by a constant factor (e.g. double), then the running time should also only decrease by a constant factor. Examples of polynomial run time bounds are //n, nlog< | + | Put simply, if the input size were to increase by a constant factor (e.g. double), then the running time should also only decrease by a constant factor. Examples of polynomial run time bounds are //n, n log< |
| One of the primary benefits of the above specific definition is its objectivity and therefore exact application from abstraction to implementation. Additionally, | One of the primary benefits of the above specific definition is its objectivity and therefore exact application from abstraction to implementation. Additionally, | ||
| + | I would rank this section a 9, it was very easy to read and very much complemented the class discussion. | ||
| + | ===== 2.2: Asymptotic Order of Growth ===== | ||
| + | To move further into the discussion of algorithm running times, a few clarifications and simplifications are in order. First, for the most part we will be considering //steps// in our pseudo-code algorithms to be anything such as assigning a value to a variable, looking up an entry in an array, following a pointer, or performing an arithmetic operation on a fixed size integer. Additionally, | ||
| - | I would rank this section a 9, it was very easy to read and very much complemented | + | === Asymptotic Upper Bounds: O === |
| + | Given a function //T(n)//, we say that //T(n) is O(f(n))// (read //T(n)// is order //f(n)//) if, for sufficiently large //n//, the function //T(n)// is bounded above by a constant multiple of //f(n)//. Mathematically put, that is if there exist constants //c > 0// and // | ||
| + | |||
| + | Note that there are //many// asymptotic upper bounds for functions, i.e. //T(n) = n// is both // | ||
| + | |||
| + | === Asymptotic Lower Bounds: Omega === | ||
| + | The definition of an asymptotic //lower// bound is the exact same as the upper bound, except that we say //T(n) is Omega(f(n))// | ||
| + | |||
| + | === Asymptotically Tight Bounds: Theta === | ||
| + | If a running time //T(n)// is both //O(f(n))// and // | ||
| + | |||
| + | For example, consider //T(n) = pn< | ||
| + | * //T(n) =< (p + q + r)n< | ||
| + | * //T(n) >= p< | ||
| + | * Therefore //T(n)// is // | ||
| + | |||
| + | === Fun stuff about bounds === | ||
| + | - Let //f// and //g// be two functions such that // | ||
| + | - **Transitivity** holds for //O, Omega,// and //Theta//. E.g. if //f = O(g)// and //g = O(h)//, then //f = O(h)//. | ||
| + | - **Sums of functions**: | ||
| + | - If //g = O(f)//, then //f + g = Theta(f)// | ||
| + | |||
| + | === Asymptotic bounds for common functions === | ||
| + | * **Polynomials**: | ||
| + | * Let //f// be a polynomial of degree //d// in which the coefficient // | ||
| + | * **Logarithms**: | ||
| + | * For every //b > 1// and every //x > 0//, we have // | ||
| + | * Can translate between bases, but often base is left out because // | ||
| + | * **Exponentials**: | ||
| + | * For every //r > 1// and every //d > 1//, we have // | ||
| + | * For the most part, this renders algorithms useless. | ||
| + | |||
| + | This was a nice section, also a 9 out of 10. Very readable, and I particularly like how they are getting down into the mathematical foundations a bit more now. | ||
| + | |||
| + | |||
| + | ===== 2.3: Implementation of Stable Matching with Lists and Arrays ===== | ||
| + | Our previous analysis of the G-S Stable Matching Algorithm | ||
| + | |||
| + | === Implementing === | ||
| + | Importantly, | ||
| + | |||
| + | - Identify a free man (maintain set of free men as linked list; then access to first, deletions, and additions are all in constant time) | ||
| + | - For man //m//, id the highest ranked woman to whom he hasn't proposed (single array, with one index for each man; each value initialized as " | ||
| + | - For woman //w//, must check if she is engaged, and if so, who her current partner is (similar to above: single array with an index for each woman; values initialized to a //null// symbol, and then updated to be the index of the man to whom she is engaged) | ||
| + | - For woman //w//, must be able to see whether she prefers //m// or //m'// (at start of algorithm, create //n x n// array which contains a row for each woman whose position indices correspond to the ID's of each man; the values at those indices contain the numerical ranking that the woman gave to the corresponding man; then, given two men, we can access their respective rankings and compare them in constant time) | ||
| + | |||
| + | The last 2-d array is essentially the preference lists of the women, but ordered by the men's ID's themselves, not by ranking. Note that we can create this at the outset with a single pass through the women' | ||
| + | |||
| + | |||
| + | I found this section //very// readable, a 9/10. Their use of the '' | ||
| + | |||
| + | |||
| + | ===== 2.4: Survey of Common Running Times ===== | ||
| + | Over the course of analyzing algorithms, a number of running times come up quite frequently, such as //O(n), O(n< | ||
| + | |||
| + | === Linear Time === | ||
| + | Algorithms that run in //O(n)// time are at most always a constant factor times the input size. Many of these end up being " | ||
| + | |||
| + | However, as we learned in class, these could be more nuanced, as is the case of merging two sorted lists. Here, we always compare the smallest two items of each list, and remove the smaller, adding it to the end of our third merged list. The important part of proving this linear is noting that on each comparison, one of the total //2n// items (supposing two lists of //n// length) is added to the new list. Therefore, there are only //2n// iterations total. | ||
| + | |||
| + | === " | ||
| + | Running time of //O(n log n)// is also very common. Specifically, | ||
| + | |||
| + | === Quadratic Time === | ||
| + | Quadratic time arises naturally from the // | ||
| + | |||
| + | Importantly, | ||
| + | |||
| + | === Cubic and Higher Order Polynomial Time === | ||
| + | We get higher orders of polynomial times from similar situations. For example, we arrive at // | ||
| + | |||
| + | The both examples quickly generalizes to // | ||
| + | |||
| + | |||
| + | === Beyond Polynomial Time === | ||
| + | The Independent Set problem gets a lot more complicated much quicker if we instead try to find the independent set of //maximal size//. In this case we arise at the exponential // | ||
| + | |||
| + | Lastly, the fastest growing of those that we consider is the //O(n!)// runtime. This factorial time arises from the natural search space of matching //n// items with //n// others (e.g. the Stable Matching search space), as well as from all possible ways of arranging //n// items in order. An example of the latter is the famous Traveling Salesman problem. | ||
| + | |||
| + | |||
| + | === Sublinear Time === | ||
| + | Back into comfortable run times, the most common sublinear time is the logarithmic //O(log n)// time, of which the most famous example is the binary search. Crucially, these typically require a certain pre-existing knowledge of the data set, just as the binary search requires that the data set be sorted ahead of time. So, algorithms like this can often require preprocessing. | ||
| + | |||
| + | |||
| + | This was a good section, 9/10. Definitely interesting and explains everything quite well. | ||
| + | |||
| + | ===== 2.5: More Complex Data Structure: Priority Queues ===== | ||
| + | As we know, a priority queue (PQ) is one that maintains a set of elements, each of which has an associated numeric value, or key. The smaller this key, the higher the priority. These support insertion and deletion and selection of element with smallest key. As is shown in the book, a sequence of //O(n)// priority queue operations can be used to sort a list of //n// numbers. So, if we can get each PQ operation to work in //O(log n)// time, then we will have an implementation of sorting in //O(n log n)// time. | ||
| + | |||
| + | This PQ implementation takes the form of a //heap//. A heap is a binary tree whose keys are in //heap order//, meaning that the values of each node's children are greater than or equal to the parent node's value. An easy implementation of this is an array, where we call the first index 1 (which is the root of the heap), and then define for each parent node //i//, its left child is at //2i// and its right child at //2i + 1//. | ||
| + | |||
| + | To implement the heap operations, we define '' | ||
| + | |||
| + | === Implementing PQs with Heaps === | ||
| + | Initializing the PQ array runs at most //O(N)//, where //N// is a constraint on the input size by being the size of the array. Due to the above Heapify operations, we can then insert and delete any element in //O(log n)// time. Finding the minimum value then takes //O(1)// time because it is simply the root of the heap--always the first element in our array. Thus, extracting the minimum value simply means removing that minimum value, therefore takes //O(log n)// time. | ||
| - | This | + | So, as mentioned in our goal, we have found an implementation of the PQ for which each operation is //O(log n)//. So, we can then use this to implement a sort in //O(n log n)// time. |
| + | This was a good section which complemented class very well. They seemed to get a little more caught in notation which led to some confusion at times, but still a 8/10 in my book. | ||
