Differences
This shows you the differences between two versions of the page.
| Next revision | Previous revision | ||
| courses:cs211:winter2014:journals:emily:entryone [2014/01/22 02:21] – created hardye | courses:cs211:winter2014:journals:emily:entryone [2014/02/24 20:10] (current) – [First Entry] hardye | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| + | ====== Entry One ====== | ||
| + | |||
| + | ====== Chapter 1.1 ====== | ||
| + | |||
| ==== The Stable Matching Problem - Gale and Shapely ==== | ==== The Stable Matching Problem - Gale and Shapely ==== | ||
| Line 43: | Line 47: | ||
| * by way of contradiction prove that S' is stable | * by way of contradiction prove that S' is stable | ||
| * in this stable matching each woman is paired with her //worst// valid partner | * in this stable matching each woman is paired with her //worst// valid partner | ||
| + | |||
| + | ====== CHAPTER TWO ====== | ||
| + | |||
| + | |||
| + | ==== Computational Tractability ==== | ||
| + | |||
| + | * our goal is to find efficient algorithms for computational problems! We will focus on running time efficiency but will also note algorithms efficiency in space and memory | ||
| + | * What does efficiency mean? | ||
| + | * //an algorithm is efficient if, when implemented, | ||
| + | * things we are missing: where and how well the algorithm is implemented and a " | ||
| + | * we want efficiency to be platform-independent, | ||
| + | * Worst-Case Running Times | ||
| + | * we analyze the worst-case running time by finding the largest possible running time the algorithm could have over all inputs of a given size | ||
| + | * Why do we use this instead of average case? | ||
| + | * it is hard to determine how to randomly generate input instances | ||
| + | * sets of random input could perform very poorly compared to another set-- tells us more about the random generation instead of the algorithm | ||
| + | * How do we decide if the worst-case analysis running bound is impressive? | ||
| + | * by comparison with brute-force search over the space of possible solutions | ||
| + | * there is a natural brute-force search algorithm that checks every possible solution | ||
| + | * since this takes exponential time it is unacceptable | ||
| + | * we offer a new definition of efficiency | ||
| + | * //an algorithm is efficient if it achieves qualitatively better worst-case performance at an analytical level, than brute-force search// | ||
| + | * vagueness of " | ||
| + | * polynomial time | ||
| + | * we want a good algorithm to have a good scaling property. | ||
| + | * **desirable scaling property** | ||
| + | * if the input size increases by one the number of possibilities incases multiplicitively. We want better than this!! we want when the input size increases by a constant factor, the algorithm should only slow down by some constant factor C. | ||
| + | * we get our third definition of efficiency | ||
| + | * //an algorithm is efficient if it has a polynomial running time// | ||
| + | * our specific definition of efficiency is beneficial because it become negatable | ||
| + | * possible that there is //no// efficient algorithm for a particular problem | ||
| + | |||
| + | Asymptotic Order of Growth | ||
| + | * an algorithm' | ||
| + | * we want to express growth rate of running times in a way that is insensitive to constant factors and low-order terms | ||
| + | * Asymptotic Upper Bounds | ||
| + | * T(n) is O(f(n)) if there are constants x>0 and nº >= 0 such that T(n) <= x(f(n) for n>= nº. T is asymptotically upper-bounded by f. | ||
| + | * Asymptotic Lower Bounds | ||
| + | * we want to show that the upper bound is the best one possible | ||
| + | * T(n) is Ω(f(n)) if there are constants x>0 and nº >= 0 such that T(n) >= x(f(n) for n>= nº. | ||
| + | * Asymptotic Tight Bounds | ||
| + | * T(n) grows exactly like f(n) to within a constant factor | ||
| + | * T(n) = pn^2 + qn + r is O(n^2) and Ω(n^2) | ||
| + | * they characterize the worst-cast performance of an algorithm precisely up to constant factors. | ||
| + | * properties of asymptotic growth rates | ||
| + | * **transitivity** | ||
| + | * if a function f is asymptotically upper-bounded by a function g, and if g is asymptotically upper bounded by function h, then f is asymptotically upper-bounded by h. | ||
| + | * **sums of functions** | ||
| + | * f = O(h) and g = O(h) then f+g = O(h) | ||
| + | * the overall running time is the sum of two functions, results on asymptotic bounds for sums of functions are relevant | ||
| + | Asymptotic Bounds for Some Common Functions | ||
| + | * **Polynomials** | ||
| + | * their asymptotic rate of growth is determined by their term that determines the degree | ||
| + | * algorithms with running-time bounds O(n^2) or O(n^3) are polynomial-time | ||
| + | * O(n log n) is also polynomial time | ||
| + | * **Logarithms** | ||
| + | * **Exponentials** | ||
