Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
courses:cs211:winter2014:journals:alyssa:home [2014/01/15 19:03] – hardnetta | courses:cs211:winter2014:journals:alyssa:home [2014/01/20 18:07] (current) – created hardnetta | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | + | ====== | |
- | ====== | + | |
- | Algorithmic ideas and notions are not just limited to long-standing, | + | |
- | I found this section pretty interesting, | + | |
- | + | ||
- | ====== Chapter 1.1: Stable Matching Problem ====== | + | |
- | + | ||
- | The Stable Matching Problem arose from a situation in which two economists (Gale and Shapley) posed the question: Could one design a college admissions process, or a job recruiting process, that was self-enforcing? | + | |
- | Algorithm Outline: | + | |
- | • Initially everyone is single. | + | |
- | • An unmarried man, m, chooses the woman, w, who ranks highest on his preference list that he hasn’t proposed to and proposes to her | + | |
- | • If the woman, w, is single, she accepts. If not, she looks at her own preference list and | + | |
- | o If the man, m, ranks higher than her current partner, mcurrent, then she leaves him and becomes engaged to m, leaving mcurrent single | + | |
- | o Otherwise, | + | |
- | • The algorithm terminates when no one is single and every woman has been proposed to | + | |
- | This algorithm returns a perfect matching—one in which no one is single and every individual is paired with exactly one individual of the opposite gender. It also terminates after at most n2 iterations (worst case: every man proposes to every woman—n x n = n2). We can prove that the G-S algorithm returns a perfect, stable matching by contradiction. Suppose the matching isn’t perfect; i.e. there is a man, m, who is single at the end of the algorithm. However, m would have had to propose to every woman in order for the algorithm to terminate, so there must be a woman, w, who is still single, a contradiction. Now suppose the matching isn’t stable. So there are pairs (m,w) and (m’,w’) such that m prefers w’ to w, w’ prefers m to m’. So either m proposed to w first, leading to their engagement, which means that m does not prefer w’ to w, or w’ broke off her engagement for m’, which means that w’ does not prefer m to m’, a contradiction. So the G-S algorithm does provide a perfect, stable matching. It is also important to note that all executions of this algorithm yield the same matching. Although this algorithm does provide a clean solution, it does not take fairness into account. Throughout the algorithm, men start out with their top choice woman and, as the algorithm proceeds, he can bounce between being engaged or singe, and his prospects worsen. Women, on the other hand, are engaged the entire time (after their first proposal) and their prospects improve as the algorithm proceeds. However, in the end, men are paired with their best valid partner and women are paired with their worst valid partner. So, in general, the side that does the proposing in the G-S algorithm ends up with the best possible stable matching (from their perspective) and the other side ends up with the worst possible stable matching. | + | |
- | I found this section pretty redundant. We covered all of this in class (very thoroughly) so I had a pretty good grasp of the problem and what the algorithm looks like. Had I been confused at all, this definitely would have cleared things up. It was very easy to understand, so I’d rate it a 10. | + | |
- | + | ||
- | ====== Chapter 2.1: Computational Tractability ====== | + | |
- | + | ||
- | A simple definition of efficiency when it comes to algorithms is: An algorithm is efficient if, when implemented, | + | |
- | + | ||
- | I think this section was pretty difficult and confusing, but I think I may have gotten the general idea. I understand why we use worst-case running times and I think I see why polynomial time is the best (it’s definitely better than many of the alternatives i.e. exponential time). I would rate this at a 6. | + | |
- | + | ||
- | ====== Chapter 2.2: Asymptotic Order of Growth ====== | + | |
- | + | ||
- | If we can express an algorithm’s worst-case running time on inputs of size n as growing at a rate that is proportional to some function f(n), then that function becomes a bound on the running time of the algorithm. Counting the number of steps an algorithm executes is essentially meaningless both because we will often be using pseudo-code and because we want to classify broad classes of algorithms so we need a more general way of determining running time. So, we want to express the growth rate of running times in a way that is insensitive to constant factors and low-order terms. | + | |
- | • T(n) – worst-case running time of a certain algorithm on an input of size n | + | |
- | • O(f(n))— asymptotic upper bound, the order of f(n) | + | |
- | • Ω(f(n)) – asymptotic lower bound | + | |
- | • Θ(f(n))—asymptotically tight bound, when T(n) = O(f(n)) = Ω(f(n)) | + | |
- | Some properties of asymptotic growth rates: | + | |
- | Transitivity: | + | |
- | o If f = O(g) and g = O(h), then f = O(h) | + | |
- | o If f = Ω(g) and g = Ω(h), then f = Ω(h) | + | |
- | Sums of functions: if we have an asymptotic upper bound that applies to each of the two functions f and g, then it applies to their sum | + | |
- | o If f = O(h) and g = O(h), then f + g = O(h) | + | |
- | o Let k be a fixed constant and let f1, | + | |
- | o If f and g are two functions (taking nonnegative values) such that g = O(f). The f + g = θ(f), i.e. f is an asymptotically tight bound for the combined function f + g. | + | |
- | + | ||
- | This was definitely the most difficult part of the reading for me. Maybe it is because I missed Monday’s class. Hopefully hearing this again in class will clear things up for me because I was completely lost in all of the variables and their relationships to one another. I’d rate this section a 3. | + | |