Chapter 5

5.1

This section introduces the idea of Divide and Conquer, and recurrence relations. Divide and Conquer algorithms generally reduce existing polynomial time algorithms to better polynomial time. Mergesort is an example of divide and conquer, it's recurrence relation is T(n) ⇐ tT(n/2) + cn. You can assume imprecisely that n is even, if it helps. You can “solve” a recursion relation by either unrolling it or guess and substitute…and then with induction. There's a fancy thing they do with substitution–they substitute variables instead of solid values to get the “general form”.

Readability 8/10

5.2

This section fancies up the analysis a little by introducing q subproblems in recurrences rather than just 2 subproblems. As it took me a bit to realize, having more than two subproblems doesn't always have to coincide with splitting the input into more than two parts. Sometimes, when you recur, you have, say, three smaller problems to solve, but each problem doesn't have to be a third smaller than the original inpus–it might be only half as small, or something. You can also have just one subproblem. There's an example of T(n) ⇐ 2T(n/2) + cn^2 which grows less quickly than you might think, because the work at each level to recombine shrinks, because (n/2)^2, say, is much geometrically bigger than (n/4)^2 or (n/8)^2

Readability 8/10

5.3

This section applies the divide and conquer strategy to counting inversions. Inversions are useful things to count, in web services which use “collaborative filtering,” for instance. There are max (n choose 2) conversions, and you can count them in n log n with the divide and conquer algorithm–but brute force would be n^2. The secret to the algorithm is–split the list into two halves, count the inversions in each half, and do a fancy thingy that's sort of like merge–you keep a pointer in each list and walk through them, adding a number of inversions according to how many elements are left, but both halves of the list have got to be sorted for that to work.

Readability 7/10

5.4

This section exposits the divide and conquer algorithm for finding the closest two points in a set of points (on a plane). First you have sorted lists of the coordinates by their x position and by their y position like we talked about in class and then you have to divide it in half and solve each half recursively, but to combine the two, you only have to check the points within a radius around the border equal to the shortest distance of the points within the halves, and then you search vertically from each point, but it guaranteed that you won't have to check more than six points. Mind-blowing. Runs in n log n. The text hints that later there's a solution to the problem that does it in an even better bound using “randomization.” Can't wait for that (I hope we do it?) Mind blowage x2?

Readability 7/10

5.5

This section gives a divide and conquer algorithm for performing multiplication. It starts by reviewing the regular algorithm for performing multiplication that I learned in third grade. Then, it does a clever and points out that you can split the numbers up into two numbers and then do a fancy thing, like 2233*1144 is also 22*11 * 10000 + (22*11 + 33*44) * 100 + 11*44. But then it turns out the recursion relation for that is T(n) < 4T(n/2) + cn, which just gets you O(n^2) in a fancier way, but no extra efficiency. But then turns out there's a trick you can do to make it even fancier, and get a O(n^1.53) bound. Not bad. Not quite as mind-blowing as the close points.

Readability 5/10

courses/cs211/winter2012/journals/richard/chap5.txt · Last modified: 2012/03/15 03:58 by marmorsteinr
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0