Chapter 5.1: Mergesort

Mergesort works by dividing a list of numbers into 2 equal halves, sorting each half recursively, and then combining the results of the recursive calls.

Common Divide and Conquer Algorithm

Divide the input into two pieces of equal size; solve the two subproblems on these pieces separately by recursion; and then combine the two results into an overall solution, spending only linear time for the initial division and final recombining.

For any algorithm that fits this style, we need a base case for the recursion, typically having it “bottom out” on inputs of some constant size. In the case of Mergesort, we will assume that once the input has been reduced to size 2, we stop the recursion and sort the two elements by simply comparing them to each other.

Consider any algorithm that fits the pattern described above and let T(n) denote its worst-case running time on input sizes of n. Supposing that n is even, the algorithm spends O(n) time to divide the input into two pieces of size n/2 each. Then it spends T(n/2) time to solve each one. Then it spends O(n) time to combine the solutions from the recursive calls. So, for some constant c, T(n) ⇐ 2T(n/2)+cn when n > 2 and T(2) ⇐ c.

Approaches to Solving Recurrences

  1. The most intuitive way is to “unroll” the recursion, accounting for running time across the first few levels, and identify a pattern that can be continued as the recursion expands. Then you can sum the running times over all levels of the recursion and get the total running time
  2. You can also start with a guess for the solution, substitute it into the recurrence relation, and check that it works. This method can be justified using an argument by induction on n. NOTE: This method probably isn't the best idea.

Method 1: Using figure 5.1 in the book, we can see that at the first level of recursion, we have a single problem of size n, which takes at most cn plus the time spent in all subsequent recursive calls. At the next level, we have 2 problems, each of size n/2. So each of these takes at most cn/2 for a total of at most cn time, plus subsequent calls. The third level has 4 problems of size n/4. Now we can see a pattern. At level j of the recursion, there are 2^j problems, each of size n/2^j, taking at most cn/2^j. So level j contributes at most 2^j(cn/2^j)=cn to the total running time. In general, each level requires cn work to be done. The number of times the input must be halved in order to reduce its size from n to 2 is log_2(n). So, summing the cn work over log(n) levels gives a total running time of O(nlogn).

I would rate this section a 5. The first half of it was very easy to read, but I don't understand the substitution method for solving Mergesort. Maybe it is because we didn't really talk about it in class. However, I understand the “unrolling” method so I'll just use that.

courses/cs211/winter2014/journals/alyssa/chapter_5.1.txt · Last modified: 2014/03/02 20:19 by hardnetta
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0