This is an old revision of the document!


Section 5.1: A First Recurrence: The Mergesort Algorithm

In this chapter, we will be breaking down inputs and solving each part recursively. To compute the running time of algorithms that do this, we solve “recurrence relations”. The mergesort algorithm works by continually dividing the input into two parts and sorting them recursively. For instance, an algorithm may break a sequence of 8 down to two of 4 which are then broken down to 4 pairs of 2 and sorted. Next, the each set of 2s is merged into a group of 4 sorted units. Finally, those are combined to make a sorted list of 8. Clearly, merge sort requires a base case to halt the recursion such as there only being 2 components left in each list. The recurrence relation of merge sort requires at its lowest level approximately n n/n processes. The maximum layers allowed is log(n). Thus, total running time is O(nlog(n)). Generally, we can solve recurrence relations either by unrolling them or guessing a solution and observing if it works or not. To unroll, we observe the layers and look for a pattern. Then, we sum over all layers. We may also use partial substitutions which is too mathematically dense for me to grasp. This section gets a 4/10 because I found it very confusing. I understand the basics but found it hard to achieve a thorough understanding.

Section 5.2: Further Recurrence Relations

Now, we look at cases where the recursive tree is not binary. First, we assume that the size of the input is still /2 each time. However, now each parent node has 3 branches instead of 2. This will produce larger time increase as levels are added. Since 3>2, the efficiency in this situation is greater than linear time. However, it is still quadratic. Overall, this change is equivalent to O(n^log2(q)), meaning any recursive algorithms with more than 2 shoots are bounded by this time. We must note that q means number of children per parent in the recursive relation. Of course, we can apply partial substitution to this just as we could for the lower values of q. Another case is where q=1. That is, for each level in the recurrence relation, there is only one computation occurring. Each computation runs in n/2l time where l is the level number. Overall, the total amount of time required as l approaches infinity is 2cn=O(n). The interesting thing about this is that the amount of work taking place actually decreases as the level number increases. In contrast, when n>2 it increases and when n=2 it stays the same(for n=2, n work is being done per level). Some recurrence relations will also have run times that are “geometric sums”. For instance, T(n)⇐2T(n/2) + cn^2 decreases in growth per level very quickly in w way that cannot be computed as a fixed sum. Some parts of this chapter were more clear but others (such as the geometric algorithm) were still very confusing. I would rate it at 6/10.

Section 5.3: Counting Inversions

courses/cs211/winter2018/journals/mccaffreyk/home/5.1520806998.txt.gz · Last modified: by mccaffreyk
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0