Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2018:journals:mccaffreyk:home:5 [2018/03/11 21:01] mccaffreykcourses:cs211:winter2018:journals:mccaffreyk:home:5 [2018/03/11 23:58] (current) mccaffreyk
Line 8: Line 8:
 halt the recursion such as there only being 2 components left in each list. The recurrence relation of merge sort requires  halt the recursion such as there only being 2 components left in each list. The recurrence relation of merge sort requires 
 at its lowest level approximately n n/n processes. The maximum layers allowed is log(n). Thus, total running time is O(nlog(n)). Generally, we can solve recurrence relations either by unrolling them or guessing a solution and observing if it works or not. To unroll, we observe the layers and look for a pattern. Then, we sum over all layers. We may also use partial substitutions which is too mathematically dense for me to grasp. This section gets a 4/10 because I found it very confusing. I understand the basics but found it hard to achieve a thorough understanding.  at its lowest level approximately n n/n processes. The maximum layers allowed is log(n). Thus, total running time is O(nlog(n)). Generally, we can solve recurrence relations either by unrolling them or guessing a solution and observing if it works or not. To unroll, we observe the layers and look for a pattern. Then, we sum over all layers. We may also use partial substitutions which is too mathematically dense for me to grasp. This section gets a 4/10 because I found it very confusing. I understand the basics but found it hard to achieve a thorough understanding. 
 +
 +==== Section 5.2: Further Recurrence Relations ====
 +
 +Now, we look at cases where the recursive tree is not binary. First, we assume that the size of the input is still /2 each time. However, now each parent node has 3 branches instead of 2. This will produce larger time increase as levels are added. Since 3>2, the efficiency in this situation is greater than linear time. However, it is still quadratic. Overall, this change is equivalent to O(n^log2(q)), meaning any recursive algorithms with more than 2 shoots are bounded by this time. We must note that **q means number of children per parent in the recursive relation.** Of course, we can apply partial substitution to this just as we could for the lower values of q. Another case is where q=1. That is, for each level in the recurrence relation, there is only one computation occurring. Each computation runs in n/2l time where l is the level number. Overall, the total amount of time required as l approaches infinity is 2cn=O(n). The interesting thing about this is that the amount of work taking place actually decreases as the level number increases. In contrast, when n>2 it increases and when n=2 it stays the same(for n=2, n work is being done per level). Some recurrence relations will also have run times that are "geometric sums". For instance, T(n)<=2T(n/2) + cn^2 decreases in growth per level very quickly in w way that cannot be computed as a fixed sum. Some parts of this chapter were more clear but others (such as the geometric algorithm) were still very confusing. I would rate it at 6/10.
 +
 +==== Section 5.3: Counting Inversions ====
 +
 +Here, we apply the building blocks from other sections to solve several other recurrence related problems. For instance, we want a way to compute the number of inversions required to transform one into another with the same values but in a different order. Complete agreement between two sets means no inversions are required to make them match and complete disagreement means that the maximum number of inversions are needed(n/2). For the algorithm, suppose that we have 2 sets: A and B where A!=B where the components in A and B are the same. We can split set B int two halves a and b recursively as in typical merge sort. Then, we sort them so that they "match" the order in set A. Finally, we merge a and b so that A==B. To do this, we take from the beginning of a and the end of b each time. This allows us to infer how many inversions we will need in only O(n) time. All along, we keep track of the total number of inversions required. This algorithm runs in O(nlogn) time as does merge sort. A brute force search for inversions would require O(n^2) time. 
 +
courses/cs211/winter2018/journals/mccaffreyk/home/5.1520802101.txt.gz · Last modified: by mccaffreyk
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0