**__Chapter 5.2: Further Recurrence Relations__** There is a more general class of algorithms obtained by considering divide and conquer algorithms that create recursive calls on q subproblems of size n/2 each and then combine the results in O(n) time. We treat the cases q > 2, q = 1, and q = 2 all separately. **The Case of q > 2 Subproblems** At the first level of recursion, we have a single problem of size n. At the next level, we have q problems, each of size n/2. The next level yields q^2 problems of size n/4 each, for a total time of (q^2/4)cn. We see that the total work per level is //increasing// as we proceed through the recursion. The total work performed at any level j is ((q/2)^j)cn. There are log2(n) levels of recursion --> math on p.216 to prove that Any function T(.) satisfying (5.3) with q > 2 is bounded by O(n^log2(q)). The running time here is more than linear, but is still polynomial in n. **The Case of q = 1** At any arbitrary level j, we still have just one instance with the size n/2^j and it contributes cn/2^j to the running time. math on p.218-19 Any function T(.) satisfying (5.3) with q = 1 is bounded by O(n). The point here is that a geometric series with a decaying exponent is a powerful thing: fully half the work performed by the algorithm is being done at the top level of the recursion. **Another Related Recurrence** The recurrence relation, T(n) <= 2T(n/2) + O(n^2), has a runtime of O(n^2). Judging by prior knowledge we would assume that it would have a running time of O(n^2logn), but the idea here is that n^2 decreases very quickly as we replace it with (n/2)^2, (n/4)^2, (n/8)^2, etc. This means we get another geometric sum, rather than one that grows by a fixed amount over all n levels. This is a very interesting chapter --> a ton of unexpected maths happen here. 9/10.