This is an old revision of the document!


Chapter 6 – Dynamic Programming

My notes on the assigned sections of Chapter 6 of Algorithm Design by Jon Kleinberg and Éva Tardos. This chapter details dynamic programming. Dynamic programming involves “operating dangerously close to the edge of a brute-force search: although it's systematically working through the exponentially large set of possible solutions to the problem, it does this without ever examining them all explicitly.” This is usually achieved with memoization.

6.1 – Weighted Interval Scheduling: A Recursive Procedure

The weighted interval scheduling problem can be designed with a simple recursive approach. opt(j) = max(vj + opt(p(j)), opt(j-1)). We know that request j bleongs to an optimal solution on the set {1,2,…,j} if an only if vj + opt(p(j)) >= opt(j-1). Also, Compute-Opt(j) correctly computes OPT(j) for each j = 1,2,…,n. Adding in memoization can reduce the redundancy in all the Compute-Opt calls, storing the calls at certain values. This memoization marks the dynamic programming solution to weighted interval scheduling.

Input: n jobs (associated start time sj, finish time fj, and value vj)

Sort jobs by finish times so that f1 ≤ f2 ≤ ... ≤ fn
Compute p(1), p(2), …, p(n)

for j = 1 to n
    M[j] = empty
M[0] = 0

M-Compute-Opt(j):
    if M[j] is empty:
        M[j] = max(vj + M-Compute-Opt(p(j)), M-Compute-Opt(j-1))
    return M[j]
    
M-Compute-Opt(n)

6.2 – Principles of Dynamic Programming: Memoization or Iteration over Subproblems

6.3 – Segmented Least Squares: Multi-way Choices

courses/cs211/winter2018/journals/bairdc/chapter6.1522118118.txt.gz · Last modified: by bairdc
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0