This is an old revision of the document!


6.1 Weighted Interval Scheduling

Designing a Recursive Algorithm
  • No natural greedy algorithm is known for this problem
  • n requests labeled 1…n with each request i specifying a start time si and finish time fi; each interval i also has a value (or weight) vi; two intervals are compatible if they don’t overlap
  • goal: select a subset of mutually compatible intervals to maximize the sum of the values of the selected intervals
  • p(j): for an interval j, it is the largest index i<j such that intervals i and j are disjoint (i is the leftmost interval that ends before j begins
  • optimal solution Oj: either j is in Oj, in which case OPT(j) = vj + OPT(p(j)), or j is not in Oj in which case OPT(j) = OPT(j-1)
  • Therefore, OPT(j) = max(vj+OPT(p(j)), OPT(j-1))
  • n belongs to the optimal solution if and only if the first of the options above is at least as good as the second
  • Request j belongs to an optimal solution on the set {1, 2, …, j} if and only if vj+OPT(p(j)) >= OPT(j-1)
  • Recursive algorithm to compute OPT(n) assuming we have already sorted the requests by finish time and computed the cvalues of p(j) for each j

Compute Opt Algorithm

   Compute-Opt(j):
     If j=0:
	     Return 0
     Else:
	     Return max(vj+OPT(p(j)), OPT(j-1))
     Endif
  • Compute-OPT(j) correctly computes OPT(j) for each j = 1, 2, …, n
  • The above algorithm for Compute-Opt as written would take exponential time in the worst case
Memoizing the Recursion
  • The fact that the algorithm as written take exponential time is due to the redundancy in the number of times it issues each of these calls
  • Solution: store the value of Compute-Opt in a globally accessible place the first time we compute it and then simply use this precomputed value in place of all future recursive calls
  • Memorization: the technique of saving values that have already been computed
  • Memorized Compute-Opt would make use of array M[0…n]; M[j] will start with the value “empty” but will hold the value of Compute-Opt(j) as soon as it is first determined

Compute Opt with Memoization

   M-Compute-Opt(j):
     If j=0:
	     Return 0
     Elif M[j] is not empty:
	     Return M[j]
     Else
	     Define M[j] = max(vj+OPT(p(j)), OPT(j-1))
	     Return M[j]
     Endif
Analyzing the Memorized Version
  • The runtime of M-Compute-Opt(n) is O(n) (assuming the input intervals are sorted by their finish time.
    • Since M has only n+1 entries, it follows that there can be at most O(n) calls to M-Compute-Opt.
Computing a Solution in Addition to Its Value

• Extend memorized solution to keep track of an optimal solution in addition to its value: we could maintain an additional array S so that S[i] contains an optimal set of intervals among {1, 2, …, i}

  • Avoid runtime blow-up by not explicitly computing S but rather by recovering the optimal solution from values saved in the array M after the optimum value has been computed

Retroactively Computing Optimal Solution

   Find-Solution(j):
     If j=0:
	     Output nothing
     Else:	
	     If vj+M[p(j)] >= M[j-1]:
		     Output j together with the result of Find-Solution(p(j))
	     Else:
		     Output the result of Find-Solution(j-1)
	     Endif
     Endif
  • Total of O(n) recursive calls
  • Given the array M of the optimal values of the sub-problems Find-Solution returns an optimal solution in O(n) time

6.2 Principles of Dynamic Programming

6.3 Segmented Least Squares

courses/cs211/winter2018/journals/nasona/chapter6.1521977948.txt.gz · Last modified: by nasona
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0