Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| courses:cs211:winter2018:journals:nasona:chapter6 [2018/03/25 12:02] – [6.1 Weighted Interval Scheduling] nasona | courses:cs211:winter2018:journals:nasona:chapter6 [2018/03/25 12:15] (current) – [6.1 Weighted Interval Scheduling] nasona | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| =======6.1 Weighted Interval Scheduling======= | =======6.1 Weighted Interval Scheduling======= | ||
| ==Summary== | ==Summary== | ||
| + | There is no known greedy solution for this problem. Using a straightforward algorithm, we would get an exponential algorithm. To decrease the runtime and redundancy of the computations, | ||
| ==Designing a Recursive Algorithm== | ==Designing a Recursive Algorithm== | ||
| Line 27: | Line 28: | ||
| * The fact that the algorithm as written take exponential time is due to the redundancy in the number of times it issues each of these calls | * The fact that the algorithm as written take exponential time is due to the redundancy in the number of times it issues each of these calls | ||
| * Solution: store the value of Compute-Opt in a globally accessible place the first time we compute it and then simply use this precomputed value in place of all future recursive calls | * Solution: store the value of Compute-Opt in a globally accessible place the first time we compute it and then simply use this precomputed value in place of all future recursive calls | ||
| - | * Memorization: the technique of saving values that have already been computed | + | * Memoization: the technique of saving values that have already been computed |
| * Memorized Compute-Opt would make use of array M[0…n]; M[j] will start with the value “empty” but will hold the value of Compute-Opt(j) as soon as it is first determined | * Memorized Compute-Opt would make use of array M[0…n]; M[j] will start with the value “empty” but will hold the value of Compute-Opt(j) as soon as it is first determined | ||
| Line 70: | Line 71: | ||
| =======6.2 Principles of Dynamic Programming ======= | =======6.2 Principles of Dynamic Programming ======= | ||
| ==Summary== | ==Summary== | ||
| + | We can find the optimal solution to the Weighted Interval Scheduling problem by using an iterative algorithm instead of a recursive algorithm. This is helpful because they run in the same time, but in general, we would use the iterative version because typically the recursive version is more difficult to comprehend. Iterative-Compute-Opt runs in O(n) time. There are three basic principles of dynamic programming. They are there must be a polynomial number of subproblems, | ||
| ==Designing the Algorithm== | ==Designing the Algorithm== | ||
| Line 98: | Line 100: | ||
| =======6.3 Segmented Least Squares======= | =======6.3 Segmented Least Squares======= | ||
| ==Summary== | ==Summary== | ||
| + | The problem can be defined as finding the partition of points as to minimize penalty. In other words, we want to minimize the function E+cL, were E is the error, c is the cost constant, and L is the number of lines. The value of the optimal solution is OPT(n) = ei, | ||
| ==The Problem== | ==The Problem== | ||
| * Suppose our data consists of a set P of n points in the plane denoted (x1, y1) … (xn, yn); suppose x1< | * Suppose our data consists of a set P of n points in the plane denoted (x1, y1) … (xn, yn); suppose x1< | ||
