Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2018:journals:cohene:home:chapter6 [2018/03/25 22:56] – [6.1: Weighted Interval Scheduling: A Recursive Procedure] cohenecourses:cs211:winter2018:journals:cohene:home:chapter6 [2018/03/26 03:03] (current) – [6.3: Segmented Least Squares: Multi-way Choices] cohene
Line 14: Line 14:
 ===== 6.2: Principles of Dynamic Programming: Memoization or Iteration over Subproblems ===== ===== 6.2: Principles of Dynamic Programming: Memoization or Iteration over Subproblems =====
  
 +We can now develop a general algorithm for dynamic programming to iterate over subproblems. The key to this is keeping track of an array, M. It keeps track of each of the subproblems. We can compute the entries in the array through an iterative algorithm rather than a recursive one. This algorithm would run in O(n) time as it only runs for n iterations and spends constant time on each iteration. 
  
 +Problems solved by dynamic programming share the following properties: there are a polynomial number of subproblems, the solution to the overall problem can be computed from the solutions to the subproblems, there is a natural ordering on the subproblems.It may be difficult to determine whether it is easier to figure out how to link the subproblems together first, or how to solve the subproblems first.  
 ===== 6.3: Segmented Least Squares: Multi-way Choices===== ===== 6.3: Segmented Least Squares: Multi-way Choices=====
  
 +The previous problems addressed in this chapter addressed recurrences based on binary choices. It is also possible to use recurrence with multi-way choices to find an optimal solution. The problem is similar to finding the line of best fit on a two dimensional set of axes. Our goal is to find the line with minimum error. A thought to the solution to this problem is to pass a set of lines through the points to minimize the error. This creates another issue, however, as this is not good problem formation. We could also try using brute force to find it, but this is also expensive. 
  
 +This problem is known as the Segmented Least Squares Problem. It is the backbone of change detection, given certain data points, we want to find the points where a change occurs. For the Segmented Least Squares Problem, we can create segments of points and add a penalty to each partition by using a fixed multiplier and the error calculation. 
  
 +To do so, we want to find a polynomial number of subproblems by partitioning n objects. We can use this to derive an algorithm. This algorithm would run in O(n^3) time.  
  
courses/cs211/winter2018/journals/cohene/home/chapter6.1522018570.txt.gz · Last modified: by cohene
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0