Differences
This shows you the differences between two versions of the page.
| courses:cs211:winter2018:journals:holmesr:section_6.3 [2018/03/27 04:47] – created holmesr | courses:cs211:winter2018:journals:holmesr:section_6.3 [2018/03/27 16:00] (current) – holmesr | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| ====== 6.3 Segmented Least Squares: Multi-way Choices ====== | ====== 6.3 Segmented Least Squares: Multi-way Choices ====== | ||
| - | The Segmented Least Squares Problem | + | What we will call the error of a line of best fit is the sum of each point' |
| + | |||
| + | This dilemma leads us to explore the issue of change detection, which is helps us identify a few points at which a change occurs. Such a technique will help to tell us how many linear approximations to use. To assist us in the solving of this problem we will define a penalty of a partition as the sum of the number of segments times a given multiplier and the error of the optimal line through a segment of points. We want to minimize the penalty. As we increase the number of segments into which the data is partitioned, | ||
| + | |||
| + | To design the algorithm, we can begin by thinking that the final point in the set of points will only be in one segment and that segment must begin at a point. If we can figure out what these two points are, then we will have found the composition of the final segment and we can remove those points from the data. Letting e< | ||
| + | |||
| + | To be completely honest, I am not completely sure what statement 6.7 in the book is saying, so I will look back over that at a later time.. | ||
| + | |||
| + | The algorithm uses O(n< | ||
