This is an old revision of the document!


Chapter 4

This chapter, as its descriptive title indicates, talks about greedy algorithm. The introduction of the chapter first points out the philosophy behind the different greedy algorithms: using a local decision to optimal small steps and constructing a overall optimal solution at the same time. Then the introduction summarizes the context of the chapter: listing out examples to let us get a feeling of why greedy algorithms are greedy.

Section 1: Interval Scheduling: The Greedy Algorithm Stays Ahead

This section illustrates the first example of greedy algorithms in details: the interval scheduling. The core inside this algorithm is to find out the criterion of being greedy to rule each small step. There are three seemingly appealing criterions that do not work: earliest starting time, smallest interval of time, and most compatible. We did not manage to come up with a counter example in class to disapprove the most compatible case. The book gives a neat example, so that is good. The actual working criterion is the earliest finishing time. To show that this actually works, we compare the solution A resulted from this earliest finishing time criterion with any optimal solution O. It is not hard to conceive that for an optimal solution composed by k pieces “for all indices r⇐k, finishing time of rth piece from A is earlier or equal to that of O”. Hence it follows that the greedy algorithm indeed returns the optimal solution. The running time of the algorithm is O(nlogn) (coming from sorting, the rest costs O(n)).

The book then introduces some extensions of the interval scheduling problem that I think are very interesting, such as choosing intervals without knowledge of future output (like a philosophy problem from Socrates) and intervals with different values.

A related problem, scheduling all intervals, is discussed next in the section. Obviously, the minimum resources we need to use is the depth of the set of intervals (maximum overlap). Then the book shows a greedy algorithm that guarantees that the number of resources equal to the depth. Basically, we also use the criterion as in the original interval scheduling problem-the earliest finishing time, but instead of abandoning the ones overlapping with the chosen ones, we put it in a new resource. There would not be a d+1 resource used. Because of the way we arrange the intervals, there has to be 1 resource such that the last interval's finishing time being earlier than the current one we are trying to find position for. The running time is still O(nlogn), coming mainly from sorting.

I like this section's reading. Since it is my first time learning about greedy algorithm, the logic behind it attracts my attention. I'd like to see how the “choosing intervals without knowledge of future output works”.

Readability is 9.

Section 2: sCHEDULING TO MINIMIZE LATENESS: AN exchange argument

The section introduces the second example that makes use of the idea of being greedy. It first states the problem: we have some projects with deadlines and lengths and we need to schedule all of them such that the maximum lateness is minimized.

There are some appealing criterions that do not really meet our goal: the length of the jobs and the slack time of the jobs. The criterion that works is the Earliest Deadline First. The algorithm is written based on this Criterion and the running time is O(nlogn). We can easily realize that there should be no idle time, that is, contagiously working. The book defines an inversion to be finishing a job with later deadline first. The book proves that all schedules with no inversions and no idle time have the same maximum lateness, which is easy to see (different schedules will only be due to there exist two jobs having same deadlines). Then the book proves that there is an optimal schedule that has no inversions and no idle time. The hardest part of the proof, which is presented separately, is that if we swap an inversion, then the new schedule is still optimal.

The extension of the minimizing lateness problem can be adding a starting time to each job. This makes the original proof no longer valid, because we can no longer swap any inversions.

This section involves a challenging proof, which is worth more attention. So the readability is 8.

Section 4: Shortest Paths in a Graph

courses/cs211/winter2011/journals/wendy/chapter4.1297822052.txt.gz · Last modified: 2011/02/16 02:07 by shangw
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0