This is an old revision of the document!
Table of Contents
Chapter 4
Section 4.1
Section 4.1 covers the introduction to greedy algorithms using the basic “staying ahead” mentality. There are two specific algorithms that are introduced in this section. The first is the interval scheduling algorithm which returns the optimal schedule of intervals from some given set of intervals. The algorithm works by finding the interval that has the earliest finishing time, removing all intervals that conflict with the chosen intervals, and then recursing through the remaining intervals until there are not left. The run-time of this algorithm is O(nlogn).
The second algorithm schedules all intervals in an optimal fashion such that the number of resources used is no more than the depth of the intervals. The algorithm works by going through each interval and removing any other intervals that conflict with it. Then, after excluding all of the conflicts, the algorithm stitches together the remaining intervals together so that the fewest resources have to be used.
I give this chapter a 6 for readability since the proofs were sometimes hard to understand a 7 on the interesting scale.
Section 4.2
Section 4.2 covers the design and creation of an algorithm that minimizes lateness in job scheduling. The algorithm itself is surprisingly simple, it arranges jobs in the order of their finish time. Even though this algorithm completely ignores the other half of data, the length of each job, it is shown in the rest of the chapter through various proofs that the algorithm always produces an optimal solution.
This section was a much tough read than many of the other sections since so much of it was taken up by the proofs which I personally find much harder to read. As such, I give this section a 4 on readability and a 6 on the interesting scale.
Section 4.4
This section covers Dijkstra's shortest path algorithm that we discussed today in class (Monday, Feb. 26th). I was able to understand the algorithm much better in class as it was easier to grasp with a real-time explanation, but the section definitely helped to supplement my knowledge of the algorithm. For example, the algorithm actually runs in O(mlogn) time not O(nlogn) time. In class we just sort of said nlogn since it is what we are used to, while mlogn is the correct runtime.
I give this section a 7 for readability and a 7 on the interesting scale.
Section 4.5
Section 4.5 covers the three ways to create a minimum spanning tree from some fully connected graph. The three algorithms used arePrim's Algorithm, Kruskal's Algorithm, and the Reverse-Delete Algorithm. All the algorithms are greedy by nature, and all three were covered in class beforehand. The chapter goes into various proofs that are used to assert the correctness of the three algorithms, but the most important points are the proving of the cut and cycle properties. The cut property is to edges that are part of the minimum spanning tree, while the cycle property proves that a cycle is inefficient and allows the deletion of nodes in the Reverse-Delete Algorithm.
Although this chapter was informative, the fact that we had covered it in class for multiple days made the reading a bit redundant as the material had already been presented. I give this section a 4 on the interesting scale and a 6 on the readability scale.
Section 4.6
Section 4.6 delves into the implementation of Kruskal's algorithm. The new data type introduced in the section is the Union-Find data type. Implemented using pointers, the Union-Find data structure performs three operations: MakeUnionFind(S) which creates a Union-Find data structure out of some set of nodes S, Find(u), and Union(A, B). Find can be implemented in O(logn) for our purposes and returns the set name which node u is in. Union combines two nodes A & B into a single set and can be implemented in O(1). The overall runtime for Kruskal's algorithm is O(mlogn).
The in-depth implementation of Kruskal's algorithm was interesting, it helped cement the in-class lecture. I give this section a 7 on readability and a 7 on the interesting scale.
