This is an old revision of the document!
Table of Contents
Onye's Journal
Preface:
Algorithms are important. They help us solve problems. They help us understand stuff. This book is about algorithms.
Section 1.1:
Section 1.1 talked about the stable matching problem and the gale shapely soluition to it. A typical example of the stable matching problem is n men and n women looking to be matched, with each man and woman having a strict order of preferences. For a given matching, m and w form an unstable pair if they each prefer each other to their current partner. For n men and n women with strict preferences, it is always possible to find a matching with no unstable pairs, and the gale shapely algorithm does just that. This works by taking any single man and attempting to pair him the his highest rated woman that we have not attempted to pair him with yet. If she prefers him to her current match, the pairing is successful, and she is paired with him and her current partner becomes single again. Repeat this until every man has proposed to every women.
Section 2.1:
Chapter 2 discusses the computational tractability and the running time of algorithms. It also discusses asymptotic order of growth and Big O. Algorithms that are polynomial time are O(n^k) for some k. These are said to be efficient. As opposed to that, some algorithms may be O(k^n), for some k > 1, for example. These algorithms are incredibly computationally intensive and quickly become impossible to calculate in a reasonable amount of time. For example, an exponential algorithm may run in under a second for n = 1, but take billions of years for n = 20.
Section 2.3
In order to get the Gale-Shapely algorithm running in O(n^2) steps we need to get each step of the loop running in constant time.
Because of we need to both add and remove elements from the set of free men it is preferable to use a list to an array. (Though, it is entirely feasible for this problem to use arrays correctly without increasing the complexity. You can, for example, implement a stack with an array and since the maximum number of men is fixed there will be no fear of running out of space in the array so all stack actions will be constant time).
To keep track of the woman a man will next propose to you can use an array called Next where Next[m] is the number of the woman man m will have to propose to next. Of course, alternatively you could have kept the men's preference list as a list and deleted woman as he proposed to them.
Finally, we need an array Ranking[w,m] that gives the ranking of man m in w's list. This allows us to compare 2 men to see which one she prefers easily. This should be done outside of the loop.
Section 2.4
This shows some running times and how some examples of algorithms that run in that time with explanations for why that is the case.
Linear Time: 1. An Algorithm can be linear if it processes the input as a single pass, spending a constant amount of time on each part of the input. An example of this is the most direct algorithm for computing the maximum of a list of numbers.
Another mehtod of looking at sort of accouting scheme. Charge the cost of each iteration to an element so the total cost will be bounded by the number of elements. An example where this approach works is analazying the running time of merging two sorted lists.
N log N:
N log N is common to “divide and conqure” strategy algorithms where the input is cut into pieces, each piece is analyzed separately and then it's all put together again. For example, sorting algorithms such as mergesort or quicksort. It is also appears as the running time where sorting is the most expensive component of the algorithm .
Quadratic Time: Quadratic running time is appears when processing is done on every possible pair of a set of n elements. For example, comparing the distance between every two points in a set to find the minimum. It can also arise from nested loops.
Cubic Time:
From triply nested loops. Or perhaps from selecting a pair and then performing a loop.
O(n^k) Time:
From k nested loops. Or from selecting a subset of k elements.
Beyond Polynomial Time: Some algorithms take even more than polynomial time. Some examples of running times like this are O(2^n) and O(n!). O(2^n) might arise in cases where we look at every possible subset of n elements. O(n!) might arise when looking at all the possible permutations of n items or matching n items to n other items.
Sublinear Time: Sometimes algorithms take less than linear time. Since just reading the input is linear time, these algorithms often work by having the input queried indirectly rather than read completely. An example of such an algorithim is the binary search algorithm for
Section 2.5
Using a heap which is a tree where each node has a key and the key of a nodes children are not smaller than the key of that node. Using operations HeapifyUp and HeapifyDown, which fix the heap if it is messed up at one particular location it is possible ot implement insertion and deletion in logarithmic time. The heap is a natural choice for implementing a priority queue because the highest priority element is naturally at the top, and it works better than other implementations which require at least one operation to run in O(n).
Section 3.4
Breadth first search can be used to test biparteness. A graph is bipartite if and only if there are no odd cycles. Color vertices based on the parity of the distance between them and an arbitrary starting node.
Section 3.5
A directed graph is a graph where edges are directed arrows pointing from one vertex to another. The concept of connected component changes somewhat on directed graphs since just because there is a path from vertex a to vertex b doesn't mean there is one from b to a. A graph is strongly connected every vertex is reachable from every other vertex.
Section 3.6
Directed Acyclic Graphs (DAGs) are directed graphs that are acyclic. Any DAG has at least one toplogical ordering, an ordering of the vertices such that if a has an arrow going to b then a comes before b. To find such an ordering just recursively build it by taking out the “least” element in the DAG, that is one with no arrow pointing to it. You will always find one of these if you try. You just have to believe.