Table of Contents
Michael's Wiki
Chapter 1.1 Summary:
Stable matching problem
Self-enforcing recruitment/matching process
Outcome is stable if E prefers ever one of its accepted applicants to A; or
A prefers her current situation over working for employer E.
Chapter 2
Chapter 2.1 Computational Tractability:
Important to think about how resource requirements – the amount of time and space an algorithm will use- will scale with increasing input size
Algorithms should be fundamentally discrete.
An algorithm is efficient if, when implemented, it runs quickly on real input instances
→
Leaves out test case size, and bad algorithms can pass this test and so can good algorithms coded poorly
An algorithm is efficient if it achieves qualitatively better worst-case performance, at an analytical level, than brute-force search.
-too vague
→
An algorithm is efficient if it has a polynomial running time
Helps to see that there might not be an efficient algorithm for a particular problem
Chapter 2.2 Asymptotic Order of Growth:
O() expressed upper bound, not the exact growth rate
Upper, Lower, and limit bounds
Transitivity, A upper bound = b upper bound, b upper bound = c upper bound, then c upper bound = a upper bound
Polynomials - asymptotic rate of growth is determined by their “high-order term”
Logarithms- slow growing
Exponentials- fast growing
Every exponential grows faster than every polynomial
Chapter 2.4 - Survey of Common Running Times
Summary: Goes over various common running times and common problems associated with those running times. Search space : The set of all possible solutions the search for algorithms' whose performance is more efficient than a brute-force enumeration of this search space. One of the goals of the book is to improve on brute-force algorithms
Linear Time
-At most a constant factor time the size of the input
Ex.
Computing the Maximum
Merging Two Sorted Lists
O(n logn) Time:
Running time of any algorithm that splits its input into two equal-sized pieces, solves each piece recursively, and then combines the two solutions in linear times
Ex.
Sorting Algorithms like Mergesort
Algorithms whose most expensive step is to sort the input
Quadratic Time:
Performing a search over all pairs of inputs and spending constant time per pair
Also arises naturally from nested loops.
Cubic Time:
More elaborate sets of nested loops.
Ex.
Finding sets which are disjoint
O(n^k) Time
Arises for any constant k when we search over all subsets of size k
Ex.
Finding independent sets in a graph
Chapter 2.5 - More Complex Data Structure: Priority Queue
Summary: Priority Queue can be implemented well using a Heap data structure which is like a balanced binary tree. The Heap data structure with Heapify-down and Heapify-up operations can efficiently implement a priority queue that is constrained to hold at most N elements at any point in time
We seek algorithms that improve qualitatively on brute-force search and in general we use polynomial-time solvability as the concrete formulation of this.
Priority Queue – designed for applications in which elements have a priority value or key and each time we need to select an element form S, we want to take the one with the highest priority
-Smaller keys represent higher priorities
-Supports addition and deletion of elements from the set and also the selection of the element with smallest key
Heap data structure- Useful data structure for implementing a priority queue
Combines the benefits of a sorted array and list
Useful to think of as a balanced binary tree
Tree will have a root, and each node can have up to two children, a left and a right child
Heap Order: for every element v, at a node I, the element w at I's parent satisfies key(w) ⇐ key(v)
Common Operations:
Accessing Min- O(1) time because it will be the root
Heapify-Up – Used to 'fix' the heap after an element is added or deleted and an element is in a spot it is 'too big' for
Heapify-Down – Inverse of Heapify-up : is used to 'fix' the heap when an element is deleted or added and an element is 'too small' for its current spot.
Chapter 3.1 Graphs- Basic Definitions and Applications:
Summary: Goes over some basic graph types and some uses for a graph
Graph – collection of things and join some of them by edged
Examples/Uses:
- Transportation Networks
- Communication Networks
- Information Networks
- Social Networks
Simple Path- all vertices are distinct from one another
Cycle – a path v1,v2,…vk-1,vk in which k >2, the first k-1 nodes are all distinct, and v1=vk. In other words, the sequence of nodes “cycles back: to where it began.
Tree – an undirected graph, is connected and does not contain a cycle, the simplest kind of connected graph.
-Rooted trees help to explain the concept of hierarchy in computer science.
Theorem 3.2: Let G be an undirected graph on n nodes. Any two of the following statements implies the third. - G is connected - G does not contain a cycle - G has n-1 edges.
