Chapter 2

Basics of Algorithm Analysis

This chapter talks about efficiency of algorithms. Not all algorithms are efficient in terms of their memory usage and speed. “Even bad algorithms can run quickly when applied to small test cases on extremely fast processors; even good algorithms can run slowly when they are coded sloppily.”

Computer Tractability

The worst-case run time for the Stable Matching example in the previous chapter is 2n^2 because there are two lists, M and W, each of size n and we go through each list once. However, there are n! possible perfect matchings and Brute-Force Search is much more bigger.

NOTE: In the book, there are some definitions on efficiency and a reason to why the particular definition is a good or a bad one. There is a also a table with run times of different algorithms on inputs of increasing size.

Asymptotic Order of Growth

Asymptotic Upper Bounds

Defined as T(n) = O(f(n)), meaning T(n) is O(f(n)) if there exist constants c > 0 and n0 >= 0 so that for all n >= n0, we have T(n) ⇐ c.f(n).

Asymptotic Lower Bounds

Defined as T(n) = Ω(f(n)), meaning T(n) is Ω(f(n)) if there exist constants ε > 0 and n0 >= 0 so that for all n >= n0, we have T(n) >= ε.f(n).

Asymptotic Tight Bounds

Defined as T(n) = Θ(f(n)), if it is both the upper bound and the lower bound, meaning, it is just the right bound.

Properties of Asymptotic Growth Rates

  • They are transitive. If f = O(g) and g = O(h), then f = O(h)
  • They are summable. If f = O(h) and g = O(h), f + g = O(h)

Asymptotic Bounds for Some Common Functions

  • Polynomial: f = O(n^d)
  • Logarithms: f = log(b)n = O(n^x)
  • Exponentials: f = n^d = O(r^n)

Implementing the Stable Matching Algorithm Using Lists and Arrays

We do not need to program each and every program but we do have to know how data are to be represented and used in an algorithm so that we can limit the steps of computation. For example, we need to think about how to represent the men and women and their rankings in the Stable Matching algorithm.

Arrays and Lists

Arrays
  • Find ith element: O(1)
  • Find element: O(n)
  • Find element (sorted list): O(log n)
Doubly Linked List
  • Deletion: O(1)
  • Insertion: O(1)

Implementation of the Stable Matching Algorithm

  • Free men are represented by a linked list.
  • Women every man proposes in order of his preference are represented by an array.
  • Women's current partners are represented by an array.
  • Women's preferences are represented in an array.

With this, we would be able to run the algorithm in O(n^2) time.

A Survey of Common Running Times

Linear

  • Computing the maximum (Algorithm on page 48)
  • Merging two sorted lists (Algorithm on page 49)

O(n) run time basically means going through each element of the data set.

O(n log n) Time

Basically splits the data set into halves, solves each subset recursively, and merges back together in linear time.

Quadratic Time

  • Finding distance between each points in a graph in a set of 2-D coordinates (Algorithm on page 52)

Need to go through the data set repetitively taking the each of the data and the next one.

Cubic Time

  • To check if the given subsets of a set are disjoint (Algorithm on page 52)

Need to check if each element of every set belongs to each of the other sets.

O(n^k) Time

We search over all subsets of size k, the same way as we did for quadratic time. (Algorithm on page 53)

Beyond Polynomial Time

  • Finding independent set of maximum size in a graph (Algorithm on page 54)

Sublinear Time

  • Binary search: O(log n) Time

A More Complex Data Structure: Priority Queues

Priority Queues are one of the broadly useful complex data type, which will be helpful to describe graphs. The men in the Stable Matching Algorithm can be represented in a Priority Queue as once he is engaged, he can be removed from the queue in constant time and if he becomes single again, he can be added back to the queue in constant time. It can also be used to sort a list (Proof in page 58).

Heap: To represent a Priority Queue

Heap is a balanced binary tree where the each of the children of each node is equal or larger than the node itself (for an ascending order of numbers). This can be represented in a list, where the first element is the element in the top node and each of the left and the right children for each next node is represented by 2i and 2i+1 respectively, where “i” is the index of the element in the queue.

When adding an element, it gets added to the next first empty space in the heap. If it is smaller than its parent, it gets swapped with its parent. This algorithm is called Heapify-up (Algorithm in page 61. Proof in page 62).

While deleting an element, the element to be deleted gets swapped with the last element in the heap and it moves up/down as required. Thus Heapify-down algorithm is needed (Algorithm in page 63. Proof in page 64).

Implementing Priority Queues with Heaps

  • StartHeap(N): O(N) time to initialize the array to hold the heap
  • Insert(H, v): Takes O(log n) time at the most
  • FindMin(H): O(1) time, since the minimum is the first element in the heap
  • Delete(H, i): O(N) time at the most
  • ExtractMin(H): Deletes the minimum and moves the next minimum to the top of the heap, having O(log n) time
  • Delete(H, v): O(log n) time at the most
  • ChangeKey(H, v, a): O(log n) time since we need to first identify the original key v, change it with a, and use Heapify-up or Heapify-down as required

Questions and Comments

This chapter was easily readable and the class discussions made it very clear. Thus, I have no questions on this chapter. Also, since most of the stuff I read in this chapter were already covered in CSCI 112, it wasn't anything new so maybe that was why I thought it wasn't “interesting” and was easy to understand.

courses/cs211/winter2012/journals/suraj/chapter2.txt.txt · Last modified: 2012/01/25 01:38 by bajracharyas
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0