This is an old revision of the document!


Chapter 2

This chapter explains how to analyze algorithms by measuring how their resource requirements scale with increasing input size.

Section 1: Computational Tractability

We want to find efficient algorithms, but what does efficient mean? We need to be more specific than just saying it runs fast or its better than brute-force search so we will consider an algorithm efficient if it has a polynomial worst-case running time. This means that for an input of size N, the worst-case running time is bounded by cN^d, where c and d are positive constants.

Readability: 8

Section 2: Asymptotic Order of Growth

This section explains how to express the growth rate of running times.

Definitions of O, Ω, and Θ

Upper bounds: T(n) is O(f(n)) if T is bounded above by a constant multiple of f, that is T(n)⇐ c*f(n) eventually.

Lower bounds: T(n) is Ω(f(n)) if T is bounded below by a multiple of f, that is T(n)>= e*f(n) eventually.

Tight bounds: T(n) is Θ(f(n)) if T is O(f(n)) and Ω(f(n)).

Properties of O, Ω, and Θ

Transitivity: If f is O(g) and g is O(h), then f is O(h). This property also holds for lower and tight bounds.

Sums of Functions: If i is an integer, f1, f2, …, fi, h are functions and each fi = O(h), then f1 + f2 + … + fi = O(h)

Common Functions

Polynomial, logarithmic and exponential functions are commonly seen. A polynomial's rate of growth is determined by its highest order term. log n is bounded by every function n^x as long as x>0. Every exponential grows faster than every polynomial.

Readability: 7

Section 3: Implementing the Stable Matching Algorithm

Section 4: A Survey of Common Running Times

Section 5: Priority Queues

courses/cs211/winter2012/journals/virginia/chapter2.1327354373.txt.gz · Last modified: 2012/01/23 21:32 by lovellv
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0