This is an old revision of the document!


Chapter 2

2.1: Computational Tractability The goal of this course is to write correct and efficient algorithms and this sections goes through how we define efficiency.

Intial attempt at defining efficiancy: “An algorithm is efficient if when, when implemented, it runs quickly on real input instances” What works:

  • Good initial intro to what we want. Need to get things done quickly!

What doesn't work:

  • Too vague: need where and how well
    • Will it be fast on more than just small cases?
    • What are real input instances?

Worst-Case Running Times and Brute-Force Search: “An algorith is efficient if it achieves qualitatviely better worst-case performance, at an analytical level, than brute-force search” Why it works:

  • brute-force search is just going through each option so imporving is needed.
  • shows intrinsic structure and computational tractability about the problem

Why it doesn't works:

  • qualitatively better performance is too vague

Polynomial Time as a Definition of Efficiency “An algorithm is efficient if it has a polynomial running time.” Why it works:

  • very specific
  • makes it easier to prove there is no efficient algorithm

Why it doesn't work

  • Too specific

2.2 Asymptotic Order of Growth This section discuss how to measure the order of an algorithm and looks at how to find a lower bound, an upper bound, and a tight bound. Our goal is to express the growth rate of running times of algorithms.

Upper bounds:

  • If T(n) is ⇐ c*f(n) for some c > 0 then T is asymptotically bounded above by f.
  • We will denote an upper bound as O. There can be multiple upper bounds.

Lower bound:

  • IF T(n) is >=ξ*f(n) for some ξ>0 then T is asymptotically bounded below by f.
  • we will denote this as Ω

Tight bounds:

  • If T(n) is both O(f(n)) and Ω(f(n)) then we've found a asymptically tight bound
  • T is bound tightly by f(n)
  • our tight bound is denoted as Θ
  • another way to find Θ is through the limit!

Properties of Asymptotic Growth Rates

1. Transitivity: 
  * If f = O(g) and g = O(h) then f = O(h)
  * If f = Ω(g) and g = Ω(h) then f = Ω(h)
  * If f = Θ(g) and g = Θ(h) then f = Θ(h)
2. Sum of Functions
  * Suppose that f and g are two functions such that for some other function h, we have f=O(h) and g = O(h) then f+g=O(h)
  * The same can be said for a finite number of functions
  * g = O(f) then f+g = O(f)

Asymptotic bounds for the some common fuctions

  • polynomials: f is o degree d then f = O(n^d)
  • logs: log_b(n) = O(n^x) where x > 0 and b > 1
  • Exponentials: f(n)=r^n where r > 1 and d > 0 → O(r^n) = n^d

Helpful to have the bounds for common functions, but I am wondering how will we find them in the future?

courses/cs211/winter2012/journals/carrie/ch2.1326394039.txt.gz · Last modified: 2012/01/12 18:47 by hopkinsc
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0