Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
courses:cs211:winter2012:journals:jeanpaul:chaptter2section2 [2012/01/15 01:53] – created mugabej | courses:cs211:winter2012:journals:jeanpaul:chaptter2section2 [2012/01/18 02:47] (current) – mugabej | ||
---|---|---|---|
Line 4: | Line 4: | ||
===Asymptotic Upper Bounds=== | ===Asymptotic Upper Bounds=== | ||
- | | + | *Let T(//n//) be a function(let' |
- | *Given another function //f(n)//,we say that T(//n//) is O( //f(n)) (//T(//n//) is order //f(n)//)if for sufficiently large //n//, the function T(//n//) is bounded above by a constant multiple of //f(n)//. We can write this relation as T(//n//) = O(// | + | |
- | * | + | For instance, if T(n) = pn< |
+ | For all n ≥ 1,\\ | ||
+ | T(n) = p// | ||
+ | => T(n) ≤ pn< | ||
+ | pn< | ||
+ | => T(n) ≤ cn< | ||
+ | Thus, T(n) = O(n< | ||
+ | |||
+ | ===Asymptotic Lower Bounds=== | ||
+ | |||
+ | | ||
+ | *To show that an upper bound found the best one possible, we need to express the notion that for arbitrarily large input sizes //n//,the function T(//n//) is at least a constant multiple of some specific function //f(n)//. Thus we write T(//n//) = Ω(// | ||
+ | |||
+ | For instance, if T(n) = pn< | ||
+ | |||
+ | For all n ≥ 0, | ||
+ | T(n) = pn< | ||
+ | => T(n) ≥ εn< | ||
+ | |||
+ | =>T(n) = Ω(n) \\ | ||
+ | |||
+ | ===Asymptotic Tight Bounds=== | ||
+ | |||
+ | * If we can show that a running time T(n) is both O(//f(n)//) and also Ω (// | ||
+ | * If a function is both O(//f(n)//) and also Ω (//f(n)//), we say that T(//n//) is Θ(// | ||
+ | * If we let //f// and //g// be two functions such that the limit as n tends to infinity of //f(n)// / //g(n)// exists and is equal to some number c > 0, then //f(n)// = Θ(// | ||
+ | |||
+ | |||
+ | === Properties of Asymptotic Growth Rates=== | ||
+ | |||
+ | * Transitivity: | ||
+ | *If f = O(g) and g = O(h) then f = O(h) | ||
+ | *If f = Ω(g) and g = Ω(h) then f = Ω(h) | ||
+ | *If f = Θ(g) and g = Θ(h) then f = Θ(h) | ||
+ | *Sums of functions: | ||
+ | * If f = O(h) and g = O(h) then f + g = O(h) | ||
+ | * If f = Ω(h) and g = Ω(h) then f + g = Ω(h) | ||
+ | * If f = Θ(h) and g = Θ(h) then f + g = Θ(h) | ||
+ | * Let k be a fixed constant and let f< | ||
+ | |||
+ | |||
+ | ===Asymptotic Bounds for some Common Functions === | ||
+ | |||
+ | |||
+ | |||
+ | * __Polynomials__: | ||
+ | * __Logarithms__: | ||
+ | * For every //b// > 1 and every //x// > 0,\\ log< | ||
+ | * Exponentials : Let // | ||
+ | |||
+ | These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials.\\ | ||
+ | |||
+ | There are several motivations for studying the bounds of the running times of algorithms. Knowing the running time of an algorithm, one can easily perfect an algorithm as much he/she can. The running times serve as a measure of comparison when searching for which algorithm is more efficient for a given problem. Thus a programmer should always aim for the most efficient algorithm he/she can possibly design and be convinced it's more efficient than that he/she had previously implemented thanks to the knowledge of the running times of algorithms.\\ | ||
+ | |||
+ | I don't have a specific question about this section as most of the material makes sense. After rereading this section and seeing how it was presented in class, I can now see the importance of tight bounds. Tight bounds are a new concept for me when it comes to running times of algorithms, | ||
+ | |||
+ | This section was interesting too, I give it a 9/10. | ||
+ | |||
+ |