Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2012:journals:jeanpaul:chaptter2section2 [2012/01/15 02:04] mugabejcourses:cs211:winter2012:journals:jeanpaul:chaptter2section2 [2012/01/18 02:47] (current) mugabej
Line 4: Line 4:
 ===Asymptotic Upper Bounds=== ===Asymptotic Upper Bounds===
  
-   *Let T(//n//) be a function(let's assume it is the worst-case running time of a certain algorithm on an input of size //n//. +   *Let T(//n//) be a function(let's assume it is the worst-case running time of a certain algorithm on an input of size //n//)
-   *Given another function //f(n)//,we say that T(//n//) is O( //f(n)) (//T(//n//) is order  //f(n)//)if for sufficiently large //n//, the function T(//n//) is bounded above by a constant multiple of //f(n)//. We can write this relation as T(//n//) = O(//f(n)//). To be precise, T(//n//) is O(//f(n)//) if there exist constants //c// > 0 and //n<sub>0</sub>// >= 0 such that  for all //n// >= //n<sub>0</sub>//, we have  +   *Given another function //f(n)//, we say that T(//n//) is O( //f(n)) (//T(//n//) is order  //f(n)//)if for sufficiently large //n//, the function T(//n//) is bounded above by a constant multiple of //f(n)//. We can write this relation as T(//n//) = O(//f(n)//). To be precise, T(//n//) is O(//f(n)//) if there exist constants //c// > 0 and //n<sub>0</sub>// >= 0 such that  for all //n// >= //n<sub>0</sub>//, we have T(//n//≤  //c//. The constant //c// works for all //n//,doesn't depend on //n//. In this case, we say that T is //asymptotically upper-bounded by f//. To find the upper bound //f//, we inflate the terms in the equation of T(//n//)
-T(//n//< = //c//. The constant //c// works for all //n//,doesn't depend on //n//. In this case, we say that T is //asymptotically upper-bounded by f//. +For instance, if T(n) = pn<sup>2</sup> + qn + r where p, q, r are positive constants, \\ 
-    *+For all n ≥ 1,\\ 
 + T(n) = p//n//<sup>2</sup> + qn + r 
 + => T(n) ≤ pn<sup>2</sup> + qn<sup>2</sup>+ rn<sup>2</sup> \\ 
 +pn<sup>2</sup> + qn<sup>2</sup>+ rn<sup>2</sup> = (p+q+r) n<sup>2</sup> = c n<sup>2</sup>\\ 
 +=> T(n) ≤ cn<sup>2</sup>, where c = p+q+r \\ 
 +Thus, T(n) = O(n<sup>2</sup>
 + 
 +===Asymptotic Lower Bounds=== 
 + 
 +  *Let T(//n//) be a function(let's assume it is the worst-case running time of a certain algorithm on an input of size //n//). 
 +  *To show that an upper bound found the best one possible, we need to express the notion that for arbitrarily large input sizes //n//,the function T(//n//) is at least a constant multiple of some specific function //f(n)//. Thus we write T(//n//) = Ω(//f(n)//)(T(//n//) is Ω(//f(n)//)) if there exist constants if there exist constants ε > 0 and n<sub>0</sub> ≥ 0 such that for all n ≥ n<sub>0</sub> , we have T(n) ≥ ε · //f(n)//). In this case, T is //asymptotically lower-bounded by f//. the constant ε is fixed,and is independent of //n//. This time, we deflate the terms of the polynomial T instead of inflating them to find the lower bound. 
 + 
 +For instance, if T(n) = pn<sup>2</sup> + qn + r where p, q, r are positive constants 
 + 
 +For all n ≥ 0, 
 +T(n) = pn<sup>2</sup> + qn + r ≥ pn<sup>2</sup>\\ 
 +=> T(n) ≥ εn<sup>2</sup>, where ε = p > 0\\ 
 + 
 +=>T(n) = Ω(n) \\ 
 + 
 +===Asymptotic Tight Bounds=== 
 + 
 +  * If we can show that a running time T(n) is both O(//f(n)//) and also Ω (//f(n)//),then we have found the //right// bound: T(//n//) grows exactly like //f(n)//to within a constant factor. 
 +  * If a function is both  O(//f(n)//) and also Ω (//f(n)//), we say that T(//n//) is Θ(//f(n)//). In this case, we say that //f(n)// is an //asymptotically  tight bound // for T(n). 
 +  * If we let //f// and //g// be two functions such that the limit as n tends to infinity of //f(n)// / //g(n)// exists and is equal to some number c > 0, then //f(n)// = Θ(//g(n)//). 
 + 
 + 
 +=== Properties of Asymptotic Growth Rates=== 
 + 
 +  * Transitivity: 
 +      *If f = O(g) and g = O(h) then f = O(h) 
 +      *If f = Ω(g) and g = Ω(h) then f = Ω(h) 
 +      *If f = Θ(g) and g = Θ(h) then f = Θ(h) 
 +  *Sums of functions: 
 +      * If f = O(h) and g = O(h) then f + g = O(h) 
 +      * If f = Ω(h) and g = Ω(h) then f + g = Ω(h) 
 +      * If f = Θ(h) and g = Θ(h) then f + g = Θ(h) 
 +      * Let k be a fixed constant and let f<sub>1</sub>,f<sub>2</sub>,...,f<sub>k</sub> // and //h// be functions such that f<sub>i</sub> = O(//h//) for all //i//.\\ Then //f<sub>1</sub> + f<sub>2</sub>+...+ f<sub>k</sub> =O(//h//).\\ 
 + 
 + 
 +===Asymptotic Bounds for some Common Functions === 
 + 
 + 
 + 
 +  * __Polynomials__: For a polynomial T(n) = a<sub>0</sub> + a<sub>1</sub>n + … + a<sub>d</sub>n<sup>d</sup> of degree //d//, in which the coefficient a<sub>d</sub> is positive, //f// = O(n<sup>d</sup>
 +  * __Logarithms__: log<sub>b</sub> n is the number //x// such that //b//<sup>//x//</sup> = //n//. If we round log<sub>b</sub>n to the nearest integer, it is one less than the number of digits in the base-b representation of the number //n//. 
 +       * For every //b// > 1 and every //x// > 0,\\ log<sub>b</sub>n = O(n<sup>//x//</sup>). 
 +   * Exponentials : Let //f(//n//)// = r<sup>n</sup>. For every r > 1 and every //d// > 0, //n// <sup>d</sup> = O(r<sup>n</sup>). 
 + 
 +These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials.\\ 
 + 
 +There are several motivations for studying the bounds of the running times of algorithms. Knowing the running time of an algorithm, one can easily perfect an algorithm as much he/she can. The running times serve as a measure of comparison when searching for which algorithm is more efficient for a given problem. Thus a programmer should always aim for the most efficient algorithm he/she can possibly design and be convinced it's more efficient than that he/she had previously implemented thanks to the knowledge of the running times of algorithms.\\ 
 + 
 +I don't have a specific question about this section as most of the material makes sense. After rereading this section and seeing how it was presented in class, I can now see the importance of tight bounds. Tight bounds are a new concept for me when it comes to running times of algorithms,but I was able to see why they are useful in analyzing the algorithm: They give approximate,more precise limits to the running time of the algorithms. I would like to remember the properties of the running times of algorithms, they look very useful and I'm convinced they can help a lot in analyzing running times of algorithms.\\ 
 + 
 +This section was interesting too, I give it a 9/10. 
 + 
 + 
courses/cs211/winter2012/journals/jeanpaul/chaptter2section2.1326593040.txt.gz · Last modified: 2012/01/15 02:04 by mugabej
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0