Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2012:journals:jeanpaul:chaptter2section2 [2012/01/17 04:33] mugabejcourses:cs211:winter2012:journals:jeanpaul:chaptter2section2 [2012/01/18 02:47] (current) mugabej
Line 31: Line 31:
   * If we can show that a running time T(n) is both O(//f(n)//) and also Ω (//f(n)//),then we have found the //right// bound: T(//n//) grows exactly like //f(n)//to within a constant factor.   * If we can show that a running time T(n) is both O(//f(n)//) and also Ω (//f(n)//),then we have found the //right// bound: T(//n//) grows exactly like //f(n)//to within a constant factor.
   * If a function is both  O(//f(n)//) and also Ω (//f(n)//), we say that T(//n//) is Θ(//f(n)//). In this case, we say that //f(n)// is an //asymptotically  tight bound // for T(n).   * If a function is both  O(//f(n)//) and also Ω (//f(n)//), we say that T(//n//) is Θ(//f(n)//). In this case, we say that //f(n)// is an //asymptotically  tight bound // for T(n).
-  * If we let //f// and //g// be two functions such that lim (n tends to infinityof //f(n)// / //g(n)// exists and is equal to some number c > 0, then //f(n)// = Θ(//g(n)//).+  * If we let //f// and //g// be two functions such that the limit as n tends to infinity of //f(n)// / //g(n)// exists and is equal to some number c > 0, then //f(n)// = Θ(//g(n)//).
  
  
Line 44: Line 44:
       * If f = Ω(h) and g = Ω(h) then f + g = Ω(h)       * If f = Ω(h) and g = Ω(h) then f + g = Ω(h)
       * If f = Θ(h) and g = Θ(h) then f + g = Θ(h)       * If f = Θ(h) and g = Θ(h) then f + g = Θ(h)
-      * Let k be a fixed constant and let f<sub>1</sub>,f<sub>2</sub>,...,f<sub>k</sub> // and //h// be functions such that f<sub>i</sub> = O(//h//) for all //i//. Then //f<sub>1</sub> + f<sub>2</sub>+...+ f<sub>k</sub> = O(//h//).\\+      * Let k be a fixed constant and let f<sub>1</sub>,f<sub>2</sub>,...,f<sub>k</sub> // and //h// be functions such that f<sub>i</sub> = O(//h//) for all //i//.\\ Then //f<sub>1</sub> + f<sub>2</sub>+...+ f<sub>k</sub> =O(//h//).\\
  
  
-==Asymptotic Bounds for some Common Functions ==+===Asymptotic Bounds for some Common Functions ===
  
  
  
   * __Polynomials__: For a polynomial T(n) = a<sub>0</sub> + a<sub>1</sub>n + … + a<sub>d</sub>n<sup>d</sup> of degree //d//, in which the coefficient a<sub>d</sub> is positive, //f// = O(n<sup>d</sup>)   * __Polynomials__: For a polynomial T(n) = a<sub>0</sub> + a<sub>1</sub>n + … + a<sub>d</sub>n<sup>d</sup> of degree //d//, in which the coefficient a<sub>d</sub> is positive, //f// = O(n<sup>d</sup>)
-  * __Logarithms__: log<sub>b</sub> n is the number //x// such that //b// <sup>//x//</sup> = //n//. If we round log<sub>b</sub>n to the nearest integer, it is one less than the number of digits in the base-b representation of the number //n//. +  * __Logarithms__: log<sub>b</sub> n is the number //x// such that //b//<sup>//x//</sup> = //n//. If we round log<sub>b</sub>n to the nearest integer, it is one less than the number of digits in the base-b representation of the number //n//. 
-       * For every //b// > 1 and every //x// > 0, log<sub>b</sub>n = O(n<sup>//x//<sup>.+       * For every //b// > 1 and every //x// > 0,\\ log<sub>b</sub>n = O(n<sup>//x//</sup>).
    * Exponentials : Let //f(//n//)// = r<sup>n</sup>. For every r > 1 and every //d// > 0, //n// <sup>d</sup> = O(r<sup>n</sup>).    * Exponentials : Let //f(//n//)// = r<sup>n</sup>. For every r > 1 and every //d// > 0, //n// <sup>d</sup> = O(r<sup>n</sup>).
  
-These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials.+These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials.\\
  
 +There are several motivations for studying the bounds of the running times of algorithms. Knowing the running time of an algorithm, one can easily perfect an algorithm as much he/she can. The running times serve as a measure of comparison when searching for which algorithm is more efficient for a given problem. Thus a programmer should always aim for the most efficient algorithm he/she can possibly design and be convinced it's more efficient than that he/she had previously implemented thanks to the knowledge of the running times of algorithms.\\
 +
 +I don't have a specific question about this section as most of the material makes sense. After rereading this section and seeing how it was presented in class, I can now see the importance of tight bounds. Tight bounds are a new concept for me when it comes to running times of algorithms,but I was able to see why they are useful in analyzing the algorithm: They give approximate,more precise limits to the running time of the algorithms. I would like to remember the properties of the running times of algorithms, they look very useful and I'm convinced they can help a lot in analyzing running times of algorithms.\\
 +
 +This section was interesting too, I give it a 9/10.
  
  
  
courses/cs211/winter2012/journals/jeanpaul/chaptter2section2.1326774813.txt.gz · Last modified: 2012/01/17 04:33 by mugabej
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0