Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
courses:cs211:winter2012:journals:jeanpaul:chaptter2section2 [2012/01/17 04:35] – mugabej | courses:cs211:winter2012:journals:jeanpaul:chaptter2section2 [2012/01/18 02:47] (current) – mugabej | ||
---|---|---|---|
Line 31: | Line 31: | ||
* If we can show that a running time T(n) is both O(//f(n)//) and also Ω (// | * If we can show that a running time T(n) is both O(//f(n)//) and also Ω (// | ||
* If a function is both O(//f(n)//) and also Ω (//f(n)//), we say that T(//n//) is Θ(// | * If a function is both O(//f(n)//) and also Ω (//f(n)//), we say that T(//n//) is Θ(// | ||
- | * If we let //f// and //g// be two functions such that lim (n tends to infinity) of //f(n)// / //g(n)// exists and is equal to some number c > 0, then //f(n)// = Θ(// | + | * If we let //f// and //g// be two functions such that the limit as n tends to infinity of //f(n)// / //g(n)// exists and is equal to some number c > 0, then //f(n)// = Θ(// |
Line 44: | Line 44: | ||
* If f = Ω(h) and g = Ω(h) then f + g = Ω(h) | * If f = Ω(h) and g = Ω(h) then f + g = Ω(h) | ||
* If f = Θ(h) and g = Θ(h) then f + g = Θ(h) | * If f = Θ(h) and g = Θ(h) then f + g = Θ(h) | ||
- | * Let k be a fixed constant and let f< | + | * Let k be a fixed constant and let f< |
- | ==Asymptotic Bounds for some Common Functions == | + | ===Asymptotic Bounds for some Common Functions |
Line 53: | Line 53: | ||
* __Polynomials__: | * __Polynomials__: | ||
* __Logarithms__: | * __Logarithms__: | ||
- | * For every //b// > 1 and every //x// > 0, log< | + | * For every //b// > 1 and every //x// > 0,\\ log< |
* Exponentials : Let // | * Exponentials : Let // | ||
- | These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials. | + | These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials.\\ |
+ | There are several motivations for studying the bounds of the running times of algorithms. Knowing the running time of an algorithm, one can easily perfect an algorithm as much he/she can. The running times serve as a measure of comparison when searching for which algorithm is more efficient for a given problem. Thus a programmer should always aim for the most efficient algorithm he/she can possibly design and be convinced it's more efficient than that he/she had previously implemented thanks to the knowledge of the running times of algorithms.\\ | ||
+ | |||
+ | I don't have a specific question about this section as most of the material makes sense. After rereading this section and seeing how it was presented in class, I can now see the importance of tight bounds. Tight bounds are a new concept for me when it comes to running times of algorithms, | ||
+ | |||
+ | This section was interesting too, I give it a 9/10. | ||