====== 2.2 Asymptotic Order of Growth======
===Asymptotic Upper Bounds===
*Let T(//n//) be a function(let's assume it is the worst-case running time of a certain algorithm on an input of size //n//).
*Given another function //f(n)//, we say that T(//n//) is O( //f(n)) (//T(//n//) is order //f(n)//)if for sufficiently large //n//, the function T(//n//) is bounded above by a constant multiple of //f(n)//. We can write this relation as T(//n//) = O(//f(n)//). To be precise, T(//n//) is O(//f(n)//) if there exist constants //c// > 0 and //n0// >= 0 such that for all //n// >= //n0//, we have T(//n//) ≤ //c//. The constant //c// works for all //n//,doesn't depend on //n//. In this case, we say that T is //asymptotically upper-bounded by f//. To find the upper bound //f//, we inflate the terms in the equation of T(//n//).
For instance, if T(n) = pn2 + qn + r where p, q, r are positive constants, \\
For all n ≥ 1,\\
T(n) = p//n//2 + qn + r
=> T(n) ≤ pn2 + qn2+ rn2 \\
pn2 + qn2+ rn2 = (p+q+r) n2 = c n2\\
=> T(n) ≤ cn2, where c = p+q+r \\
Thus, T(n) = O(n2)
===Asymptotic Lower Bounds===
*Let T(//n//) be a function(let's assume it is the worst-case running time of a certain algorithm on an input of size //n//).
*To show that an upper bound found the best one possible, we need to express the notion that for arbitrarily large input sizes //n//,the function T(//n//) is at least a constant multiple of some specific function //f(n)//. Thus we write T(//n//) = Ω(//f(n)//)(T(//n//) is Ω(//f(n)//)) if there exist constants if there exist constants ε > 0 and n0 ≥ 0 such that for all n ≥ n0 , we have T(n) ≥ ε · //f(n)//). In this case, T is //asymptotically lower-bounded by f//. the constant ε is fixed,and is independent of //n//. This time, we deflate the terms of the polynomial T instead of inflating them to find the lower bound.
For instance, if T(n) = pn2 + qn + r where p, q, r are positive constants
For all n ≥ 0,
T(n) = pn2 + qn + r ≥ pn2\\
=> T(n) ≥ εn2, where ε = p > 0\\
=>T(n) = Ω(n) \\
===Asymptotic Tight Bounds===
* If we can show that a running time T(n) is both O(//f(n)//) and also Ω (//f(n)//),then we have found the //right// bound: T(//n//) grows exactly like //f(n)//to within a constant factor.
* If a function is both O(//f(n)//) and also Ω (//f(n)//), we say that T(//n//) is Θ(//f(n)//). In this case, we say that //f(n)// is an //asymptotically tight bound // for T(n).
* If we let //f// and //g// be two functions such that the limit as n tends to infinity of //f(n)// / //g(n)// exists and is equal to some number c > 0, then //f(n)// = Θ(//g(n)//).
=== Properties of Asymptotic Growth Rates===
* Transitivity:
*If f = O(g) and g = O(h) then f = O(h)
*If f = Ω(g) and g = Ω(h) then f = Ω(h)
*If f = Θ(g) and g = Θ(h) then f = Θ(h)
*Sums of functions:
* If f = O(h) and g = O(h) then f + g = O(h)
* If f = Ω(h) and g = Ω(h) then f + g = Ω(h)
* If f = Θ(h) and g = Θ(h) then f + g = Θ(h)
* Let k be a fixed constant and let f1,f2,...,fk // and //h// be functions such that fi = O(//h//) for all //i//.\\ Then //f1 + f2+...+ fk =O(//h//).\\
===Asymptotic Bounds for some Common Functions ===
* __Polynomials__: For a polynomial T(n) = a0 + a1n + … + adnd of degree //d//, in which the coefficient ad is positive, //f// = O(nd)
* __Logarithms__: logb n is the number //x// such that //b////x// = //n//. If we round logbn to the nearest integer, it is one less than the number of digits in the base-b representation of the number //n//.
* For every //b// > 1 and every //x// > 0,\\ logbn = O(n//x//).
* Exponentials : Let //f(//n//)// = rn. For every r > 1 and every //d// > 0, //n// d = O(rn).
These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials.\\
There are several motivations for studying the bounds of the running times of algorithms. Knowing the running time of an algorithm, one can easily perfect an algorithm as much he/she can. The running times serve as a measure of comparison when searching for which algorithm is more efficient for a given problem. Thus a programmer should always aim for the most efficient algorithm he/she can possibly design and be convinced it's more efficient than that he/she had previously implemented thanks to the knowledge of the running times of algorithms.\\
I don't have a specific question about this section as most of the material makes sense. After rereading this section and seeing how it was presented in class, I can now see the importance of tight bounds. Tight bounds are a new concept for me when it comes to running times of algorithms,but I was able to see why they are useful in analyzing the algorithm: They give approximate,more precise limits to the running time of the algorithms. I would like to remember the properties of the running times of algorithms, they look very useful and I'm convinced they can help a lot in analyzing running times of algorithms.\\
This section was interesting too, I give it a 9/10.