This is an old revision of the document!
Table of Contents
Chapter 2: Basics of Algorithm Analysis
2.1 Computational Tractability
Several factors are key in the design of algorithms, including understanding the discrete nature of many problems as well as addressing the efficiency and space of an algorithm. The text gives one definition of efficiency: “An algorithm is efficient if, when implemented, it runs quickly on real input instances.” Unfortunately, this definition fails to consider where and how well the algorithm is implemented.
To be more specific, we can argue that an algorithm is efficient if it, at the bare minimum, performs in the worst case better than would a brute-force search. To narrow the definition even further, we could classify an 'efficient algorithm' as one with a polynomial running time, meaning its running time is bounded by cNd where c and d are constants > 0 and every input instance is size N. I found it beneficial to learn more about what exactly makes an algorithm efficient and I would rate this section a 10 for its ability to be clearly understood, since concepts were explained very straightforwardly.
2.2 Asymptotic Order of Growth
For the purposes of simplicity and clarity, classifying running times should be done in terms of constant factors; for example instead of calculating a run time as an2 + bn + c, we would consider this run time to be simply n2. Assume T(n) is the worst-case run time for an algorithm with n referring to the size of the input. Let f(n) refer to a function. T is asymptotically upper-bounded by f and T(n) is O(f(n)) if:
- there exists a constant c > 0
- there exists and a constant n0 ≥ 0 such that all n ≥ n0, and
- T(n) ≤ c • f(n).
Suppose T(n) ≥ d • f(n), for a fixed constant d > 0; then T is asymptotically lower-bounded by f and Ω(f(n)). Finally, if T(n) is both O(f(n)) and Ω(f(n)), then T(n) is considered asymptotically tight bound for T(n) and T(n) is θ(f(n)).
The growth rates O, Ω, and θ share some basic properties:
- transitivity: if a function f is asymptotically upper-bounded by a function g and g is asymptotically upper-bounded by a function h, then f is asymptotically upper-bounded by h.
- if f = θ(g) and g = θ(h) then f = θ(h)
- if f = O(h) and g = O(h) then f + g = O(h)
- if g = O(h) then f + g
- if g = O(f) then f + g = θ(f)
Logarithms, polynomials, and exponential functions are various ways to classify running time, with logarithms growing the slowest and exponentials growing the fastest. It was interesting to learn more deeply about run time order, for it has been frequently discussed throughout many of my courses. This section provided a deeper understanding on the variety of run time functions. I would have liked more information on how algorithms' run times are calculated. The readability of this section was around a 7, because the various variables, rules, and terms were somewhat complicated to keep straight.
