This is an old revision of the document!


Chapter 2: Basics of Algorithm Analysis

2.1: Computational Tractability

This section started off by defining that all algorithms are fundamentally discrete. That is, they “involve an implicit search over a large set of combinatorial possibilities” with the goal of finding the most efficient solution that solves the problem at hand. The rest of the section discussed the term “efficiency” and possible definitions for it. It settled upon saying that an algorithm is efficient if it has a polynomial run time, with this definition being an “absolute notion.” This means that it allows us to have a strict definition of whether or not a problem has an efficient solution. One of my main takeaways was that the best way to compute and declare run time is by using the worst-case run time. This way, you don't have any disagreements as to what average run time means and it is rare that you will achieve best-case run time, unless it has a tight bound, so you would not want to use that.

2.2: Asymptotic Order of Growth

Here the different forms for declaring bounds of a run time were stated. The upper bound is denoted by big-O notation, meaning that the run time will be no worse than O(*), the star being the run time. Ω represents the lower bound, meaning run time is no better than Ω(*). Tight bound is represented by Θ. I had not previously known what the tight bound was, but I know now that it represents if O(*) and Ω(*) are the same, then the tight bound will be the same as well. Also, though the base does not matter when declaring logarithmic run times, it matters greatly for exponential run times.

2.3: Implementing the Stable Matching Algorithm Using Lists and Arrays

courses/cs211/winter2018/journals/shermanc/chapter2.1516668700.txt.gz · Last modified: by shermanc
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0