When approaching a problem, it helps thinking about two kinds of bounds: one on the running time you hope to achieve, and the other on the size of the problem's natural search space (hence on the running time of a brute-force search algorithm for the problem).

What are the different running times and which one is fastest???

Linear Time An algorithm is linear , means an algorithm's running time is at most a constant factor times the size of the input. Processing the input in a single pass will yield such a running time. Computing the maximum of n numbers can be performed in a “one-pass” style. Using an array or list, we can compare every element starting at index 1 to the current maximum number, which would start with value at index 0. We would have to update the current maximum if the element we are at is larger than the current maximum. Every element is looked at once thus taking O(n) time.

Merging two lists. Given two sorted lists, how can we put them in one list? We would have to use pointers that point at the first element in both lists. Then inert the smallest first element and move its corresponding pointer to the next element. ALgorithm in detail on Page 49. Worst-case, an element can be compared to all elements in the other list. This can give us a O(n^2). Wecan charge an element only once, so there can be at most 2n iterations. Thus giving us a linear running time.

O(n log n) Time aka Linearithmic Time. A running time that belongs to every algorithm that splits its input into two equally-sized pieces, and solves each piece recursively (O(log n)). It then merges the two pieces later on in linear time (O(n)).

Quadratic Time Finding the closest pair of n points randomly in a plane. Brute-force algorithm would compute distance between every pair and choose the pair for which the distance in between is smallest. * Number of pairs is n^2. The overall running time is O(n^2), from spending constant time on every pair. Quadratic time also arises from a pair of nested loops. The brute-force algorithm for finding the closest pair of points is on Page 51. CHapter 5 will give a better solution to this problem, run time being O(nlog n). Cubic Time Is the similar to Quadratic despite the fact that cubic time arises from a triple of nested loops.

Generally, to obtain a running time of O(n^k), we would have to search over all subsets of size k (Polynomial Time). Can we find a set of independent nodes in a given graph?

Beyond Polynomial Time We shall look at 2^n (exponential) and n! (factorial) in particular. 2^n arises naturally as a running time for a search algorithm that must consider all subsets. The function n! grows even faster than 2^n, thus even more menacing as a bound on the performance of an algorithm. We have seen this before in the perfect matching problem, where a man's choice reduces by 1 every time he is ditched. Despite this huge set of possible outcomes, we were able to solve the Stable Matching problem in O(n^2). The function n! arises in in problems where the search space consists of all ways to arrange n items in order.

Finally is the sublinear time, for example the binary search algorithm runs in O(log n) time. The size of the input decreases in half every time binary search runs through.

Is it possible to achieve a running time better than sublinear (O(log n))???

This section was harder than 2.3 thus its scale would be a 7/10.

courses/cs211/winter2014/journals/fred/2.4_a_survey_of_common_running_times.txt · Last modified: 2014/01/21 23:57 by gisaf
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0