Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2012:journals:garrett:entries:week_1 [2012/01/18 04:56] – [Chapter 2: Basics of Algorithm Analysis] garrettheath4courses:cs211:winter2012:journals:garrett:entries:week_1 [2012/03/07 03:45] (current) – enabled discussion garrettheath4
Line 26: Line 26:
  
 === Asymptotic Order of Growth === === Asymptotic Order of Growth ===
-If we are analyzing the time behavior of an algorithm, it would be nice if we could provide some useful general characteristics about it that could be used to relate to other algorithms with a similar time behavior.  A good way to do this is to put bounds on the time characteristic of the algorithm, being an upper bound and a lower bound of the algorithm's time complexity.  The upper bound of an algorithm's time complexity is notated as ''O(f(n))'' and is known as "Big-O" notation.  Determining the "Big-O" function of an algorithm means that for any problem size the algorithm will not take more computation than the given function plus a constant as the size of the problem grows to infinity.  Similarly, the lower bound of an algorithm's time complexity is notated as ''Ω(f(n))'' and is knwon as "Big-Omega" notation.+If we are analyzing the time behavior of an algorithm, it would be nice if we could provide some useful general characteristics about it that could be used to relate to other algorithms with a similar time behavior.  A good way to do this is to put bounds on the time characteristic of the algorithm, being an upper bound and a lower bound of the algorithm's time complexity.  The upper bound of an algorithm's time complexity is notated as ''O(f(n))'' and is known as **"Big-O" notation**.  Determining the "Big-O" function of an algorithm means that for any problem size the algorithm will not take more computation than the given function plus a constant as the size of the problem grows to infinity.  Similarly, the lower bound of an algorithm's time complexity is notated as ''Ω(f(n))'' and is known as **"Big-Omega" notation** "Big-Omega" notation is useful because it is indicative of the baseline "best-case" scenario of an algorithm of any problem size. 
 + 
 + 
 + 
 +~~DISCUSSION~~
courses/cs211/winter2012/journals/garrett/entries/week_1.1326862588.txt.gz · Last modified: 2012/01/18 04:56 by garrettheath4
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0