Chapter 2: Basics of Algorithm Analysis

2.3 - Implementing the Stable Matching Algorithm Using Lists and Arrays

Arrays and Lists

An array is able to maintain a list of elements either sorted either numerically or alphabetically to allow for O(log n) access via binary search. Arrays are not good for maintaining dynamically changing lists.

Linked lists sequence elements by having each point to the next in the list. Traversing the list is of time proportional to the length, or O(i). Insertion and deletion in a linked lists requires changing pointers of values.

Implementing the Stable Matching Algorithm

In order for the G-S algorithm to run in n^2 time, each iteration of the following process must run in constant time:

  • Identify a free man, m
  • Identify m’s most preferred woman, w
  • Decide if w is engaged and identify current partner
  • Decide if w prefers m to her current partner

This is accomplished by maintaining two arrays for each list of preferences, ManPref and WomanPref. An array Next is used to keep track of the next women that a man will propose to, based on preference. A Current array is kept to track the current engagements. Another array of size n x n contains the rank of m in the sorted order of w’s preferences.

Using these arrays allows us to complete each step in constant time, O(1), and the G-S algorithm in O(n^2) time.

Discussion

The information in the section is straight-forward and calls for little debate. I would find it interesting if they offered alternative solutions to the problem that might characterize a poor design choice and follow with an explanation.

2.4 - A Survey of Common Running Times

Linear O(n) time- running time is at most a constant factor times the size of input

Linear time is found in algorithms computing the maximum element of a list or merging two sorted lists - constant work per element.

O(n log n) time - running time of any algorithm that splits its input into two equal pieces, solves each piece recursively, and then combines the solutions in linear time

This can be found in the Mergesort algorithm and also when the most expensive step of an algorithm is to sort the input.

Quadratic O(n^2) time - running time where a loop with O(n) iterations takes O(n) time for each iteration. Common in nested loops.

Cubic time O(n^3) - running time of more elaborate sets of nested loops with O(n) time

O(n^k) time - running time when searching over all subsets of size k

Run times of 2^n and n! grow faster than polynomial functions. n! algorithms typically handle situations when finding the number of ways n items can be matched with n other items.

Sublinear time has cases where the running time is asymptotically smaller than linear. For example, the binary search algorithm runs in O(log n) time where a constant amount of work throws out a constant fraction of the input.

Discussion

Upon further reading, I found it easier to compute times by identifying smaller processes within an algorithm and referring back to other algorithms where I can relate the current problem.

2.5 - A More Complex Data Structure: Priority Queues

The Problem

A priority queue is a data structure that maintains a set of elements S, where each element (v in S) has an associated value key v that denotes the priority of element v. Supports addition and deletion. Handle events based on priority. Adding, deleting, and selecting highest priority occurs in O(log n) time.

A sequence of O(n) priority queue operations can be used to sort a set of n numbers in O(n log n) time.

A Data Structure for Implementing a Priority Queue

Priority queue is implemented in a heap. A heap is a balanced binary tree with a root and nodes containing up to two children. Heap order is when the key of an element is at least as large as the key of the element at its parent node in the tree.

Implementing the Heap Operations

The procedure Heapify-up fixes the heap property in O(log i) time, assuming that the array H is almost a heap with the key of H[i] too small. Using Heapify-up we can insert a new element in a heap of n elements in O(log n) time.

The procedure Heapify-down fixes the heap property in O(log n) time, assuming that H is almost a heap with the key value of H[i] too big. Using Heapify-up or Heapify-down we can delete a new element in a heap of n elements in O(log n) time.

Discussion

The heap is a simple tree structure that focuses on the Heapify-up/-down procedures to order their elements. I am most curious to see how the heap (and other more complex structures in the future) affects the difficulty of the programming.

courses/cs211/winter2014/journals/colin/chapter2.txt · Last modified: 2014/01/23 06:47 by mohnacsc
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0