A More Complex Data Structure: Priority Queues

To qualitatively improve the brute-force algorithms so we can have efficient algorithms, we always need to overcome higher-level obstacles. However, the running time of polynomial-time algorithms can often be further improved through the implementation details and use of more complex data structures. Some of those complex data structures are usually specialized for solving some kind of problems, while we can also find others that are more general in their applications. This section illustrates and analyses a broadly used complex data structure, called priority queue. In a priority queue, each element is inserted with a certain key or value, which helps to identify the element for various manipulations. Priority queues perform non-trivial computations when called, and are usually a much better alternative to arrays and lists.

Problem

The following are situations where we may need to use priority queues:

  • The Stable Matching Problem: The set of free men is dynamically changing, and the priority queue will help us maintain this set in a much more efficient way.
  • Managing real-time events: For instance the scheduling of processes on a computer,where each process has a priority;but processes don't arrive in order of their priority and our goal is to select the one with the currently highest priority and run it.
  • Sorting n elements in O(nlogn) time.

Implementing the Priority queue

To implement a priority queue using:

  • lists: We can have a pointer for the Min of the elements and use it to access the Min and adding elements become easy.
    • Problem: Extraction of Min is hard since it would take us O(n) to update the Min pointer
  • The issue may be resolved by storing the elements in a sorted list.
    • Problem: If we use a doubly-linked list, we may need O(n) time to search for where to insert the element and then add it in O(n) time.
    • If we choose to use an array,we would find the position in O(logn) time but we would have to shift all of the other elements one position to the right,which takes up to O(n) time

The heap comes in as a solution to these issues, as it combines the benefits of a sorted array and a list. Elements in a heap order are arranged in such a way the key of any parent element is not bigger than the key of any of its children, and the heap itself is a balanced binary tree.

in a heap implemented using arrays:

  • Each node stores its key,and three pointers: two pointing to its left and right child, and the third pointing to its parent
  • For a heap H,H[1] = root
  • leftChild(i) = 2i
  • rightChild(i) = 2i + 1
  • parent(i) = int(i/2)(Int:make the result an integer)
  • length(H) = number of elements in the heap(Logical size)

Implementing the heap operations

  • It takes O(1) to identify the Min element since the smallest key is the root
  • To add a new element,we use the procedure called Heapify-up to keep the heap property.(Algorithm: see Eva Tardos and Jon Kleinberg,Algorithm Design,pp 61).
  • Heapify-up fixes the heap in O(log(i)) time,assuming that the array is almost a heap with the keyof H[i] too small.
  • inserting a new element in a heap of length n,takes O(logn) when we use Heapify-up.
  • Deleting an element,i, is a little trickier
  • Since deleting an element,i, from the heap leaves a hole in the heap, we move the last element in the heap into that hole
  • Thus we encounter two scenarios:
    • If the element moved at position i is too small,then we can use Heapify-up to fix the heap
    • If the element moved at position i is too big,we use another procedure similar to Heapify-up, that is called Heapify-down.For the algorithm of Heapify-down, see Tardos and Kleinberg,pp 63)
  • Heapify-down fixes the heap in O(logn) time assuming the heap is almost a heap with the key value of H[i] too big.
  • Using Heapify-up or Heapify-down ,deletion is completed in O(logn) time.

(For definitions of too big and too small,see Tardos and Kleinberg, pp 61-4).

Implementing Priority Queues with Heaps

Operations used for a heap with N < n elements:

  • StartHeap(N):Creates an empty heap that can hold N elements. Takes O(N) time.
  • Insert(v) Inserts item v into heap. Takes O(logn) time at most, if heap currently has n elements.
  • FindMin() Identifies minimum element in heap but does not remove it.Takes O(1) time.
  • Delete(i) Deletes element in heap at position i. Takes O(logn) time if heap currently has n elements.
  • ExtractMin() Identifies and deletes an element with minimum key from heap. Takes O(logn) time if heap currently has n elements.

If we want to operate on a particular node,regardless of where it is positioned in the heap, we simply maintain an additional array Position that stores the current position of each element.

In this case,

  • To delete a node v, we use Delete(H,Position[v]). It takes O(logn) time
  • To change the key of v to w, we use ChangeKey(H,v,w). It also takes us O(logn)

I definitely need to remember the positional relationships between the parent and its children in a heap. I also need to remember the two procedures, namely Heapify-up and Heapify-down

This section was a very important and interesting reading, but I give it an 8/10 since it required additional attention compared to the other sections.

courses/cs211/winter2012/journals/jeanpaul/chaptter2section5.txt · Last modified: 2012/01/26 17:59 by mugabej
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0