This is an old revision of the document!


Mike's Journal

Preface:

Algorithms are used all over the place in tons of applications, but most of the time there is no dirrect proof that one is better than the other, and it all becomes muddled by the specifics of the individual application. Designing a great algorithm is an elegant dance of logic, creativity, and language. Learning how to create good ones is an art.

Chapter 1

The Matching Problem

The goal is to make an efficient algorithm for matching two groups together. The simplest form of this problem, and thus the easiest example of it to explain, is where every one thing matches with only 1 thing, and there are an identical number of items on each side. For this example the book uses men and women getting engaged. The Book Presents the Gale and Shapely method for what a stable and unstable match is. An unstable match is one in which two sets of partners would rather be with the partner in the other group. There is then an algorithm presented to match men with women. This is followed by several pretty trivial proofs about the algorithm.

Five Representative Problems

Graphs are good for representing pair-wise relationships. Five separate problems to get some experience with algorithms.

Interval Scheduling

Problem: How do you schedule individual use of a popular resource. This problem is pretty simple to solve.

Weighted Interval Scheduling

Problem: How do you schedule a resource which is in demand, given that individuals have a priority associated with them, to maximize the return. This is not as easy to solve as the non-weighted problem, but you can use dynamic programming.

Bipartite Matching

Problem: When doing the matching problem using Bipartite graphs, the problem gets more difficult. You need to build your matching and then back track to confirm what is needed. Process is called augmentation, applicable in network flow problems.

Independent Set

Problem: given a graph, find the subgraph such that all nodes are independent. Finding independent sets and testing independent sets are very different, with one being very difficult, and the other being rather simple.

Competitive Facility Location

Problem: you have two players competing for owning a certain sum of weighted nodes with rules as to which can be adjacent, each picks nodes after the other, what's the best possible way to play the game to getting the largest sum?

Chapter 2

Basics of Algorithm Analysis

Computational Tactability

Defining efficiency is difficult. You can talk about speed, the finite nature of operations, and the memory used by the data structures. The book first simply says something is efficient if it works quickly with real world datasets. Each set of instructions has a worst case senario in which a maximum amount of instructions are needed to get one broader goal done. Since you don't know what your data is going to be, you want the worst case of your algorithm to be better than the worst case in a brute force approach to solving the problem. Exponential growth, grows exponentially, so try to avoid them as best you can…

Asymptotic Order of Growth

When discussing algorithm complexity it is most important to discuss how the algorithm handles number of instructions relative to the amount of data/information one handles. This is represented by functions. Big O notation is the Asymptotic upper bounds, such that if a function f has O(g(n)), then there exists a c in N such that c*N > f for all n in N. Omega (Ω) notation is the asymptotic lower bounds of the function. If f~O(g(n))~Ω(g(n)), then we say that g(n) is the asymptotically tight bound for f(n), or rather f~Θ(g(n)). This means that the difference between the best case and worst case is not steep. All one needs to understand asymptotic orders and compair them is a basic understanding of how different functions grow, and at which rates they grow.

Chapter 4

Section 7: Clustering

The Problem

Maximum Spacing

Design of the Algorithm

Analyzing the Algorithm

Section 8: Huffman Codes and Data Compression

The Problem

Variable-Length Encoding Schemes
Prefix Codes
Optimal Prefix Codes

Design

Prefix Codes as Binary Trees
Top-Down Approach

Analyzing

Chapter 5

Section 1: Mergesort

Approaches

Unrolling the Mergesort Recurrence
Substituting a Solution into the Mergesort Recurrence
Partial Substitution
courses/cs211/winter2012/journals/mike/home.1330577176.txt.gz · Last modified: 2012/03/01 04:46 by whitem12
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0