Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2018:journals:cohene:home:chapter3 [2018/02/06 20:29] – [3.2: Graph Connectivity and Graph Traversal] cohenecourses:cs211:winter2018:journals:cohene:home:chapter3 [2018/02/07 00:55] (current) – [3.6: Directed Acyclic Graphs and Topological Ordering] cohene
Line 15: Line 15:
 The second method to determining connectivity is the Depth-First Search (DFS). The way I am able to best distinguish BFS from DFS is to think of DFS as the way to solve a maze. You start at s and take the first edge leading out, continuing until a dead-end is reached. In general, DFS explores G by going as deep as possible and only retracing when necessary. Because of this functionality, DFS must maintain global knowledge of which nodes have already been explored.  The second method to determining connectivity is the Depth-First Search (DFS). The way I am able to best distinguish BFS from DFS is to think of DFS as the way to solve a maze. You start at s and take the first edge leading out, continuing until a dead-end is reached. In general, DFS explores G by going as deep as possible and only retracing when necessary. Because of this functionality, DFS must maintain global knowledge of which nodes have already been explored. 
  
 +
 +===== 3.3: Implementing Graph Traversal Using Queues and Stacks=====
 +This section addressed the methodology of using lists and arrays to represent graphs. There are two basic ways to represent a graph, an adjacency matrix and adjacency list. Every graph G = (V,E) has two inputs- the number of nodes |V| and the number of edges |E|. We assign n=|V| and m=|E| for simplicity. The number of edges can be at most n^2, with connected graphs having at least m>= n-1 edges. The search algorithms are in O(m+n), which is linear time. For connected graphs, O(m+n) = O(m) since m>= n-1. 
 +
 +An adjacency matrix is an nxn matrix where A[u,v] is 1 if the graph contains that edge. While this allows check if this edge is present in constant time, there are two major setbacks. First, the adjacency matrix takes Θ(n^2) space. Second, checking to see whether A[v,w] to see whether that edge is present takes Θ(n) time. 
 +
 +An adjacency list presents a better option for representing graphs. The adjacency list records for each node v, containing a list of the nodes to which v had edges. On an undirected graph each edge will occur on two adjacency lists. This representation requires only O(n+m) space, and the total lenth of all lists is 2m = O(m). 
 +
 +There are also two data structures which we can use to help implement the adjacency list. The first is a queue, which adds items in first-in, first-out order. The other is a stack, which adds items in a last-in, first-out order. An adjacency list is good for BFS and is more efficient to manage with a queue. DFS would be more efficient to process in a stack rather than a queue as there is a large distinction in discovering a node (BFS) and exploring a node (DFS). 
 +
 +Overall this section helped to clarify the use of adjacency lists and was also about a 7 on readability.
 +
 +===== 3.4: Testing Bipartiteness: An Application of Breadth-First Search=====
 +A bipartite graph is one where the set of the nodes can be partitioned into sets X and Y in such a way that every edge has one of its ends in X and the other in Y. As we discussed in class, it helps to imagine that the nodes in set X are red and the nodes in set Y are blue. This allows us to imagine a bipartite graph as one with red and blue nodes where every edge has one red end and one blue end. 
 +
 +In this chapter we look to find natural examples of a non-bipartite graph where no partition of V (the set of nodes) is possible. It is easy to identify that a triangle isn't bipartite, or any cycle of odd length. because every odd number node would be colored red and every even node would be colored blue, and the cycle starts and ends with an odd node, it would not be bipartite. 
 +
 +We can test for bipartitness by doing the following: first, assume that the graph, G, is connected. Then, pick any node s∈V and color it red. All of its neighbors are then colored blue. Next, all of the neighbors of those nodes are colored red, until the entire graph has been colored. This allows us to visualize whether or not we have a bipartite graph. This description is the same as the process for BFS. Just like BFS, the runtime for the coloring algorithm is O(n+m).
 +===== 3.5: Connectivity in Directed Graphs=====
 +This section tackles the issue of directed graphs. To represent directed graphs, we will still use an adjacency list. Instead of each node having a single list of neighbors, each node has two lists- one for nodes to which it has edges and one to which it has edges from. BFS and DFS are very similar to how they are in undirected graphs. To use BFS as an example, first we start at a node s. Then we define a layer of nodes to conist of all those to which s has an edge, and a second layer for the additional nodes to which these first-layer nodes have edges. This way nodes are discovered in each layer in an outward search from s and we get the shortest path from s with j edges (j being the layer). This results in a running time of O(m+n). DFS still runs in linear time and results in the same set of nodes. It runs the same except DFS can only travel in the direction of the edges.
 +
 +A graph is strongly connected if, for two nodes u and v, u and v have a path as well as v and u. u and v are mutually reachable if it is strongly connected. There are several properties that can be derived from this. First, if u and v are mutually reachable, and v and w are mutually reachable, then u and w are mutually reachable. We can check in linear time if a directed graph is strongly connected. We can define the strong component containing a node s in a directed graph to be the set of all v such that s and v are mutually reachable. Furthermore, for any two nodes s and t in a directed graph, their strong components are either identical or disjoint. 
 +===== 3.6: Directed Acyclic Graphs and Topological Ordering=====
 +If a directed graph has no cycles it is considered a directed acyclic graph (DAG). DAGs can be used for precedence relations or dependencies. Given a set of tasks with dependencies, it is natural to look for an order in which the tasks should be performed. This looks for topological ordering, where all edges point forward in the ordering. 
 +
 +We look here to check if every DAG has a topological ordering. To do so, we need to find which node goes at the beginning of a topological ordering. In every DAG there must be a node with no incoming edges, which will be the starting node. This will help us build our topological ordering. Once we remove the starting node, we must find the next node with no incoming edges. If we repeat this pattern, we will eventually build our order. This section int he textbook really helps to clarify our discussion of this topic in class. In this algorithm, finding a node v with no incoming edges and deleting it will be done in O(n), and the algorithm itself runs n times, so the total running time is O(n^2). By iteratively deleting nodes with no incoming edges we can achieve a running time of O(m+n). 
courses/cs211/winter2018/journals/cohene/home/chapter3.1517948951.txt.gz · Last modified: by cohene
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0