Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revision | |||
courses:cs211:winter2012:journals:jeanpaul:chapterfour_sectionvii [2012/03/06 03:28] – [The Problem] mugabej | courses:cs211:winter2012:journals:jeanpaul:chapterfour_sectionvii [2012/03/06 03:38] (current) – [The Problem] mugabej | ||
---|---|---|---|
Line 11: | Line 11: | ||
\\ | \\ | ||
--> Let's consider a set U of n objects{p< | --> Let's consider a set U of n objects{p< | ||
- | --> For each pair p< | + | --> For each pair p< |
--> For this distance:\\ | --> For this distance:\\ | ||
--> d(p< | --> d(p< | ||
Line 40: | Line 40: | ||
--> Each time we add an edge between that spans two distinct components, it's like merging two corresponding clusters: This iterative merging of clusters is called **single-link clustering**, | --> Each time we add an edge between that spans two distinct components, it's like merging two corresponding clusters: This iterative merging of clusters is called **single-link clustering**, | ||
--> Single-link clustering is a special case of // | --> Single-link clustering is a special case of // | ||
- | + | \\ | |
- | + | --> This algorithm is precisely an application of Kruskal' | |
+ | --> Only difference: We stop once we obtain k connected components. So we're running Kruskal' | ||
+ | \\ | ||
+ | --> So, it's basically taking the full Minimum Spanning Tree T produced by Kruskal' | ||
+ | \\ | ||
+ | ** Analyzing the Algorithm ** | ||
+ | \\ | ||
+ | \\ | ||
+ | --> The components C< | ||
+ | --> The idea behind the claim is that the spacing of any other clustering can be no larger than that of the clustering found by by the single-linkage algorithm.\\ | ||
+ | \\ | ||
+ | This section was interesting in that it presented a very important idea in a way as concise as it can get. I give it an 8/10. | ||