Talk:Kruskal's algorithm

Latest comment: 10 years ago by Schulllz in topic Animation

Confusion on algorithm

edit

Someone should explain all the variable names in it. particularly what "u" refers to. — Preceding unsigned comment added by 141.219.232.115 (talk) 00:51, 24 October 2013 (UTC)Reply

Untitled

edit

Many problems with this proof. Confusion between vertices and edges, etc., etc.. --zero 05:56, 13 Oct 2003 (UTC)

There was actually also a problem with the running time. Kruskal's can run in O(|E| log |V|) where E is the set of edges and V the set of vertices. It should be obvious that |E| >= |V| in any graph that connects all vertices (though not necessarily fully connected).

Hell, using a better UNION-FIND paradigm, Kruskal's algorithm will run in expected time  , where   is the inverse Ackermann function. (If memory serves right. It's certainly better than  ). Grendelkhan 08:46, 2004 Apr 20 (UTC)

A simple oversight I'm sure, but it's certainly not obvious that |E| >= |V| in any connected graph because that is false. This fails precisely on trees, where |E| = |V| - 1. 129.128.211.134 (talk) 21:17, 25 November 2011 (UTC)Reply

Merge the min spanning tree algorithms into one article

edit

The three pages Kruskal's algorithm, Boruvka's algorithm and Prim's algorithm should be merged into one article (possibly named minimum weight spanning tree algorithm), because they are all very similar greedy algorithms (the underlying concept is the same, they only differ, if at all, in use of data structures), which were discovered independently. Also, there are some other confusions. Jarnik, mentioned in Prim's algorithm, cooperated on this problem with Boruvka, so they should be mentioned on the same page. Also, originally, their definiton of "graph" (for this problem) was a set of points in Euclidean space, and the weights were distances among points and the tree was called skeleton (the lightest possible construction needed to rigidly connect the points - so, due to this tradition, in Czech, the problem is called "problem of minimum skeleton"). So, if anyone doesn't object to this, I'll try to merge them. Samohyl Jan 17:29, 11 Jan 2005 (UTC)

I don't think they need to be merged. Each is already mentioned in the Minimum spanning tree article and they seem different enough to have their own articles. Cockroachbill
It may be worthwhile merging Kruskal's with Prim's, as these are pretty similar, but Boruvka's is (I think) different. After all, textbooks often treat them simultaneously. Even for those two, I could go either way. The "minimum skeleton" you refer to is now called the Euclidean minimum spanning tree, which has its own article, and in fact has considerably more efficient specialized algorithms than any of these three. Consider adding a redirect. Deco 18:33, 1 December 2005 (UTC)Reply
I, too, see no reason for merging. grendel|khan 19:15, 1 December 2005 (UTC)Reply
Agreed, I see no reason for merging. As a student, the separate articles have helped me understand the differences in a way one article would never have. 129.59.157.168 (talk) 06:11, 10 February 2009 (UTC)Reply

caption on figure

edit

The caption should not describe the figure as a complete network - in a complete network, all edges are present. The preceding unsigned comment was added by 69.159.78.115 (talk • contribs) 21:15, 4 February 2006 (UTC)

I've gone ahead and fixed it up. It said "complete network", which might be someone's jargon for "a graph in which all vertices form a single well-connected component". A quick Google search suggested it is not. In the future, be bold and fix things as you see them! --Mgreenbe 19:31, 4 February 2006 (UTC)Reply

Minimum Spanning Forest

edit

Minimum Sppaning Forest is a subset of edges that "cover" all vertexs, not a set of minimum spanning trees of each connected component The preceding unsigned comment was added by 201.129.158.57 (talk • contribs) 23:13, 12 February 2006 (UTC)

You are fundamentally correct, but watch: minimal such subsets of edges in a connected graph induce trees. In a disconnected graph, each component treated individually is spanned by a tree; the union of these is a forest. Thus a minimal spanning forest is a forest of minimal spanning trees. --Mgreenbe 21:47, 12 February 2006 (UTC)Reply

What is the difference between Kruskal's and Borůvka's algorithms?

edit

What is the difference between Kruskal's and Borůvka's algorithms? Some study material I have describes some algorithm which is labeled as Borůvka/Kruskal. And I see no difference between the two. Can someone point to the difference?--Alvin-cs 13:20, 8 September 2007 (UTC)Reply

Proof of correctness

edit

May be I'm just stupid, but I found the proof un-clear. --Weedrat (talk) 14:21, 10 March 2008 (UTC)Reply

You're not - we'd appreciate specific feedback on what parts confused you. Dcoetzee 20:03, 10 March 2008 (UTC)Reply
I also found the argument (edit: that is, the argument before Mikkalai fixed it) confusing. I think I've managed to recast the argument in a way which makes its key ideas more explicit. Specifically,
  1. subforests of a graph form a matroid (the graphic matroid), so we have the exchange property, and
  2. by appropriately changing the edge order, one can produce any desired spanning tree (although the spanning tree is only minimal if edges with smaller weights appear before edges with larger weights).
This suggests that we can move one step at a time from a spanning tree produced by the algorithm to a given minimum spanning tree while maintaining the weight at each step. Here is my attempted revision:
Let T be the spanning tree produced by the algorithm for a given edge order, and let M be a minimum spanning tree. Assume that T contains k > 0 edges which do not occur in M, and let e be the smallest such edge. By the exchange property, there is an edge f which occurs in M but not in T such that T \ e ∪ f is a spanning tree. Since M is a minimum spanning tree, the weight of f is less than or equal to the weight of e. Let F be the forest constructed by the algorithm just prior to encountering the edge e. If f has already been encountered and rejected, then F ∪ f contains a cycle, contradicting the claim that T \ e ∪ f is a tree. Hence e is encountered before f, implying that the weight of e is less than or equal to the weight of f. Consequently, the spanning treeT \ e ∪ f has the same weight as T, but contains only k − 1 edges which do not occur in M. The algorithm can be coaxed to produce the spanning tree T \ e ∪ f by swapping the positions of e and f in the edge order. Repeatedly applying this procedure eventually produces M. Hence T and M have the same weight. Michael Slone (talk) 06:04, 11 March 2008 (UTC)Reply

I made some changes and corrections to the proof - it would be good if someone else could check these. The sentence in parentheses used to say "at least one vertex of e is not in the set of vertices incident on the edges of F." But I don't think this is necessarily the case, so I wrote a different reason why C has to contain another edge (as well as e) that is not in F. Abc45624 (talk) 11:39, 4 March 2011 (UTC)Reply

More data on the union/find operations

edit

I have now had three teachers use the union-find approach to both understanding, and analyzing kruskal's algorithm. Could we possibly get a section on this? 129.59.157.168 (talk) 06:14, 10 February 2009 (UTC)Reply

Pseudocode

edit

The Kruskal psuedocode could quite easily be called obscurocode. After all, if it were so clear, why all the lines of comments? My feeling is that a more useful text would be to use pure Python (by pure, I mean no 3rd party libraries needed). For example, the following public domain search (by Guido) has no comments and should be clear to most readers.

def find_path(graph, start, end, path=[]):
  path = path + [start]
  if start == end:
    return path
  if not graph.has_key(start):
    return None
  for node in graph[start]:
    if node not in path:
      newpath = find_path(graph, node, end, path)
      if newpath: return newpath
  return None


If I could figure out the use of the cluster and which node it should be assigned to after merging the clusters for each node, I would try to put it into Python.

As an alternative, this would be my suggestion, using the above search.

def Kruskal(graph):
  tree={}
  nodes=node_list(graph)
  n=len(nodes)
  edges=graph.keys()
  edges.sort(key=lambda x:graph[x][2])
  for e in edges:
    v=graph[e]
    loop=find_path(tree, v[1], v[0])
    if not loop:
      tree[e]=v
      if len(tree)>n-2:break
  return tree

41.246.48.90 (talk) 10:31, 11 April 2009 (UTC)Reply

Negative edge weights?

edit

Why is there no discussion on negative edge weights? Does the algorithm still work for negative edge weights? 129.67.128.29 (talk) 23:51, 1 June 2009 (UTC)Reply

It does work for negative edge weights. Note that every spanning tree has the same size (|V|-1), so if we increase the weights of all edges by the same amount Δ, the weights of all spanning trees increase by Δ(|V|-1). If we increment the weights so that all edges have non-negative weight we can conclude from the above that the minimum spanning tree found by Kruskal's in the modified graph corresponds to the minimum spanning tree in the original.
In fact there is no need to actually modify the weights, since Kruskal's will perform the same operations in either case (verify this yourself). 129.128.211.134 (talk) 21:26, 25 November 2011 (UTC)Reply

Animation

edit

I've made an animation to show the process of finding minimal spanning tree. Do you have any comments for animation? Do you have any objections not to include animation to article?

 
Visualization of Kruskal's algorithm

Schulllz (talk) 15:18, 30 January 2014 (UTC)Reply