Information theory measures
Although we strongly recommend the use of Wallace to compare partitions, there are several other coefficients that have been used. Here we present alternatives based on the information theory. All the coefficients presented can be calculated using the Online Tool but they are not calculated by default.

Mutual Information and Normalized Mutual Information
The mutual information between partition A and B is
A normalized Mutual information was also proposed as measure of similarity between partitions A and B

Variation of Information
This coefficient establishes how much information is there in each of the clusterings, and how much information one clustering gives about the other (Meila, 2007).
where H is the entropy (see Shannon's Index of Diversity). The previous expression can be rewritten as: