SplitEntropy function

Entropy of two splits

Entropy of two splits

Calculate the entropy, joint entropy, entropy distance and information content of two splits, treating each split as a division of n leaves into two groups. Further details are available in a vignette, \insertCite Mackay2003;textualTreeDist and \insertCite Meila2007;textualTreeDist. UTF-8

SplitEntropy(split1, split2 = split1)

Arguments

  • split1, split2: Logical vectors listing leaves in a consistent order, identifying each leaf as a member of the ingroup (TRUE) or outgroup (FALSE) of the split in question.

Returns

A numeric vector listing, in bits:

  • H1 The entropy of split 1;
  • H2 The entropy of split 2;
  • H12 The joint entropy of both splits;
  • I The mutual information of the splits;
  • Hd The entropy distance (variation of information) of the splits.

Examples

A <- TRUE B <- FALSE SplitEntropy(c(A, A, A, B, B, B), c(A, A, B, B, B, B))

References

\insertAllCited

See Also

Other information functions: SplitSharedInformation(), TreeInfo

Author(s)

Martin R. Smith

(martin.smith@durham.ac.uk)