[R] entropy package: how to compute mutual information?

Sam Steingold sds at gnu.org
Mon Feb 13 22:23:42 CET 2012


> * Sam Steingold <fqf at tah.bet> [2012-02-13 16:14:36 -0500]:
>
> suppose I have two factor vectors:
>  x <- as.factor(c("a","b","a","c","b","c"))
>  y <- as.factor(c("b","a","a","c","c","b"))
> I can compute their entropies:
>  entropy(table(x))
> [1] 1.098612
> using
>  library(entropy)
> but it is not clear how to compute their mutual information directly.
> I can compute the joint entropy as
>   entropy(table(paste(x,y,sep="")))

this can be simplified to entropy(table(x,y))

> [1] 1.791759
> and then mutual information will be h(x) + h(y) - h(x,y) =
> 1.098612 + 1.098612 - 1.791759
> 0.405465
>
> but I was wondering whether there was a better way (without creating a
> fresh factor vector and a fresh factor class, both of which are
> immediately discarded).

-- 
Sam Steingold (http://sds.podval.org/) on Ubuntu 11.10 (oneiric) X 11.0.11004000
http://www.childpsy.net/ http://ffii.org http://mideasttruth.com
http://thereligionofpeace.com http://americancensorship.org http://memri.org
If money were measured in piles, I would have had a pit of it.



More information about the R-help mailing list