[R] a series of 1's and -1's

McGehee, Robert Robert.McGehee at geodecapital.com
Thu Jan 12 00:29:45 CET 2006

I would compare the Shannon entropy of your test vector with the entropy
of your expected probability distribution to see if they are close. That
is, if you're binary probability distribution is half 1 and half -1,
then if your string is long you would expect about half the numbers in
your vector to be 1 and half to be -1, i.e. H(s)=1. Moreover, you should
also look at the entropy of every subset of the vector and compare that
to your distribution as well. For instance, does the sequence (1, 1)
show up just as often as (1, -1), (-1, 1) and (-1, 1)? As this problem
is specific to a certain random process, I doubt there is a canned test
in R. 

Also, the sample entropy should converge to the distribution of the
underlying process as the sample size increases for all subsets of the
sample, probably following a t-distribution (Central Limit Theorem),
although I'd need to noodle on this a bit more. You can then construct a
test of significance if you know the sample size and how far the sample
entropy is from the hypothesized process's distribution. Unfortunately,
it's been a while since I've done information encoding, but hopefully
this gets you started.

You can read up on informational entropy here:

And if you do find a test in R, I would be interested as well.


-----Original Message-----
From: Mark Leeds [mailto:Mleeds at kellogggroup.com] 
Sent: Wednesday, January 11, 2006 4:46 PM
To: R-Stat Help
Subject: [R] a series of 1's and -1's

Does anyone know of a simple test
in any R package that given
a series of negative ones and positive
ones ( no other values are possible in the series )
returns a test of whether the series is random or not.
( a test at each point would be good but
I can use the apply function to implement
that ) ?

This email and any files transmitted with it are

R-help at stat.math.ethz.ch mailing list
PLEASE do read the posting guide!

More information about the R-help mailing list