[R] problem with nnet
madhurima bhattacharjee
madhurima_b at persistent.co.in
Thu Feb 2 17:33:50 CET 2006
Hello All,
I am working with samr and nnet packages.
I am following the steps given below:
1> I take a input file with signal values for 9506 genes and 36 chips
, belonging to two classes.
2> I perform samr analysis on 80% of chip data from both the
classes.(selected by random sampling)
3> I then use the data of only the significant genes from this samr
analysis to train nnet.
4> The parameters I am currently using for nnet are:
result <- nnet(traindata[sortsamp,], targets[sortsamp,], size = nnetsize
, rang =0.00000003 ,decay = 0.00009, maxit = 100, MaxNWts =100000)
traindata is the significant gene's data, sortsamp is the
randomly sampled number out of those genes and targets is the class
indicator of the significant genes.
5> Then I am using the data from the chips left out from samr(20% chip
data) to test the nnet.
I use the following command for this:
pred <-predict(result, testdata) #result is the nnet output(given
above) and testdata is the chipdata of remainder chips.
6>I run the nnet part for 100 runs using the same significant genes in
all runs.
The problem is that the pred in each run gives same values for both the
classes.
Example:
[1] "pred----->"
cer noncer
[1,] 0.4990032 0.5009930
[2,] 0.4990032 0.5009930
[3,] 0.4990030 0.5009933
[4,] 0.4989994 0.5009968
[5,] 0.4990030 0.5009932
[6,] 0.4990032 0.5009930
[7,] 0.4990032 0.5009930
This is really weird result.
I have checked the data and code several times but couldn't figure out
the problem.
I am really stuck in this.
Can anyone help me as soon as possible?
Thanks in advance,
Madhurima.
More information about the R-help
mailing list