[R] Issues with nnet.default for regression/classification
Jude Ryan
jryan at marketsharepartners.com
Mon Nov 29 19:51:26 CET 2010
Good to know that you solved your problem. I did not realize that the default decay parameter = 0 was the cause of the problem. Since I have the MASS book, I was always setting this parameter, in my own work, as indicated in the book, and had no reason to change it. This is probably the first time I have left this parameter out! I am not sure that the effect of leaving out the decay parameter is documented anywhere. I will have to dig out the book and check, but the book is rather terse and to the point and it would not surprise me if there is no mention of when to override the default of decay = 0.
Jude Ryan
MarketShare Partners
1270 Avenue of the Americas, Suite # 2702
New York, NY 10020
http://www.marketsharepartners.com
Work: (646)-745-9916 ext: 222
Cell: (973)-943-2029
-----Original Message-----
From: Georg Ruß [mailto:research at georgruss.de]
Sent: Monday, November 29, 2010 10:37 AM
To: Jude Ryan; R-help at r-project.org
Subject: Re: [R] Issues with nnet.default for regression/classification
On 29/11/10 11:57:31, Jude Ryan wrote:
> Hi Georg,
>
>
> The documentation (?nnet) says that y should be a matrix or data frame,
> but in your case it is a vector. This is most likely the problem, if
> you do not have other data issues going on. Convert y to a matrix (or
> data frame) using ‘as.matrix’ and see if this solves your problem.
> Library ‘nnet’ can do both classification and regression. I was able to
> replicate your problem, using an example from Modern Applied Statistics
> with S, Venables and Ripley, pages 246 and 247), by turning y into a
> vector and verifying that all the predicted values are the same when y
> is a vector. This is not the case when y is part of a data frame. You
> can see this by running the code below. I tried about 4 neural network
> packages in the past, including AMORE, but found ‘nnet’ to be the best
> for my needs.
Hi Jude,
thanks for the hint. I lately experimented both with the nnet(x,y, ...)
and the nnet(formula, dataframe ...) interfaces to nnet and both yielded
the same results. So changing the format of y from a vector to a matrix or
a data frame didn't change anything at all. However, what _did_ change the
outcome is to introduce the "decay" parameter (which I didn't have at all
before). By default it is set to 0 which doesn't seem appropriate in my
case. Setting it to "decay=1e-3" magically turned my output into an
acceptable regression response instead of spitting out fixed values.
I really love the predict interface for regression in each of the models
I'm using. Clear code :-)
So, for the record, the call for nnet for the regression problem is as
follows:
net.fitted <- nnet(formula, data = sppdf at data[-testset,], decay=1e-3, size = 20, linout = TRUE)
(where sppdf at data is the data part of a SpatialPointsDataFrame. And yes,
in selecting the [-testset,] data points I'm taking into account the existing
spatial autocorrelation.)
> # Neural Network model in Modern Applied Statistics with S, Venables
> and Ripley, pages 246 and 247
Thanks for your help and the reference, I'm likely to order the book now
:-) Leaving out the "decay" parameter changes the fitted.values in the
"rock" example you mentioned as well, although not that much. Convergence
speed does change as expected, so the parameter is working. I guess my
problem is solved now, the rest is due to the specialties with my data
sets.
Georg.
--
Research Assistant
Otto-von-Guericke-Universität Magdeburg
research at georgruss.de
http://research.georgruss.de
More information about the R-help
mailing list