[R] Backpropagation to adjust weights in a neural net when receiving new training examples
jude.ryan at ubs.com
jude.ryan at ubs.com
Fri May 29 16:24:14 CEST 2009
You can figure out which weights go with which connections with the
function summary(nnet.object) and nnet.object$wts. Sample code from
Venables and Ripley is below:
# Neural Network model in Modern Applied Statistics with S, Venables and
Ripley, pages 246 and 247
> library(nnet)
> attach(rock)
> dim(rock)
[1] 48 4
> area1 <- area/10000; peri1 <- peri/10000
> rock1 <- data.frame(perm, area = area1, peri = peri1, shape)
> dim(rock1)
[1] 48 4
> head(rock1)
perm area peri shape
1 6.3 0.4990 0.279190 0.0903296
2 6.3 0.7002 0.389260 0.1486220
3 6.3 0.7558 0.393066 0.1833120
4 6.3 0.7352 0.386932 0.1170630
5 17.1 0.7943 0.394854 0.1224170
6 17.1 0.7979 0.401015 0.1670450
> rock.nn <- nnet(log(perm) ~ area + peri + shape, rock1, size=3,
decay=1e-3, linout=T, skip=T, maxit=1000, Hess=T)
# weights: 19
initial value 1196.787489
iter 10 value 32.400984
iter 20 value 31.664545
...
iter 280 value 14.230077
iter 290 value 14.229809
final value 14.229785
converged
> summary(rock.nn)
a 3-3-1 network with 19 weights
options were - skip-layer connections linear output units decay=0.001
b->h1 i1->h1 i2->h1 i3->h1
-0.51 -9.33 14.59 3.85
b->h2 i1->h2 i2->h2 i3->h2
0.93 3.35 6.09 -5.86
b->h3 i1->h3 i2->h3 i3->h3
0.80 -10.93 -4.58 9.53
b->o h1->o h2->o h3->o i1->o i2->o i3->o
1.89 -14.62 7.35 8.77 -3.00 -4.25 4.44
> sum((log(perm) - predict(rock.nn))^2)
[1] 13.20451
> rock.nn$wts
[1] -0.5064848 -9.3288410 14.5859255 3.8521844 0.9266730
3.3524267 6.0900909 -5.8628448 0.8026366 -10.9345352 -4.5783516
9.5311123
[13] 1.8866734 -14.6181959 7.3466236 8.7655882 -2.9988287
-4.2508948 4.4397158
>
In the output from summary(rock.nn), b is the bias or intercept, h1 is
the 1st hidden neuron, i1 is the first input (peri) and o is the
(linear) output. So b->h1 is the bias or intercept to the first hidden
neuron, i1->h1 is the 1st input (peri) to the first hidden neuron (there
are 3 hidden neurons in this example), h1->o is the 1st hidden neuron to
the first output, and i1->o is the first input to the output (since
skip=T - this is a skip layer network). The weights are below (b->h1 ..)
but are rounded. But rock.nn$wts gives you the un-rounded weights. If
you compare the output from summary(rock.nn) and rock.nn$wts you will
see that the first row of weights from summary() is listed first in
rock.nn$wts, followed by the 2nd row of weights from summary() and so
on.
You can construct the neural network equations manually (this is not in
the Venables Ripley book) and check the results against the predict()
function to verify that the weights are listed in the order I described.
The code to do this is:
# manually calculate the neural network predictions based on the neural
network equations
rock1$h1 <- -0.5064848 -9.3288410 * rock1$area + 14.5859255 * rock1$peri
+ 3.8521844 * rock1$shape
rock1$logistic_h1 <- exp(rock1$h1) / (1 + exp(rock1$h1))
rock1$h2 <- 0.9266730 + 3.3524267 * rock1$area + 6.0900909 * rock1$peri
-5.8628448 * rock1$shape
rock1$logistic_h2 <- exp(rock1$h2) / (1 + exp(rock1$h2))
rock1$h3 <- 0.8026366 - 10.9345352 * rock1$area - 4.5783516 * rock1$peri
+ 9.5311123 * rock1$shape
rock1$logistic_h3 <- exp(rock1$h3) / (1 + exp(rock1$h3))
rock1$pred1 <- (1.8866734 - 14.6181959 * rock1$logistic_h1 + 7.3466236 *
rock1$logistic_h2 +
8.7655882 * rock1$logistic_h3 - 2.9988287 * rock1$area -
4.2508948 * rock1$peri +
4.4397158 * rock1$shape)
rock1$nn.pred <- predict(rock.nn)
head(rock1)
perm area peri shape h1 logistic_h1 h2
logistic_h2 h3 logistic_h3 pred1 nn.pred
1 6.3 0.4990 0.279190 0.0903296 -0.7413656 0.3227056 3.770238
0.9774726 -5.070985 0.0062370903 2.122910 2.122910
2 6.3 0.7002 0.389260 0.1486220 -0.7883026 0.3125333 4.773323
0.9916186 -7.219361 0.0007317343 1.514820 1.514820
3 6.3 0.7558 0.393066 0.1833120 -1.1178398 0.2464122 4.779515
0.9916699 -7.514112 0.0005450367 2.451231 2.451231
4 6.3 0.7352 0.386932 0.1170630 -1.2703391 0.2191992 5.061506
0.9937039 -7.892204 0.0003735057 2.656199 2.656199
5 17.1 0.7943 0.394854 0.1224170 -1.6854993 0.1563686 5.276490
0.9949156 -8.523675 0.0001986684 3.394902 3.394902
6 17.1 0.7979 0.401015 0.1670450 -1.4573040 0.1888800 5.064433
0.9937222 -8.165892 0.0002841023 3.072776 3.072776
The first 6 records show that the numbers from the manual equations and
the predict() function are the same (the last 2 columns). As VR point
out in their book, there are several solutions and a random starting
point and if you run the same example your results may differ.
Hope this helps.
Jude Ryan
Filipe Rocha wrote:
I want to create a neural network, and then everytime it receives new
data,
instead of creating a new nnet, i want to use a backpropagation
algorithm
to adjust the weights in the already created nn.
I'm using nnet package, I know that nn$wts gives the weights, but I
cant
find out which weights belong to which conections so I could implement
the
backpropagation algorithm myself.
But if anyone knows some function to do this, it would be even better.
In anycase, thank you!
Filipe Rocha
___________________________________________
Jude Ryan
Director, Client Analytical Services
Strategy & Business Development
UBS Financial Services Inc.
1200 Harbor Boulevard, 4th Floor
Weehawken, NJ 07086-6791
Tel. 201-352-1935
Fax 201-272-2914
Email: jude.ryan at ubs.com
-------------- next part --------------
Please do not transmit orders or instructions regarding a UBS
account electronically, including but not limited to e-mail,
fax, text or instant messaging. The information provided in
this e-mail or any attachments is not an official transaction
confirmation or account statement. For your protection, do not
include account numbers, Social Security numbers, credit card
numbers, passwords or other non-public information in your e-mail.
Because the information contained in this message may be privileged,
confidential, proprietary or otherwise protected from disclosure,
please notify us immediately by replying to this message and
deleting it from your computer if you have received this
communication in error. Thank you.
UBS Financial Services Inc.
UBS International Inc.
UBS Financial Services Incorporated of Puerto Rico
UBS AG
UBS reserves the right to retain all messages. Messages are protected
and accessed only in legally justified cases.
More information about the R-help
mailing list