# [R] SVM coefficients

Bernd Bischl bernd_bischl at gmx.net
Mon Aug 31 15:00:53 CEST 2009

```Noah Silverman wrote:
> Steve,
>
> That doesn't work.
>
> I just trained an SVM with 80 variables.
> svm_model\$coefs gives me  a list of 10,000 items.  My training set is
> 30,000 examples of 80 variables, so I have no idea what the 10,000
> items represent.
>
> There should be some attribute that lists the "weights" for each of
> the 80 variables.
>

Hi Noah,
does this help?

# make binary problem from iris
mydata <- iris[1:100,]
mydata\$Species <- mydata\$Species[,drop=T]
str(mydata)

#'data.frame':   100 obs. of  5 variables:
# \$ Sepal.Length: num  5.1 4.9 4.7 4.6 5 5.4 4.6 5 4.4 4.9 ...
# \$ Sepal.Width : num  3.5 3 3.2 3.1 3.6 3.9 3.4 3.4 2.9 3.1 ...
# \$ Petal.Length: num  1.4 1.4 1.3 1.5 1.4 1.7 1.4 1.5 1.4 1.5 ...
# \$ Petal.Width : num  0.2 0.2 0.2 0.2 0.2 0.4 0.3 0.2 0.2 0.1 ...
# \$ Species     : Factor w/ 2 levels "setosa","versicolor": 1 1 1 1 1 1
1 1 1 1 ...

# inputs
X <- as.matrix(mydata[,-5])

# train svm with linear kernel,
# to make later stuff easier we dont scale
m <- svm(Species~., data=mydata, kernel="linear", scale=F)

# ....
# Number of Support Vectors:  3

# we get 3 support vectors, these are weights for training cases
# or in svm therory speak: our dual variables alpha
m\$coefs[,1]
#   0.67122500  0.07671148 -0.74793648

# these are the indices of the cases to which the alphas belong
m\$index
#  24 42 99

# lets calculate the primary vars from the dual ones
# svm theory says
# w = sum x_i alpha_i
w <- t(m\$coefs) %*% X[m\$index,]
#    Sepal.Length Sepal.Width Petal.Length Petal.Width
# [1,]  -0.04602689   0.5216377    -1.003002  -0.4641042

# test whether the above was nonsense.....
# e1071 predict
p1 <- predict(m, newdata=mydata, decision.values=T)
p1 <- attr(p1, "decision.values")
# do it manually with w, simple linear predictor with intercept -m\$rho
p2 <- X %*% t(w) - m\$rho

# puuuh, lucky....
max(abs(p1 - p2))
#  6.439294e-15

Bernd

```