[R] using contrasts on matrix regressions (using gmodels, perhaps): 2 Solutions

Ranjan Maitra maitra at iastate.edu
Thu Jul 26 20:33:29 CEST 2007


Dear list,

I got two responses to my post. One was from Soren with a follow-up on personal e-mail, and the other I leave anonymous since he contacted me on personal e-mail. Anyway, here we go:

The first (Soren):

library(doBy)

Y <- as.data.frame(Y)
 
lapply(Y,function(y){reg<- lm(y~X); esticon(reg, c(0,0, 0, 1, 0, -1) )})
 
Confidence interval ( WALD ) level = 0.95 
Confidence interval ( WALD ) level = 0.95 
Confidence interval ( WALD ) level = 0.95 
Confidence interval ( WALD ) level = 0.95 
Confidence interval ( WALD ) level = 0.95 
$V1
  beta0  Estimate Std.Error  t.value DF  Pr(>|t|)  Lower.CI Upper.CI
1     0 0.6701771  0.517921 1.293976  4 0.2653302 -0.767802 2.108156
$V2
  beta0   Estimate Std.Error    t.value DF  Pr(>|t|)  Lower.CI Upper.CI
1     0 -0.2789954   0.64481 -0.4326784  4 0.6875555 -2.069275 1.511284
$V3
  beta0   Estimate Std.Error    t.value DF  Pr(>|t|)  Lower.CI Upper.CI
1     0 -0.7677927 0.9219688 -0.8327751  4 0.4518055 -3.327588 1.792003
$V4
  beta0   Estimate Std.Error   t.value DF Pr(>|t|)  Lower.CI  Upper.CI
1     0 -0.6026635 0.4960805 -1.214850  4  0.29123 -1.980004 0.7746768
$V5
  beta0 Estimate Std.Error  t.value DF Pr(>|t|)  Lower.CI Upper.CI
1     0 2.001558  1.004574 1.992444  4 0.117123 -0.787587 4.790703

 

One thing I do not know how to handle is the output "Confidence interval ( WALD ) level = 0.95"  which shows up for every regression. When I do millions of regressions, this seriously slows it all down. Any idea how I can suppress that?



The second solution uses gmodels, with a lucid explanation which I reproduce. Thanks!
 

The second (anon):

For a standard (non-matrix) regression, you could test the hypothesis  
X3=X4 using

	estimable(reg, c("(Intercept)"=0, X1=0, X2=0, X3=1, X4=0, X5=-1) )

but this won't currently work with the mlm object created by a matrix  
regression.

The best way to solve this problem is to write an estimable.mlm()  
function that simply extracts the individual regressions from the mlm  
object and then calls estimable on each of these, pasting the results  
back together appropriately.

Something like this should do the trick:

`estimable.mlm` <-
   function (object, ...)
{
   coef <- coef(object)
   ny <- ncol(coef)
   effects <- object$effects
   resid <- object$residuals
   fitted <- object$fitted
   ynames <- colnames(coef)
   if (is.null(ynames)) {
     lhs <- object$terms[[2]]
     if (mode(lhs) == "call" && lhs[[1]] == "cbind")
       ynames <- as.character(lhs)[-1]
     else ynames <- paste("Y", seq(ny), sep = "")
   }
   value <- vector("list", ny)
   names(value) <- paste("Response", ynames)
   cl <- oldClass(object)
   class(object) <- cl[match("mlm", cl):length(cl)][-1]
   for (i in seq(ny)) {
     object$coefficients <- coef[, i]
     object$residuals <- resid[, i]
     object$fitted.values <- fitted[, i]
     object$effects <- effects[, i]
     object$call$formula[[2]] <- object$terms[[2]] <- as.name(ynames[i])
     value[[i]] <- estimable(object, ...)
   }
   class(value) <- "listof"
   value
}

Now this all works:

 > X <- matrix(rnorm(50),10,5)
 > Y <- matrix(rnorm(50),10,5)
 > reg <- lm(Y~X)
 > estimable(reg, c("(Intercept)"=0, X1=0, X2=0, X3=1, X4=0, X5=-1) )  

Response Y1 :
                  Estimate Std. Error   t value DF  Pr(>|t|)
(0 0 0 1 0 -1) -0.9024065  0.4334235 -2.082043  4 0.1057782

Response Y2 :
                  Estimate Std. Error   t value DF   Pr(>|t|)
(0 0 0 1 0 -1) -0.7017988  0.2199234 -3.191106  4 0.03318115

Response Y3 :
                 Estimate Std. Error  t value DF  Pr(>|t|)
(0 0 0 1 0 -1) 0.5412863  0.2632527 2.056147  4 0.1089276

Response Y4 :
                  Estimate Std. Error    t value DF Pr(>|t|)
(0 0 0 1 0 -1) -0.1028162  0.5973959 -0.1721073  4  0.87171

Response Y5 :
                 Estimate Std. Error  t value DF  Pr(>|t|)
(0 0 0 1 0 -1) 0.2493330  0.2024061 1.231845  4 0.2854716











On Wed, 25 Jul 2007 18:30:36 -0500 Ranjan Maitra <maitra at iastate.edu>
wrote:

> Hi, 
> 
> I want to test for a contrast from a regression where I am regressing the columns of a matrix. In short, the following.
> 
> X <- matrix(rnorm(50),10,5)
> Y <- matrix(rnorm(50),10,5)
> lm(Y~X)  
> 
> Call:
> lm(formula = Y ~ X)
> 
> Coefficients:
>              [,1]     [,2]     [,3]     [,4]     [,5]   
> (Intercept)   0.3350  -0.1989  -0.1932   0.7528   0.0727
> X1            0.2007  -0.8505   0.0520   0.1501   0.3248
> X2            0.3212   0.7008  -0.0963  -0.2584   0.6711
> X3            0.3781  -0.7321   0.1907  -0.1721   0.3073
> X4           -0.1778   0.2822  -0.0644  -0.2649  -0.4140
> X5           -0.1079  -0.0475   0.6047  -0.8369  -0.5928
> 
> 
> I want to test for c'b = 0 where c is (lets say) the contrast (0, 0, 1, 0, -1). Is it possible to do so, in one shot, using gmodels or something else?
> 
> Many thanks and best wishes,
> Ranjan
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



More information about the R-help mailing list