[R] cbind in a loop...better way? | summary
Evan Cooch
evan.cooch at gmail.com
Thu Oct 9 14:36:59 CEST 2014
Two solutions proposed -- not entirely orthogonal, but both do the
trick. Instead of nesting cbin in a loop (as I did originally -- OP,
below),
1\ do.call(cbind, lapply(mat_list, as.vector))
or
2\ sapply(mat_list,function(x) as.vector(x))
Both work fine. Thanks to Jeff Laake (2) + David Carlson (1) for their
suggestions.
On 10/8/2014 3:12 PM, Evan Cooch wrote:
> ...or some such. I'm trying to work up a function wherein the user
> passes a list of matrices to the function, which then (1) takes each
> matrix, (2) performs an operation to 'vectorize' the matrix (i.e.,
> given an (m x n) matrix x, this produces the vector Y of length m*n
> that contains the columns of the matrix x, stacked below each other),
> and then (3) cbinds them together.
>
> Here is an example using the case where I know how many matrices I
> need to cbind together. For this example, 2 square (3x3) matrices:
>
> a <- matrix(c,0,20,50,0.05,0,0,0,0.1,0),3,3,byrow=T)
> b <- matrix(c(0,15,45,0.15,0,0,0,0.2,0),3,3,byrow=T)
>
> I want to vec them, and then cbind them together. So,
>
> result <- cbind(matrix(a,nr=9), matrix(b,nr=9))
>
> which yields the following:
>
> [,1] [,2]
> [1,] 0.00 0.00
> [2,] 0.05 0.15
> [3,] 0.00 0.00
> [4,] 20.00 15.00
> [5,] 0.00 0.00
> [6,] 0.10 0.20
> [7,] 50.00 45.00
> [8,] 0.00 0.00
> [9,] 0.00 0.00
>
> Easy enough. But, I want to put it in a function, where the number and
> dimensions of the matrices is not specified. Something like
>
> Using matrices (a) and (b) from above, let
>
> env <- list(a,b).
>
> Now, a function (or attempt at same) to perform the desired operations:
>
> vec=function(matlist) {
>
> n_mat=length(matlist);
> size_mat=dim(matlist[[1]])[1];
>
> result=cbind()
>
> for (i in 1:n_mat) {
> result=cbind(result,matrix(matlist[[i]],nr=size_mat^2))
> }
>
> return(result)
>
> }
>
>
> When I run vec(env), I get the *right answer*, but I am wondering if
> there is a *better* way to get there from here than the approach I use
> (above). I'm not so much interested in 'computational efficiency' as I
> am in stability, and flexibility.
>
> Thanks...
>
> .
>
More information about the R-help
mailing list