[R] how "big" (in RAM and/or disk storage) is each of these objects in a list?
murdoch.duncan at gmail.com
Sat Nov 26 19:55:52 CET 2011
On 11-11-26 1:41 PM, Paul Johnson wrote:
> We generated a bunch of results and saved them in an RData file. We
> can open, use, all is well, except that the size of the saved file is
> quite a bit larger than we expected. I suspect there's something
> floating about in there that one of the packages we are using puts in,
> such as a spare copy of a data frame that is saved in some subtle way
> that has escaped my attention.
> Consider a list of objects. Are there ways to do these things:
> 1. ask R how much memory is used by the things inside the list?
You can use object.size, but read the man page: it is not a completely
> 2. Does "as.expression(anObject)" print everything in there? Or, is
> there a better way to convert each thing to text or some other format
> that you can actually read line by line to see what is in there, to
> "see" everything?
No, as.expression won't necessarily work. save(..., ascii=TRUE) will
show you everything, but it's not designed to be readable. Probably the
most useful function is str().
> If there's no giant hidden data frame floating about, I figure I'll
> have to convert symmetric matrices to lower triangles or such to save
> space. Unless R already is automatically saving a matrix in that way
> but just showing me the full matrix, which I suppose is possible. If
> you have other ideas about general ways to make saved objects smaller,
> I'm open for suggestions.
You could try different compression methods (see ?save), but probably
the best idea is to identify the things that you didn't mean to include,
and don't include those. A common way this happens is objects like
functions or formulas that carry their environment with them.
More information about the R-help