[BioC] how to solve the memory problem
Sean Davis
sdavis2 at mail.nih.gov
Sat Jan 14 17:12:44 CET 2012
On Sat, Jan 14, 2012 at 10:17 AM, wang peter <wng.peter at gmail.com> wrote:
> i have already got the answer from my friends
>
> when you load big data into memory, u often meet the memory problem.
> but in fact, you have enough physical memeory to load the data.
> that is because the linux use much of your free memory to cache the files
> becaues u do calculation
> you should run such command to set them free
>
> memory problem
>
> echo 3 > /proc/sys/vm/drop_caches
Hi, Shao Gan.
Thank you for sharing, but I do not think this is either necessary or
relevant for successful use of bioconductor. There is not a need to
do this on modern linux systems, as the cache is automatically
downsized as RAM for running programs is allocated. I would not
expect the command above to in any way affect the amount of memory
available to R/Bioconductor. In fact, emptying the cache may, in some
cases, be detrimental to performance if one is, for example, reading
or writing files.
I just wanted to be clear that what you suggest is not a recommended
approach to dealing with memory issues in R and is not likely to be
successful (since the cache is not the problem).
Sean
More information about the Bioconductor
mailing list