[R] parallel number of cores according to memory?
Jeff Newmiller
jdnewm|| @end|ng |rom dcn@d@v|@@c@@u@
Wed Jul 8 06:40:50 CEST 2020
Use an operating system that supports forking, like Linux or MacOSX, and use the parallel package mclapply function or similar to share memory for read operations. [1]
And stop posting in HTML here.
[1] https://cran.r-project.org/web/views/HighPerformanceComputing.html
On July 7, 2020 9:20:39 PM PDT, ivo welch <ivowel using gmail.com> wrote:
>if I understand correctly, R makes a copy of the full environment for
>each
>process. thus, even if I have 32 processors, if I only have 64GB of
>RAM
>and my R process holds about 10GB, I should probably not spawn 32
>processes.
>
>has anyone written a function that sets the number of cores for use (in
>mclapply) to be guessed at by appropriate memory requirements (e.g.,
>"amount-of-RAM"/"RAM held by R")?
>
>(it would be even nicer if I could declare my 8GB data frame to be
>read-only and to be shared among my processes, but this is presumably
>technically very difficult.)
>
>pointers appreciated.
>
>/iaw
>
> [[alternative HTML version deleted]]
>
>______________________________________________
>R-help using r-project.org mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.
--
Sent from my phone. Please excuse my brevity.
More information about the R-help
mailing list