[R] parallel number of cores according to memory?
jdnewm|| @end|ng |rom dcn@d@v|@@c@@u@
Wed Jul 8 06:40:50 CEST 2020
Use an operating system that supports forking, like Linux or MacOSX, and use the parallel package mclapply function or similar to share memory for read operations. 
And stop posting in HTML here.
On July 7, 2020 9:20:39 PM PDT, ivo welch <ivowel using gmail.com> wrote:
>if I understand correctly, R makes a copy of the full environment for
>process. thus, even if I have 32 processors, if I only have 64GB of
>and my R process holds about 10GB, I should probably not spawn 32
>has anyone written a function that sets the number of cores for use (in
>mclapply) to be guessed at by appropriate memory requirements (e.g.,
>"amount-of-RAM"/"RAM held by R")?
>(it would be even nicer if I could declare my 8GB data frame to be
>read-only and to be shared among my processes, but this is presumably
>technically very difficult.)
> [[alternative HTML version deleted]]
>R-help using r-project.org mailing list -- To UNSUBSCRIBE and more, see
>PLEASE do read the posting guide
>and provide commented, minimal, self-contained, reproducible code.
Sent from my phone. Please excuse my brevity.
More information about the R-help