[R] Exception while using NeweyWest function with doMC

David Winsemius dwinsemius at comcast.net
Tue Aug 30 17:46:41 CEST 2011


On Aug 30, 2011, at 11:29 AM, Simon Zehnder wrote:

> Hi David,
>
> thank you very much for your advice! I updated R and all my  
> packages. Regrettably it doesn't work yet. But, I think, that the  
> parallel processing (using 32bit) does improve time, especially when  
> it comes to higher dimensions:
>
> system.time(simuFunctionSeq(0.03, 0.015, 1, 5, 1000, 100,"/Users/ 
> simon/Documents/R/BigMTest"))
>   system.time(simuFunctionPar(0.03, 0.015, 1, 5, 1000, 100,"/Users/ 
> simon/Documents/R/BigMTest"))
> [1] "Sequential Processing with N =  1000  and K =  100"
>   user  system elapsed
>  5.157   0.086   5.587
> [1] "Parallel Processing with N =  1000  and K =  100"
>   user  system elapsed
>  6.069   0.220   3.895
>
> :> system.time(simuFunctionSeq(0.03, 0.015, 1, 5, 10000, 100,"/Users/ 
> simon/Documents/R/BigMTest"))
>   system.time(simuFunctionPar(0.03, 0.015, 1, 5, 10000, 100,"/Users/ 
> simon/Documents/R/BigMTest"))
> [1] "Sequential Processing with N =  10000  and K =  100"
>   user  system elapsed
>  8.129   0.689  12.747
> [1] "Parallel Processing with N =  10000  and K =  100"
>   user  system elapsed
>  8.387   0.772  12.005
>
> :> system.time(simuFunctionSeq(0.03, 0.015, 1, 5, 10000, 1000,"/ 
> Users/simon/Documents/R/BigMTest"))
>   system.time(simuFunctionPar(0.03, 0.015, 1, 5, 10000, 1000,"/Users/ 
> simon/Documents/R/BigMTest"))
> [1] "Sequential Processing with N =  10000  and K =  1000"
>   user  system elapsed
> 71.295   6.330 109.656
> [1] "Parallel Processing with N =  10000  and K =  1000"
>   user  system elapsed
> 50.943   6.347  89.115
>
> Or are the times negligible?

I would think that for most applications getting a gain of efficiency  
of 20% would be considered unworthy of the effort at setting up and  
maintaining. I suppose if a simulation ran for 18 hours in sequential  
mode and you would be happier if it were done in the morning after  
leaving overnight and  finding it had completed in 15 hours, it might  
be worth the effort.

> What happens if I use a supercomputer with several cores and much  
> more memory?

Or even a MacPro with 4 or 8 cores and 32-64 GB?. Generally you hope  
to see halving or quartering in times when you apply these techniques.

-- 
David.

>
> Thanks again!
>
> Simon
>
>
>
> On Aug 29, 2011, at 6:59 PM, David Winsemius wrote:
>
>>
>> On Aug 27, 2011, at 3:37 PM, Simon Zehnder wrote:
>>
>>> Dear R users,
>>>
>>> I am using R right now for a simulation of a model that needs a  
>>> lot of
>>> memory. Therefore I use the *bigmemory* package and - to make it  
>>> faster -
>>> the *doMC* package. See my code posted on http://pastebin.com/dFRGdNrG
>>>
>>> Now, if I use the foreach loop with the addon %do% (for sequential  
>>> run) I
>>> have no problems at all - only here and there some singularities in
>>> regressor matrices which should be ok.
>>> BUT if I run the loop on multiple cores I get very often a bad  
>>> exception. I
>>> have posted the exception on http://pastebin.com/eMWF4cu0 The  
>>> exception
>>> comes from the NeweyWest function loaded within the sandwich  
>>> library.
>>>
>>> I have no clue, what it want to say me and why it is so weirdly  
>>> printed to
>>> the terminal. I am used to receive here and there errors....but  
>>> the messages
>>> never look like this.
>>>
>>> Does anyone have a useful answer for me, where to look for the  
>>> cause of this
>>> weird error?
>>>
>>> Here some additional information:
>>>
>>> Hardware: MacBook Pro 2.66 GHz Intel Core Duo, 4 GB Memory 1067  
>>> MHz DDR3
>>> Software System: Mac Os X Lion 10.7.1 (11B26)
>>> Software App: R64 version 2.11.1 run via Mac terminal
>>
>> Using the R64 version in a 4GB environment will reduce the  
>> effective memory capacity since the larger pointers take up more  
>> space, and using parallel methods is unlikely to improve  
>> performance very much with only two cores. It also seems likely  
>> that there have been several bug fixes in the last couple of years  
>> since that version of R was released, so the package authors are  
>> unlikely to be very interested in segfault errors thrown by  
>> outdated software.
>>
>>> I hope someone has a good suggestion!
>>
>> Update R. Don't use features that only reduce performance and make  
>> unstable a machine that has limited resources.
>>
>> -- 
>>
>> David Winsemius, MD
>> West Hartford, CT
>>
>

David Winsemius, MD
West Hartford, CT



More information about the R-help mailing list