[R-SIG-Finance] R-SIG-Finance Digest, Vol 52, Issue 2

Rosenthal, Dale W.R. daler at uic.edu
Tue Sep 2 18:00:40 CEST 2008


While not the most user-friendly process, you could also work with you
sysadmin to get a version of R compiled with 64-bit libraries.  That was
my method for beating the 2GB limit; and, it worked very well.

While your at it, you also might ask your sysadmin to compile and link in
GotoBLAS for the linear algebra routines.  That way you would get threaded
performance on some calculations.

Dale
-- 
Dale W.R. Rosenthal
Assistant Professor, Department of Finance
University of Illinois at Chicago
http://tigger.uic.edu/~daler
SSRN: http://ssrn.com/author=906862

On Tue, September 2, 2008 05:00, r-sig-finance-request at stat.math.ethz.ch
wrote:
> Message: 2
> Date: Mon, 01 Sep 2008 22:18:36 +0100
> From: Rory Winston <rory.winston at gmail.com>
> Subject: Re: [R-SIG-Finance] Is there a way to overcome 2 gigabytes,
> 	data set	limit in R?
> To: r-sig-finance at stat.math.ethz.ch
> Message-ID: <48BC5C2C.9020507 at gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> There has been a lot of work on this area over the last year or more.
R.ff is a well-known package designed to handle this, but there is also
a comparative package called bigmemory, which may be slightly easier to
use.
>
> Cheers
> Rory
>> I am doing a survival analysis of the large group of loans and would
like to know if there is a way to do it in R with the dataset of more
than 2 Gigabytes, specifically on 64 bits OS.
>
>






Dale W.R. Rosenthal
Assistant Professor, Department of Finance
University of Illinois at Chicago
http://tigger.uic.edu/~daler
SSRN: http://ssrn.com/author=906862



More information about the R-SIG-Finance mailing list