[Rd] huge data?

Jay Emerson jayemerson at gmail.com
Wed Jun 25 14:59:11 CEST 2008


I'm afraid I don't have anything to add to this discussion.  I don't
know enough (or anything) about "huge pages" to be able to assess
possible performance gains, but my instinct is that they will be
negligible when compared to other costs (e.g. a copy-on-write kicking
in with a big R object).  In which case this would not be a good way
to spend your time.  I would encourage you to make sure you understand
other aspects of handling large objects with R before investing too
much time and effort in this one direction.

Jay


<<previous message copied below>>

Hi Jay Emerson,
     Our Intention is to primarily optimize "R" to utilize the Parallel
Processing Capabilities of CELL BE Processor.(has any work been done in this
area?)

We have huge pages(of size 1MB 16MB ) available in the system and as you
pointed out our data is also in the GB ranges.So the idea is if Vectors of
this huge size are allocated from Huge Pages the performance will naturally
increase.How to implement it?
So How can we proceed in this case?

Also the Upper Limit of Class 6 is specified as 128 bytes(Node Classes)
Will there be any side effect in increasing this to 512 bytes or so
(depending on the average of the Application data)




Advance Thanks & Regards
R.Subramanian

-- 
John W. Emerson (Jay)
Assistant Professor of Statistics
Department of Statistics
Yale University
http://www.stat.yale.edu/~jay



More information about the R-devel mailing list