[R] Tip for performance improvement while handling huge data?

Suresh_FSFM suresh.ghalsasi at gmail.com
Sun Feb 8 18:39:20 CET 2009


Hello All,

For certain calculations, I have to handle a dataframe with say 10 million
rows and multiple columns of different datatypes. 
When I try to perform calculations on certain elements in each row, the
program just goes in "busy" mode for really long time.
To avoid this "busy" mode, I split the dataframe into subsets of 10000 rows.
Then the calculation was done very fast. within reasonable time.

Is there any other tip to improve the performance ?

Regards,
Suresh
 
-- 
View this message in context: http://www.nabble.com/Tip-for-performance-improvement-while-handling-huge-data--tp21901287p21901287.html
Sent from the R help mailing list archive at Nabble.com.




More information about the R-help mailing list