[R] classification for huge datasets: SVM yields memory troubles

Christoph Lehmann christoph.lehmann at gmx.ch
Mon Dec 13 13:27:11 CET 2004


Hi
I have a matrix with 30 observations and roughly 30000 variables, each 
obs belongs to one of two groups. With svm and slda I get into memory 
troubles ('cannot allocate vector of size' roughly 2G). PCA LDA runs 
fine. Are there any way to use the memory issue withe SVM's? Or can you 
recommend any other classification method for such huge datasets?


P.S. I run suse 9.1 on a 2G RAM PIV machine.
thanks for a hint

Christoph




More information about the R-help mailing list