[R] predictive modeling and extremely large data

Divyam divyamurali13 at gmail.com
Wed Sep 7 11:25:54 CEST 2011


Hi,

I am new to R and here is what I am doing in it now. I am using machine
learning technique (svm) to do predictive modeling. The data that I am using
is one that is bound to grow perpetually. what I want to know is, say, I fed
in a data set with 5000 data points to svm initially. The algorithm derives
a certain intelligence (i.e.,output)  based on these 5000 data points. I
have an additional 10000 data points today. Now if i remove the first fed
5000 data and then feed in this new additional 10000 data, I want the
algorithm to make use of the intelligence derived from the initial data(5000
data points) too while evaluating the new delta data points(10000) and the
end result to be an aggregated measure of the total 15000 data. This is
important to me from an efficiency point of view. If there are any other
packages in r that does the same (i.e., enable statistical models to learn
from the past experience continuously while deleting the prior data used
from which the intelligence is derived) kindly post about them. This will be
of immense help to me.

Thanks in advance.

divya 

--
View this message in context: http://r.789695.n4.nabble.com/predictive-modeling-and-extremely-large-data-tp3795674p3795674.html
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list