[R] memory usage upon web-query using try function

cir p cirkus01 at yahoo.com
Sun Jun 26 09:20:57 CEST 2011


Dear Community,
my program below runs quite slow and I'm not sure whether the http-requests are 
to blame for this. Also, when running it gradually increases the memory usage 
enormously. After the program finishes, the memory is not freed. Can someone 
point out a problem in the code? Sorry my basic question, but I am totally new 
to R programming...

Many thans for your time,
Cyrus

require(XML)
row=0
URL="http://de.finance.yahoo.com/lookup?s="
df <- matrix(ncol=6,nrow=100000)                
for (Ticker in 100000:200000)
{
URLTicker=paste(URL,Ticker,sep="")
query=try(readHTMLTable(
                    URLTicker,
                    which=2,
                    header=T,
                    colClasses = c("character","character","character",
                           "character","character","character"),
                    stringsAsFactors=F,)[1,],silent=T)

if (class(query)=="data.frame")
{
row=row+1
df[row,]=as.character(query)
}}



More information about the R-help mailing list