[R] Memory getting eaten up with XML

Andrew Gormley GormleyA at landcareresearch.co.nz
Tue Dec 6 04:01:39 CET 2011


Hi all. I have an issue that I cannot resolve. I am trying to read in lots of data that are stored in xml files. But after I read them in and copy the relevant data, then remove the document etc, it doesn't free up the memory. When I monitor it in windows task manager the memory usage just climbs with each iteration until R crashes. I can replicate the problem with the small example:
        file.name<-"C:\\MyData.xml.gz"
        TEMPP<-xmlParse(file.name)
        xx <- xmlRoot(TEMPP)
        rm(xx)
        rm(TEMPP)
        gc()

Even though I remove the root node xx and the document TEMPP, the memory usage remains the same as it was when I first read it in... Any ideas/solutions?
I am using a 32bit version of R 2.14.0 on windows XP, and the latest version of XML (3.6.1).
Many thanks
Andrew (apologies for the large footer my work appends to all my emails...)

Please consider the environment before printing this email
Warning:  This electronic message together with any attachments is confidential. If you receive it in error: (i) you must not read, use, disclose, copy or retain it; (ii) please contact the sender immediately by reply email and then delete the emails.
The views expressed in this email may not be those of Landcare Research New Zealand Limited. http://www.landcareresearch.co.nz



More information about the R-help mailing list