[R] reading multiple text files from web
Tom Wright
tom at maladmin.com
Wed Mar 18 20:37:56 CET 2015
I think you need to use a loop to iterate through each of the items in
getlinks
for(link in getlinks)
{
url<-paste0('http://spec.org/jEnterprise2010/results/',link)
output<-readfiles(url)
}
You're probably also going to need to add some error handling when your
search string doesn't exist.
On Wed, 2015-03-11 at 23:08 -0700, Kruti Pandya wrote:
> readfiles=function(x) { a<-readLines(x)
>
>
> sm <- "Java EE AppServer & Database Server HW (SUT
> hardware)"
>
>
> s<-grep(sm, a, fixed=TRUE)
>
>
> e<-grep("^\\S", a[-(1:s)])[1]
>
>
> grep("OS Vendor", a[(s+1):(s+e-1)], fixed=T, value=T)[1]
>
> grep("OS Name", a[(s+1):(s+e-1)], fixed=T,
> value=T)[1]
>
> }
More information about the R-help
mailing list