[R] running a scraping code in parallel...
akshay kulkarni
@k@h@y_e4 @end|ng |rom hotm@||@com
Thu Jul 14 20:32:40 CEST 2022
Dear members,
please feel free to ignore this mail if you feel that it is not about Base R.
I have the following web scraping code ( i have 500 stocks to iterate over):
getFirmsDates <- function() {
rD <- RsDriver(browser="chrome")
remDr <- rD$client
{ scrape for stock i }
}
Will the following code work?
DATES <- mclapply(1:500, getFirmsDates, mc.cores = 48)
Basically, there must be 500 chrome instances and rD and remDr are same for all iterations. If not any suggestions on how to accomplish the task?
I am using RSelenium and rvest packages.
THanking you,
yours sincerely,
AKSHAY M KULKARNI
[[alternative HTML version deleted]]
More information about the R-help
mailing list