[R] aws.s3::s3sync Error in curl::curl_fetch_disk(url, x$path, handle = handle) : Failed to open file

nevil amos nev||@@mo@ @end|ng |rom gm@||@com
Thu Jul 16 11:28:11 CEST 2020


I am trying to use aws.s3::s3sync to sync the contents of a bucket to an
aws ec2 instance ( via rstudio server)

my script is as follows I cannot make a repro since that would need to be
run from the instance with my account key and details ( replaced with
xxxxxxxxxxxxxx below).

all files I am trying to sync are under the s3 prefix  FAME_FMR which is
the only top level "directory" in the bucket
Have I missed a setting?
If not any suggestions on this error?

thanks

Nevil Amos



library(aws.s3)
Sys.setenv("AWS_ACCESS_KEY_ID" = "xxxxxxxxxxxxxxxxxxxxxxxxx",
           "AWS_SECRET_ACCESS_KEY" =
"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
           "AWS_DEFAULT_REGION" = "ap-southeast-2")
myBucket<-"ecological-risk-analysis"
myPath<-"~/ShinyApps"

> bucket_exists(myBucket)[1][1] TRUE> dir.exists(myPath)[1] TRUE

>s3sync(path = myPath ,bucket =myBucket)

 [998] "FAME_FMR/HDMS/75m/BinaryThresholded/Satin_Bowerbird_Spp10679_Thresholded_Binary.tif"
 [999] "FAME_FMR/HDMS/75m/BinaryThresholded/Satin_Flycatcher_Spp10366_Thresholded_Binary.tif"
[1000] "FAME_FMR/HDMS/75m/BinaryThresholded/Scaly_breasted_Lorikeet_Spp10256_Thresholded_Binary.tif"
 [ reached getOption("max.print") -- omitted 191 entries ]1191 bucket
objects not found in local directory<== Saving object 'FAME_FMR/' to
'~/ShinyApps/FAME_FMR/'Error in curl::curl_fetch_disk(url, x$path,
handle = handle) :
  Failed to open file /home/rstudio/ShinyApps/FAME_FMR/.

	[[alternative HTML version deleted]]



More information about the R-help mailing list