[BioC] EBS volumes with the Bioconductor AMI: how to change default behaviour
Quin Wills
qilin at quinwills.net
Tue Aug 16 12:31:30 CEST 2011
Thanks a stack, Dan
Duly noted this side that only one EBS volume now launches.
And if the instance shutdown behaviour is set to "terminate" it will
detach when shutting down from within R. Brilliant.
Btw. the new AMI only seems to be available on the US East Coast.
Thanks again,
Quin
On 15 August 2011 21:10, Dan Tenenbaum <dtenenba at fhcrc.org> wrote:
>
> On Mon, Aug 15, 2011 at 1:40 AM, Quin Wills <qilin at quinwills.net> wrote:
> > Thanks Dan
> > I did think it slightly unusual that so many volumes are created and stay
> > attached. I look forward to heard from you!
>
>
> BTW, this no longer happens. Be sure and use the latest AMI ID from:
> http://bioconductor.org/help/bioconductor-cloud-ami/
>
> Thanks,
> Dan
>
>
> > If you do have any advice on what you think would be the most appropriate
> > way for R to terminate an instance once a job is done, I'd also appreciate
> > it enormously.
> > Thanks,
> > Quin
> >
> >
> > On 12 August 2011 18:46, Dan Tenenbaum <dtenenba at fhcrc.org> wrote:
> >>
> >> On Fri, Aug 12, 2011 at 12:32 AM, Quin Wills <qilin at quinwills.net> wrote:
> >> > Thanks for the advice Dan.
> >> >
> >> > The reason I like to use S3 is that I like to run jobs, log out and
> >> > have them automatically
> >> > shut down when done. At the moment I'm just running the following
> >> > function for
> >> > automated shutdown of my instances from within my R script:
> >> >
> >> > shutdown <- function(time=0) return(system(paste("echo 'sudo halt' |
> >> > at now + ",time," min",sep="")))
> >> >
> >> > Even if I set my instance's shutdown behaviour to "terminate" (in the
> >> > AWS management console),
> >> > those EBS volumes seem to persist when I automate termination this way.
> >>
> >> Hi,
> >>
> >> I just tried running the Bioconductor AMI as a different user and I
> >> notice that those volumes are created and stay "attached" to the
> >> instance even after the instance is terminated. I'm not sure why they
> >> are created in the first place; I'll look into that and report back to
> >> you. You can safely detach and delete them after your instance is
> >> halted. You can do that in the EC2 web console, with EC2 command-line
> >> tools, or calling the EC2 API from a programming language. I can send
> >> you an example of the latter if you are interested.
> >>
> >>
> >> >
> >> > Do you perhaps have a recommendation on how better to make sure my
> >> > instance shuts
> >> > down once the job is done? Ideally it would be great if it could fire
> >> > off a quick
> >> > email too, but this doesn't seem so easy to do unless I create my own
> >> > AMI I think.
> >> >
> >>
> >> You might look into Amazon's Simple Email Service.
> >> http://aws.amazon.com/ses/
> >>
> >> Dan
> >>
> >>
> >> > Thanks a ton,
> >> > Quin
> >> >
> >> >
> >> >>>On Thu, Aug 11, 2011 at 6:11 AM, Quin Wills <qilin at quinwills.net>
> >> >>> wrote:
> >> >>> Hello Bioconductor AMI gurus
> >> >>>
> >> >>> Delighted that Bioconductor has an AMI with pre-loaded bells and
> >> >>> whistles.
> >> >>> I'm hardly an AWS guru (yet?), and in particular feel like all the
> >> >>> dots
> >> >>> aren't connecting in my brain regarding EBS.
> >> >>>
> >> >>> So I see that the Bioconductor AMI automatically initiates 1 x 20GiB
> >> >>> root
> >> >>> EBS volume, and 3 x 30 GiB extra volumes, correct?
> >> >>> What if I don't want
> >> >>> these? Presumably just detaching and deleting them in the AWS
> >> >>> management
> >> >>> console is one way to do it? Is this the only (reasonably easy) way?
> >> >>
> >> >>
> >> >>The AMI "lives" on these EBS volumes so you don't want to delete them.
> >> >>You may find you don't even own them.
> >> >>
> >> >>
> >> >>
> >> >>> For the moment I'm just using AWS for CPU-intensive work that I need
> >> >>> to
> >> >>> speed up. I have an S3 bucket and am using the omegahat RAmazonS3
> >> >>> library to
> >> >>> access and save data on a semi-permanent basis. Does this seem like a
> >> >>> reasonable tactic? For the moment, the sizes of the data objects in my
> >> >>> S3
> >> >>> bucket are manageable.
> >> >>
> >> >>If it works for you, it is reasonable. The reason we don't use S3 is
> >> >>that we find it slow, plus it is a two-step process to push files to
> >> >>S3 from your AMI, then pull them from S3 to your local machine, as
> >> >>opposed to using scp to copy files directly in one step.
> >> >>
> >> >>But if you find that S3 works for you, there's no reason not to use it.
> >> >>Dan
> >> >>
> >> >>> Perhaps there's a link to an idiots guide on "EBS vs S3" options and
> >> >>> suggestions when using the Bioconductor AMI?
> >> >>>
> >> >>> Thanks in advance for any wisdom,
> >> >>> Quin
> >> >
> >> > _______________________________________________
> >> > Bioconductor mailing list
> >> > Bioconductor at r-project.org
> >> > https://stat.ethz.ch/mailman/listinfo/bioconductor
> >> > Search the archives:
> >> > http://news.gmane.org/gmane.science.biology.informatics.conductor
> >> >
> >
> >
> >
> > --
> >
> > Quin Wills
> > Live the kind of life that when you die half the world mourns the loss of a
> > great sentience, whilst the other half are just grateful that it's over.
> >
> > Brasenose College
> > Oxford
> > OX1 4AJ
> > tel: +44 (0)7951 335 714
> > inet: www.quinwills.net
> >
--
Quin Wills
Live the kind of life that when you die half the world mourns the loss
of a great sentience, whilst the other half are just grateful that
it's over.
Brasenose College
Oxford
OX1 4AJ
tel: +44 (0)7951 335 714
inet: www.quinwills.net
More information about the Bioconductor
mailing list