[R] Proposal: Archive UseR conference presentations at www.r-project.org/useR-yyyy
hadley wickham
h.wickham at gmail.com
Tue Sep 25 16:17:23 CEST 2007
> > but I can understand your desire to do
> > that. Perhaps just taking a static snapshot using something like
> > wget, and hosting that on the R-project website would be a good
> > compromise.
>
> Hmm, wouldn't it be easier if the hosting institution would make a tgz
> file? wget over HTTP is rather bad in resolving links etc
Really? I've always found it to be rather excellent.
The reason I suggest it is that unless you have some way to generate a
static copy of the site, you'll need to ensure that the R-project
supports any dynamic content. e.g. for example the user 2008 site
uses some (fairly vanilla) php for including the header and footer.
> we could include a note on the top page that this is only a snapshot
> copy and have a link to the original site (in case something changes
> there).
That's reasonable, although it would be even better to have it on every page.
> > The one problem is setting up a redirect so that existing links and
> > google searches aren't broken. This would need to be put in place at
> > least 6 months before the old website closed.
>
> Yes, very good point, I didn't think about that. But the R site is
> searched very often, so material there appears rather quickly on
> Google searches. Ad bookmarks: I don't want to remove the old site,
> just have an archive copy at a central location.
In that case, should it be labelled no-index as it's just a cache of
material that should be available elsewhere? We need some
machine-readable way of indicating where the canonical resource is.
It's always frustrated me a little that when googling for r
documentation, you find hundreds of the same page hosted at different
sites.
Hadley
--
http://had.co.nz/
More information about the R-help
mailing list