From nakama at ki.rim.or.jp Wed Jul 1 07:20:08 2015 From: nakama at ki.rim.or.jp (Ei-ji Nakama) Date: Wed, 1 Jul 2015 14:20:08 +0900 Subject: [Rd] additional leap second Message-ID: hi, Index: leap_second/src/library/base/R/zdatetime.R =================================================================== --- leap_second/src/library/base/R/zdatetime.R (revision 68608) +++ leap_second/src/library/base/R/zdatetime.R (working copy) @@ -24,7 +24,8 @@ "1979-12-31", "1981-6-30", "1982-6-30", "1983-6-30", "1985-6-30", "1987-12-31", "1989-12-31", "1990-12-31", "1992-6-30", "1993-6-30", "1994-6-30","1995-12-31", - "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", "2012-6-30") + "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", + "2012-6-30", "2015-6-30") .leap.seconds <- strptime(paste(.leap.seconds , "23:59:60"), "%Y-%m-%d %H:%M:%S") c(as.POSIXct(.leap.seconds, "GMT")) # lose the timezone Best Regards, -- Eiji NAKAMA "\u4e2d\u9593\u6804\u6cbb" From murdoch.duncan at gmail.com Wed Jul 1 07:36:28 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 01 Jul 2015 07:36:28 +0200 Subject: [Rd] additional leap second In-Reply-To: References: Message-ID: <55937C5C.7030805@gmail.com> On 01/07/2015 7:20 AM, Ei-ji Nakama wrote: > hi, > > Index: leap_second/src/library/base/R/zdatetime.R > =================================================================== > --- leap_second/src/library/base/R/zdatetime.R (revision 68608) > +++ leap_second/src/library/base/R/zdatetime.R (working copy) > @@ -24,7 +24,8 @@ > "1979-12-31", "1981-6-30", "1982-6-30", "1983-6-30", > "1985-6-30", "1987-12-31", "1989-12-31", "1990-12-31", > "1992-6-30", "1993-6-30", "1994-6-30","1995-12-31", > - "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", "2012-6-30") > + "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", > + "2012-6-30", "2015-6-30") > .leap.seconds <- strptime(paste(.leap.seconds , "23:59:60"), > "%Y-%m-%d %H:%M:%S") > c(as.POSIXct(.leap.seconds, "GMT")) # lose the timezone > > Best Regards, > -- > Eiji NAKAMA > "\u4e2d\u9593\u6804\u6cbb" Thanks, I'll add it to R-devel and R-patched. Duncan Murdoch From edd at debian.org Wed Jul 1 09:53:02 2015 From: edd at debian.org (Dirk Eddelbuettel) Date: Wed, 1 Jul 2015 02:53:02 -0500 Subject: [Rd] additional leap second In-Reply-To: <55937C5C.7030805@gmail.com> References: <55937C5C.7030805@gmail.com> Message-ID: <21907.40030.56834.77596@max.nulle.part> On 1 July 2015 at 07:36, Duncan Murdoch wrote: | On 01/07/2015 7:20 AM, Ei-ji Nakama wrote: | > hi, | > | > Index: leap_second/src/library/base/R/zdatetime.R | > =================================================================== | > --- leap_second/src/library/base/R/zdatetime.R (revision 68608) | > +++ leap_second/src/library/base/R/zdatetime.R (working copy) | > @@ -24,7 +24,8 @@ | > "1979-12-31", "1981-6-30", "1982-6-30", "1983-6-30", | > "1985-6-30", "1987-12-31", "1989-12-31", "1990-12-31", | > "1992-6-30", "1993-6-30", "1994-6-30","1995-12-31", | > - "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", "2012-6-30") | > + "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", | > + "2012-6-30", "2015-6-30") | > .leap.seconds <- strptime(paste(.leap.seconds , "23:59:60"), | > "%Y-%m-%d %H:%M:%S") | > c(as.POSIXct(.leap.seconds, "GMT")) # lose the timezone | > | > Best Regards, | > -- | > Eiji NAKAMA | > "\u4e2d\u9593\u6804\u6cbb" | | Thanks, I'll add it to R-devel and R-patched. I just launched a Debian build of R 3.2.1 as well. Dirk -- http://dirk.eddelbuettel.com | @eddelbuettel | edd at debian.org From ripley at stats.ox.ac.uk Wed Jul 1 10:13:47 2015 From: ripley at stats.ox.ac.uk (Prof Brian Ripley) Date: Wed, 1 Jul 2015 09:13:47 +0100 Subject: [Rd] additional leap second In-Reply-To: References: Message-ID: <5593A13B.9080505@stats.ox.ac.uk> Thanks, I was working on this. There are other changes needed in src/main/datetime.c and ?.leap.seconds which I will commit shortly, and the example in hist.POSIXt() needed alteration (it seems DJM did not run 'make check'!). On 01/07/2015 06:20, Ei-ji Nakama wrote: > hi, > > Index: leap_second/src/library/base/R/zdatetime.R > =================================================================== > --- leap_second/src/library/base/R/zdatetime.R (revision 68608) > +++ leap_second/src/library/base/R/zdatetime.R (working copy) > @@ -24,7 +24,8 @@ > "1979-12-31", "1981-6-30", "1982-6-30", "1983-6-30", > "1985-6-30", "1987-12-31", "1989-12-31", "1990-12-31", > "1992-6-30", "1993-6-30", "1994-6-30","1995-12-31", > - "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", "2012-6-30") > + "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", > + "2012-6-30", "2015-6-30") > .leap.seconds <- strptime(paste(.leap.seconds , "23:59:60"), > "%Y-%m-%d %H:%M:%S") > c(as.POSIXct(.leap.seconds, "GMT")) # lose the timezone > > Best Regards, > -- > Eiji NAKAMA > "\u4e2d\u9593\u6804\u6cbb" > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > -- Brian D. Ripley, ripley at stats.ox.ac.uk Emeritus Professor of Applied Statistics, University of Oxford 1 South Parks Road, Oxford OX1 3TG, UK From nakama at ki.rim.or.jp Wed Jul 1 11:02:00 2015 From: nakama at ki.rim.or.jp (Ei-ji Nakama) Date: Wed, 1 Jul 2015 18:02:00 +0900 Subject: [Rd] additional leap second In-Reply-To: <5593A13B.9080505@stats.ox.ac.uk> References: <5593A13B.9080505@stats.ox.ac.uk> Message-ID: sorry, i didnt watch src/main/datetime.c ... Index: leap_second/src/main/datetime.c =================================================================== --- leap_second/src/main/datetime.c (????? 68608) +++ leap_second/src/main/datetime.c (?????) @@ -303,14 +303,15 @@ } #ifndef HAVE_POSIX_LEAPSECONDS -/* There have been 25 leapseconds: see .leap.seconds in R +/* There have been many leapseconds: see .leap.seconds in R + * (as.Date(.leap.seconds)-as.Date("1970-01-01"))*86400 */ -static int n_leapseconds = 25; static const time_t leapseconds[] = { 78796800, 94694400,126230400,157766400,189302400,220924800,252460800, 283996800,315532800,362793600,394329600,425865600,489024000,567993600, 631152000,662688000,709948800,741484800,773020800,820454400,867715200, - 915148800,1136073600,1230768000,1341100800}; + 915148800,1136073600,1230768000,1341100800,1435708800}; +static int n_leapseconds = sizeof(leapseconds)/sizeof(time_t); #endif static double guess_offset (stm *tm) 2015-07-01 17:13 GMT+09:00 Prof Brian Ripley : > Thanks, I was working on this. > > There are other changes needed in src/main/datetime.c and ?.leap.seconds > which I will commit shortly, and the example in hist.POSIXt() needed > alteration (it seems DJM did not run 'make check'!). > > > > On 01/07/2015 06:20, Ei-ji Nakama wrote: >> >> hi, >> >> Index: leap_second/src/library/base/R/zdatetime.R >> =================================================================== >> --- leap_second/src/library/base/R/zdatetime.R (revision 68608) >> +++ leap_second/src/library/base/R/zdatetime.R (working copy) >> @@ -24,7 +24,8 @@ >> "1979-12-31", "1981-6-30", "1982-6-30", "1983-6-30", >> "1985-6-30", "1987-12-31", "1989-12-31", "1990-12-31", >> "1992-6-30", "1993-6-30", "1994-6-30","1995-12-31", >> - "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", >> "2012-6-30") >> + "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", >> + "2012-6-30", "2015-6-30") >> .leap.seconds <- strptime(paste(.leap.seconds , "23:59:60"), >> "%Y-%m-%d %H:%M:%S") >> c(as.POSIXct(.leap.seconds, "GMT")) # lose the timezone >> >> Best Regards, >> -- >> Eiji NAKAMA >> "\u4e2d\u9593\u6804\u6cbb" >> >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel >> > > > -- > Brian D. Ripley, ripley at stats.ox.ac.uk > Emeritus Professor of Applied Statistics, University of Oxford > 1 South Parks Road, Oxford OX1 3TG, UK -- Best Regards, -- Eiji NAKAMA "\u4e2d\u9593\u6804\u6cbb" From murdoch.duncan at gmail.com Wed Jul 1 12:54:20 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 01 Jul 2015 12:54:20 +0200 Subject: [Rd] additional leap second In-Reply-To: <5593A13B.9080505@stats.ox.ac.uk> References: <5593A13B.9080505@stats.ox.ac.uk> Message-ID: <5593C6DC.4090609@gmail.com> On 01/07/2015 10:13 AM, Prof Brian Ripley wrote: > Thanks, I was working on this. > > There are other changes needed in src/main/datetime.c and ?.leap.seconds > which I will commit shortly, and the example in hist.POSIXt() needed > alteration (it seems DJM did not run 'make check'!). That's true. Sometimes simple changes aren't so simple. Duncan Murdoch > > > On 01/07/2015 06:20, Ei-ji Nakama wrote: >> hi, >> >> Index: leap_second/src/library/base/R/zdatetime.R >> =================================================================== >> --- leap_second/src/library/base/R/zdatetime.R (revision 68608) >> +++ leap_second/src/library/base/R/zdatetime.R (working copy) >> @@ -24,7 +24,8 @@ >> "1979-12-31", "1981-6-30", "1982-6-30", "1983-6-30", >> "1985-6-30", "1987-12-31", "1989-12-31", "1990-12-31", >> "1992-6-30", "1993-6-30", "1994-6-30","1995-12-31", >> - "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", "2012-6-30") >> + "1997-6-30", "1998-12-31", "2005-12-31", "2008-12-31", >> + "2012-6-30", "2015-6-30") >> .leap.seconds <- strptime(paste(.leap.seconds , "23:59:60"), >> "%Y-%m-%d %H:%M:%S") >> c(as.POSIXct(.leap.seconds, "GMT")) # lose the timezone >> >> Best Regards, >> -- >> Eiji NAKAMA >> "\u4e2d\u9593\u6804\u6cbb" >> >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel >> > > From winstonchang1 at gmail.com Fri Jul 3 06:36:45 2015 From: winstonchang1 at gmail.com (Winston Chang) Date: Thu, 2 Jul 2015 23:36:45 -0500 Subject: [Rd] Are downstream dependencies rebuilt when a package is updated on CRAN? Message-ID: I was wondering: are the downstream dependencies of a package rebuilt when a package is updated on CRAN? (I'm referring to the binary packages, of course.) The reason I ask is because there are cases where this can cause problems. Suppose that when pkgB is built, it calls pkgA::makeClosure(), which returns a closure that refers to a function in pkgA. Suppose this code is in pkgA 1.0: makeClosure <- function() { function() { funA() } } funA <- function() "funA was called!" And this is the code in pkgB: foo <- pkgA::makeClosure() After building and installing pkgB, you can do this, which gives the expected result: > pkgB::foo() [1] "funA was called!" Now suppose pkgA is upgraded to 2.0, and it contains this: makeClosure <- function() { function() { newFunA() } } newFunA <- function() "newFunA was called!" After this upgrade, when you run the pkgB::foo(), you'll get an error: > pkgB::foo() Error in foo() : could not find function "funA" This is because the environment for pkgB::foo is the pkgA namespace, but the contents of that namespace have changed underneath it. The solution is to rebuild and reinstall pkgB against pkgA 2.0. This is why I'm asking about downstream dependencies being rebuilt on CRAN. If pkgB's binary packages are not automatically rebuilt when pkgA is updated on CRAN, then there would be a broken combination of pkgA and pkgB on CRAN. (Note that source packages aren't vulnerable to this problem) This is not a purely academic exercise -- something similar to this has actually occurred. When R went from version 3.0.0 to 3.0.1, there was a new function in the methods package named .setDummyField, and reference class objects that were built against methods 3.0.1 received closures that included calls to .setDummyField(). If these refclass objects were saved in a package at build-time, then they would not work on systems with R (and methods) 3.0.0. Because CRAN built binaries on R 3.0.1 after its release, users who were on R 3.0.0 simply could not run some binary packages from CRAN. Their possible solutions were to upgrade R, or install the packages from source. I know this happened with Shiny, lme4, ROAuth, among others. https://groups.google.com/forum/#!msg/shiny-discuss/poNAAtyYuS4/n6iAVMI3mb0J https://github.com/lme4/lme4/issues/54 http://stackoverflow.com/questions/18258087/twitter-error-when-authorizing-token It happens that the methods package is included with the base R distribution, but this kind of problem can occur with other packages as well. Many of us have encountered hard-to-diagnose problems that are solved by reinstalling R and all packages. I suspect that this is the root cause in many of those cases. At the very least, it would be nice to be able to specify in a pkgB that it depends on pkgA in such a way that it must be rebuilt when pkgA is upgraded. There are two important consequences of this: CRAN would need to rebuild pkgB when a new pkgA is accepted; and when users upgrade pkgA, they would need to receive a rebuilt pkgB as well. -Winston From ligges at statistik.tu-dortmund.de Fri Jul 3 08:35:51 2015 From: ligges at statistik.tu-dortmund.de (Uwe Ligges) Date: Fri, 03 Jul 2015 08:35:51 +0200 Subject: [Rd] Are downstream dependencies rebuilt when a package is updated on CRAN? In-Reply-To: References: Message-ID: <55962D47.8000703@statistik.tu-dortmund.de> Winston, see far below. ;-) On 03.07.2015 06:36, Winston Chang wrote: > I was wondering: are the downstream dependencies of a package rebuilt > when a package is updated on CRAN? (I'm referring to the binary > packages, of course.) > > The reason I ask is because there are cases where this can cause > problems. Suppose that when pkgB is built, it calls > pkgA::makeClosure(), which returns a closure that refers to a function > in pkgA. Suppose this code is in pkgA 1.0: > makeClosure <- function() { > function() { > funA() > } > } > funA <- function() "funA was called!" > > And this is the code in pkgB: > > foo <- pkgA::makeClosure() > > > After building and installing pkgB, you can do this, which gives the > expected result: >> pkgB::foo() > [1] "funA was called!" > > > Now suppose pkgA is upgraded to 2.0, and it contains this: > makeClosure <- function() { > function() { > newFunA() > } > } > newFunA <- function() "newFunA was called!" > > > After this upgrade, when you run the pkgB::foo(), you'll get an error: >> pkgB::foo() > Error in foo() : could not find function "funA" > > > This is because the environment for pkgB::foo is the pkgA namespace, > but the contents of that namespace have changed underneath it. The > solution is to rebuild and reinstall pkgB against pkgA 2.0. > > This is why I'm asking about downstream dependencies being rebuilt on > CRAN. If pkgB's binary packages are not automatically rebuilt when > pkgA is updated on CRAN, then there would be a broken combination of > pkgA and pkgB on CRAN. (Note that source packages aren't vulnerable to > this problem) > > > This is not a purely academic exercise -- something similar to this > has actually occurred. When R went from version 3.0.0 to 3.0.1, there > was a new function in the methods package named .setDummyField, and > reference class objects that were built against methods 3.0.1 received > closures that included calls to .setDummyField(). If these refclass > objects were saved in a package at build-time, then they would not > work on systems with R (and methods) 3.0.0. Because CRAN built > binaries on R 3.0.1 after its release, users who were on R 3.0.0 > simply could not run some binary packages from CRAN. Their possible > solutions were to upgrade R, or install the packages from source. I > know this happened with Shiny, lme4, ROAuth, among others. > https://groups.google.com/forum/#!msg/shiny-discuss/poNAAtyYuS4/n6iAVMI3mb0J > https://github.com/lme4/lme4/issues/54 > http://stackoverflow.com/questions/18258087/twitter-error-when-authorizing-token > > It happens that the methods package is included with the base R > distribution, but this kind of problem can occur with other packages > as well. > > Many of us have encountered hard-to-diagnose problems that are solved > by reinstalling R and all packages. I suspect that this is the root > cause in many of those cases. I am aware of this probelm. Another as with the environments happens if S4 stuff is changed. > > At the very least, it would be nice to be able to specify in a pkgB > that it depends on pkgA in such a way that it must be rebuilt when > pkgA is upgraded. There are two important consequences of this: CRAN > would need to rebuild pkgB when a new pkgA is accepted; This happens for Windows binary packages on CRAN. > and when users > upgrade pkgA, they would need to receive a rebuilt pkgB as well. This will not happen automatically, since the R package mechanism is based on version numbers tht are not changed on CRAN. It is very hard to know when such a rebuild is really needed. In very few cases only, actually. Currently we always rebuild which causes a lot of overhead. But we do not always want to trigger updates since these are not needed in I guess > 99% of all package updates. Best wishes, Uwe > -Winston From edd at debian.org Fri Jul 3 10:54:28 2015 From: edd at debian.org (Dirk Eddelbuettel) Date: Fri, 3 Jul 2015 03:54:28 -0500 Subject: [Rd] additional leap second In-Reply-To: References: <5593A13B.9080505@stats.ox.ac.uk> Message-ID: <21910.19908.267421.627243@max.nulle.part> On 1 July 2015 at 18:02, Ei-ji Nakama wrote: | sorry, i didnt watch src/main/datetime.c ... Thanks again -- I also made that update for a Debian build 3.2.1-4. Dirk | Index: leap_second/src/main/datetime.c | =================================================================== | --- leap_second/src/main/datetime.c (????? 68608) | +++ leap_second/src/main/datetime.c (?????) | @@ -303,14 +303,15 @@ | } | | #ifndef HAVE_POSIX_LEAPSECONDS | -/* There have been 25 leapseconds: see .leap.seconds in R | +/* There have been many leapseconds: see .leap.seconds in R | + * (as.Date(.leap.seconds)-as.Date("1970-01-01"))*86400 | */ | -static int n_leapseconds = 25; | static const time_t leapseconds[] = | { 78796800, 94694400,126230400,157766400,189302400,220924800,252460800, | 283996800,315532800,362793600,394329600,425865600,489024000,567993600, | 631152000,662688000,709948800,741484800,773020800,820454400,867715200, | - 915148800,1136073600,1230768000,1341100800}; | + 915148800,1136073600,1230768000,1341100800,1435708800}; | +static int n_leapseconds = sizeof(leapseconds)/sizeof(time_t); | #endif | | static double guess_offset (stm *tm) | -- http://dirk.eddelbuettel.com | @eddelbuettel | edd at debian.org From Tom.Wenseleers at bio.kuleuven.be Sat Jul 4 19:17:09 2015 From: Tom.Wenseleers at bio.kuleuven.be (Tom Wenseleers) Date: Sat, 4 Jul 2015 17:17:09 +0000 Subject: [Rd] Support for transparency in metafile export & support for export to Powerpoint Message-ID: <37EFC97028F3E44082ACC5CBEC00563011544EAB@ICTS-S-MBX13.luna.kuleuven.be> Dear all, Further to my previous message I now made a one-line convencience function to export your currently active graphics window/plot to either Word or Powerpoint in Office-native vector-based DrawingML format using either export2ppt(file="plot.pptx") or export2doc(file="plot.docx") : see http://stackoverflow.com/questions/31212659/r-function-to-capture-r-plot-in-current-graphics-device-and-export-it-to-powerp/31221813#31221813 (analogous in syntax to function dev.copy2pdf in grDevices) The code of the function is: export2office = function(file = "plot", type="PPT", scaling = 90, aspectr=NULL, vector.graphic = TRUE, fontname = "Arial", pointsize=20) { file=sub("^(.*)[.].*", "\\1", file) if (type=="PPT"|type=="PPTX") {ext=".pptx";type="PPT"} else {ext=".docx";type="DOC"} require(ReporteRs) captureplot = function() {p = invisible(recordPlot()) dev.copy() return(p)} p = captureplot() plotsize = dev.size() plotaspectr = plotsize[[1]]/plotsize[[2]] if (!is.null(aspectr)) plotaspectr=aspectr myplot=function(pl=p) print(pl) if (type=="PPT") {doc = pptx();doc = addSlide(doc, slide.layout = "Blank");pagesize = dim(doc)$slide.dim} else {doc = docx();pagesize = dim(doc)$page-dim(doc)$margins[c(4,3)]} pageaspectr = pagesize["width"]/pagesize["height"] if (pageaspectr>plotaspectr) {xf=plotaspectr/pageaspectr;yf=1} else {xf=1;yf=pageaspectr/plotaspectr} w = (scaling/100)*pagesize["width"]*xf; h = (scaling/100)*pagesize["height"]*yf if (type=="PPT") {doc = addPlot( doc, myplot, vector.graphic = vector.graphic, fontname = fontname, pointsize = pointsize, offx = (pagesize["width"]-w)/2, offy = (pagesize["height"]-h)/2, width = w, height = h) } else {doc = addPlot( doc, myplot, vector.graphic = vector.graphic, fontname = fontname, pointsize = pointsize, width = w, height = h)} writeDoc( doc, paste0(file,ext) ) } export2ppt = function(type="PPT", ...) export2office(type=type,...) export2doc = function(type="DOC", ...) export2office(type=type,...) # Examples: require(ggplot2) qplot(Sepal.Length, Petal.Length, data = iris, color = Species, size = Petal.Width, alpha = I(0.7)) export2ppt(file="plot.pptx") export2ppt(file="plot.pptx",aspectr=1.7,fontname="Times New Roman") heatmap(as.matrix(eurodist)) export2ppt(file="heatmap.pptx") In Powerpoint you can then right click on the graph and Ungroup it, thereby allowing you to make minor changes to the layout if need be, before saving it as PDF from PPT. The quality is much better than what you get if you try to do the editing in the PDF version using Inkscape. It works with ggplot2 and lattice plots as well as base R plots and also fully supports transparency (unlike e.g. EPS or EMF export in R - EMF in the meantime I found out does not support transparency at all, and can only deal with it by rasterizing all semi-tranparent graphics elements). Given the widespread use of Office/LibreOffice/OpenOffice I think it would be very handy if this kind of functionality were provided as part of base R at one stage or another (as would Excel import and export, for that matter). So if anyone on this list thinks it would be a good idea to incorporate this function in grDevices or something, please do! (would be handy e.g. if powerpoint export also showed in the File...Save as... interactive graphics devices, like windows() ) Otherwise I'll be in touch with the ReporteRs author to try to convince him to add it there. cheers, Tom [[alternative HTML version deleted]] From richierocks at gmail.com Sun Jul 5 18:53:40 2015 From: richierocks at gmail.com (Richard Cotton) Date: Sun, 5 Jul 2015 18:53:40 +0200 Subject: [Rd] Are import-reexport-only packages allowed on CRAN? Message-ID: One piece of feedback that I received at useR was that the assertive package is getting too big, and should be broken down into smaller pieces. I want to split the functionality into assertive.base, assertive.types, and a few others, then have the assertive package as a virtual package (suggestions for better terminology welcomed) that just imports and reexports the contents of the underlying pieces. That way end-users can can still type library(assertive) and have the same behaviour as before, and package developers who worry about having lightweight dependencies can just use the parts that they need. Before I do the refactoring, I wanted to check that it is OK to have a package without any of its own content (other than vignettes) on CRAN. Is it OK? -- Regards, Richie Learning R 4dpiecharts.com From murdoch.duncan at gmail.com Sun Jul 5 19:05:46 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Sun, 05 Jul 2015 13:05:46 -0400 Subject: [Rd] Are import-reexport-only packages allowed on CRAN? In-Reply-To: References: Message-ID: <559963EA.4040205@gmail.com> On 05/07/2015 12:53 PM, Richard Cotton wrote: > One piece of feedback that I received at useR was that the assertive > package is getting too big, and should be broken down into smaller > pieces. > > I want to split the functionality into assertive.base, > assertive.types, and a few others, then have the assertive package as > a virtual package (suggestions for better terminology welcomed) that > just imports and reexports the contents of the underlying pieces. > > That way end-users can can still type library(assertive) and have the > same behaviour as before, and package developers who worry about > having lightweight dependencies can just use the parts that they need. > > Before I do the refactoring, I wanted to check that it is OK to have a > package without any of its own content (other than vignettes) on CRAN. > Is it OK? > I think you're writing to the wrong place. This sounds like a question for the R-pkg-devel list, or if you don't get an answer there, CRAN at r-project.org. Duncan Murdoch From renaud at mancala.cbio.uct.ac.za Tue Jul 7 11:05:02 2015 From: renaud at mancala.cbio.uct.ac.za (Renaud Gaujoux) Date: Tue, 7 Jul 2015 11:05:02 +0200 Subject: [Rd] List S3 methods and defining packages Message-ID: Hi, from the man page ?methods, I expected to be able to build pairs (class,package) for a given S3 method, e.g., print, using attr(methods(print), 'info'). However all the methods, except the ones defined in base or S4 methods, get the 'from' value "registered S3method for print", instead of the actual package name (see below for the first rows). Is this normal behaviour? If so, is there a way to get what I want: a character vector mapping class to package (ideally in loading order, but this I can re-order from search()). Thank you. Bests, Renaud visible from generic isS4 print.abbrev FALSE registered S3method for print print FALSE print.acf FALSE registered S3method for print print FALSE print.AES FALSE registered S3method for print print FALSE print.agnes FALSE registered S3method for print print FALSE print.anova FALSE registered S3method for print print FALSE print.Anova FALSE registered S3method for print print FALSE print.anova.loglm FALSE registered S3method for print print FALSE print,ANY-method TRUE base print TRUE print.aov FALSE registered S3method for print print FALSE From csardi.gabor at gmail.com Tue Jul 7 12:58:25 2015 From: csardi.gabor at gmail.com (=?UTF-8?B?R8OhYm9yIENzw6FyZGk=?=) Date: Tue, 7 Jul 2015 12:58:25 +0200 Subject: [Rd] dead links to manuals Message-ID: E.g. here http://cran.r-project.org/manuals.html the link to http://cran.r-project.org/doc/manuals/r-release/R-intro.html gives a 404. FYI, Gabor From pdalgd at gmail.com Tue Jul 7 13:32:14 2015 From: pdalgd at gmail.com (peter dalgaard) Date: Tue, 7 Jul 2015 13:32:14 +0200 Subject: [Rd] dead links to manuals In-Reply-To: References: Message-ID: <601EFDEF-5CD9-4343-8DFA-CBFE96812C70@gmail.com> The configure was broken for texinfo 6.0. Fixed in r-patched, but trickier to fix for r-release without r-release no longer being r-release... -pd On 07 Jul 2015, at 12:58 , G?bor Cs?rdi wrote: > E.g. here http://cran.r-project.org/manuals.html the link to > http://cran.r-project.org/doc/manuals/r-release/R-intro.html gives a > 404. > > FYI, > Gabor > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel -- Peter Dalgaard, Professor, Center for Statistics, Copenhagen Business School Solbjerg Plads 3, 2000 Frederiksberg, Denmark Phone: (+45)38153501 Office: A 4.23 Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com From marc_schwartz at me.com Tue Jul 7 13:20:18 2015 From: marc_schwartz at me.com (Marc Schwartz) Date: Tue, 07 Jul 2015 06:20:18 -0500 Subject: [Rd] dead links to manuals In-Reply-To: References: Message-ID: <7CFE1138-BAF6-4E1F-A423-3AF14278F390@me.com> > On Jul 7, 2015, at 5:58 AM, G?bor Cs?rdi wrote: > > E.g. here http://cran.r-project.org/manuals.html the link to > http://cran.r-project.org/doc/manuals/r-release/R-intro.html gives a > 404. > > FYI, > Gabor Gabor, this was reported yesterday on R-Help: https://stat.ethz.ch/pipermail/r-help/2015-July/430130.html Regards, Marc Schwartz From mtmorgan at fredhutch.org Tue Jul 7 21:01:20 2015 From: mtmorgan at fredhutch.org (Martin Morgan) Date: Tue, 07 Jul 2015 12:01:20 -0700 Subject: [Rd] List S3 methods and defining packages In-Reply-To: References: Message-ID: <559C2200.4070305@fredhutch.org> On 07/07/2015 02:05 AM, Renaud Gaujoux wrote: > Hi, > > from the man page ?methods, I expected to be able to build pairs > (class,package) for a given S3 method, e.g., print, using > > attr(methods(print), 'info'). > > However all the methods, except the ones defined in base or S4 > methods, get the 'from' value "registered S3method for print", instead > of the actual package name (see below for the first rows). > > Is this normal behaviour? If so, is there a way to get what I want: a > character vector mapping class to package (ideally in loading order, > but this I can re-order from search()). It's the way it has always been, so normal in that sense. There could be two meanings of 'from' -- the namespace in which the generic to which the method belongs is defined, and the namespace in which the method is defined. I think the former is what you're interested in, but the latter likely what methods() might be modified return. For your use case, maybe something like .S3methodsInNamespace <- function(envir, pattern) { mtable <- get(".__S3MethodsTable__.", envir = asNamespace(envir)) methods <- ls(mtable, pattern = pattern) env <- vapply(methods, function(x) { environmentName(environment(get(x, mtable))) }, character(1)) setNames(names(env), unname(env)) } followed by nmspc = loadedNamespaces() lapply(setNames(nmspc, nmspc), .S3methodsInNamespace, "^plot.") which reveals the different meanings of 'from', e.g., > lapply(setNames(nmspc, nmspc), .S3methodsInNamespace, "^plot.")["graphics"] $graphics stats graphics stats "plot.acf" "plot.data.frame" "plot.decomposed.ts" graphics stats stats "plot.default" "plot.dendrogram" "plot.density" stats graphics graphics "plot.ecdf" "plot.factor" "plot.formula" graphics stats graphics "plot.function" "plot.hclust" "plot.histogram" stats stats stats "plot.HoltWinters" "plot.isoreg" "plot.lm" stats stats stats "plot.medpolish" "plot.mlm" "plot.ppr" stats stats stats "plot.prcomp" "plot.princomp" "plot.profile.nls" graphics stats stats "plot.raster" "plot.spec" "plot.stepfun" stats graphics stats "plot.stl" "plot.table" "plot.ts" stats stats "plot.tskernel" "plot.TukeyHSD" Also this is for loaded, rather than attached, namespaces. Martin Morgan > Thank you. > > Bests, > Renaud > > visible > from generic isS4 > print.abbrev FALSE registered > S3method for print print FALSE > print.acf FALSE registered > S3method for print print FALSE > print.AES FALSE registered > S3method for print print FALSE > print.agnes FALSE registered > S3method for print print FALSE > print.anova FALSE registered > S3method for print print FALSE > print.Anova FALSE registered > S3method for print print FALSE > print.anova.loglm FALSE registered > S3method for print print FALSE > print,ANY-method TRUE > base print TRUE > print.aov FALSE registered > S3method for print print FALSE > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > -- Computational Biology / Fred Hutchinson Cancer Research Center 1100 Fairview Ave. N. PO Box 19024 Seattle, WA 98109 Location: Arnold Building M1 B861 Phone: (206) 667-2793 From mick.jordan at oracle.com Wed Jul 8 00:33:24 2015 From: mick.jordan at oracle.com (Mick Jordan) Date: Tue, 07 Jul 2015 15:33:24 -0700 Subject: [Rd] [ on call Message-ID: <559C53B4.2060804@oracle.com> I'm rather puzzled by this behavior: e export("caption<-", "caption", "label", "label<-", "align<-", "align", "digits<-", "digits", "display<-", "display", "xtable", "print.xtable", "toLatex.xtable") > e[[1L]] e[[1L]] export > e[-1L] e[-1L] "caption<-"("caption", "label", "label<-", "align<-", "align", "digits<-", "digits", "display<-", "display", "xtable", "print.xtable", "toLatex.xtable") I'm not at all clear what should result from removing the first element, i.e. the 'export', but I would not expect the first argument to be promoted into the function position. I guess I would expect a coercion to list or pairlist first, after which the [-1L] would produce a meaningful result on that coercion. In any event I do observe that as.character(e[-1L]) produces the expected result: as.character(e[-1L]) [1] "caption<-" "caption" "label" "label<-" [5] "align<-" "align" "digits<-" "digits" [9] "display<-" "display" "xtable" "print.xtable" [13] "toLatex.xtable" This code is from parseNamespaceFile (on the xtable package). Mick Jordan From murdoch.duncan at gmail.com Wed Jul 8 00:40:52 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Tue, 07 Jul 2015 18:40:52 -0400 Subject: [Rd] [ on call In-Reply-To: <559C53B4.2060804@oracle.com> References: <559C53B4.2060804@oracle.com> Message-ID: <559C5574.1070108@gmail.com> On 07/07/2015 6:33 PM, Mick Jordan wrote: > I'm rather puzzled by this behavior: > > e > export("caption<-", "caption", "label", "label<-", "align<-", > "align", "digits<-", "digits", "display<-", "display", "xtable", > "print.xtable", "toLatex.xtable") > > e[[1L]] > e[[1L]] > export > > e[-1L] > e[-1L] > "caption<-"("caption", "label", "label<-", "align<-", "align", > "digits<-", "digits", "display<-", "display", "xtable", > "print.xtable", > "toLatex.xtable") > > I'm not at all clear what should result from removing the first element, > i.e. the 'export', but I would not expect the first argument to be > promoted into the function position. I guess I would expect a coercion > to list or pairlist first, after which the [-1L] would produce a > meaningful result on that coercion. In any event I do observe that > as.character(e[-1L]) produces the expected result: > > as.character(e[-1L]) > [1] "caption<-" "caption" "label" "label<-" > [5] "align<-" "align" "digits<-" "digits" > [9] "display<-" "display" "xtable" "print.xtable" > [13] "toLatex.xtable" > > This code is from parseNamespaceFile (on the xtable package). In R, language objects are specially marked pairlists. The first element is the function to call, the second is its first argument, etc. So if you delete the first element and do nothing else, that promotes the first argument to the function to call. Duncan Murdoch From renaud at mancala.cbio.uct.ac.za Wed Jul 8 12:09:44 2015 From: renaud at mancala.cbio.uct.ac.za (Renaud Gaujoux) Date: Wed, 8 Jul 2015 12:09:44 +0200 Subject: [Rd] List S3 methods and defining packages In-Reply-To: <559C2200.4070305@fredhutch.org> References: <559C2200.4070305@fredhutch.org> Message-ID: Thank you for your reply Martin. Your code made me realize that S3 methods are added to the .__S3MethodsTable__. of the package that defines the generic, not to the ones defining the method itself. How does things work in the case of a method from one package B masking the one from another package A? I don't get any warning message and there seems to be only one entry in the relevant .__S3MethodsTable__. Aren't these tables updated when the masking package B is detached? On 7 July 2015 at 21:01, Martin Morgan wrote: > On 07/07/2015 02:05 AM, Renaud Gaujoux wrote: >> >> Hi, >> >> from the man page ?methods, I expected to be able to build pairs >> (class,package) for a given S3 method, e.g., print, using >> >> attr(methods(print), 'info'). >> >> However all the methods, except the ones defined in base or S4 >> methods, get the 'from' value "registered S3method for print", instead >> of the actual package name (see below for the first rows). >> >> Is this normal behaviour? If so, is there a way to get what I want: a >> character vector mapping class to package (ideally in loading order, >> but this I can re-order from search()). > > > It's the way it has always been, so normal in that sense. > > There could be two meanings of 'from' -- the namespace in which the generic > to which the method belongs is defined, and the namespace in which the > method is defined. I think the former is what you're interested in, but the > latter likely what methods() might be modified return. > > For your use case, maybe something like > > .S3methodsInNamespace <- function(envir, pattern) { > mtable <- get(".__S3MethodsTable__.", envir = asNamespace(envir)) > methods <- ls(mtable, pattern = pattern) > env <- vapply(methods, function(x) { > environmentName(environment(get(x, mtable))) > }, character(1)) > setNames(names(env), unname(env)) > } > > > followed by > > nmspc = loadedNamespaces() > lapply(setNames(nmspc, nmspc), .S3methodsInNamespace, "^plot.") > > which reveals the different meanings of 'from', e.g., > >> lapply(setNames(nmspc, nmspc), .S3methodsInNamespace, >> "^plot.")["graphics"] > $graphics > stats graphics stats > "plot.acf" "plot.data.frame" "plot.decomposed.ts" > graphics stats stats > "plot.default" "plot.dendrogram" "plot.density" > stats graphics graphics > "plot.ecdf" "plot.factor" "plot.formula" > graphics stats graphics > "plot.function" "plot.hclust" "plot.histogram" > stats stats stats > "plot.HoltWinters" "plot.isoreg" "plot.lm" > stats stats stats > "plot.medpolish" "plot.mlm" "plot.ppr" > stats stats stats > "plot.prcomp" "plot.princomp" "plot.profile.nls" > graphics stats stats > "plot.raster" "plot.spec" "plot.stepfun" > stats graphics stats > "plot.stl" "plot.table" "plot.ts" > stats stats > "plot.tskernel" "plot.TukeyHSD" > > Also this is for loaded, rather than attached, namespaces. > > Martin Morgan > >> Thank you. >> >> Bests, >> Renaud >> >> visible >> from generic isS4 >> print.abbrev FALSE registered >> S3method for print print FALSE >> print.acf FALSE registered >> S3method for print print FALSE >> print.AES FALSE registered >> S3method for print print FALSE >> print.agnes FALSE registered >> S3method for print print FALSE >> print.anova FALSE registered >> S3method for print print FALSE >> print.Anova FALSE registered >> S3method for print print FALSE >> print.anova.loglm FALSE registered >> S3method for print print FALSE >> print,ANY-method TRUE >> base print TRUE >> print.aov FALSE registered >> S3method for print print FALSE >> >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel >> > > > -- > Computational Biology / Fred Hutchinson Cancer Research Center > 1100 Fairview Ave. N. > PO Box 19024 Seattle, WA 98109 > > Location: Arnold Building M1 B861 > Phone: (206) 667-2793 > From renaud at mancala.cbio.uct.ac.za Wed Jul 8 15:57:49 2015 From: renaud at mancala.cbio.uct.ac.za (Renaud Gaujoux) Date: Wed, 8 Jul 2015 15:57:49 +0200 Subject: [Rd] List S3 methods and defining packages In-Reply-To: References: <559C2200.4070305@fredhutch.org> Message-ID: Thanks Kevin, this indeed clarifies a bit the S3 method registration universe. On 8 July 2015 at 15:48, Kevin Wright wrote: > Not sure if this answers your question, but you can't unregister a method. See > > http://tolstoy.newcastle.edu.au/R/help/06/07/30791.html > > Kevin Wright > > > On Wed, Jul 8, 2015 at 5:09 AM, Renaud Gaujoux > wrote: >> Thank you for your reply Martin. >> Your code made me realize that S3 methods are added to the >> .__S3MethodsTable__. of the package that defines the generic, not to >> the ones defining the method itself. >> How does things work in the case of a method from one package B >> masking the one from another package A? I don't get any warning >> message and there seems to be only one entry in the relevant >> .__S3MethodsTable__. >> Aren't these tables updated when the masking package B is detached? >> >> On 7 July 2015 at 21:01, Martin Morgan wrote: >>> On 07/07/2015 02:05 AM, Renaud Gaujoux wrote: >>>> >>>> Hi, >>>> >>>> from the man page ?methods, I expected to be able to build pairs >>>> (class,package) for a given S3 method, e.g., print, using >>>> >>>> attr(methods(print), 'info'). >>>> >>>> However all the methods, except the ones defined in base or S4 >>>> methods, get the 'from' value "registered S3method for print", instead >>>> of the actual package name (see below for the first rows). >>>> >>>> Is this normal behaviour? If so, is there a way to get what I want: a >>>> character vector mapping class to package (ideally in loading order, >>>> but this I can re-order from search()). >>> >>> >>> It's the way it has always been, so normal in that sense. >>> >>> There could be two meanings of 'from' -- the namespace in which the generic >>> to which the method belongs is defined, and the namespace in which the >>> method is defined. I think the former is what you're interested in, but the >>> latter likely what methods() might be modified return. >>> >>> For your use case, maybe something like >>> >>> .S3methodsInNamespace <- function(envir, pattern) { >>> mtable <- get(".__S3MethodsTable__.", envir = asNamespace(envir)) >>> methods <- ls(mtable, pattern = pattern) >>> env <- vapply(methods, function(x) { >>> environmentName(environment(get(x, mtable))) >>> }, character(1)) >>> setNames(names(env), unname(env)) >>> } >>> >>> >>> followed by >>> >>> nmspc = loadedNamespaces() >>> lapply(setNames(nmspc, nmspc), .S3methodsInNamespace, "^plot.") >>> >>> which reveals the different meanings of 'from', e.g., >>> >>>> lapply(setNames(nmspc, nmspc), .S3methodsInNamespace, >>>> "^plot.")["graphics"] >>> $graphics >>> stats graphics stats >>> "plot.acf" "plot.data.frame" "plot.decomposed.ts" >>> graphics stats stats >>> "plot.default" "plot.dendrogram" "plot.density" >>> stats graphics graphics >>> "plot.ecdf" "plot.factor" "plot.formula" >>> graphics stats graphics >>> "plot.function" "plot.hclust" "plot.histogram" >>> stats stats stats >>> "plot.HoltWinters" "plot.isoreg" "plot.lm" >>> stats stats stats >>> "plot.medpolish" "plot.mlm" "plot.ppr" >>> stats stats stats >>> "plot.prcomp" "plot.princomp" "plot.profile.nls" >>> graphics stats stats >>> "plot.raster" "plot.spec" "plot.stepfun" >>> stats graphics stats >>> "plot.stl" "plot.table" "plot.ts" >>> stats stats >>> "plot.tskernel" "plot.TukeyHSD" >>> >>> Also this is for loaded, rather than attached, namespaces. >>> >>> Martin Morgan >>> >>>> Thank you. >>>> >>>> Bests, >>>> Renaud >>>> >>>> visible >>>> from generic isS4 >>>> print.abbrev FALSE registered >>>> S3method for print print FALSE >>>> print.acf FALSE registered >>>> S3method for print print FALSE >>>> print.AES FALSE registered >>>> S3method for print print FALSE >>>> print.agnes FALSE registered >>>> S3method for print print FALSE >>>> print.anova FALSE registered >>>> S3method for print print FALSE >>>> print.Anova FALSE registered >>>> S3method for print print FALSE >>>> print.anova.loglm FALSE registered >>>> S3method for print print FALSE >>>> print,ANY-method TRUE >>>> base print TRUE >>>> print.aov FALSE registered >>>> S3method for print print FALSE >>>> >>>> ______________________________________________ >>>> R-devel at r-project.org mailing list >>>> https://stat.ethz.ch/mailman/listinfo/r-devel >>>> >>> >>> >>> -- >>> Computational Biology / Fred Hutchinson Cancer Research Center >>> 1100 Fairview Ave. N. >>> PO Box 19024 Seattle, WA 98109 >>> >>> Location: Arnold Building M1 B861 >>> Phone: (206) 667-2793 >>> >> >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel > > > > -- > Kevin Wright > From vthokiesvt at vt.edu Wed Jul 8 20:43:17 2015 From: vthokiesvt at vt.edu (vthokienj) Date: Wed, 8 Jul 2015 11:43:17 -0700 (PDT) Subject: [Rd] Graphical User Interface (GUI) Message-ID: <1436380997159-4709583.post@n4.nabble.com> I'd like to create a user interface for my R code and have only seen mostly older posts on the subject. I'm not looking for an IDE for development, but something that the end user of the software would use. So something that would involve displaying buttons, listboxes, and drop down menus to a user that will facilitate various actions. I doubt there is anything Visual Studio-like that provides a toolbox to drag and drop, but what would be the simplest way to approach designing an interface? The basic goal of my program is to have the user enter data which would be saved to a database. Then, through the interface, various buttons will draw various graphs on that data. Thanks in advance for any help on this. -- View this message in context: http://r.789695.n4.nabble.com/Graphical-User-Interface-GUI-tp4709583.html Sent from the R devel mailing list archive at Nabble.com. From daattali at gmail.com Wed Jul 8 22:28:01 2015 From: daattali at gmail.com (Dean Attali) Date: Wed, 8 Jul 2015 13:28:01 -0700 Subject: [Rd] Graphical User Interface (GUI) In-Reply-To: <1436380997159-4709583.post@n4.nabble.com> References: <1436380997159-4709583.post@n4.nabble.com> Message-ID: Isn't this exactly what Shiny is meant for? http://shiny.rstudio.com/ --- http://deanattali.com On 8 July 2015 at 11:43, vthokienj wrote: > I'd like to create a user interface for my R code and have only seen mostly > older posts on the subject. > I'm not looking for an IDE for development, but something that the end user > of the software would use. > So something that would involve displaying buttons, listboxes, and drop > down > menus to a user that will facilitate various actions. > > I doubt there is anything Visual Studio-like that provides a toolbox to > drag > and drop, but what would be the simplest way to approach designing an > interface? The basic goal of my program is to have the user enter data > which > would be saved to a database. Then, through the interface, various buttons > will draw various graphs on that data. > > Thanks in advance for any help on this. > > > > -- > View this message in context: > http://r.789695.n4.nabble.com/Graphical-User-Interface-GUI-tp4709583.html > Sent from the R devel mailing list archive at Nabble.com. > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > [[alternative HTML version deleted]] From retep.meissner at gmail.com Wed Jul 8 23:26:10 2015 From: retep.meissner at gmail.com (Peter Meissner) Date: Wed, 8 Jul 2015 23:26:10 +0200 Subject: [Rd] Graphical User Interface (GUI) In-Reply-To: References: <1436380997159-4709583.post@n4.nabble.com> Message-ID: Have a look at gWidgets (John Verzani). Best, Peter Am 08.07.2015 22:29 schrieb "Dean Attali" : > Isn't this exactly what Shiny is meant for? > http://shiny.rstudio.com/ > > --- > http://deanattali.com > > On 8 July 2015 at 11:43, vthokienj wrote: > > > I'd like to create a user interface for my R code and have only seen > mostly > > older posts on the subject. > > I'm not looking for an IDE for development, but something that the end > user > > of the software would use. > > So something that would involve displaying buttons, listboxes, and drop > > down > > menus to a user that will facilitate various actions. > > > > I doubt there is anything Visual Studio-like that provides a toolbox to > > drag > > and drop, but what would be the simplest way to approach designing an > > interface? The basic goal of my program is to have the user enter data > > which > > would be saved to a database. Then, through the interface, various > buttons > > will draw various graphs on that data. > > > > Thanks in advance for any help on this. > > > > > > > > -- > > View this message in context: > > > http://r.789695.n4.nabble.com/Graphical-User-Interface-GUI-tp4709583.html > > Sent from the R devel mailing list archive at Nabble.com. > > > > ______________________________________________ > > R-devel at r-project.org mailing list > > https://stat.ethz.ch/mailman/listinfo/r-devel > > > > [[alternative HTML version deleted]] > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > [[alternative HTML version deleted]] From jfox at mcmaster.ca Wed Jul 8 23:52:24 2015 From: jfox at mcmaster.ca (John Fox) Date: Wed, 8 Jul 2015 17:52:24 -0400 Subject: [Rd] Graphical User Interface (GUI) In-Reply-To: <1436380997159-4709583.post@n4.nabble.com> References: <1436380997159-4709583.post@n4.nabble.com> Message-ID: <004101d0b9c8$6053b240$20fb16c0$@mcmaster.ca> Dear vthokienj, One approach would be to write a plug-in package for the Rcmdr package. An advantage is that data input (at least of rectangular data sets), data management, etc., are already taken care of. There are about 40 such plug-ins currently on CRAN, most with names beginning with "RcmdrPlugin.". The GUI toolkit that's used is Tcl/Tk and the tcltk package is part of the standard R distribution. Some convenience functions for creating dialogs are exported by the Rcmdr package for use in plug-ins. A couple of relevant -- if somewhat dated -- articles are in R News < http://www.r-project.org/doc/Rnews/Rnews_2007-3.pdf > and the Journal of Statistical Software . Best, John ----------------------------------------------- John Fox, Professor McMaster University Hamilton, Ontario, Canada http://socserv.socsci.mcmaster.ca/jfox/ > -----Original Message----- > From: R-devel [mailto:r-devel-bounces at r-project.org] On Behalf Of > vthokienj > Sent: July-08-15 2:43 PM > To: r-devel at r-project.org > Subject: [Rd] Graphical User Interface (GUI) > > I'd like to create a user interface for my R code and have only seen > mostly > older posts on the subject. > I'm not looking for an IDE for development, but something that the end > user > of the software would use. > So something that would involve displaying buttons, listboxes, and drop > down > menus to a user that will facilitate various actions. > > I doubt there is anything Visual Studio-like that provides a toolbox to > drag > and drop, but what would be the simplest way to approach designing an > interface? The basic goal of my program is to have the user enter data > which > would be saved to a database. Then, through the interface, various > buttons > will draw various graphs on that data. > > Thanks in advance for any help on this. > > > > -- > View this message in context: http://r.789695.n4.nabble.com/Graphical- > User-Interface-GUI-tp4709583.html > Sent from the R devel mailing list archive at Nabble.com. > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel --- This email has been checked for viruses by Avast antivirus software. https://www.avast.com/antivirus From vthokiesvt at vt.edu Thu Jul 9 04:02:05 2015 From: vthokiesvt at vt.edu (vthokienj) Date: Wed, 8 Jul 2015 19:02:05 -0700 (PDT) Subject: [Rd] Graphical User Interface (GUI) In-Reply-To: <004101d0b9c8$6053b240$20fb16c0$@mcmaster.ca> References: <1436380997159-4709583.post@n4.nabble.com> <004101d0b9c8$6053b240$20fb16c0$@mcmaster.ca> Message-ID: <1436407325763-4709601.post@n4.nabble.com> Thanks all for the replies both here and in email. It seems like Shiny is what I am looking for, I guess I just ignored anything R Studio thinking they only made an IDE. I'm going to look into the other suggestions as well. I've only been using R for a few months but look forward to trying these out. -- View this message in context: http://r.789695.n4.nabble.com/Graphical-User-Interface-GUI-tp4709583p4709601.html Sent from the R devel mailing list archive at Nabble.com. From lawrence.michael at gene.com Thu Jul 9 06:01:08 2015 From: lawrence.michael at gene.com (Michael Lawrence) Date: Wed, 8 Jul 2015 21:01:08 -0700 Subject: [Rd] Graphical User Interface (GUI) In-Reply-To: <1436380997159-4709583.post@n4.nabble.com> References: <1436380997159-4709583.post@n4.nabble.com> Message-ID: On Wed, Jul 8, 2015 at 11:43 AM, vthokienj wrote: > I'd like to create a user interface for my R code and have only seen mostly > older posts on the subject. > I'm not looking for an IDE for development, but something that the end user > of the software would use. > So something that would involve displaying buttons, listboxes, and drop down > menus to a user that will facilitate various actions. > > I doubt there is anything Visual Studio-like that provides a toolbox to drag > and drop, Actually, there are a couple of options. QtDesigner is such a tool, and its output is compatible with the qtbase R package. There's also Glade, which is compatible with RGtk2. > but what would be the simplest way to approach designing an > interface? The basic goal of my program is to have the user enter data which > would be saved to a database. Then, through the interface, various buttons > will draw various graphs on that data. > > Thanks in advance for any help on this. > > > > -- > View this message in context: http://r.789695.n4.nabble.com/Graphical-User-Interface-GUI-tp4709583.html > Sent from the R devel mailing list archive at Nabble.com. > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel From Phillip.Schermerhorn at austintexas.gov Thu Jul 9 16:44:50 2015 From: Phillip.Schermerhorn at austintexas.gov (Schermerhorn, Phillip) Date: Thu, 9 Jul 2015 14:44:50 +0000 Subject: [Rd] Compiling on AIX 6.1 Message-ID: I'm trying to compile R on AIX 6.1. gcc -v Using built-in specs. Target: powerpc-ibm-aix6.1.0.0 Configured with: ../gcc-4.4.5/configure --with-as=/usr/bin/as --with-ld=/usr/bin/ld --enable-languages=c,c++,fortran --prefix=/opt/freeware --enable-threads --enable-version-specific-runtime-libs --disable-nls --enable-decimal-float=dpd --host=powerpc-ibm-aix6.1.0.0 Thread model: aix gcc version 4.4.5 (GCC) gfortran -v Using built-in specs. Target: powerpc-ibm-aix6.1.0.0 Configured with: ../gcc-4.4.5/configure --with-as=/usr/bin/as --with-ld=/usr/bin/ld --enable-languages=c,c++,fortran --prefix=/opt/freeware --enable-threads --enable-version-specific-runtime-libs --disable-nls --enable-decimal-float=dpd --host=powerpc-ibm-aix6.1.0.0 Thread model: aix gcc version 4.4.5 (GCC) No matter what I try, I keep getting the following errors. ld: 0711-224 WARNING: Duplicate symbol: R_ReadItemDepth ld: 0711-345 Use the -bloadmap or -bnoquiet option to obtain more information. ld: 0711-317 ERROR: Undefined symbol: .__nearbyintl128 ld: 0711-317 ERROR: Undefined symbol: .__log1pl128 collect2: ld returned 8 exit status make: 1254-004 The error code from the last command is 1. Any ideas on how to correct this? Thanks Phil Phillip Schermerhorn Programmer/Analyst Communication Technology Management -- City of Austin Office: (512) 974-1433 The latest survey shows that three out of four people make up 75% of the population. [[alternative HTML version deleted]] From therneau at mayo.edu Thu Jul 9 19:26:40 2015 From: therneau at mayo.edu (Therneau, Terry M., Ph.D.) Date: Thu, 09 Jul 2015 12:26:40 -0500 Subject: [Rd] R CMD build failure Message-ID: <2f3a88$105o53@ironport10.mayo.edu> I have a local library 'dart' that imports "httr". It has routines that access central patient data such as birth date, so it is heavily used locally but of no interest to anyone else. The httr library (and 300 others) are in a shared directory, referenced by everyone in the biostatistics group via adding this location to the .libPaths in their default .Rprofile. (Actually, their .Rprofile starts by running material from a central one, the libPaths is there). When I run R (3.2.0) all is fine, but R CMD build fails with the text below * creating vignettes ... ERROR Error: processing vignette 'dart.Rnw' failed with diagnostics: chunk 2 (label = auth1) Error : package ?httr? required by ?dart? could not be found Execution halted If I add the requiste directory to my R_LIBS_USER environment variable then all is fine. However, that's a nuisance since the location changes over time (e.g. R releases). The system admins have a whole process that keeps .bashrc, .Rprofle and etc dot references up to date. Plugging into this is why we use .Rprofile. They are quite willing to make select changes in the central file, but with >1000 users any suggested changes in the overall process do not get a warm welcome. Any ideas? There is no mention in the Writing R Extentions manual that it ignores the Rprofile file. If "suck it up and use R_LIBS_USER" is the answer, well, there are only a few who build packages. PS, I can't use RHOME:/etc/Rprofile.site since the biostat group is not the only set of R users. Some other groups, for instance, cannot even see our central area. Terry T. From h.wickham at gmail.com Thu Jul 9 19:33:27 2015 From: h.wickham at gmail.com (Hadley Wickham) Date: Thu, 9 Jul 2015 10:33:27 -0700 Subject: [Rd] R CMD build failure In-Reply-To: <2f3a88$105o53@ironport10.mayo.edu> References: <2f3a88$105o53@ironport10.mayo.edu> Message-ID: What field is httr in the DESCRIPTION? Hadley On Thu, Jul 9, 2015 at 10:26 AM, Therneau, Terry M., Ph.D. wrote: > I have a local library 'dart' that imports "httr". It has routines that > access central patient data such as birth date, so it is heavily used > locally but of no interest to anyone else. > > The httr library (and 300 others) are in a shared directory, referenced by > everyone in the biostatistics group via adding this location to the > .libPaths in their default .Rprofile. (Actually, their .Rprofile starts by > running material from a central one, the libPaths is there). > > When I run R (3.2.0) all is fine, but R CMD build fails with the text below > > * creating vignettes ... ERROR > Error: processing vignette 'dart.Rnw' failed with diagnostics: > chunk 2 (label = auth1) > Error : package ?httr? required by ?dart? could not be found > Execution halted > > > If I add the requiste directory to my R_LIBS_USER environment variable then > all is fine. However, that's a nuisance since the location changes over > time (e.g. R releases). The system admins have a whole process that keeps > .bashrc, .Rprofle and etc dot references up to date. Plugging into this is > why we use .Rprofile. They are quite willing to make select changes in the > central file, but with >1000 users any suggested changes in the overall > process do not get a warm welcome. > > Any ideas? There is no mention in the Writing R Extentions manual that it > ignores the Rprofile file. If "suck it up and use R_LIBS_USER" is the > answer, well, there are only a few who build packages. > > PS, I can't use RHOME:/etc/Rprofile.site since the biostat group is not the > only set of R users. Some other groups, for instance, cannot even see our > central area. > > Terry T. > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel -- http://had.co.nz/ From therneau at mayo.edu Thu Jul 9 19:51:12 2015 From: therneau at mayo.edu (Therneau, Terry M., Ph.D.) Date: Thu, 09 Jul 2015 12:51:12 -0500 Subject: [Rd] R CMD build failure In-Reply-To: References: <2f3a88$105o53@ironport10.mayo.edu> Message-ID: <2f3a88$1060i1@ironport10.mayo.edu> "Depends: httr (>= 0.6), XML" in the DESCRIPTION file "import(httr, XML)" in NAMESPACE Per your question I just tried changing "Depends" to "Imports", it didn't make a change. Terry T. On 07/09/2015 12:33 PM, Hadley Wickham wrote: > What field is httr in the DESCRIPTION? > Hadley > > On Thu, Jul 9, 2015 at 10:26 AM, Therneau, Terry M., Ph.D. > wrote: >> I have a local library 'dart' that imports "httr". It has routines that >> access central patient data such as birth date, so it is heavily used >> locally but of no interest to anyone else. >> >> The httr library (and 300 others) are in a shared directory, referenced by >> everyone in the biostatistics group via adding this location to the >> .libPaths in their default .Rprofile. (Actually, their .Rprofile starts by >> running material from a central one, the libPaths is there). >> >> When I run R (3.2.0) all is fine, but R CMD build fails with the text below >> >> * creating vignettes ... ERROR >> Error: processing vignette 'dart.Rnw' failed with diagnostics: >> chunk 2 (label = auth1) >> Error : package ?httr? required by ?dart? could not be found >> Execution halted >> >> >> If I add the requiste directory to my R_LIBS_USER environment variable then >> all is fine. However, that's a nuisance since the location changes over >> time (e.g. R releases). The system admins have a whole process that keeps >> .bashrc, .Rprofle and etc dot references up to date. Plugging into this is >> why we use .Rprofile. They are quite willing to make select changes in the >> central file, but with >1000 users any suggested changes in the overall >> process do not get a warm welcome. >> >> Any ideas? There is no mention in the Writing R Extentions manual that it >> ignores the Rprofile file. If "suck it up and use R_LIBS_USER" is the >> answer, well, there are only a few who build packages. >> >> PS, I can't use RHOME:/etc/Rprofile.site since the biostat group is not the >> only set of R users. Some other groups, for instance, cannot even see our >> central area. >> >> Terry T. >> >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel > > > From ccberry at ucsd.edu Thu Jul 9 20:31:21 2015 From: ccberry at ucsd.edu (Charles C. Berry) Date: Thu, 9 Jul 2015 11:31:21 -0700 Subject: [Rd] R CMD build failure In-Reply-To: <2f3a88$105o53@ironport10.mayo.edu> References: <2f3a88$105o53@ironport10.mayo.edu> Message-ID: On Thu, 9 Jul 2015, Therneau, Terry M., Ph.D. wrote: > I have a local library 'dart' that imports "httr". [snip `R CMD build' can't find dart] > > Any ideas? There is no mention in the Writing R Extentions manual that it > ignores the > Rprofile file. Terry, >From WRE: 1.3 Checking and building packages ... Note: R CMD check and R CMD build run R processes with --vanilla in which none of the user?s startup files are read. If you need R_LIBS set (to find packages in a non-standard library) you can set it in the environment: also you can use the check and build environment files (as specified by the environment variables R_CHECK_ENVIRON and R_BUILD_ENVIRON; if unset, files33 ~/.R/check.Renviron and ~/.R/build.Renviron are used) to set environment variables when using these utilities. And from ?Startup The command-line option --vanilla implies --no-site-file, --no-init-file, --no-environ and (except for R CMD) --no-restore HTH, Chuck From josh.m.ulrich at gmail.com Thu Jul 9 20:36:35 2015 From: josh.m.ulrich at gmail.com (Joshua Ulrich) Date: Thu, 9 Jul 2015 13:36:35 -0500 Subject: [Rd] R CMD build failure In-Reply-To: <2f3a88$105o53@ironport10.mayo.edu> References: <2f3a88$105o53@ironport10.mayo.edu> Message-ID: On Thu, Jul 9, 2015 at 12:26 PM, Therneau, Terry M., Ph.D. wrote: > I have a local library 'dart' that imports "httr". It has routines that > access central patient data such as birth date, so it is heavily used > locally but of no interest to anyone else. > > The httr library (and 300 others) are in a shared directory, referenced by > everyone in the biostatistics group via adding this location to the > .libPaths in their default .Rprofile. (Actually, their .Rprofile starts by > running material from a central one, the libPaths is there). > > When I run R (3.2.0) all is fine, but R CMD build fails with the text below > > * creating vignettes ... ERROR > Error: processing vignette 'dart.Rnw' failed with diagnostics: > chunk 2 (label = auth1) > Error : package ?httr? required by ?dart? could not be found > Execution halted > > > If I add the requiste directory to my R_LIBS_USER environment variable then > all is fine. However, that's a nuisance since the location changes over > time (e.g. R releases). The system admins have a whole process that keeps > .bashrc, .Rprofle and etc dot references up to date. Plugging into this is > why we use .Rprofile. They are quite willing to make select changes in the > central file, but with >1000 users any suggested changes in the overall > process do not get a warm welcome. > > Any ideas? There is no mention in the Writing R Extentions manual that it > ignores the Rprofile file. If "suck it up and use R_LIBS_USER" is the > answer, well, there are only a few who build packages. > It is mentioned in ?Startup: "'R CMD check' and 'R CMD build' do not always read the standard startup files, but they do always read specific 'Renviron' files. The location of these can be controlled by the environment variables 'R_CHECK_ENVIRON' and 'R_BUILD_ENVIRON'. If these are set their value is used as the path for the 'Renviron' file; otherwise, files '~/.R/check.Renviron' or '~/.R/build.Renviron' or sub-architecture-specific versions are employed." Maybe one of those options could work for you? > PS, I can't use RHOME:/etc/Rprofile.site since the biostat group is not the > only set of R users. Some other groups, for instance, cannot even see our > central area. > > Terry T. > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel -- Joshua Ulrich | about.me/joshuaulrich FOSS Trading | www.fosstrading.com From gs at statlab.uni-heidelberg.de Fri Jul 10 21:22:26 2015 From: gs at statlab.uni-heidelberg.de (G. Sawitzki) Date: Fri, 10 Jul 2015 21:22:26 +0200 Subject: [Rd] colour palettes in biplot Message-ID: <55A01B72.6020608@statlab.uni-heidelberg.de> biplot could benefit from allowing colour palettes, eg. allowing col=list(xcol= xxpal, ycol= yypal) where xxpal and yypal are palettes. For an example, see the robust pricipal components in the principal componetent section of The required change is to replace col[1L] resp. col[2L] by col[1L]] reep. col[[2L]] for example in plot(x, type = "n", xlim = xlim, ylim = ylim, col = col[[1L]], ## << xlab = xlab, ylab = ylab, sub = sub, main = main, ...) Existing code should not be broken by this change. Yours, gs. >From valentin.todorov at chello.at: > library(rrcov) > > pc <- PcaProj(iris[,1:4], 3, scale=TRUE) > windows(8,5) > par(mfrow=c(1,2)) > biplot(pc) > > col <- list(xcol=as.numeric(iris[,5])+2, ycol="red") > biplot(pc, col=col) > -- G. Sawitzki StatLab Heidelberg Im Neuenheimer Feld 294 D 69120 Heidelberg Tel. (+49) 62 21 - 54 89 79 Fax (+49) 62 21 - 54 53 31 GPG Fingerprint: FEA4 3F2A 88B2 2629 6CE2 0429 CC3C E49E D159 99AE From glennmschultz at me.com Sat Jul 11 04:03:47 2015 From: glennmschultz at me.com (Glenn Schultz) Date: Sat, 11 Jul 2015 02:03:47 +0000 (GMT) Subject: [Rd] User Input Message-ID: All, I have a package BondLab, all variables are passed to the models via Cusip objects. ? The only variables input by the user are? settlement date,? price or yield or spread. ? Further price may be passed as 100.125, 100-8 (1/8), or 100-2 (2/64) or 100-2+ (5/128).? Once passed these variables must have a steady state (in decimal). ?After reading the R documentation and Hadley's book I think the way to do this is via a class but I can quite get my head around how to do it. ?A nominal model is: Analytics <- function(bondid = cusip, tradedate = trade.date, settledate = settle.date, price = 100.125){ BondAnalytics <- BondAnalytics(tradedate = tradedate, settledate = settledate, price = price) New("BondAnalytics ...) } ? So, price is an input to a constructor function which calls other functions to create the class object BondAnalytics. ?If price is an object with validity how do I make that part of the function? If I do make it class how does it function input inherit? ?I read John Chamber's book but I am not sure... maybe it does not have to be a class but something else or is it something like the below: BondAnalytics <- function(bond.id = "character", tradedate = "character", settledate = "character", price = .Object){ do stuff...} Advice appreciated, -glenn From thierry.onkelinx at inbo.be Sun Jul 12 11:58:47 2015 From: thierry.onkelinx at inbo.be (Thierry Onkelinx) Date: Sun, 12 Jul 2015 11:58:47 +0200 Subject: [Rd] User Input In-Reply-To: References: Message-ID: If you use S4 objects then have a look at ?setValidity Best regards, Thierry Op 11-jul.-2015 04:11 schreef "Glenn Schultz" : > All, > > I have a package BondLab, all variables are passed to the models via Cusip > objects. > > The only variables input by the user are > settlement date, > price or yield or spread. > > Further price may be passed as 100.125, 100-8 (1/8), or 100-2 (2/64) or > 100-2+ (5/128). > > Once passed these variables must have a steady state (in decimal). After > reading the R documentation and Hadley's book I think the way to do this is > via a class but I can quite get my head around how to do it. A nominal > model is: > > Analytics <- function(bondid = cusip, tradedate = trade.date, settledate = > settle.date, price = 100.125){ > BondAnalytics <- BondAnalytics(tradedate = tradedate, settledate = > settledate, price = price) > New("BondAnalytics ...) > } > > So, price is an input to a constructor function which calls other > functions to create the class object BondAnalytics. If price is an object > with validity how do I make that part of the function? > > If I do make it class how does it function input inherit? I read John > Chamber's book but I am not sure... maybe it does not have to be a class > but something else or is it something like the below: > > BondAnalytics <- function(bond.id = "character", tradedate = "character", > settledate = "character", price = .Object){ > do stuff...} > > Advice appreciated, > -glenn > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > [[alternative HTML version deleted]] From daattali at gmail.com Sun Jul 12 21:51:06 2015 From: daattali at gmail.com (Dean Attali) Date: Sun, 12 Jul 2015 12:51:06 -0700 Subject: [Rd] Why no support for 3-digit HEX colours? Message-ID: When specifying an RGB colour in R, is there a strong reason not to accept 3-character HEX codes? In CSS and many other languages, a colour of "#ABC" is automatically converted to "#AABBCC", and I was wondering if R could support that as well, or if it was a conscious decision to not support it. --- http://deanattali.com [[alternative HTML version deleted]] From murdoch.duncan at gmail.com Sun Jul 12 22:32:03 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Sun, 12 Jul 2015 16:32:03 -0400 Subject: [Rd] Why no support for 3-digit HEX colours? In-Reply-To: References: Message-ID: <55A2CEC3.6000308@gmail.com> On 12/07/2015 3:51 PM, Dean Attali wrote: > When specifying an RGB colour in R, is there a strong reason not to accept > 3-character HEX codes? In CSS and many other languages, a colour of "#ABC" > is automatically converted to "#AABBCC", and I was wondering if R could > support that as well, or if it was a conscious decision to not support it. This is on the wish list. See https://bugs.r-project.org/bugzilla3/show_bug.cgi?id=16426 Duncan Murdoch From daattali at gmail.com Sun Jul 12 22:41:58 2015 From: daattali at gmail.com (Dean Attali) Date: Sun, 12 Jul 2015 13:41:58 -0700 Subject: [Rd] Why no support for 3-digit HEX colours? In-Reply-To: <55A2CEC3.6000308@gmail.com> References: <55A2CEC3.6000308@gmail.com> Message-ID: And very recent as well, good to know, thanks! --- http://deanattali.com On 12 July 2015 at 13:32, Duncan Murdoch wrote: > On 12/07/2015 3:51 PM, Dean Attali wrote: > > When specifying an RGB colour in R, is there a strong reason not to > accept > > 3-character HEX codes? In CSS and many other languages, a colour of > "#ABC" > > is automatically converted to "#AABBCC", and I was wondering if R could > > support that as well, or if it was a conscious decision to not support > it. > > This is on the wish list. See > > https://bugs.r-project.org/bugzilla3/show_bug.cgi?id=16426 > > Duncan Murdoch > > [[alternative HTML version deleted]] From joysofpi at gmail.com Sun Jul 12 23:15:07 2015 From: joysofpi at gmail.com (Arthur Vigil) Date: Sun, 12 Jul 2015 14:15:07 -0700 Subject: [Rd] suggestion: better support for https CRAN mirrors Message-ID: <8C4E93DE-0655-4D8B-961E-5CD5906E4697@gmail.com> Hello, I see that https CRAN mirrors are starting to pop up, as per this post at https://support.rstudio.com/hc/en-us/articles/206827897-Secure-Package-Downloads-for-R However, trying to use one of these mirrors without changing the default download.file.method option gives me errors > Warning: unable to access index for repository https://cran.rstudio.com/src/contrib > Warning: unable to access index for repository https://cran.rstudio.com/bin/macosx/mavericks/contrib/3.2 I don?t mind just choosing an http mirror but when prompted to select a mirror, there is no distinction made in the mirror listing between http and https mirrors. It would be nice if https mirrors were hidden when the current environment doesn?t support them. Alternatively, at least make a distinction between http and https in the mirror listing. They currently just look like duplicates. I think this is a pretty trivial change but would help a lot with usability. As an example, when using install.packages: > --- Please select a CRAN mirror for use in this session --- > CRAN mirror > > 1: 0-Cloud 2: 0-Cloud > 3: Algeria 4: Argentina (La Plata) > 5: Australia (Canberra) 6: Australia (Melbourne) > 7: Austria 8: Austria > 9: Belgium 10: Brazil (BA) > 11: Brazil (PR) 12: Brazil (RJ) > 13: Brazil (SP 1) 14: Brazil (SP 2) In the listing above there are duplicate entries for 0-cloud and Austria pointing to http and https versions of the same mirror. From murdoch.duncan at gmail.com Mon Jul 13 00:54:34 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Sun, 12 Jul 2015 18:54:34 -0400 Subject: [Rd] suggestion: better support for https CRAN mirrors In-Reply-To: <8C4E93DE-0655-4D8B-961E-5CD5906E4697@gmail.com> References: <8C4E93DE-0655-4D8B-961E-5CD5906E4697@gmail.com> Message-ID: <55A2F02A.9070707@gmail.com> On 12/07/2015 5:15 PM, Arthur Vigil wrote: > Hello, > I see that https CRAN mirrors are starting to pop up, as per this post at https://support.rstudio.com/hc/en-us/articles/206827897-Secure-Package-Downloads-for-R > > However, trying to use one of these mirrors without changing the default download.file.method option gives me errors > >> Warning: unable to access index for repository https://cran.rstudio.com/src/contrib >> Warning: unable to access index for repository https://cran.rstudio.com/bin/macosx/mavericks/contrib/3.2 > > I don?t mind just choosing an http mirror but when prompted to select a mirror, there is no distinction made in the mirror listing between http and https mirrors. It would be nice if https mirrors were hidden when the current environment doesn?t support them. Alternatively, at least make a distinction between http and https in the mirror listing. They currently just look like duplicates. I think this is a pretty trivial change but would help a lot with usability. > > As an example, when using install.packages: If you update to R-patched, you'll see the difference. With current releases of R, you'll see some mirrors twice: the first one is https, the second is http. The other issue you may have is that your system may be configured to default to a method that doesn't support https. Try download.file("https://cran.rstudio.com") and readLines(url("https://cran.rstudio.com")) to confirm that's not a problem for you. Duncan Murdoch > >> --- Please select a CRAN mirror for use in this session --- >> CRAN mirror >> >> 1: 0-Cloud 2: 0-Cloud >> 3: Algeria 4: Argentina (La Plata) >> 5: Australia (Canberra) 6: Australia (Melbourne) >> 7: Austria 8: Austria >> 9: Belgium 10: Brazil (BA) >> 11: Brazil (PR) 12: Brazil (RJ) >> 13: Brazil (SP 1) 14: Brazil (SP 2) > > In the listing above there are duplicate entries for 0-cloud and Austria pointing to http and https versions of the same mirror. > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > From murdoch.duncan at gmail.com Mon Jul 13 00:58:01 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Sun, 12 Jul 2015 18:58:01 -0400 Subject: [Rd] suggestion: better support for https CRAN mirrors In-Reply-To: <55A2F02A.9070707@gmail.com> References: <8C4E93DE-0655-4D8B-961E-5CD5906E4697@gmail.com> <55A2F02A.9070707@gmail.com> Message-ID: <55A2F0F9.4070104@gmail.com> On 12/07/2015 6:54 PM, Duncan Murdoch wrote: > On 12/07/2015 5:15 PM, Arthur Vigil wrote: >> Hello, >> I see that https CRAN mirrors are starting to pop up, as per this post at https://support.rstudio.com/hc/en-us/articles/206827897-Secure-Package-Downloads-for-R >> >> However, trying to use one of these mirrors without changing the default download.file.method option gives me errors >> >>> Warning: unable to access index for repository https://cran.rstudio.com/src/contrib >>> Warning: unable to access index for repository https://cran.rstudio.com/bin/macosx/mavericks/contrib/3.2 >> >> I don?t mind just choosing an http mirror but when prompted to select a mirror, there is no distinction made in the mirror listing between http and https mirrors. It would be nice if https mirrors were hidden when the current environment doesn?t support them. Alternatively, at least make a distinction between http and https in the mirror listing. They currently just look like duplicates. I think this is a pretty trivial change but would help a lot with usability. >> >> As an example, when using install.packages: > > If you update to R-patched, you'll see the difference. With current > releases of R, you'll see some mirrors twice: the first one is https, > the second is http. > > The other issue you may have is that your system may be configured to > default to a method that doesn't support https. Try > download.file("https://cran.rstudio.com") and Oops, you need to give a destination. Try download.file("https://cran.rstudio.com", destfile = (f <- tempfile())) > readLines(url("https://cran.rstudio.com")) to confirm that's not a > problem for you. > > Duncan Murdoch > >> >>> --- Please select a CRAN mirror for use in this session --- >>> CRAN mirror >>> >>> 1: 0-Cloud 2: 0-Cloud >>> 3: Algeria 4: Argentina (La Plata) >>> 5: Australia (Canberra) 6: Australia (Melbourne) >>> 7: Austria 8: Austria >>> 9: Belgium 10: Brazil (BA) >>> 11: Brazil (PR) 12: Brazil (RJ) >>> 13: Brazil (SP 1) 14: Brazil (SP 2) >> >> In the listing above there are duplicate entries for 0-cloud and Austria pointing to http and https versions of the same mirror. >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel >> > From Tom.Wenseleers at bio.kuleuven.be Tue Jul 14 13:03:57 2015 From: Tom.Wenseleers at bio.kuleuven.be (Tom Wenseleers) Date: Tue, 14 Jul 2015 11:03:57 +0000 Subject: [Rd] Use cairo fallback resolution greater than 72dpi in cairo_pdf and cairo_ps in grDevices Message-ID: <37EFC97028F3E44082ACC5CBEC00563011547D07@ICTS-S-MBX13.luna.kuleuven.be> Dear all, In grDevices R functions cairo_pdf and cairo_ps it is mentioned that when transparency (alpha channels) are used in vector output, it will rasterize the PDF or postscript exported graph at a resolution of 72 dpi : https://stat.ethz.ch/R-manual/R-devel/library/grDevices/html/cairo.html You can see the problem if you try library(ggplot2) cairo_ps(file = "test.eps",onefile = FALSE) qplot(Sepal.Length, Petal.Length, data = iris, color = Species, size = Petal.Width, alpha = I(0.7)) dev.off() as in the output (here zoomed in) the plot symbols are heavily pixelated then, showing it is only using 72 dpi. I was wondering how the fallback resolution could be increased to 600 dpi? In library(RGtk2) there is a command cairoSurfaceSetFallbackResolution, which I think is what is relevant here, but I would not know how to make grDevices use that parameter. Any thoughts? Using postscript() btw also doesn't work, since that doesn't support transparency, and returns the error "semi-transparency is not supported on this device: reported only once per page". I know I can use svg or pdf instead and that this would solve the problem, but problem is the journal (PloS) I would like to submit to only accepts EPS. So is there any solution to increase the quality in EPS, without having to rasterize everything to PNG? best regards, Tom Wenseleers [[alternative HTML version deleted]] From radford at cs.toronto.edu Wed Jul 15 00:08:35 2015 From: radford at cs.toronto.edu (Radford Neal) Date: Tue, 14 Jul 2015 18:08:35 -0400 Subject: [Rd] Two bugs showing up mostly on SPARC systems Message-ID: <20150714220835.GA28360@cs.toronto.edu> In testing pqR on Solaris SPARC systems, I have found two bugs that are also present in recent R Core versions. You can see the bugs and fixes at the following URLs: https://github.com/radfordneal/pqR/commit/739a4960a4d8f3a3b20cfc311518369576689f37 https://github.com/radfordneal/pqR/commit/339b7286c7b43dcc6b00e51515772f1d7dce7858 The first bug, in nls, is most likely to occur on a 64-bit big-endian system, but will occur with low probability on most platforms. The second bug, in readBin, may occur on systems in which unaligned access to data more than one byte in size is an error, depending on details of the compiler. It showed up with gcc 4.9.2 on a SPARC system. The fix slightly changes the error behaviuor, signaling an error on inappropriate reads before reading any data, rather than after reading one (but not all) items as before. Radford Neal From murdoch.duncan at gmail.com Wed Jul 15 01:52:56 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Tue, 14 Jul 2015 19:52:56 -0400 Subject: [Rd] Two bugs showing up mostly on SPARC systems In-Reply-To: <20150714220835.GA28360@cs.toronto.edu> References: <20150714220835.GA28360@cs.toronto.edu> Message-ID: <55A5A0D8.6070504@gmail.com> On 14/07/2015 6:08 PM, Radford Neal wrote: > In testing pqR on Solaris SPARC systems, I have found two bugs that > are also present in recent R Core versions. You can see the bugs and > fixes at the following URLs: > > https://github.com/radfordneal/pqR/commit/739a4960a4d8f3a3b20cfc311518369576689f37 Thanks for the report. Just one followup on this one: There are two sections of code that are really similar. Your patch applies to the code in port_nlsb(), but there's a very similar test in port_nlminb(), which is called from nlminb() in R. Do you think it would be a good idea to apply the same patch there as well? It doesn't look like it would hurt, but I don't know this code at all, so it might be unnecessary. Duncan Murdoch > > https://github.com/radfordneal/pqR/commit/339b7286c7b43dcc6b00e51515772f1d7dce7858 > > The first bug, in nls, is most likely to occur on a 64-bit big-endian > system, but will occur with low probability on most platforms. > > The second bug, in readBin, may occur on systems in which unaligned > access to data more than one byte in size is an error, depending on > details of the compiler. It showed up with gcc 4.9.2 on a SPARC > system. The fix slightly changes the error behaviuor, signaling an > error on inappropriate reads before reading any data, rather than > after reading one (but not all) items as before. > > Radford Neal > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > From radford at cs.toronto.edu Wed Jul 15 03:29:36 2015 From: radford at cs.toronto.edu (Radford Neal) Date: Tue, 14 Jul 2015 21:29:36 -0400 Subject: [Rd] Two bugs showing up mostly on SPARC systems In-Reply-To: <55A5A0D8.6070504@gmail.com> References: <20150714220835.GA28360@cs.toronto.edu> <55A5A0D8.6070504@gmail.com> Message-ID: <20150715012936.GA18664@cs.toronto.edu> On Tue, Jul 14, 2015 at 07:52:56PM -0400, Duncan Murdoch wrote: > On 14/07/2015 6:08 PM, Radford Neal wrote: > > In testing pqR on Solaris SPARC systems, I have found two bugs that > > are also present in recent R Core versions. You can see the bugs and > > fixes at the following URLs: > > > > https://github.com/radfordneal/pqR/commit/739a4960a4d8f3a3b20cfc311518369576689f37 > > Thanks for the report. Just one followup on this one: > > There are two sections of code that are really similar. Your patch > applies to the code in port_nlsb(), but there's a very similar test in > port_nlminb(), which is called from nlminb() in R. Do you think it > would be a good idea to apply the same patch there as well? It doesn't > look like it would hurt, but I don't know this code at all, so it might > be unnecessary. Looking at nlminb, it seems that this bug doesn't exist there. The R code sets low <- upp <- NULL, as in nls, but later there is an "else low <- upp <- numeric()" that ensures that low and upp are never actually NULL. This may have been a fix for the bug showing up in nlminb that was not applied to nls as well (and of course, the fix didn't delete the now pointless low <- upp <- NULL). The nlminb code might be a better fix, stylistically, after removing "low <- upp <- NULL", though it seems that both it and my fix for nls should work. Of course, both assume that the call of the C function is done only from this R code, so no other types for low and upp are possible. And really the whole thing ought to be rewritten, since the .Call functions modify variables without any regard for whether or not their values are shared. Radford Neal From georgi.boshnakov at manchester.ac.uk Wed Jul 15 13:44:42 2015 From: georgi.boshnakov at manchester.ac.uk (Georgi Boshnakov) Date: Wed, 15 Jul 2015 11:44:42 +0000 Subject: [Rd] add .emacs.desktop and .emacs.desktop.lock to files ignored by R CMD build? Message-ID: <438D2EC9EAFE5946B2D5864670EA468E015EA2EABB@MBXP01.ds.man.ac.uk> Is it possible to consider adding .emacs.desktop and .emacs.desktop.lock to files ignored by R CMD build? Thanks, Georgi -- Dr Georgi Boshnakov tel: (+44) (0)161 306 3684 School of Mathematics fax: (+44) (0)161 306 3669 Alan Turing Building 1.125 The University of Manchester email: Georgi.Boshnakov at manchester.ac.uk Oxford Road Manchester M13 9PL UK From edd at debian.org Wed Jul 15 14:24:34 2015 From: edd at debian.org (Dirk Eddelbuettel) Date: Wed, 15 Jul 2015 07:24:34 -0500 Subject: [Rd] add .emacs.desktop and .emacs.desktop.lock to files ignored by R CMD build? In-Reply-To: <438D2EC9EAFE5946B2D5864670EA468E015EA2EABB@MBXP01.ds.man.ac.uk> References: <438D2EC9EAFE5946B2D5864670EA468E015EA2EABB@MBXP01.ds.man.ac.uk> Message-ID: <21926.20738.919996.314767@max.nulle.part> On 15 July 2015 at 11:44, Georgi Boshnakov wrote: | Is it possible to consider adding .emacs.desktop and .emacs.desktop.lock to files ignored by R CMD build? You do that at your end via a file .Rbuildignore. From Section 1.3.2 of WRE: To exclude files from being put into the package, one can specify a list of exclude patterns in file '.Rbuildignore' in the top-level source directory. These patterns should be Perl-like regular expressions (see the help for 'regexp' in R for the precise details), one per line, to be matched case-insensitively(1) against the file and directory names relative to the top-level package source directory. In addition, directories from source control systems(2) or from 'eclipse'(3), directories with names ending '.Rcheck' or 'Old' or 'old' and files 'GNUMakefile'(4), 'Read-and-delete-me' or with base names starting with '.#', or starting and ending with '#', or ending in '~', '.bak' or '.swp', are excluded by default. In addition, those files in the 'R', 'demo' and 'man' directories which are flagged by 'R CMD check' as having invalid names will be excluded. Also, this would have been a splendid question for the new mailing r-package-devel: https://stat.ethz.ch/mailman/listinfo/r-package-devel Hth, Dirk -- http://dirk.eddelbuettel.com | @eddelbuettel | edd at debian.org From murdoch.duncan at gmail.com Wed Jul 15 19:05:33 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 15 Jul 2015 13:05:33 -0400 Subject: [Rd] Two bugs showing up mostly on SPARC systems In-Reply-To: <20150715012936.GA18664@cs.toronto.edu> References: <20150714220835.GA28360@cs.toronto.edu> <55A5A0D8.6070504@gmail.com> <20150715012936.GA18664@cs.toronto.edu> Message-ID: <55A692DD.5060404@gmail.com> On 14/07/2015 9:29 PM, Radford Neal wrote: > On Tue, Jul 14, 2015 at 07:52:56PM -0400, Duncan Murdoch wrote: >> On 14/07/2015 6:08 PM, Radford Neal wrote: >>> In testing pqR on Solaris SPARC systems, I have found two bugs that >>> are also present in recent R Core versions. You can see the bugs and >>> fixes at the following URLs: >>> >>> https://github.com/radfordneal/pqR/commit/739a4960a4d8f3a3b20cfc311518369576689f37 >> >> Thanks for the report. Just one followup on this one: >> >> There are two sections of code that are really similar. Your patch >> applies to the code in port_nlsb(), but there's a very similar test in >> port_nlminb(), which is called from nlminb() in R. Do you think it >> would be a good idea to apply the same patch there as well? It doesn't >> look like it would hurt, but I don't know this code at all, so it might >> be unnecessary. > > Looking at nlminb, it seems that this bug doesn't exist there. The > R code sets low <- upp <- NULL, as in nls, but later there is an > "else low <- upp <- numeric()" that ensures that low and upp are never > actually NULL. This may have been a fix for the bug showing up in > nlminb that was not applied to nls as well (and of course, the fix > didn't delete the now pointless low <- upp <- NULL). > > The nlminb code might be a better fix, stylistically, after removing > "low <- upp <- NULL", though it seems that both it and my fix for nls > should work. Of course, both assume that the call of the C function > is done only from this R code, so no other types for low and upp are > possible. And really the whole thing ought to be rewritten, since the > .Call functions modify variables without any regard for whether or not > their values are shared. Thanks. I think I'll make the mods to the R code rather than the C code, to make the two functions more consistent, and so as not to suggest that port.c is actually okay. The modification of variables without checking for sharing is definitely something that should be fixed. It looks to me as though we're likely to get away with it here (both m and iv are allocated by functions called by nls(), not by the user), but it's still really bad practice. Duncan Murdoch From dayne.filer at gmail.com Wed Jul 15 20:49:27 2015 From: dayne.filer at gmail.com (Dayne Filer) Date: Wed, 15 Jul 2015 14:49:27 -0400 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 Message-ID: Hello, I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that worked as I intended previously. Briefly, I am using bquote to generate expressions to modify data.table objects within a function, so I need the changes to actually be stored in the given environment. Previously, I used code like the following: test <- list(bquote(x <- 10)) fenv <- environment() rapply(test, evalq, envir = fenv) Although the code in the example above is much simpler, it shows the problem. On 3.1.2 the expression is evaluated and x is stored as 10 in the given environment, fenv. In 3.2.1 the code throws an error: Error in eval(substitute(expr), envir, enclos) : object 'X' not found I could not find anything in the release notes that would explain this change. Changing evalq to eval works in 3.2.1, but eval does not store x in the given environment in 3.1.2. Thanks, Dayne [[alternative HTML version deleted]] From murdoch.duncan at gmail.com Wed Jul 15 21:18:12 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 15 Jul 2015 15:18:12 -0400 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: <55A6B1F4.6010908@gmail.com> On 15/07/2015 2:49 PM, Dayne Filer wrote: > Hello, > > I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that worked > as I intended previously. Briefly, I am using bquote to generate > expressions to modify data.table objects within a function, so I need the > changes to actually be stored in the given environment. Previously, I used > code like the following: > > test <- list(bquote(x <- 10)) > fenv <- environment() > rapply(test, evalq, envir = fenv) > > Although the code in the example above is much simpler, it shows the > problem. On 3.1.2 the expression is evaluated and x is stored as 10 in the > given environment, fenv. In 3.2.1 the code throws an error: > > Error in eval(substitute(expr), envir, enclos) : object 'X' not found > > I could not find anything in the release notes that would explain this > change. Changing evalq to eval works in 3.2.1, but eval does not store x in > the given environment in 3.1.2. Please submit this as a bug report (at bugs.r-project.org). I don't know for sure that it's a bug, but it looks like one: there's no 'X' in your code, so that message is coming from something internal. It would be helpful to include results that work from 3.1.2 as well as what you're seeing in 3.2.1. I'm still seeing the error you reported in R-devel, so I think the bug is still there... Duncan Murdoch From wdunlap at tibco.com Wed Jul 15 21:29:14 2015 From: wdunlap at tibco.com (William Dunlap) Date: Wed, 15 Jul 2015 12:29:14 -0700 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: I am curious why you used evalq instead of eval in this code. Bill Dunlap TIBCO Software wdunlap tibco.com On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer wrote: > Hello, > > I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that worked > as I intended previously. Briefly, I am using bquote to generate > expressions to modify data.table objects within a function, so I need the > changes to actually be stored in the given environment. Previously, I used > code like the following: > > test <- list(bquote(x <- 10)) > fenv <- environment() > rapply(test, evalq, envir = fenv) > > Although the code in the example above is much simpler, it shows the > problem. On 3.1.2 the expression is evaluated and x is stored as 10 in the > given environment, fenv. In 3.2.1 the code throws an error: > > Error in eval(substitute(expr), envir, enclos) : object 'X' not found > > I could not find anything in the release notes that would explain this > change. Changing evalq to eval works in 3.2.1, but eval does not store x in > the given environment in 3.1.2. > > Thanks, > > Dayne > > [[alternative HTML version deleted]] > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > [[alternative HTML version deleted]] From dayne.filer at gmail.com Wed Jul 15 21:35:19 2015 From: dayne.filer at gmail.com (Dayne Filer) Date: Wed, 15 Jul 2015 15:35:19 -0400 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: In 3.1.2 eval does not store the result of the bquote-generated call in the given environment. Interestingly, in 3.2.1 eval does store the result of the bquote-generated call in the given environment. In other words if I run the given example with eval rather than evalq, on 3.1.2 "x" is never stored in "fenv," but it is when I run the same code on 3.2.1. However, the given example stores "x" in "fenv" on 3.1.2, but throws the error I gave when run on 3.2.1. To give credit, I received the idea for using evalq from SO: http://stackoverflow.com/a/22559385 Dayne On Wed, Jul 15, 2015 at 3:29 PM, William Dunlap wrote: > I am curious why you used evalq instead of eval in this code. > > Bill Dunlap > TIBCO Software > wdunlap tibco.com > > On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer > wrote: > >> Hello, >> >> I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that worked >> as I intended previously. Briefly, I am using bquote to generate >> expressions to modify data.table objects within a function, so I need the >> changes to actually be stored in the given environment. Previously, I used >> code like the following: >> >> test <- list(bquote(x <- 10)) >> fenv <- environment() >> rapply(test, evalq, envir = fenv) >> >> Although the code in the example above is much simpler, it shows the >> problem. On 3.1.2 the expression is evaluated and x is stored as 10 in the >> given environment, fenv. In 3.2.1 the code throws an error: >> >> Error in eval(substitute(expr), envir, enclos) : object 'X' not found >> >> I could not find anything in the release notes that would explain this >> change. Changing evalq to eval works in 3.2.1, but eval does not store x >> in >> the given environment in 3.1.2. >> >> Thanks, >> >> Dayne >> >> [[alternative HTML version deleted]] >> >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel >> > > [[alternative HTML version deleted]] From kevinushey at gmail.com Wed Jul 15 21:43:49 2015 From: kevinushey at gmail.com (Kevin Ushey) Date: Wed, 15 Jul 2015 12:43:49 -0700 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: My best guess is that it could be related to this commit: https://github.com/wch/r-source/commit/14f904c32a44010d4dfb8a829805648a88c22f53, since that's the only change that's touched `rapply` lately. On Wed, Jul 15, 2015 at 12:35 PM, Dayne Filer wrote: > In 3.1.2 eval does not store the result of the bquote-generated call in the > given environment. Interestingly, in 3.2.1 eval does store the result of > the bquote-generated call in the given environment. > > In other words if I run the given example with eval rather than evalq, on > 3.1.2 "x" is never stored in "fenv," but it is when I run the same code on > 3.2.1. However, the given example stores "x" in "fenv" on 3.1.2, but throws > the error I gave when run on 3.2.1. > > To give credit, I received the idea for using evalq from SO: > http://stackoverflow.com/a/22559385 > > Dayne > > > On Wed, Jul 15, 2015 at 3:29 PM, William Dunlap wrote: > >> I am curious why you used evalq instead of eval in this code. >> >> Bill Dunlap >> TIBCO Software >> wdunlap tibco.com >> >> On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer >> wrote: >> >>> Hello, >>> >>> I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that worked >>> as I intended previously. Briefly, I am using bquote to generate >>> expressions to modify data.table objects within a function, so I need the >>> changes to actually be stored in the given environment. Previously, I used >>> code like the following: >>> >>> test <- list(bquote(x <- 10)) >>> fenv <- environment() >>> rapply(test, evalq, envir = fenv) >>> >>> Although the code in the example above is much simpler, it shows the >>> problem. On 3.1.2 the expression is evaluated and x is stored as 10 in the >>> given environment, fenv. In 3.2.1 the code throws an error: >>> >>> Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>> >>> I could not find anything in the release notes that would explain this >>> change. Changing evalq to eval works in 3.2.1, but eval does not store x >>> in >>> the given environment in 3.1.2. >>> >>> Thanks, >>> >>> Dayne >>> >>> [[alternative HTML version deleted]] >>> >>> ______________________________________________ >>> R-devel at r-project.org mailing list >>> https://stat.ethz.ch/mailman/listinfo/r-devel >>> >> >> > > [[alternative HTML version deleted]] > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel From murdoch.duncan at gmail.com Wed Jul 15 21:50:16 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 15 Jul 2015 15:50:16 -0400 Subject: [Rd] Two bugs showing up mostly on SPARC systems In-Reply-To: <20150714220835.GA28360@cs.toronto.edu> References: <20150714220835.GA28360@cs.toronto.edu> Message-ID: <55A6B978.7090004@gmail.com> On 14/07/2015 6:08 PM, Radford Neal wrote: > In testing pqR on Solaris SPARC systems, I have found two bugs that > are also present in recent R Core versions. You can see the bugs and > fixes at the following URLs: > > https://github.com/radfordneal/pqR/commit/739a4960a4d8f3a3b20cfc311518369576689f37 > > https://github.com/radfordneal/pqR/commit/339b7286c7b43dcc6b00e51515772f1d7dce7858 > > The first bug, in nls, is most likely to occur on a 64-bit big-endian > system, but will occur with low probability on most platforms. > > The second bug, in readBin, may occur on systems in which unaligned > access to data more than one byte in size is an error, depending on > details of the compiler. It showed up with gcc 4.9.2 on a SPARC > system. The fix slightly changes the error behaviuor, signaling an > error on inappropriate reads before reading any data, rather than > after reading one (but not all) items as before. I've now taken a look at the second bug. It's a little harder to apply your patch, since your code is based on an old version of R; current R has more error checking and allows long vector results from this function. But it's basically just a matter of being careful not to make changes to the newer code. So this one should go into R-patched more or less as-is. Duncan Murdoch From wdunlap at tibco.com Wed Jul 15 21:51:39 2015 From: wdunlap at tibco.com (William Dunlap) Date: Wed, 15 Jul 2015 12:51:39 -0700 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: I think rapply() was changed to act like lapply() in this respect. In R-3.1.3 we got rapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) #[1] 18 rapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) #Error in (function (expr, envir = parent.frame(), enclos = if (is.list(envir) || : object 'myNumber' not found lapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) #Error in eval(substitute(expr), envir, enclos) : object 'X' not found lapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) #[[1]] #[1] 18 while in R-3.2.0 we get rapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) #Error in eval(substitute(expr), envir, enclos) : object 'X' not found rapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) #[1] 18 lapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) #Error in eval(substitute(expr), envir, enclos) : object 'X' not found lapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) #[[1]] #[1] 18 Make the FUN argument function(arg)sys.call() to see some details of the change. Bill Dunlap TIBCO Software wdunlap tibco.com On Wed, Jul 15, 2015 at 12:35 PM, Dayne Filer wrote: > In 3.1.2 eval does not store the result of the bquote-generated call in > the given environment. Interestingly, in 3.2.1 eval does store the result > of the bquote-generated call in the given environment. > > In other words if I run the given example with eval rather than evalq, on > 3.1.2 "x" is never stored in "fenv," but it is when I run the same code on > 3.2.1. However, the given example stores "x" in "fenv" on 3.1.2, but throws > the error I gave when run on 3.2.1. > > To give credit, I received the idea for using evalq from SO: > http://stackoverflow.com/a/22559385 > > Dayne > > > On Wed, Jul 15, 2015 at 3:29 PM, William Dunlap wrote: > >> I am curious why you used evalq instead of eval in this code. >> >> Bill Dunlap >> TIBCO Software >> wdunlap tibco.com >> >> On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer >> wrote: >> >>> Hello, >>> >>> I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that >>> worked >>> as I intended previously. Briefly, I am using bquote to generate >>> expressions to modify data.table objects within a function, so I need >>> the >>> changes to actually be stored in the given environment. Previously, I >>> used >>> code like the following: >>> >>> test <- list(bquote(x <- 10)) >>> fenv <- environment() >>> rapply(test, evalq, envir = fenv) >>> >>> Although the code in the example above is much simpler, it shows the >>> problem. On 3.1.2 the expression is evaluated and x is stored as 10 in >>> the >>> given environment, fenv. In 3.2.1 the code throws an error: >>> >>> Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>> >>> I could not find anything in the release notes that would explain this >>> change. Changing evalq to eval works in 3.2.1, but eval does not store x >>> in >>> the given environment in 3.1.2. >>> >>> Thanks, >>> >>> Dayne >>> >>> [[alternative HTML version deleted]] >>> >>> ______________________________________________ >>> R-devel at r-project.org mailing list >>> https://stat.ethz.ch/mailman/listinfo/r-devel >>> >> >> > [[alternative HTML version deleted]] From dwinsemius at comcast.net Wed Jul 15 22:25:11 2015 From: dwinsemius at comcast.net (David Winsemius) Date: Wed, 15 Jul 2015 13:25:11 -0700 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: On Jul 15, 2015, at 12:51 PM, William Dunlap wrote: > I think rapply() was changed to act like lapply() in this respect. > When I looked at the source of the difference, it was that typeof() returned 'language' in 3.2.1, while it returned 'list' in the earlier version of R. The first check in rapply's code in both version was: if (typeof(object) != "list") stop("'object' must be a list") Wrapping list() around the first argument and switching to using eval with an expression-object rather than a call-object seemed to solve the problem when this was posed as a question on StackOverflow, but Dayne was not happy with that solution for other reasons that he is not describing. -- David. > In R-3.1.3 we got > rapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) > #[1] 18 > rapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) > #Error in (function (expr, envir = parent.frame(), enclos = if > (is.list(envir) || : > object 'myNumber' not found > lapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found > lapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) > #[[1]] > #[1] 18 > while in R-3.2.0 we get > rapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found > rapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) > #[1] 18 > lapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found > lapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) > #[[1]] > #[1] 18 > > Make the FUN argument function(arg)sys.call() to see some details of the > change. > > > Bill Dunlap > TIBCO Software > wdunlap tibco.com > > On Wed, Jul 15, 2015 at 12:35 PM, Dayne Filer wrote: > >> In 3.1.2 eval does not store the result of the bquote-generated call in >> the given environment. Interestingly, in 3.2.1 eval does store the result >> of the bquote-generated call in the given environment. >> >> In other words if I run the given example with eval rather than evalq, on >> 3.1.2 "x" is never stored in "fenv," but it is when I run the same code on >> 3.2.1. However, the given example stores "x" in "fenv" on 3.1.2, but throws >> the error I gave when run on 3.2.1. >> >> To give credit, I received the idea for using evalq from SO: >> http://stackoverflow.com/a/22559385 >> >> Dayne >> >> >> On Wed, Jul 15, 2015 at 3:29 PM, William Dunlap wrote: >> >>> I am curious why you used evalq instead of eval in this code. >>> >>> Bill Dunlap >>> TIBCO Software >>> wdunlap tibco.com >>> >>> On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer >>> wrote: >>> >>>> Hello, >>>> >>>> I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that >>>> worked >>>> as I intended previously. Briefly, I am using bquote to generate >>>> expressions to modify data.table objects within a function, so I need >>>> the >>>> changes to actually be stored in the given environment. Previously, I >>>> used >>>> code like the following: >>>> >>>> test <- list(bquote(x <- 10)) >>>> fenv <- environment() >>>> rapply(test, evalq, envir = fenv) >>>> >>>> Although the code in the example above is much simpler, it shows the >>>> problem. On 3.1.2 the expression is evaluated and x is stored as 10 in >>>> the >>>> given environment, fenv. In 3.2.1 the code throws an error: >>>> >>>> Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>>> >>>> I could not find anything in the release notes that would explain this >>>> change. Changing evalq to eval works in 3.2.1, but eval does not store x >>>> in >>>> the given environment in 3.1.2. >>>> >>>> Thanks, >>>> >>>> Dayne >>>> >>>> [[alternative HTML version deleted]] >>>> >>>> ______________________________________________ >>>> R-devel at r-project.org mailing list >>>> https://stat.ethz.ch/mailman/listinfo/r-devel >>>> >>> >>> >> > > [[alternative HTML version deleted]] > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel David Winsemius Alameda, CA, USA From wdunlap at tibco.com Wed Jul 15 22:40:50 2015 From: wdunlap at tibco.com (William Dunlap) Date: Wed, 15 Jul 2015 13:40:50 -0700 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: Another aspect of the change is (using TERR's RinR package): > options(REvaluators=list(makeREvaluator("R-3.1.3"), makeREvaluator("R-3.2.0"))) > RCompare(rapply(list(quote(function(x)x),list(quote(pi),quote(7-4))), function(arg)typeof(arg))) R version 3.1.3 (2015-03-09) R version 3.2.0 (2015-04-16) [1,] [1] "closure" "double" [1] "language" "symbol" [2,] [3] "double" [3] "language" $all.equal $all.equal$`R version 3.1.3 (2015-03-09) vs. R version 3.2.0 (2015-04-16)` [1] "3 string mismatches" I prefer the new semantics, but it is a change. Bill Dunlap TIBCO Software wdunlap tibco.com On Wed, Jul 15, 2015 at 1:25 PM, David Winsemius wrote: > > On Jul 15, 2015, at 12:51 PM, William Dunlap wrote: > > > I think rapply() was changed to act like lapply() in this respect. > > > > When I looked at the source of the difference, it was that typeof() > returned 'language' in 3.2.1, while it returned 'list' in the earlier > version of R. The first check in rapply's code in both version was: > > if (typeof(object) != "list") > stop("'object' must be a list") > > Wrapping list() around the first argument and switching to using eval with > an expression-object rather than a call-object seemed to solve the problem > when this was posed as a question on StackOverflow, but Dayne was not happy > with that solution for other reasons that he is not describing. > > -- > David. > > > In R-3.1.3 we got > > rapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) > > #[1] 18 > > rapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) > > #Error in (function (expr, envir = parent.frame(), enclos = if > > (is.list(envir) || : > > object 'myNumber' not found > > lapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) > > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found > > lapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) > > #[[1]] > > #[1] 18 > > while in R-3.2.0 we get > > rapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) > > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found > > rapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) > > #[1] 18 > > lapply(list(quote(1+myNumber)), evalq, envir=list2env(list(myNumber=17))) > > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found > > lapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) > > #[[1]] > > #[1] 18 > > > > Make the FUN argument function(arg)sys.call() to see some details of the > > change. > > > > > > Bill Dunlap > > TIBCO Software > > wdunlap tibco.com > > > > On Wed, Jul 15, 2015 at 12:35 PM, Dayne Filer > wrote: > > > >> In 3.1.2 eval does not store the result of the bquote-generated call in > >> the given environment. Interestingly, in 3.2.1 eval does store the > result > >> of the bquote-generated call in the given environment. > >> > >> In other words if I run the given example with eval rather than evalq, > on > >> 3.1.2 "x" is never stored in "fenv," but it is when I run the same code > on > >> 3.2.1. However, the given example stores "x" in "fenv" on 3.1.2, but > throws > >> the error I gave when run on 3.2.1. > >> > >> To give credit, I received the idea for using evalq from SO: > >> http://stackoverflow.com/a/22559385 > >> > >> Dayne > >> > >> > >> On Wed, Jul 15, 2015 at 3:29 PM, William Dunlap > wrote: > >> > >>> I am curious why you used evalq instead of eval in this code. > >>> > >>> Bill Dunlap > >>> TIBCO Software > >>> wdunlap tibco.com > >>> > >>> On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer > >>> wrote: > >>> > >>>> Hello, > >>>> > >>>> I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that > >>>> worked > >>>> as I intended previously. Briefly, I am using bquote to generate > >>>> expressions to modify data.table objects within a function, so I need > >>>> the > >>>> changes to actually be stored in the given environment. Previously, I > >>>> used > >>>> code like the following: > >>>> > >>>> test <- list(bquote(x <- 10)) > >>>> fenv <- environment() > >>>> rapply(test, evalq, envir = fenv) > >>>> > >>>> Although the code in the example above is much simpler, it shows the > >>>> problem. On 3.1.2 the expression is evaluated and x is stored as 10 in > >>>> the > >>>> given environment, fenv. In 3.2.1 the code throws an error: > >>>> > >>>> Error in eval(substitute(expr), envir, enclos) : object 'X' not found > >>>> > >>>> I could not find anything in the release notes that would explain this > >>>> change. Changing evalq to eval works in 3.2.1, but eval does not > store x > >>>> in > >>>> the given environment in 3.1.2. > >>>> > >>>> Thanks, > >>>> > >>>> Dayne > >>>> > >>>> [[alternative HTML version deleted]] > >>>> > >>>> ______________________________________________ > >>>> R-devel at r-project.org mailing list > >>>> https://stat.ethz.ch/mailman/listinfo/r-devel > >>>> > >>> > >>> > >> > > > > [[alternative HTML version deleted]] > > > > ______________________________________________ > > R-devel at r-project.org mailing list > > https://stat.ethz.ch/mailman/listinfo/r-devel > > David Winsemius > Alameda, CA, USA > > [[alternative HTML version deleted]] From dayne.filer at gmail.com Wed Jul 15 22:44:25 2015 From: dayne.filer at gmail.com (Dayne Filer) Date: Wed, 15 Jul 2015 16:44:25 -0400 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: David, If you are referring to the solution that would be: rapply(list(test), eval, envir = fenv) I thought I explained in the question that the above code does not work. It does not throw an error, but the behavior is no different (at least in the output or result). Using the above code still results in the x object not being stored in fenv on 3.1.2. Dayne On Wed, Jul 15, 2015 at 4:40 PM, William Dunlap wrote: > Another aspect of the change is (using TERR's RinR package): > > options(REvaluators=list(makeREvaluator("R-3.1.3"), > makeREvaluator("R-3.2.0"))) > > RCompare(rapply(list(quote(function(x)x),list(quote(pi),quote(7-4))), > function(arg)typeof(arg))) > R version 3.1.3 (2015-03-09) R version 3.2.0 (2015-04-16) > [1,] [1] "closure" "double" [1] "language" "symbol" > [2,] [3] "double" [3] "language" > $all.equal > $all.equal$`R version 3.1.3 (2015-03-09) vs. R version 3.2.0 > (2015-04-16)` > [1] "3 string mismatches" > > I prefer the new semantics, but it is a change. > > > > Bill Dunlap > TIBCO Software > wdunlap tibco.com > > On Wed, Jul 15, 2015 at 1:25 PM, David Winsemius > wrote: > >> >> On Jul 15, 2015, at 12:51 PM, William Dunlap wrote: >> >> > I think rapply() was changed to act like lapply() in this respect. >> > >> >> When I looked at the source of the difference, it was that typeof() >> returned 'language' in 3.2.1, while it returned 'list' in the earlier >> version of R. The first check in rapply's code in both version was: >> >> if (typeof(object) != "list") >> stop("'object' must be a list") >> >> Wrapping list() around the first argument and switching to using eval >> with an expression-object rather than a call-object seemed to solve the >> problem when this was posed as a question on StackOverflow, but Dayne was >> not happy with that solution for other reasons that he is not describing. >> >> -- >> David. >> >> > In R-3.1.3 we got >> > rapply(list(quote(1+myNumber)), evalq, >> envir=list2env(list(myNumber=17))) >> > #[1] 18 >> > rapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) >> > #Error in (function (expr, envir = parent.frame(), enclos = if >> > (is.list(envir) || : >> > object 'myNumber' not found >> > lapply(list(quote(1+myNumber)), evalq, >> envir=list2env(list(myNumber=17))) >> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >> > lapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) >> > #[[1]] >> > #[1] 18 >> > while in R-3.2.0 we get >> > rapply(list(quote(1+myNumber)), evalq, >> envir=list2env(list(myNumber=17))) >> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >> > rapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) >> > #[1] 18 >> > lapply(list(quote(1+myNumber)), evalq, >> envir=list2env(list(myNumber=17))) >> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >> > lapply(list(quote(1+myNumber)), eval, envir=list2env(list(myNumber=17))) >> > #[[1]] >> > #[1] 18 >> > >> > Make the FUN argument function(arg)sys.call() to see some details of the >> > change. >> > >> > >> > Bill Dunlap >> > TIBCO Software >> > wdunlap tibco.com >> > >> > On Wed, Jul 15, 2015 at 12:35 PM, Dayne Filer >> wrote: >> > >> >> In 3.1.2 eval does not store the result of the bquote-generated call in >> >> the given environment. Interestingly, in 3.2.1 eval does store the >> result >> >> of the bquote-generated call in the given environment. >> >> >> >> In other words if I run the given example with eval rather than evalq, >> on >> >> 3.1.2 "x" is never stored in "fenv," but it is when I run the same >> code on >> >> 3.2.1. However, the given example stores "x" in "fenv" on 3.1.2, but >> throws >> >> the error I gave when run on 3.2.1. >> >> >> >> To give credit, I received the idea for using evalq from SO: >> >> http://stackoverflow.com/a/22559385 >> >> >> >> Dayne >> >> >> >> >> >> On Wed, Jul 15, 2015 at 3:29 PM, William Dunlap >> wrote: >> >> >> >>> I am curious why you used evalq instead of eval in this code. >> >>> >> >>> Bill Dunlap >> >>> TIBCO Software >> >>> wdunlap tibco.com >> >>> >> >>> On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer >> >>> wrote: >> >>> >> >>>> Hello, >> >>>> >> >>>> I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that >> >>>> worked >> >>>> as I intended previously. Briefly, I am using bquote to generate >> >>>> expressions to modify data.table objects within a function, so I >> need >> >>>> the >> >>>> changes to actually be stored in the given environment. Previously, I >> >>>> used >> >>>> code like the following: >> >>>> >> >>>> test <- list(bquote(x <- 10)) >> >>>> fenv <- environment() >> >>>> rapply(test, evalq, envir = fenv) >> >>>> >> >>>> Although the code in the example above is much simpler, it shows the >> >>>> problem. On 3.1.2 the expression is evaluated and x is stored as 10 >> in >> >>>> the >> >>>> given environment, fenv. In 3.2.1 the code throws an error: >> >>>> >> >>>> Error in eval(substitute(expr), envir, enclos) : object 'X' not found >> >>>> >> >>>> I could not find anything in the release notes that would explain >> this >> >>>> change. Changing evalq to eval works in 3.2.1, but eval does not >> store x >> >>>> in >> >>>> the given environment in 3.1.2. >> >>>> >> >>>> Thanks, >> >>>> >> >>>> Dayne >> >>>> >> >>>> [[alternative HTML version deleted]] >> >>>> >> >>>> ______________________________________________ >> >>>> R-devel at r-project.org mailing list >> >>>> https://stat.ethz.ch/mailman/listinfo/r-devel >> >>>> >> >>> >> >>> >> >> >> > >> > [[alternative HTML version deleted]] >> > >> > ______________________________________________ >> > R-devel at r-project.org mailing list >> > https://stat.ethz.ch/mailman/listinfo/r-devel >> >> David Winsemius >> Alameda, CA, USA >> >> > [[alternative HTML version deleted]] From dayne.filer at gmail.com Wed Jul 15 22:46:53 2015 From: dayne.filer at gmail.com (Dayne Filer) Date: Wed, 15 Jul 2015 16:46:53 -0400 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: Bill, Is your conclusion to just update the code and enforce using the most recent version of R? Dayne On Wed, Jul 15, 2015 at 4:44 PM, Dayne Filer wrote: > David, > > If you are referring to the solution that would be: > > rapply(list(test), eval, envir = fenv) > > I thought I explained in the question that the above code does not work. > It does not throw an error, but the behavior is no different (at least in > the output or result). Using the above code still results in the x object > not being stored in fenv on 3.1.2. > > Dayne > > On Wed, Jul 15, 2015 at 4:40 PM, William Dunlap wrote: > >> Another aspect of the change is (using TERR's RinR package): >> > options(REvaluators=list(makeREvaluator("R-3.1.3"), >> makeREvaluator("R-3.2.0"))) >> > RCompare(rapply(list(quote(function(x)x),list(quote(pi),quote(7-4))), >> function(arg)typeof(arg))) >> R version 3.1.3 (2015-03-09) R version 3.2.0 (2015-04-16) >> [1,] [1] "closure" "double" [1] "language" "symbol" >> [2,] [3] "double" [3] "language" >> $all.equal >> $all.equal$`R version 3.1.3 (2015-03-09) vs. R version 3.2.0 >> (2015-04-16)` >> [1] "3 string mismatches" >> >> I prefer the new semantics, but it is a change. >> >> >> >> Bill Dunlap >> TIBCO Software >> wdunlap tibco.com >> >> On Wed, Jul 15, 2015 at 1:25 PM, David Winsemius >> wrote: >> >>> >>> On Jul 15, 2015, at 12:51 PM, William Dunlap wrote: >>> >>> > I think rapply() was changed to act like lapply() in this respect. >>> > >>> >>> When I looked at the source of the difference, it was that typeof() >>> returned 'language' in 3.2.1, while it returned 'list' in the earlier >>> version of R. The first check in rapply's code in both version was: >>> >>> if (typeof(object) != "list") >>> stop("'object' must be a list") >>> >>> Wrapping list() around the first argument and switching to using eval >>> with an expression-object rather than a call-object seemed to solve the >>> problem when this was posed as a question on StackOverflow, but Dayne was >>> not happy with that solution for other reasons that he is not describing. >>> >>> -- >>> David. >>> >>> > In R-3.1.3 we got >>> > rapply(list(quote(1+myNumber)), evalq, >>> envir=list2env(list(myNumber=17))) >>> > #[1] 18 >>> > rapply(list(quote(1+myNumber)), eval, >>> envir=list2env(list(myNumber=17))) >>> > #Error in (function (expr, envir = parent.frame(), enclos = if >>> > (is.list(envir) || : >>> > object 'myNumber' not found >>> > lapply(list(quote(1+myNumber)), evalq, >>> envir=list2env(list(myNumber=17))) >>> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>> > lapply(list(quote(1+myNumber)), eval, >>> envir=list2env(list(myNumber=17))) >>> > #[[1]] >>> > #[1] 18 >>> > while in R-3.2.0 we get >>> > rapply(list(quote(1+myNumber)), evalq, >>> envir=list2env(list(myNumber=17))) >>> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>> > rapply(list(quote(1+myNumber)), eval, >>> envir=list2env(list(myNumber=17))) >>> > #[1] 18 >>> > lapply(list(quote(1+myNumber)), evalq, >>> envir=list2env(list(myNumber=17))) >>> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>> > lapply(list(quote(1+myNumber)), eval, >>> envir=list2env(list(myNumber=17))) >>> > #[[1]] >>> > #[1] 18 >>> > >>> > Make the FUN argument function(arg)sys.call() to see some details of >>> the >>> > change. >>> > >>> > >>> > Bill Dunlap >>> > TIBCO Software >>> > wdunlap tibco.com >>> > >>> > On Wed, Jul 15, 2015 at 12:35 PM, Dayne Filer >>> wrote: >>> > >>> >> In 3.1.2 eval does not store the result of the bquote-generated call >>> in >>> >> the given environment. Interestingly, in 3.2.1 eval does store the >>> result >>> >> of the bquote-generated call in the given environment. >>> >> >>> >> In other words if I run the given example with eval rather than >>> evalq, on >>> >> 3.1.2 "x" is never stored in "fenv," but it is when I run the same >>> code on >>> >> 3.2.1. However, the given example stores "x" in "fenv" on 3.1.2, but >>> throws >>> >> the error I gave when run on 3.2.1. >>> >> >>> >> To give credit, I received the idea for using evalq from SO: >>> >> http://stackoverflow.com/a/22559385 >>> >> >>> >> Dayne >>> >> >>> >> >>> >> On Wed, Jul 15, 2015 at 3:29 PM, William Dunlap >>> wrote: >>> >> >>> >>> I am curious why you used evalq instead of eval in this code. >>> >>> >>> >>> Bill Dunlap >>> >>> TIBCO Software >>> >>> wdunlap tibco.com >>> >>> >>> >>> On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer >> > >>> >>> wrote: >>> >>> >>> >>>> Hello, >>> >>>> >>> >>>> I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that >>> >>>> worked >>> >>>> as I intended previously. Briefly, I am using bquote to generate >>> >>>> expressions to modify data.table objects within a function, so I >>> need >>> >>>> the >>> >>>> changes to actually be stored in the given environment. Previously, >>> I >>> >>>> used >>> >>>> code like the following: >>> >>>> >>> >>>> test <- list(bquote(x <- 10)) >>> >>>> fenv <- environment() >>> >>>> rapply(test, evalq, envir = fenv) >>> >>>> >>> >>>> Although the code in the example above is much simpler, it shows the >>> >>>> problem. On 3.1.2 the expression is evaluated and x is stored as 10 >>> in >>> >>>> the >>> >>>> given environment, fenv. In 3.2.1 the code throws an error: >>> >>>> >>> >>>> Error in eval(substitute(expr), envir, enclos) : object 'X' not >>> found >>> >>>> >>> >>>> I could not find anything in the release notes that would explain >>> this >>> >>>> change. Changing evalq to eval works in 3.2.1, but eval does not >>> store x >>> >>>> in >>> >>>> the given environment in 3.1.2. >>> >>>> >>> >>>> Thanks, >>> >>>> >>> >>>> Dayne >>> >>>> >>> >>>> [[alternative HTML version deleted]] >>> >>>> >>> >>>> ______________________________________________ >>> >>>> R-devel at r-project.org mailing list >>> >>>> https://stat.ethz.ch/mailman/listinfo/r-devel >>> >>>> >>> >>> >>> >>> >>> >> >>> > >>> > [[alternative HTML version deleted]] >>> > >>> > ______________________________________________ >>> > R-devel at r-project.org mailing list >>> > https://stat.ethz.ch/mailman/listinfo/r-devel >>> >>> David Winsemius >>> Alameda, CA, USA >>> >>> >> > [[alternative HTML version deleted]] From wdunlap at tibco.com Wed Jul 15 23:05:25 2015 From: wdunlap at tibco.com (William Dunlap) Date: Wed, 15 Jul 2015 14:05:25 -0700 Subject: [Rd] bquote/evalq behavior changed in R-3.2.1 In-Reply-To: References: Message-ID: You could test for the version of R when using rapply. > getRversion() >= "3.2.0" [1] TRUE I rarely use rapply(). I often find that writing my own purpose-built recursive function is easier than fitting my problem into rapply's framework. Bill Dunlap TIBCO Software wdunlap tibco.com On Wed, Jul 15, 2015 at 1:46 PM, Dayne Filer wrote: > Bill, > > Is your conclusion to just update the code and enforce using the most > recent version of R? > > Dayne > > On Wed, Jul 15, 2015 at 4:44 PM, Dayne Filer > wrote: > >> David, >> >> If you are referring to the solution that would be: >> >> rapply(list(test), eval, envir = fenv) >> >> I thought I explained in the question that the above code does not work. >> It does not throw an error, but the behavior is no different (at least in >> the output or result). Using the above code still results in the x object >> not being stored in fenv on 3.1.2. >> >> Dayne >> >> On Wed, Jul 15, 2015 at 4:40 PM, William Dunlap >> wrote: >> >>> Another aspect of the change is (using TERR's RinR package): >>> > options(REvaluators=list(makeREvaluator("R-3.1.3"), >>> makeREvaluator("R-3.2.0"))) >>> > RCompare(rapply(list(quote(function(x)x),list(quote(pi),quote(7-4))), >>> function(arg)typeof(arg))) >>> R version 3.1.3 (2015-03-09) R version 3.2.0 (2015-04-16) >>> [1,] [1] "closure" "double" [1] "language" "symbol" >>> [2,] [3] "double" [3] "language" >>> $all.equal >>> $all.equal$`R version 3.1.3 (2015-03-09) vs. R version 3.2.0 >>> (2015-04-16)` >>> [1] "3 string mismatches" >>> >>> I prefer the new semantics, but it is a change. >>> >>> >>> >>> Bill Dunlap >>> TIBCO Software >>> wdunlap tibco.com >>> >>> On Wed, Jul 15, 2015 at 1:25 PM, David Winsemius >> > wrote: >>> >>>> >>>> On Jul 15, 2015, at 12:51 PM, William Dunlap wrote: >>>> >>>> > I think rapply() was changed to act like lapply() in this respect. >>>> > >>>> >>>> When I looked at the source of the difference, it was that typeof() >>>> returned 'language' in 3.2.1, while it returned 'list' in the earlier >>>> version of R. The first check in rapply's code in both version was: >>>> >>>> if (typeof(object) != "list") >>>> stop("'object' must be a list") >>>> >>>> Wrapping list() around the first argument and switching to using eval >>>> with an expression-object rather than a call-object seemed to solve the >>>> problem when this was posed as a question on StackOverflow, but Dayne was >>>> not happy with that solution for other reasons that he is not describing. >>>> >>>> -- >>>> David. >>>> >>>> > In R-3.1.3 we got >>>> > rapply(list(quote(1+myNumber)), evalq, >>>> envir=list2env(list(myNumber=17))) >>>> > #[1] 18 >>>> > rapply(list(quote(1+myNumber)), eval, >>>> envir=list2env(list(myNumber=17))) >>>> > #Error in (function (expr, envir = parent.frame(), enclos = if >>>> > (is.list(envir) || : >>>> > object 'myNumber' not found >>>> > lapply(list(quote(1+myNumber)), evalq, >>>> envir=list2env(list(myNumber=17))) >>>> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>>> > lapply(list(quote(1+myNumber)), eval, >>>> envir=list2env(list(myNumber=17))) >>>> > #[[1]] >>>> > #[1] 18 >>>> > while in R-3.2.0 we get >>>> > rapply(list(quote(1+myNumber)), evalq, >>>> envir=list2env(list(myNumber=17))) >>>> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>>> > rapply(list(quote(1+myNumber)), eval, >>>> envir=list2env(list(myNumber=17))) >>>> > #[1] 18 >>>> > lapply(list(quote(1+myNumber)), evalq, >>>> envir=list2env(list(myNumber=17))) >>>> > #Error in eval(substitute(expr), envir, enclos) : object 'X' not found >>>> > lapply(list(quote(1+myNumber)), eval, >>>> envir=list2env(list(myNumber=17))) >>>> > #[[1]] >>>> > #[1] 18 >>>> > >>>> > Make the FUN argument function(arg)sys.call() to see some details of >>>> the >>>> > change. >>>> > >>>> > >>>> > Bill Dunlap >>>> > TIBCO Software >>>> > wdunlap tibco.com >>>> > >>>> > On Wed, Jul 15, 2015 at 12:35 PM, Dayne Filer >>>> wrote: >>>> > >>>> >> In 3.1.2 eval does not store the result of the bquote-generated call >>>> in >>>> >> the given environment. Interestingly, in 3.2.1 eval does store the >>>> result >>>> >> of the bquote-generated call in the given environment. >>>> >> >>>> >> In other words if I run the given example with eval rather than >>>> evalq, on >>>> >> 3.1.2 "x" is never stored in "fenv," but it is when I run the same >>>> code on >>>> >> 3.2.1. However, the given example stores "x" in "fenv" on 3.1.2, but >>>> throws >>>> >> the error I gave when run on 3.2.1. >>>> >> >>>> >> To give credit, I received the idea for using evalq from SO: >>>> >> http://stackoverflow.com/a/22559385 >>>> >> >>>> >> Dayne >>>> >> >>>> >> >>>> >> On Wed, Jul 15, 2015 at 3:29 PM, William Dunlap >>>> wrote: >>>> >> >>>> >>> I am curious why you used evalq instead of eval in this code. >>>> >>> >>>> >>> Bill Dunlap >>>> >>> TIBCO Software >>>> >>> wdunlap tibco.com >>>> >>> >>>> >>> On Wed, Jul 15, 2015 at 11:49 AM, Dayne Filer < >>>> dayne.filer at gmail.com> >>>> >>> wrote: >>>> >>> >>>> >>>> Hello, >>>> >>>> >>>> >>>> I upgraded from 3.1.2 to 3.2.1 and am receiving errors on code that >>>> >>>> worked >>>> >>>> as I intended previously. Briefly, I am using bquote to generate >>>> >>>> expressions to modify data.table objects within a function, so I >>>> need >>>> >>>> the >>>> >>>> changes to actually be stored in the given environment. >>>> Previously, I >>>> >>>> used >>>> >>>> code like the following: >>>> >>>> >>>> >>>> test <- list(bquote(x <- 10)) >>>> >>>> fenv <- environment() >>>> >>>> rapply(test, evalq, envir = fenv) >>>> >>>> >>>> >>>> Although the code in the example above is much simpler, it shows >>>> the >>>> >>>> problem. On 3.1.2 the expression is evaluated and x is stored as >>>> 10 in >>>> >>>> the >>>> >>>> given environment, fenv. In 3.2.1 the code throws an error: >>>> >>>> >>>> >>>> Error in eval(substitute(expr), envir, enclos) : object 'X' not >>>> found >>>> >>>> >>>> >>>> I could not find anything in the release notes that would explain >>>> this >>>> >>>> change. Changing evalq to eval works in 3.2.1, but eval does not >>>> store x >>>> >>>> in >>>> >>>> the given environment in 3.1.2. >>>> >>>> >>>> >>>> Thanks, >>>> >>>> >>>> >>>> Dayne >>>> >>>> >>>> >>>> [[alternative HTML version deleted]] >>>> >>>> >>>> >>>> ______________________________________________ >>>> >>>> R-devel at r-project.org mailing list >>>> >>>> https://stat.ethz.ch/mailman/listinfo/r-devel >>>> >>>> >>>> >>> >>>> >>> >>>> >> >>>> > >>>> > [[alternative HTML version deleted]] >>>> > >>>> > ______________________________________________ >>>> > R-devel at r-project.org mailing list >>>> > https://stat.ethz.ch/mailman/listinfo/r-devel >>>> >>>> David Winsemius >>>> Alameda, CA, USA >>>> >>>> >>> >> > [[alternative HTML version deleted]] From csardi.gabor at gmail.com Thu Jul 16 13:18:00 2015 From: csardi.gabor at gmail.com (=?UTF-8?B?R8OhYm9yIENzw6FyZGk=?=) Date: Thu, 16 Jul 2015 07:18:00 -0400 Subject: [Rd] Building r-devel fails on Ubuntu (old and new as well) Message-ID: ... making array.d from array.c array.c:33:23: fatal error: duplicate.h: No such file or directory compilation terminated. ... https://travis-ci.org/metacran/r-builder/builds/71225331#L7405 fyi. Gabor From maechler at stat.math.ethz.ch Thu Jul 16 14:08:17 2015 From: maechler at stat.math.ethz.ch (Martin Maechler) Date: Thu, 16 Jul 2015 14:08:17 +0200 Subject: [Rd] Building r-devel fails on Ubuntu (old and new as well) In-Reply-To: References: Message-ID: <21927.40625.106548.815652@stat.math.ethz.ch> >>>>> G?bor Cs?rdi >>>>> on Thu, 16 Jul 2015 07:18:00 -0400 writes: > ... > making array.d from array.c > array.c:33:23: fatal error: duplicate.h: No such file or directory > compilation terminated. > ... > https://travis-ci.org/metacran/r-builder/builds/71225331#L7405 Yes. That's what R-devel is: "in development" and so, human errors do happen. ... and are corrected. ... already more than half an hour ago. No need to trumpet to the world... I think (R-core would have been sufficient (and unnecessary this time)). Martin From maechler at stat.math.ethz.ch Fri Jul 17 18:00:28 2015 From: maechler at stat.math.ethz.ch (Martin Maechler) Date: Fri, 17 Jul 2015 18:00:28 +0200 Subject: [Rd] Improvements (?) in stats::poly and stats::polym. In-Reply-To: <5F22AFBADFE10342ABECF0281DE992181824CEF3@EXCH001.campden.co.uk> References: <5F22AFBADFE10342ABECF0281DE992181824CEF3@EXCH001.campden.co.uk> Message-ID: <21929.9884.443290.579419@stat.math.ethz.ch> Dear Keith, >>>>> >>>>> on Thu, 16 Jul 2015 08:58:11 +0000 writes: > Dear R Core Team, > Last week I made a post to the R-help mailing list > ?predict.poly for multivariate data? > > but it has had no responses so I?m sending this to the > email address of package:stats maintainer. Please feel > free to tell me that this is inappropriate. Asking R Core in your case is ok ... { though still slightly "sub optimal" (but *not* "inappropriate"!): Ideallly you'd have followed the posting guide (http://www.r-project.org/posting-guide.html) here, namely to send your original post to R-devel instead of R-help. Then it would have been noticed by me and most probably several other R core members ... } > IMHO the reproducible code I presented in that post: > ############# > library(datasets) > alm <- lm(stack.loss ~ poly(Air.Flow, Water.Temp, degree=3), stackloss) > alm$fitted.values[1:10] # "correct" prediction values [1:10] > predict(alm, stackloss)[1:10] # gives correct values > predict(alm, stackloss[1:10,]) # gives wrong values > ######### > ... clearly demonstrates something wrong, the two predicts should not differ. > I hesitate to call it a bug, it might be viewed as inappropriate usage. But it's easy to get wrong answers, fairly small changes to poly and polym correct the wrongness, and I think the changes are backwards compatible. Perhaps appending the altered codes made the R-help post too long for easy comprehension, I attach them to this email. Thank you! I had started to look at your R-help post and noticed that you changed the *printout* of the R functions, instead of the source The current development version of that part of the R source code is always at https://svn.r-project.org/R/trunk/src/library/stats/R/contr.poly.R and if you look carefully, you see that there are comments in the sources that are lost in the process (of parsing, byte-compiling, saving in binary, ....), but never mind: you've marked your changes well and I can use your version to modify the sources. >From what I've understood, the changes make much sense and look good; and if no problem surfaces should make it into R - with an acknowledgement to you, of course. Thank you, indeed, Martin ------- Part 2: > While I'm writing, for didactic purposes I've sometimes > wanted to express the orthogonal polynomials in the > conventional form: > b0.x^0 + b1.x^1 + b2.x^2 + ... > ... rather than in terms of " the centering and > normalization constants used in constructing the > orthogonal polynomials". > Over the years in the various R forums others have made > similar requests to be told that one can: > (a) calculate the polynomials using poly.predict(), so the > conventional form coefficients aren't needed; > (b) see the algorithm in the code and/or Kennedy & Gentle (1980, pp. 343?4). > (a) doesn't meet my didactic needs. > With respect to (b) I could see how to calculate polynomials for given x value but I just got lost in the algebra trying to deduce the conventional form coefficients :-{ > Kennedy & Gentle refer to "solving for x in p(x)" which to > my simple mind suggested `lm' and led to the truly > horrible approach implemented in the attached > poly-orth2raw.R. > I fully accept that there must be a better, more direct, > way to deduce the conventional form coefficients from the > centering and normalisation constants but I can't work it > out, and this approach seems to work. I'm not really commenting on the above issue, and notably your orth2raw implementation at all. My vague recollection would be that indeed, it has been sometimes desirable, if only for didactical reasons, to better explain the meaning of the orthogonal polynomial basis. OTOH, notably if you think of high degree polynomials(10 already may be "high" here), it can even be "dangerous" to hand down the coefficients wrt to the (1, x, x^2, .., x^p) basis to the end user, because even using them in prediction may be numerical quite unstable {but then one, me included, would argue that using degree 10 polynomials is typically nonsense and should be replaced by using regression/smoothing splines}... > With best regards, > Keith Jewell ? Head of Statistics Group > Campden BRI Group > Tel + 44(0)1386 842055 > Email keith.jewell at campdenbri.co.uk > Web www.campdenbri.co.uk > Site Station Road, Chipping Campden, Gloucestershire, GL55 6LD, UK > [............ ca 20 lines of legal gibberish + anti-virus ad .. ] > ____________________________________________________________ > x[DELETED ATTACHMENT poly.R, Untyped binary data] > x[DELETED ATTACHMENT polym.R, Untyped binary data] > x[DELETED ATTACHMENT orth2raw.R, Untyped binary data] From ripley at stats.ox.ac.uk Sat Jul 18 10:09:09 2015 From: ripley at stats.ox.ac.uk (Prof Brian Ripley) Date: Sat, 18 Jul 2015 09:09:09 +0100 Subject: [Rd] Use cairo fallback resolution greater than 72dpi in cairo_pdf and cairo_ps in grDevices In-Reply-To: <37EFC97028F3E44082ACC5CBEC00563011547D07@ICTS-S-MBX13.luna.kuleuven.be> References: <37EFC97028F3E44082ACC5CBEC00563011547D07@ICTS-S-MBX13.luna.kuleuven.be> Message-ID: <55AA09A5.3060808@stats.ox.ac.uk> The 'at a minimum' information requested by the posting guide is missing. According to their documentation the cairo default fallback resolution is now 300dpi, and when I run your example on Fedora 21 that is what the emitted postscript says it is. You can easily alter the R code to set it to something different: see http://cairographics.org/manual/cairo-cairo-surface-t.html for the call you would need to add. However, I would suggest that you generate a bitmap directly and use that, as PostScript does not support semi-transparency. On 14/07/2015 12:03, Tom Wenseleers wrote: > Dear all, > In grDevices R functions cairo_pdf and cairo_ps it is mentioned that when transparency (alpha channels) are used in vector output, it will rasterize the PDF or postscript exported graph at a resolution of 72 dpi : https://stat.ethz.ch/R-manual/R-devel/library/grDevices/html/cairo.html > > You can see the problem if you try > > library(ggplot2) > cairo_ps(file = "test.eps",onefile = FALSE) > qplot(Sepal.Length, Petal.Length, data = iris, color = Species, size = Petal.Width, alpha = I(0.7)) > dev.off() > > as in the output (here zoomed in) the plot symbols are heavily pixelated then, showing it is only using 72 dpi. > > I was wondering how the fallback resolution could be increased to 600 dpi? In library(RGtk2) there is a command cairoSurfaceSetFallbackResolution, which I think is what is relevant here, but I would not know how to make grDevices use that parameter. Any thoughts? > > Using postscript() btw also doesn't work, since that doesn't support transparency, and returns the error "semi-transparency is not supported on this device: reported only once per page". > > > I know I can use svg or pdf instead and that this would solve the problem, but problem is the journal (PloS) I would like to submit to only accepts EPS. So is there any solution to increase the quality in EPS, without having to rasterize everything to PNG? > > best regards, > Tom Wenseleers > > > > > > > [[alternative HTML version deleted]] > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > -- Brian D. Ripley, ripley at stats.ox.ac.uk Emeritus Professor of Applied Statistics, University of Oxford 1 South Parks Road, Oxford OX1 3TG, UK From joseclaudio.faria at gmail.com Sun Jul 19 06:05:06 2015 From: joseclaudio.faria at gmail.com (Jose Claudio Faria) Date: Sun, 19 Jul 2015 01:05:06 -0300 Subject: [Rd] update.packages(checkBuilt=TRUE, ask=FALSE): possible bug Message-ID: Hello, > version _ platform x86_64-w64-mingw32 arch x86_64 os mingw32 system x86_64, mingw32 status Patched major 3 minor 2.1 year 2015 month 07 day 16 svn rev 68681 language R version.string R version 3.2.1 Patched (2015-07-16 r68681) nickname World-Famous Astronaut I have the options below: options(pkgType='binary') options(install.packages.check.source='no') For some, but not allI repositories I get the error message below: Error in install.packages(update[instlib == l, "Package"], l, contriburl = contriburl, : specifying 'contriburl' or 'available' requires a single type, not type = "both" Is it a bug? Best, ///\\\///\\\///\\\///\\\///\\\///\\\///\\\///\\\ Jose Claudio Faria Estatistica UESC/DCET/Brasil joseclaudio.faria at gmail.com Telefones: 55(73)3680.5545 - UESC 55(73)9966.9100 - VIVO 55(73)9100.7351 - TIM 55(73)8817.6159 - OI 55(73)8129.9942 - CLARO ///\\\///\\\///\\\///\\\///\\\///\\\///\\\///\\\ [[alternative HTML version deleted]] From rhurlin at gwdg.de Sun Jul 19 09:42:53 2015 From: rhurlin at gwdg.de (Rainer Hurling) Date: Sun, 19 Jul 2015 09:42:53 +0200 Subject: [Rd] Building r-devel fails on Ubuntu (old and new as well) In-Reply-To: <21927.40625.106548.815652@stat.math.ethz.ch> References: <21927.40625.106548.815652@stat.math.ethz.ch> Message-ID: <55AB54FD.2010805@gwdg.de> Am 16.07.2015 um 14:08 schrieb Martin Maechler: >>>>>> G?bor Cs?rdi >>>>>> on Thu, 16 Jul 2015 07:18:00 -0400 writes: > > > ... > > making array.d from array.c > > array.c:33:23: fatal error: duplicate.h: No such file or directory > > compilation terminated. > > ... > > > https://travis-ci.org/metacran/r-builder/builds/71225331#L7405 > > Yes. That's what R-devel is: "in development" and so, human errors do happen. > ... and are corrected. ... already more than half an hour ago. Hmm, I'm a bit confused. Now we can find duplicate.h in trunk, but it still seems to be missing in the devel tarballs. (At least until R-devel_2015-07.18.tar.gz) > > No need to trumpet to the world... I think > (R-core would have been sufficient (and unnecessary this time)). ?? I also would have reported it on R-devel. Is this a special case with special handling? Sorry, when I have missed something. Best wishes, Rainer Hurling > > Martin From pdalgd at gmail.com Sun Jul 19 09:50:54 2015 From: pdalgd at gmail.com (peter dalgaard) Date: Sun, 19 Jul 2015 09:50:54 +0200 Subject: [Rd] update.packages(checkBuilt=TRUE, ask=FALSE): possible bug In-Reply-To: References: Message-ID: <21FF8325-EE50-480E-9729-E0440D6587EE@gmail.com> > On 19 Jul 2015, at 06:05 , Jose Claudio Faria wrote: > > Hello, > >> version > _ > platform x86_64-w64-mingw32 > arch x86_64 > os mingw32 > system x86_64, mingw32 > status Patched > major 3 > minor 2.1 > year 2015 > month 07 > day 16 > svn rev 68681 > language R > version.string R version 3.2.1 Patched (2015-07-16 r68681) > nickname World-Famous Astronaut > > I have the options below: > options(pkgType='binary') > options(install.packages.check.source='no') > > For some, but not allI repositories I get the error message below: > Error in install.packages(update[instlib == l, "Package"], l, contriburl = > contriburl, : > specifying 'contriburl' or 'available' requires a single type, not type = > "both" > > Is it a bug? Possibly, but I think you need to give a more reproducible example. In particular, which repositories and which packages? There is a fair amount of combinations to try... -- Peter Dalgaard, Professor, Center for Statistics, Copenhagen Business School Solbjerg Plads 3, 2000 Frederiksberg, Denmark Phone: (+45)38153501 Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com From pdalgd at gmail.com Sun Jul 19 10:01:57 2015 From: pdalgd at gmail.com (peter dalgaard) Date: Sun, 19 Jul 2015 10:01:57 +0200 Subject: [Rd] Building r-devel fails on Ubuntu (old and new as well) In-Reply-To: <55AB54FD.2010805@gwdg.de> References: <21927.40625.106548.815652@stat.math.ethz.ch> <55AB54FD.2010805@gwdg.de> Message-ID: <2B983FE8-401B-4DE5-B833-35CCCDB92091@gmail.com> Looks like someone forgot to update the DISTFILES (specifically the HEADERS) in src/main/Makefile.in... -pd > On 19 Jul 2015, at 09:42 , Rainer Hurling wrote: > > Hmm, I'm a bit confused. Now we can find duplicate.h in trunk, but it > still seems to be missing in the devel tarballs. (At least until > R-devel_2015-07.18.tar.gz) -- Peter Dalgaard, Professor, Center for Statistics, Copenhagen Business School Solbjerg Plads 3, 2000 Frederiksberg, Denmark Phone: (+45)38153501 Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com From pdalgd at gmail.com Sun Jul 19 10:19:25 2015 From: pdalgd at gmail.com (peter dalgaard) Date: Sun, 19 Jul 2015 10:19:25 +0200 Subject: [Rd] Building r-devel fails on Ubuntu (old and new as well) In-Reply-To: <2B983FE8-401B-4DE5-B833-35CCCDB92091@gmail.com> References: <21927.40625.106548.815652@stat.math.ethz.ch> <55AB54FD.2010805@gwdg.de> <2B983FE8-401B-4DE5-B833-35CCCDB92091@gmail.com> Message-ID: An updated tarball should be available in $CRAN/src/base/pre-release soon. (For CRAN=https://cran.r-project.org, immediately. Other mirrors need mirroring.) -pd > On 19 Jul 2015, at 10:01 , peter dalgaard wrote: > > Looks like someone forgot to update the DISTFILES (specifically the HEADERS) in src/main/Makefile.in... > > -pd > >> On 19 Jul 2015, at 09:42 , Rainer Hurling wrote: >> >> Hmm, I'm a bit confused. Now we can find duplicate.h in trunk, but it >> still seems to be missing in the devel tarballs. (At least until >> R-devel_2015-07.18.tar.gz) > > -- > Peter Dalgaard, Professor, > Center for Statistics, Copenhagen Business School > Solbjerg Plads 3, 2000 Frederiksberg, Denmark > Phone: (+45)38153501 > Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com > > > > > > > > -- Peter Dalgaard, Professor, Center for Statistics, Copenhagen Business School Solbjerg Plads 3, 2000 Frederiksberg, Denmark Phone: (+45)38153501 Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com From rhurlin at gwdg.de Sun Jul 19 10:28:14 2015 From: rhurlin at gwdg.de (Rainer Hurling) Date: Sun, 19 Jul 2015 10:28:14 +0200 Subject: [Rd] Building r-devel fails on Ubuntu (old and new as well) In-Reply-To: References: <21927.40625.106548.815652@stat.math.ethz.ch> <55AB54FD.2010805@gwdg.de> <2B983FE8-401B-4DE5-B833-35CCCDB92091@gmail.com> Message-ID: <55AB5F9E.5070308@gwdg.de> Am 19.07.2015 um 10:19 schrieb peter dalgaard: > An updated tarball should be available in $CRAN/src/base/pre-release soon. (For CRAN=https://cran.r-project.org, immediately. Other mirrors need mirroring.) Wow, that's fast! Many thanks for the update. Greetings, Rainer Hurling > > -pd > >> On 19 Jul 2015, at 10:01 , peter dalgaard wrote: >> >> Looks like someone forgot to update the DISTFILES (specifically the HEADERS) in src/main/Makefile.in... >> >> -pd >> >>> On 19 Jul 2015, at 09:42 , Rainer Hurling wrote: >>> >>> Hmm, I'm a bit confused. Now we can find duplicate.h in trunk, but it >>> still seems to be missing in the devel tarballs. (At least until >>> R-devel_2015-07.18.tar.gz) >> >> -- >> Peter Dalgaard, Professor, >> Center for Statistics, Copenhagen Business School >> Solbjerg Plads 3, 2000 Frederiksberg, Denmark >> Phone: (+45)38153501 >> Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com From ripley at stats.ox.ac.uk Sun Jul 19 10:33:06 2015 From: ripley at stats.ox.ac.uk (Prof Brian Ripley) Date: Sun, 19 Jul 2015 09:33:06 +0100 Subject: [Rd] Use cairo fallback resolution greater than 72dpi in cairo_pdf and cairo_ps in grDevices In-Reply-To: <55AA09A5.3060808@stats.ox.ac.uk> References: <37EFC97028F3E44082ACC5CBEC00563011547D07@ICTS-S-MBX13.luna.kuleuven.be> <55AA09A5.3060808@stats.ox.ac.uk> Message-ID: <55AB60C2.8000301@stats.ox.ac.uk> On 18/07/2015 09:09, Prof Brian Ripley wrote: > The 'at a minimum' information requested by the posting guide is missing. > > According to their documentation the cairo default fallback resolution > is now 300dpi, and when I run your example on Fedora 21 that is what the > emitted postscript says it is. > > You can easily alter the R code to set it to something different: see > http://cairographics.org/manual/cairo-cairo-surface-t.html for the call > you would need to add. I've added the ability to set the fallback resolution in R-devel, mainly because it seems to be system-specific (I still recommend using a bitmap format such as PNG directly). > > However, I would suggest that you generate a bitmap directly and use > that, as PostScript does not support semi-transparency. > > On 14/07/2015 12:03, Tom Wenseleers wrote: >> Dear all, >> In grDevices R functions cairo_pdf and cairo_ps it is mentioned that >> when transparency (alpha channels) are used in vector output, it will >> rasterize the PDF or postscript exported graph at a resolution of 72 >> dpi : >> https://stat.ethz.ch/R-manual/R-devel/library/grDevices/html/cairo.html >> >> You can see the problem if you try >> >> library(ggplot2) >> cairo_ps(file = "test.eps",onefile = FALSE) >> qplot(Sepal.Length, Petal.Length, data = iris, color = Species, size = >> Petal.Width, alpha = I(0.7)) >> dev.off() >> >> as in the output (here zoomed in) the plot symbols are heavily >> pixelated then, showing it is only using 72 dpi. >> >> I was wondering how the fallback resolution could be increased to 600 >> dpi? In library(RGtk2) there is a command >> cairoSurfaceSetFallbackResolution, which I think is what is relevant >> here, but I would not know how to make grDevices use that parameter. >> Any thoughts? >> >> Using postscript() btw also doesn't work, since that doesn't support >> transparency, and returns the error "semi-transparency is not >> supported on this device: reported only once per page". >> >> >> I know I can use svg or pdf instead and that this would solve the >> problem, but problem is the journal (PloS) I would like to submit to >> only accepts EPS. So is there any solution to increase the quality in >> EPS, without having to rasterize everything to PNG? >> >> best regards, >> Tom Wenseleers >> >> >> >> >> >> >> [[alternative HTML version deleted]] >> >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel >> > > -- Brian D. Ripley, ripley at stats.ox.ac.uk Emeritus Professor of Applied Statistics, University of Oxford 1 South Parks Road, Oxford OX1 3TG, UK From bodenhofer at bioinf.jku.at Sun Jul 19 12:03:59 2015 From: bodenhofer at bioinf.jku.at (bodenhofer at bioinf.jku.at) Date: 19 Jul 2015 10:03:59 -0000 Subject: [Rd] R-devel Digest, Vol 149, Issue 17 Message-ID: <20150719100359.781.qmail@lion.ifs.uni-linz.ac.at> Thanks for your message! I am out of office until August 10, 2015. Your message will be answered after I return to office on August 11, 2015. Thank you in advance for your kind patience! Best regards, Ulrich Bodenhofer From joseclaudio.faria at gmail.com Sun Jul 19 13:51:23 2015 From: joseclaudio.faria at gmail.com (Jose Claudio Faria) Date: Sun, 19 Jul 2015 08:51:23 -0300 Subject: [Rd] update.packages(checkBuilt=TRUE, ask=FALSE): possible bug In-Reply-To: <21FF8325-EE50-480E-9729-E0440D6587EE@gmail.com> References: <21FF8325-EE50-480E-9729-E0440D6587EE@gmail.com> Message-ID: Dear Peter, I managed to reproduce the error in multiple repositories ( http://cran.at.r-project.org/, http://cran-r.c3sl.ufpr.br/, etc.). However, after rebooting the computer it was no longer possible to reproduce it. I'll be monitoring ... Best, ///\\\///\\\///\\\///\\\///\\\///\\\///\\\///\\\ Jose Claudio Faria Estatistica UESC/DCET/Brasil joseclaudio.faria at gmail.com Telefones: 55(73)3680.5545 - UESC 55(73)9966.9100 - VIVO 55(73)9100.7351 - TIM 55(73)8817.6159 - OI 55(73)8129.9942 - CLARO ///\\\///\\\///\\\///\\\///\\\///\\\///\\\///\\\ On Sun, Jul 19, 2015 at 4:50 AM, peter dalgaard wrote: > > > On 19 Jul 2015, at 06:05 , Jose Claudio Faria < > joseclaudio.faria at gmail.com> wrote: > > > > Hello, > > > >> version > > _ > > platform x86_64-w64-mingw32 > > arch x86_64 > > os mingw32 > > system x86_64, mingw32 > > status Patched > > major 3 > > minor 2.1 > > year 2015 > > month 07 > > day 16 > > svn rev 68681 > > language R > > version.string R version 3.2.1 Patched (2015-07-16 r68681) > > nickname World-Famous Astronaut > > > > I have the options below: > > options(pkgType='binary') > > options(install.packages.check.source='no') > > > > For some, but not allI repositories I get the error message below: > > Error in install.packages(update[instlib == l, "Package"], l, contriburl > = > > contriburl, : > > specifying 'contriburl' or 'available' requires a single type, not type > = > > "both" > > > > Is it a bug? > > Possibly, but I think you need to give a more reproducible example. In > particular, which repositories and which packages? There is a fair amount > of combinations to try... > > -- > Peter Dalgaard, Professor, > Center for Statistics, Copenhagen Business School > Solbjerg Plads 3, 2000 Frederiksberg, Denmark > Phone: (+45)38153501 > Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com > > > > > > > > > [[alternative HTML version deleted]] From suimong at msn.com Mon Jul 20 09:13:45 2015 From: suimong at msn.com (suimong) Date: Mon, 20 Jul 2015 00:13:45 -0700 (PDT) Subject: [Rd] Errors on Windows with grep(fixed=TRUE) on UTF-8 strings In-Reply-To: <1425435174.17350.26.camel@bigboy> References: <1425435174.17350.26.camel@bigboy> Message-ID: <1437376425699-4710059.post@n4.nabble.com> Thank you Winston for the solution! The only workaround I come up with is to set options(encoding = "UTF-8"), which is generally undesirable. I'm wondering is there any chance this patch will be included in future R version? I have been running into this problem from time to time and the latest R 3.2.1 still hasn't handled this issue properly. Winston Chang wrote > After a bit more investigation, I think I've found the cause of the bug, > and I have a patch. > > This bug happens with grep(), when: > * Running on Windows. > * The search uses fixed=TRUE. > * The search pattern is a single byte. > * The current locale has a multibyte encoding. > > ======================= > Here's an example that demonstrates the bug: > > # First, create a 3-byte UTF-8 character > y <- rawToChar(as.raw(c(0xe6, 0xb8, 0x97))) > Encoding(y) <- "UTF-8" > y > # [1] "?" > > # In my default locale, grep with a single-char pattern and fixed=TRUE > # returns integer(0), as expected. > Sys.getlocale("LC_CTYPE") > # [1] "English_United States.1252" > grep("a", y, fixed = TRUE) > # integer(0) > > # When the using a multibyte locale, grep with a single-char > # pattern and fixed=TRUE results in an error. > Sys.setlocale("LC_CTYPE", "chinese") > grep("a", y, fixed = TRUE) > # Error in grep("a", y, fixed = TRUE) : invalid multibyte string at '<97>' > > > ======================= > > I believe the problem is in the main/grep.c file, in the fgrep_one > function. It tests for a multi-byte character string locale > `mbcslocale`, and then for the `use_UTF8`, like so: > > if (!useBytes && mbcslocale) { > ... > } else if (!useBytes && use_UTF8) { > ... > } else ... > > This can be seen at > https://github.com/wch/r-source/blob/e92b4c1cba05762480cd3898335144e5dd111cb7/src/main/grep.c#L668-L692 > > A similar pattern occurs in the fgrep_one_bytes function, at > https://github.com/wch/r-source/blob/e92b4c1cba05762480cd3898335144e5dd111cb7/src/main/grep.c#L718-L736 > > > I believe that the test order should be reversed; it should test first > for `use_UTF8`, and then for `mbcslocale`. This pattern occurs in a few > places in grep.c. It looks like this: > > if (!useBytes && use_UTF8) { > ... > } else if (!useBytes && mbcslocale) { > ... > } else ... > > > ======================= > This patch does what I described; it simply tests for `use_UTF8` first, > and then `mbcslocale`, in both fgrep_one and fgrep_one_bytes. I made > this patch against the 3.1.2 sources, and tested the example code above. > In both cases, grep() returned integer(0), as expected. > > (The reason I made this change against 3.1.2 is because I had problems > getting the current trunk to compile on both Linux or Windows.) > > > diff --git src/main/grep.c src/main/grep.c > index 6e6ec3e..348c63d 100644 > --- src/main/grep.c > +++ src/main/grep.c > @@ -664,27 +664,27 @@ static int fgrep_one(const char *pat, const char > *target, > } > return -1; > } > - if (!useBytes && mbcslocale) { /* skip along by chars */ > - mbstate_t mb_st; > + if (!useBytes && use_UTF8) { > int ib, used; > - mbs_init(&mb_st); > for (ib = 0, i = 0; ib <= len-plen; i++) { > if (strncmp(pat, target+ib, plen) == 0) { > if (next != NULL) *next = ib + plen; > return i; > } > - used = (int) Mbrtowc(NULL, target+ib, MB_CUR_MAX, &mb_st); > + used = utf8clen(target[ib]); > if (used <= 0) break; > ib += used; > } > - } else if (!useBytes && use_UTF8) { > + } else if (!useBytes && mbcslocale) { /* skip along by chars */ > + mbstate_t mb_st; > int ib, used; > + mbs_init(&mb_st); > for (ib = 0, i = 0; ib <= len-plen; i++) { > if (strncmp(pat, target+ib, plen) == 0) { > if (next != NULL) *next = ib + plen; > return i; > } > - used = utf8clen(target[ib]); > + used = (int) Mbrtowc(NULL, target+ib, MB_CUR_MAX, &mb_st); > if (used <= 0) break; > ib += used; > } > @@ -714,21 +714,21 @@ static int fgrep_one_bytes(const char *pat, const > char *target, int len, > if (*p == pat[0]) return i; > return -1; > } > - if (!useBytes && mbcslocale) { /* skip along by chars */ > - mbstate_t mb_st; > + if (!useBytes && use_UTF8) { /* not really needed */ > int ib, used; > - mbs_init(&mb_st); > for (ib = 0, i = 0; ib <= len-plen; i++) { > if (strncmp(pat, target+ib, plen) == 0) return ib; > - used = (int) Mbrtowc(NULL, target+ib, MB_CUR_MAX, &mb_st); > + used = utf8clen(target[ib]); > if (used <= 0) break; > ib += used; > } > - } else if (!useBytes && use_UTF8) { /* not really needed */ > + } else if (!useBytes && mbcslocale) { /* skip along by chars */ > + mbstate_t mb_st; > int ib, used; > + mbs_init(&mb_st); > for (ib = 0, i = 0; ib <= len-plen; i++) { > if (strncmp(pat, target+ib, plen) == 0) return ib; > - used = utf8clen(target[ib]); > + used = (int) Mbrtowc(NULL, target+ib, MB_CUR_MAX, &mb_st); > if (used <= 0) break; > ib += used; > } > > > -Winston > > ______________________________________________ > R-devel@ > mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel -- View this message in context: http://r.789695.n4.nabble.com/Errors-on-Windows-with-grep-fixed-TRUE-on-UTF-8-strings-tp4704073p4710059.html Sent from the R devel mailing list archive at Nabble.com. From luke-tierney at uiowa.edu Mon Jul 20 22:29:25 2015 From: luke-tierney at uiowa.edu (luke-tierney at uiowa.edu) Date: Mon, 20 Jul 2015 15:29:25 -0500 Subject: [Rd] Defining a `show` function breaks the print-ing of S4 object -- bug or expected? In-Reply-To: <5592F8D7.5020200@gmail.com> References: <5592898B.5000103@gmail.com> <5592B6E0.30009@gmail.com> <5592CC34.20309@gmail.com> <5592F8D7.5020200@gmail.com> Message-ID: This fixed in R-devel in r68702; in R-3-2-branch in r68705. Best, luke On Tue, 30 Jun 2015, Duncan Murdoch wrote: > On 30/06/2015 7:04 PM, Paul Gilbert wrote: >> >> >> On 06/30/2015 11:33 AM, Duncan Murdoch wrote: >>> On 30/06/2015 5:27 PM, Lorenz, David wrote: >>>> There is something I'm really missing here. The function show is a >>>> standardGeneric function, so the correct way to write it as method like >>>> this: >>> >>> That describes methods::show. The problem is that the default print >>> mechanism isn't calling methods::show() (or base::print() as Luke says), >>> it's calling show() or print() in the global environment, so the user's >>> function overrides the generic, and you get the error. >> >> These are two different problems aren't they? I can see that you might >> want to ensure that base::print() calls methods::show(), but forcing the >> default print to go to base::print(), rather than whatever print() is >> first on the search path, would seem like a real change of philosophy. >> What about all the other base functions that can be overridden by >> something in the global environment? > > I'd guess it's a minority of R users who know that print() or show() is > being called when you just evaluate an expression. Most would think R > just shows you the value of the expression. That's why they'd be > surprised when their local function suddenly stops the display of > variables from working. > > On the other hand, if someone defined a print or show *method* in the > global environment, I think that one should override one defined in a > package namespace. It does now, and I wouldn't change that. The > difference is that I'd expect someone defining a method to know what > they're doing, but just defining a function doesn't imply that. > > Duncan Murdoch > >> >> Paul >>> >>> Luke, are you going to look at this, or should I? >>> >>> Duncan Murdoch >>> >>>> >>>> setMethod("show", "Person", function(object) { >>>> >>>> for an object of class "Person" for example. >>> >>> >>>> Dave >>>> >>>> On Tue, Jun 30, 2015 at 10:11 AM, wrote: >>>> >>>>> Same thing happens with S3 if you redefine print(). I thought that >>>>> code was actually calculating the function to call rather than the >>>>> symbol to use, but apparently not. Shouldn't be too hard to fix. >>>>> >>>>> luke >>>>> >>>>> On Tue, 30 Jun 2015, Hadley Wickham wrote: >>>>> >>>>> On Tue, Jun 30, 2015 at 2:20 PM, Duncan Murdoch >>>>>> wrote: >>>>>> >>>>>>> On 30/06/2015 1:57 PM, Hadley Wickham wrote: >>>>>>> >>>>>>>> A slightly simpler formulation of the problem is: >>>>>>>> >>>>>>>> show <- function(...) stop("My show!") >>>>>>>> methods::setClass("Person", slots = list(name = "character")) >>>>>>>> methods::new("Person", name = "Tom") >>>>>>>> #> Error in (function (...) : My show! >>>>>>>> >>>>>>> >>>>>>> Just to be clear: the complaint is that the auto-called show() is not >>>>>>> methods::show? I.e. after >>>>>>> >>>>>>> x <- methods::new("Person", name = "Tom") >>>>>>> >>>>>>> you would expect >>>>>>> >>>>>>> show(x) >>>>>>> >>>>>>> to give the error, but not >>>>>>> >>>>>>> x >>>>>>> >>>>>>> ?? >>>>>>> >>>>>> >>>>>> Correct - I'd expect print() to always call methods::show(), not >>>>>> whatever show() is first on the search path. >>>>>> >>>>>> Hadley >>>>>> >>>>>> >>>>>> >>>>> -- >>>>> Luke Tierney >>>>> Ralph E. Wareham Professor of Mathematical Sciences >>>>> University of Iowa Phone: 319-335-3386 >>>>> Department of Statistics and Fax: 319-335-3017 >>>>> Actuarial Science >>>>> 241 Schaeffer Hall email: luke-tierney at uiowa.edu >>>>> Iowa City, IA 52242 WWW: http://www.stat.uiowa.edu >>>>> >>>>> >>>>> ______________________________________________ >>>>> R-devel at r-project.org mailing list >>>>> https://stat.ethz.ch/mailman/listinfo/r-devel >>>>> >>>> >>>> [[alternative HTML version deleted]] >>>> >>>> ______________________________________________ >>>> R-devel at r-project.org mailing list >>>> https://stat.ethz.ch/mailman/listinfo/r-devel >>>> >>> >>> ______________________________________________ >>> R-devel at r-project.org mailing list >>> https://stat.ethz.ch/mailman/listinfo/r-devel >>> > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > -- Luke Tierney Ralph E. Wareham Professor of Mathematical Sciences University of Iowa Phone: 319-335-3386 Department of Statistics and Fax: 319-335-3017 Actuarial Science 241 Schaeffer Hall email: luke-tierney at uiowa.edu Iowa City, IA 52242 WWW: http://www.stat.uiowa.edu From mauricio.zambrano at ufrontera.cl Tue Jul 21 05:21:04 2015 From: mauricio.zambrano at ufrontera.cl (MAURICIO ZAMBRANO BIGIARINI) Date: Tue, 21 Jul 2015 00:21:04 -0300 Subject: [Rd] ' --enable-R-shlib' problem when setting up R-devel in Linux Mint 17.1 64-bit Message-ID: Dear list, This is my first time trying to set up the development version of R (R-devel ) in my local machine to test some packages before submitting them to CRAN. I'm using Linux Mint 17.1 64-bit, which is an Ubuntu-based distro. However, I'm not able to correctly set up r-devel, apparently due to a problem with enabling the shared library support. I run the first script given by Dirk Eddelbuettel (changing '~/SVN/R-devel/trunk' by '~/svn/r-devel' as my local copy of r-devel) on: https://stat.ethz.ch/pipermail/r-sig-debian/2012-August/001935.html After running that script, the error message I get is: `ls ../unix/*.o ../appl/*.o ../nmath/*.o` ../extra/tre/libtre.a -lblas -lgfortran -lm -lquadmath -lreadline -lpcre -llzma -lbz2 -lz -lrt -ldl -lm -licuuc -licui18n /usr/bin/ld: ../appl/dchdc.o: relocation R_X86_64_32 against `.rodata' can not be used when making a shared object; recompile with -fPIC ../appl/dchdc.o: error adding symbols: Bad value collect2: error: ld returned 1 exit status make[3]: *** [libR.so] Error 1 make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/main' make[2]: *** [R] Error 2 make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/main' make[1]: *** [R] Error 1 make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src' make: *** [R] Error 1 *** Done -- now run 'make install' and the complete output of the make part of that script is below: R is now configured for x86_64-unknown-linux-gnu Source directory: . Installation directory: /usr/local/lib/R-devel C compiler: ccache gcc -ggdb -pipe -std=gnu99 -Wall -pedantic Fortran 77 compiler: ccache gfortran -g -O2 C++ compiler: ccache g++ -ggdb -pipe -Wall -pedantic C++11 compiler: ccache g++ -std=c++11 -ggdb -pipe -Wall -pedantic Fortran 90/95 compiler: ccache gfortran -g -O2 Obj-C compiler: Interfaces supported: X11, tcltk External libraries: readline, BLAS(generic), LAPACK(generic), curl Additional capabilities: PNG, JPEG, TIFF, NLS, cairo, ICU Options enabled: shared R library, R profiling Capabilities skipped: Options not enabled: shared BLAS, memory profiling Recommended packages: no make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/doc/manual' make[1]: Nothing to be done for `front-matter'. make[1]: Nothing to be done for `html-non-svn'. make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/doc/manual' SVN-REVISION is unchanged make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/m4' make[1]: Nothing to be done for `R'. make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/m4' make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/tools' make[1]: Nothing to be done for `R'. make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/tools' make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/doc' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/doc/html' make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/doc/html' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/doc/manual' make[2]: Nothing to be done for `R'. make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/doc/manual' make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/doc' make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/etc' make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/etc' make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/share' make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/share' make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/scripts' creating src/scripts/R.fe make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/scripts' make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/scripts' make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/scripts' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/include' config.status: creating src/include/config.h config.status: src/include/config.h is unchanged Rmath.h is unchanged make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/include/R_ext' make[3]: Nothing to be done for `R'. make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/include/R_ext' make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/include' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/extra' make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' make[4]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' make[4]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' make[4]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' make[4]: `libtre.a' is up to date. make[4]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/extra' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/appl' make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/appl' make[3]: `libappl.a' is up to date. make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/appl' make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/appl' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/nmath' make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/nmath' make[3]: `libnmath.a' is up to date. make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/nmath' make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/nmath' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/unix' make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/unix' make[3]: `libunix.a' is up to date. make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/unix' make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/unix' ccache gcc -I. -I../../src/include -I../../src/include -I/usr/local/include -DHAVE_CONFIG_H -fopenmp -fpic -ggdb -pipe -std=gnu99 -Wall -pedantic -L/usr/local/lib -DR_HOME='"/home/hzambran/SVN/R-devel/trunk"' \ -o Rscript ./Rscript.c make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/unix' make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/unix' make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/main' make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/main' ccache gcc -shared -fopenmp -L/usr/local/lib -o libR.so CommandLineArgs.o Rdynload.o Renviron.o RNG.o agrep.o apply.o arithmetic.o array.o attrib.o bind.o builtin.o character.o coerce.o colors.o complex.o connections.o context.o cum.o dcf.o datetime.o debug.o deparse.o devices.o dotcode.o dounzip.o dstruct.o duplicate.o edit.o engine.o envir.o errors.o eval.o format.o gevents.o gram.o gram-ex.o graphics.o grep.o identical.o inlined.o inspect.o internet.o iosupport.o lapack.o list.o localecharset.o logic.o main.o mapply.o match.o memory.o names.o objects.o options.o paste.o platform.o plot.o plot3d.o plotmath.o print.o printarray.o printvector.o printutils.o qsort.o random.o raw.o registration.o relop.o rlocale.o saveload.o scan.o seq.o serialize.o sort.o source.o split.o sprintf.o startup.o subassign.o subscript.o subset.o summary.o sysutils.o times.o unique.o util.o version.o g_alab_her.o g_cntrlify.o g_fontdb.o g_her_glyph.o xxxpr.o `ls ../unix/*.o ../appl/*.o ../nmath/*.o` ../extra/tre/libtre.a -lblas -lgfortran -lm -lquadmath -lreadline -lpcre -llzma -lbz2 -lz -lrt -ldl -lm -licuuc -licui18n /usr/bin/ld: ../appl/dchdc.o: relocation R_X86_64_32 against `.rodata' can not be used when making a shared object; recompile with -fPIC ../appl/dchdc.o: error adding symbols: Bad value collect2: error: ld returned 1 exit status make[3]: *** [libR.so] Error 1 make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/main' make[2]: *** [R] Error 2 make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/main' make[1]: *** [R] Error 1 make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src' make: *** [R] Error 1 *** Done -- now run 'make install' I would highly appreciate any advice on how to solve this issue or how to correctly set up r-devel. Thanks in advance, Mauricio Zambrano-Bigiarini, PhD ===================================== Dept. of Civil Engineering Faculty of Engineering and Sciences Universidad de La Frontera PO Box 54-D, Temuco, Chile ===================================== mailto : mauricio.zambrano at ufrontera.cl work-phone : +56 45 259 2812 http://ingenieriacivil.ufro.cl/ ===================================== "When the pupil is ready, the master arrives." (Zen proverb) ===================================== Linux user #454569 -- Linux Mint user From mauricio.zambrano at ufrontera.cl Tue Jul 21 07:15:35 2015 From: mauricio.zambrano at ufrontera.cl (MAURICIO ZAMBRANO BIGIARINI) Date: Tue, 21 Jul 2015 02:15:35 -0300 Subject: [Rd] ' --enable-R-shlib' problem when setting up R-devel in Linux Mint 17.1 64-bit In-Reply-To: References: Message-ID: Regarding the previous problem with setting up r-devel in Linux Mint 17.1 64-bit, I deleted all the r-devel files I previously had, and then I did the check out to the '~/svn/r-devel' directory originally mentioned by by Dirk on https://stat.ethz.ch/pipermail/r-sig-debian/2012-August/001935.html, instead of the customised location I previously chose: '~/SVN/R-devel/trunk' and now I everything worked perfectly fine. However, I'm still wondering what should I change in the first script proposed by Dirk on https://stat.ethz.ch/pipermail/r-sig-debian/2012-August/001935.html if i want to use a user-defined location for the development files (e.g., '~/SVN/R-devel/trunk'). I meant, in addition to the line: cd ~/svn/r-devel Kind regards, Mauricio ===================================== "When the pupil is ready, the master arrives." (Zen proverb) ===================================== Linux user #454569 -- Linux Mint user On 21 July 2015 at 00:21, MAURICIO ZAMBRANO BIGIARINI wrote: > Dear list, > > This is my first time trying to set up the development version of R > (R-devel ) in my local machine to test some packages before submitting > them to CRAN. I'm using Linux Mint 17.1 64-bit, which is an > Ubuntu-based distro. > > However, I'm not able to correctly set up r-devel, apparently due to a > problem with enabling the shared library support. > > > I run the first script given by Dirk Eddelbuettel (changing > '~/SVN/R-devel/trunk' by '~/svn/r-devel' as my local copy of r-devel) > on: > https://stat.ethz.ch/pipermail/r-sig-debian/2012-August/001935.html > > > After running that script, the error message I get is: > > `ls ../unix/*.o ../appl/*.o ../nmath/*.o` ../extra/tre/libtre.a > -lblas -lgfortran -lm -lquadmath -lreadline -lpcre -llzma -lbz2 -lz > -lrt -ldl -lm -licuuc -licui18n > /usr/bin/ld: ../appl/dchdc.o: relocation R_X86_64_32 against `.rodata' > can not be used when making a shared object; recompile with -fPIC > ../appl/dchdc.o: error adding symbols: Bad value > collect2: error: ld returned 1 exit status > make[3]: *** [libR.so] Error 1 > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/main' > make[2]: *** [R] Error 2 > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/main' > make[1]: *** [R] Error 1 > make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src' > make: *** [R] Error 1 > *** Done -- now run 'make install' > > > and the complete output of the make part of that script is below: > > R is now configured for x86_64-unknown-linux-gnu > > Source directory: . > Installation directory: /usr/local/lib/R-devel > > C compiler: ccache gcc -ggdb -pipe -std=gnu99 -Wall -pedantic > Fortran 77 compiler: ccache gfortran -g -O2 > > C++ compiler: ccache g++ -ggdb -pipe -Wall -pedantic > C++11 compiler: ccache g++ -std=c++11 -ggdb -pipe -Wall -pedantic > Fortran 90/95 compiler: ccache gfortran -g -O2 > Obj-C compiler: > > Interfaces supported: X11, tcltk > External libraries: readline, BLAS(generic), LAPACK(generic), curl > Additional capabilities: PNG, JPEG, TIFF, NLS, cairo, ICU > Options enabled: shared R library, R profiling > > Capabilities skipped: > Options not enabled: shared BLAS, memory profiling > > Recommended packages: no > > make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/doc/manual' > make[1]: Nothing to be done for `front-matter'. > make[1]: Nothing to be done for `html-non-svn'. > make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/doc/manual' > SVN-REVISION is unchanged > make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/m4' > make[1]: Nothing to be done for `R'. > make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/m4' > make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/tools' > make[1]: Nothing to be done for `R'. > make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/tools' > make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/doc' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/doc/html' > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/doc/html' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/doc/manual' > make[2]: Nothing to be done for `R'. > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/doc/manual' > make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/doc' > make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/etc' > make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/etc' > make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/share' > make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/share' > make[1]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/scripts' > creating src/scripts/R.fe > make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/scripts' > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/scripts' > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/scripts' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/include' > config.status: creating src/include/config.h > config.status: src/include/config.h is unchanged > Rmath.h is unchanged > make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/include/R_ext' > make[3]: Nothing to be done for `R'. > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/include/R_ext' > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/include' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/extra' > make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' > make[4]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' > make[4]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' > make[4]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' > make[4]: `libtre.a' is up to date. > make[4]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/extra/tre' > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/extra' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/appl' > make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/appl' > make[3]: `libappl.a' is up to date. > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/appl' > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/appl' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/nmath' > make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/nmath' > make[3]: `libnmath.a' is up to date. > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/nmath' > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/nmath' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/unix' > make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/unix' > make[3]: `libunix.a' is up to date. > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/unix' > make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/unix' > ccache gcc -I. -I../../src/include -I../../src/include > -I/usr/local/include -DHAVE_CONFIG_H -fopenmp -fpic -ggdb -pipe > -std=gnu99 -Wall -pedantic -L/usr/local/lib > -DR_HOME='"/home/hzambran/SVN/R-devel/trunk"' \ > -o Rscript ./Rscript.c > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/unix' > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/unix' > make[2]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/main' > make[3]: Entering directory `/home/hzambran/SVN/R-devel/trunk/src/main' > ccache gcc -shared -fopenmp -L/usr/local/lib -o libR.so > CommandLineArgs.o Rdynload.o Renviron.o RNG.o agrep.o apply.o > arithmetic.o array.o attrib.o bind.o builtin.o character.o coerce.o > colors.o complex.o connections.o context.o cum.o dcf.o datetime.o > debug.o deparse.o devices.o dotcode.o dounzip.o dstruct.o duplicate.o > edit.o engine.o envir.o errors.o eval.o format.o gevents.o gram.o > gram-ex.o graphics.o grep.o identical.o inlined.o inspect.o internet.o > iosupport.o lapack.o list.o localecharset.o logic.o main.o mapply.o > match.o memory.o names.o objects.o options.o paste.o platform.o plot.o > plot3d.o plotmath.o print.o printarray.o printvector.o printutils.o > qsort.o random.o raw.o registration.o relop.o rlocale.o saveload.o > scan.o seq.o serialize.o sort.o source.o split.o sprintf.o startup.o > subassign.o subscript.o subset.o summary.o sysutils.o times.o unique.o > util.o version.o g_alab_her.o g_cntrlify.o g_fontdb.o g_her_glyph.o > xxxpr.o `ls ../unix/*.o ../appl/*.o ../nmath/*.o` > ../extra/tre/libtre.a -lblas -lgfortran -lm -lquadmath -lreadline > -lpcre -llzma -lbz2 -lz -lrt -ldl -lm -licuuc -licui18n > /usr/bin/ld: ../appl/dchdc.o: relocation R_X86_64_32 against `.rodata' > can not be used when making a shared object; recompile with -fPIC > ../appl/dchdc.o: error adding symbols: Bad value > collect2: error: ld returned 1 exit status > make[3]: *** [libR.so] Error 1 > make[3]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/main' > make[2]: *** [R] Error 2 > make[2]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src/main' > make[1]: *** [R] Error 1 > make[1]: Leaving directory `/home/hzambran/SVN/R-devel/trunk/src' > make: *** [R] Error 1 > *** Done -- now run 'make install' > > > I would highly appreciate any advice on how to solve this issue or how > to correctly set up r-devel. > > > Thanks in advance, > > Mauricio Zambrano-Bigiarini, PhD > > ===================================== > Dept. of Civil Engineering > Faculty of Engineering and Sciences > Universidad de La Frontera > PO Box 54-D, Temuco, Chile > ===================================== > mailto : mauricio.zambrano at ufrontera.cl > work-phone : +56 45 259 2812 > http://ingenieriacivil.ufro.cl/ > ===================================== > "When the pupil is ready, the master arrives." > (Zen proverb) > ===================================== > Linux user #454569 -- Linux Mint user From edd at debian.org Tue Jul 21 13:13:40 2015 From: edd at debian.org (Dirk Eddelbuettel) Date: Tue, 21 Jul 2015 06:13:40 -0500 Subject: [Rd] ' --enable-R-shlib' problem when setting up R-devel in Linux Mint 17.1 64-bit In-Reply-To: References: Message-ID: <21934.10596.6975.789890@max.nulle.part> On 21 July 2015 at 02:15, MAURICIO ZAMBRANO BIGIARINI wrote: | Regarding the previous problem with setting up r-devel in Linux Mint | 17.1 64-bit, I deleted all the r-devel files I previously had, and | then I did the check out to the | | '~/svn/r-devel' | | directory originally mentioned by by Dirk on | https://stat.ethz.ch/pipermail/r-sig-debian/2012-August/001935.html, | instead of the customised location I previously chose: | | '~/SVN/R-devel/trunk' | | and now I everything worked perfectly fine. | | However, I'm still wondering what should I change in the first script | proposed by Dirk on | https://stat.ethz.ch/pipermail/r-sig-debian/2012-August/001935.html if | i want to use a user-defined location for the development files (e.g., | '~/SVN/R-devel/trunk'). I meant, in addition to the line: | | cd ~/svn/r-devel That is most likely spurious. Your errors may be due to, say, you not having run 'make clean' or 'make distclean' in the previous repo checkout you had so that the effect of the _initial_ configure call still lingered. There is no magic here. Lots of people have built R this way, and even more people use configure regularly. These things work. You somehow boxed yourself into a corner, out of which you now seem to have escaped, which is good. The proper sequence (svn checkout ...; configure ...; make ...) will work no matter what name you give the directory you work in. Regards, Dirk -- http://dirk.eddelbuettel.com | @eddelbuettel | edd at debian.org From pdalgd at gmail.com Tue Jul 21 14:16:52 2015 From: pdalgd at gmail.com (peter dalgaard) Date: Tue, 21 Jul 2015 14:16:52 +0200 Subject: [Rd] ' --enable-R-shlib' problem when setting up R-devel in Linux Mint 17.1 64-bit In-Reply-To: <21934.10596.6975.789890@max.nulle.part> References: <21934.10596.6975.789890@max.nulle.part> Message-ID: <91FF6478-328A-41D0-BE8B-59569CD8DF82@gmail.com> > On 21 Jul 2015, at 13:13 , Dirk Eddelbuettel wrote: > > > You somehow boxed yourself into a corner, out of which you now seem to have > escaped, which is good. The proper sequence (svn checkout ...; configure > ...; make ...) will work no matter what name you give the directory you work > in. > You may want to avoid building in the source directory, though. We don't generally do that when testing, so you may bump into something unexpected. I.e., rather do export REPOS=https://svn.r-project.org/R export RTOP=~ #adjust as necessary cd $RTOP svn co $REPOS/trunk r-devel/R mkdir r-devel/BUILD #--- cd r-devel/R svn up cd ../BUILD ../R/configure make For configure optione, I generally maintain a config.site in the r-devel directory and copy it to BUILD before running configure. Presumably, regular R already exists as a package on Linux Mint. You might want to investigate how R is built from sources, see for instance: http://community.linuxmint.com/tutorial/view/1822 -- Peter Dalgaard, Professor, Center for Statistics, Copenhagen Business School Solbjerg Plads 3, 2000 Frederiksberg, Denmark Phone: (+45)38153501 Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com From mauricio.zambrano at ufrontera.cl Tue Jul 21 16:53:23 2015 From: mauricio.zambrano at ufrontera.cl (MAURICIO ZAMBRANO BIGIARINI) Date: Tue, 21 Jul 2015 11:53:23 -0300 Subject: [Rd] ' --enable-R-shlib' problem when setting up R-devel in Linux Mint 17.1 64-bit In-Reply-To: <91FF6478-328A-41D0-BE8B-59569CD8DF82@gmail.com> References: <21934.10596.6975.789890@max.nulle.part> <91FF6478-328A-41D0-BE8B-59569CD8DF82@gmail.com> Message-ID: Thank you very much Dirk, Peter and Johannes for all your help. Now, I removed all the previous files and repeated all the previous procedure using my own SVN directory, instead of the ' ~/svn/r-devel' included in the original mail of Dirk, and everything worked fine as well. So I can confirm that very likely I suffered the effect of the _initial_ configure, because I didn't run 'make clean' or 'make distclean' in the first repo I checkout. Finally, I slightly modified the first script of Dirk on https://stat.ethz.ch/pipermail/r-sig-debian/2012-August/001935.html, to avoid building in the source directory (following the advice of Peter Dalgaard) and finally I'm able to set up R-devel on a daily basis ! Thanks to all of you and I hope this thread might be useful for somebody else.... Kind regards, Mauricio Zambrano-Bigiarini, PhD ===================================== Dept. of Civil Engineering Faculty of Engineering and Sciences Universidad de La Frontera PO Box 54-D, Temuco, Chile ===================================== mailto : mauricio.zambrano at ufrontera.cl work-phone : +56 45 259 2812 http://ingenieriacivil.ufro.cl/ ===================================== "When the pupil is ready, the master arrives." (Zen proverb) ===================================== Linux user #454569 -- Linux Mint user On 21 July 2015 at 09:16, peter dalgaard wrote: > >> On 21 Jul 2015, at 13:13 , Dirk Eddelbuettel wrote: >> >> >> You somehow boxed yourself into a corner, out of which you now seem to have >> escaped, which is good. The proper sequence (svn checkout ...; configure >> ...; make ...) will work no matter what name you give the directory you work >> in. >> > > You may want to avoid building in the source directory, though. We don't generally do that when testing, so you may bump into something unexpected. > > I.e., rather do > > export REPOS=https://svn.r-project.org/R > export RTOP=~ #adjust as necessary > cd $RTOP > svn co $REPOS/trunk r-devel/R > mkdir r-devel/BUILD > #--- > cd r-devel/R > svn up > cd ../BUILD > ../R/configure > make > > For configure optione, I generally maintain a config.site in the r-devel directory and copy it to BUILD before running configure. > > Presumably, regular R already exists as a package on Linux Mint. You might want to investigate how R is built from sources, see for instance: http://community.linuxmint.com/tutorial/view/1822 > > > -- > Peter Dalgaard, Professor, > Center for Statistics, Copenhagen Business School > Solbjerg Plads 3, 2000 Frederiksberg, Denmark > Phone: (+45)38153501 > Email: pd.mes at cbs.dk Priv: PDalgd at gmail.com > > > > > > > > From maechler at stat.math.ethz.ch Thu Jul 23 16:25:30 2015 From: maechler at stat.math.ethz.ch (Martin Maechler) Date: Thu, 23 Jul 2015 16:25:30 +0200 Subject: [Rd] Improvements (?) in stats::poly and stats::polym. In-Reply-To: <21929.9884.443290.579419@stat.math.ethz.ch> References: <5F22AFBADFE10342ABECF0281DE992181824CEF3@EXCH001.campden.co.uk> <21929.9884.443290.579419@stat.math.ethz.ch> Message-ID: <21936.63834.876737.501247@stat.math.ethz.ch> >>>>> "MM" == Martin Maechler >>>>> on Fri, 17 Jul 2015 18:00:28 +0200 writes: MM> Dear Keith, >>>>> >>>>> on Thu, 16 Jul 2015 08:58:11 +0000 writes: >> Dear R Core Team, >> Last week I made a post to the R-help mailing list >> ?predict.poly for multivariate data? >> >> but it has had no responses so I?m sending this to the >> email address of package:stats maintainer. Please feel >> free to tell me that this is inappropriate. MM> Asking R Core in your case is ok ... MM> { though still slightly "sub optimal" (but *not* "inappropriate"!): MM> Ideallly you'd have followed the posting guide MM> (http://www.r-project.org/posting-guide.html) here, MM> namely to send your original post to R-devel instead of R-help. MM> Then it would have been noticed by me and most probably MM> several other R core members ... MM> } >> IMHO the reproducible code I presented in that post: >> ############# >> library(datasets) >> alm <- lm(stack.loss ~ poly(Air.Flow, Water.Temp, degree=3), stackloss) >> alm$fitted.values[1:10] # "correct" prediction values [1:10] >> predict(alm, stackloss)[1:10] # gives correct values >> predict(alm, stackloss[1:10,]) # gives wrong values >> ######### >> ... clearly demonstrates something wrong, the two predicts should not differ. >> I hesitate to call it a bug, it might be viewed as inappropriate usage. But it's easy to get wrong answers, fairly small changes to poly and polym correct the wrongness, and I think the changes are backwards compatible. Perhaps appending the altered codes made the R-help post too long for easy comprehension, I attach them to this email. MM> Thank you! MM> I had started to look at your R-help post and noticed that you MM> changed the *printout* of the R functions, instead of the MM> source MM> The current development version of that part of the R MM> source code is always at MM> https://svn.r-project.org/R/trunk/src/library/stats/R/contr.poly.R MM> and if you look carefully, you see that there are comments in MM> the sources that are lost in the process (of parsing, MM> byte-compiling, saving in binary, ....), MM> but never mind: MM> you've marked your changes well and I can use your version to MM> modify the sources. >> From what I've understood, the changes make much sense and look MM> good; and if no problem surfaces should make it into R - with an MM> acknowledgement to you, of course. I've now committed corresponding changes to R-devel, changes which indeed have evolved from your (Keith) contributions, thank you very much. My additional changes were trying to slightly simplify the code logic, (and a new argument 'simple' to gain some speed). If the changes do not have visible negative effects on existing CRAN/Bioconductor code (which *is* possible, after all, the results now sometimes are different in the attributes), we may consider porting the changes to 'R 3.2.1 patched' which will become R 3.2.2 in three weeks. Thank you again, Martin Maechler > [............................] From milena.stat at gmail.com Fri Jul 24 17:21:57 2015 From: milena.stat at gmail.com (SuzukiBlue) Date: Fri, 24 Jul 2015 08:21:57 -0700 (PDT) Subject: [Rd] Rcartogram package - error message Message-ID: <1437751317926-4710313.post@n4.nabble.com> I am trying to install two R packages to produce cartograms like in the work of Gastner and Newman: http://www.pnas.org/content/101/20/7499.full.pdf but I have a problem installing Rcartogram and rdyncall packages. Both are not available in CRAN and have to be installed from archivea and produce errors: > install.packages("C:/Users/Milena/Downloads/*Rcartogram*_0.2-2.tar.gz", > repos = NULL, type = "source") Installing package into ?C:/Users/Milena/Documents/R/win-library/3.2? (as ?lib? is unspecified) * installing *source* package 'Rcartogram' ... ********************************************** WARNING: this package has a configure script It probably needs manual configuration ********************************************** ** libs *** arch - i386 Warning: running command 'make -f "Makevars" -f "C:/PROGRA~1/R/R-3.2.0/etc/i386/Makeconf" -f "C:/PROGRA~1/R/R-3.2.0/share/make/winshlib.mk" SHLIB="Rcartogram.dll" OBJECTS="Rcart.o cart.o"' had status 127 ERROR: compilation failed for package 'Rcartogram' * removing 'C:/Users/Milena/Documents/R/win-library/3.2/Rcartogram' Warning in install.packages : running command '"C:/PROGRA~1/R/R-3.2.0/bin/x64/R" CMD INSTALL -l "C:\Users\Milena\Documents\R\win-library\3.2" "C:/Users/Milena/Downloads/Rcartogram_0.2-2.tar.gz"' had status 1 Warning in install.packages : installation of package ?C:/Users/Milena/Downloads/Rcartogram_0.2-2.tar.gz? had non-zero exit status > install.packages("C:/Users/Milena/Downloads/*rdyncall*_0.7.5.tar.gz", > repos = NULL, type = "source") Installing package into ?C:/Users/Milena/Documents/R/win-library/3.2? (as ?lib? is unspecified) * installing *source* package 'rdyncall' ... ** package 'rdyncall' successfully unpacked and MD5 sums checked ** libs *** arch - i386 Warning: running command 'make -f "Makevars.win" -f "C:/PROGRA~1/R/R-3.2.0/etc/i386/Makeconf" -f "C:/PROGRA~1/R/R-3.2.0/share/make/winshlib.mk" SHLIB="rdyncall.dll" OBJECTS="rcallback.o rdyncall.o rdynload.o rpack.o rpackage.o rutils.o rutils_float.o rutils_str.o"' had status 127 ERROR: compilation failed for package 'rdyncall' * removing 'C:/Users/Milena/Documents/R/win-library/3.2/rdyncall' Warning in install.packages : running command '"C:/PROGRA~1/R/R-3.2.0/bin/x64/R" CMD INSTALL -l "C:\Users\Milena\Documents\R\win-library\3.2" "C:/Users/Milena/Downloads/rdyncall_0.7.5.tar.gz"' had status 1 Warning in install.packages : installation of package ?C:/Users/Milena/Downloads/rdyncall_0.7.5.tar.gz? had non-zero exit status *rdyncall* finally loaded on a machine of a friend but not on mine, is there something wrong with my installation? following are my sessionInfo() R version 3.2.0 (2015-04-16) Platform: *x86_64-w64-mingw32/x64 (64-bit)* Running under: *Windows 8 x64 (build 9200) * locale: [1] LC_COLLATE=English_United Kingdom.1252 LC_CTYPE=English_United Kingdom.1252 LC_MONETARY=English_United Kingdom.1252 LC_NUMERIC=C [5] LC_TIME=English_United Kingdom.1252 attached base packages: [1] stats graphics grDevices utils datasets methods base other attached packages: [1] sp_1.1-1 loaded via a namespace (and not attached): [1] tools_3.2.0 grid_3.2.0 lattice_0.20-31 -- View this message in context: http://r.789695.n4.nabble.com/Rcartogram-package-error-message-tp4710313.html Sent from the R devel mailing list archive at Nabble.com. From jgbradley1 at gmail.com Fri Jul 24 22:21:03 2015 From: jgbradley1 at gmail.com (Joshua Bradley) Date: Fri, 24 Jul 2015 16:21:03 -0400 Subject: [Rd] Memory limitations for parallel::mclapply Message-ID: Hello, I have been having issues using parallel::mclapply in a memory-efficient way and would like some guidance. I am using a 40 core machine with 96 GB of RAM. I've tried to run mclapply with 20, 30, and 40 mc.cores and it has practically brought the machine to a standstill each time to the point where I do a hard reset. When running mclapply with 10 mc.cores, I can see that each process takes 7.4% (~7 GB) of memory. My use-case for mclapply is the following: run mclapply over a list of 150000 names, for each process I refer to a larger pre-computed data.table to compute some stats with the name, and return those stats . Ideally I want to use the large data.table as shared-memory but the number of mc.cores I can use are being limited because each one requires 7 GB. Someone posted this exact same issue on stackoverflow a couple years ago but it never got answered. Do I have to manually tell mclapply to use shared memory (if so, how?)? Is this type of job better with the doParallel package and foreach approach? Josh Bradley [[alternative HTML version deleted]] From istazahn at gmail.com Fri Jul 24 23:49:14 2015 From: istazahn at gmail.com (Ista Zahn) Date: Fri, 24 Jul 2015 17:49:14 -0400 Subject: [Rd] Memory limitations for parallel::mclapply In-Reply-To: References: Message-ID: Hi Josh, I think we need some more details, including code, and information about your operating system. My machine has only 12 Gb of ram, but I can run this quite comfortably (no swap, other processes using memory etc.): library(parallel) library(data.table) d <- data.table(a = rnorm(50000000), b = runif(1:50000000), c = sample(letters, 50000000, replace = TRUE), d = 1:50000000, g = rep(letters[1:10], each = 5000000)) system.time(means <- mclapply(unique(d$g), function(x) sapply(d[g==x, list(a, b, d)], mean), mc.cores = 5)) In other words, I don't think there is anything inherent the the kind of operation you describe that requires the large data object to be copied. So as usual the devil is in the details, which you haven't yet described. Best, Ista On Fri, Jul 24, 2015 at 4:21 PM, Joshua Bradley wrote: > Hello, > > I have been having issues using parallel::mclapply in a memory-efficient > way and would like some guidance. I am using a 40 core machine with 96 GB > of RAM. I've tried to run mclapply with 20, 30, and 40 mc.cores and it has > practically brought the machine to a standstill each time to the point > where I do a hard reset. > > When running mclapply with 10 mc.cores, I can see that each process takes > 7.4% (~7 GB) of memory. My use-case for mclapply is the following: run > mclapply over a list of 150000 names, for each process I refer to a larger > pre-computed data.table to compute some stats with the name, and return > those stats . Ideally I want to use the large data.table as shared-memory > but the number of mc.cores I can use are being limited because each one > requires 7 GB. Someone posted this exact same issue > > on > stackoverflow a couple years ago but it never got answered. > > Do I have to manually tell mclapply to use shared memory (if so, how?)? Is > this type of job better with the doParallel package and foreach approach? > > Josh Bradley > > [[alternative HTML version deleted]] > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel From radford at cs.toronto.edu Sun Jul 26 20:14:02 2015 From: radford at cs.toronto.edu (Radford Neal) Date: Sun, 26 Jul 2015 14:14:02 -0400 Subject: [Rd] R-devel Digest, Vol 149, Issue 22 In-Reply-To: References: Message-ID: <20150726181402.GA27570@cs.toronto.edu> > From: Joshua Bradley > > I have been having issues using parallel::mclapply in a memory-efficient > way and would like some guidance. I am using a 40 core machine with 96 GB > of RAM. I've tried to run mclapply with 20, 30, and 40 mc.cores and it has > practically brought the machine to a standstill each time to the point > where I do a hard reset. When mclapply forks to start a new process, the memory is initially shared with the parent process. However, a memory page has to be copied whenever either process writes to it. Unfortunately, R's garbage collector writes to each object to mark and unmark it whenever a full garbage collection is done, so it's quite possible that every R object will be duplicated in each process, even though many of them are not actually changed (from the point of view of the R programs). One thing on my near-term to-do list for pqR is to re-implement R's garbage collector in a way that will avoid this (as well as having various other advantages, including less memory overhead per object). Radford Neal From bt at datak.fr Mon Jul 27 15:16:32 2015 From: bt at datak.fr (Benoit Thieurmel) Date: Mon, 27 Jul 2015 15:16:32 +0200 Subject: [Rd] parallel performance inline code vs using function ? Message-ID: Hi, I really try to understand why working with parallel package, code seems to be slower using inside a function... for example : # data don <- lapply(1:150, function(x){data.frame(a = rnorm(100000), b = rnorm(100000))}) # inline test t0 <- Sys.time() require(parallel) cl <- makeCluster(4) res <- parLapplyLB(cl, don, function(x){1}) stopCluster(cl) Sys.time()-t0 # 3.5 sec, each thread up to 90 Mo # using function parF <- function(data){ require(parallel) cl <- makeCluster(4) result <- parLapply(cl, data, function(x){1}) stopCluster(cl) } system.time(res2 <- parF(don)) # 9.5 sec, each thread up to 320 Mo ...! It's seems that, using inside a function : - is 3x slower... - more data is loaded into each thread...! Thanks. -- Benoit Thieurmel +33 6 69 04 06 11 10 place de la Madeleine - 75008 Paris [[alternative HTML version deleted]] From ignacio82 at gmail.com Mon Jul 27 20:06:54 2015 From: ignacio82 at gmail.com (Ignacio Martinez) Date: Mon, 27 Jul 2015 18:06:54 +0000 Subject: [Rd] R package with Fortran module on Windows? undefined reference to `__stack_chk_fail' Message-ID: Hi, I created a R library that uses a Fortran module. Everything works like a charm on linux. Now I'm trying to make it work on Windows. I cloned my git repository on a windows computer, and when I press the build and reload button on Rstudio I get these errors: ==> Rcmd.exe INSTALL --no-multiarch --with-keep.source MyPi * installing to library 'C:/Users/IMartinez/Documents/R/R-3.2.1/library'* installing *source* package 'MyPi' ...** libs gfortran -m64 -shared -s -static-libgcc -o MyPi.dll tmp.def Fpi.o -Ld:/RCompile/r-compiling/local/local320/lib/x64 -Ld:/RCompile/r-compiling/local/local320/lib -LC:/Users/IMARTI~1/DOCUME~1/R/R-32~1.1/bin/x64 -lR Fpi.o: In function `__fortranpi_MOD_dboard': Fpi.f90:(.text+0xd7): undefined reference to `__stack_chk_fail' Fpi.o: In function `pi_': Fpi.f90:(.text+0x249): undefined reference to `__stack_chk_fail' collect2: ld returned 1 exit status no DLL was created ERROR: compilation failed for package 'MyPi'* removing 'C:/Users/IMartinez/Documents/R/R-3.2.1/library/MyPi' Exited with status 1. This is the Fortran code: Module Fortranpi IMPLICIT NONE contains subroutine dboard(darts, dartsscore) integer, intent(in) :: darts double precision, intent(out) :: dartsscore double precision :: x_coord, y_coord integer :: score, n score = 0 do n = 1, darts call random_number(x_coord) call random_number(y_coord) if ((x_coord**2 + y_coord**2) <= 1.0d0) then score = score + 1 end if end do dartsscore = 4.0d0*score/darts end subroutine dboard subroutine pi(avepi, DARTS, ROUNDS) bind(C, name="pi_") use, intrinsic :: iso_c_binding, only : c_double, c_int real(c_double), intent(out) :: avepi integer(c_int), intent(in) :: DARTS, ROUNDS integer :: MASTER, rank, i, n integer, allocatable :: seed(:) double precision :: pi_est, homepi, pirecv, pisum ! we set it to zero in the sequential run rank = 0! initialize the random number generator! we make sure the seed is different for each task call random_seed() call random_seed(size = n) allocate(seed(n)) seed = 12 + rank*11 call random_seed(put=seed(1:n)) deallocate(seed) avepi = 0 do i = 0, ROUNDS-1 call dboard(darts, pi_est) ! calculate the average value of pi over all iterations avepi = ((avepi*i) + pi_est)/(i + 1) end do end subroutine pi end module Fortranpi I tried adding -fno-stack-protector -lssp but it did not help. I also tried doing this "by hand" and I get these errors: > system("R CMD SHLIB -fno-stack-protector -lssp ./src/Fpi.f90") gfortran -m64 -shared -s -static-libgcc -o src/Fpi.dll tmp.def ./src/Fpi.o -fno-stack-protector -lssp -Ld:/RCompile/r-compiling/local/local320/lib/x64 -Ld:/RCompile/r-compiling/local/local320/lib -LC:/Users/IMARTI~1/DOCUME~1/R/R-32~1.1/bin/x64 -lR> dyn.load("./src/Fpi.dll") Error in inDL(x, as.logical(local), as.logical(now), ...) : unable to load shared object 'C:/Users/IMartinez/Projects/MyPi/./src/Fpi.dll': LoadLibrary failure: %1 is not a valid Win32 application. 'C:/Users/IMartinez/Projects/MyPi/./src/Fpi.dll': LoadLibrary failure: %1 is not a valid Win32 application. Thanks for the help! ? PS: I posted this question in stackoverflow with no luck. PPS: I also sent this to r-help but they told me to try this list [[alternative HTML version deleted]] From kevinushey at gmail.com Mon Jul 27 21:12:45 2015 From: kevinushey at gmail.com (Kevin Ushey) Date: Mon, 27 Jul 2015 12:12:45 -0700 Subject: [Rd] R package with Fortran module on Windows? undefined reference to `__stack_chk_fail' In-Reply-To: References: Message-ID: You should be able to set PKG_FCFLAGS="-fno-stack-protector" when compiling to ensure that the stack protector is not used. (Trying that out on a Windows VM, with a simple `R CMD build` + `R CMD INSTALL`, compilation of your package succeeded but linking failed saying the DLL 'Fpi' was not found; I imagine that's a separate issue.) As an aside, be sure to check out what R-exts has to say on the topic of Fortran code in R packages (especially F90 and above): https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Using-F95-code. Cheers, Kevin On Mon, Jul 27, 2015 at 11:06 AM, Ignacio Martinez wrote: > Hi, > > I created a R library that uses a Fortran module. Everything works like a > charm on linux. > > Now I'm trying to make it work on Windows. I cloned my git repository > on a windows computer, and when I press > the build and reload button on Rstudio I get these errors: > > ==> Rcmd.exe INSTALL --no-multiarch --with-keep.source MyPi > * installing to library > 'C:/Users/IMartinez/Documents/R/R-3.2.1/library'* installing *source* > package 'MyPi' ...** libs > gfortran -m64 -shared -s -static-libgcc -o MyPi.dll tmp.def Fpi.o > -Ld:/RCompile/r-compiling/local/local320/lib/x64 > -Ld:/RCompile/r-compiling/local/local320/lib > -LC:/Users/IMARTI~1/DOCUME~1/R/R-32~1.1/bin/x64 -lR > Fpi.o: In function `__fortranpi_MOD_dboard': > Fpi.f90:(.text+0xd7): undefined reference to `__stack_chk_fail' > Fpi.o: In function `pi_': > Fpi.f90:(.text+0x249): undefined reference to `__stack_chk_fail' > collect2: ld returned 1 exit status > no DLL was created > ERROR: compilation failed for package 'MyPi'* removing > 'C:/Users/IMartinez/Documents/R/R-3.2.1/library/MyPi' > > Exited with status 1. > > > This is the Fortran code: > > > Module Fortranpi > IMPLICIT NONE > contains > subroutine dboard(darts, dartsscore) > integer, intent(in) :: darts > double precision, intent(out) :: dartsscore > double precision :: x_coord, y_coord > integer :: score, n > > score = 0 > do n = 1, darts > call random_number(x_coord) > call random_number(y_coord) > > if ((x_coord**2 + y_coord**2) <= 1.0d0) then > score = score + 1 > end if > end do > > dartsscore = 4.0d0*score/darts > > end subroutine dboard > > subroutine pi(avepi, DARTS, ROUNDS) bind(C, name="pi_") > use, intrinsic :: iso_c_binding, only : > c_double, c_int > real(c_double), intent(out) :: avepi > integer(c_int), intent(in) :: DARTS, ROUNDS > integer :: MASTER, rank, i, n > integer, allocatable :: seed(:) > double precision :: pi_est, homepi, pirecv, pisum > ! we set it to zero in the sequential run > rank = 0! initialize the random number generator! we make sure the > seed is different for each task > call random_seed() > call random_seed(size = n) > allocate(seed(n)) > seed = 12 + rank*11 > call random_seed(put=seed(1:n)) > deallocate(seed) > > avepi = 0 > do i = 0, ROUNDS-1 > call dboard(darts, pi_est) > ! calculate the average value of pi over all iterations > avepi = ((avepi*i) + pi_est)/(i + 1) > end do > end subroutine pi > > end module Fortranpi > > > I tried adding -fno-stack-protector > -lssp but it did not help. > > I also tried doing this "by hand" and > I get these errors: > > >> system("R CMD SHLIB -fno-stack-protector -lssp ./src/Fpi.f90") > gfortran -m64 -shared -s -static-libgcc -o src/Fpi.dll tmp.def > ./src/Fpi.o -fno-stack-protector -lssp > -Ld:/RCompile/r-compiling/local/local320/lib/x64 > -Ld:/RCompile/r-compiling/local/local320/lib > -LC:/Users/IMARTI~1/DOCUME~1/R/R-32~1.1/bin/x64 -lR> > dyn.load("./src/Fpi.dll") > Error in inDL(x, as.logical(local), as.logical(now), ...) : > unable to load shared object 'C:/Users/IMartinez/Projects/MyPi/./src/Fpi.dll': > LoadLibrary failure: %1 is not a valid Win32 application. > 'C:/Users/IMartinez/Projects/MyPi/./src/Fpi.dll': > LoadLibrary failure: %1 is not a valid Win32 application. > > > Thanks for the help! > ? > > > PS: I posted this question in stackoverflow with no luck. > > > > PPS: I also sent this to r-help but they told me to try this list > > [[alternative HTML version deleted]] > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel From ignacio82 at gmail.com Mon Jul 27 21:37:25 2015 From: ignacio82 at gmail.com (Ignacio Martinez) Date: Mon, 27 Jul 2015 19:37:25 +0000 Subject: [Rd] R package with Fortran module on Windows? undefined reference to `__stack_chk_fail' In-Reply-To: References: Message-ID: Thanks, I just got the answer on stackoverflow!! On Mon, Jul 27, 2015 at 3:12 PM Kevin Ushey wrote: > You should be able to set PKG_FCFLAGS="-fno-stack-protector" when > compiling to ensure that the stack protector is not used. > > (Trying that out on a Windows VM, with a simple `R CMD build` + `R CMD > INSTALL`, compilation of your package succeeded but linking failed > saying the DLL 'Fpi' was not found; I imagine that's a separate > issue.) > > As an aside, be sure to check out what R-exts has to say on the topic > of Fortran code in R packages (especially F90 and above): > https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Using-F95-code > . > > Cheers, > Kevin > > On Mon, Jul 27, 2015 at 11:06 AM, Ignacio Martinez > wrote: > > Hi, > > > > I created a R library that uses a Fortran module. Everything works like a > > charm on linux. > > > > Now I'm trying to make it work on Windows. I cloned my git repository > > on a windows computer, and when I > press > > the build and reload button on Rstudio I get these errors: > > > > ==> Rcmd.exe INSTALL --no-multiarch --with-keep.source MyPi > > * installing to library > > 'C:/Users/IMartinez/Documents/R/R-3.2.1/library'* installing *source* > > package 'MyPi' ...** libs > > gfortran -m64 -shared -s -static-libgcc -o MyPi.dll tmp.def Fpi.o > > -Ld:/RCompile/r-compiling/local/local320/lib/x64 > > -Ld:/RCompile/r-compiling/local/local320/lib > > -LC:/Users/IMARTI~1/DOCUME~1/R/R-32~1.1/bin/x64 -lR > > Fpi.o: In function `__fortranpi_MOD_dboard': > > Fpi.f90:(.text+0xd7): undefined reference to `__stack_chk_fail' > > Fpi.o: In function `pi_': > > Fpi.f90:(.text+0x249): undefined reference to `__stack_chk_fail' > > collect2: ld returned 1 exit status > > no DLL was created > > ERROR: compilation failed for package 'MyPi'* removing > > 'C:/Users/IMartinez/Documents/R/R-3.2.1/library/MyPi' > > > > Exited with status 1. > > > > > > This is the Fortran code: > > > > > > Module Fortranpi > > IMPLICIT NONE > > contains > > subroutine dboard(darts, dartsscore) > > integer, intent(in) :: darts > > double precision, intent(out) :: dartsscore > > double precision :: x_coord, y_coord > > integer :: score, n > > > > score = 0 > > do n = 1, darts > > call random_number(x_coord) > > call random_number(y_coord) > > > > if ((x_coord**2 + y_coord**2) <= 1.0d0) then > > score = score + 1 > > end if > > end do > > > > dartsscore = 4.0d0*score/darts > > > > end subroutine dboard > > > > subroutine pi(avepi, DARTS, ROUNDS) bind(C, name="pi_") > > use, intrinsic :: iso_c_binding, only : > > c_double, c_int > > real(c_double), intent(out) :: avepi > > integer(c_int), intent(in) :: DARTS, ROUNDS > > integer :: MASTER, rank, i, n > > integer, allocatable :: seed(:) > > double precision :: pi_est, homepi, pirecv, > pisum > > ! we set it to zero in the sequential run > > rank = 0! initialize the random number generator! we make sure the > > seed is different for each task > > call random_seed() > > call random_seed(size = n) > > allocate(seed(n)) > > seed = 12 + rank*11 > > call random_seed(put=seed(1:n)) > > deallocate(seed) > > > > avepi = 0 > > do i = 0, ROUNDS-1 > > call dboard(darts, pi_est) > > ! calculate the average value of pi over all iterations > > avepi = ((avepi*i) + pi_est)/(i + 1) > > end do > > end subroutine pi > > > > end module Fortranpi > > > > > > I tried adding -fno-stack-protector > > -lssp but it did not help. > > > > I also tried doing this "by hand" > and > > I get these errors: > > > > > >> system("R CMD SHLIB -fno-stack-protector -lssp ./src/Fpi.f90") > > gfortran -m64 -shared -s -static-libgcc -o src/Fpi.dll tmp.def > > ./src/Fpi.o -fno-stack-protector -lssp > > -Ld:/RCompile/r-compiling/local/local320/lib/x64 > > -Ld:/RCompile/r-compiling/local/local320/lib > > -LC:/Users/IMARTI~1/DOCUME~1/R/R-32~1.1/bin/x64 -lR> > > dyn.load("./src/Fpi.dll") > > Error in inDL(x, as.logical(local), as.logical(now), ...) : > > unable to load shared object > 'C:/Users/IMartinez/Projects/MyPi/./src/Fpi.dll': > > LoadLibrary failure: %1 is not a valid Win32 application. > > 'C:/Users/IMartinez/Projects/MyPi/./src/Fpi.dll': > > LoadLibrary failure: %1 is not a valid Win32 application. > > > > > > Thanks for the help! > > ? > > > > > > PS: I posted this question in stackoverflow with no luck. > > < > http://stackoverflow.com/questions/31638934/r-package-with-fortran-module-on-windows-undefined-reference-to-stack-chk-fa > > > > > > > > PPS: I also sent this to r-help but they told me to try this list > > > > [[alternative HTML version deleted]] > > > > ______________________________________________ > > R-devel at r-project.org mailing list > > https://stat.ethz.ch/mailman/listinfo/r-devel > [[alternative HTML version deleted]] From jon.clayden at gmail.com Tue Jul 28 12:58:06 2015 From: jon.clayden at gmail.com (Jon Clayden) Date: Tue, 28 Jul 2015 11:58:06 +0100 Subject: [Rd] all.equal: possible mismatch between behaviour and documentation Message-ID: Dear all, The documentation for `all.equal.numeric` says Numerical comparisons for ?scale = NULL? (the default) are done by first computing the mean absolute difference of the two numerical vectors. If this is smaller than ?tolerance? or not finite, absolute differences are used, otherwise relative differences scaled by the mean absolute difference. But the actual behaviour of the function is to use relative differences if the mean value of the first argument is greater than `tolerance`: all.equal(0.1, 0.102, tolerance=0.01) # [1] "Mean relative difference: 0.02" It seems to me that this example should produce `TRUE`, because abs(0.1-0.102) < 0.01, but it does not, because abs(0.1) > 0.01. The relevant section in the source seems to be what <- if (is.null(scale)) { xn <- mean(abs(target)) if (is.finite(xn) && xn > tolerance) { xy <- xy/xn "relative" } else "absolute" } I think `xy`, not `xn`, should be tested here. The last line of the documentation, indicating that relative differences are "scaled by the mean absolute difference" also seems not to match the code, but in this aspect the code is surely right, i.e., the relative difference is relative to the mean value, not the mean difference. All the best, Jon From jon.clayden at gmail.com Tue Jul 28 13:14:48 2015 From: jon.clayden at gmail.com (Jon Clayden) Date: Tue, 28 Jul 2015 12:14:48 +0100 Subject: [Rd] all.equal: possible mismatch between behaviour and documentation In-Reply-To: References: Message-ID: Sorry; minor clarification. The actual test criterion in the example I gave is of course abs((0.1-0.102)/0.1) < 0.01, not abs(0.1) < 0.01. In any case, this does not match (my reading of) the docs, and the result is not `TRUE`. Regards, Jon On 28 July 2015 at 11:58, Jon Clayden wrote: > Dear all, > > The documentation for `all.equal.numeric` says > > Numerical comparisons for ?scale = NULL? (the default) are done by > first computing the mean absolute difference of the two numerical > vectors. If this is smaller than ?tolerance? or not finite, > absolute differences are used, otherwise relative differences > scaled by the mean absolute difference. > > But the actual behaviour of the function is to use relative > differences if the mean value of the first argument is greater than > `tolerance`: > > all.equal(0.1, 0.102, tolerance=0.01) > # [1] "Mean relative difference: 0.02" > > It seems to me that this example should produce `TRUE`, because > abs(0.1-0.102) < 0.01, but it does not, because abs(0.1) > 0.01. The > relevant section in the source seems to be > > what <- if (is.null(scale)) { > xn <- mean(abs(target)) > if (is.finite(xn) && xn > tolerance) { > xy <- xy/xn > "relative" > } > else "absolute" > } > > I think `xy`, not `xn`, should be tested here. > > The last line of the documentation, indicating that relative > differences are "scaled by the mean absolute difference" also seems > not to match the code, but in this aspect the code is surely right, > i.e., the relative difference is relative to the mean value, not the > mean difference. > > All the best, > Jon From friendly at yorku.ca Tue Jul 28 15:53:00 2015 From: friendly at yorku.ca (Michael Friendly) Date: Tue, 28 Jul 2015 09:53:00 -0400 Subject: [Rd] Installing/updating packages on a lab network Message-ID: <55B7893C.3010903@yorku.ca> I'm the faculty member in my department who advises our IT staff on the details of installing R for students and faculty on our Windows 7 lab network. We are about to upgrade from R 3.1.1 to R 3.2.1, and once that is done, this version of R and all installed packages will be frozen in the image that appears in C:/Program Files/R/R-3.2.1/ on every lab computer, and this image is refreshed on each computer at each new login. Every user has a persistent home directory across all university lab computers, typically their F:/ drive (students) and/or X:/ drive (faculty). I need to give instructions to our IT staff for how to arrange for a personal library, say F:/~R/library to be be set automatically, and included in .libPaths() so that install.packages() and update.packages() will work using that path. At present, there is no file R_HOME/etc/Renviron.site and R_HOME/etc/Rprofile.site is the vanilla version with everything commented out, including # set a site library # .Library.site <- file.path(chartr("\\", "/", R.home()), "site-library") Environmental variables set for users include HOMEDRIVE=F: HOMEPATH=\ HOMESHARE=\\vfadmin.yorku.yorku.ca\HH\HOME\friendly with the latter mapped to the X: drive What is the code to be added to Rprofile.site to make this work? thanks, -Michael -- Michael Friendly Email: friendly AT yorku DOT ca Professor, Psychology Dept. & Chair, Quantitative Methods York University Voice: 416 736-2100 x66249 Fax: 416 736-5814 4700 Keele Street Web:http://www.datavis.ca Toronto, ONT M3J 1P3 CANADA From ligges at statistik.tu-dortmund.de Tue Jul 28 19:32:21 2015 From: ligges at statistik.tu-dortmund.de (Uwe Ligges) Date: Tue, 28 Jul 2015 19:32:21 +0200 Subject: [Rd] Installing/updating packages on a lab network In-Reply-To: <55B7893C.3010903@yorku.ca> References: <55B7893C.3010903@yorku.ca> Message-ID: <55B7BCA5.4080102@statistik.tu-dortmund.de> Just add the line R_LIBS_SITE=F:/R/library to the file R_HOME/etc/x64/Renviron.site Best, Uwe Ligges On 28.07.2015 15:53, Michael Friendly wrote: > I'm the faculty member in my department who advises our IT staff on the > details of installing R for > students and faculty on our Windows 7 lab network. We are about to > upgrade from R 3.1.1 to > R 3.2.1, and once that is done, this version of R and all installed > packages will be frozen in the > image that appears in C:/Program Files/R/R-3.2.1/ on every lab computer, > and this image is > refreshed on each computer at each new login. > > Every user has a persistent home directory across all university lab > computers, typically their > F:/ drive (students) and/or X:/ drive (faculty). I need to give > instructions to our IT staff for how to > arrange for a personal library, say > F:/~R/library > to be be set automatically, and included in .libPaths() so that > install.packages() and update.packages() > will work using that path. > > At present, there is no file R_HOME/etc/Renviron.site and > R_HOME/etc/Rprofile.site is the vanilla > version with everything commented out, including > # set a site library > # .Library.site <- file.path(chartr("\\", "/", R.home()), "site-library") > > Environmental variables set for users include > HOMEDRIVE=F: > HOMEPATH=\ > HOMESHARE=\\vfadmin.yorku.yorku.ca\HH\HOME\friendly > with the latter mapped to the X: drive > > What is the code to be added to Rprofile.site to make this work? > > thanks, > -Michael > > > From friendly at yorku.ca Wed Jul 29 17:11:10 2015 From: friendly at yorku.ca (Michael Friendly) Date: Wed, 29 Jul 2015 11:11:10 -0400 Subject: [Rd] Installing/updating packages on a lab network In-Reply-To: <55B7BCA5.4080102@statistik.tu-dortmund.de> References: <55B7893C.3010903@yorku.ca> <55B7BCA5.4080102@statistik.tu-dortmund.de> Message-ID: <55B8ED0E.6070701@yorku.ca> On 7/28/2015 1:32 PM, Uwe Ligges wrote: > Just add the line > > R_LIBS_SITE=F:/R/library > > to the file R_HOME/etc/x64/Renviron.site Thanks, Uwe I have no way to test this and our IT people who do the installation know nothing of R, so follow-up questions: * There is no R_HOME/etc/x64/Renviron.site file, but there is a R_HOME/etc/Rprofile.site I can modify and ask them to install. Would the equivalent in Rprofile.site be Sys.setenv(R_LIBS_SITE="F:/R/library") * Does this require that the R_LIBS_SITE folder exist and be writable by the user? Here is what I see in the current version: .libPaths(), the library trees where R looks for packages gives: [1] "G:/uitapps/HEBB_R_LIB2/library" [2] "C:/Program Files/R/R/R-3.1.1/library" If I try to install a new package, e.g., install.packages("rockchalk") I get "G:/uitapps/HEBB_R_LIB2/library is not writeable. Use a personal library instead?", followed by a popup window, "Create ~R/library? (y/n)" This then installs the package, but to "C:/Users/friendly/Desktop/~R/library". In the current login session, .libPaths() will then have this path, so in a new R session, load(rockchalk) will work. However, once I logout, all that disappears, so I have to start over again to install or update a package. -- Michael Friendly Email: friendly AT yorku DOT ca Professor, Psychology Dept. & Chair, Quantitative Methods York University Voice: 416 736-2100 x66249 Fax: 416 736-5814 4700 Keele Street Web:http://www.datavis.ca Toronto, ONT M3J 1P3 CANADA From ligges at statistik.tu-dortmund.de Wed Jul 29 17:30:01 2015 From: ligges at statistik.tu-dortmund.de (Uwe Ligges) Date: Wed, 29 Jul 2015 17:30:01 +0200 Subject: [Rd] Installing/updating packages on a lab network In-Reply-To: <55B8ED0E.6070701@yorku.ca> References: <55B7893C.3010903@yorku.ca> <55B7BCA5.4080102@statistik.tu-dortmund.de> <55B8ED0E.6070701@yorku.ca> Message-ID: <55B8F179.7080704@statistik.tu-dortmund.de> On 29.07.2015 17:11, Michael Friendly wrote: > On 7/28/2015 1:32 PM, Uwe Ligges wrote: >> Just add the line >> >> R_LIBS_SITE=F:/R/library >> >> to the file R_HOME/etc/x64/Renviron.site > Thanks, Uwe > > I have no way to test this and our IT people who do the installation > know nothing of R, so follow-up questions: > > * There is no R_HOME/etc/x64/Renviron.site file, b Just create it. > ut there is a > R_HOME/etc/Rprofile.site I can modify and > ask them to install. Would the equivalent in Rprofile.site be > Sys.setenv(R_LIBS_SITE="F:/R/library") It maybe too late there, not sure. > > * Does this require that the R_LIBS_SITE folder exist and be writable by > the user? Well, you cann add several paths there separated by ";", I'd put a stnadrd bapth for a per user library first that is writeble by the user. > > Here is what I see in the current version: > > .libPaths(), the library trees where R looks for packages gives: > [1] "G:/uitapps/HEBB_R_LIB2/library" > [2] "C:/Program Files/R/R/R-3.1.1/library" > > If I try to install a new package, e.g., > install.packages("rockchalk") > > I get "G:/uitapps/HEBB_R_LIB2/library is not writeable. Use a > personal library instead?", followed by a popup window, > "Create ~R/library? (y/n)" Unless you have a writeable library in the first place as mentioned above. > > This then installs the package, but to > "C:/Users/friendly/Desktop/~R/library". In the current login session, > .libPaths() will > then have this path, so in a new R session, load(rockchalk) will work. > > However, once I logout, all that disappears, so I have to start over > again to install or update a package. That depends oin your setup. On a non-tweaked Windows machie that directory should be persistent. Best, Uwe From james.f.hester at gmail.com Wed Jul 29 18:13:57 2015 From: james.f.hester at gmail.com (Jim Hester) Date: Wed, 29 Jul 2015 12:13:57 -0400 Subject: [Rd] Mapping parse tree elements to tokens Message-ID: I would like to map the parsed tokens obtained from utils::getParseData() to the parse tree and elements obtained by base::parse(). It looks like back when this code was in the parser package the parse() function annotated the elements in the tree with their id, which would allow you to perform this mapping. However when the code was included in R this functionality was removed. ?getParseData states The ?id? values are not attached to the elements of the parse tree, they are only retained in the table returned by ?getParseData?. Is there another way you can map between the getParseData() tokens and elements of the parse tree that makes this additional annotation unnecessary? Or is this simply not possible? [[alternative HTML version deleted]] From murdoch.duncan at gmail.com Wed Jul 29 18:43:29 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 29 Jul 2015 12:43:29 -0400 Subject: [Rd] Mapping parse tree elements to tokens In-Reply-To: References: Message-ID: <55B902B1.1090204@gmail.com> On 29/07/2015 12:13 PM, Jim Hester wrote: > I would like to map the parsed tokens obtained from utils::getParseData() > to the parse tree and elements obtained by base::parse(). > > It looks like back when this code was in the parser package the parse() > function annotated the elements in the tree with their id, which would > allow you to perform this mapping. However when the code was included in R > this functionality was removed. Yes, not all elements of the parse tree can legally have attributes attached. > > ?getParseData states > The ?id? values are not attached to the elements of the parse > tree, they are only retained in the table returned by > ?getParseData?. > > Is there another way you can map between the getParseData() tokens and > elements of the parse tree that makes this additional annotation > unnecessary? Or is this simply not possible? I think you can't get to it, though you can get close by looking at the id & parent values in the table. For example, code <- "x + (y + 1)" p <- parse(text=code) getParseData(p) line1 col1 line2 col2 id parent token terminal text 15 1 1 1 11 15 0 expr FALSE 1 1 1 1 1 1 3 SYMBOL TRUE x 3 1 1 1 1 3 15 expr FALSE 2 1 3 1 3 2 15 '+' TRUE + 13 1 5 1 11 13 15 expr FALSE 4 1 5 1 5 4 13 '(' TRUE ( 11 1 6 1 10 11 13 expr FALSE 5 1 6 1 6 5 7 SYMBOL TRUE y 7 1 6 1 6 7 11 expr FALSE 6 1 8 1 8 6 11 '+' TRUE + 8 1 10 1 10 8 9 NUM_CONST TRUE 1 9 1 10 1 10 9 11 expr FALSE 10 1 11 1 11 10 13 ')' TRUE ) Now p is an expression, with the parse tree in p[[1]]. From the table, we can see that the root node has id 15, and 3 nodes have that as a parent. Those would be p[[c(1,1)]], p[[c(1,2)]], p[[c(1,3)]]. The tricky part is the re-ordering: those correspond to `+`, x, and (y+1) respectively, not the order they appear in the original source or in the table. Generally the function call appears first in the parse tree, but I'm not sure you could always recognize which is the function call by looking at the table. Duncan Murdoch From friendly at yorku.ca Wed Jul 29 18:55:28 2015 From: friendly at yorku.ca (Michael Friendly) Date: Wed, 29 Jul 2015 12:55:28 -0400 Subject: [Rd] update.packages(checkBuilt=TRUE, ask=FALSE): possible bug In-Reply-To: <21FF8325-EE50-480E-9729-E0440D6587EE@gmail.com> References: <21FF8325-EE50-480E-9729-E0440D6587EE@gmail.com> Message-ID: <55B90580.4010207@yorku.ca> On 7/19/2015 3:50 AM, peter dalgaard wrote: >> For some, but not allI repositories I get the error message below: >> >Error in install.packages(update[instlib == l, "Package"], l, contriburl = >> >contriburl, : >> > specifying 'contriburl' or 'available' requires a single type, not type = >> >"both" >> > >> >Is it a bug? I think that what Jose is referring to is this error message from install.packages(), even though he set options(pkgType='binary') Isn't that a bug? Moreover, in a fresh R 3.2.1 session [Platform: x86_64-w64-mingw32/x64 (64-bit)], I see > getOption("pkgType") [1] "both" I think this is somehow related to the fact that install.packages() will now tell you if a more current source package exists and ask if you want to install from source. But this would require RTools to be installed to work. IMHO, I consider this a major infelicity or design flaw, that will cause problems for naive and even experienced users. install.packages("foo") should always work unless package "foo" cannot cannot be found in getOption("repos") -Michael From murdoch.duncan at gmail.com Wed Jul 29 19:46:43 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 29 Jul 2015 13:46:43 -0400 Subject: [Rd] update.packages(checkBuilt=TRUE, ask=FALSE): possible bug In-Reply-To: <55B90580.4010207@yorku.ca> References: <21FF8325-EE50-480E-9729-E0440D6587EE@gmail.com> <55B90580.4010207@yorku.ca> Message-ID: <55B91183.6000306@gmail.com> On 29/07/2015 12:55 PM, Michael Friendly wrote: > On 7/19/2015 3:50 AM, peter dalgaard wrote: > >> For some, but not allI repositories I get the error message below: > >> >Error in install.packages(update[instlib == l, "Package"], l, contriburl = > >> >contriburl, : > >> > specifying 'contriburl' or 'available' requires a single type, not type = > >> >"both" > >> > > >> >Is it a bug? > > I think that what Jose is referring to is this error message from > install.packages(), even though he set > options(pkgType='binary') > Isn't that a bug? > > Moreover, in a fresh R 3.2.1 session [Platform: x86_64-w64-mingw32/x64 > (64-bit)], I see > > getOption("pkgType") > [1] "both" > > I think this is somehow related to the fact that install.packages() will > now tell you if a more current source package exists and ask if you want > to install from source. But this would require RTools to > be installed to work. > > IMHO, I consider this a major infelicity or design flaw, that will cause > problems for naive and even experienced users. install.packages("foo") > should always work unless package "foo" cannot > cannot be found in getOption("repos") I'd like to see a reproducible example before I declared it to be a bug that needs fixing. Do note the news item in R-patched (soon to be R 3.2.2): | - download.packages()| failed for |type| equal to either |"both"| or |"binary"|. (Reported by Dan Tenenbaum.) install.packages() can call download.packages(), so this might have been fixed now. Duncan Murdoch From lawrence.michael at gene.com Wed Jul 29 20:30:57 2015 From: lawrence.michael at gene.com (Michael Lawrence) Date: Wed, 29 Jul 2015 11:30:57 -0700 Subject: [Rd] Mapping parse tree elements to tokens In-Reply-To: <55B902B1.1090204@gmail.com> References: <55B902B1.1090204@gmail.com> Message-ID: Probably need a generic tree based on "ParseNode" objects that associate the line information with the symbol (for leaf nodes). As Duncan notes, it should be possible to gather that from the table. But it would be nice if there was an "expr" column in the parse data column in addition to "text". It would contain the parsed object. Otherwise, to use the table, one is often reparsing the text, which just seems redundant and inconvenient. Michael On Wed, Jul 29, 2015 at 9:43 AM, Duncan Murdoch wrote: > On 29/07/2015 12:13 PM, Jim Hester wrote: >> >> I would like to map the parsed tokens obtained from utils::getParseData() >> to the parse tree and elements obtained by base::parse(). >> >> It looks like back when this code was in the parser package the parse() >> function annotated the elements in the tree with their id, which would >> allow you to perform this mapping. However when the code was included in >> R >> this functionality was removed. > > > Yes, not all elements of the parse tree can legally have attributes > attached. >> >> >> ?getParseData states >> The ?id? values are not attached to the elements of the parse >> tree, they are only retained in the table returned by >> ?getParseData?. >> >> Is there another way you can map between the getParseData() tokens and >> elements of the parse tree that makes this additional annotation >> unnecessary? Or is this simply not possible? > > > I think you can't get to it, though you can get close by looking at the id & > parent values in the table. For example, > > code <- "x + (y + 1)" > p <- parse(text=code) > > getParseData(p) > line1 col1 line2 col2 id parent token terminal text > 15 1 1 1 11 15 0 expr FALSE > 1 1 1 1 1 1 3 SYMBOL TRUE x > 3 1 1 1 1 3 15 expr FALSE > 2 1 3 1 3 2 15 '+' TRUE + > 13 1 5 1 11 13 15 expr FALSE > 4 1 5 1 5 4 13 '(' TRUE ( > 11 1 6 1 10 11 13 expr FALSE > 5 1 6 1 6 5 7 SYMBOL TRUE y > 7 1 6 1 6 7 11 expr FALSE > 6 1 8 1 8 6 11 '+' TRUE + > 8 1 10 1 10 8 9 NUM_CONST TRUE 1 > 9 1 10 1 10 9 11 expr FALSE > 10 1 11 1 11 10 13 ')' TRUE ) > > > Now p is an expression, with the parse tree in p[[1]]. From the table, we > can see that the root node has id 15, and 3 nodes have that as a parent. > Those would be p[[c(1,1)]], p[[c(1,2)]], p[[c(1,3)]]. The tricky part is > the re-ordering: those correspond to `+`, x, and (y+1) respectively, not > the order they appear in the original source or in the table. Generally the > function call appears first in the parse tree, but I'm not sure you could > always recognize which is the function call by looking at the table. > > Duncan Murdoch > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel From murdoch.duncan at gmail.com Wed Jul 29 20:47:40 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 29 Jul 2015 14:47:40 -0400 Subject: [Rd] Mapping parse tree elements to tokens In-Reply-To: References: <55B902B1.1090204@gmail.com> Message-ID: <55B91FCC.6040201@gmail.com> On 29/07/2015 2:30 PM, Michael Lawrence wrote: > Probably need a generic tree based on "ParseNode" objects that > associate the line information with the symbol (for leaf nodes). As > Duncan notes, it should be possible to gather that from the table. > > But it would be nice if there was an "expr" column in the parse data > column in addition to "text". It would contain the parsed object. > Otherwise, to use the table, one is often reparsing the text, which > just seems redundant and inconvenient. Can you (both Jim and Michael) describe the uses you might have for this? There are lots of possible changes that could make this information available: - attach to each item in the parse tree, as the parser package did. (Bad idea for general use which is why I dropped it, but it could be done as a special option to parse, if you aren't planning to evaluate the expression.) - give the index into the parse tree of each item, i.e. c(1,1), c(1,2), c(1,3) in the example below, or just 1,2,3 along with a function to reconstruct the full path. - give a copy of the branch of the parse tree, as Michael suggests. etc. Which is best for your purposes? Duncan Murdoch > > Michael > > On Wed, Jul 29, 2015 at 9:43 AM, Duncan Murdoch > wrote: > > On 29/07/2015 12:13 PM, Jim Hester wrote: > >> > >> I would like to map the parsed tokens obtained from utils::getParseData() > >> to the parse tree and elements obtained by base::parse(). > >> > >> It looks like back when this code was in the parser package the parse() > >> function annotated the elements in the tree with their id, which would > >> allow you to perform this mapping. However when the code was included in > >> R > >> this functionality was removed. > > > > > > Yes, not all elements of the parse tree can legally have attributes > > attached. > >> > >> > >> ?getParseData states > >> The ?id? values are not attached to the elements of the parse > >> tree, they are only retained in the table returned by > >> ?getParseData?. > >> > >> Is there another way you can map between the getParseData() tokens and > >> elements of the parse tree that makes this additional annotation > >> unnecessary? Or is this simply not possible? > > > > > > I think you can't get to it, though you can get close by looking at the id & > > parent values in the table. For example, > > > > code <- "x + (y + 1)" > > p <- parse(text=code) > > > > getParseData(p) > > line1 col1 line2 col2 id parent token terminal text > > 15 1 1 1 11 15 0 expr FALSE > > 1 1 1 1 1 1 3 SYMBOL TRUE x > > 3 1 1 1 1 3 15 expr FALSE > > 2 1 3 1 3 2 15 '+' TRUE + > > 13 1 5 1 11 13 15 expr FALSE > > 4 1 5 1 5 4 13 '(' TRUE ( > > 11 1 6 1 10 11 13 expr FALSE > > 5 1 6 1 6 5 7 SYMBOL TRUE y > > 7 1 6 1 6 7 11 expr FALSE > > 6 1 8 1 8 6 11 '+' TRUE + > > 8 1 10 1 10 8 9 NUM_CONST TRUE 1 > > 9 1 10 1 10 9 11 expr FALSE > > 10 1 11 1 11 10 13 ')' TRUE ) > > > > > > Now p is an expression, with the parse tree in p[[1]]. From the table, we > > can see that the root node has id 15, and 3 nodes have that as a parent. > > Those would be p[[c(1,1)]], p[[c(1,2)]], p[[c(1,3)]]. The tricky part is > > the re-ordering: those correspond to `+`, x, and (y+1) respectively, not > > the order they appear in the original source or in the table. Generally the > > function call appears first in the parse tree, but I'm not sure you could > > always recognize which is the function call by looking at the table. > > > > Duncan Murdoch > > > > ______________________________________________ > > R-devel at r-project.org mailing list > > https://stat.ethz.ch/mailman/listinfo/r-devel From lawrence.michael at gene.com Wed Jul 29 21:06:57 2015 From: lawrence.michael at gene.com (Michael Lawrence) Date: Wed, 29 Jul 2015 12:06:57 -0700 Subject: [Rd] Mapping parse tree elements to tokens In-Reply-To: <55B91FCC.6040201@gmail.com> References: <55B902B1.1090204@gmail.com> <55B91FCC.6040201@gmail.com> Message-ID: I have two use cases in mind: 1) Code indexing/searching, where the table gets me almost all of the way there, except I ask for all of the text (including the calls) and then parse that, because it's nice to get back an actual code object when you are searching code (in addition to where the code lives). The extra parsing step is just a minor inconvenience. 2) Code analysis, which I'm pretty sure is also Jim's use case, where the analysis is implemented most easily as a parse tree traversal, while you also want to point back to the original source location. Here's where one would want a reference from parse node to location. So neither of those involves code evaluation at first glance, though I guess one could use some sort of evaluation during analysis. On Wed, Jul 29, 2015 at 11:47 AM, Duncan Murdoch wrote: > On 29/07/2015 2:30 PM, Michael Lawrence wrote: >> >> Probably need a generic tree based on "ParseNode" objects that >> associate the line information with the symbol (for leaf nodes). As >> Duncan notes, it should be possible to gather that from the table. >> >> But it would be nice if there was an "expr" column in the parse data >> column in addition to "text". It would contain the parsed object. >> Otherwise, to use the table, one is often reparsing the text, which >> just seems redundant and inconvenient. > > > Can you (both Jim and Michael) describe the uses you might have for this? > There are lots of possible changes that could make this information > available: > > - attach to each item in the parse tree, as the parser package did. (Bad > idea for general use which is why I dropped it, but > it could be done as a special option to parse, if you aren't planning to > evaluate the expression.) > - give the index into the parse tree of each item, i.e. c(1,1), c(1,2), > c(1,3) in the example below, or just 1,2,3 along with a function to > reconstruct the full path. > - give a copy of the branch of the parse tree, as Michael suggests. > > etc. Which is best for your purposes? > > Duncan Murdoch > >> >> Michael >> >> On Wed, Jul 29, 2015 at 9:43 AM, Duncan Murdoch >> wrote: >> > On 29/07/2015 12:13 PM, Jim Hester wrote: >> >> >> >> I would like to map the parsed tokens obtained from >> >> utils::getParseData() >> >> to the parse tree and elements obtained by base::parse(). >> >> >> >> It looks like back when this code was in the parser package the parse() >> >> function annotated the elements in the tree with their id, which would >> >> allow you to perform this mapping. However when the code was included >> >> in >> >> R >> >> this functionality was removed. >> > >> > >> > Yes, not all elements of the parse tree can legally have attributes >> > attached. >> >> >> >> >> >> ?getParseData states >> >> The ?id? values are not attached to the elements of the parse >> >> tree, they are only retained in the table returned by >> >> ?getParseData?. >> >> >> >> Is there another way you can map between the getParseData() tokens and >> >> elements of the parse tree that makes this additional annotation >> >> unnecessary? Or is this simply not possible? >> > >> > >> > I think you can't get to it, though you can get close by looking at the >> > id & >> > parent values in the table. For example, >> > >> > code <- "x + (y + 1)" >> > p <- parse(text=code) >> > >> > getParseData(p) >> > line1 col1 line2 col2 id parent token terminal text >> > 15 1 1 1 11 15 0 expr FALSE >> > 1 1 1 1 1 1 3 SYMBOL TRUE x >> > 3 1 1 1 1 3 15 expr FALSE >> > 2 1 3 1 3 2 15 '+' TRUE + >> > 13 1 5 1 11 13 15 expr FALSE >> > 4 1 5 1 5 4 13 '(' TRUE ( >> > 11 1 6 1 10 11 13 expr FALSE >> > 5 1 6 1 6 5 7 SYMBOL TRUE y >> > 7 1 6 1 6 7 11 expr FALSE >> > 6 1 8 1 8 6 11 '+' TRUE + >> > 8 1 10 1 10 8 9 NUM_CONST TRUE 1 >> > 9 1 10 1 10 9 11 expr FALSE >> > 10 1 11 1 11 10 13 ')' TRUE ) >> > >> > >> > Now p is an expression, with the parse tree in p[[1]]. From the table, >> > we >> > can see that the root node has id 15, and 3 nodes have that as a parent. >> > Those would be p[[c(1,1)]], p[[c(1,2)]], p[[c(1,3)]]. The tricky part >> > is >> > the re-ordering: those correspond to `+`, x, and (y+1) respectively, >> > not >> > the order they appear in the original source or in the table. Generally >> > the >> > function call appears first in the parse tree, but I'm not sure you >> > could >> > always recognize which is the function call by looking at the table. >> > >> > Duncan Murdoch >> > >> > ______________________________________________ >> > R-devel at r-project.org mailing list >> > https://stat.ethz.ch/mailman/listinfo/r-devel > > From luke-tierney at uiowa.edu Wed Jul 29 21:15:51 2015 From: luke-tierney at uiowa.edu (luke-tierney at uiowa.edu) Date: Wed, 29 Jul 2015 14:15:51 -0500 Subject: [Rd] Mapping parse tree elements to tokens In-Reply-To: References: <55B902B1.1090204@gmail.com> <55B91FCC.6040201@gmail.com> Message-ID: Both codetools and compiler get by without this. codetools uses source refs to generate messages; I don't recall if compiler does but it could easily do so. I would be wary about committing to this sort of implementation specific stuff -- we might want to go to completely different parser technology at tome point, which would be harder if we committed to these sort of details. Best, luke On Wed, 29 Jul 2015, Michael Lawrence wrote: > I have two use cases in mind: > > 1) Code indexing/searching, where the table gets me almost all of the > way there, except I ask for all of the text (including the calls) and > then parse that, because it's nice to get back an actual code object > when you are searching code (in addition to where the code lives). The > extra parsing step is just a minor inconvenience. > > 2) Code analysis, which I'm pretty sure is also Jim's use case, where > the analysis is implemented most easily as a parse tree traversal, > while you also want to point back to the original source location. > Here's where one would want a reference from parse node to location. > > So neither of those involves code evaluation at first glance, though I > guess one could use some sort of evaluation during analysis. > > On Wed, Jul 29, 2015 at 11:47 AM, Duncan Murdoch > wrote: >> On 29/07/2015 2:30 PM, Michael Lawrence wrote: >>> >>> Probably need a generic tree based on "ParseNode" objects that >>> associate the line information with the symbol (for leaf nodes). As >>> Duncan notes, it should be possible to gather that from the table. >>> >>> But it would be nice if there was an "expr" column in the parse data >>> column in addition to "text". It would contain the parsed object. >>> Otherwise, to use the table, one is often reparsing the text, which >>> just seems redundant and inconvenient. >> >> >> Can you (both Jim and Michael) describe the uses you might have for this? >> There are lots of possible changes that could make this information >> available: >> >> - attach to each item in the parse tree, as the parser package did. (Bad >> idea for general use which is why I dropped it, but >> it could be done as a special option to parse, if you aren't planning to >> evaluate the expression.) >> - give the index into the parse tree of each item, i.e. c(1,1), c(1,2), >> c(1,3) in the example below, or just 1,2,3 along with a function to >> reconstruct the full path. >> - give a copy of the branch of the parse tree, as Michael suggests. >> >> etc. Which is best for your purposes? >> >> Duncan Murdoch >> >>> >>> Michael >>> >>> On Wed, Jul 29, 2015 at 9:43 AM, Duncan Murdoch >>> wrote: >>> > On 29/07/2015 12:13 PM, Jim Hester wrote: >>> >> >>> >> I would like to map the parsed tokens obtained from >>> >> utils::getParseData() >>> >> to the parse tree and elements obtained by base::parse(). >>> >> >>> >> It looks like back when this code was in the parser package the parse() >>> >> function annotated the elements in the tree with their id, which would >>> >> allow you to perform this mapping. However when the code was included >>> >> in >>> >> R >>> >> this functionality was removed. >>> > >>> > >>> > Yes, not all elements of the parse tree can legally have attributes >>> > attached. >>> >> >>> >> >>> >> ?getParseData states >>> >> The ?id? values are not attached to the elements of the parse >>> >> tree, they are only retained in the table returned by >>> >> ?getParseData?. >>> >> >>> >> Is there another way you can map between the getParseData() tokens and >>> >> elements of the parse tree that makes this additional annotation >>> >> unnecessary? Or is this simply not possible? >>> > >>> > >>> > I think you can't get to it, though you can get close by looking at the >>> > id & >>> > parent values in the table. For example, >>> > >>> > code <- "x + (y + 1)" >>> > p <- parse(text=code) >>> > >>> > getParseData(p) >>> > line1 col1 line2 col2 id parent token terminal text >>> > 15 1 1 1 11 15 0 expr FALSE >>> > 1 1 1 1 1 1 3 SYMBOL TRUE x >>> > 3 1 1 1 1 3 15 expr FALSE >>> > 2 1 3 1 3 2 15 '+' TRUE + >>> > 13 1 5 1 11 13 15 expr FALSE >>> > 4 1 5 1 5 4 13 '(' TRUE ( >>> > 11 1 6 1 10 11 13 expr FALSE >>> > 5 1 6 1 6 5 7 SYMBOL TRUE y >>> > 7 1 6 1 6 7 11 expr FALSE >>> > 6 1 8 1 8 6 11 '+' TRUE + >>> > 8 1 10 1 10 8 9 NUM_CONST TRUE 1 >>> > 9 1 10 1 10 9 11 expr FALSE >>> > 10 1 11 1 11 10 13 ')' TRUE ) >>> > >>> > >>> > Now p is an expression, with the parse tree in p[[1]]. From the table, >>> > we >>> > can see that the root node has id 15, and 3 nodes have that as a parent. >>> > Those would be p[[c(1,1)]], p[[c(1,2)]], p[[c(1,3)]]. The tricky part >>> > is >>> > the re-ordering: those correspond to `+`, x, and (y+1) respectively, >>> > not >>> > the order they appear in the original source or in the table. Generally >>> > the >>> > function call appears first in the parse tree, but I'm not sure you >>> > could >>> > always recognize which is the function call by looking at the table. >>> > >>> > Duncan Murdoch >>> > >>> > ______________________________________________ >>> > R-devel at r-project.org mailing list >>> > https://stat.ethz.ch/mailman/listinfo/r-devel >> >> > > ______________________________________________ > R-devel at r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel -- Luke Tierney Ralph E. Wareham Professor of Mathematical Sciences University of Iowa Phone: 319-335-3386 Department of Statistics and Fax: 319-335-3017 Actuarial Science 241 Schaeffer Hall email: luke-tierney at uiowa.edu Iowa City, IA 52242 WWW: http://www.stat.uiowa.edu From lawrence.michael at gene.com Wed Jul 29 22:10:50 2015 From: lawrence.michael at gene.com (Michael Lawrence) Date: Wed, 29 Jul 2015 13:10:50 -0700 Subject: [Rd] Mapping parse tree elements to tokens In-Reply-To: References: <55B902B1.1090204@gmail.com> <55B91FCC.6040201@gmail.com> Message-ID: I agree that we don't want to depend on implementation details. Some sort of abstraction that is higher resolution than srcrefs would be nice. Right now, it would be inconvenient using srcrefs to get to the exact column range of a symbol, for example, but an IDE wants that to highlight the symbol. Maybe looking at how other parsers represent this information would be helpful. On Wed, Jul 29, 2015 at 12:15 PM, wrote: > Both codetools and compiler get by without this. codetools uses source > refs to generate messages; I don't recall if compiler does but it > could easily do so. I would be wary about committing to this sort of > implementation specific stuff -- we might want to go to completely > different parser technology at tome point, which would be harder if we > committed to these sort of details. > > Best, > > luke > > On Wed, 29 Jul 2015, Michael Lawrence wrote: > >> I have two use cases in mind: >> >> 1) Code indexing/searching, where the table gets me almost all of the >> way there, except I ask for all of the text (including the calls) and >> then parse that, because it's nice to get back an actual code object >> when you are searching code (in addition to where the code lives). The >> extra parsing step is just a minor inconvenience. >> >> 2) Code analysis, which I'm pretty sure is also Jim's use case, where >> the analysis is implemented most easily as a parse tree traversal, >> while you also want to point back to the original source location. >> Here's where one would want a reference from parse node to location. >> >> So neither of those involves code evaluation at first glance, though I >> guess one could use some sort of evaluation during analysis. >> >> On Wed, Jul 29, 2015 at 11:47 AM, Duncan Murdoch >> wrote: >>> >>> On 29/07/2015 2:30 PM, Michael Lawrence wrote: >>>> >>>> >>>> Probably need a generic tree based on "ParseNode" objects that >>>> associate the line information with the symbol (for leaf nodes). As >>>> Duncan notes, it should be possible to gather that from the table. >>>> >>>> But it would be nice if there was an "expr" column in the parse data >>>> column in addition to "text". It would contain the parsed object. >>>> Otherwise, to use the table, one is often reparsing the text, which >>>> just seems redundant and inconvenient. >>> >>> >>> >>> Can you (both Jim and Michael) describe the uses you might have for this? >>> There are lots of possible changes that could make this information >>> available: >>> >>> - attach to each item in the parse tree, as the parser package did. >>> (Bad >>> idea for general use which is why I dropped it, but >>> it could be done as a special option to parse, if you aren't planning to >>> evaluate the expression.) >>> - give the index into the parse tree of each item, i.e. c(1,1), c(1,2), >>> c(1,3) in the example below, or just 1,2,3 along with a function to >>> reconstruct the full path. >>> - give a copy of the branch of the parse tree, as Michael suggests. >>> >>> etc. Which is best for your purposes? >>> >>> Duncan Murdoch >>> >>>> >>>> Michael >>>> >>>> On Wed, Jul 29, 2015 at 9:43 AM, Duncan Murdoch >>>> wrote: >>>> > On 29/07/2015 12:13 PM, Jim Hester wrote: >>>> >> >>>> >> I would like to map the parsed tokens obtained from >>>> >> utils::getParseData() >>>> >> to the parse tree and elements obtained by base::parse(). >>>> >> >>>> >> It looks like back when this code was in the parser package the >>>> >> parse() >>>> >> function annotated the elements in the tree with their id, which >>>> >> would >>>> >> allow you to perform this mapping. However when the code was >>>> >> included >>>> >> in >>>> >> R >>>> >> this functionality was removed. >>>> > >>>> > >>>> > Yes, not all elements of the parse tree can legally have attributes >>>> > attached. >>>> >> >>>> >> >>>> >> ?getParseData states >>>> >> The ?id? values are not attached to the elements of the parse >>>> >> tree, they are only retained in the table returned by >>>> >> ?getParseData?. >>>> >> >>>> >> Is there another way you can map between the getParseData() tokens >>>> >> and >>>> >> elements of the parse tree that makes this additional annotation >>>> >> unnecessary? Or is this simply not possible? >>>> > >>>> > >>>> > I think you can't get to it, though you can get close by looking at >>>> > the >>>> > id & >>>> > parent values in the table. For example, >>>> > >>>> > code <- "x + (y + 1)" >>>> > p <- parse(text=code) >>>> > >>>> > getParseData(p) >>>> > line1 col1 line2 col2 id parent token terminal text >>>> > 15 1 1 1 11 15 0 expr FALSE >>>> > 1 1 1 1 1 1 3 SYMBOL TRUE x >>>> > 3 1 1 1 1 3 15 expr FALSE >>>> > 2 1 3 1 3 2 15 '+' TRUE + >>>> > 13 1 5 1 11 13 15 expr FALSE >>>> > 4 1 5 1 5 4 13 '(' TRUE ( >>>> > 11 1 6 1 10 11 13 expr FALSE >>>> > 5 1 6 1 6 5 7 SYMBOL TRUE y >>>> > 7 1 6 1 6 7 11 expr FALSE >>>> > 6 1 8 1 8 6 11 '+' TRUE + >>>> > 8 1 10 1 10 8 9 NUM_CONST TRUE 1 >>>> > 9 1 10 1 10 9 11 expr FALSE >>>> > 10 1 11 1 11 10 13 ')' TRUE ) >>>> > >>>> > >>>> > Now p is an expression, with the parse tree in p[[1]]. From the >>>> > table, >>>> > we >>>> > can see that the root node has id 15, and 3 nodes have that as a >>>> > parent. >>>> > Those would be p[[c(1,1)]], p[[c(1,2)]], p[[c(1,3)]]. The tricky part >>>> > is >>>> > the re-ordering: those correspond to `+`, x, and (y+1) respectively, >>>> > not >>>> > the order they appear in the original source or in the table. >>>> > Generally >>>> > the >>>> > function call appears first in the parse tree, but I'm not sure you >>>> > could >>>> > always recognize which is the function call by looking at the table. >>>> > >>>> > Duncan Murdoch >>>> > >>>> > ______________________________________________ >>>> > R-devel at r-project.org mailing list >>>> > https://stat.ethz.ch/mailman/listinfo/r-devel >>> >>> >>> >> >> ______________________________________________ >> R-devel at r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel > > > -- > Luke Tierney > Ralph E. Wareham Professor of Mathematical Sciences > University of Iowa Phone: 319-335-3386 > Department of Statistics and Fax: 319-335-3017 > Actuarial Science > 241 Schaeffer Hall email: luke-tierney at uiowa.edu > Iowa City, IA 52242 WWW: http://www.stat.uiowa.edu From james.f.hester at gmail.com Wed Jul 29 23:33:11 2015 From: james.f.hester at gmail.com (Jim Hester) Date: Wed, 29 Jul 2015 17:33:11 -0400 Subject: [Rd] Mapping parse tree elements to tokens In-Reply-To: <55B91FCC.6040201@gmail.com> References: <55B902B1.1090204@gmail.com> <55B91FCC.6040201@gmail.com> Message-ID: As Michael guessed my main use cases was code analysis. A concrete example where this would help is with my test code coverage tool covr. There is currently a bug when tracking coverage for if / else statements when the clauses do not contain brackets (https://github.com/jimhester/covr/issues/39). Because only one source reference is generated in this case (because it is parsed as a single expression), it is not possible to track each of the clauses separately. While I can get the source reference for the entire statement, in order to extract the if/else clauses I need to either use the tokenized information from getParseData(), or re-parse the entire if / else expression by hand (which seems prone to error to me). Another example of where this would help is linking comments to expressions. While I know this topic has been discussed previously ( https://stat.ethz.ch/pipermail/r-devel/2009-March/052731.html) and I am fine with the default parser dropping comments, having the ability to map the more detailed tokens back to the parse tree would allow the comments to be annotated to their closest expression. Of the three options you propose I think simply supplying the index as an additional column from the getParseData() output would be the most straightforward to implement and use. While it is true that you can get most of the way there with the current source references as Michael mentions in some cases having more fine grained location information is useful and there is no great way to get there currently without re-parsing the full expressions from the source reference. The current getParseData output is already very implementation specific so I don't think it would be a great additional support burden to add the indexing information. Likely the whole function would have to be removed if a different parsing method was used. Regardless I am glad others have shown some interest in this issue, thank you for taking the time to read and respond! Jim On Wed, Jul 29, 2015 at 2:47 PM, Duncan Murdoch wrote: > On 29/07/2015 2:30 PM, Michael Lawrence wrote: > >> Probably need a generic tree based on "ParseNode" objects that >> associate the line information with the symbol (for leaf nodes). As >> Duncan notes, it should be possible to gather that from the table. >> >> But it would be nice if there was an "expr" column in the parse data >> column in addition to "text". It would contain the parsed object. >> Otherwise, to use the table, one is often reparsing the text, which >> just seems redundant and inconvenient. >> > > Can you (both Jim and Michael) describe the uses you might have for this? > There are lots of possible changes that could make this information > available: > > - attach to each item in the parse tree, as the parser package did. (Bad > idea for general use which is why I dropped it, but > it could be done as a special option to parse, if you aren't planning to > evaluate the expression.) > - give the index into the parse tree of each item, i.e. c(1,1), c(1,2), > c(1,3) in the example below, or just 1,2,3 along with a function to > reconstruct the full path. > - give a copy of the branch of the parse tree, as Michael suggests. > > etc. Which is best for your purposes? > > Duncan Murdoch > > >> Michael >> >> On Wed, Jul 29, 2015 at 9:43 AM, Duncan Murdoch >> wrote: >> > On 29/07/2015 12:13 PM, Jim Hester wrote: >> >> >> >> I would like to map the parsed tokens obtained from >> utils::getParseData() >> >> to the parse tree and elements obtained by base::parse(). >> >> >> >> It looks like back when this code was in the parser package the parse() >> >> function annotated the elements in the tree with their id, which would >> >> allow you to perform this mapping. However when the code was included >> in >> >> R >> >> this functionality was removed. >> > >> > >> > Yes, not all elements of the parse tree can legally have attributes >> > attached. >> >> >> >> >> >> ?getParseData states >> >> The ?id? values are not attached to the elements of the parse >> >> tree, they are only retained in the table returned by >> >> ?getParseData?. >> >> >> >> Is there another way you can map between the getParseData() tokens and >> >> elements of the parse tree that makes this additional annotation >> >> unnecessary? Or is this simply not possible? >> > >> > >> > I think you can't get to it, though you can get close by looking at the >> id & >> > parent values in the table. For example, >> > >> > code <- "x + (y + 1)" >> > p <- parse(text=code) >> > >> > getParseData(p) >> > line1 col1 line2 col2 id parent token terminal text >> > 15 1 1 1 11 15 0 expr FALSE >> > 1 1 1 1 1 1 3 SYMBOL TRUE x >> > 3 1 1 1 1 3 15 expr FALSE >> > 2 1 3 1 3 2 15 '+' TRUE + >> > 13 1 5 1 11 13 15 expr FALSE >> > 4 1 5 1 5 4 13 '(' TRUE ( >> > 11 1 6 1 10 11 13 expr FALSE >> > 5 1 6 1 6 5 7 SYMBOL TRUE y >> > 7 1 6 1 6 7 11 expr FALSE >> > 6 1 8 1 8 6 11 '+' TRUE + >> > 8 1 10 1 10 8 9 NUM_CONST TRUE 1 >> > 9 1 10 1 10 9 11 expr FALSE >> > 10 1 11 1 11 10 13 ')' TRUE ) >> > >> > >> > Now p is an expression, with the parse tree in p[[1]]. From the table, >> we >> > can see that the root node has id 15, and 3 nodes have that as a parent. >> > Those would be p[[c(1,1)]], p[[c(1,2)]], p[[c(1,3)]]. The tricky part >> is >> > the re-ordering: those correspond to `+`, x, and (y+1) respectively, >> not >> > the order they appear in the original source or in the table. >> Generally the >> > function call appears first in the parse tree, but I'm not sure you >> could >> > always recognize which is the function call by looking at the table. >> > >> > Duncan Murdoch >> > >> > ______________________________________________ >> > R-devel at r-project.org mailing list >> > https://stat.ethz.ch/mailman/listinfo/r-devel >> > > [[alternative HTML version deleted]] From murdoch.duncan at gmail.com Thu Jul 30 01:24:35 2015 From: murdoch.duncan at gmail.com (Duncan Murdoch) Date: Wed, 29 Jul 2015 19:24:35 -0400 Subject: [Rd] Mapping parse tree elements to tokens In-Reply-To: References: <55B902B1.1090204@gmail.com> <55B91FCC.6040201@gmail.com> Message-ID: <55B960B3.1@gmail.com> On 29/07/2015 5:33 PM, Jim Hester wrote: > As Michael guessed my main use cases was code analysis. A concrete > example where this would help is with my test code coverage tool covr. > There is currently a bug when tracking coverage for if / else statements > when the clauses do not contain brackets > (https://github.com/jimhester/covr/issues/39). Because only one source > reference is generated in this case (because it is parsed as a single > expression), it is not possible to track each of the clauses > separately. While I can get the source reference for the entire > statement, in order to extract the if/else clauses I need to either use > the tokenized information from getParseData(), or re-parse the entire if > / else expression by hand (which seems prone to error to me). Re-parsing may be inefficient, but it isn't error prone. Just use getSrcLines (or as.character(srcref), with a bit more work) to get the source code from the original file, and re-parse it. > > Another example of where this would help is linking comments to > expressions. While I know this topic has been discussed previously > (https://stat.ethz.ch/pipermail/r-devel/2009-March/052731.html) and I am > fine with the default parser dropping comments, having the ability to > map the more detailed tokens back to the parse tree would allow the > comments to be annotated to their closest expression. getParseData does record comments. > > Of the three options you propose I think simply supplying the index as > an additional column from the getParseData() output would be the most > straightforward to implement and use. > > While it is true that you can get most of the way there with the current > source references as Michael mentions in some cases having more fine > grained location information is useful and there is no great way to get > there currently without re-parsing the full expressions from the source > reference. That's duplication of effort, but it's not really that slow. > The current getParseData output is already very implementation specific > so I don't think it would be a great additional support burden to add > the indexing information. Likely the whole function would have to be > removed if a different parsing method was used. Luke may wish to comment, but I think the issue is that the parse tree is not uniquely defined by the source, there are some arbitrary decisions being made. Our parser groups things in a certain way (especially comments, which are not part of the parse tree, but are recorded nonetheless), and a different implementation would necessarily be different. The more we put into getParseData, the harder it becomes to change those arbitrary decisions without breaking other people's code. We do break things sometimes, but we don't like to do it. Duncan Murdoch > > Regardless I am glad others have shown some interest in this issue, > thank you for taking the time to read and respond! > > Jim > > On Wed, Jul 29, 2015 at 2:47 PM, Duncan Murdoch > > wrote: > > On 29/07/2015 2:30 PM, Michael Lawrence wrote: > > Probably need a generic tree based on "ParseNode" objects that > associate the line information with the symbol (for leaf nodes). As > Duncan notes, it should be possible to gather that from the table. > > But it would be nice if there was an "expr" column in the parse data > column in addition to "text". It would contain the parsed object. > Otherwise, to use the table, one is often reparsing the text, which > just seems redundant and inconvenient. > > > Can you (both Jim and Michael) describe the uses you might have for > this? There are lots of possible changes that could make this > information available: > > - attach to each item in the parse tree, as the parser package > did. (Bad idea for general use which is why I dropped it, but > it could be done as a special option to parse, if you aren't > planning to evaluate the expression.) > - give the index into the parse tree of each item, i.e. c(1,1), > c(1,2), c(1,3) in the example below, or just 1,2,3 along with a > function to reconstruct the full path. > - give a copy of the branch of the parse tree, as Michael suggests. > > etc. Which is best for your purposes? > > Duncan Murdoch > > > Michael > > On Wed, Jul 29, 2015 at 9:43 AM, Duncan Murdoch > > wrote: > > On 29/07/2015 12:13 PM, Jim Hester wrote: > >> > >> I would like to map the parsed tokens obtained from > utils::getParseData() > >> to the parse tree and elements obtained by base::parse(). > >> > >> It looks like back when this code was in the parser package > the parse() > >> function annotated the elements in the tree with their id, > which would > >> allow you to perform this mapping. However when the code was > included in > >> R > >> this functionality was removed. > > > > > > Yes, not all elements of the parse tree can legally have > attributes > > attached. > >> > >> > >> ?getParseData states > >> The ?id? values are not attached to the elements of the parse > >> tree, they are only retained in the table returned by > >> ?getParseData?. > >> > >> Is there another way you can map between the getParseData() > tokens and > >> elements of the parse tree that makes this additional annotation > >> unnecessary? Or is this simply not possible? > > > > > > I think you can't get to it, though you can get close by > looking at the id & > > parent values in the table. For example, > > > > code <- "x + (y + 1)" > > p <- parse(text=code) > > > > getParseData(p) > > line1 col1 line2 col2 id parent token terminal text > > 15 1 1 1 11 15 0 expr FALSE > > 1 1 1 1 1 1 3 SYMBOL TRUE x > > 3 1 1 1 1 3 15 expr FALSE > > 2 1 3 1 3 2 15 '+' TRUE + > > 13 1 5 1 11 13 15 expr FALSE > > 4 1 5 1 5 4 13 '(' TRUE ( > > 11 1 6 1 10 11 13 expr FALSE > > 5 1 6 1 6 5 7 SYMBOL TRUE y > > 7 1 6 1 6 7 11 expr FALSE > > 6 1 8 1 8 6 11 '+' TRUE + > > 8 1 10 1 10 8 9 NUM_CONST TRUE 1 > > 9 1 10 1 10 9 11 expr FALSE > > 10 1 11 1 11 10 13 ')' TRUE ) > > > > > > Now p is an expression, with the parse tree in p[[1]]. From > the table, we > > can see that the root node has id 15, and 3 nodes have that as > a parent. > > Those would be p[[c(1,1)]], p[[c(1,2)]], p[[c(1,3)]]. The > tricky part is > > the re-ordering: those correspond to `+`, x, and (y+1) > respectively, not > > the order they appear in the original source or in the table. > Generally the > > function call appears first in the parse tree, but I'm not > sure you could > > always recognize which is the function call by looking at the > table. > > > > Duncan Murdoch > > > > ______________________________________________ > > R-devel at r-project.org mailing list > > https://stat.ethz.ch/mailman/listinfo/r-devel > > > From maechler at stat.math.ethz.ch Thu Jul 30 11:39:36 2015 From: maechler at stat.math.ethz.ch (Martin Maechler) Date: Thu, 30 Jul 2015 11:39:36 +0200 Subject: [Rd] all.equal: possible mismatch between behaviour and documentation In-Reply-To: References: Message-ID: <21945.61656.918871.226701@stat.math.ethz.ch> Dear Jon, thank you for raising the issue, >>>>> Jon Clayden >>>>> on Tue, 28 Jul 2015 12:14:48 +0100 writes: > Sorry; minor clarification. The actual test criterion in the example I > gave is of course abs((0.1-0.102)/0.1) < 0.01, not abs(0.1) < 0.01. In > any case, this does not match (my reading of) the docs, and the result > is not `TRUE`. > Regards, > Jon > On 28 July 2015 at 11:58, Jon Clayden wrote: > > Dear all, > > > > The documentation for `all.equal.numeric` says > > > > Numerical comparisons for ?scale = NULL? (the default) are done by > > first computing the mean absolute difference of the two numerical > > vectors. If this is smaller than ?tolerance? or not finite, > > absolute differences are used, otherwise relative differences > > scaled by the mean absolute difference. > > > > But the actual behaviour of the function is to use relative > > differences if the mean value of the first argument is greater than > > `tolerance`: > > > > all.equal(0.1, 0.102, tolerance=0.01) > > # [1] "Mean relative difference: 0.02" > > > > It seems to me that this example should produce `TRUE`, because > > abs(0.1-0.102) < 0.01, but it does not, because abs(0.1) > 0.01. Irrespective of the documentation, the above example should continue to produce what it does now. These numbers are not close to zero (compared to tol), and so relative error should be used. The whole idea of all.equal.numeric() is to use *relative* error/difference __unless__ that is not sensible anymore, namely when the denominator of the ratio which defines the relative error becomes too close to zero (and hence has to be seen as "unstable" / "unreliable"). The exact behavior of all.equal.numeric() has __ I'm pretty sure, but can no longer easily prove __ been inherited from the original S implementation in most parts, and (if that's correct) has been in place for about 30 years [ If not, it has "only" been in place about 17 years... ] notably the code below has been unchanged for a long time, and been in use in too many places to be changed now. So it is about the *documentation* only we should discuss changing. > > The relevant section in the source seems to be > > > > what <- if (is.null(scale)) { > > xn <- mean(abs(target)) > > if (is.finite(xn) && xn > tolerance) { > > xy <- xy/xn > > "relative" > > } > > else "absolute" > > } > > > > I think `xy`, not `xn`, should be tested here. as I said above, no such change is acceptable {but I don't see *why* either } > > The last line of the documentation, indicating that relative > > differences are "scaled by the mean absolute difference" also seems > > not to match the code, but in this aspect the code is surely right, > > i.e., the relative difference is relative to the mean value, not the > > mean difference. Indeed... interestingly, 'target' at the point in the function is containing only those values of the original target which are not NA (fine, of course) but also which are not equal to 'current'. This is a bit more surprising, also does make sense, let's say if we have too long numeric vectors which are mostly zero (and hence often both equal to zero), but in other cases, this may be more problematic (think of a case when also both vectors are mostly (exactly) equal, but then there are a few outliers...). But in spite of such cases, as said above, I'm almost sure this won't be changed in the code. My current conclusion would be to change only scaled by the mean absolute difference to scaled by the mean absolute target value or (less easy, but more precise) scaled by the mean absolute value of those finite entries of \code{target} where it differs from \code{current} or yet another version which is both precise and understandable ? Martin Maechler, ETH Zurich > > > > All the best, > > Jon From jon.clayden at gmail.com Thu Jul 30 14:38:37 2015 From: jon.clayden at gmail.com (Jon Clayden) Date: Thu, 30 Jul 2015 13:38:37 +0100 Subject: [Rd] all.equal: possible mismatch between behaviour and documentation In-Reply-To: <21945.61656.918871.226701@stat.math.ethz.ch> References: <21945.61656.918871.226701@stat.math.ethz.ch> Message-ID: Dear Martin, Thank you for following up. I appreciate that this is entrenched behaviour and that changing the documentation may be preferable to changing the code in practice, and accordingly I filed this as a documentation bug earlier today (#16493). But I don't agree that the current behaviour makes sense. Firstly, the case where the magnitude of `tolerance` is greater than that of the test vectors must surely be pretty rare. Who wants to test whether 1 and 2 are equal with a tolerance of 10? Secondly, absolute error is (IMHO) more intuitive, and since the docs don't emphasise that the function prefers relative error, I would think that many users, like me, would expect absolute error to be used. (My assumption, which the docs do not coherently contradict, has been that absolute error is used to decide whether or not to return `TRUE`, but if the vectors are not considered equal then relative error is used in the return string.) Finally, if the decision is about numerical precision in the comparison then comparing `xn` to `tolerance` doesn't seem sensible. Maybe it should be something like `xn * tolerance > .Machine$double.eps`, i.e., to check whether the test criterion under relative error would be within machine precision? Note that that would make all.equal(0.3, 0.1+0.2, tolerance=1e-16) # [1] "Mean relative difference: 1.850372e-16" test TRUE (on my system), since 0.3-(0.1+0.2) is approximately -5.6e-17 (i.e., less in magnitude than 1e-16), while 0.3*1e-16 is less than .Machine$double.eps of 2.2e-16 (so absolute error would be chosen). However, if the code will not be changed, I think the documentation should (i) make clear that relative error is preferred where appropriate; (ii) correct the line 2 mistake where it is stated that the choice of relative or absolute error is determined by comparing mean absolute difference to `tolerance`; and (iii) correct the final line mistake where it is stated that relative errors are scaled by the difference (which you have suggested alternatives for). All the best, Jon On 30 July 2015 at 10:39, Martin Maechler wrote: > Dear Jon, > > thank you for raising the issue, > >>>>>> Jon Clayden >>>>>> on Tue, 28 Jul 2015 12:14:48 +0100 writes: > >> Sorry; minor clarification. The actual test criterion in the example I >> gave is of course abs((0.1-0.102)/0.1) < 0.01, not abs(0.1) < 0.01. In >> any case, this does not match (my reading of) the docs, and the result >> is not `TRUE`. > >> Regards, >> Jon > >> On 28 July 2015 at 11:58, Jon Clayden wrote: >> > Dear all, >> > >> > The documentation for `all.equal.numeric` says >> > >> > Numerical comparisons for ?scale = NULL? (the default) are done by >> > first computing the mean absolute difference of the two numerical >> > vectors. If this is smaller than ?tolerance? or not finite, >> > absolute differences are used, otherwise relative differences >> > scaled by the mean absolute difference. >> > >> > But the actual behaviour of the function is to use relative >> > differences if the mean value of the first argument is greater than >> > `tolerance`: >> > >> > all.equal(0.1, 0.102, tolerance=0.01) >> > # [1] "Mean relative difference: 0.02" >> > >> > It seems to me that this example should produce `TRUE`, because >> > abs(0.1-0.102) < 0.01, but it does not, because abs(0.1) > 0.01. > > Irrespective of the documentation, > the above example should continue to produce what it does now. > These numbers are not close to zero (compared to tol), and so > relative error should be used. > > The whole idea of all.equal.numeric() is to use *relative* error/difference > __unless__ that is not sensible anymore, namely when the > denominator of the ratio which defines the relative error > becomes too close to zero (and hence has to be seen as > "unstable" / "unreliable"). > > The exact behavior of all.equal.numeric() has __ I'm pretty sure, but > can no longer easily prove __ been inherited from the original S > implementation in most parts, and (if that's correct) has been > in place for about 30 years [ If not, it has "only" been in > place about 17 years... ] > notably the code below has been unchanged for a long time, and been in use > in too many places to be changed now. > > So it is about the *documentation* only we should discuss changing. > > >> > The relevant section in the source seems to be >> > >> > what <- if (is.null(scale)) { >> > xn <- mean(abs(target)) >> > if (is.finite(xn) && xn > tolerance) { >> > xy <- xy/xn >> > "relative" >> > } >> > else "absolute" >> > } >> > >> > I think `xy`, not `xn`, should be tested here. > > as I said above, no such change is acceptable > {but I don't see *why* either } > >> > The last line of the documentation, indicating that relative >> > differences are "scaled by the mean absolute difference" also seems >> > not to match the code, but in this aspect the code is surely right, >> > i.e., the relative difference is relative to the mean value, not the >> > mean difference. > > Indeed... interestingly, 'target' at the point in the function > is containing only those values of the original target > which are not NA (fine, of course) but also which are not equal to > 'current'. This is a bit more surprising, also does make sense, > let's say if we have too long numeric vectors which are mostly zero > (and hence often both equal to zero), but in other cases, this > may be more problematic (think of a case when also both vectors > are mostly (exactly) equal, but then there are a few > outliers...). > But in spite of such cases, as said above, I'm almost sure this > won't be changed in the code. > > My current conclusion would be to change only > > scaled by the mean absolute difference > to > scaled by the mean absolute target value > > or (less easy, but more precise) > > scaled by the mean absolute value of those finite entries of > \code{target} where it differs from \code{current} > > or yet another version which is both precise and understandable > ? > > Martin Maechler, > ETH Zurich > >> > >> > All the best, >> > Jon From profjcnash at gmail.com Fri Jul 31 14:18:51 2015 From: profjcnash at gmail.com (ProfJCNash) Date: Fri, 31 Jul 2015 08:18:51 -0400 Subject: [Rd] equality testing, was all.equal.... Message-ID: <55BB67AB.8060406@gmail.com> These issues have been around for many years. I've created some upset among some programmers by using equality tests for reals (in R doubles). However, there's a "but", and it is that I test using if ( (a + offset) == (b + offset) ) { } where offset is a number like 100.0. This is really "equality to a scale" defined by the offset. It also seems to inhibit those who don't know what is going on from changing a tolerance. It will, of course, be defeated by some optimizing compilers. I started doing this on very small computers (<4K bytes for program and data) where I wanted to avoid lots of checks on whether a and b were small. Then I realized it simplified code and is suitable for most tests of equality. It may be that an all.equal.offset() function would be useful. Best, JN