[Rd] Why does the lexical analyzer drop comments ?
Romain Francois
romain.francois at dbmail.com
Tue Mar 31 16:39:04 CEST 2009
hadley wickham wrote:
>> At the moment, I am concentrating efforts deep down in the parser code, but
>> there are other challenges:
>> - once the expressions are parsed, we will need something that investigates
>> to find evidence about function calls, to get an idea of where the function
>> is defined (by the user, in a package, ...) . This is tricky, and unless you
>> actually evaluate the code, there will be some errors made.
>>
>
> Are you aware of Luke Tierney's codetools package? That would seem to
> be the place to start.
>
Yep. Plan to combine the more verbose information out of the modified
parser with the same guess machine that checkUsage uses.
Another side effect is that we could imagine to link error patterns
identified by checkUsage (no visible binding for global variable "y",
...) to actual locations on the file (for example the place where the
variable y is used in that case ), which at the moment is not possible
because the parser only locates entire expression (semantic groupings)
and not tokens.
> f <- function( x = 2) {
+ y + 2
+ }
> checkUsage( f )
<anonymous>: no visible binding for global variable ‘y’
> Hadley
>
>
--
Romain Francois
Independent R Consultant
+33(0) 6 28 91 30 30
http://romainfrancois.blog.free.fr
More information about the R-devel
mailing list