[Rd] multi-threaded R current status?
Duncan Temple Lang
duncan@research.bell-labs.com
Fri, 12 Apr 2002 15:26:18 -0400
I plan to attack this in mid May unless Luke or others get there
first. As I have mentioned before, making the R engine reentrant
and/or thread-safe will probably not be all that is needed for your
purposes, and fixing the packages, especially those with native (C,
C++ & Fortran) code is also necessary. That is why I have been working
on a tool that aids in the task of removing the global variables.
Also, it may be prudent to prioritize an adequate security model in
your application over allowing concurrent intrusions :-)
D.
Warnes, Gregory R wrote:
>
> Hi All,
>
> What is the current status of removing the global variables etc that is
> required to permit multi-threading R?
>
> I'm developing a web application tool for/using R, python (www.python.org),
> and Zope (www.zope.org), and it would be really convenient if I could use
> something like RPy to communicate with several concurrent R sessions,
> preferably within the same process space.
>
> -Greg
>
> -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
> r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
> Send "info", "help", or "[un]subscribe"
> (in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch
> _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
--
_______________________________________________________________
Duncan Temple Lang duncan@research.bell-labs.com
Bell Labs, Lucent Technologies office: (908)582-3217
700 Mountain Avenue, Room 2C-259 fax: (908)582-3340
Murray Hill, NJ 07974-2070
http://cm.bell-labs.com/stat/duncan
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._