[R] Numerical Derivative / Numerical Differentiation of unknown funct ion

Berton Gunter gunter.berton at gene.com
Fri May 6 00:34:06 CEST 2005


But...

See ?numericDeriv which already does it via a C call and hence is much
faster (and probably more accurate,too).



-- Bert Gunter
Genentech Non-Clinical Statistics
South San Francisco, CA
 
"The business of the statistician is to catalyze the scientific learning
process."  - George E. P. Box
 
 

> -----Original Message-----
> From: r-help-bounces at stat.math.ethz.ch 
> [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of Uzuner, Tolga
> Sent: Thursday, May 05, 2005 3:21 PM
> To: 'r-help at stat.math.ethz.ch'
> Subject: [R] Numerical Derivative / Numerical Differentiation 
> of unknown funct ion
> 
> Hi,
> 
> I have been trying to do numerical differentiation using R. 
> 
> I found some old S code using Richardson Extrapolation which 
> I managed to get
> to work.
> 
> I am posting it here in case anyone needs it.
> 
> 
> ##############################################################
> ##########
> richardson.grad <- function(func, x, d=0.01, eps=1e-4, r=6, show=F){
> # This function calculates a numerical approximation of the first
> #   derivative of func at the point x. The calculation
> #   is done by Richardson's extrapolation (see eg. G.R.Linfield and
> J.E.T.Penny
> #   "Microcomputers in Numerical Analysis"). The method 
> should be used if
> #   accuracy, as opposed to speed, is important.
> #
> #  *  modified by Paul Gilbert from orginal code by XINGQIAO LIU.
> # CALCULATES THE FIRST ORDER DERIVATIVE 
> #     VECTOR using a Richardson  extrapolation.
> #
> #  GENERAL APPROACH
> #     --  GIVEN THE FOLLOWING INITIAL VALUES:
> #             INTERVAL VALUE D, NUMBER OF ITERATIONS R, AND
> #             REDUCED FACTOR V.
> #      - THE FIRST ORDER aproximation to the DERIVATIVE WITH 
> RESPECT TO Xi IS
> #
> #           F'(Xi)={F(X1,...,Xi+D,...,Xn) - 
> F(X1,...,Xi-D,...,Xn)}/(2*D)
> #       
> #     --  REPEAT r TIMES  with successively smaller D  and 
> #          then apply Richardson extraplolation
> #
> #  INPUT
> #       func    Name of the function.
> #       x       The parameters of func.
> #       d       Initial interval value (real) by default set 
> to 0.01*x or
> #               eps if x is 0.0.
> #       r       The number of Richardson improvement iterations.
> #       show    If T show intermediate results.
> #  OUTPUT
> #
> #       The gradient vector.
> 
>   v <- 2               # reduction factor.
>   n <- length(x)       # Integer, number of variables.
>   a.mtr <- matrix(1, r, n) 
>   b.mtr <- matrix(1, (r - 1), n)
> #-------------------------------------------------------------
> -----------
> # 1 Calculate the derivative formula given in 'GENERAL 
> APPROACH' section.
> #   --  The first order derivatives are stored in the matrix 
> a.mtr[k,i], 
> #        where the indexing variables k for rows(1 to r),  i 
> for columns
> #       (1 to n),  r is the number of iterations, and n is 
> the number of
> #       variables.
> #-------------------------------------------------------------
> ------------  
> 
>   h <- abs(d*x)+eps*(x==0.0)
>   for(k in 1:r)  { # successively reduce h                
>      for(i in 1:n)  {
>          x1.vct <- x2.vct <- x
>          x1.vct[i]  <- x[i] + h[i]
>          x2.vct[i]  <- x[i] - h[i]
>          if(k == 1) a.mtr[k,i] <- (func(x1.vct) - 
> func(x2.vct))/(2*h[i])
>          else{
>            if(abs(a.mtr[(k-1),i])>1e-20)
>                  # some functions are unstable near 0.0              
>                  a.mtr[k,i] <- (func(x1.vct)-func(x2.vct))/(2*h[i])
>             else  a.mtr[k, i] <- 0
>           }
>       }
>      h <- h/v     # Reduced h by 1/v.
>     }	
>    if(show)  {
>         cat("\n","first order approximations", "\n")		
>         print(a.mtr, 12)
>     }
> 
> #-------------------------------------------------------------
> -----------
> # 1 Applying Richardson Extrapolation to improve the accuracy of 
> #   the first and second order derivatives. The algorithm as follows:
> #
> #   --  For each column of the 1st and 2nd order derivatives 
> matrix a.mtr,
> #       say, A1, A2, ..., Ar, by Richardson Extrapolation, to 
> calculate a
> #       new sequence of approximations B1, B2, ..., Br used 
> the formula
> #
> #          B(i) =( A(i+1)*4^m - A(i) ) / (4^m - 1) ,  i=1,2,...,r-m
> #
> #             N.B. This formula assumes v=2.
> #
> #   -- Initially m is taken as 1  and then the process is repeated 
> #      restarting with the latest improved values and increasing the 
> #      value of m by one each until m equals r-1
> #
> # 2 Display the improved derivatives for each
> #   m from 1 to r-1 if the argument show=T.
> #
> # 3 Return the final improved  derivative vector.
> #-------------------------------------------------------------
> ------------
> 
>   for(m in 1:(r - 1)) {		
>      for(i in 1:(r - m)) b.mtr[i,]<- 
> (a.mtr[(i+1),]*(4^m)-a.mtr[i,])/(4^m-1)
> #     a.mtr<- b.mtr
> #     a.mtr<- (a.mtr[2:(r+1-m),]*(4^m)-a.mtr[1:(r-m),])/(4^m-1)
>      if(show & m!=(r-1) )  {
>         cat("\n","Richarson improvement group No. ", m, "\n")		
>         print(a.mtr[1:(r-m),], 12)
>       }
>    }
> a.mtr[length(a.mtr)]
> }
> 
> ## try it out
> richardson.grad(function(x){x^3},2)
> ##############################################################
> ##########################
> 
> 
> Regards,
> Tolga Uzuner
> 
> 
> ==============================================================
> ================
> This message is for the sole use of the intended recipient...{{dropped}}




More information about the R-help mailing list