[R-SIG-Finance] sufficient n for a binomial option pricing model

J Toll jctoll at gmail.com
Fri Sep 7 03:08:09 CEST 2012


On Thu, Sep 6, 2012 at 9:40 AM, Smith, Dale <Dale.Smith at fiserv.com> wrote:
> One way to terminate is to look at the consecutive differences between
> the averages and terminate if the difference is less than your
> tolerance. However, you should guard against the case where the
> consecutive differences are never less than the tolerance. In this case,
> just put in a maximum number of steps n and log the last average, the
> number of steps, and a message. This allows the user to determine
> whether they want to accept or reject the result.

Dale,

Thanks for the suggested method.  Actually, that's probably overkill
for what I'm trying to do, but it definitely puts the process in
perspective and gives some insight into what I could be doing to boost
the reliability of my pricing.  Thanks again.

Best,

James






> -----Original Message-----
> From: r-sig-finance-bounces at r-project.org
> [mailto:r-sig-finance-bounces at r-project.org] On Behalf Of J Toll
> Sent: Thursday, September 06, 2012 10:25 AM
> To: r-sig-finance at r-project.org
> Subject: [R-SIG-Finance] sufficient n for a binomial option pricing
> model
>
> Hi,
>
> I have a question regarding the selection of n, the number of time
> steps, in a binomial option pricing model.  I suppose my question is not
> strictly related to R.  As larger values should be more accurate, what
> I've read on the subject simply suggests that you use a sufficiently
> large value for your purposes.  So I've been trying to evaluate what is
> a sufficiently large value of n for my purposes.  Is there any rule of
> thumb regarding the value of n?
>
> When using the fOptions package CRRBinomialTreeOption function, with
> varying n, the price oscillates back and forth converging on a price.
> This can be clearly seen through plotting.
>
> require(fOptions)
>
> x <- function(n) {
>
>   CRRBinomialTreeOption(TypeFlag = "ca",
>                         S = 50,
>                         X = 50,
>                         Time = 1/12,
>                         r = 0.02,
>                         b = 0.02,
>                         sigma = 0.18,
>                         n = n)@price
> }
>
> y <- sapply(1:100, x)               # mean(y) == 1.079693
> plot(y)
>
> Given this oscillation, my question is whether it would be "better" to
> compute two prices using two smaller, consecutive values of n rather
> than one large value?  Or is there some other better way?
>
> For example, using n =1000 or 1001, the option prices are within 5
> hundredths of a cent, but the calculation is extremely slow for either.
>
> x(1000)                                    # 1.077408
> x(1001)                                    # 1.077926
>
> mean(sapply(1000:1001, x))      # 1.077667
>
> Comparatively, taking the mean of n= 40 and 41 yields a value very close
> to the middle of the range, yet is much faster.
>
> mean(sapply(40:41, x))             # 1.0776
>
> It seems like averaging two smaller, consecutive values of n is
> basically as accurate and far faster than using large values of n.  I
> was hoping someone might have some insight into why this might or might
> not be a valid approach.  Thanks.
>
>
> James
>
> _______________________________________________
> R-SIG-Finance at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-sig-finance
> -- Subscriber-posting only. If you want to post, subscribe first.
> -- Also note that this is not the r-help list where general R questions
> should go.



More information about the R-SIG-Finance mailing list