ar {stats}R Documentation

Fit Autoregressive Models to Time Series


Fit an autoregressive time series model to the data, by default selecting the complexity by AIC.


ar(x, aic = TRUE, order.max = NULL,
   method = c("yule-walker", "burg", "ols", "mle", "yw"),
   na.action, series, ...)

ar.burg(x, ...)
## Default S3 method:
ar.burg(x, aic = TRUE, order.max = NULL,
        na.action =, demean = TRUE, series,
        var.method = 1, ...)
## S3 method for class 'mts'
ar.burg(x, aic = TRUE, order.max = NULL,
        na.action =, demean = TRUE, series,
        var.method = 1, ...)

ar.yw(x, ...)
## Default S3 method:
ar.yw(x, aic = TRUE, order.max = NULL,
      na.action =, demean = TRUE, series, ...)
## S3 method for class 'mts'
ar.yw(x, aic = TRUE, order.max = NULL,
      na.action =, demean = TRUE, series,
      var.method = 1, ...)

ar.mle(x, aic = TRUE, order.max = NULL, na.action =,
       demean = TRUE, series, ...)

## S3 method for class 'ar'
predict(object, newdata, n.ahead = 1, = TRUE, ...)



A univariate or multivariate time series.


Logical flag. If TRUE then the Akaike Information Criterion is used to choose the order of the autoregressive model. If FALSE, the model of order order.max is fitted.


Maximum order (or order) of model to fit. Defaults to the smaller of N-1 and 10*log10(N) where N is the number of observations except for method = "mle" where it is the minimum of this quantity and 12.


Character string giving the method used to fit the model. Must be one of the strings in the default argument (the first few characters are sufficient). Defaults to "yule-walker".


function to be called to handle missing values.


should a mean be estimated during fitting?


names for the series. Defaults to deparse(substitute(x)).


the method to estimate the innovations variance (see ‘Details’).


additional arguments for specific methods.


a fit from ar.


data to which to apply the prediction.


number of steps ahead at which to predict.

logical: return estimated standard errors of the prediction error?


For definiteness, note that the AR coefficients have the sign in

x[t] - m = a[1]*(x[t-1] - m) + … + a[p]*(x[t-p] - m) + e[t]

ar is just a wrapper for the functions ar.yw, ar.burg, ar.ols and ar.mle.

Order selection is done by AIC if aic is true. This is problematic, as of the methods here only ar.mle performs true maximum likelihood estimation. The AIC is computed as if the variance estimate were the MLE, omitting the determinant term from the likelihood. Note that this is not the same as the Gaussian likelihood evaluated at the estimated parameter values. In ar.yw the variance matrix of the innovations is computed from the fitted coefficients and the autocovariance of x.

ar.burg allows two methods to estimate the innovations variance and hence AIC. Method 1 is to use the update given by the Levinson-Durbin recursion (Brockwell and Davis, 1991, (8.2.6) on page 242), and follows S-PLUS. Method 2 is the mean of the sum of squares of the forward and backward prediction errors (as in Brockwell and Davis, 1996, page 145). Percival and Walden (1998) discuss both. In the multivariate case the estimated coefficients will depend (slightly) on the variance estimation method.

Remember that ar includes by default a constant in the model, by removing the overall mean of x before fitting the AR model, or (ar.mle) estimating a constant to subtract.


For ar and its methods a list of class "ar" with the following elements:


The order of the fitted model. This is chosen by minimizing the AIC if aic = TRUE, otherwise it is order.max.


Estimated autoregression coefficients for the fitted model.


The prediction variance: an estimate of the portion of the variance of the time series that is not explained by the autoregressive model.


The estimated mean of the series used in fitting and for use in prediction.


(ar.ols only.) The intercept in the model for x - x.mean.


The differences in AIC between each model and the best-fitting model. Note that the latter can have an AIC of -Inf.


The number of observations in the time series.


The value of the order.max argument.


The estimate of the partial autocorrelation function up to lag order.max.


residuals from the fitted model, conditioning on the first order observations. The first order residuals are set to NA. If x is a time series, so is resid.


The value of the method argument.


The name(s) of the time series.


The frequency of the time series.


The matched call.


(univariate case, order > 0.) The asymptotic-theory variance matrix of the coefficient estimates.

For, a time series of predictions, or if = TRUE, a list with components pred, the predictions, and se, the estimated standard errors. Both components are time series.


Only the univariate case of ar.mle is implemented.

Fitting by method="mle" to long series can be very slow.


Martyn Plummer. Univariate case of ar.yw, ar.mle and C code for univariate case of ar.burg by B. D. Ripley.


Brockwell, P. J. and Davis, R. A. (1991) Time Series and Forecasting Methods. Second edition. Springer, New York. Section 11.4.

Brockwell, P. J. and Davis, R. A. (1996) Introduction to Time Series and Forecasting. Springer, New York. Sections 5.1 and 7.6.

Percival, D. P. and Walden, A. T. (1998) Spectral Analysis for Physical Applications. Cambridge University Press.

Whittle, P. (1963) On the fitting of multivariate autoregressions and the approximate canonical factorization of a spectral density matrix. Biometrika 40, 129–134.

See Also

ar.ols, arima for ARMA models; acf2AR, for AR construction from the ACF.

arima.sim for simulation of AR processes.


ar(lh, method = "burg")
ar(lh, method = "ols")
ar(lh, FALSE, 4) # fit ar(4)

( <- ar(sunspot.year))
predict(, n.ahead = 25)
## try the other methods too

ar(ts.union(BJsales, BJsales.lead))
## Burg is quite different here, as is OLS (see ar.ols)
ar(ts.union(BJsales, BJsales.lead), method = "burg")

[Package stats version 3.2.2 Index]