HOME

TheInfoList



OR:

In
econometrics Econometrics is the application of statistical methods to economic data in order to give empirical content to economic relationships. M. Hashem Pesaran (1987). "Econometrics," '' The New Palgrave: A Dictionary of Economics'', v. 2, p. 8 p. 8 ...
, the autoregressive conditional heteroskedasticity (ARCH) model is a statistical model for
time series In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Ex ...
data that describes the
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbe ...
of the current error term or
innovation Innovation is the practical implementation of ideas that result in the introduction of new goods or services or improvement in offering goods or services. ISO TC 279 in the standard ISO 56000:2020 defines innovation as "a new or changed entit ...
as a function of the actual sizes of the previous time periods' error terms; often the variance is related to the squares of the previous
innovation Innovation is the practical implementation of ideas that result in the introduction of new goods or services or improvement in offering goods or services. ISO TC 279 in the standard ISO 56000:2020 defines innovation as "a new or changed entit ...
s. The ARCH model is appropriate when the error variance in a time series follows an
autoregressive In statistics, econometrics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model spe ...
(AR) model; if an
autoregressive moving average In statistics, econometrics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model spe ...
(ARMA) model is assumed for the error variance, the model is a generalized autoregressive conditional heteroskedasticity (GARCH) model. ARCH models are commonly employed in modeling financial
time series In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Ex ...
that exhibit time-varying volatility and
volatility clustering In finance, volatility clustering refers to the observation, first noted by Mandelbrot (1963), that "large changes tend to be followed by large changes, of either sign, and small changes tend to be followed by small changes." A quantitative manifes ...
, i.e. periods of swings interspersed with periods of relative calm. ARCH-type models are sometimes considered to be in the family of
stochastic volatility In statistics, stochastic volatility models are those in which the variance of a stochastic process is itself randomly distributed. They are used in the field of mathematical finance to evaluate derivative securities, such as options. The name d ...
models, although this is strictly incorrect since at time ''t'' the volatility is completely pre-determined (deterministic) given previous values.


Model specification

To model a time series using an ARCH process, let ~\epsilon_t~ denote the error terms (return residuals, with respect to a mean process), i.e. the series terms. These ~\epsilon_t~ are split into a stochastic piece z_t and a time-dependent standard deviation \sigma_t characterizing the typical size of the terms so that : ~\epsilon_t=\sigma_t z_t ~ The random variable z_t is a strong
white noise In signal processing, white noise is a random signal having equal intensity at different frequencies, giving it a constant power spectral density. The term is used, with this or similar meanings, in many scientific and technical disciplines ...
process. The series \sigma_t^2 is modeled by : \sigma_t^2=\alpha_0+\alpha_1 \epsilon_^2+\cdots+\alpha_q \epsilon_^2 = \alpha_0 + \sum_^q \alpha_ \epsilon_^2 , :where ~\alpha_0>0~ and \alpha_i\ge 0,~i>0. An ARCH(''q'') model can be estimated using
ordinary least squares In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the ...
. A method for testing whether the residuals \epsilon_t exhibit time-varying heteroskedasticity using the
Lagrange multiplier test In statistics, the score test assesses constraints on statistical parameters based on the gradient of the likelihood function—known as the ''score''—evaluated at the hypothesized parameter value under the null hypothesis. Intuitively, if the ...
was proposed by Engle (1982). This procedure is as follows: # Estimate the best fitting
autoregressive model In statistics, econometrics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model spe ...
AR(''q'') y_t = a_0 + a_1 y_ + \cdots + a_q y_ + \epsilon_t = a_0 + \sum_^q a_i y_ + \epsilon_t . # Obtain the squares of the error \hat \epsilon^2 and regress them on a constant and ''q'' lagged values: #: \hat \epsilon_t^2 = \hat \alpha_0 + \sum_^ \hat \alpha_i \hat \epsilon_^2 #: where ''q'' is the length of ARCH lags. #The
null hypothesis In scientific research, the null hypothesis (often denoted ''H''0) is the claim that no difference or relationship exists between two sets of data or variables being analyzed. The null hypothesis is that any experimentally observed difference is d ...
is that, in the absence of ARCH components, we have \alpha_i = 0 for all i = 1, \cdots, q . The alternative hypothesis is that, in the presence of ARCH components, at least one of the estimated \alpha_i coefficients must be significant. In a sample of ''T'' residuals under the null hypothesis of no ARCH errors, the test statistic ''T'R²'' follows \chi^2 distribution with ''q'' degrees of freedom, where T' is the number of equations in the model which fits the residuals vs the lags (i.e. T'=T-q ). If ''T'R²'' is greater than the Chi-square table value, we ''reject'' the null hypothesis and conclude there is an ARCH effect in the ARMA model. If ''T'R²'' is smaller than the Chi-square table value, we do not reject the null hypothesis.


GARCH

If an
autoregressive moving average model In statistics, econometrics and signal processing, an autoregressive (AR) model is a representation of a type of random process; as such, it is used to describe certain time-varying processes in nature, economics, etc. The autoregressive model spe ...
(ARMA) model is assumed for the error variance, the model is a generalized autoregressive conditional heteroskedasticity (GARCH) model. In that case, the GARCH (''p'', ''q'') model (where ''p'' is the order of the GARCH terms ~\sigma^2 and ''q'' is the order of the ARCH terms ~\epsilon^2 ), following the notation of the original paper, is given by y_t=x'_t b +\epsilon_t \epsilon_t, \psi_ \sim\mathcal(0, \sigma^2_t) \sigma_t^2=\omega + \alpha_1 \epsilon_^2 + \cdots + \alpha_q \epsilon_^2 + \beta_1 \sigma_^2 + \cdots + \beta_p\sigma_^2 = \omega + \sum_^q \alpha_i \epsilon_^2 + \sum_^p \beta_i \sigma_^2 Generally, when testing for heteroskedasticity in econometric models, the best test is the
White test In statistics, the White test is a statistical test that establishes whether the variance of the errors in a regression model is constant: that is for homoskedasticity. This test, and an estimator for heteroscedasticity-consistent standard erro ...
. However, when dealing with
time series In mathematics, a time series is a series of data points indexed (or listed or graphed) in time order. Most commonly, a time series is a sequence taken at successive equally spaced points in time. Thus it is a sequence of discrete-time data. Ex ...
data, this means to test for ARCH and GARCH errors. Exponentially weighted
moving average In statistics, a moving average (rolling average or running average) is a calculation to analyze data points by creating a series of averages of different subsets of the full data set. It is also called a moving mean (MM) or rolling mean and is ...
(EWMA) is an alternative model in a separate class of exponential smoothing models. As an alternative to GARCH modelling it has some attractive properties such as a greater weight upon more recent observations, but also drawbacks such as an arbitrary decay factor that introduces subjectivity into the estimation.


GARCH(''p'', ''q'') model specification

The lag length ''p'' of a GARCH(''p'', ''q'') process is established in three steps: # Estimate the best fitting AR(''q'') model #: y_t = a_0 + a_1 y_ + \cdots + a_q y_ + \epsilon_t = a_0 + \sum_^q a_i y_ + \epsilon_t . # Compute and plot the autocorrelations of \epsilon^2 by #: \rho = # The asymptotic, that is for large samples, standard deviation of \rho (i) is 1/\sqrt . Individual values that are larger than this indicate GARCH errors. To estimate the total number of lags, use the
Ljung–Box test The Ljung–Box test (named for Greta M. Ljung and George E. P. Box) is a type of statistical test of whether any of a group of autocorrelations of a time series are different from zero. Instead of testing randomness at each distinct lag, it tes ...
until the value of these are less than, say, 10% significant. The Ljung–Box
Q-statistic The Q-statistic is a test statistic output by either the Box-Pierce test or, in a modified version which provides better small sample properties, by the Ljung-Box test. It follows the chi-squared distribution. See also Portmanteau test. The q ...
follows \chi^2 distribution with ''n'' degrees of freedom if the squared residuals \epsilon^2_t are uncorrelated. It is recommended to consider up to T/4 values of ''n''. The null hypothesis states that there are no ARCH or GARCH errors. Rejecting the null thus means that such errors exist in the
conditional variance In probability theory and statistics, a conditional variance is the variance of a random variable given the value(s) of one or more other variables. Particularly in econometrics, the conditional variance is also known as the scedastic function or ...
.


NGARCH


NAGARCH

Nonlinear Asymmetric GARCH(1,1) (NAGARCH) is a model with the specification: : ~\sigma_^2= ~\omega + ~\alpha (~\epsilon_ - ~\theta~\sigma_)^2 + ~\beta ~\sigma_^2, :where ~\alpha\geq 0 , ~\beta \geq 0 , ~\omega > 0 and ~\alpha (1 + ~\theta^2) + ~\beta < 1 , which ensures the non-negativity and stationarity of the variance process. For stock returns, parameter ~ \theta is usually estimated to be positive; in this case, it reflects a phenomenon commonly referred to as the "leverage effect", signifying that negative returns increase future volatility by a larger amount than positive returns of the same magnitude. This model should not be confused with the NARCH model, together with the NGARCH extension, introduced by Higgins and Bera in 1992.


IGARCH

Integrated Generalized Autoregressive Conditional heteroskedasticity (IGARCH) is a restricted version of the GARCH model, where the persistent parameters sum up to one, and imports a
unit root In probability theory and statistics, a unit root is a feature of some stochastic processes (such as random walks) that can cause problems in statistical inference involving time series models. A linear stochastic process has a unit root if 1 is ...
in the GARCH process. The condition for this is \sum^p_ ~\beta_ +\sum_^q~\alpha_ = 1 .


EGARCH

The exponential generalized autoregressive conditional heteroskedastic (EGARCH) model by Nelson & Cao (1991) is another form of the GARCH model. Formally, an EGARCH(p,q): \log\sigma_^2=\omega+\sum_^\beta_g(Z_)+\sum_^\alpha_\log\sigma_^ where g(Z_)=\theta Z_+\lambda(, Z_, -E(, Z_, )), \sigma_^ is the
conditional variance In probability theory and statistics, a conditional variance is the variance of a random variable given the value(s) of one or more other variables. Particularly in econometrics, the conditional variance is also known as the scedastic function or ...
, \omega, \beta, \alpha, \theta and \lambda are coefficients. Z_ may be a
standard normal variable A standard normal deviate is a normally distributed deviate. It is a realization of a standard normal random variable, defined as a random variable with expected value 0 and variance 1.Dodge, Y. (2003) The Oxford Dictionary of Statisti ...
or come from a
generalized error distribution The generalized normal distribution or generalized Gaussian distribution (GGD) is either of two families of parametric continuous probability distributions on the real line. Both families add a shape parameter to the normal distribution. To dis ...
. The formulation for g(Z_) allows the sign and the magnitude of Z_ to have separate effects on the volatility. This is particularly useful in an asset pricing context. Since \log\sigma_^ may be negative, there are no sign restrictions for the parameters.


GARCH-M

The GARCH-in-mean (GARCH-M) model adds a heteroskedasticity term into the mean equation. It has the specification: y_t = ~\beta x_t + ~\lambda ~\sigma_t + ~\epsilon_t The residual ~\epsilon_t is defined as: ~\epsilon_t = ~\sigma_t ~\times z_t


QGARCH

The Quadratic GARCH (QGARCH) model by Sentana (1995) is used to model asymmetric effects of positive and negative shocks. In the example of a GARCH(1,1) model, the residual process ~\sigma_t is ~\epsilon_t = ~\sigma_t z_t where z_t is i.i.d. and ~\sigma_t^2 = K + ~\alpha ~\epsilon_^2 + ~\beta ~\sigma_^2 + ~\phi ~\epsilon_


GJR-GARCH

Similar to QGARCH, the Glosten-Jagannathan-Runkle GARCH (GJR-GARCH) model by Glosten, Jagannathan and Runkle (1993) also models asymmetry in the ARCH process. The suggestion is to model ~\epsilon_t = ~\sigma_t z_t where z_t is i.i.d., and ~\sigma_t^2 = K + ~\delta ~\sigma_^2 + ~\alpha ~\epsilon_^2 + ~\phi ~\epsilon_^2 I_ where I_ = 0 if ~\epsilon_ \ge 0 , and I_ = 1 if ~\epsilon_ < 0 .


TGARCH model

The Threshold GARCH (TGARCH) model by Zakoian (1994) is similar to GJR GARCH. The specification is one on conditional standard deviation instead of
conditional variance In probability theory and statistics, a conditional variance is the variance of a random variable given the value(s) of one or more other variables. Particularly in econometrics, the conditional variance is also known as the scedastic function or ...
: ~\sigma_t = K + ~\delta ~\sigma_ + ~\alpha_1^ ~\epsilon_^ + ~\alpha_1^ ~\epsilon_^ where ~\epsilon_^ = ~\epsilon_ if ~\epsilon_ > 0 , and ~\epsilon_^ = 0 if ~\epsilon_ \le 0 . Likewise, ~\epsilon_^ = ~\epsilon_ if ~\epsilon_ \le 0 , and ~\epsilon_^ = 0 if ~\epsilon_ > 0 .


fGARCH

Hentschel's fGARCH model, also known as Family GARCH, is an omnibus model that nests a variety of other popular symmetric and asymmetric GARCH models including APARCH, GJR, AVGARCH, NGARCH, etc.


COGARCH

In 2004, Claudia Klüppelberg, Alexander Lindner and Ross Maller proposed a continuous-time generalization of the discrete-time GARCH(1,1) process. The idea is to start with the GARCH(1,1) model equations :\epsilon_t = \sigma_t z_t, :\sigma_t^2 = \alpha_0 + \alpha_1 \epsilon^2_ + \beta_1 \sigma^2_ = \alpha_0 + \alpha_1 \sigma_^2 z_^2 + \beta_1 \sigma^2_, and then to replace the strong white noise process z_t by the infinitesimal increments \mathrmL_t of a
Lévy process In probability theory, a Lévy process, named after the French mathematician Paul Lévy, is a stochastic process with independent, stationary increments: it represents the motion of a point whose successive displacements are random, in which disp ...
(L_t)_ , and the squared noise process z^2_t by the increments \mathrm ,L\mathrm_t , where : ,L\mathrm_t = \sum_ (\Delta L_t)^2,\quad t\geq0, is the purely discontinuous part of the
quadratic variation In mathematics, quadratic variation is used in the analysis of stochastic processes such as Brownian motion and other martingales. Quadratic variation is just one kind of variation of a process. Definition Suppose that X_t is a real-valued sto ...
process of L . The result is the following system of
stochastic differential equations A stochastic differential equation (SDE) is a differential equation in which one or more of the terms is a stochastic process, resulting in a solution which is also a stochastic process. SDEs are used to model various phenomena such as stock pr ...
: :\mathrmG_t = \sigma_ \,\mathrmL_t, :\mathrm\sigma_t^2 = (\beta - \eta \sigma^2_t)\,\mathrmt + \varphi \sigma_^2 \,\mathrm ,L\mathrm_t, where the positive parameters \beta , \eta and \varphi are determined by \alpha_0 , \alpha_1 and \beta_1 . Now given some initial condition (G_0,\sigma^2_0) , the system above has a pathwise unique solution (G_t,\sigma^2_t)_ which is then called the continuous-time GARCH (COGARCH) model.


ZD-GARCH

Unlike GARCH model, the Zero-Drift GARCH (ZD-GARCH) model by Li, Zhang, Zhu and Ling (2018) lets the drift term ~\omega= 0 in the first order GARCH model. The ZD-GARCH model is to model ~\epsilon_t = ~\sigma_t z_t , where z_t is i.i.d., and ~\sigma_t^2 = ~\alpha_ ~\epsilon_^2 + ~\beta_ ~\sigma_^2. The ZD-GARCH model does not require ~\alpha_ + ~\beta_= 1 , and hence it nests the Exponentially weighted moving average (EWMA) model in "
RiskMetrics The RiskMetrics variance model (also known as exponential smoother) was first established in 1989, when Sir Dennis Weatherstone, the new chairman of J.P. Morgan, asked for a daily report measuring and explaining the risks of his firm. Nearly f ...
". Since the drift term ~\omega= 0 , the ZD-GARCH model is always non-stationary, and its statistical inference methods are quite different from those for the classical GARCH model. Based on the historical data, the parameters ~\alpha_ and ~\beta_ can be estimated by the generalized
QMLE In statistics a quasi-maximum likelihood estimate (QMLE), also known as a pseudo-likelihood estimate or a composite likelihood estimate, is an estimate of a parameter ''θ'' in a statistical model that is formed by maximizing a function that is rel ...
method.


Spatial GARCH

Spatial GARCH processes by Otto, Schmid and Garthoff (2018) are considered as the spatial equivalent to the temporal generalized autoregressive conditional heteroscedasticity (GARCH) models. In contrast to the temporal ARCH model, in which the distribution is known given the full information set for the prior periods, the distribution is not straightforward in the spatial and spatiotemporal setting due to the interdependence between neighboring spatial locations. The spatial model is given by ~\epsilon(s_i) = ~\sigma(s_i) z(s_i) and : ~\sigma(s_i)^2 = ~\alpha_i + \sum_^ \rho w_ \epsilon(s_v)^2, where ~s_i denotes the i-th spatial location and ~w_ refers to the iv-th entry of a spatial weight matrix and w_=0 for ~i = 1, ..., n . The spatial weight matrix defines which locations are considered to be adjacent.


Gaussian process-driven GARCH

In a different vein, the machine learning community has proposed the use of Gaussian process regression models to obtain a GARCH scheme. This results in a nonparametric modelling scheme, which allows for: (i) advanced robustness to overfitting, since the model marginalises over its parameters to perform inference, under a Bayesian inference rationale; and (ii) capturing highly-nonlinear dependencies without increasing model complexity.


References


Further reading

* * * ''(the paper which sparked the general interest in ARCH models)'' * * ''(a short, readable introduction)'' * * * {{DEFAULTSORT:Autoregressive Conditional Heteroskedasticity Nonlinear time series analysis Autocorrelation