HOME

TheInfoList



OR:

In
statistics Statistics (from German language, German: ', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a s ...
, the Hannan–Quinn information criterion (HQC) is a criterion for
model selection Model selection is the task of selecting a model from among various candidates on the basis of performance criterion to choose the best one. In the context of machine learning and more generally statistical analysis, this may be the selection of ...
. It is an alternative to
Akaike information criterion The Akaike information criterion (AIC) is an estimator of prediction error and thereby relative quality of statistical models for a given set of data. Given a collection of models for the data, AIC estimates the quality of each model, relative to ...
(AIC) and
Bayesian information criterion In statistics, the Bayesian information criterion (BIC) or Schwarz information criterion (also SIC, SBC, SBIC) is a criterion for model selection among a finite set of models; models with lower BIC are generally preferred. It is based, in part, on ...
(BIC). It is given as : \mathrm = -2 L_ + 2 k \ln(\ln(n))\ Where: * ''L_'' is the log-likelihood, * ''k'' is the number of parameters, and * ''n'' is the number of
observations Observation in the natural sciences is an act or instance of noticing or perceiving and the acquisition of information from a primary source. In living beings, observation employs the senses. In science, observation can also involve the perceptio ...
. According to Burnham and Anderson, HQIC, "while often cited, seems to have seen little use in practice" (p. 287). They also note that HQIC, like BIC, but unlike AIC, is not an estimator of
Kullback–Leibler divergence In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted D_\text(P \parallel Q), is a type of statistical distance: a measure of how much a model probability distribution is diff ...
. Claeskens and Hjort note that HQC, like BIC, but unlike AIC, is not asymptotically efficient; however, it misses the optimal estimation rate by a very small \ln(\ln(n)) factor (ch. 4). They further point out that whatever method is being used for fine-tuning the criterion will be more important in practice than the term \ln(\ln(n)), since this latter number is small even for very large n; however, the \ln(\ln(n)) term ensures that, unlike AIC, HQC is strongly consistent. It follows from the law of the iterated logarithm that any strongly consistent method must miss efficiency by at least a \ln(\ln(n)) factor, so in this sense HQC is asymptotically very well-behaved. Van der Pas and Grünwald prove that model selection based on a modified Bayesian estimator, the so-called switch distribution, in many cases behaves asymptotically like HQC, while retaining the advantages of Bayesian methods such as the use of priors.


See also

*
Akaike information criterion The Akaike information criterion (AIC) is an estimator of prediction error and thereby relative quality of statistical models for a given set of data. Given a collection of models for the data, AIC estimates the quality of each model, relative to ...
*
Bayesian information criterion In statistics, the Bayesian information criterion (BIC) or Schwarz information criterion (also SIC, SBC, SBIC) is a criterion for model selection among a finite set of models; models with lower BIC are generally preferred. It is based, in part, on ...
*
Deviance information criterion The deviance information criterion (DIC) is a hierarchical modeling generalization of the Akaike information criterion (AIC). It is particularly useful in Bayesian model selection problems where the posterior distributions of the models have been ...
*
Focused information criterion In statistics, the focused information criterion (FIC) is a method for selecting the most appropriate model among a set of competitors for a given data set. Unlike most other model selection strategies, like the Akaike information criterion (AIC), t ...
*
Shibata information criterion Shibata may refer to: Places * Shibata, Miyagi, a town in Miyagi Prefecture * Shibata District, Miyagi, a district in Miyagi Prefecture * Shibata, Niigata, a city in Niigata Prefecture ** Shibata Station (Niigata), a railway station in Niigata ...


References

Regression variable selection Model selection


Further reading

* Aznar Grasa, A. (1989). ''Econometric Model Selection: A New Approach'', Springer. {{ISBN, 978-0-7923-0321-3 * Chen, C et al. ''Order Determination for Autoregressive Processes Using Resampling methods'' Statistica Sinica 3:1993, http://www3.stat.sinica.edu.tw/statistica/oldpdf/A3n214.pdf