Indirect Inference
Indirect inference is a simulation-based method for estimating the parameters of economic models. It is a computational method for determining acceptable macroeconomic model parameters in circumstances where the available data is too voluminous or unsuitable for formal modeling. Approximate Bayesian computation can be understood as a kind of Bayesian version of indirect inference.Drovandi, Christopher C. "ABC and indirect inference." Handbook of Approximate Bayesian Computation (2018): 179-209. https://arxiv.org/abs/1803.01999 Core idea Given a dataset of real observations and a generative model with parameters \theta for which no likelihood function can easily be provided. Then we can ask the question of which choice of parameters \theta could have generated the observations. Since a maximum likelihood estimation cannot be performed, indirect inference proposes to fit a (possibly misspecified) auxiliary model\hat_ to the observations, which will result in a set of auxiliary mod ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Economic Model
In economics, a model is a theoretical construct representing economic processes by a set of variables and a set of logical and/or quantitative relationships between them. The economic model is a simplified, often mathematical, framework designed to illustrate complex processes. Frequently, economic models posit structural parameters. A model may have various exogenous variables, and those variables may change to create various responses by economic variables. Methodological uses of models include investigation, theorizing, and fitting theories to the world. Overview In general terms, economic models have two functions: first as a simplification of and abstraction from observed data, and second as a means of selection of data based on a paradigm of econometric study. ''Simplification'' is particularly important for economics given the enormous complexity of economic processes. This complexity can be attributed to the diversity of factors that determine economic activit ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Macroeconomic Model
A macroeconomic model is an analytical tool designed to describe the operation of the problems of economy of a country or a region. These models are usually designed to examine the comparative statics and dynamics of aggregate quantities such as the total amount of goods and services produced, total income earned, the level of employment of productive resources, and the level of prices. Macroeconomic models may be logical, mathematical, and/or computational; the different types of macroeconomic models serve different purposes and have different advantages and disadvantages. Macroeconomic models may be used to clarify and illustrate basic theoretical principles; they may be used to test, compare, and quantify different macroeconomic theories; they may be used to produce "what if" scenarios (usually to predict the effects of changes in monetary, fiscal, or other macroeconomic policies); and they may be used to generate economic forecasts. Thus, macroeconomic models are widely ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Approximate Bayesian Computation
Approximate Bayesian computation (ABC) constitutes a class of computational methods rooted in Bayesian statistics that can be used to estimate the posterior distributions of model parameters. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and a ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Statistical Model Specification
In statistics, model specification is part of the process of building a statistical model: specification consists of selecting an appropriate functional form for the model and choosing which variables to include. For example, given personal income y together with years of schooling s and on-the-job experience x, we might specify a functional relationship y = f(s,x) as follows: : \ln y = \ln y_0 + \rho s + \beta_1 x + \beta_2 x^2 + \varepsilon where \varepsilon is the unexplained error term that is supposed to comprise independent and identically distributed Gaussian variables. The statistician Sir David Cox has said, "How hetranslation from subject-matter problem to statistical model is done is often the most critical part of an analysis". Specification error and bias Specification error occurs when the functional form or the choice of independent variables poorly represent relevant aspects of the true data-generating process. In particular, bias (the expected value of the di ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Method Of Simulated Moments
In econometrics, the method of simulated moments (MSM) (also called simulated method of moments) is a structural estimation technique introduced by Daniel McFadden. It extends the generalized method of moments to cases where theoretical moment functions cannot be evaluated directly, such as when moment functions involve high-dimensional integrals. MSM's earliest and principal applications have been to research in industrial organization, after its development by Ariel Pakes, David Pollard, and others, though applications in consumption are emerging. Although the method requires the user to specify the distribution from which the simulations are to be drawn, this requirement can be relaxed through the use of an entropy maximizing distribution. GMM v.s. MSM *\hat_=\operatorname\,m(x,\beta)'Wm(x,\beta), where m(x,\beta) is the moment condition and W is a matrix. Using the optimal W matrix leads to efficient estimator. *\hat_=\operatorname\,\hat(x,\beta)'W\hat(x,\beta), where ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Estimation Methods
Estimation (or estimating) is the process of finding an estimate or approximation, which is a value that is usable for some purpose even if input data may be incomplete, uncertain, or unstable. The value is nonetheless usable because it is derived from the best information available.C. Lon Enloe, Elizabeth Garnett, Jonathan Miles, ''Physical Science: What the Technology Professional Needs to Know'' (2000), p. 47. Typically, estimation involves "using the value of a statistic derived from a sample to estimate the value of a corresponding population parameter".Raymond A. Kent, "Estimation", ''Data Construction and Data Analysis for Survey Research'' (2001), p. 157. The sample provides information that can be projected, through various formal or informal processes, to determine a range most likely to describe the missing information. An estimate that turns out to be incorrect will be an overestimate if the estimate exceeds the actual result and an underestimate if the estimate fal ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |