John Muth
John Fraser Muth (; September 27, 1930 – October 23, 2005) was an American economist. He is "the father of the rational expectations revolution in economics", primarily due to his article "Rational Expectations and the Theory of Price Movements" from 1961. Muth earned his PhD in mathematical economics from Carnegie Mellon University, and was in 1954 the first recipient of the Alexander Henderson Award. He was affiliated with Carnegie Mellon as a research associate from 1956 until 1959, as an assistant professor from 1959 to 1962, and as an associate professor without tenure from 1962 to 1964. He was a full professor at Michigan State University from 1964 to 1969 and a full professor at Indiana University from 1969 until his retirement in 1994. Muth asserted that expectations "are essentially the same as the predictions of the relevant economic theory." Although he formulated the rational expectations principle in the context of microeconomics it has subsequently become assoc ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Carnegie School
The Carnegie School is a school of economic thought originally formed at the Graduate School of Industrial Administration (GSIA), the current Tepper School of Business, of Carnegie Institute of Technology, the current Carnegie Mellon University, especially during the 1950s to 1970s. Faculty at the Graduate School of Industrial Administration are known for formulating two "seemingly incompatible" concepts: bounded rationality and rational expectations. The former was developed by Herbert A. Simon, along with James March, Richard Cyert and Oliver Williamson. The latter was developed by John F. Muth and later popularized by Robert Lucas Jr., Thomas Sargent, Leonard Rapping, and others. Depending on author and context, the term "Carnegie School" can refer to either both branches or only the bounded rationality branch, sometimes with the qualifier "Carnegie School of organization theory". The commonality between both branches is the use of dynamic optimization and forecasting tec ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Graduate School Of Industrial Administration
The Tepper School of Business is the business school of Carnegie Mellon University. It is located in the university's campus in Pittsburgh, Pennsylvania, US. The school offers degrees from the undergraduate through doctoral levels, in addition to executive education programs. The Tepper School of Business, originally known as the Graduate School of Industrial Administration (GSIA), was founded in 1949 by William Larimer Mellon. In March 2004, the school received a record $55 million gift from alumnus David Tepper and was renamed the "David A. Tepper School of Business at Carnegie Mellon". A number of Nobel Prize–winning economists have been affiliated with the school, including Herbert A. Simon, Franco Modigliani, Merton Miller, Robert Lucas, Edward Prescott, Finn Kydland, Oliver Williamson, Dale Mortensen, and Lars Peter Hansen. History In 1946, economist George Leland Bach was hired by the Carnegie Institute of Technology (predecessor of Carnegie Mellon University) ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Milton Friedman
Milton Friedman (; July 31, 1912 – November 16, 2006) was an American economist and statistician who received the 1976 Nobel Memorial Prize in Economic Sciences for his research on consumption analysis, monetary history and theory and the complexity of stabilization policy. With George Stigler and others, Friedman was among the intellectual leaders of the Chicago school of economics, a neoclassical school of economic thought associated with the work of the faculty at the University of Chicago that rejected Keynesianism in favor of monetarism until the mid-1970s, when it turned to new classical macroeconomics heavily based on the concept of rational expectations. Several students, young professors and academics who were recruited or mentored by Friedman at Chicago went on to become leading economists, including Gary Becker, Robert Fogel, Thomas Sowell and Robert Lucas Jr. Friedman's challenges to what he called "naive Keynesian theory" began with his interpretatio ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Phillip D
Philip, also Phillip, is a male given name, derived from the Greek (''Philippos'', lit. "horse-loving" or "fond of horses"), from a compound of (''philos'', "dear", "loved", "loving") and (''hippos'', "horse"). Prominent Philips who popularized the name include kings of Macedonia and one of the apostles of early Christianity. ''Philip'' has many alternative spellings. One derivation often used as a surname is Phillips. It was also found during ancient Greek times with two Ps as Philippides and Philippos. It has many diminutive (or even hypocoristic) forms including Phil, Philly, Lip, Pip, Pep or Peps. There are also feminine forms such as Philippine and Philippa. Antiquity Kings of Macedon * Philip I of Macedon * Philip II of Macedon, father of Alexander the Great * Philip III of Macedon, half-brother of Alexander the Great * Philip IV of Macedon * Philip V of Macedon New Testament * Philip the Apostle * Philip the Evangelist Others * Philippus of Croton (c. 6th centu ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Stochastic Variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the possible upper sides of a flipped coin such as heads H and tails T) in a sample space (e.g., the set \) to a measurable space, often the real numbers (e.g., \ in which 1 corresponding to H and -1 corresponding to T). Informally, randomness typically represents some fundamental element of chance, such as in the roll of a dice; it may also represent uncertainty, such as measurement error. However, the interpretation of probability is philosophically complicated, and even in specific cases is not always straightforward. The purely mathematical analysis of random variables is independent of such interpretational difficulties, and can be based upon a rigorous axiomatic setup. In the formal mathematical language of measure theory, a random v ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Certainty Equivalence
Stochastic control or stochastic optimal control is a sub field of control theory that deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. The system designer assumes, in a Bayesian probability-driven fashion, that random noise with known probability distribution affects the evolution and observation of the state variables. Stochastic control aims to design the time path of the controlled variables that performs the desired control task with minimum cost, somehow defined, despite the presence of this noise. The context may be either discrete time or continuous time. Certainty equivalence An extremely well-studied formulation in stochastic control is that of linear quadratic Gaussian control. Here the model is linear, the objective function is the expected value of a quadratic form, and the disturbances are purely additive. A basic result for discrete-time centralized systems with only additive uncertainty is th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Moment (mathematics)
In mathematics, the moments of a function are certain quantitative measures related to the shape of the function's graph. If the function represents mass density, then the zeroth moment is the total mass, the first moment (normalized by total mass) is the center of mass, and the second moment is the moment of inertia. If the function is a probability distribution, then the first moment is the expected value, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis. The mathematical concept is closely related to the concept of moment in physics. For a distribution of mass or probability on a bounded interval, the collection of all the moments (of all orders, from to ) uniquely determines the distribution (Hausdorff moment problem). The same is not true on unbounded intervals ( Hamburger moment problem). In the mid-nineteenth century, Pafnuty Chebyshev became the first person to think sy ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Expected Value
In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a large number of independently selected outcomes of a random variable. The expected value of a random variable with a finite number of outcomes is a weighted average of all possible outcomes. In the case of a continuum of possible outcomes, the expectation is defined by integration. In the axiomatic foundation for probability provided by measure theory, the expectation is given by Lebesgue integration. The expected value of a random variable is often denoted by , , or , with also often stylized as or \mathbb. History The idea of the expected value originated in the middle of the 17th century from the study of the so-called problem of points, which seeks to divide the stakes ''in a fair way'' between two players, who have to e ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Linear Functions
Linearity is the property of a mathematical relationship ('' function'') that can be graphically represented as a straight line. Linearity is closely related to '' proportionality''. Examples in physics include rectilinear motion, the linear relationship of voltage and current in an electrical conductor (Ohm's law), and the relationship of mass and weight. By contrast, more complicated relationships are ''nonlinear''. Generalized for functions in more than one dimension, linearity means the property of a function of being compatible with addition and scaling, also known as the superposition principle. The word linear comes from Latin ''linearis'', "pertaining to or resembling a line". In mathematics In mathematics, a linear map or linear function ''f''(''x'') is a function that satisfies the two properties: * Additivity: . * Homogeneity of degree 1: for all α. These properties are known as the superposition principle. In this definition, ''x'' is not necessarily a re ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Decision Rule
In decision theory, a decision rule is a function which maps an observation to an appropriate action. Decision rules play an important role in the theory of statistics and economics, and are closely related to the concept of a strategy in game theory. In order to evaluate the usefulness of a decision rule, it is necessary to have a loss function detailing the outcome of each action under different states. Formal definition Given an observable random variable ''X'' over the probability space \scriptstyle (\mathcal,\Sigma, P_\theta), determined by a parameter ''θ'' ∈ ''Θ'', and a set ''A'' of possible actions, a (deterministic) decision rule is a function ''δ'' : \scriptstyle\mathcal→ ''A''. Examples of decision rules * An estimator is a decision rule used for estimating a parameter. In this case the set of actions is the parameter space, and a loss function details the cost of the discrepancy between the true value of the parameter and the e ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Probability Distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events (subsets of the sample space). For instance, if is used to denote the outcome of a coin toss ("the experiment"), then the probability distribution of would take the value 0.5 (1 in 2 or 1/2) for , and 0.5 for (assuming that the coin is fair). Examples of random phenomena include the weather conditions at some future date, the height of a randomly selected person, the fraction of male students in a school, the results of a survey to be conducted, etc. Introduction A probability distribution is a mathematical description of the probabilities of events, subsets of the sample space. The sample space, often denoted by \Omega, is the set of all possible outcomes of a ra ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Quadratic Relation
In mathematics, the term quadratic describes something that pertains to squares, to the operation of squaring, to terms of the second degree, or equations or formulas that involve such terms. ''Quadratus'' is Latin for ''square''. Mathematics Algebra (elementary and abstract) * Quadratic function (or quadratic polynomial), a polynomial function that contains terms of at most second degree ** Complex quadratic polynomials, are particularly interesting for their sometimes chaotic properties under iteration * Quadratic equation, a polynomial equation of degree 2 (reducible to 0 = ''ax''2 + ''bx'' + ''c'') * Quadratic formula, calculation to solve a quadratic equation for the independent variable (''x'') * Quadratic field, an algebraic number field of degree two over the field of rational numbers * Quadratic irrational or "quadratic surd", an irrational number that is a root of a quadratic polynomial Calculus * Quadratic integral, the integral of the reciprocal of a second- ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |