New Keynesian economics is a school of
macroeconomics
Macroeconomics is a branch of economics that deals with the performance, structure, behavior, and decision-making of an economy as a whole. This includes regional, national, and global economies. Macroeconomists study topics such as output (econ ...
that strives to provide
microeconomic foundations for
Keynesian economics
Keynesian economics ( ; sometimes Keynesianism, named after British economist John Maynard Keynes) are the various macroeconomics, macroeconomic theories and Economic model, models of how aggregate demand (total spending in the economy) strongl ...
. It developed partly as a response to criticisms of Keynesian macroeconomics by adherents of
new classical macroeconomics.
Two main assumptions define the New Keynesian approach to macroeconomics. Like the New Classical approach, New Keynesian macroeconomic analysis usually assumes that households and firms have
rational expectations. However, the two schools differ in that New Keynesian analysis usually assumes a variety of
market failure
In neoclassical economics, market failure is a situation in which the allocation of goods and services by a free market is not Pareto efficient, often leading to a net loss of economic value.Paul Krugman and Robin Wells Krugman, Robin Wells (2006 ...
s. In particular, New Keynesians assume that there is
imperfect competition
In economics, imperfect competition refers to a situation where the characteristics of an economic market do not fulfil all the necessary conditions of a perfectly competitive market. Imperfect competition causes market inefficiencies, resulting in ...
in price and wage setting to help explain why prices and wages can become "
sticky", which means they do not adjust instantaneously to changes in economic conditions.
Wage and price stickiness, and the other present descriptions of market failures in New Keynesian
models
A model is an informative representation of an object, person, or system. The term originally denoted the plans of a building in late 16th-century English, and derived via French and Italian ultimately from Latin , .
Models can be divided int ...
, imply that the economy may fail to attain
full employment. Therefore, New Keynesians argue that macroeconomic stabilization by the government (using
fiscal policy
In economics and political science, fiscal policy is the use of government revenue collection ( taxes or tax cuts) and expenditure to influence a country's economy. The use of government revenue expenditures to influence macroeconomic variab ...
) and the
central bank
A central bank, reserve bank, national bank, or monetary authority is an institution that manages the monetary policy of a country or monetary union. In contrast to a commercial bank, a central bank possesses a monopoly on increasing the mo ...
(using
monetary policy
Monetary policy is the policy adopted by the monetary authority of a nation to affect monetary and other financial conditions to accomplish broader objectives like high employment and price stability (normally interpreted as a low and stable rat ...
) can lead to a more
efficient macroeconomic outcome than a ''
laissez faire'' policy would.
New Keynesianism became part of the
new neoclassical synthesis
The new neoclassical synthesis (NNS), which is occasionally referred as the New Consensus, is the fusion of the major, modern macroeconomic schools of thought – new classical macroeconomics/ real business cycle theory and early New Keynesian e ...
that incorporated parts of both it and
new classical macroeconomics, and forms the theoretical basis of mainstream macroeconomics today.
[Woodford, Michael]
''Convergence in Macroeconomics: Elements of the New Synthesis''
January 2008.[Mankiw, N. Greg (May 2006)]
''The Macroeconomist as Scientist and Engineer''
pp. 14–15.[Goodfriend, Marvin and King, Robert G. (June 1997)]
''The New Neoclassical Synthesis and The Role of Monetary Policy''
Federal Reserve Bank of Richmond. Working papers. No. 98–5.
Development of New Keynesian economics
1970s
The first wave of New Keynesian economics developed in the late 1970s. The first model of ''Sticky information'' was developed by
Stanley Fischer in his 1977 article, ''Long-Term Contracts, Rational Expectations, and the Optimal Money Supply Rule''. He adopted a "staggered" or "overlapping" contract model. Suppose that there are two unions in the economy, who take turns to choose wages. When it is a union's turn, it chooses the wages it will set for the next two periods. This contrasts with
John B. Taylor's model where the nominal wage is constant over the contract life, as was subsequently developed in his two articles: one in 1979, "Staggered wage setting in a macro model", and one in 1980, "Aggregate Dynamics and Staggered Contracts". Both Taylor and Fischer contracts share the feature that only the unions setting the wage in the current period are using the latest information: wages in half of the economy still reflect old information. The Taylor model had sticky nominal wages in addition to the sticky information: nominal wages had to be constant over the length of the contract (two periods). These early new Keynesian theories were based on the basic idea that, given fixed nominal wages, a monetary authority (central bank) can control the employment rate. Since wages are fixed at a nominal rate, the monetary authority can control the
real wage (wage values adjusted for inflation) by changing the money supply and thus affect the employment rate.
1980s
Menu costs and imperfect competition
In the 1980s the key concept of using menu costs in a framework of
imperfect competition
In economics, imperfect competition refers to a situation where the characteristics of an economic market do not fulfil all the necessary conditions of a perfectly competitive market. Imperfect competition causes market inefficiencies, resulting in ...
to explain price stickiness was developed. The concept of a lump-sum cost (menu cost) to changing the price was originally introduced by Sheshinski and Weiss (1977) in their paper looking at the effect of inflation on the frequency of price-changes. The idea of applying it as a general theory of
nominal price rigidity was simultaneously put forward by several economists in 1985–86.
George Akerlof and
Janet Yellen put forward the idea that due to
bounded rationality
Bounded rationality is the idea that rationality is limited when individuals decision-making, make decisions, and under these limitations, rational individuals will select a decision that is satisficing, satisfactory rather than optimal.
Limitat ...
firms will not want to change their price unless the benefit is more than a small amount. This
bounded rationality
Bounded rationality is the idea that rationality is limited when individuals decision-making, make decisions, and under these limitations, rational individuals will select a decision that is satisficing, satisfactory rather than optimal.
Limitat ...
leads to inertia in nominal prices and wages which can lead to output fluctuating at constant nominal prices and wages.
Gregory Mankiw took the menu-cost idea and focused on the welfare effects of changes in output resulting from
sticky prices. Michael Parkin also put forward the idea. Although the approach initially focused mainly on the rigidity of nominal prices, it was extended to wages and prices by
Olivier Blanchard and
Nobuhiro Kiyotaki in their influential article "Monopolistic Competition and the Effects of Aggregate Demand".
Huw Dixon and Claus Hansen showed that even if menu costs applied to a small sector of the economy, this would influence the rest of the economy and lead to prices in the rest of the economy becoming less responsive to changes in demand.
While some studies suggested that menu costs are too small to have much of an aggregate impact,
Laurence M. Ball and
David Romer showed in 1990 that
real rigidities could interact with nominal rigidities to create significant disequilibrium. Real rigidities occur whenever a firm is slow to adjust its real prices in response to a changing economic environment. For example, a firm can face real rigidities if it has market power or if its costs for inputs and wages are locked-in by a contract. Ball and Romer argued that real rigidities in the labor market keep a firm's costs high, which makes firms hesitant to cut prices and lose revenue. The expense created by real rigidities combined with the menu cost of changing prices makes it less likely that firm will cut prices to a market clearing level.
Even if prices are perfectly flexible, imperfect competition can affect the influence of fiscal policy in terms of the multiplier. Huw Dixon and Gregory Mankiw developed independently simple general equilibrium models showing that the fiscal multiplier could be increasing with the degree of imperfect competition in the output market. The reason for this is that
imperfect competition
In economics, imperfect competition refers to a situation where the characteristics of an economic market do not fulfil all the necessary conditions of a perfectly competitive market. Imperfect competition causes market inefficiencies, resulting in ...
in the output market tends to reduce the
real wage, leading to the household substituting away from
consumption towards
leisure
Leisure (, ) has often been defined as a quality of experience or as free time. Free time is time spent away from business, Employment, work, job hunting, Housekeeping, domestic chores, and education, as well as necessary activities such as ...
. When
government spending
Government spending or expenditure includes all government consumption, investment, and transfer payments. In national income accounting, the acquisition by governments of goods and services for current use, to directly satisfy the individual or ...
is increased, the corresponding increase in
lump-sum taxation causes both leisure and consumption to decrease (assuming that they are both a normal good). The greater the degree of imperfect competition in the output market, the lower the
real wage and hence the more the reduction falls on leisure (i.e. households work more) and less on consumption. Hence the
fiscal multiplier
In economics, the fiscal multiplier (not to be confused with the money multiplier) is the ratio of change in national income arising from a change in government spending. More generally, the exogenous spending multiplier is the ratio of change ...
is less than one, but increasing in the degree of imperfect competition in the output market.
Calvo staggered contracts model
In 1983
Guillermo Calvo wrote "Staggered Prices in a Utility-Maximizing Framework". The original article was written in a
continuous time mathematical framework, but nowadays is mostly used in its
discrete time version. The Calvo model has become the most common way to model nominal rigidity in new Keynesian models. There is a probability that the firm can reset its price in any one period (the
hazard rate), or equivalently the probability () that the price will remain unchanged in that period (the survival rate). The probability is sometimes called the "Calvo probability" in this context. In the Calvo model the crucial feature is that the price-setter does not know how long the nominal price will remain in place, in contrast to the Taylor model where the length of contract is known ''ex ante''.
Coordination failure
Coordination failure was another important new Keynesian concept developed as another potential explanation for recessions and unemployment. In recessions a factory can go idle even though there are people willing to work in it, and people willing to buy its production if they had jobs. In such a scenario, economic downturns appear to be the result of coordination failure: The invisible hand fails to coordinate the usual, optimal, flow of production and consumption.
Russell Cooper
Theo Russell Cooper (born 4 February 1941) is an Australian retired National Party politician.
He was Premier of Queensland for a period of 73 days, from 25 September 1989 to 7 December 1989. His loss at the state election of 1989 ended 32 ...
and Andrew John's 1988 paper "Coordinating Coordination Failures in Keynesian Models" expressed a general form of coordination as models with multiple equilibria where agents could coordinate to improve (or at least not harm) each of their respective situations.
Cooper and John based their work on earlier models including
Peter Diamond's 1982
coconut model, which demonstrated a case of coordination failure involving
search and matching theory. In Diamond's model producers are more likely to produce if they see others producing. The increase in possible trading partners increases the likelihood of a given producer finding someone to trade with. As in other cases of coordination failure, Diamond's model has multiple equilibria, and the welfare of one agent is dependent on the decisions of others. Diamond's model is an example of a "thick-market
externality
In economics, an externality is an Indirect costs, indirect cost (external cost) or indirect benefit (external benefit) to an uninvolved third party that arises as an effect of another party's (or parties') activity. Externalities can be conside ...
" that causes markets to function better when more people and firms participate in them. Other potential sources of coordination failure include
self-fulfilling prophecies. If a firm anticipates a fall in demand, they might cut back on hiring. A lack of job vacancies might worry workers who then cut back on their consumption. This fall in demand meets the firm's expectations, but it is entirely due to the firm's own actions.
Labor market failures: Efficiency wages
New Keynesians offered explanations for the failure of the labor market to clear. In a Walrasian market, unemployed workers bid down wages until the demand for workers meets the supply. If markets are Walrasian, the ranks of the unemployed would be limited to workers transitioning between jobs and workers who choose not to work because wages are too low to attract them. They developed several theories explaining why markets might leave willing workers unemployed. The most important of these theories was the
efficiency wage theory used to explain
long-term effects of previous unemployment, where short-term increases in unemployment become permanent and lead to higher levels of unemployment in the long-run.

In efficiency wage models, workers are paid at levels that maximize productivity instead of clearing the market. For example, in developing countries, firms might pay more than a market rate to ensure their workers can afford enough nutrition to be productive. Firms might also pay higher wages to increase loyalty and morale, possibly leading to better productivity. Firms can also pay higher than market wages to forestall shirking. Shirking models were particularly influential.
Carl Shapiro
Carl Shapiro (born March 20, 1955) is an American economist and professor at the University of California, Berkeley, Haas School of Business. He is the co-author, along with Hal Varian of ''Information Rules, Information Rules: A Strategic Guide ...
and
Joseph Stiglitz's 1984 paper "Equilibrium Unemployment as a Worker Discipline Device" created a model where employees tend to avoid work unless firms can monitor worker effort and threaten slacking employees with unemployment. If the economy is at full employment, a fired shirker simply moves to a new job. Individual firms pay their workers a premium over the market rate to ensure their workers would rather work and keep their current job instead of shirking and risk having to move to a new job. Since each firm pays more than market clearing wages, the aggregated labor market fails to clear. This creates a pool of unemployed laborers and adds to the expense of getting fired. Workers not only risk a lower wage, they risk being stuck in the pool of unemployed. Keeping wages above market clearing levels creates a serious disincentive to shirk that makes workers more efficient even though it leaves some willing workers unemployed.
1990s
New neoclassical synthesis
In the early 1990s, economists began to combine the elements of new Keynesian economics developed in the 1980s and earlier with
Real Business Cycle Theory. RBC models were dynamic but assumed perfect competition; new Keynesian models were primarily static but based on imperfect competition. The
new neoclassical synthesis
The new neoclassical synthesis (NNS), which is occasionally referred as the New Consensus, is the fusion of the major, modern macroeconomic schools of thought – new classical macroeconomics/ real business cycle theory and early New Keynesian e ...
essentially combined the dynamic aspects of RBC with imperfect competition and nominal rigidities of new Keynesian models. Tack Yun was one of the first to do this, in a model that used the
Calvo pricing model. Goodfriend and King proposed a list of four elements that are central to the new synthesis: intertemporal optimization, rational expectations, imperfect competition, and costly price adjustment (menu costs). Goodfriend and King also find that the consensus models produce certain policy implications: whilst monetary policy can affect real output in the short-run, but there is no long-run trade-off: money is not
neutral in the short-run but it is in the long-run. Inflation has negative welfare effects. It is important for central banks to maintain credibility through rules based policy like inflation targeting.
Taylor Rule
In 1993, John B Taylor formulated the idea of a
Taylor rule, which is a reduced form approximation of the responsiveness of the
nominal interest rate, as set by the
central bank
A central bank, reserve bank, national bank, or monetary authority is an institution that manages the monetary policy of a country or monetary union. In contrast to a commercial bank, a central bank possesses a monopoly on increasing the mo ...
, to changes in inflation,
output
Output may refer to:
* The information produced by a computer, see Input/output
* An output state of a system, see state (computer science)
* Output (economics), the amount of goods and services produced
** Gross output in economics, the valu ...
, or other economic conditions. In particular, the rule describes how, for each one-percent increase in inflation, the central bank tends to raise the nominal interest rate by more than one percentage point. This aspect of the rule is often called the Taylor principle. Although such rules provide concise, descriptive proxies for central bank policy, they are not, in practice, explicitly proscriptively considered by central banks when setting nominal rates.
Taylor's original version of the rule describes how the nominal interest rate responds to divergences of actual inflation rates from ''target'' inflation rates and of actual gross domestic product (GDP) from ''potential'' GDP:
In this equation,
is the target short-term
nominal interest rate (e.g. the
federal funds rate in the US, the
Bank of England base rate in the UK),
is the rate of inflation as measured by the
GDP deflator
In economics, the GDP deflator (implicit price deflator) is a measure of the money price of all new, domestically produced, final goods and services in an economy in a year relative to the real value of them. It can be used as a measure of the val ...
,
is the desired rate of inflation,
is the assumed equilibrium real interest rate,
is the logarithm of real GDP, and
is the logarithm of
potential output, as determined by a linear trend.
New Keynesian Phillips curve
The New Keynesian Phillips curve was originally derived by Roberts in 1995, and has since been used in most state-of-the-art New Keynesian DSGE models. The new Keynesian Phillips curve says that this period's inflation depends on current output and the expectations of next period's inflation. The curve is derived from the dynamic Calvo model of pricing and in mathematical terms is:
The current period expectations of next period's inflation are incorporated as