On Refereeing: Do we have Confidence in our Economic Institutions?

referee.jpg

Like most academics, I spend much of my time asking for money from research councils. So, it is a welcome change for me to sit on the other side of the table in my role on the management team of Rebuilding Macroeconomics. This is an initiative located at the National Institute of Economic and Social Research in the UK and funded by the Economic and Social Research Council. Our remit is to act as gatekeepers to distribute approximately £2.4 million over the next four years to projects that have the potential to transform macroeconomics back to a truly policy relevant social science. We are seeking risky projects that combine insights from different disciplines that would not normally be funded and we expect that not all of them will succeed. It is our hope that one or more of the projects we fund will lead to academic advances and new solutions to the pressing policy issues of our time. 

In addition to my role on the management team of Rebuilding Macroeconomics, I am Research Director at NIESR. I have already learnt a great deal from discussions with the other members of the team. The committee consists of myself, Angus Armstrong, of Lloyds Bank,  Laura Bear of LSE, Doyne Farmer, of Oxford University,  and David Tuckett, of UCL. Laura is a Professor of Anthropology who has worked extensively on the anthropology of the urban economy and she brings a refreshing perspective to the sometimes-insular world of economics. Doyne Farmer, no relation, is a complexity theorist who runs the INET Complexity Economics Centre at the Oxford Martin School. Doyne was trained as a physicist and he has a long association with the Santa Fe Institute in New Mexico. And last, but by no means least, David Tuckett is a psychologist at University College London where he directs the UCL Centre for the Study of Decision Making Under Uncertainty. As you might imagine, conversations among this diverse group have been eye-opening for all of us.

We have chosen to allocate funds by identifying a number of ‘hubs’ that are loosely based around a set of pressing public issues. So far, we have identified three: 1) Can globalisation benefit all? 2)  Why are economies unstable? and 3) Do we have confidence in economic institutions? In this post, I want to focus on the third of these questions which evolved from conversations between those of us on the management team and that Laura and I have spent quite a bit of time refining. 

We can break institutions into two broad groups: Academic institutions that shape the culture of economists. And government and policy institutions that transmit this culture to the wider public sphere. Research on academic institutions involves the organisation of economic education in universities, the journal structure, the rules for promotion and tenure in academic departments and the socialisation and seminar culture of the tribe of the Econ. Research on policy making institutions like the Bank of England, the Treasury and the IMF involves the way that insular thinking, learned in graduate schools, is transmitted to society at large. 

Insular thinking is reflected, for example, in economic journal publishing, a process that is highly centralised around five leading journals. These are the American Economic Review, the Quarterly Journal of Economics, the Review of Economic Studies, Econometrica and the Journal of Political Economy. For a young newly appointed lecturer, publishing a paper in one of these top five journals is a pre-requisite for promotion in a leading economic department in the United States, the United Kingdom and many of the top Continental European departments. This process is more often than not depressingly slow. Even for a well-established leading economist, publication in a top five journal is never guaranteed. And when a paper is finally published, it is after rejection from three or more other journals and the collective efforts of a coterie of referees. This experience, as I learned from Doyne, is not characteristic of the natural sciences.

Normal
0




false
false
false

EN-US
X-NONE
X-NONE

 
 
 
 
 
 
 
 
 
 


 
 
 
 
 
 
 
 
 
 
 


 <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false"
DefSemiHidden="false" DefQFormat="false" DefPriority="99"
LatentStyleCount=…

                         From Redpen/Blackpen twitter feed

In economics, the expected time from writing a working paper to publication in a journal is around four years. That assumes that the researcher is shooting for a top journal and is prepared to accept several rejections along the way. When a paper is finally accepted it must, more often than not, be extensively rewritten to meet the proclivities of the referees. In my experience, not all of the referee reports lead to improvements. Sometimes, the input of dedicated referees can improve the final product. At other times, referee comments lead to monstrous additions as the editor incorporates the inconsistent approaches of referees with conflicting views of what the paper is about. 

It is not like that in other disciplines. I will paraphrase from my memory of a conversation with Doyne, so if you are a physicist or a biologist with new information, please feel free to let me know in the comment section of this blog. In physics, a researcher is rightfully upset if she does not receive feedback within a month. And that feedback involves short comments and an up or down decision. In physics, there is far less of a hierarchy of journals. Publications are swift and many journals have equal weight in promotion and tenure decisions.

I do not know why economics and physics are so different but I suspect that it is related to the fact that economics is not an experimental science. In macroeconomics, in particular, there are often many competing explanations for the same limited facts and it would be destructive to progress if every newly minted graduate student were to propose their own new theory to explain those facts. Instead, internal discipline is maintained by a priestly caste who monitor what can and cannot be published. 

The internal discipline of macroeconomics enables most of us to engage in what Thomas Kuhn calls ‘normal science’. But occasionally there are large events like the Great Depression of the 1930’s, the Great Stagflation of the 1970’s or the Great Recession of 2008, that cause us to re-evaluate our preconceived ideas. A journal culture that works well in normal times can, in periods of revolution, become deeply suppressive and destructive of creative thought. Now would be a very good time to re-evaluate our culture and perhaps, just perhaps, we can learn something from physics.

Where's the Inflation? Where's the Beef?

In a 1984 advertising campaign, Wendy’s Hamburgers featured the character actress Clara Peller.  Clara peers disappointedly at a burger from a rival chain that, while well stocked with bread, has remarkably little meat. Her rallying cry: Where’s the beef? was taken up as a political slogan by Vice Presidential Candidate Walter Mondale and it captured the imagination of a generation. 

beef.jpg

Today, as we stare at a Fed balance sheet of $4.5 trillion and rates of price change at or below 2% one can envisage a millennial Clara Peller metaphorically peering at a bloated Fed balance sheet and pleading; Where’s the inflation?

In a 2009 review of Akerlof and Shiller’s book, ‘Animal Spirits’, Greg Hill pointed out that I made the following claim: “History has taught that a massive expansion of liquidity will lead to inflation”. My review was designed to be critical of slavish applications of 1950s Keynesian remedies to twenty-first century problems.  I stand by that critique. There is a reason we rejected Keynesian economics in the 1970s. It didn’t work the way it was supposed to. In particular, Keynesian economics had nothing to say about the most important economic issue of the 1960s and 1970s: the simultaneous appearance of inflation and unemployment for which the British politician, Ian Macleod coined the term ‘stagflation’.

In the 1960s, the U.S. government borrowed to pay for the Vietnam war, and rather than raise politically unpopular taxes, it paid for new military expenditures by printing money. Milton Friedman pointed out correctly, that printing money would eventually lead to inflation. If printing money leads to inflation, why has a more than fivefold expansion of the Fed balance sheet, from $800 million in 2006 to $4,500 million in 2017, not been accompanied by an increase in prices?

Modern theories of inflation are based on Milton Friedman’s celebrated restatement of the quantity theory of money. (Aside: If you are a student of macroeconomics and you have not read Friedman’s essay; you are being short-changed by your professor). Friedman was building on the earlier work of quantity theorists (see for example, Hume’s essay; Of Money) who built a theory of inflation around the definition of the velocity of circulation, v, as the ratio of nominal GDP to the stock of money:

(1)      v = (P x Y)/M

Here, P is a price index,  Y is real GDP and M is the quantity of money.  According to the Quantity Theory of Money, Y is equal to potential GDP, Y*

(2)      Y = Y* 

and  v is constant. If  Y is growing at the growth rate of potential GDP and if v  is a constant then the rate of price inflation is, mechanically, equal to the rate of money creation minus the growth rate of potential GDP. It was that fact that led Friedman to proclaim that “inflation is always and everywhere a monetary phenomenon”. But what if the velocity of circulation is not a constant?

Friedman’s restatement of the quantity theory of money improved over earlier versions of the quantity theory by recognizing formally that the velocity of circulation is a function of a spectrum of interest rates on alternative assets. In its simplest form, Friedman’s restatement implies that money is like a hot potato that is passed from hand to hand more quickly when the interest rate increases.

Screen Shot 2017-09-21 at 7.20.31 PM.png

Figure 1 plots the velocity of circulation on the horizontal axis against the interest rate on three month treasuries on the vertical axis. This graph is upward sloping as long as the interest rate is positive. It is horizontal when the interest rate is zero, a feature that Keynes referred to as ‘the liquidity trap’.

The graph of velocity against the interest rate flattens out as the interest rate approaches zero because at zero rates, money and bonds become perfect substitutes. Like a glutton who has eaten so much he cannot stomach one more hamburger, at zero interest rates people are satiated with liquidity and have no further use for cash for day-to-day transactions. If the Fed buys T-bills and replaces them with dollar bills people will be content to hold the extra cash rather than spend it. This observation leads me to remark that what I should have said in my 2009 review of Akerlof and Shiller was that: “History has taught that a massive expansion of liquidity will lead to inflation: [except when the interest rate is zero]”.

A final word of caution. When reserves of private banks at the Fed pay interest, as they do now, the opportunity cost of holding money is not the T-bill rate. It is the T-bill rate minus the reserve rate. If the Fed raises the interest rate and continues to pay interest on excess reserves, the connection between velocity and the interest rate will remain permanently broken. In that case, the graph that I plotted in Figure 1 will not continue to characterize future data, even if the T-bill rate increases above zero. I wrote about that issue here where I pointed out that the impact of monetary tightening on inflation will depend very much on how central banks tighten. Stay tuned to this spot and don’t trust your favourite interpreter of the doctrine of Keynes. When the Keynesian prophets call for more of the same without explaining why their policies failed us in the great stagflation; take your cue from Clara Peller and ask them loudly: Where’s the beef?

Indeterminacy, the Belief Function and Reinventing IS-LM

This is my final post featuring research presented at the conference on Applications of Behavioural Economics and Multiple Equilibrium Models to Macroeconomics Policy Conference held at the Bank of England on July 3rd and 4th 2017.

Today I will talk about the work of two of my graduate students and co-authors, Giovanni Nicolò and Konstantin Platonov. Both of them gave presentations at the conference.

multeq.jpg

Giovanni is in his final year of the Ph.D. programme at UCLA and he will be looking for a job this coming January at the annual ASSA meetings. This year they will be held in Philadelphia. He has already published one paper in the Journal of Economic Dynamics and Control, co-authored with myself and Vadim Khramov. He has a co-authored paper with Francesco Bianchi that is under revision for Quantitative Economics, and a third  paper co-authored with me, Keynesian Economics without the Phillips Curve, that we wrote for an upcoming conference at Gerzensee Switzerland in October. At the Bank of England Conference, Giovanni presented a fourth paper. This is his single-authored job-market paper “Monetary Policy, Expectations and Business Cycles in the U.S. Postwar Period”.


Giovanni’s research is on the empirics of models with multiple equilibria and sunspots. He began working on this topic when Vadim and I invited him to join us on the project, “Solving and estimating indeterminate DSGE models”, (Farmer Khramov and Nicolò FKN) that now appears in the JEDC. In that paper, we showed how to use standard software packages such as Chris Sim’s matlab code, GENSYS, and the computational package DYNARE, to solve models in which the steady state of the model is indeterminate. This has been a hot topic for empirical macroeconomics ever since Thomas Lubik and Frank Schorfheide showed, in 2004, that the Federal Reserve Board, prior to 1979, followed a policy in which the equilibrium of the economy was indeterminate and subject to non-fundamental belief shocks, aka, sunspot fluctuations.

In order to estimate a model driven by sunspots, the researcher must make distributional assumptions about the nature of non-fundamental uncertainty and how it co-varies with other fundamental shocks to demand and supply. The parameters of this distribution are part of what I call the belief function.  Before we wrote our paper (FKN 2015), researchers who wanted to estimate an indeterminate model by applying the Lubik-Schorfheide method were faced with a complicated programming problem. We showed how to side-step this computational problem and instead to estimate an indeterminate model using the widely-used software package, DYNARE.

In his paper with Francesco Bianchi, Giovanni took this agenda one step further. To estimate an indeterminate DSGE model using the FKN method, the researcher needed first to know if a particular parameterization of the model is determinate or indeterminate. For a simple model such as the three equation New-Keynesian model, it is possible to partition the parameter space into determinate and indeterminate regions analytically. For more complicated models, no such analytic partition is possible. Bianchi and Nicolò (BN 2017) develop a computational method for which no analytic expression is needed. Their work allows researchers to estimate medium to large scale models without imposing the assumption, a priori, that all of the shocks to the model are fundamental.

Models of indeterminacy are identified in data by the fact that they have richer propagation mechanisms than models with a unique determinate steady state. Giovanni’s independent work takes off from the observation (Beyer and Farmer 2004) that it may in practice be difficult to tell the difference between models with an indeterminate steady state and models with a unique steady state but richer internal propagation mechanisms. To put his method through its paces, he estimates a complete medium scale DSGE model of the type constructed by Frank Smets and Raf Wouters. He finds that the Lubik-Schorfheide result carries over to the complete Smets-Wouters model, a result that could not have been discovered without the method that Giovanni developed with Francesco. This is a very nice piece of work and if you are looking to hire an exceptionally smart young macroeconomist with strong theoretical and empirical skills, Giovanni comes highly recommended!  You can hear him discuss his research in the attached video clip.


The final conference paper that I will discuss in this series, “Animal Spirits in a Monetary Economy”, was co-authored by myself and Konstantin Platonov. Konstantin presented our paper at the conference and we wrote about our work for VOX here.

I have been critical of the IS-LM model in several of my posts. My paper with Konstantin  fixes some of the more salient problems of IS-LM by reintroducing two key ideas from Keynes. 1. The confidence fairy is real. 2. If confidence remains depressed, high unemployment can exist forever.  Our Vox piece presents the key findings of the paper in simple language. Here are some excerpts...

islmnac.png
“Larry Summers has argued that market economies may get stuck in permanently inefficient equilibria. He calls this 'secular stagnation' (Summers 2014). In this equilibrium, unemployment may be permanently ‘too high’ and output may remain permanently below potential, because private investors are pessimistic about the prospects for future growth. Our most recent research attempts to explain why secular stagnation occurs and how economic policy may be used to escape it (Farmer and Platonov 2016).
In the wake of the Great Recession, macroeconomic orthodoxy is under attack. Paul Krugman (2011) has called for a return to the IS-LM model, an approach that was developed by Sir John Hicks (1937). We are sympathetic to that call but we believe that the IS-LM model needs to be redesigned. We suggest a different way of thinking about the effect of monetary policy that we call the 'IS-LM-NAC' model. It is part of a broader research agenda ( Farmer 2010201220142016a2016b) that studies models in which beliefs independently influence outcomes...” continue reading

Konstantin has another year at UCLA but he will be on the market in January of 2019. He is an exceptionally talented young economist who also comes highly recommended. In addition to his co-authored paper with me, he has some exciting new work of his own that extends Farmer’s Keynesian search model to an international framework with two or more countries. Konstantin presented his single-authored paper at the European Economic Association meetings in Lisbon last summer.

This brings me full circle and ties together my own research with the other pieces you have heard in the series of linked video clips. Stay tuned to this spot; the home for new approaches to macroeconomics!

Beliefs, Networks, History and the Housing Premium Puzzle

This is week five of my posts featuring research presented at the conference on Applications of Behavioural Economics and Multiple Equilibrium Models to Macroeconomics Policy Conference held at the Bank of England on July 3rd and 4th 2017. Today’s post features the coauthored work of Héctor Calvo-Pardo, and a series of coauthored papers by Alan Taylor, a co-organizer of the conference. 

network.png

Hector Calvo-Pardo, from the University of Southampton, presented his paper on social networks, coauthored with Luc Arrondel Research Director at CNRS in Paris, Chryssi Giannitsaro of Cambridge University and Michael Haliassos of Goethe University. Alan Taylor, a Professor at UC Davis, helped organize the conference. In a linked video, he discusses an amazing new data set developed jointly with Òscar Jordà, Vice President of the Federal Reserve Bank of San Francisco and Moritz Schularick, Professor of Economics at the University of Bonn.


Arrondel, Calvo-Pardo, Giannitsaro and Haliassos (ACGH) conduct work on social networks that fits beautifully into the theme of this conference.  In the opening Macro Memo  I explained how economic models with multiple equilibria could be combined with psychological models of belief propagation to advance our understanding of financial panics. The ACGH paper “Informative Social Interactions” is a very nice example of the use of network analysis to explain the propagation of ideas.

Quoting from the paper [page 2]

“…we design, field and exploit novel survey data that provide measures of stock market participation (relative to individuals’ financial wealth), connectedness, but also of subjective expectations and perceptions of stock market returns via probabilistic elicitation techniques.  Our empirical analysis exploits cross-sectional variation for a representative sample by age, asset classes and wealth of the population of France, collected in two stages, in December 2014 and May 2015.  
… the questionnaire [also] contains a rich set of covariates for socioeconomic and demographic controls, preferences, constraints and access and frequency of consultation of information sources, typically absent from social network empirical studies.

And this is where networks come in:

[our data set] … contains specific questions designed to obtain quantitative measures of relevant network characteristics that enable identification of information network effects on financial decisions from individual answers…

ACGH break up a person’s network into a small inner circle of contacts who are financially knowledgeable, and a separate larger set of contacts who are not.  They find significant peer effects from contacts in the financial network to a person’s beliefs about future financial variables. The paper provides evidence of how beliefs spread through social networks, a mechanism that may help to promote the spread of asset prices bubbles. I recommend listening to Hector’s explanation in his own words, linked above.

This brings me to a fascinating series of papers by Òscar Jordà, Moritz Schularick and Alan Taylor (JST). To my knowledge, they have so far produced five papers, “When Credit Bites Back: Leverage, Business Cycles, And Crises, (2011), “Sovereigns versus Banks: Credit, Crises and Consequences" (2013), “The Great Mortgaging: Housing Finance, Crises, and Business Cycles”, (2014), “Betting the House”, (2015), and the paper I know best “The Rate of Return on Everything” (2017), which I saw presented at the NBER Summer Institute in July of 2017. This paper has two additional co-authors Katharina Knoll and Dmitry Kuvshinov who are/were both students of Moritz at the University of Bonn. 


The unifying theme in all of these papers is a new data set that the authors have painstakingly assembled. Here I quote from JKKST (2017):

Our paper introduces, for the first time, a large dataset on the rates of return on all major asset classes in advanced economies, annually since 1870. Our data provide new empirical foundations of long-run macro-financial research. Along the way, we uncover new and somewhat unexpected stylized facts.
… Notably, housing wealth is on average roughly one half of national wealth in a typical economy, and can fluctuate significantly over time… But there is no previous rate of return database which contains any information on housing returns.

Screen Shot 2017-09-02 at 1.42.03 PM.png

Table 1 shows the extent of the coverage which includes sixteen advanced economies for 145 years. 

There are many nuggets to be mined here, some of which have already been dug out by the authors. For example, it is well known that the return to equity has been 5% higher than the return to T-bills in a century of US data, a fact that Rajnish Mehra and Ed Prescott labeled the equity premium puzzle.  JKKST provide evidence that the equity premium puzzle is universal across all sixteen economies and, in addition, there is a second ‘housing premium puzzle’, documented in Table 2. This second puzzle is all the more intriguing since the variance of returns to housing are significantly lower than the returns to equities.

Screen Shot 2017-09-02 at 1.50.24 PM.png

JKKST focus on the relationship between average stock market returns, “r” and average growth rates “g”. The connection between r and g has gained widespread notoriety since Thomas Piketty revived the Marxist claim that capitalism will self-destruct as the rich grow richer in his acclaimed tome, Capitalism in the 21st Century. I suggested at the recent 2017 NBER Summer Institute, that Alan and his coauthors look instead, at the connection between the safe rate of return and the growth rate. If, as I suspect, the safe rate of return is roughly equal to the growth rate across these data, it suggests (to me) that Samuelson’s biological rate of interest is at play. It’s time to start teaching our graduate students more about the Overlapping Generations Model, just one of the topics I’ll be covering in a series of ten lectures on Indeterminacy and Sunspots in Macroeconomics, as part of the Swiss Doctoral Programme at Gerzensee this week.

Short Sharp Shocks

This is week four of my posts featuring research presented at the conference on Applications of Behavioural Economics, and Multiple Equilibrium Models to Macroeconomics Policy Conference held at the Bank of England on July 3rd and 4th 2017.

money.jpg

Today’s memo features two economists working on models of multiple equilibria from different perspectives. George Evans is a pioneer in models of adaptive learning, a topic he has worked on for more than thirty years. George presented his joint work with Seppo Honkapohja, Deputy Governor of the Bank of Finland, and Kaushik Mitra, Professor of Economics at the University of Birmingham. Patrick Pintus, a Researcher at the Banque de France, presented a co-authored paper with Yi Wen, an Assistant Vice-President at the Federal Reserve Bank of St. Louis and Xiaochuan Xing from Yale University.  The papers presented by both of these authors make small changes to a relatively conventional monetary Dynamic Stochastic General Equilibrium (DSGE) Model. Both of them reach non-mainstream conclusions by exploiting the fact that monetary equilibrium models always contain multiple equilibria.


George Evans began his work on adaptive learning in his Ph.D. dissertation at Berkeley in the early 1980s. When the rest of the profession was swept up by the rational expectations revolution, George persevered with the important idea that perfectly correct beliefs about the future cannot be plucked from the air, they must be learned. For an introduction to George’s work, I highly recommend the book co-authored with his long-time co-author, Seppo Honkapohja.


The paper of Evans, Honkapohja and Mitra (EHM), begins with a theme we met in post two where I discussed the fact that the standard New Keynesian model, in which the central bank follows a Taylor Rule, has two steady state equilibria. The intellectual foundation for that idea comes from the work of Jess Benhabib, Stephanie Schmitt Grohé and Martín Uribe, (BSU). BSU pointed out that the money interest rate cannot be negative. It follows from that observation that the Taylor Rule must be non-linear. 

ESM picture.png

The evidence for multiple steady states is presented in Figure 1 which is taken from George's paper. The dashed line is called the Fisher equation, after the American economist Irving Fisher. This graphs the relationship we would expect to hold between the money interest rate and the inflation rate if the real interest rate is constant. The solid line is an estimated Taylor Rule that takes account of the non-linearity in the central bank’s response to inflation that arises from the existence of the lower bound.  A steady state is an inflation rate and an interest rate that satisfies both of these equations. Notice that these two curves intersect twice, one at an interest rate of roughly 2.5% and one with an interest rate close to zero.

Evans, Honkapohja and Mitra (EHM) build on this idea by adding a theory of adaptive learning. I am often asked how my own work on the belief function is related to George’s work on adaptive learning. They are very closely linked. I agree with George that expectations, aka beliefs, are not plucked from the air. They must be learned. In models where there is a continuum of equilibria, like the ones I work with, it is beliefs that select which equilibrium will prevail. George’s work on adaptive learning provides a micro-foundation for what I have called the belief function.

Previous work has shown that, in the basic New-Keynesian model, the upper steady state is stable under adaptive learning but the lower steady state is not. They modify the basic model by adding the assumption that the rate at which prices and output can fall has a lower bound. They show that this assumption implies that there exists a third steady state in which recessions can be persistent and deep.

Figure 2

Figure 2

Figure 2 illustrates the dynamics that arise from adaptive learning in their model. The three steady states, A B and C are respectively the target 2% inflation steady state, the zero-lower bound steady state and the deflation steady state that arises from EHM’s assumption that deflation is bounded below. Importantly, steady states A and C are stable; steady state B is not. Most of the time, the economy is hit by shocks that keep it in region A. But occasionally a large shock, like the Great Recession, knock it over into region C and, when that happens, it may be very difficult to escape.

George and his co-authors use their analysis to argue that a large fiscal intervention, a short-sharp shock, can knock the economy out of region C and back into region A. Readers of this blog will know that I have expressed scepticism of that idea in the past, largely because I am not a big fan of the basic NK model. However, this is the most convincing rationale in favour of a large fiscal stimulus that I have yet seen. The mechanism that ESM propose works by permanently shifting expectations; that mechanism is likely to be at work in many economic models and I am pleased to see the George’s agenda is once again getting exposure. If you are a young researcher who is thinking of working in macroeconomic theory and policy, consider working on models of expectations formation. 


Next, I will turn to the work of Patrick Pintus, Yi Wen and Xiaochuan Xing (PWX). I have been a fan of Patrick’s work since he was a graduate student in Paris working with Jean-Michel Grandmont and I have followed Yi Wen’s papers closely since we first met at a conference in New York many years ago. Yi wrote the state of the art paper on why models of increasing returns to scale should be taken seriously and it is no surprise that a collaboration that involves the two of them would produce ideas worth listening to. This is my first exposure to Xiaochuan Xing, and I am sure we will hear much more of him in the coming years.


PWX take up a puzzle that has long been known to plague the equilibrium real business cycle (RBC) model that has dominated macroeconomic theory for more than thirty years. That model predicts that when interest rates are high, the economy will soon enter an expansion. The reality is different. High interest rates are an omen that a recession is coming down the road. What are the features of the real world that are missed by the classical RBC paradigm?

PWX relax the RBC model in two ways. First, they introduce the realistic assumption that it is difficult or impossible to borrow in large amounts without providing collateral. Second, they recognize that many loan contracts are arranged with variable-rates as opposed to fixed-rates. By combining these two assumptions with an otherwise standard business cycle model, they arrive at a model where many different outcomes can occur in equilibrium. The alert reader will by now, have picked up the theme: this is a model with multiple equilibria where outcomes are driven by beliefs.

The fact that PWX are able to construct a theoretical model with multiple equilibria is a first step: But does this model help to explain data? Until recently, much of the work on multiple equilibria consisted of esoteric calibrated theoretical models. The reason for that was two-fold. First, most graduate students were not exposed to the potential for multiple equilibrium models to explain data. And second, the techniques that were available to confront those models with data had not been developed. That began to change when Thomas Lubik and Frank Schorfheide showed in 2004 how to estimate a model with a set of indeterminate equilibria. That agenda was advanced further when Farmer, Khramov and Nicolò (FKN) developed a simple method for implementing their idea using standard software packages.  FWX use the FKN technique to estimate their model using US data and they find that roughly 25% of the variance in GDP is caused by animal spirits. You can hear Patrick discuss these ideas in the video linked above.