Moving the Overton Window: Let the Debate Continue

Overton.png

The NIESR Rebuilding Macroeconomics project is stirring a great deal of controversy. And that’s a good thing. I am part of the management team. But I can only speak for myself here. In my opinion,  we should be generating dialogue between heterodox non-mainstream groups and conventional macroeconomists. Mainstream economists, and heterodox economists alike, should each stop being so certain that they are guardians of the sole correct approach.

Our goal is not to fund a set of projects that is non-intersecting with the mainstream. It is to move the Overton Window. We seek risky projects that would be unlikely to be funded by mainstream funding agencies. Sometimes this means that the projects we fund will overlap with existing extant agendas but originate from communities of scholars that are non mainstream. Sometimes it means they follow approaches that originate in other social sciences and are not currently part of economics.

We have funded two projects so far. One is an example of the first kind. It involves research that has some overlap with existing mainstream research. The other is an example of a project that originates from social psychology and that has little or no overlap with existing research. Our management team includes a complexity theorist, an anthropologist and a social psychologist. Economists have historically maintained an ideological purity that borders on arrogance. I think it’s time for cross fertilization of ideas.

The internet discussion on the project we are funding by Özlem Onaran is, by this criterion, already showing signs of success. There is a range of heterodox economics that stretches from Post Keynesians through agent based models, Modern Monetary Theory, institutional economics etc etc to approaches using indeterminacy and sunspots of the kind I have personally worked with.  This latter example, sunspot theory, is a fundamental building block of modern macroeconomics that is influential, but only occasionally invited to the table at NBER macro conferences. Other approaches like behavioural macroeconomics were once fringe but are now mainstream.

It would be naive, and obviously false, to think that by including a post Keynesian group in the dialogue, Rebuilding Macroeconomics will lead to research on topics that are excluded entirely by the mainstream. But the approach followed by a PK economist like Özlem is unlikely to be a mirror of that followed by mainstream economists like Mathias Doepke and Michèle Tertilt.

In the mainstream, it would be difficult (but admittedly not impossible) to bring in ideas from mainstream sociology. We are still, after all, wedded to the nineteenth century notion that individual preferences are independent of social trends. The economics of the family, as Doepke and Tertilt refer to it, or feminist economics, the label preferred by the heterodox, will, I hope, lead to a breaking down of some of the barriers that were erected by economists in the nineteenth century when we formalized the welfare theorems.

I have no illusion that all of the projects we fund will succeed in leading to a new ‘policy relevant macroeconomics’. But that is not my expectation. Our remit is to take risks. By funding projects from people or groups that inject new genes into the pool I hope we will seed the development of ideas that might otherwise take much longer to emerge. Many of these projects will likely fail. But there is a widespread feeling amongst the general public, those who pay for our research, that mainstream macroeconomics has itself failed in a spectacular way. While I do not fully subscribe to that view, I do feel that the existing institutional structures have led to the dominance of a small subset of ideas whose success has depended more on historical circumstance than to any obvious claim to truth.

I see one of my major responsibilities in my roles as a member of the Rebuilding Macroeconomics  management team, a co-leader of the RM instability Hub, and as Research Director at NIESR, to contribute meaningfully to moving the Overton Window. We are engaged in a non-experimental science that sometimes needs to be shaken up a little by bringing in fresh approaches. Let the debate continue.

Standing on the Shoulders of Giants

Screenshot 2018-04-14 10.11.02.png

I have rarely read anything as unhelpful to the task of Rebuilding Macroeconomics as Howard Reed’s poisonous attack on the foundations of the dismal science. I have a lot of sympathy for Howard’s position. But if you are young and committed, with a desire to rebuild economics, your first job is to understand the edifice you are rebuilding. 

Diane Coyle has penned an admirable defense of neo-classical economics in response to Howard’s rhetoric; and that is a good place to start. But there is much more to be said, particularly if you are reading this post from the position of a young uncommitted student with a desire to change the world.

Perhaps the most depressing part of Reed’s criticism is that he has an understanding of what he is attacking. But the young mind, taking his words at face value, will be deterred from studying neoclassical economics. Instead our bright young soon-to-be academic will set herself the task of reconstructing economics from a blank slate. So where should our budding reformer begin? Howard suggests that the classical economists, Smith, Ricardo and Marx, may perhaps have something valuable to contribute.

I disagree. Smith,  Ricardo and Marx were privileged members of the bourgeoisie who injected eighteenth and early nineteenth century moral values into economics. They were students of history, familiar with a western philosophical tradition stretching back to Plato and Aristotle, both of whom were European slave owners. Marx may have had some enlightened views on the role of class, but he foolishly tried to formalize his system using elementary mathematics.

No. I believe that our reconstruction must dismiss, not just the mistaken neoclassical ideas of Walras, Pareto and Marshall. To truly reconstruct economics as a relevant social science we must first tear down every part of the existing patriarchal structure which serves as a tool of the ruling elite to suppress the legitimate desires of freedom loving peoples throughout the world to receive their entitled share of global wealth. We must begin, instead, by studying Latour,  Derrida and Baudrillard.  

And I have a bridge to sell you.

The divergence of neoclassical economics from classical ideas does not have to do with mathematical formalism. It occurs when Walras and Pareto introduced us to homoeconomicus, a human being who springs fully formed into the world at the age of 18 with a complete understanding of his preferences over every conceivable outcome in his extensive choice set. That step enabled us to understand why markets are better ways of organizing economic activity than any other known form of social organization. If you disagree with that statement, go ask Xi Jinping or any one of the other 1.5 billion human beings in China who have been lifted from abject poverty in the space of a few decades by the adoption of a market economy.

Homeconomicus brought understanding that was central to the neoclassical project. But his introduction to economics came at the cost of splitting economics off from sociology which retained the idea that our preferences are formed through social interaction. There is room for both ideas in the social sciences and economists and sociologists have much to learn from each other. But to engage in genuine dialogue we must  first learn each other’s language.

If you are a young, idealistic, smart, dedicated student with a desire to do good in the world there are many ways to accomplish that goal. If economics is your chosen route, do not be fooled by snake oil salesmen who offer you an easy path. There is much that is wrong with existing economics. But to contribute to our subject, you must first understand how we got here. Neoclassical Economics was constructed by young, idealistic, smart, dedicated people, just like you, who built on the ideas of those who came before. Take a page from the book of those who preceded you. We are all standing on the shoulders of giants. 

Large Scale Econometric Models: Do they Have a Future?

Here is an intriguing question: How is the Large Hadron Collider like the National Institute Global Economic Model? Read on!

It was a great pleasure to organize a session on econometric models for the Royal Economic Society Conference at the University of Sussex. In my new role as Research Director at the National Institute of Economic and Social Research (NIESR) I have inherited the responsibility for research and development of the National Institute Global Economic Model, NiGEM; the preeminent model of the world economy. As you might expect, given my role at NIESR, the answer to the question posed in this session is a resounding yes!

Screen Shot 2018-03-27 at 7.11.03 PM.png

For the session at Sussex, in addition to my own presentation, I assembled three outstanding speakers, Tony Garratt from the Warwick Business School (WBS), Marco Del Negro of the Federal Reserve Bank of New York and Garry Young, Director of Macroeconomic Modelling and Forecasting at NIESR.

Tony kicked off the session with a description of the work he’s been engaged in at WBS along with his co-authors Ana Galvao and James Mitchell. The University of Warwick is collaborating with the National Institute of Economic and Social Research in a partnership that gives Warwick graduate students access to the expertise of the applied economists at NIESR and where NIESR gains from the academic expertise of Warwick economists. As part of that partnership, the WBS team have agreed to publish their forecasts each quarter in the National Institute Review as a benchmark against which to measure the performance of the NiGEM team. Tony gave us a fascinating account of what it is the WBS team does!

Their approach is reduced form and eclectic. WBS has a stable of more than twenty-five models that are averaged with weights, updated in real time, by past forecast performance. Tony showed us how the WBS forecasts had performed in the past relative to the Bank of England and the Bank of England  Survey of External Forecasters. He described different ways of evaluating forecasts, both by comparing point forecasts and density forecasts, for output growth and inflation. Perhaps the most interesting result, for me, was that judgmental forecasts often outperform econometric models at short horizons.

Tony’s talk was followed by Marco Del Negro from the New York Fed who described the behaviour of a medium scale Dynamic Stochastic General Equilibrium (DSGE) model they’ve been running at the NY Fed since 2008. DSGE models have received quite a bit of bad press lately as a result of the failure of almost all of the experts to predict the 2008 financial crisis. Marco gave a spirited defence of DSGE models by showing us the forecast performance of the NY Fed’s DSGE model from 2008 to the present. The model is written in a relatively new computer language; JULIA. The code is open source, blindingly fast and widely used in research publications in leading journals. For the MatLab users out there: perhaps it’s time to switch?

In the third presentation of the day we were treated to an entertaining interlude when the projection facility malfunctioned and Garry Young ad-libbed for ten minutes with a cricketing anecdote. When he resumed, Garry gave us an account of the use of NiGEM to forecast the effects of Brexit. NiGEM has more than 5,000 equations, covers 60 countries and is widely used by central banks and national treasuries around the world for scenario analysis. NiGEM has a lot more in common with the NY Fed’s DSGE model than most people realize.

In the final presentation of the day, I tied the three presentations together by explaining the history of econometric modelling beginning with Klein Model 1 in the 1940s and ending with the NY FED’s DSGE model and with NIESR’s NiGEM. For me, the main story is continuity. With the publication of Robert Lucas’ celebrated critique of econometric modelling in 1976, large-scale models disappeared from the halls of academia. But they never disappeared from central banks, treasuries and research institutes where, as Garry reminded us, they have been used as story-telling devices for more than fifty years.

The version of NiGEM we work with today has come a long way from the backward looking equations of Klein model 1. It has been lovingly tended and developed by distinguished teams of researchers who have passed through the National Institute over the years. Past NIESR researchers include among their number, some of the leading applied economists and applied econometricians in the UK and the model they developed includes state-of-the art assumptions including the ability to add forward looking elements and rational expectations in solution scenarios.

Large-scale econometric models are here to stay. Policy makers use models like NiGEM to analyse policy alternatives and that is unlikely to change soon. In my presentation I argued for a closer dialogue between economic theorists and applied economists, similar to the dialogue that currently exists between theoretical physicists and applied physicists. I argued that NiGEM located at NIESR, is to economics as the Large Hadron Collider (LHC) located at CERN, is to physics.  Just as physicists use the LHC to test new theories of subatomic particles so economists should use NiGEM to test new theories of macroeconomics. I hope to put that idea into practice in the future at the National Institute.

In a separate presentation at the Royal Economic Society Conference this year, I discussed work I am engaged in with a research team at UCLA where we have developed a new theory of belief formation. This is an example of one of the theories we hope to test using NiGEM as a laboratory. 

According to Forbes, the operating budget of the Large Hadron Collider is approximately one billion US dollars a year. NiGEM is entirely funded from subscriptions and the operating budget is well south of half a million US dollars. Funding agencies take note: we could make some pretty cool improvements for a billion a year.

What Does it Mean to Have Rational Expectations?

This is a follow up to my ergodicity post from last week. Both posts are inspired by conversations I had with my Co-Hub-Leader Jean Philippe Bouchaud (for the Rebuilding Macroeconomics Hub: Why are Economies Unstable?) on the role of the ergodicity assumption in science. Content warning: This is more technical than many of my posts with no apologies. It is a technical subject.

Figure 1: The Tent Map

Figure 1: The Tent Map

I became interested in Chaos Theory in the early 1980s when I attended a conference in Paris organized by Jean Michel Grandmont. Jean Michel had been working on non-linear cycle theories, as had I, and the conference was an opportunity to explore the idea that plain vanilla general equilibrium models with rational agents, each of whom held rational expectations, might lead to complicated dynamic paths for observable variables. As I pointed out here, many of us at the conference were persuaded by the work of Buzz Brock, who argued that even if the economic data live on a complicated non-linear attracting set, we don’t have enough data to establish that fact.

The simplest example of a complicated non-linear attracting set is the tent map displayed in Figure 1. The map F(x) (plotted as the red triangle) maps the [0,1] interval into itself. The map has a steady state at 0 and a steady state at XS but both steady states are unstable. Trajectories that start close to either steady state move away from them. However, all paths that start in the [0,1] interval stay there. The tent map is a perpetual motion machine.

While these facts are interesting, my eventual response was: So What? If you generate random numbers on a computer, those numbers are generated by more sophisticated versions of the tent-map. If we lived in a world where the shocks to GDP were generated by a chaotic deterministic system it should not influence our behaviour. It would simply explain the origin of what we treat as random variables. Data generated by the tent-map have predictable behaviour. They obey statistical laws. If there is a small degree of uncertainty of the value of x at date 1, that uncertainty is magnified the further you move into the future. In the limit, as T gets larger, x(T) is a random variable with an invariant distribution and the best guess of where you would expect to see x(T) as T gets larger is the mean of x with respect to that invariant distribution.

Jean-Philippe introduced me to the work of Philip Anderson, a Nobel Laureate in physics who worked on solid state electronics and has written a series of illuminating posts on phenomena known as spin glasses. Without getting too far into the details, the basic idea is that for a large class of physical phenomena, it is not just the state variables that describe the world that are random. It is the probabilities that those variables will live in any particular state.

Here is a question for all of you out there who have thought about these ideas. Imagine that you are offered a sequence of gambles in which you may bet on the outcome of a coin toss where the coin comes up heads with probability p(t) and tails with probability 1-p(t) where t=1,2,.. and where p(t) is generated by the tent map. Suppose we allocate the value 0 to heads and 1 to tails. I conjecture that, for any finite sample of T coin tosses, the sample mean of the random variable that takes the value 0 with probability p(t) and 1 with probability 1-p(t) does not converge to a number as T gets larger. If the world is like this, and I believe there is a sense in which financial market data are very much like this: What Does it Mean to Have Rational Expectations?

The Household Fallacy

My new working paper, joint with Pawel Zabczyk, is out now as an NBER working paper, a CEPR discussion paper and a NIESR discussion paper. Here is the abstract: 

house.jpg

We refer to the idea that government must ‘tighten its belt’ as a necessary policy response to higher indebtedness as the household fallacy. We provide a reason to be skeptical of this claim that holds even if the economy always operates at full employment and all markets clear. Our argument rests on the fact that, in an overlapping-generations (OLG) model, changes in government debt cause changes in the real interest rate that redistribute the burden of repayment across generations. We do not rely on the assumption that the equilibrium is dynamically inefficient, and our argument holds in a version of the OLG model where the real interest rate is always positive.

Screenshot 2018-03-12 19.21.31.png

Figure 1 will be helpful if you know something about difference equations. What this illustrates is the dynamics of debt adjustment in a model with two generations where preferences are relatively standard. The picture illustrates a case where the interest rate is positive and where governments do  not need to actively balance their budgets. Unlike some examples where this happens, we are not relying on the idea that forward looking agents select the only equilibrium that uniquely pins down the price level and prevents debt from exploding. In other words, we do not appeal to what the literature refers to as the Fiscal Theory of the Price Level (FTPL).We claim that this situation is not a crazy way to think about the world; it is a generic and common property of a large class of overlapping generations models. That fact is an embarrassment for the FTPL since it implies that, in monetary models where the dynamics are described by Figure 1, the FTPL is incapable of selecting a unique equilibrium. Stay tuned for two more papers coming soon on this topic with more realistic preference and endowment structures.

Lets also be clear about what we are NOT saying. We do not claim that governments do not face constraints: They do. In our model, the government runs a primary surplus on average as a fraction of potential GDP just as it does in the real world. What we claim is that the government does not need to actively alter the fiscal surplus in response to booms or recessions.