Economics is not the kind of science in which there could ever be one true model that works best in all contexts. The point is not “to reach a consensus about which model is right,” as Romer puts it, but to figure out which model applies best in a given setting. And doing that will always remain a craft, not a science, especially when the choice has to be made in real time.
The social world differs from the physical world because it is man-made and hence almost infinitely malleable. So, unlike the natural sciences, economics advances scientifically not by replacing old models with better ones, but by expanding its library of models, with each shedding light on a different social contingency.
Friday, September 11, 2015
Rodrik on Economic Models
Tuesday, August 18, 2015
Stories from the Macro Wars
That may be an apt description of macroeconomics in the 1970s and 1980s. On his website, Paul Romer has offered an interesting take on the methodenstreit between the dynamic general equilibrium approach (so-called "freshwater" macro, championed by Robert Lucas) and Keynesian macro-econometric models (the "saltwater" camp). Romer is particularly critical of Robert Solow, arguing that his dismissive attitude towards Lucas et al., contributed into a counterproductive hardening of differences. He writes:
Solow also seemed to be motivated to attack harshly because he was concerned that the type of model Lucas was developing might undermine political support for active countercyclical policy. To his credit, there was a legitimate basis for this concern. The new Chicago school of macro eventually did oppose an active response to the financial crisis and its aftermath. But the type of response that Solow exemplified may actually have contributed to the emergence of this new Chicago school. In retrospect, if the goal was to maintain support for active macro policy, the better course would have been to take seriously what the rebel group that was forming around Lucas was saying. This might have kept the rebels from cutting off contact with all outsiders, even those who were taking seriously the issues they were raising.Brad DeLong and Paul Krugman responded in defense of Solow. DeLong writes:
And, at this point, Romer ought to say that Solow’s and Hahn’s criticisms were (a) no more biting in their rhetoric than the criticisms that Stigler, Friedman, and company had been inflicting on their victims at Chicago for a generation, and (b) correct and accurate.Romer has more interesting detail in his response, including this summary of the main points:
In the summer of 1978, Lucas and Sargent were making three claims:[Romer uses "SAGE" to refer to general equilibrium models]. See also: this from Krugman, and this from DeLong. David Glasner has a thoughtful post putting things in a broader context.
(a) Existing multi-equation macro simulation models were not identified. That is, these models summarized correlations in the data but did not yield reliable statements of the form “if the government does X, this will cause Y to happen.”(b) It was time to use SAGE models to address such fundamental questions about economic fluctuations as why changes in the supply of money influence economic activity; and(c) SAGE models will imply that an active monetary policy cannot stabilize economic fluctuations.Solow thought that Lucas and Sargent were wrong about the policy ineffectiveness claim (c). DeLong, Krugman, and I all agree. In the 2013 introduction to his collected papers, Lucas uses some asides about the Great Depression and the Great Recession to admit that now even he agrees. Claim (c) is what DeLong and Krugman have in mind when they say that Solow was right and Lucas was wrong.
Yet all macroeconomists now agree that Lucas and Sargent were correct about the fatal problems with the large simulation models. Much of Solow’s response amounted to an implausible denial that there was anything wrong with them. So on this point, the roles are reversed. Lucas and Sargent were right and Solow was wrong.
In his post, Romer cites several papers, including Lucas and Sargent's "After Keynesian Macroeconomics," from the 1978 Boston Fed conference. Perhaps it should be known as "the throwdown in Edgartown."
Fascinating stuff... but fortunately for contemporary macroeconomists - particularly those of us with conflict-averse midwestern temperaments - things aren't nearly so rancorous now. There certainly are differences of inclination and opinion, and economists can be blunt in expressing their differences, but the "saltwater" vs. "freshwater" cleavage is largely a thing of the past, as this Steven Williamson post explains. Since the wars of the 1970s and 80s, there has been some convergence: macroeconomists have developed a class of models - sometimes called "New Keynesian" - which respond to Lucas' methodological critique but also allow for a stabilizing role for macroeconomic policy. That's not to suggest we've figured it all out, of course; this recent Mark Thoma column highlights some of the weak points of contemporary theory.
Thursday, May 21, 2015
A Theory of Production
Using C to denote capital, L for labor and P for production, the production function makes its first appearance:
Although the description of technology is a theoretical contribution, much of the article is empirical in nature, as they construct indexes of capital and labor in order to test their model. They compare the production implied by their function and estimates of capital and labor, P', with a measure of actual production.
To a contemporary macroeconomist reader, the striking thing about the article is the extent to which it anticipates how we analyze business cycles today. Cobb and Douglas, separate out cyclical and trend components (using 3 year moving averages) and show that the deviations of actual production and the production implied by changes in capital and labor are procyclical.
The article includes a chronology of business cycles which aligns with the NBER chronology; the NBER recessions during this period are
- June 1899 - Dec. 1900
- Sept. 1902 - Aug. 1904
- May 1907 - June 1908
- Jan. 1910 - Jan. 1912
- Jan. 1913 - Dec. 1914
- Aug. 1918 - Mar. 1919
- Jan. 1920 - July 1921
Today these deviations of actual output from the amount implied by changes in factors of production are known as "Solow residuals" after work by Robert Solow in the 1950s and interpreted as measures of technological progress (i.e., our ability to wring more output out of given amounts of capital and labor). Although Solow was mainly concerned with long-run growth trends, in the 1980's, Real Business Cycle theorists interpreted short-run fluctuations as "technology shocks". In Real Business Cycle models these shocks drive economic fluctuations, and the same pattern identified by Cobb and Douglas - using postwar data and newer detrending techniques - was cited in support of this theory. One weakness of this argument is that short run movements in the Solow residual are at least partly due to utilization - "factor hoarding" - rather than changes in technology. This, too, was anticipated by Cobb and Douglas:
The index does not of course measure the short-time fluctuations in the amount of capital used. Thus, no allowance is made for the capital which is allowed to be idle during periods of business depression nor for the greater than normal intensity of use int he form of second shifts etc., which characterizes the periods of prospertity.Overall, this article would fit very well into a syllabus for a current course on business cycle theory. Hmm...
Tuesday, March 31, 2015
Minimum Wages and Economics
The UK minimum wage took effect 16 years ago this week, on April 1 1999. As with the Equal Pay Act, economically literate commentators feared trouble, and for much the same reason: the minimum wage would destroy jobs and harm those it was intended to help. We would face the tragic situation of employers who would only wish to hire at a low wage, workers who would rather have poorly paid work than no work at all, and the government outlawing the whole affair.At its best, economics is a fruitful dialogue between theory and empirical (data) work. All economic models are, by nature, simplifications. One of the judgments we have to make is whether some of the simplifcations we've made are inappropriate. Testing our models against the data helps us do that.
And yet, the minimum wage does not seem to have destroyed many jobs — or at least, not in a way that can be discerned by slicing up the aggregate data. (One exception: there is some evidence that in care homes, where large numbers of people are paid the minimum wage, employment has been dented.)
The general trend seems a puzzling suspension of the law of supply and demand. One explanation of the puzzle is that higher wages may attract more committed workers, with higher morale, better attendance and lower turnover. On this view, the minimum wage pushed employers into doing something they might have been wise to do anyway. To the extent that it imposed net costs on employers, they were small enough to make little difference to their appetite for hiring.
An alternative response is that the data are noisy and don’t tell us much, so we should stick to basic economic reasoning. But do we give the data a fair hearing?
The first tool an economist will reach for in trying to analyze a market is supply and demand; in that context, a minimum wage is a price floor, which creates an excess supply of labor (i.e., unemployment):
(the equilibrium wage and quantity of labor are labelled with superscript e's, and the m's mark the minimum wage and corresponding amount of labor).
We like supply and demand because it is simple and works well in many context; but the labor market is one case where its simplicity can lead us astray. As Paul Krugman recently put it:
[B]ecause workers are people, wages are not, in fact, like the price of butter, and how much workers are paid depends as much on social forces and political power as it does on simple supply and demand.Indeed, some empirical research has demonstrated that minimum wages do not have the effects implied by the supply and demand framework. This NYT Magazine piece by Annie Lowrey summarized David Card and Alan Krueger's classic paper on the subject and some of the subsequent dispute.
Saturday, December 20, 2014
Economics Navel-Gazing, Curricular Edition
The criticisms seem to me to be based on a somewhat unfair caricature of economics and economists, that we're head-in-the-sand apologists for "neoliberalism" who use mathematics as a form of obscurantism and have little useful to say about the "real world," particularly in the wake of the financial crisis.
Some of this may be rooted in the fact that the "economics" articulated by politicians, government officials and the press - what Simon Wren-Lewis has called "mediamacro" - does not reflect the views of most of mainstream academic economics. In particular the obsession with government budget deficits is not based on textbook economics (I discussed an example of this misconception in a European context a couple of years ago).
Markets are at the heart of economics - this may be where the view that economists are "free market fundamentalists" comes from. In introducing markets, though, there are really two main points to make:
- The gains from exchange and specialization possible from voluntary trade (i.e., Adam Smith's "Invisible Hand"), and the ability of markets make to adjustments based to dispersed information about what Hayek called "the particular circumstances of time and place" which would be un-knowable to any central planner.
- While economists need to make our students aware of the hidden and under-appreciated role markets play in organizing society and in lifting humanity out of subsistence-level poverty, we also devote a considerable amount of attention to how they fail. In particular, problems of monopoly power, externalities, public goods and asymmetric information are standard subjects for introductory economics courses. (2a., There are also reasons to be skeptical in practice of the ability of our political system to effectively correct market failures).
We typically introduce markets with the model of "supply and demand," and the exercise of thinking in terms of models provides much of the lasting value of studying economics. Working with economic models can sharpen students' logical and critical thinking skills immensely. As John Cochrane nicely put it recently, "economic models are quantitative parables, not explicit and complete descriptions of reality." The criticism that models are "simplifications" is a cheap one - writing down a set of assumptions in mathematical form and working out the implications (and then testing them against data), is where the insight comes from. The discipline of doing this cultivates an ability to think intelligently about tradeoffs and hidden costs, and to trace conclusions back to underlying assumptions and consider how changing assumptions lead to different conclusions. Since models are, by necessity, very stylized descriptions of the world, students of economics must not only learn how to work with them, but also how to judge which simplifications are appropriate for a given circumstance or question. As Keynes said, "Economics is the science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world."
So I think the core of what we try to do in our introductory economics courses - introducing markets (both their successes and failures), and teaching students how to think in terms of models - is extremely worthwhile. Of course, this does not cover everything that we possibly would like to do in a course (or in small set of courses). Much of economics is concerned with the allocation of scarce resources, and the time that our students can spend on a course in a semester (both in and out of the classroom) is very limited, forcing some difficult choices on instructors. Some of the criticisms made by the UK students seem to be about what we're leaving out, though I think what we're doing in our introductory courses is pretty important, and laying some groundwork in the economic way of thinking will help the students tackle issues like understanding the financial crisis, either in later classes or independently. While many of the debates in the news are about macroeconomic policy (and as a macroeconomist, I'm happy to see the revival of interest in the topic, even if arises from unfortunate sources), the core microeconomic concepts are very important and not to be skipped. While it can be exciting to be teaching a subject that is relevant to contemporary events, we should not be seduced into bringing "news" into the classroom in a way that interferes with developing an understanding of the fundamentals.
There is sometimes a bit of a muddle in these navel-gazing discussions, too, between what should be in our undergraduate curriculum and the separate, but not wholly unrelated, issue of our research agenda and graduate curriculum. I'm not entirely unsympathetic to the calls for "methodological pluralism" though I wouldn't go as far as the UK students would like. I have argued for graduate study of the history of economic thought, and I have emphasized it in my undergraduate teaching, using it as an organizing principle for my intermediate macroeconomics class (and also making my intro students read some Smith, Hayek, Friedman and Keynes). As a field, I do think macroeconomics is at a point where we should be open to reconsiderations of some of the standard tools (though I don't think that is ever not the case), and I worry that the "publish or perish" incentives we all face mean that we do too little of that.
Karl Whelan of University College, Dublin has a written nice essay "Teaching Economics 'After the Crash'" with a more detailed response to the UK students' criticisms which is well worth reading.
Wednesday, July 23, 2014
DSGE Failing the Market Test?
As I see it, the main characteristic of "equilibrium" models Lucas and Sargent inaugurated is that they put people, time, and economics into macro.A Bloomberg View column by Noah Smith nicely summarizes the methodological shift, which gained momentum from the apparent breakdown of the Phillips curve relationship between inflation and unemployment in the 1970s. Smith writes:
Keynesian models model aggregates. Consumption depends on income. Investment depends on interest rates. Labor supply and demand depend on wages. Money demand depends on income and interest rates. "Consumption" and "investment" and so forth are the fundamental objects to be modeled.
"Equilibrium" models (using Lucas and Sargent's word) model people and technology. People make simultaneous decisions across multiple goods, constrained by budget constraints -- if you consume more and save more, you must work more, or hold less money. Firms make decisions across multiple goods constrained by technology.
Putting people and their simultaneous decisions back to the center of the model generates Lucas and Sargent's main econometric conclusion -- Sims' "incredible" identifying restrictions. When people simultaneously decide consumption, saving, labor supply, then the variables describing each must spill over in to the other. There is no reason for leaving (say) wages out of the consumption equation. But the only thing distinguishing one equation from another is which variables get left out.
People make decisions thinking about the future. I think "static" vs. "intertemporal" are good words to use. That observation goes back to Friedman: consumption depends on permanent income, including expected future income, not today's income. Decisions today are inevitably tied to expectations --rational or not -- about the future.
Lucas showed that trying to boost gross domestic product by raising inflation might be like the tail trying to wag the dog. To avoid that kind of mistake, he and his compatriots declared, macroeconomists needed to base their models on things that wouldn’t change when government policy changed -- things like technology, or consumer preferences. And so DSGE was born. (DSGE also gave macroeconomists a chance to use a lot of cool new math tricks, which probably increased its appeal.)That's an interesting question -- when thinking about issues like this, I often come back to the divide between "science" and "engineering" put forward by Greg Mankiw. While academic macroeconomics has gone down the path marked out Lucas and Sargent, the policymaking "engineers" in Washington often still find the older-style models more useful. It sounds like Wall Street's economists do too.
OK, history lesson over. So why is this important now?
Well, for one thing, the finance industry has ignored DSGE models. That could be a big mistake! Suppose you’re a macro investor. If all you want to do is make unconditional forecasts -- say, GDP next quarter – then you can go ahead and use an old-style SEM model, because you only care about correlation, not causation. But suppose you want to make a forecast of the effect of a government policy change -- for example, suppose you want to know how the Fed’s taper will affect growth. In that case, you need to understand causation -- you need to know whether quantitative easing is actually changing people’s behavior in a predictable way, and how.
This is what DSGE models are supposed to do. This is why academic macroeconomists use these models. So why doesn’t anyone in the finance industry use them? Maybe industry is just slow to catch on. But with so many billions upon billions of dollars on the line, and so many DSGE models to choose from, you would think someone at some big bank or macro hedge fund somewhere would be running a DSGE model. And yet after asking around pretty extensively, I can’t find anybody who is.
The question is whether academic macroeconomics is on track to produce models that are more useful for the policymakers and moneymakers. The DSGE method is still fairly new, and, until recently, we've been constrained by the limitations of our computers as well as our minds (a point Narayana Kocherlakota made here), so maybe we're just not quite there yet. But we should be open to the possibility that we're on the wrong track entirely.
Thursday, June 12, 2014
Lucas on Keynes
I think that in writing the General Theory, Keynes was viewing himself as a spokesman for a discredited profession. That’s why he doesn’t cite anyone but crazies like Hobson. He knows about Wicksell and all the “classics,” but he is at pains to disassociate his views from theirs, to overemphasize the differences. He’s writing in a situation where people are ready to throw in the towel on capitalism and liberal democracy and go with fascism or corporatism, protectionism, socialist planning. Keynes’s first objective is to say, “Look, there’s got to be a way to respond to depressions that’s consistent with capitalist democracy.” What he hits on is that the government should take some new responsibilities, but the responsibilities are for stabilizing overall spending flows. You don’t have to plan the economy in detail in order to meet this objective. And in that sense, I think for everybody in the postwar period—I’m talking about Keynesians and monetarists both—that’s the agreed-upon view: We should stabilize spending flows, and the question is really one of the details about how best to do it. Friedman’s approach involved slightly less government involvement than a Keynesian approach, but I say slightly.This is consistent with the "Neoclassical Synthesis" view that Keynes himself presaged in chapter 24 of the General Theory, which Brad DeLong discussed on his WCEG blog yesterday and Krugman commented on today (the quote from Lucas makes me wonder if perhaps Lucas and Krugman aren't quite as far apart as they think after all?).
So I think this was a great political achievement. It gave us a lasting image of what we need economists for. I’ve been talking about the internal mainstream of economics, that’s what we researchers live on, but as a group we have to earn our living by helping people diagnose situations that arise and helping them understand what is going on and what we can do about it. That was Keynes’s whole life. He was a political activist from beginning to end. What he was concerned about when he wrote the General Theory was convincing people that there was a way to deal with the Depression that was forceful and effective but didn’t involve scrapping the capitalist system. Maybe we could have done it without him, but I’m glad we didn’t have to try.
Sunday, February 23, 2014
Fighting the last Methodenstreit
The reason they might seem so is the methodological underpinnings of DSGE (Dynamic Stochastic General Equilibrium) models, which are "micro-founded" macroeconomic models derived from the optimizing behavior of individuals (or, often a "representative agent") were brought into macroeconomics by Robert Lucas, Ed Prescott and others who were seeking to overturn "Keynesian" macroeconomics (see, e.g., Lucas and Sargent, 1979, "After Keynesian Macroeconomics").
The first generation of models of this type - "Real Business Cycle" (RBC - where "real" means non-monetary) implied that economic fluctuations could be optimal, and that monetary and fiscal policy were either useless or harmful (this JEP article by Charles Plosser is a good primer).
While these models failed to convince as explanations of economic fluctuations overall (as Larry Summers explained, though they can still be a useful part of the macro toolkit, as Chris House argues), the methods introduced by the RBC theorists have become nearly universal in macroeconomic modelling under the broader moniker "DSGE". The last couple of decades have shown us that a number of "Keynesian" features, such as "sticky" prices can be incorporated into such models, which then go by the name "New Keynesian."
So the "New Keynesians" are using methods that were introduced by a cohort of macroeconomists that were explicitly anti-Keynesian. That is, Lucas et al. won the methodological war about how to build macroeconomic models, but their anti-Keynesian view of the economy itself did not prevail.
Wren-Lewis' answer to the question posed in the title of his post is "no." Paul Krugman summarizes and responds:
Wren-Lewis’s answer is no, because New Keynesians were only doing what they would have wanted to do even if there hadn’t been a de facto blockade of the journals against anything without rational-actor microfoundations. He has a point: long before anyone imagined doing anything like real business cycle theory, there had been a steady trend in macro toward grounding ideas in more or less rational behavior. The life-cycle model of consumption, for example, was clearly a step away from the Keynesian ad hoc consumption function toward modeling consumption choices as the result of rational, forward-looking behavior.But I think we need to be careful about defining what, exactly, the bargain was. I would agree that being willing to use models with hyperrational, forward-looking agents was a natural step even for Keynesians. The Faustian bargain, however, was the willingness to accept the proposition that only models that were microfounded in that particular sense would be considered acceptable. It’s one thing to accept that models with an Euler condition at their core can sometimes be useful; it’s quite different to restrict your discourse to models with that characteristic, while ruling out everything else.
For more interesting thoughts on this see: Brad DeLong, Roger Farmer, Steve Williamson's response to the Krugman post quoted above, another post by Krugman.
Thursday, January 23, 2014
Rodrik on Our Science-ish-ness
While the question "is economics a science?" is a little pedantic - the answer is depends on how one defines science - raising it does sometimes lead to some useful reflections on what it is that we actually do.
In an essay for Institute for Advanced Study's Institute Letter, "Economics: Science, Craft or Snake Oil?" Dani Rodrik offers offers a number of characteristically interesting thoughts on the topic, including:
Economics, unlike the natural sciences, rarely yields cut-and-dried results. Economics is really a toolkit with multiple models—each a different, stylized representation of some aspect of reality. The contextual nature of its reasoning means that there are as many conclusions as potential real-world circumstances. All economic propositions are “if-then” statements. One’s skill as an economic analyst depends on the ability to pick and choose the right model for the situation. Accordingly, figuring out which remedy works best in a particular setting is a craft rather than a science.Or, as Keynes said: "Economics is the science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world."
One reaction I get when I say this is the following: “how can economics be useful if you have a model for every possible outcome?” Well, the world is complicated, and we understand it by simplifying it. A market behaves differently when there are many sellers than when there are a few. Even when there are a few sellers, the outcomes differ depending on the nature of strategic interactions among them. When we add imperfect information, we get even more possibilities. The best we can do is to understand the structure of behavior in each one of these cases, and then have an empirical method that helps us apply the right model to the particular context we are interested in.
See also related thoughts from Mark Thoma and Chris Dillow.
Friday, February 15, 2013
Stanley Fischer
“He was not fundamentally a rat-exian,” Bernanke said, invoking the derogatory slang that Keynesians used to describe Lucas and his theory of “rational expectations.” “He was basically a Keynesian in his instincts, so he got along just fine with Samuelson and [fellow MIT professor Robert] Solow.”The profile includes some speculation that Fischer might succeed his student as Federal Reserve chair (Bernanke's term ends in Jan. 2014). If he were nominated, it would be interesting to see whether the fact that he is from outside the US - he was born in Zambia (when it was Northern Rhodesia) and came to the US for grad school at Chicago - and served as head of another country's (Israel's) central bank would cause trouble during the Senate confirmation process. It seems likely that some in the Senate would make trouble for whoever President Obama might nominate (which may be an argument for trying to keep Bernanke on), but I would guess opponents would be more likely to latch on to the fact that Fischer also held a high-ranking job at Citigroup for several years.
The fruit of Fischer’s effort to integrate the two approaches is known today as “New Keynesian” economics. It is the dominant approach in most leading economics departments, with Mankiw, Bernanke, IMF chief economist Olivier Blanchard and many others contributing to the movement.
But Fischer was arguably first out of the gate. He helped originate the argument that “sticky prices”— that is, practical impediments to changing prices for goods, such as the expense of printing a new restauarant menu — mean that even rational, self-interested businesses and consumers can make choices that add up to an economy much like the one Keynesians describe.
Fischer, Bernanke said, wrote “one of the very first papers that had both sticky prices and rational expectations in it.” By doing this, Fischer had in effect united the two sides of economics. “I still think Keynesian economics is extremely important, and if anybody didn’t think so, this crisis should have made them rethink,” Fischer said in an interview.
Update (2/17): David Warsh's Economic Principals also discussed Fischer as a potential Fed candidate a couple of weeks ago.
Tuesday, May 29, 2012
Flo Doesn't Know the Lucas Critique
Dyer explains:
According to the rules, Snapshot can generate a discount but not a surcharge -- unless you live in Rhode Island. The device logs your speed, but that's not a factor in the calculations because Progressive doesn't know where you are -- you might be doing 65 mph in a 70 zone or 45 mph through a car wash (although one wonders if a few trips into the triple digits would disqualify you from a safe-driver discount). The deciding factors are what time of day you drive, how far you drive, and how forcefully you brake.The reason for the braking criteria is that "gentle braking apparently correlates to low insurance claims". Anyone familiar with the Lucas critique will spot the problem with this, as Dyer does:
If you're approaching a yellow light, Snapshot is an incentive to risk running the red rather than hitting the brakes. If a deer jumps out in front of you, Snapshot would prefer that you swerve into the oncoming lane rather than mash that brake pedal.That is, the past relationship between braking behavior and driving safety reflects the behavior of agents under one set of incentives - if you change their incentives, their behavior will change, and the previous relationship between braking and safe driving will no longer be valid.
What Progressive really needs is a "structural" model that embodies the underlying preferences of their customers, which will be invariant to the policy change. Of course that's a much harder thing to do (as macroecomists have discovered over the past several decades...).
Sunday, October 16, 2011
The Structure of Macroeconomic Revolutions?
So I liked this post by Matthew Yglesias, where he invokes Kuhn and draws an analogy between macroeconomics and the Copernican revolution in astronomy. He recounts how Copernican (Earth revolves around the sun) astronomy eventually supplanted Ptolemaic (Earth is center of universe), but initially the Ptolemaic system made much better predictions, and concludes:
My view, with both all due respect and all due derision, is that the Robert Lucas types are like the early Copernicans here. There’s something admirable in their insistence that it ought to all work out to an easily modeled system grounded in compelling theoretically considerations. The New Keynesian model is a mess, like late-Ptolemaic astronomy, thrown together to account for observed reality. But you don’t fly to a moon with an elegant model that delivers mistaken predictions about where the moon’s going to be. And what we actually need is a Kepler to give us an elegant model that actually predicts the phenomena, and then a Newton who can explain what that model means.Hmmm... I'm more inclined to place the users of old Keynesian models, including the IS-LM-based macroeconometric models used by policymakers, in the "late-Ptolemaic" role, but, in any case, the Kuhninan approach helps explain why I simultaneously agree with Brad DeLong, Paul Krugman and Greg Mankiw that the IS-LM model remains a very useful tool, while being a little more optimistic than Krugman about the state of macroeconomics.
Also, I'm not sure that Lucas and others (including new Nobel laureate Tom Sargent) who have pushed macroeconomics towards "structural" or "micro-founded" models are leading us to an "easily modeled system." What counts for "elegance" in modern macro is consistency between the macroeconomic model and micro-economically optimal behavior on the part of consumers and firms (I suppose the obvious retort to that is to invoke Emerson: "A foolish consistency is the hobgoblin of little minds").
Wednesday, June 22, 2011
Krugman on Keynes
The brand of economics I use in my daily work – the brand that I still consider by far the most reasonable approach out there – was largely established by Paul Samuelson back in 1948, when he published the first edition of his classic textbook. It’s an approach that combines the grand tradition of microeconomics, with its emphasis on how the invisible hand leads to generally desirable outcomes, with Keynesian macroeconomics, which emphasises the way the economy can develop what Keynes called “magneto trouble”, requiring policy intervention. In the Samuelsonian synthesis, one must count on the government to ensure more or less full employment; only once that can be taken as given do the usual virtues of free markets come to the fore.I share Krugman's view that the "textbook" Keynesian apparatus remains a useful apparatus for thinking about the economy. However, I think his portrayal of the turn macroeconomics has taken over the past forty or so years is a bit unfair. As Krugman notes, contemporary macroeconomic models are grounded in microeconomic optimization. Although a foolish desire for consistency can be the hobgoblin of our little economist minds, there is more to the story - the shift in methodology was also motivated by real deficiencies in the Keynesian framework identified by Friedman and Lucas, as well as the "stagflation" of the 1970's, which appeared to contradict Keynesian theory.
It’s a deeply reasonable approach – but it’s also intellectually unstable. For it requires some strategic inconsistency in how you think about the economy. When you’re doing micro, you assume rational individuals and rapidly clearing markets; when you’re doing macro, frictions and ad hoc behavioural assumptions are essential.
So what? Inconsistency in the pursuit of useful guidance is no vice. The map is not the territory, and it’s OK to use different kinds of maps depending on what you’re trying to accomplish. If you’re driving, a road map suffices. If you’re going hiking, you really need a topographic survey.
But economists were bound to push at the dividing line between micro and macro – which in practice has meant trying to make macro more like micro, basing more and more of it on optimisation and market-clearing. And if the attempts to provide “microfoundations” fell short? Well, given human propensities, plus the law of diminishing disciples, it was probably inevitable that a substantial part of the economics profession would simply assume away the realities of the business cycle, because they didn’t fit the models.
The result was what I’ve called the Dark Age of macroeconomics, in which large numbers of economists literally knew nothing of the hard-won insights of the 30s and 40s – and, of course, went into spasms of rage when their ignorance was pointed out.
Monday, March 21, 2011
DeLong on Friedman
This reveals a contradiction between how Milton Friedman is perceived and what his ideas really meant, Brad DeLong explains:
In the 1950s and 1960s and 1970s Milton Friedman faced a rhetorical problem. He was a laissez-faire libertarian. But he also believed that macroeconomic stabilization required that the central bank be always in the market, buying and selling government bonds in order to match the supply of liquid cash money to the demand, and so make Say's Law true in practice even though it was false in theory.
Such a policy of constant government intervention to continually rebalance aggregate demand is hardly a laissez-faire hands-off libertarian policy, is it?
Friedman, however, set about trying to maximize the rhetorical distance between his position--which was merely the "neutral," passive policy of maintaining the money stock growth rate at a constant--and the position of other macroeconomists, which was an "activist," interventionist policy of having the government disturb the natural workings of the free market. Something went wrong, Friedman claimed, only when a government stepped away from the "neutral" monetary policy of the constant growth rate rule and did something else.
It was, I think, that description of optimal monetary policy--not "the central bank has to be constantly intervening in order to offset shocks to cash demand by households and businesses, shocks to desired reserves on the part of banks, and shocks to the financial depth of the banking system" but "the central bank needs to keep its nose out of the economy, sit on its hands, and do nothing but maintain a constant growth rate for the money stock"--that set the stage for what was to follow in Chicago.
First, Friedman's rhetorical doctrine eliminated the cognitive dissonance between normal laissez-faire policies and optimal macro policy: both were "neutral" in the sense of the government "not interfering" with the natural equilibrium of the market. Second, Friedman's rhetorical doctrine eliminated all interesting macroeconomic questions: if the government followed the proper "neutral" policy, then there could be no macroeconomic problems. Third, generations of Chicago that had been weaned on this diet turned out to know nothing about macro and monetary issues when they became important again.
It is in this sense, I think, that I blame Milton Friedman: he sold the Chicago School an interventionist, technocratic, managerial optimal monetary policy under the pretense that it was something--laissez-faire--that it was not.
Friday, March 4, 2011
Scientists vs Engineers?
The House Republicans’ proposal would reduce 2011 real GDP growth by 0.5% and 2012 growth by 0.2 percentage points This would mean some 400,000 fewer jobs created by the end of 2011 and 700,000 fewer jobs by the end of 2012.This shouldn't come to a surprise to macroeconomics students, who know that a decrease in government purchases reduces aggregate demand and - outside of the special "classical" case of vertical aggregate supply - output.
John Taylor disagrees, however. Ezra Klein explains:
Mark Zandi says the GOP's proposed spending cuts will cost about 700,000 jobs. John Taylor says they will "increase economic growth and employment." Both are respected economists who immerse themselves in data, research and theory. So how can they disagree so sharply?A similar disagreement is playing out over monetary policy. In a recent NY Times column, Christina Romer wrote:
The dispute comes down to how much weight you give to "expectations" about future deficits. Taylor's argument is that Zandi's model -- which you can read more about here -- doesn't account for the upside of deficit reduction -- namely, that when the government spends less, the private sector will spend more. Taylor thinks individuals and businesses are hoarding their money because they're afraid of the high taxes, sharp spending cuts and assorted other nastiness that deficit reduction will eventually require. "The high unemployment we are experiencing now is due to low private investment rather than low government spending," he writes. "By reducing some uncertainty and the threats of exploding debt, the House spending proposal will encourage private investment."
The debate is between what I would describe as empiricists and theorists.She sides with the "empiricists" and argues that the influence of the "theorists" has held the Fed back from taking bolder, more effective action. Stephen Williamson begs to differ:
Empiricists, as the name suggests, put most weight on the evidence. Empirical analysis shows that the main determinants of inflation are past inflation and unemployment. Inflation rises when unemployment is below normal and falls when it is above normal.
Though there is much debate about what level of unemployment is now normal, virtually no one doubts that at 9 percent, unemployment is well above it. With core inflation running at less than 1 percent, empiricists are therefore relatively unconcerned about inflation in the current environment.
Theorists, on the other hand, emphasize economic models that assume people are highly rational in forming expectations of future inflation. In these models, Fed actions that call its commitment to low inflation into question can cause inflation expectations to spike, leading to actual increases in prices and wages.
Romer says some things about economic history in her piece, but of course she is very selective, and seems to want to ignore the period in US economic history and in macroeconomic thought that runs from about 1968 to 1985. Let's review that. (i) Samuelson/Solow and others think that the Phillips curve is a structural relationship - a stable relationship between unemployment and inflation that represents a policy choice for the Fed. (ii) Friedman (in words) says that this is not so. There is no long-run tradeoff between unemployment and inflation. It is possible to have high inflation and high unemployment. (iii) Macroeconomic events play out in a way consistent with what Friedman stated. We have high inflation and high unemployment. (iv) Lucas writes down a theory that makes rigorous what Friedman said. There are parts of the theory that we don't like so much now, but Lucas's work sets off a methodological revolution that changes how we do macroeconomics.The divides between Goldman/Zandi and Taylor over fiscal policy and between Romer and Williamson over monetary policy both reminded me of Greg Mankiw's distinction between "scientific" and "engineering" macroeconomics. The models used by the "engineers" - the people in Washington and on Wall Street who need to make practical, quantitative assessments of the impact of policy alternatives on the economy - are more elaborate versions of the "textbook" Keynesian IS-LM aggregate supply and demand framework that most of us (still) teach our macroeconomics students. As Williamson points out, the models used by academics - Mankiw's "scientists" - in our research are fundamentally different.
The engineering models are built on relationships among aggregate macroeconomic variables like the Phillips curve, which relates inflation and unemployment, and the consumption function, which connects consumption and disposable income. As Williamson alludes to, Robert Lucas and others won a methodological war in the profession (or at least the academic branch of it) in the 1970s and 1980s. The result of their victory is that the macroeconomic models published in leading journals today are expected to be grounded in the optimizing, forward-looking behavior of rational individuals.
Such individuals might believe, for example, a reduction in government spending today implies that their future taxes will be lower (because the government will be servicing a smaller debt burden). The resulting increase in their lifetime disposable income means that they will immediately increase their consumption. So any negative impact of a cut in government purchases is offset by an increase in consumption. Rational, forward-looking optimizers might also recognize that any monetary expansion will erode their real wages and demand an offsetting increase in nominal wages. This means that employment will remain unchanged (because the real cost to the firms of a worker is the same) even as inflation rises.
At its most extreme, the assumption of dynamic optimization under rational expectations was once believed to imply the Lucas-Sargent "Policy Ineffectiveness" proposition, which Bennett McCallum explained in a 1980 Challenge article:
Macroeconomic policies - sustained patterns of action or reaction - will have no influence because they are perceived and taken into account by private decision-making agents. Thus, the adoption of a policy to maintain "full employment" will not, according to the present argument, result in values of the unemployment rate that are smaller (or less variable) on average than those that would be experienced in the absence of such a policy.Of course, in a world of rationally optimizing people, where prices adjust to clear markets, it is hard to explain how we could get to such large deviations from the natural rate of unemployment in the first place...
More generally, while macroeconomic science has continued on the methodological path established by Lucas, many of its practitioners have worked to re-incorporate real effects of monetary policy. This is a large part of the "New Keynesian" project, which is arguably now the reigning paradigm and best hope for reuniting "science" and "engineering" (and arguably is as much "monetarist" as it is "Keynesian").
Fiscal policy has received less attention - the implausibility of managing aggregate demand through the slow, cumbersome and messy budget process means that, in general, the focus has been on the Fed.
That has started to change as the global slump has pushed conventional monetary policy to its limits (and beyond into unknown worlds of unconventional policy), and governments around the world have made fitful attempts at fiscal policy. For example, recent papers by Christiano, Eichenbaum and Rebelo, Gauti Eggertson and Michael Woodford have shown that it is possible for fiscal policy to have significant multiplier effects when monetary policy is at the zero lower bound (as it is today) in New Keynesian models.
So, while, at a superficial level, it appears that the split between "scientists" and "engineers" persists, some of the "scientific" work being done today is finding that the remedies proposed by the "engineers" are not wholly inconsistent with forward-looking rational behavior after all.
Sunday, December 5, 2010
What We Believe and the Tools We Use
What we research, and what we believe, aren't necessarily the same thing. What gets published in the journals is a survey of what we are currently researching. It isn't an accurate survey of what we currently believe. The whole point of a journal is not to publish what everybody already believes. The journals are a map of where we are currently exploring for gold. They are not a map of existing gold deposits. They are not a map of where we think gold might be found in places we can't currently explore.To which I might add, economic models can be thought of as tools. Using a particular tool (model) to do a job (normal science) shouldn't be taken to imply a belief that the model is the right one for all economic phenomena. In many cases, its much easier to make progress (and get papers accepted) if one uses existing tools. For instance, I've used a real business cycle model in my own research - it turned out to be an effective device to implement an idea I had about real exchange rate volatility. But it does not mean that I believe that real business cycle theory is a correct explanation of economic fluctuations.
Tuesday, August 24, 2010
Keynes on the Science and Art of Economics
Economics is the science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world.That comes to my attention via David Colander, "The Death of Neoclassical Economics" (HT: Frances Woolley).
Sunday, May 30, 2010
DeLong Answers Kocherlakota
I believe that during the last financial crisis, macroeconomists (and I include myself among them) failed the country, and indeed the world. In September 2008, central bankers were in desperate need of a playbook that offered a systematic plan of attack to deal with fast-evolving circumstances. Macroeconomics should have been able to provide that playbook. It could not.DeLong argues that a better understanding of how to respond to economic crises exists, even if it is outside of Kocherlakota's realm of "modern macroeconomics." He goes back to John Stuart Mill:
Let me briefly set out what the macro playbook is, and how it has been developed by economists and policymakers over the past 185 years. Start with Say's or Walras's Law: the circular flow principle that everybody's expenditure is someone else's income--ands everyone's income is somebody else's expenditure. It has to be that way: for every buyer there is a seller: and for every seller who is disappointed because they sell for less than their cost plus normal profit because of excess supply there must be another who is exuberant from selling at more than cost plus normal profit.How, then, can you have a depression--a "general glut," a situation in which there is excess supply of not one or a few but all commodity goods and services? How can you have a situation in which workers laid off from shrinking industries where demand is less than was expected and thus less than supply are not rapidly hired into industries where demand is more than was expected and hence more than supply?
Moral philosopher, libertarian, colonial bureaucrat, feminist, public intellectual, and economist John Stuart Mill put his finger on the answer in a piece he published in 1844:
[T]hose who have... affirmed that there was an excess of all commodities, never pretended that money was one of these commodities.... [P]ersons in general, at that particular time, from a general expectation of being called upon to meet sudden demands, liked better to possess money than any other commodity. Money, consequently, was in request, and all other commodities were in comparative disrepute. In extreme cases, money is collected in masses, and hoarded; in the milder cases, people merely defer parting with their money, or coming under any new engagements to part with it. But the result is, that all commodities fall in price, or become unsaleable...
DeLong puts the problem in terms of a shortage of "safe" assets. The policy response of creating more of them - issuing more government bonds - is the flip side of the traditional Keynesian remedy of deficit spending (or deficit financed tax cuts), as well as of an aggressive "lender of last resort" central bank policy. See also DeLong's related project syndicate column, and this Vox piece by Ricardo Caballero.
So, is this further evidence that we are living in what Krugman called the "dark age of macroeconomics"? Yes and no. As DeLong notes, policymakers have largely been following his playbook (though there are ominous signs they are pulling back too soon). However, academic models employing the reigning methodology of "dynamic stochastic general equilibrium" (DSGE) have generally not been very helpful. That paradigm is still relatively young - it remains to be seen if it will develop in a direction that makes it more useful for policy, or whether it will be supplanted in a more fundamental shift.
Monday, May 17, 2010
The State of Macro
Kocherlakota argues that macro has largely gotten beyond the "saltwater" - "freshwater" schism that has, I think, been overplayed in much of the conversation about macroeconomics and the recession (including Krugman's widely noted NYT magazine article, that I responded to in this post). His picture is of a field that is more pragmatic than ideological. For example, he suggests the use of "social planner" solutions in dynamic stochastic general equilibrium models has been more a matter of convenience than of a rigid belief that perfectly competitive market conditions hold at all times. He writes:
My own idiosyncratic view is that the division was a consequence of the limited computing technologies and techniques that were available in the 1980s. To solve a generic macro model, a vast array of time- and state-dependent quantities and prices must be computed. These quantities and prices interact in potentially complex ways, and so the problem can be quite daunting.However, this complicated interaction simplifies greatly if the model is such that its implied quantities maximize a measure of social welfare. Given the primitive state of computational tools, most researchers could only solve models of this kind. But—almost coincidentally—in these models, all government interventions (including all forms of stabilization policy) are undesirable.
With the advent of better computers, better theory, and better programming, it is possible to solve a much wider class of modern macro models. As a result, the freshwater-saltwater divide has disappeared. Both camps have won (and I guess lost). On the one hand, the freshwater camp won in terms of its modeling methodology. Substantively, too, there is a general recognition that some nontrivial fraction of aggregate fluctuations is actually efficient in nature.
On the other hand, the saltwater camp has also won, because it is generally agreed that some forms of stabilization policy are useful. As I will show, though, these stabilization policies take a different form from that implied by the older models (from the 1960s and 1970s).
Saturday, January 30, 2010
New Keynesian Bastards?
Following the work of Friedman in the 1950's and 60's and Lucas in the 70's, there was an increasing emphasis on establishing microeconomic foundations for macroeconomics, and thinking seriously about expectations, with the idea of "rational expectations" becoming the new benchmark. This methodological revolution led to Real Business Cycle (RBC) models in the 1980's that explained economic fluctuations as the optimal responses of a forward-looking representative agent to productivity shocks. The models were "real" because monetary variables played no role. That ultimately proved too implausible for much of the profession to swallow (see e.g., Lawrence Summers (pdf)).
The real business cycle theorists did not succeed in taking over the field, but they did win the methodological war. It is now standard practice to derive macroeconomic models from the microeconomic foundations of optimizing behavior of agents with rational expectations. Much of the "New Keynesian" macroeconomics which is widespread today is essentially Real Business Cycle models with "sticky" prices grafted on (and the sticky prices mean that monetary policy has real effects).
In the FT, Roger Farmer argues that the "New Keynesians" aren't very good Keynesians, either:
For 30 years, macroeconomists have been of two stripes: new-classical and new-Keynesian. Neither has anything interesting to say about the current crisis.In new-classical and new-Keynesian economics, all unemployment is temporary and unemployed workers will quickly find jobs. According to the Keynes of The General Theory, very high unemployment can persist forever. Nobody has taken this Keynesian idea seriously in respectable academic circles since the 1950s. But given the current jobless recovery, it’s an idea that makes sense and needs to be reconsidered.
Keynesian economics as we know it today is a watered down version of The General Theory given to us by American Keynesians like Paul Samuelson. Samuelson turned Keynesian economics into a digestible series of bite-sized pieces that the Cambridge economist and contemporary of Keynes, Joan Robinson, has referred to as “bastard Keynesianism”. Samuelson’s interpretation of Keynes evolved into a modern incarnation - new-Keynesian economics.
According to new-Keynesians, recessions occur because some firms are stubbornly unwilling to lower their prices in the face of a fall in demand. Workers quit their jobs and choose to take a prolonged vacation. This is not the main theme of The General Theory. But the idea that some firms are slow to change prices is central to new-Keynesian economics. To explain why firms don’t change prices, the new-Keynesians assume that a firm must wait until it’s randomly chosen to be given the privilege to change its price. This option is facetiously referred to as a ‘visit from the Calvo fairy’ after a paper by economist Guillermo Calvo who first introduced the idea into macroeconomics. I don’t believe in fairies.
The Calvo fairy is not the only unrealistic feature of new-Keynesian economics. Perhaps more damning is the fact that there is no unemployment in the benchmark new-Keynesian model. Instead, all variations in the employment rate occur as rational maximizing households choose to vary their hours in response to changes in the real wage. It is hard to take this model seriously as an explanation for the Great Depression or the current financial crisis. But it continues to dominate the discussion at academic conferences because - until now -there has been no good theoretical alternative.
While Farmer has his own alternative that involves more fundamental change, there is some progress being made on the labor market front within the existing New Keynesian paradigm. Several recent papers (which happen to be sitting on my desk right now) by Carl Walsh, Antonella Trigari and Olivier Blanchard and Jordi Gali include involuntary unemployment by integrating search and matching processes into the model.
It remains to be seen whether good "normal science" like this will save the New Keynesian framework in the wake of the global slump, or whether it is time for a scientific revolution...



