Saturday, December 20, 2014

The Birth of Inflation Targeting

Inflation targeting, where monetary policy is directed to aim for a specific level or range for the inflation rate, has become a widespread practice.  The Times' Neil Irwin looks back at its first implementation by New Zealand; he writes:
Sometimes, decisions that shape the world’s economic future are made with great pomp and gain widespread attention. Other times, they are made through a quick, unanimous vote by members of the New Zealand Parliament who were eager to get home for Christmas.

That is what happened 25 years ago this Sunday, when New Zealand became the first country to set a formal target for how much prices should rise each year — zero to 2 percent in its initial action. The practice was so successful in making the high inflation of the 1970s and ’80s a thing of the past that all of the world’s most advanced nations have emulated it in one form or another. A 2 percent inflation target is now the norm across much of the world, having become virtually an economic religion.
Irwin goes on to provide a nice description of how and why New Zealand adopted this policy. Although it initially seemed quite successful in achieving the goal of avoiding a repeat of the high inflation if the 1970s (which continued well into the 80s in some countries) while maintaining reasonable growth rates, it has been tested by the experience of the past seven years.  One question is whether the level of 2 percent is the right one.  Irwin describes how Janet Yellen successfully argued against those at the Fed who wanted to go for zero inflation in the mid-1990s, but even 2 percent may be too low:
Starting in the late 1990s, Japan found itself stuck in a pattern of falling prices, or deflation, even after it cut interest rates all the way to zero. The United States suffered a mild recession in 2001, and the Fed cut interest rates to 1 percent to help spur a recovery. Then came the global financial crisis of 2007 to 2009, spurring a steep downturn across the planet and causing central banks to slash interest rates.

All of this has quite a few smart economists wondering whether the central bankers got the target number wrong. If they had set it a bit higher, perhaps at 3 or 4 percent, they might have been better able to combat the Great Recession because they could cut inflation-adjusted interest rates by more.
The apparent initial success as well the reasons for recent doubts can be seen in the UK's experience, which adopted inflation targeting in 1998:

(the chart data is from the OECD, via FRED.  The UK's target was initially 2.5%, but expressed in terms of a different price index measure, when it switched to using the CPI, it moved the target to 2% based on differences in the measures.)

While the UK generally has had low and (relatively) stable inflation since the early 1990s, it did miss its target considerably in 2007-2012, and it may be in danger of undershooting its target (as the Fed is) - inflation in November was 1% (this is not evident in the chart because it plots the percentage change in the price index from the year before).

Although I think the Bank of England deserves credit for not tightening in the face of inflation which ultimately proved transitory, this does call the inflation targeting framework into question.  Arguably, it may have helped keep inflation expectations "anchored" even as inflation deviated from target.  However, at some point, one would expect such deviations to undermine the credibility of the regime, and it was the idea of establishing credibility that made it attractive to academic economists in the first place (the underlying intuition for this was nicely described in this speech by Philadelphia Fed President Charles Plosser).

The other question, of whether a higher target, or a different one - such as a target price level (inflation is the rate of change of the price level) or nominal GDP - would be better is an interesting and important one.  The difficulty now is that, having established a monetary policy rule, the credibility of any new rule could be diminished by a change in rules.

Economics Navel-Gazing, Curricular Edition

A group of economics students in the UK have undertaken a movement to reform the economics curriculum.  I'm a little surprised that I haven't run into similar sentiments at Wesleyan - I can't decide if I'm disappointed or relieved by this.

The criticisms seem to me to be based on a somewhat unfair caricature of economics and economists, that we're head-in-the-sand apologists for "neoliberalism" who use mathematics as a form of obscurantism and have little useful to say about the "real world," particularly in the wake of the financial crisis.

Some of this may be rooted in the fact that the "economics" articulated by politicians, government officials and the press - what Simon Wren-Lewis has called "mediamacro" - does not reflect the views of most of mainstream academic economics.  In particular the obsession with government budget deficits is not based on textbook economics (I discussed an example of this misconception in a European context a couple of years ago).

Markets are at the heart of economics - this may be where the view that economists are "free market fundamentalists" comes from.  In introducing markets, though, there are really two main points to make:
  1. The gains from exchange and specialization possible from voluntary trade (i.e., Adam Smith's "Invisible Hand"), and the ability of markets make to adjustments based to dispersed information about what Hayek called "the particular circumstances of time and place" which would be un-knowable to any central planner.
  2. While economists need to make our students aware of the hidden and under-appreciated role markets play in organizing society and in lifting humanity out of subsistence-level poverty, we also devote a considerable amount of attention to how they fail.  In particular, problems of monopoly power, externalities, public goods and asymmetric information are standard subjects for introductory economics courses.  (2a., There are also reasons to be skeptical in practice of the ability of our political system to effectively correct market failures).
We do suffer from an excess of libertarians who mistakenly believe that economic theory validates their views - I usually think of these people as students who stopped listening after they learned about point #1 in first several weeks of their principles courses (or put too much weight on #2a).  But these folks are a minority in economics, though perhaps a vocal enough one that students and other outsiders might believe they are more representative than they really are.

We typically introduce markets with the model of "supply and demand," and the exercise of thinking in terms of models provides much of the lasting value of studying economics.  Working with economic models can sharpen students' logical and critical thinking skills immensely.  As John Cochrane nicely put it recently, "economic models are quantitative parables, not explicit and complete descriptions of reality." The criticism that models are "simplifications" is a cheap one - writing down a set of assumptions in mathematical form and working out the implications (and then testing them against data), is where the insight comes from.  The discipline of doing this cultivates an ability to think intelligently about tradeoffs and hidden costs, and to trace conclusions back to underlying assumptions and consider how changing assumptions lead to different conclusions.  Since models are, by necessity, very stylized descriptions of the world, students of economics must not only learn how to work with them, but also how to judge which simplifications are appropriate for a given circumstance or question.  As Keynes said, "Economics is the science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world."

So I think the core of what we try to do in our introductory economics courses - introducing markets (both their successes and failures), and teaching students how to think in terms of models - is extremely worthwhile.  Of course, this does not cover everything that we possibly would like to do in a course (or in small set of courses).  Much of economics is concerned with the allocation of scarce resources, and the time that our students can spend on a course in a semester (both in and out of the classroom) is very limited, forcing some difficult choices on instructors.  Some of the criticisms made by the UK students seem to be about what we're leaving out, though I think what we're doing in our introductory courses is pretty important, and laying some groundwork in the economic way of thinking will help the students tackle issues like understanding the financial crisis, either in later classes or independently.  While many of the debates in the news are about macroeconomic policy (and as a macroeconomist, I'm happy to see the revival of interest in the topic, even if arises from unfortunate sources), the core microeconomic concepts are very important and not to be skipped.  While it can be exciting to be teaching a subject that is relevant to contemporary events, we should not be seduced into bringing "news" into the classroom in a way that interferes with developing an understanding of the fundamentals.

There is sometimes a bit of a muddle in these navel-gazing discussions, too, between what should be in our undergraduate curriculum and the separate, but not wholly unrelated, issue of our research agenda and graduate curriculum.  I'm not entirely unsympathetic to the calls for "methodological pluralism" though I wouldn't go as far as the UK students would like.  I have argued for graduate study of the history of economic thought, and I have emphasized it in my undergraduate teaching, using it as an organizing principle for my intermediate macroeconomics class (and also making my intro students read some Smith, Hayek, Friedman and Keynes).  As a field, I do think macroeconomics is at a point where we should be open to reconsiderations of some of the standard tools (though I don't think that is ever not the case), and I worry that the "publish or perish" incentives we all face mean that we do too little of that.

Karl Whelan of University College, Dublin has a written nice essay "Teaching Economics 'After the Crash'" with a more detailed response to the UK students' criticisms which is well worth reading.

Friday, November 28, 2014

Gray Matter and the US Current Account

After increasing steadily from the mid-1990s through the mid-2000s, the rate at which the US is borrowing from the rest of the world - our current account deficit - has come down considerably.
The current account deficit peaked at around 6% of GDP in 2006 and has hovered around 2.5-3% over the past several years.

During the period when it looked like the US current account deficit was growing inexorably, there was quite a bit of discussion of "global imbalances" and whether or not the US' borrowing was sustainable.

Despite years of borrowing, the US tends to earn a positive balance on income - i.e., the US receives more in payments on assets it owns abroad than it makes to foreign owners of US assets.  Ricardo Hausmann and Federico Sturzenegger argued that national accounts understate the true value of US foreign assets.  They called the gap between the accounting value and the true value "dark matter", which they largely attributed to the know-how exported (but not properly measured) with US FDI.  (I revisited this idea in a previous post).

Writing at Project Syndicate, Jeffrey Frankel offers another idea on why our balance of payments accounts may make the US deficit look worse than it really is:
Every year, US residents take some of what they earn in overseas investment income – interest on bonds, dividends on equities, and repatriated profits on direct investment – and reinvest it then and there. For example, corporations plow overseas profits back into their operations, often to avoid paying the high US corporate income tax implied by repatriating those earnings. Technically, this should be recorded as a bigger surplus on the investment-income account, matched by greater acquisition of assets overseas. Often it is counted correctly. But there is reason to think that this is not always the case...

...For example, US multinational corporations sometimes over-invoice import bills or under-report export earnings to reduce their tax obligations. Again, this would work to overstate the recorded current-account deficit.
While the efforts that US multinationals make to evade their tax obligations are probably technically legal in most cases, they go against the spirit of the US tax law, which taxes US corporations based on their global earnings (whether it should be this way is another matter).  Since it arises from a gray area in our tax code, perhaps the we should call the resulting gap between our measured and true foreign assets "gray matter."

Update: Empirical evidence from Gabriel Zucman (QJE, 2013) that some of the "dark matter" is in tax havens...

Friday, November 7, 2014

October Employment

A good report today from the BLS on employment in October:  the unemployment rate fell to 5.8% (from 5.9%) and employers' payrolls rose by 218,000.
The payroll figure comes from a survey of firms, while the unemployment rate is based on a survey of households (which has a smaller sample than the employer survey).  The household survey figures look even better: the number of people employed rose by 683,000, and the number unemployed fell by 267,000.  The labor force (i.e., people who are working or looking for work) rose by 416,000, which put the labor force participation rate at 62.8%, an increase from last month's historic low of 62.7%. 

The decline in labor force participation (which was at 66% in late 2007) has been one of the worrying trends of the past several years.  It partly reflects demographics, though, as the population is becoming older and a larger portion of the population is of retirement age.  Looking at the employment-population ratio for 25-54 year olds gives a picture of the labor market that takes out some of the guesswork in interpreting participation:
This ratio increased from 76.7 to 76.9 in October.  Overall, it shows some recovery over the past three years, but also gives an indication of why many Americans remain unhappy with the state of the economy - it is still less than halfway back from its low point to its pre-recession level.

Moreover, while employment is improving, wages are still growing slowly - the BLS reports that average hourly wages have increased 2% over the past year.  This suggests that there is still plenty of "slack" in the labor market.

The BLS' broader measure of un- and under-employment, 'U-6', which includes the "marginally attached" and people working part-time who want to be full-time, is at 11.5%, down from 11.8% last month (it peaked at 17.2% in April 2010).

Wednesday, September 17, 2014

Information Overload

We've arrived at the point in Econ 302 this semester where we're reading Paul David's "The Dynamo and the Computer: An Historical Perspective on the Modern Productivity Paradox" which means that I find myself again marveling at the prescience of this:
That's a pretty amazing thing to have written in 1989! (the paper was published in the May '90 American Economic Review, which has papers presented at the meeting in early January 1990).

For a nice (and un-gated) summary of David's argument, see this Tim Harford piece in Slate.

Friday, September 5, 2014

August Employment

According to the BLS, employment rose by 142,000 in August and the unemployment rate ticked down to 6.1%.
That's consistent with the picture of a continuing, but painfully slow, recovery that has predominated over the past several years, though this particular report was a little on the disappointing side.

The employment figure comes from a survey of firms, while the unemployment rate is based on a survey of households, which has a smaller sample.  According to the household survey, 80,000 fewer people were unemployed, but only 16,000 more were working - the difference is accounted for by 64,000 departures from the labor force (i.e., adults who are working or looking for work).  Such decreases in labor force participation are not an encouraging sign.

However, labor force participation is a little bit difficult to interpret because demographic change (more people reaching retirement age, etc.) plays a role as well.  My preferred measure of the state of the labor market is the share of 25-54 year-olds who are working - this takes out the guesswork about demographics and participation.  This measure rose in August, to 76.8% (from 76.4%)
that's up from a low of 74.8% in November 2010, but still well below pre-recession levels.  Employment is continuing to crawl out of the hole we dug in 2008-09, but we're less than halfway there.  Any talk of returning to "normal" monetary policy seems a premature to me - things may be getting slightly better, but the situation is still quite bad.

Friday, August 22, 2014

Europe in Depression

In a post back in 2012, when things were looking pretty hairy for the euro, I said it would be "a real human disaster if the euro cracked up in a crisis."  Two years later, fear of a calamitous exit by the "peripheral" Eurozone countries have eased (as evidenced by reduced bond yields).  The euro appears to have been saved - and it has been a real human disaster nonetheless.

At wonkblog, Matt O'Brien writes:
As I've said before, the euro is the gold standard with moral authority. And that last part is the problem. Europeans don't think the euro represents civilization, but rather the defense of it. It's a paper monument to peace and prosperity that's made the latter impossible. So the eurocrats who have spent their lives building it are never going to tear it down, despite the fact that, as it's currently constructed, the euro is standing between them and recovery.

Just like the 1930s, Europe is stuck with a fixed exchange system that doesn't let them print, spend, or devalue their way out of a crisis. But, unlike then, Europe might never give it up. It's a fidelity to failure that even the gold bloc couldn't have imagined.
Unemployment rates are above 20% in Spain and Greece, and above 10% in Portugal, Italy, Ireland, France, Cyprus, Slovakia and Slovenia:
Ambrose Evans-Pritchard spoke to several economics Nobel laureates:
An array of Nobel economists have launched a blistering attack on the eurozone's economic strategy, warning that contractionary policies risk years of depression and a fresh eruption of the debt crisis.
"Historians are going to tar and feather Europe's central bankers," said Professor Peter Diamond, the world's leading expert on unemployment. "Young people in Spain and Italy who hit the job market in this recession are going to be affected for decades. It is a terrible outcome, and it is surprising how little uproar there has been over policies that are so stunningly destructive," he told The Telegraph at a gathering of Nobel laureates at Lake Constance...
Professor Joseph Stiglitz said austerity policies had been a "disastrous failure" and are directly responsible for the failed recovery over the first half of this year, with Italy falling into a triple-dip recession, France registering zero growth and even Germany contracting in the second quarter.
"There is a risk of a depression lasting years, leaving even Japan's Lost Decade in the shade. The eurozone economy is 20pc below its trend growth rate," he said...
Professor Christopher Sims, a US expert on monetary policy, said EMU policy makers had not sorted out the basic design flaws in monetary union, and are driving Club Med nations into deeper trouble by imposing pro-cyclical austerity.
"If I were advising Greece, Portugal or even Spain, I would tell them to prepare contingency plans to leave the euro. There is no point being in EMU if all that happens when you are hit with a shock is that the shock gets worse," he said.
"It would be very costly to leave the euro, a form of default, but staying in the euro is also very costly for these countries. The Europeans have created a system that is worse than the Gold Standard. Countries are in the same position as Latin American states that borrowed in dollars," he said.
It may be a slightly hopeful sign that Francois Hollande is coming to a recognition of the problem, the Times' Liz Alderman reports:
After months of insisting that a recovery from Europe’s long debt crisis was at hand, President Fran├žois Hollande on Wednesday delivered a far bleaker message. He indicated that the austerity policies France had been compelled to adopt to meet the eurozone’s budget deficit targets were making growth impossible.
Paris officials say that France — the eurozone’s second-largest economy after Germany — will no longer try to meet this year’s deficit-reduction targets, to avoid making economic matters worse. Even in abandoning those targets, they indicated that France was unlikely to recover soon from its long period of stagnation or quickly reduce its unemployment rate, which exceeds 10 percent.
“The diagnosis is clear,” Mr. Hollande said in an interview published Wednesday in the French daily Le Monde. “Due to the austerity policies of the last several years, there is a problem of demand throughout Europe, and a growth rate that is not reducing employment.”
To really make a difference, though, a more inflationary monetary policy is needed, and there is no sign of that on the horizon.

A euro breakup in 2010, 11 or 12 would have been disastrous for sure, but I'm beginning to wonder if it would have been worse than what we've actually seen.

Wednesday, July 23, 2014

DSGE Failing the Market Test?

The prevailing methodology of macroeconomic theory these days is "Dynamic Stochastic General Equilibrium" (DSGE) modelling.  Although many contemporary DSGE models, including the ones I'm working on, include "Keynesian" elements such as sticky prices, unemployment and financial frictions, they represent a methodological break with an older style of "Keynesian" models based on relationships among aggregate variables.  The shift in method followed from the work of Lucas and Sargent (most prominently among others) -- which John Cochrane summarized on his blog:
As I see it, the main characteristic of "equilibrium" models Lucas and Sargent inaugurated is that they put people, time, and economics into macro.

Keynesian models model aggregates. Consumption depends on income. Investment depends on interest rates. Labor supply and demand depend on wages. Money demand depends on income and interest rates. "Consumption" and "investment" and so forth are the fundamental objects to be modeled.

"Equilibrium" models (using Lucas and Sargent's word) model people and technology. People make simultaneous decisions across multiple goods, constrained by budget constraints -- if you consume more and save more, you must work more, or hold less money.  Firms  make decisions across multiple goods constrained by technology.

Putting people and their simultaneous decisions back to the center of the model generates Lucas and Sargent's main econometric conclusion -- Sims' "incredible" identifying restrictions. When people simultaneously decide consumption, saving, labor supply, then the variables describing each must spill over in to the other. There is no reason for leaving (say) wages out of the consumption equation. But the only thing distinguishing one equation from another is which variables get left out.

People make decisions thinking about the future. I think "static" vs. "intertemporal" are good words to use.  That observation goes back to Friedman: consumption depends on permanent income, including expected future income, not today's income. Decisions today are inevitably tied to expectations --rational or not -- about the future.
A Bloomberg View column by Noah Smith nicely summarizes the methodological shift, which gained momentum from the apparent breakdown of the Phillips curve relationship between inflation and unemployment in the 1970s.  Smith writes:
Lucas showed that trying to boost gross domestic product by raising inflation might be like the tail trying to wag the dog. To avoid that kind of mistake, he and his compatriots declared, macroeconomists needed to base their models on things that wouldn’t change when government policy changed -- things like technology, or consumer preferences. And so DSGE was born. (DSGE also gave macroeconomists a chance to use a lot of cool new math tricks, which probably increased its appeal.)

OK, history lesson over. So why is this important now?

Well, for one thing, the finance industry has ignored DSGE models. That could be a big mistake! Suppose you’re a macro investor. If all you want to do is make unconditional forecasts -- say, GDP next quarter – then you can go ahead and use an old-style SEM model, because you only care about correlation, not causation. But suppose you want to make a forecast of the effect of a government policy change -- for example, suppose you want to know how the Fed’s taper will affect growth. In that case, you need to understand causation -- you need to know whether quantitative easing is actually changing people’s behavior in a predictable way, and how.

This is what DSGE models are supposed to do. This is why academic macroeconomists use these models. So why doesn’t anyone in the finance industry use them? Maybe industry is just slow to catch on. But with so many billions upon billions of dollars on the line, and so many DSGE models to choose from, you would think someone at some big bank or macro hedge fund somewhere would be running a DSGE model. And yet after asking around pretty extensively, I can’t find anybody who is.
That's an interesting question -- when thinking about issues like this, I often come back to the divide between "science" and "engineering" put forward by Greg Mankiw.  While academic macroeconomics has gone down the path marked out Lucas and Sargent, the policymaking "engineers" in Washington often still find the older-style models more useful.  It sounds like Wall Street's economists do too. 

The question is whether academic macroeconomics is on track to produce models that are more useful for the policymakers and moneymakers. The DSGE method is still fairly new, and, until recently, we've been constrained by the limitations of our computers as well as our minds (a point Narayana Kocherlakota made here), so maybe we're just not quite there yet.  But we should be open to the possibility that we're on the wrong track entirely.

Saturday, July 5, 2014

Efficiency Wages

The New York Times has a story about several restaurants that have decided to pay above-market wages.  One of them is Shake Shack, which is starting employees at $9.50/hr:
“The No. 1 reason we pay our team well above the minimum wage is because we believe that if we take care of the team, they will take care of our customers,” said Randy Garutti, the chief executive of Shake Shack.
That, and other anecdotes in the article, are consistent with the "efficiency wage" theory, where firms can induce more effort by paying a higher real wage.  This might arise if firms have a less than perfect ability to monitor individual employees' productivity - paying an above-market wage creates a stronger incentive not to "shirk". 

For more, see this brief 1984 survey by Janet Yellen, who did some of her early academic work in this area.

Tuesday, July 1, 2014

Classroom Technology

Despite evidence that having computers in class is not good for students, Slate's Rebecca Schumann argues that professors should permit them anyway:
[P]olicing the (otherwise nondisruptive) behavior of students further infantilizes these 18-to-22-year-olds. Already these students are hand-held through so many steps in the academic process: I check homework; I give quizzes about the syllabus to make sure they’ve actually read it; I walk them, baby-steps style, through every miniscule stage of their essays. Some of these practices do indeed improve what contemporary pedagogy parlance calls “learning outcomes” (barf) because they show students how invested I am in their progress. But these practices also serve as giant, scholastic water wings for people who should really be swimming by now.

My colleagues and I joke sometimes that we teach “13th-graders,” but really, if I confiscate laptops at the door, am I not creating a 13th-grade classroom? Despite their bottle-rocket butt pranks and their 10-foot beer bongs, college students are old enough to vote and go to war. They should be old enough to decide for themselves whether they want to pay attention in class—and to face the consequences if they do not.
I'm sympathetic to the argument - I've never had an "attendance policy" for essentially the same reason - but what Schumann misses is that the use of laptops have a negative spillover effect (what economists call an "externality").  A student who is using a computer will not only distract herself but also the students around her - it is the harm to others, and the classroom environment more generally, that justifies prohibiting computers in class.

Schumann goes on to argue the real problem is lecture format classes.  I don't think its appropriate to generalize - the optimal format probably varies across subjects (and across students, too, which may be a more difficult problem).  I'm planning some pretty big changes to the way I teach my classes for the coming year that will significantly reduce the amount of lecturing I do.  I wouldn't be doing this if I didn't expect the benefits to outweigh the costs, but I suspect the virtues of the traditional lecture style may be under-appreciated these days.  In particular, the act of note-taking by hand is a valuable part of the learning process.  A recent NY Times story about the decline of handwriting instruction in schools discussed some evidence on that point:
Two psychologists, Pam A. Mueller of Princeton and Daniel M. Oppenheimer of the University of California, Los Angeles, have reported that in both laboratory settings and real-world classrooms, students learn better when they take notes by hand than when they type on a keyboard. Contrary to earlier studies attributing the difference to the distracting effects of computers, the new research suggests that writing by hand allows the student to process a lecture’s contents and reframe it — a process of reflection and manipulation that can lead to better understanding and memory encoding.
Although we should always be looking for ways to improve, and to take advantage of new technology where it can be helpful, sometimes "innovation" carries hidden costs, and we will make better choices if we try to understand what those might be and take them into account.

Thursday, June 26, 2014

Not Repeating All of Our Mistakes

With all the frustrations and mistakes of the recent years, its easy to miss the good economic policy news, but there is some --

At Wonkblog, Lydia DePillis reports on the lack of a turn towards protectionism on the part of high-income countries during the global slump of the last few years.  The evidence she cites suggests that developing countries have raised trade barriers, but in a fairly muted fashion.

That's a huge improvement over the 1930s which saw widespread increases in trade barriers (including US' infamous Smoot-Hawley tariff).  Though the increases in tariffs and other trade barriers did not cause the depression, most of us economists regard them as a counter-productive response.

The architecture of the GATT and WTO was developed in part to prevent making the same mistake again.  However, the rules do allow for temporary increases in tariffs through "antidumping", "safeguard" and "countervailing duty" measures, but there hasn't been a large increase in the use of these measures. DePillis writes:
So, why did the United States appear to be less aggressive about protecting itself in the face of the latest economic meltdown? It's learned from experience.

"We designed the current system in response to what happened in the 1930s," says Chad Bown, a World Bank economist who maintains the database of temporary trade barriers. For one thing, the United States is able to target products more specifically rather than entire sectors. "That helps blow off some political steam and not have overall increases in protection," Bown says.
Another important factor may be that now, unlike the 1930s, the world is largely operating under a (non) system of floating currencies. In Trade Policy Disaster, Doug Irwin argues (persuasively, I think) that the motivation for the increasing trade barriers was more "mercantilist" than "protectionist" - that governments were concerned with preventing trade deficits, which would have led to deflationary gold outflows under the gold standard.  

Today's countries aren't bound the same way.  The one exception is Europe, where the economies of "peripheral" Europe are the hardest-suffering in the world - they can't adjust through depreciation, and the EU prevents Spain and Greece from raising tariffs.

Update: At VoxEU, Chad Bown discusses some findings from the Temporary Trade Barriers Database.

GDP in the Rear-View Mirror

appears smaller than it did before --  the BEA's "third estimate" of real GDP growth came in at -2.9% annual rate.  That's really bad, and a big revision from the "advance estimate" in April of 0.1% growth, and the "second estimate" in May of -1%.
One of the things I emphasize to my students are the limitations of GDP statistics.  One of the difficulties in using them is that they are subject to substantial revisions, that come in with considerable lags.  Policymakers - and anyone else trying to judge the state of the economy - are looking at noisy, backward-looking data. 

Here it is, just past the summer solstice, that we learn that GDP last winter (Jan. - Mar.) was dropping at its fastest rate since 2009 (the first quarter of 2011 is only one other quarter since the recession with declining GDP).  The rate of decline in the 1st quarter was worse than either of the two quarters with negative growth in the 2001 recession.

Although there are usually some changes, this particular revision was unusually large - the change from the initial to the third estimate was the largest since the BEA began releasing estimates this way in the mid-1980's.

The prevailing theory on why the first-quarter was so bad appears to be that it was mainly due to unusually severe weather; although the data are "seasonally adjusted" to account for the fact that some types of economic activity normally are lower in January and February - this winter may have been worse than most.

As Neil Irwin and CEA Chair Jason Furman both note, other indicators - like employment - looked ok during the same period.  Payroll growth averaged 190,000 during the first three months of the year.  That, as Justin Wolfers explains, means a large deviation from the historic relationship between unemployment and output growth known as "Okun's Law".  It also implies a big drop in productivity as we measure it.

Monday, June 16, 2014

Hawks, Doves and (Wesleyan) Cardinals

The Federal Reserve Board welcomed a Wesleyan alum today: Lael Brainard '83 was sworn in (she's second from right, along with Jerome Powell, Janet Yellen and Stanley Fisher).
Brainard previously served as Undersecretary of the Treasury for International Affairs; the NY Times' Annie Lowrey wrote a brief profile of Brainard when she stepped down from that post last year.

Thursday, June 12, 2014

Lucas on Keynes

Robert Lucas appears in the history of macroeconomics as the leader of a methodological "revolution" which supplanted the approach, predominant in the postwar era, of aggregate macro models based on Keynes' ideas, with general equilibrium models grounded in the optimizing behavior.  In his keynote address to the 2003 HOPE (History of Political Economy) convention, Lucas reminisced about his training in Keynesian economics as a grad student at Chicago (yes, at Chicago) in the 1960's.  He closes with an appreciative note on what he thinks Keynes was trying to accomplish:
I think that in writing the General Theory, Keynes was viewing himself as a spokesman for a discredited profession. That’s why he doesn’t cite anyone but crazies like Hobson. He knows about Wicksell and all the “classics,” but he is at pains to disassociate his views from theirs, to overemphasize the differences. He’s writing in a situation where people are ready to throw in the towel on capitalism and liberal democracy and go with fascism or corporatism, protectionism, socialist planning. Keynes’s first objective is to say, “Look, there’s got to be a way to respond to depressions that’s consistent with capitalist democracy.” What he hits on is that the government should take some new responsibilities, but the responsibilities are for stabilizing overall spending flows. You don’t have to plan the economy in detail in order to meet this objective. And in that sense, I think for everybody in the postwar period—I’m talking about Keynesians and monetarists both—that’s the agreed-upon view: We should stabilize spending flows, and the question is really one of the details about how best to do it. Friedman’s approach involved slightly less government involvement than a Keynesian approach, but I say slightly.

So I think this was a great political achievement. It gave us a lasting image of what we need economists for. I’ve been talking about the internal mainstream of economics, that’s what we researchers live on, but as a group we have to earn our living by helping people diagnose situations that arise and helping them understand what is going on and what we can do about it. That was Keynes’s whole life. He was a political activist from beginning to end. What he was concerned about when he wrote the General Theory was convincing people that there was a way to deal with the Depression that was forceful and effective but didn’t involve scrapping the capitalist system. Maybe we could have done it without him, but I’m glad we didn’t have to try.
This is consistent with the "Neoclassical Synthesis" view that Keynes himself presaged in chapter 24 of the General Theory, which Brad DeLong discussed on his WCEG blog yesterday and Krugman commented on today (the quote from Lucas makes me wonder if perhaps Lucas and Krugman aren't quite as far apart as they think after all?).

Keynes' Over-worked Grandchildren?

In a New Yorker book review essay, Elizabeth Kolbert revisits one of my favorites, "Economic Possibilities for Our Grandchildren" by John Maynard Keynes:
Keynes delivered an early version of “Economic Possibilities” as a lecture at a boys’ school in Hampshire. He was still at work revising and refining the essay when, in the fall of 1929, the stock market crashed. Some might have taken this as a bad sign; Keynes was undeterred. Though he quickly recognized the gravity of the situation—the crash, he wrote in early 1930, had produced a “slump which will take its place in history amongst the most acute ever experienced”—over the long run this would prove to be just a minor interruption in a much larger, more munificent trend. In the final version of “Economic Possibilities,” published in 1931, Keynes urged readers to look beyond this “temporary phase of maladjustment” and into the rosy beyond.

According to Keynes, the nineteenth century had unleashed such a torrent of technological innovation—“electricity, petrol, steel, rubber, cotton, the chemical industries, automatic machinery and the methods of mass production”—that further growth was inevitable. The size of the global economy, he forecast, would increase sevenfold in the following century, and this, in concert with ever greater “technical improvements,” would usher in the fifteen-hour week.

To Keynes, the coming age of abundance, while welcome, would pose a new and in some ways even bigger challenge. With so little need for labor, people would have to figure out what to do with themselves: “For the first time since his creation man will be faced with his real, his permanent problem—how to use his freedom from pressing economic cares, how to occupy the leisure, which science and compound interest will have won.”
As Kolbert notes, Keynes' predictions about growth were pretty well on-target. He wrote: "I would predict that the standard of life in progressive countries one hundred years hence will be between four and eight times as high as it is to-day."  That implies an annual growth rate between 1.39% and 2.08%.  According to Maddison project data, UK real GDP per capita rose 4.37 fold between 1930 (when Keynes wrote) and 2010, which gives an annual growth rate of 1.84%.

While Keynes was correct about growth, his prediction about leisure has not come true.  At least not fully - hours worked have fallen, though much more in Europe than in the US -
but even Western Europe is far short of "three hour shifts or a fifteen-hour week." 

That we're still working so much calls into question how we think about work, leisure and preferences.

Among the explanations are:"conspicuous busyness" - i.e., that appearing overworked is a signal; Paul Krugman discussed this on his blog a while back:
First of all, [James Surowiecki is] right that for what he calls knowledge workers — I’d just say elite workers in general — the whole time ethos has changed. When I was growing up on Long Island, there was a clear class hierarchy on commute times. Early trains were filled with menial workers; the later the train the more and fancier suits, with executives starting their day at 9:30 or 10. These days it is if anything reversed: lots of hard-driving suits on the early trains, much more mixed later on.
So what is this about? Surowiecki emphasizes the incentives of employers, and their difficulty in taking the negative effects on productivity into account. My sense, however, is that the most important factor — which he alludes to but doesn’t put at the center — is signaling. Working insane hours is a sign of commitment, of willingness to sacrifice for the job; the personal destructiveness of the practice isn’t a bug, it’s a feature.
This may be true in parenting, as well - as Kolbert writes (referring to "Overwhelmed" by Brigid Schulte): 
One theory she entertains early on is that busyness has acquired social status. The busier you are the more important you seem; thus, people compete to be—or, at least, to appear to be—harried. A researcher she consults at the University of North Dakota, Ann Burnett, has collected five decades’ worth of holiday letters and found that they’ve come to dwell less and less on the blessings of the season and more and more on how jam-packed the previous year has been. Based on this archive, Burnett has concluded that keeping up with the Joneses now means trying to outschedule them. (In one recent letter, a mother boasts of schlepping her kids to so many activities that she drives “a hundred miles a day.”) “There’s a real ‘busier than thou’ attitude,” Burnett says. 
Another hypothesis is that people derive satisfaction and a sense of identity from work.  Kolbert quotes from "Revisiting Keynes" - 
A third group of economists challenges the Keynesian presumption that leisure is preferable to labor. Work may not set us free, but it lends meaning to our days, and without it we’d be lost. In the view of Edward Phelps, of Columbia University, a career provides “most, if not all, of the attainable self-realization in modern societies.” Richard Freeman, of Harvard, is, if possible, more emphatic. “Hard work is the only way forward,” he writes. “There is so much to learn and produce and improve that we should not spend more than a dribble of time living as if we were in Eden. Grandchildren, keep trucking.”  
Phelps and Freeman are correct that our standard treatment of work (bad) versus leisure (good) often misses something important.  This was present in the early "romantic" Marx, who said, "man is a tool-using animal".  The relevance varies a great deal, I suspect - some of us have the good fortune not to feel "alienated" from our labor, though, for many, work is the drudgery that standard economic theory assumes it to be. 

A third explanation relates to the fact that "quality" is a relative concept and the desire for ever-higher quality goods keeps the consumption motive from slackening - this was explained well by Robert Frank in an NYT column.

Friday, May 23, 2014

Accounting and the Liberal Arts

I've had mixed feelings about offering accounting classes in a liberal arts college economics department.  Liberal arts colleges don't typically offer accounting majors or have accounting departments, but an accounting course or two might be on the books in the economics department.

The reason for my reservations about it is that I worry that it reinforces a mis-perception that studying economics is more "vocational" or "practical" than other parts of the liberal arts curriculum - i.e., that an economics major is somehow a proxy for getting a business degree.

However, there is a good case for liberal arts students learning about accounting - financial statements are an important source of information in our society, so being able to interpret them is valuable for anyone who might want to (critically) examine the activities of businesses or the government.

This NY Times opinionator piece by Jacob Soll nicely argues for the value of accounting for an informed citizenry:
The German economic thinker Max Weber believed that for capitalism to work, average people needed to know how to do double-entry bookkeeping. This is not simply because this type of accounting makes it possible to calculate profit and capital by balancing debits and credits in parallel columns; it is also because good books are “balanced” in a moral sense. They are the very source of accountability, a word that in fact derives its origin from the word “accounting.”

A Definition of Business Cycles

Our traditional term for macroeconomic fluctuations - "business cycles" - doesn't really represent well how we think about them now.  Robert Solow provides a good definition:
“the business cycle” has become shorthand for the series of irregular, short-run, aggregative fluctuations of varying duration, magnitude, andprobablycausation that we call prosperity and recession.
That's from Solow's delightful 2007 review of Thomas McCraw's biography of Joseph Schumpeter, which I had the good fortune to stumble upon.

The NBER's somewhat mushy "official" definition is here.

Friday, May 2, 2014

Cassidy on Keynes and Reagan

Econ 302 midterm question 2(a):
The fiscal policies enacted by the Reagan administration included significant cuts in taxes and increases in (military) spending. Illustrate the effects of this fiscal policy using an IS-LM diagram. 
While my students were asked to work out the results in (Keynesian) theory, the data are consistent with its prediction:
The red line (right-hand scale) is GDP growth, which is negative in 1982, but strongly positive in 1983 and 84 ('Morning in America'), and the blue line is the federal deficit as a percentage of potential GDP, which shows the effect of Reagan's fiscal policy.

Apropos of this, John Cassidy has a nice post arguing that Reagan was a closet Keynesian:
In strict terms, Reagan’s neglect of the deficit wasn’t Keynesian. Keynes himself believed in letting the deficit rise in a recession and paying down debts in the good times. In America, though, Keynesianism has always been associated with stimulus programs, big government, and deprioritizing the deficit. In all of these ways, Reagan was a Keynesian. But a word to the wise: don’t waste your time trying to tell that to anybody in the Republican Party.

GDP and Employment: Mixed Messages

This week's Employment and GDP releases sent some very mixed - even more than usual - signals about the economy.

The good: Employment rose by 288,000 in April which is above the average of 194,000 since January 2012, and the February and March employment growth estimates were revised upward slightly.
The bad: The unemployment rate fell from 6.7% to 6.3%, but this was because the labor force fell by 806,000, which puts the labor force participation rate at 62.8%, down 0.4 pts from March (unemployment is measured as a fraction of the "labor force" which includes people who are working or say that they are looking for work, so this means that fewer people were looking for work).  The unemployment rate is calculated from a survey of households which has a smaller sample than the establishment survey that generates the payroll employment figure; in the household survey, the number of people employed fell by 73,000.  The decline in the labor force was attributed to fewer entrants rather than people exiting, as reported by Annalyn Kurtz of CNN:
"The drop in participation is not due to discouraged workers leaving the labor force," a Department of Labor spokesperson noted, "it's due to re-entrants and new entrants who we expected to see flowing into the labor force, and who didn't this month." 
The ugly: Wednesday's advance estimate from the BEA put real GDP growth at an 0.1% annual rate for the first quarter.  Consumption grew at a 3 percent rate; the biggest drag was investment, which fell at a 6.1% rate (mainly due to declines in equipment and housing investment as well as a decrease in inventories, which is counted as a negative investment), and exports, which declined at a 7.6% pace.
Furthermore, yesterday, Ylan Mui of the Washington Post reported that there are some reasons to expect a downward revision:
[T]he Census Bureau released new data on construction spending that were weaker than not only the consensus forecast  but also the government’s estimates in its calculations of the nation’s gross domestic product. Ben Herzon of Macroeconomic Advisers said core construction -- which doesn’t include residential improvements and federal spending -- was soft in March, while the numbers for the first two months of the year were revised lower.

According to Macroeconomic Advisers’ analysis, that means instead of the 0.2 percent boost in private nonresidential construction spending assumed in the GDP calculation, there was likely a 5.7 percent decline. Ouch.

In addition, new data show retail sales were also slightly softer than expected, translating into a 2.9 percent increase in consumer spending instead of a 3 percent rise, Herzon said.
Overall, the mixed signals highlight one of the reasons why "fine tuning" macroeconomic policies are difficult (at best): the "recognition lag" in identifying changes in the state of the economy.  Economic statistics are backward looking, based on surveys (which means there's some statistical "noise" - there's a great illustration of this at The Upshot), and subject to substantial revision.  Right now, the GDP figures look like a recession warning sign, but the employment numbers are consistent with the (slow) recovery continuing on.  Of the data we have so far, the payroll employment figures are probably the most reliable.  Average growth was 190,000 jobs over the first three months of the year, so despite the construction and retail sales data noted above, I'd expect the GDP figure ultimately to be revised upwards. We'll get the "second estimate" on May 29.

Update (5/4): According to Danielle Kurtzleben of Vox, Labor Secretary Perez has a theory about why labor force entry was low:
"What we tend to see, and this is my operating hypothesis of what's going on, this time of year we traditionally expect to see certain types of people flowing into the workforce, and those people are seasonal workers," he says.

The people seeking out that seasonal work start to ramp up their searches later in the month of April, he says. However, the survey week in which the government asks US households whether people are working fell as early as it possibly could have last month.

That's because the household survey week is the calendar week in which the 12th of the month falls. But the 12th landed on a Saturday, meaning households were surveyed from April 6-12.

"The people we would traditionally expect to see flowing into the workforce at this time have not yet entered the workforce," he says.

Wednesday, April 23, 2014

Sterling Crisis!

The British Pathe newsreel archive is now on youtube.  This is how they chronicled Britain's 1949 devaluation:

While floating exchange rates have day-to-day volatility, they don't provide the same drama as fixed exchange rates, which are prone to crises that are both political and economic.

The postwar Bretton Woods system fixed exchange rates with the dollar (and pegged the dollar to gold), but countries were allowed to make discrete changes in their parities.  Britain availed itself of  this in 1949 and 1967.
The Bretton Woods system ended in 1971 when the US announced that it would no longer redeem dollars for gold.

Sterling faced speculative pressure in the mid-1960's - "in a world of international finance, there is no sentiment; nations, whose currency is ailing must expect from the foreign dealers a kick in the moneybags"-

Which ultimately led to the 1967 devaluation:

Sterling crises were not unique to the Bretton Woods era.  Britain had left the gold standard in 1931 (after a very painful experience re-joining it in 1925).  In 1976, a plummeting pound led Britain to turn to the IMF for a loan, and, on "Black Wednesday" in 1992,  it was forced out of the European Monetary System.

Saturday, April 19, 2014

Been Down So Long (HP Filter Edition)

it looks above-trend to me...

I'm teaching my advanced students about "Real Business Cycles" this week, and, as part of the set up I'm introducing them to the Hodrick-Prescott (HP) filter, a widely-used method of separating "cyclical" from "trend" components of time series data, like US GDP.  I recently updated my example - the red line is US real GDP, which tends upwards, with occasional interruption, and the blue line is the trend, according to HP filter:
When we study "business cycles" we're studying the deviations of real GDP (red) from its trend (blue) path (the trend is the subject of economic growth theory).

The striking thing is at the end, where filter shows GDP above its trend path.  This is clearer if we pull out the deviations:
These are the business cycles captured by this method, and you can see at the end, the distance from trend is positive.

So, we're "above trend"?  Economy's not so bad after all?!

Well, not really... the way the HP filter works is that it chooses a trend that minimizes the distance between the trend and the underlying data, subject to a constraint limiting the change in the growth rate of the trend, which is what forces it to be smooth.  The US economy's slump was deep and long enough that it pulled down the trend far enough that we're now a little above it.

Here is the growth rate of the trend:
You can see how much, according to the filter, the last several years pulled down the trend path.

I'm not sure whether this says more about the economy or the de-trending method (there's a whole literature on technical issues in de-trending...).  But it does remind us that we need to be careful in how we use our tools and interpret their results (i.e., we should actually look at the graphs).

A somewhat different picture, which is more consistent with what most of us think is the state of "the economy" is given by comparing real GDP to the CBO's estimate of "potential output":
That is, we still have a long way to go to close our "output gap".

Friday, April 4, 2014

March Employment

The BLS released the employment figures for March today - a fairly good report overall, consistent with the trend that has predominated over the past several years of an economy that is recovering, but far too slowly.

Nonfarm payrolls - the jobs number from the survey of firms - rose by 192,000.  The unemployment rate remained at 6.7%, but in the survey of households which is used to calculate it, employment rose by 476,000.  The reason the unemployment rate didn't fall is that the labor force - i.e., the number of people working, or looking for work, increased by 503,000.  The decline in labor force participation has been one of the most troubling figures over the past several years, so it is good to see it rising again - in March, it rose 0.2 to 63.2%.
 The employment-population ratio for people 25-54 also rose by 0.2 percentage points, to 76.7%.
Looking at this statistic provides a rough way to control for the effects of the changing age distribution of the population.  While it has shown improvement recently, its still considerably below its all time high of 81.9% from April 2000.

Overall, while the report was somewhat encouraging, 10.5 million people remain unemployed.  The broader measure of un- and under-employment, 'U-6', which counts discouraged workers and part-time workers who would prefer to be full time, is at 12.7%.

On a non-seasonally adjusted basis, the unemployment rate was 7.2%, down from 7.5% in February, and payrolls rose by 941,000 (i.e., March is a month that normally sees a big employment gain, which is removed by the seasonal adjustment).

Monday, March 10, 2014

TFP: Three Episodes or Four?

Zachary Goldfarb reports that this was CEA Chair Jason Furman's favorite chart from the latest Economic Report of the President:
That shows total factor productivity growth (TFP, though some call it "multifactor") which is a measure of technological progress calculated as a residual: the part of output growth that cannot be explained by increases in factors of production (capital and labor).  Another way to think of it is that it is the growth that would occur even if the amount of machinery and equipment as well as labor stayed the same.  The CEA's report explains it quite nicely starting on p. 181.

Technological progress is the key determinant of long run living standards, so if the trend of technological progress has risen after its slump in the 1970's and 80's, its a big deal (and a bigger deal than many of the short-run cyclical issues we tend to obsess over).

The CEA's chart shows 15-year averages which smooths out the year-to-year volatility in TFP growth.  This is sensible because it its hard to discern the long-run trends that the concept is meant to help us understand.  The average TFP growth rate can be split into three eras:
  • 1949-1973 -- 1.9%
  • 1974-1995 -- 0.4%
  • 1996-2012 -- 1.1%
That is, productivity growth has risen about halfway back to its "postwar golden age" level since the mid-1990's.  While identifying breaks in a series like this can be tricky, there does seem to be consensus that there was a slowdown in the mid-1970's and a resurgence in the mid-1990's.

However, the results are somewhat different than what I presented to my macroeconomics students a few weeks ago.  This is a chart made from the BLS' Historical Multifactor Productivity Measures for the private non-farm business sector (which I believe is the same data the CEA used).
The gray line is the actual annual growth (you can see why it helps to average out some of the volatility) and the red line is the centered 15-year average (i.e., the same as the CEA's graph).  However, the CEA's method of averaging means that their graph stops in 2005. The years since then have not been good ones for TFP.  Since their data point for 2005 is an average over the years 1998-2012, the CEA is not ignoring the bad news, but they are lumping it in together with some good years (the late 1990's and early 2000's).

The dashed lines in the chart above illustrate an alternative, less optimistic way of interpreting the same data by breaking it down into four eras instead of three:
  • 1949-1973 -- 1.9%
  • 1974-1995 -- 0.4%
  • 1996-2005 -- 1.6%
  • 2006-2012 -- 0.5% 
This would be consistent with a brief 'boom', perhaps attributable to information technology and the internet, in 1996-2005, but one that is already exhausted.  That is the view that Robert Gordon took in this NBER working paper (and recent update).

Identification of trends in short periods of volatile data is inherently uncertain, and it may be sensible to think the reduction in TFP growth over the past several years is largely the artifact of cyclical factors.  That seems to be, implicitly, the CEA's view (and Ben Bernanke also has argued for a more optimistic interpretation of long-run prospects).  Whether they're right or Gordon is will make a huge difference for standards of living a generation or two hence, but, just now, it is too soon to tell.

Friday, March 7, 2014

February Employment

A mildly encouraging employment report from the BLS today: in February, payrolls increased by 175,000.  The unemployment rate ticked up to 6.7% (from 6.6%), but that was partly due to a slight increase in labor force participation. 

The payroll number comes from a survey of firms and the unemployment rate from a survey of households (which has a smaller sample size); in the household survey, the number of employed people only grew by 40,000, while the labor force grew by 264,000.  Since the surveys are separate, they do not match up every month - last month the number of people reporting they were employed in the household survey was much stronger than the increase in payrolls.
The labor force includes everyone who is working, or looking for work.  In general, the decline in labor force participation (the percent of adults in the labor force) has been one of the more worrying trends of the last several years - in February, it stood at 63.0%, down from 66.3% seven years ago.  Since the unemployment rate is measured as a percentage of the labor force, to the extent that people are giving up on finding a job and leaving the labor force, the fall in the unemployment rate might make the labor market look better than it really is.  However, some decline might be expected due to demographic trends (i.e., retirement of the "baby boom" generation).  To try to set aside those effects, the employment-population ratio for 25-54 year olds can be useful:
This shows some recovery, but still a pretty deep hole relative to where we were before the recession.

'U-6', the broader measure of the unemployment rate which includes discouraged workers (i.e., people who are not looking for work but say they would like to have a job) and people working part-time but who would prefer to be full-time is at 12.6%, down from 14.3% a year ago.

10.5 million people are unemployed, and 3.8 million of them have been unemployed for longer than 27 weeks.

The data are all seasonally adjusted; on a non-seasonally adjusted basis the unemployment rate was 7.0% and payrolls increased by 750,000 (i.e., unemployment in February is normally high, so the seasonal adjustment is down, but payrolls normally grow, so the growth of payrolls is also adjusted down to remove the 'seasonal' effect).

Monday, February 24, 2014

FOMC 2008

On Friday, the transcripts of the Federal Open Market Committee meetings from 2008, the year when the financial crisis intensified and the economy collapsed.

The striking thing is how little sense the board members had of how bad things were getting.  Even late in the year, the committee was seriously concerned by inflation.  The Times' Binyamin Appelbaum writes:
The Fed’s understanding of the crisis, however, was clouded by its reliance on indicators that tend to miss sharp changes in conditions. The government initially estimated, for example, that the economy expanded in the first half of 2008 because it basically assumed that some economic trends, like the pace of business creation, had continued apace. The Fed also relied on economic models that assumed the existence of smoothly functioning financial markets, limiting its ability to project the consequences of a breakdown. And the outlook of Fed officials also reflected a deeply ingrained bias to worry more about the risk of inflation than the reality of rising unemployment.

As Fed officials gathered on Sept. 16 at their marble headquarters in Washington for a previously scheduled meeting, stock markets were in free fall. Housing prices had been collapsing for two years, and unemployment was climbing.

Yet most officials did not see clear evidence of a broad crisis. They expected the economy to grow slowly in 2008 and then more quickly in 2009.
The Times also put together a fantastic interactive graphic linking quotes from the meetings to the events of the year.

A couple of things stood out to me in looking over the transcript from September 16 (the day after the Lehman bankruptcy), when the committee voted to hold the fed funds rate target at 2 percent:

The committee member with the best perception of how bad things were getting was Eric Rosengren, President of the Boston Fed, who argued for a rate cut: 
This is already a historic week, and the week has just begun. The labor market is weak and getting weaker. The unemployment rate has risen 1.1 percentage points since April and is likely to rise further. I am not convinced that the unemployment rate will level off where the Greenbook is assuming currently.

The failure of a major investment bank, the forced merger of another, the largest thrift and insurer teetering, and the failure of Freddie and Fannie are likely to have a significant impact on the real economy. Individuals and firms will become risk averse, with reluctance to consume or to invest. Even if firms were inclined to invest, credit spreads are rising, and the cost and availability of financing is becoming more difficult. Many securitization vehicles are frozen. The degree of financial distress has risen markedly. Deleveraging is likely to occur with a vengeance as firms seek to survive this period of significant upheaval. Given that many borrowers will face higher interest rates as a result of financial problems, we can help offset this additional drag by reducing the federal funds rate.
I think those of us who reside in District One can be proud of our Fed president. District Eleven (Dallas), on the other hand, well.... Richard Fisher:
That said, in my anecdotal interchanges, I am still hearing about the likelihood, as I think President Pianalto mentioned, that people are seeking to preserve their margins. They’ve been stung for many years, and I’ll just give you one case because I think it tells us something. If you talk to the CEO of Wal-Mart USA, what they are pricing to be on their shelf six to eight months from now has an average price increase of 10 percent. Now, of course, you might have this reversed as we go through time. My biggest disappointment, incidentally, was that the one bakery that I’ve gone to for thirty years, Stein’s Bakery in Dallas, Texas, the best maker of not only bagels but also anything that has Crisco in it, [laughter] has just announced a price increase due to cost pressures.
Well, there's a pretty good case that the trend of academic economists supplanting bankers and businesspeople on the FOMC has been a good thing.  To be fair to Fisher, though, part of the reason for the use of anecdotal evidence is that the data does not give the Fed a clear, real-time picture of the state (and direction) of the economy.  The chart below, from ALFRED, compares what the GDP data that were released shortly after that meeting showed (blue line), compared to the most recent vintage of data (i.e., what we know now, in red):
It is easy, with the benefit of hindsight, to criticize the committee members whose worries over inflation and optimism about the impact of the financial crisis look so foolish today (and this applies to some of the academic members, not just Fisher).  But looking at the data they had at the time underscores the fact that their task isn't so easy.

Sunday, February 23, 2014

Fighting the last Methodenstreit

That's a German word that means "method war" and it came to mind reading Simon Wren-Lewis' post last week, "Are New Keynesian DSGE Models a Faustian Bargain?"

The reason they might seem so is the methodological underpinnings of DSGE (Dynamic Stochastic General Equilibrium) models, which are "micro-founded" macroeconomic models derived from the optimizing behavior of individuals (or, often a "representative agent") were brought into macroeconomics by Robert Lucas, Ed Prescott and others who were seeking to overturn "Keynesian" macroeconomics (see, e.g., Lucas and Sargent, 1979, "After Keynesian Macroeconomics").

The first generation of models of this type - "Real Business Cycle" (RBC - where "real" means non-monetary) implied that economic fluctuations could be optimal, and that monetary and fiscal policy were either useless or harmful (this JEP article by Charles Plosser is a good primer).

While these models failed to convince as explanations of economic fluctuations overall (as Larry Summers explained, though they can still be a useful part of the macro toolkit, as Chris House argues), the methods introduced by the RBC theorists have become nearly universal in macroeconomic modelling under the broader moniker "DSGE".  The last couple of decades have shown us that a number of "Keynesian" features, such as "sticky" prices can be incorporated into such models, which then go by the name "New Keynesian."

So the "New Keynesians" are using methods that were introduced by a cohort of macroeconomists that were explicitly anti-Keynesian.  That is, Lucas et al. won the methodological war about how to build macroeconomic models, but their anti-Keynesian view of the economy itself did not prevail.

Wren-Lewis' answer to the question posed in the title of his post is "no." Paul Krugman summarizes and responds:
Wren-Lewis’s answer is no, because New Keynesians were only doing what they would have wanted to do even if there hadn’t been a de facto blockade of the journals against anything without rational-actor microfoundations. He has a point: long before anyone imagined doing anything like real business cycle theory, there had been a steady trend in macro toward grounding ideas in more or less rational behavior. The life-cycle model of consumption, for example, was clearly a step away from the Keynesian ad hoc consumption function toward modeling consumption choices as the result of rational, forward-looking behavior. 

But I think we need to be careful about defining what, exactly, the bargain was. I would agree that being willing to use models with hyperrational, forward-looking agents was a natural step even for Keynesians. The Faustian bargain, however, was the willingness to accept the proposition that only models that were microfounded in that particular sense would be considered acceptable. It’s one thing to accept that models with an Euler condition at their core can sometimes be useful; it’s quite different to restrict your discourse to models with that characteristic, while ruling out everything else.
A couple of things to note here:

Politics: Both the academic sort in terms of who gets hired and what gets published - as Krugman alludes to, some of it was pretty vicious (at least that's my sense - this was all well before my time) and some are still holding grudges - and the political implications of the theory.  In its purest form, RBC theory has some pretty right-wing policy implications (though RBC macroeconomists are not necessarily Republicans), so some view RBC (and, by extension, DSGE) models as cover for a conservative political agenda.

How academia works: Publishing papers requires at least some incremental degree of novelty (i.e., a journal article must make a "contribution to the literature").  While the events of the last six years have underscored the usefulness of the standard textbook Keynesian approach I (and many others) teach our intermediate-level macroeconomics students, as far as publishing it, well, it was done 77 years ago.  While it is useful for policymakers and the economists working in policy institutions, academic economists are going to focus on developing new theory - which hopefully leads to better policy-making, in the long-run at least.  That is, the divide between "scientists" and "engineers" described by Greg Mankiw applies.

For more interesting thoughts on this see: Brad DeLong, Roger Farmer, Steve Williamson's response to the Krugman post quoted above, another post by Krugman.