Saturday, December 20, 2014

The Birth of Inflation Targeting

Inflation targeting, where monetary policy is directed to aim for a specific level or range for the inflation rate, has become a widespread practice.  The Times' Neil Irwin looks back at its first implementation by New Zealand; he writes:
Sometimes, decisions that shape the world’s economic future are made with great pomp and gain widespread attention. Other times, they are made through a quick, unanimous vote by members of the New Zealand Parliament who were eager to get home for Christmas.

That is what happened 25 years ago this Sunday, when New Zealand became the first country to set a formal target for how much prices should rise each year — zero to 2 percent in its initial action. The practice was so successful in making the high inflation of the 1970s and ’80s a thing of the past that all of the world’s most advanced nations have emulated it in one form or another. A 2 percent inflation target is now the norm across much of the world, having become virtually an economic religion.
Irwin goes on to provide a nice description of how and why New Zealand adopted this policy. Although it initially seemed quite successful in achieving the goal of avoiding a repeat of the high inflation if the 1970s (which continued well into the 80s in some countries) while maintaining reasonable growth rates, it has been tested by the experience of the past seven years.  One question is whether the level of 2 percent is the right one.  Irwin describes how Janet Yellen successfully argued against those at the Fed who wanted to go for zero inflation in the mid-1990s, but even 2 percent may be too low:
Starting in the late 1990s, Japan found itself stuck in a pattern of falling prices, or deflation, even after it cut interest rates all the way to zero. The United States suffered a mild recession in 2001, and the Fed cut interest rates to 1 percent to help spur a recovery. Then came the global financial crisis of 2007 to 2009, spurring a steep downturn across the planet and causing central banks to slash interest rates.

All of this has quite a few smart economists wondering whether the central bankers got the target number wrong. If they had set it a bit higher, perhaps at 3 or 4 percent, they might have been better able to combat the Great Recession because they could cut inflation-adjusted interest rates by more.
The apparent initial success as well the reasons for recent doubts can be seen in the UK's experience, which adopted inflation targeting in 1998:

(the chart data is from the OECD, via FRED.  The UK's target was initially 2.5%, but expressed in terms of a different price index measure, when it switched to using the CPI, it moved the target to 2% based on differences in the measures.)

While the UK generally has had low and (relatively) stable inflation since the early 1990s, it did miss its target considerably in 2007-2012, and it may be in danger of undershooting its target (as the Fed is) - inflation in November was 1% (this is not evident in the chart because it plots the percentage change in the price index from the year before).

Although I think the Bank of England deserves credit for not tightening in the face of inflation which ultimately proved transitory, this does call the inflation targeting framework into question.  Arguably, it may have helped keep inflation expectations "anchored" even as inflation deviated from target.  However, at some point, one would expect such deviations to undermine the credibility of the regime, and it was the idea of establishing credibility that made it attractive to academic economists in the first place (the underlying intuition for this was nicely described in this speech by Philadelphia Fed President Charles Plosser).

The other question, of whether a higher target, or a different one - such as a target price level (inflation is the rate of change of the price level) or nominal GDP - would be better is an interesting and important one.  The difficulty now is that, having established a monetary policy rule, the credibility of any new rule could be diminished by a change in rules.
 

Economics Navel-Gazing, Curricular Edition

A group of economics students in the UK have undertaken a movement to reform the economics curriculum.  I'm a little surprised that I haven't run into similar sentiments at Wesleyan - I can't decide if I'm disappointed or relieved by this.

The criticisms seem to me to be based on a somewhat unfair caricature of economics and economists, that we're head-in-the-sand apologists for "neoliberalism" who use mathematics as a form of obscurantism and have little useful to say about the "real world," particularly in the wake of the financial crisis.

Some of this may be rooted in the fact that the "economics" articulated by politicians, government officials and the press - what Simon Wren-Lewis has called "mediamacro" - does not reflect the views of most of mainstream academic economics.  In particular the obsession with government budget deficits is not based on textbook economics (I discussed an example of this misconception in a European context a couple of years ago).

Markets are at the heart of economics - this may be where the view that economists are "free market fundamentalists" comes from.  In introducing markets, though, there are really two main points to make:
  1. The gains from exchange and specialization possible from voluntary trade (i.e., Adam Smith's "Invisible Hand"), and the ability of markets make to adjustments based to dispersed information about what Hayek called "the particular circumstances of time and place" which would be un-knowable to any central planner.
  2. While economists need to make our students aware of the hidden and under-appreciated role markets play in organizing society and in lifting humanity out of subsistence-level poverty, we also devote a considerable amount of attention to how they fail.  In particular, problems of monopoly power, externalities, public goods and asymmetric information are standard subjects for introductory economics courses.  (2a., There are also reasons to be skeptical in practice of the ability of our political system to effectively correct market failures).
We do suffer from an excess of libertarians who mistakenly believe that economic theory validates their views - I usually think of these people as students who stopped listening after they learned about point #1 in first several weeks of their principles courses (or put too much weight on #2a).  But these folks are a minority in economics, though perhaps a vocal enough one that students and other outsiders might believe they are more representative than they really are.

We typically introduce markets with the model of "supply and demand," and the exercise of thinking in terms of models provides much of the lasting value of studying economics.  Working with economic models can sharpen students' logical and critical thinking skills immensely.  As John Cochrane nicely put it recently, "economic models are quantitative parables, not explicit and complete descriptions of reality." The criticism that models are "simplifications" is a cheap one - writing down a set of assumptions in mathematical form and working out the implications (and then testing them against data), is where the insight comes from.  The discipline of doing this cultivates an ability to think intelligently about tradeoffs and hidden costs, and to trace conclusions back to underlying assumptions and consider how changing assumptions lead to different conclusions.  Since models are, by necessity, very stylized descriptions of the world, students of economics must not only learn how to work with them, but also how to judge which simplifications are appropriate for a given circumstance or question.  As Keynes said, "Economics is the science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world."

So I think the core of what we try to do in our introductory economics courses - introducing markets (both their successes and failures), and teaching students how to think in terms of models - is extremely worthwhile.  Of course, this does not cover everything that we possibly would like to do in a course (or in small set of courses).  Much of economics is concerned with the allocation of scarce resources, and the time that our students can spend on a course in a semester (both in and out of the classroom) is very limited, forcing some difficult choices on instructors.  Some of the criticisms made by the UK students seem to be about what we're leaving out, though I think what we're doing in our introductory courses is pretty important, and laying some groundwork in the economic way of thinking will help the students tackle issues like understanding the financial crisis, either in later classes or independently.  While many of the debates in the news are about macroeconomic policy (and as a macroeconomist, I'm happy to see the revival of interest in the topic, even if arises from unfortunate sources), the core microeconomic concepts are very important and not to be skipped.  While it can be exciting to be teaching a subject that is relevant to contemporary events, we should not be seduced into bringing "news" into the classroom in a way that interferes with developing an understanding of the fundamentals.

There is sometimes a bit of a muddle in these navel-gazing discussions, too, between what should be in our undergraduate curriculum and the separate, but not wholly unrelated, issue of our research agenda and graduate curriculum.  I'm not entirely unsympathetic to the calls for "methodological pluralism" though I wouldn't go as far as the UK students would like.  I have argued for graduate study of the history of economic thought, and I have emphasized it in my undergraduate teaching, using it as an organizing principle for my intermediate macroeconomics class (and also making my intro students read some Smith, Hayek, Friedman and Keynes).  As a field, I do think macroeconomics is at a point where we should be open to reconsiderations of some of the standard tools (though I don't think that is ever not the case), and I worry that the "publish or perish" incentives we all face mean that we do too little of that.

Karl Whelan of University College, Dublin has a written nice essay "Teaching Economics 'After the Crash'" with a more detailed response to the UK students' criticisms which is well worth reading.

Friday, November 28, 2014

Gray Matter and the US Current Account

After increasing steadily from the mid-1990s through the mid-2000s, the rate at which the US is borrowing from the rest of the world - our current account deficit - has come down considerably.
The current account deficit peaked at around 6% of GDP in 2006 and has hovered around 2.5-3% over the past several years.

During the period when it looked like the US current account deficit was growing inexorably, there was quite a bit of discussion of "global imbalances" and whether or not the US' borrowing was sustainable.

Despite years of borrowing, the US tends to earn a positive balance on income - i.e., the US receives more in payments on assets it owns abroad than it makes to foreign owners of US assets.  Ricardo Hausmann and Federico Sturzenegger argued that national accounts understate the true value of US foreign assets.  They called the gap between the accounting value and the true value "dark matter", which they largely attributed to the know-how exported (but not properly measured) with US FDI.  (I revisited this idea in a previous post).

Writing at Project Syndicate, Jeffrey Frankel offers another idea on why our balance of payments accounts may make the US deficit look worse than it really is:
Every year, US residents take some of what they earn in overseas investment income – interest on bonds, dividends on equities, and repatriated profits on direct investment – and reinvest it then and there. For example, corporations plow overseas profits back into their operations, often to avoid paying the high US corporate income tax implied by repatriating those earnings. Technically, this should be recorded as a bigger surplus on the investment-income account, matched by greater acquisition of assets overseas. Often it is counted correctly. But there is reason to think that this is not always the case...

...For example, US multinational corporations sometimes over-invoice import bills or under-report export earnings to reduce their tax obligations. Again, this would work to overstate the recorded current-account deficit.
While the efforts that US multinationals make to evade their tax obligations are probably technically legal in most cases, they go against the spirit of the US tax law, which taxes US corporations based on their global earnings (whether it should be this way is another matter).  Since it arises from a gray area in our tax code, perhaps the we should call the resulting gap between our measured and true foreign assets "gray matter."

Friday, November 7, 2014

October Employment

A good report today from the BLS on employment in October:  the unemployment rate fell to 5.8% (from 5.9%) and employers' payrolls rose by 218,000.
The payroll figure comes from a survey of firms, while the unemployment rate is based on a survey of households (which has a smaller sample than the employer survey).  The household survey figures look even better: the number of people employed rose by 683,000, and the number unemployed fell by 267,000.  The labor force (i.e., people who are working or looking for work) rose by 416,000, which put the labor force participation rate at 62.8%, an increase from last month's historic low of 62.7%. 

The decline in labor force participation (which was at 66% in late 2007) has been one of the worrying trends of the past several years.  It partly reflects demographics, though, as the population is becoming older and a larger portion of the population is of retirement age.  Looking at the employment-population ratio for 25-54 year olds gives a picture of the labor market that takes out some of the guesswork in interpreting participation:
This ratio increased from 76.7 to 76.9 in October.  Overall, it shows some recovery over the past three years, but also gives an indication of why many Americans remain unhappy with the state of the economy - it is still less than halfway back from its low point to its pre-recession level.

Moreover, while employment is improving, wages are still growing slowly - the BLS reports that average hourly wages have increased 2% over the past year.  This suggests that there is still plenty of "slack" in the labor market.

The BLS' broader measure of un- and under-employment, 'U-6', which includes the "marginally attached" and people working part-time who want to be full-time, is at 11.5%, down from 11.8% last month (it peaked at 17.2% in April 2010).

Wednesday, September 17, 2014

Information Overload

We've arrived at the point in Econ 302 this semester where we're reading Paul David's "The Dynamo and the Computer: An Historical Perspective on the Modern Productivity Paradox" which means that I find myself again marveling at the prescience of this:
That's a pretty amazing thing to have written in 1989! (the paper was published in the May '90 American Economic Review, which has papers presented at the meeting in early January 1990).

For a nice (and un-gated) summary of David's argument, see this Tim Harford piece in Slate.

Friday, September 5, 2014

August Employment

According to the BLS, employment rose by 142,000 in August and the unemployment rate ticked down to 6.1%.
That's consistent with the picture of a continuing, but painfully slow, recovery that has predominated over the past several years, though this particular report was a little on the disappointing side.

The employment figure comes from a survey of firms, while the unemployment rate is based on a survey of households, which has a smaller sample.  According to the household survey, 80,000 fewer people were unemployed, but only 16,000 more were working - the difference is accounted for by 64,000 departures from the labor force (i.e., adults who are working or looking for work).  Such decreases in labor force participation are not an encouraging sign.

However, labor force participation is a little bit difficult to interpret because demographic change (more people reaching retirement age, etc.) plays a role as well.  My preferred measure of the state of the labor market is the share of 25-54 year-olds who are working - this takes out the guesswork about demographics and participation.  This measure rose in August, to 76.8% (from 76.4%)
that's up from a low of 74.8% in November 2010, but still well below pre-recession levels.  Employment is continuing to crawl out of the hole we dug in 2008-09, but we're less than halfway there.  Any talk of returning to "normal" monetary policy seems a premature to me - things may be getting slightly better, but the situation is still quite bad.

Friday, August 22, 2014

Europe in Depression

In a post back in 2012, when things were looking pretty hairy for the euro, I said it would be "a real human disaster if the euro cracked up in a crisis."  Two years later, fear of a calamitous exit by the "peripheral" Eurozone countries have eased (as evidenced by reduced bond yields).  The euro appears to have been saved - and it has been a real human disaster nonetheless.

At wonkblog, Matt O'Brien writes:
As I've said before, the euro is the gold standard with moral authority. And that last part is the problem. Europeans don't think the euro represents civilization, but rather the defense of it. It's a paper monument to peace and prosperity that's made the latter impossible. So the eurocrats who have spent their lives building it are never going to tear it down, despite the fact that, as it's currently constructed, the euro is standing between them and recovery.

Just like the 1930s, Europe is stuck with a fixed exchange system that doesn't let them print, spend, or devalue their way out of a crisis. But, unlike then, Europe might never give it up. It's a fidelity to failure that even the gold bloc couldn't have imagined.
Unemployment rates are above 20% in Spain and Greece, and above 10% in Portugal, Italy, Ireland, France, Cyprus, Slovakia and Slovenia:
Ambrose Evans-Pritchard spoke to several economics Nobel laureates:
An array of Nobel economists have launched a blistering attack on the eurozone's economic strategy, warning that contractionary policies risk years of depression and a fresh eruption of the debt crisis.
"Historians are going to tar and feather Europe's central bankers," said Professor Peter Diamond, the world's leading expert on unemployment. "Young people in Spain and Italy who hit the job market in this recession are going to be affected for decades. It is a terrible outcome, and it is surprising how little uproar there has been over policies that are so stunningly destructive," he told The Telegraph at a gathering of Nobel laureates at Lake Constance...
Professor Joseph Stiglitz said austerity policies had been a "disastrous failure" and are directly responsible for the failed recovery over the first half of this year, with Italy falling into a triple-dip recession, France registering zero growth and even Germany contracting in the second quarter.
"There is a risk of a depression lasting years, leaving even Japan's Lost Decade in the shade. The eurozone economy is 20pc below its trend growth rate," he said...
Professor Christopher Sims, a US expert on monetary policy, said EMU policy makers had not sorted out the basic design flaws in monetary union, and are driving Club Med nations into deeper trouble by imposing pro-cyclical austerity.
"If I were advising Greece, Portugal or even Spain, I would tell them to prepare contingency plans to leave the euro. There is no point being in EMU if all that happens when you are hit with a shock is that the shock gets worse," he said.
"It would be very costly to leave the euro, a form of default, but staying in the euro is also very costly for these countries. The Europeans have created a system that is worse than the Gold Standard. Countries are in the same position as Latin American states that borrowed in dollars," he said.
It may be a slightly hopeful sign that Francois Hollande is coming to a recognition of the problem, the Times' Liz Alderman reports:
After months of insisting that a recovery from Europe’s long debt crisis was at hand, President Fran├žois Hollande on Wednesday delivered a far bleaker message. He indicated that the austerity policies France had been compelled to adopt to meet the eurozone’s budget deficit targets were making growth impossible.
Paris officials say that France — the eurozone’s second-largest economy after Germany — will no longer try to meet this year’s deficit-reduction targets, to avoid making economic matters worse. Even in abandoning those targets, they indicated that France was unlikely to recover soon from its long period of stagnation or quickly reduce its unemployment rate, which exceeds 10 percent.
“The diagnosis is clear,” Mr. Hollande said in an interview published Wednesday in the French daily Le Monde. “Due to the austerity policies of the last several years, there is a problem of demand throughout Europe, and a growth rate that is not reducing employment.”
To really make a difference, though, a more inflationary monetary policy is needed, and there is no sign of that on the horizon.

A euro breakup in 2010, 11 or 12 would have been disastrous for sure, but I'm beginning to wonder if it would have been worse than what we've actually seen.

Wednesday, July 23, 2014

DSGE Failing the Market Test?

The prevailing methodology of macroeconomic theory these days is "Dynamic Stochastic General Equilibrium" (DSGE) modelling.  Although many contemporary DSGE models, including the ones I'm working on, include "Keynesian" elements such as sticky prices, unemployment and financial frictions, they represent a methodological break with an older style of "Keynesian" models based on relationships among aggregate variables.  The shift in method followed from the work of Lucas and Sargent (most prominently among others) -- which John Cochrane summarized on his blog:
As I see it, the main characteristic of "equilibrium" models Lucas and Sargent inaugurated is that they put people, time, and economics into macro.

Keynesian models model aggregates. Consumption depends on income. Investment depends on interest rates. Labor supply and demand depend on wages. Money demand depends on income and interest rates. "Consumption" and "investment" and so forth are the fundamental objects to be modeled.

"Equilibrium" models (using Lucas and Sargent's word) model people and technology. People make simultaneous decisions across multiple goods, constrained by budget constraints -- if you consume more and save more, you must work more, or hold less money.  Firms  make decisions across multiple goods constrained by technology.

Putting people and their simultaneous decisions back to the center of the model generates Lucas and Sargent's main econometric conclusion -- Sims' "incredible" identifying restrictions. When people simultaneously decide consumption, saving, labor supply, then the variables describing each must spill over in to the other. There is no reason for leaving (say) wages out of the consumption equation. But the only thing distinguishing one equation from another is which variables get left out.

People make decisions thinking about the future. I think "static" vs. "intertemporal" are good words to use.  That observation goes back to Friedman: consumption depends on permanent income, including expected future income, not today's income. Decisions today are inevitably tied to expectations --rational or not -- about the future.
A Bloomberg View column by Noah Smith nicely summarizes the methodological shift, which gained momentum from the apparent breakdown of the Phillips curve relationship between inflation and unemployment in the 1970s.  Smith writes:
Lucas showed that trying to boost gross domestic product by raising inflation might be like the tail trying to wag the dog. To avoid that kind of mistake, he and his compatriots declared, macroeconomists needed to base their models on things that wouldn’t change when government policy changed -- things like technology, or consumer preferences. And so DSGE was born. (DSGE also gave macroeconomists a chance to use a lot of cool new math tricks, which probably increased its appeal.)

OK, history lesson over. So why is this important now?

Well, for one thing, the finance industry has ignored DSGE models. That could be a big mistake! Suppose you’re a macro investor. If all you want to do is make unconditional forecasts -- say, GDP next quarter – then you can go ahead and use an old-style SEM model, because you only care about correlation, not causation. But suppose you want to make a forecast of the effect of a government policy change -- for example, suppose you want to know how the Fed’s taper will affect growth. In that case, you need to understand causation -- you need to know whether quantitative easing is actually changing people’s behavior in a predictable way, and how.

This is what DSGE models are supposed to do. This is why academic macroeconomists use these models. So why doesn’t anyone in the finance industry use them? Maybe industry is just slow to catch on. But with so many billions upon billions of dollars on the line, and so many DSGE models to choose from, you would think someone at some big bank or macro hedge fund somewhere would be running a DSGE model. And yet after asking around pretty extensively, I can’t find anybody who is.
That's an interesting question -- when thinking about issues like this, I often come back to the divide between "science" and "engineering" put forward by Greg Mankiw.  While academic macroeconomics has gone down the path marked out Lucas and Sargent, the policymaking "engineers" in Washington often still find the older-style models more useful.  It sounds like Wall Street's economists do too. 

The question is whether academic macroeconomics is on track to produce models that are more useful for the policymakers and moneymakers. The DSGE method is still fairly new, and, until recently, we've been constrained by the limitations of our computers as well as our minds (a point Narayana Kocherlakota made here), so maybe we're just not quite there yet.  But we should be open to the possibility that we're on the wrong track entirely.

Saturday, July 5, 2014

Efficiency Wages

The New York Times has a story about several restaurants that have decided to pay above-market wages.  One of them is Shake Shack, which is starting employees at $9.50/hr:
“The No. 1 reason we pay our team well above the minimum wage is because we believe that if we take care of the team, they will take care of our customers,” said Randy Garutti, the chief executive of Shake Shack.
That, and other anecdotes in the article, are consistent with the "efficiency wage" theory, where firms can induce more effort by paying a higher real wage.  This might arise if firms have a less than perfect ability to monitor individual employees' productivity - paying an above-market wage creates a stronger incentive not to "shirk". 

For more, see this brief 1984 survey by Janet Yellen, who did some of her early academic work in this area.

Tuesday, July 1, 2014

Classroom Technology

Despite evidence that having computers in class is not good for students, Slate's Rebecca Schumann argues that professors should permit them anyway:
[P]olicing the (otherwise nondisruptive) behavior of students further infantilizes these 18-to-22-year-olds. Already these students are hand-held through so many steps in the academic process: I check homework; I give quizzes about the syllabus to make sure they’ve actually read it; I walk them, baby-steps style, through every miniscule stage of their essays. Some of these practices do indeed improve what contemporary pedagogy parlance calls “learning outcomes” (barf) because they show students how invested I am in their progress. But these practices also serve as giant, scholastic water wings for people who should really be swimming by now.

My colleagues and I joke sometimes that we teach “13th-graders,” but really, if I confiscate laptops at the door, am I not creating a 13th-grade classroom? Despite their bottle-rocket butt pranks and their 10-foot beer bongs, college students are old enough to vote and go to war. They should be old enough to decide for themselves whether they want to pay attention in class—and to face the consequences if they do not.
I'm sympathetic to the argument - I've never had an "attendance policy" for essentially the same reason - but what Schumann misses is that the use of laptops have a negative spillover effect (what economists call an "externality").  A student who is using a computer will not only distract herself but also the students around her - it is the harm to others, and the classroom environment more generally, that justifies prohibiting computers in class.

Schumann goes on to argue the real problem is lecture format classes.  I don't think its appropriate to generalize - the optimal format probably varies across subjects (and across students, too, which may be a more difficult problem).  I'm planning some pretty big changes to the way I teach my classes for the coming year that will significantly reduce the amount of lecturing I do.  I wouldn't be doing this if I didn't expect the benefits to outweigh the costs, but I suspect the virtues of the traditional lecture style may be under-appreciated these days.  In particular, the act of note-taking by hand is a valuable part of the learning process.  A recent NY Times story about the decline of handwriting instruction in schools discussed some evidence on that point:
Two psychologists, Pam A. Mueller of Princeton and Daniel M. Oppenheimer of the University of California, Los Angeles, have reported that in both laboratory settings and real-world classrooms, students learn better when they take notes by hand than when they type on a keyboard. Contrary to earlier studies attributing the difference to the distracting effects of computers, the new research suggests that writing by hand allows the student to process a lecture’s contents and reframe it — a process of reflection and manipulation that can lead to better understanding and memory encoding.
Although we should always be looking for ways to improve, and to take advantage of new technology where it can be helpful, sometimes "innovation" carries hidden costs, and we will make better choices if we try to understand what those might be and take them into account.