Monday, March 28, 2011

The Spaghetti Bowl is Not Deep

At Vox, Theresa Carpenter and Andreas Lendle ask, "How Preferential is World Trade?"  Their answer:
We find that around half of world imports are from countries that are granted some preference – yet that does not mean that all of that trade is actually preferential. MFN rates may be zero or a product can be excluded from preferences. Overall – but not including intra-EU trade – just 16.3% of global trade has a positive preferential margin. Preferential margins are distributed as follows:
  • 10.5% of total trade has a margin of 5 percentage points or less, 3.9% has a margin of 5-10 percentage points, 1.3% has a margin of 10-20 percentage points and only 0.5% has a margin of 20 percentage points and above.
  • Around 30% of global trade is not eligible for preferences and subject to non-zero MFN rates, although almost 2/3 of that trade faces tariffs of 5% or less.
  • More than half (52%) of world trade is at MFN zero, so no preferences can be granted. 
By "preferential margin," they mean the gap between the "most favored nation" tariff, which all WTO members are expected to apply to imports from other members, and the lower tariffs which might apply under regional and bilateral trade agreements.  For example, if the US has a 10% MFN tariff on cogs, but under NAFTA, cogs from Mexico enter the US tariff-free, that would mean a 10% preferential margin.

This addresses one of the main complaints about trade agreements: that they lead to preferences.  This can lead to "trade diversion," whereby a country imports a good from someplace that isn't the most efficient producer - in the above example, if Brazilian cogs were 5% less costly than Mexican cogs, without NAFTA, the US would import them from Brazil; NAFTA causes the US to shift to Mexican cogs (and lose the tariff revenue).  As a political economy matter, this might also mean that Mexico is less likely to support multilateral liberalization - e.g., a reduction in cog duties in the WTO Doha round talks - because it enjoys its privileged access to the US market.  That is, preferential agreements lead to worries about "preference erosion" which could hinder broader tariff reductions.

These are two aspects of the general criticism that regional trade agreements create what is known as the "spaghetti bowl phenomenon" described thus in an academic article by Jagdish Bhagwati, David Greenaway and Arvind Panagariya:
The result [of the proliferation of regional trade agreements] is…the ‘spaghetti bowl’ phenomenon of numerous and crisscrossing PTAs and innumerable applicable tariff rates depending on arbitrarily-determined and often a multiplicity of sources of origin. In short, the systemic effect is to generate a world of preferences, with all its well-known consequences, which increases transaction costs and facilitates protectionism. In the guise of freeing trade, PTAs have managed to recreate the preferences-ridden world of the 1930s as surely as protectionism did at the time. Irony, indeed!
One of the interesting questions in trade is whether "free trade" agreements really deserve the name.  Carpenter and Lendle's evidence suggests that perhaps some of the criticisms - while valid in theory - may be overdone in practice.

Thursday, March 24, 2011


Ben Bernanke will be meeting the press, the Fed announced:
Chairman Ben S. Bernanke will hold press briefings four times per year to present the Federal Open Market Committee's current economic projections and to provide additional context for the FOMC's policy decisions.

In 2011, the Chairman's press briefings will be held at 2:15 p.m. following FOMC decisions scheduled on April 27, June 22 and November 2. The briefings will be broadcast live on the Federal Reserve's website. For these meetings, the FOMC statement is expected to be released at around 12:30 p.m., one hour and forty-five minutes earlier than for other FOMC meetings.

The introduction of regular press briefings is intended to further enhance the clarity and timeliness of the Federal Reserve's monetary policy communication. The Federal Reserve will continue to review its communications practices in the interest of ensuring accountability and increasing public understanding. 
Wow.  It wasn't too long ago that I remember watching one of Bank of England Governor Mervyn King's press conferences and thinking "that will never happen here."

This is another step that follows from a significant evolution in what is seen as good practice by central banks.  The Fed and other central banks once believed in the importance of using secrecy to cultivate a mystique.  It was not for nothing that William Grieder titled his 1987 classic about the Fed "Secrets of the Temple."  The Fed only began announcing changes in its policy stance in 1994, announcements of explicit Fed Funds rate targets followed in 1995.

This trend towards openness partly reflects the influence of academic views, as modern macroeconomic models assign important roles to expectations, information and credibility (which can be cultivated by making policy announcements and then sticking to them).  Its not surprising to see Bernanke pushing in this direction, given his background as a prominent academic economist (in contrast to his predecessor).  The previous procedural step in this direction was in November 2007, when Bernanke instituted the practice of having the FOMC members release their forecasts.  Bernanke also appeared on 60 minutes last December, which would have been unthinkable to the previous generation of central bankers.

Update: Real Time Economics has reaction from some Fed watchers and a chronology of the Fed's increasing openness.

Monday, March 21, 2011

DeLong on Friedman

Although it was once considered the big divide in macroeconomics, contemporary "Keynesianism" and "Monetarism" may not be so far apart (some would argue that's because modern "New Keynesians" aren't really Keynesians at all, but I digress...).  These days, they are mostly on the same side in arguing that policy can do something (other than damage) to smooth economic fluctuations and, by extension, persistent high unemployment is a policy failure.

This reveals a contradiction between how Milton Friedman is perceived and what his ideas really meant, Brad DeLong explains:
In the 1950s and 1960s and 1970s Milton Friedman faced a rhetorical problem. He was a laissez-faire libertarian. But he also believed that macroeconomic stabilization required that the central bank be always in the market, buying and selling government bonds in order to match the supply of liquid cash money to the demand, and so make Say's Law true in practice even though it was false in theory.

Such a policy of constant government intervention to continually rebalance aggregate demand is hardly a laissez-faire hands-off libertarian policy, is it?

Friedman, however, set about trying to maximize the rhetorical distance between his position--which was merely the "neutral," passive policy of maintaining the money stock growth rate at a constant--and the position of other macroeconomists, which was an "activist," interventionist policy of having the government disturb the natural workings of the free market. Something went wrong, Friedman claimed, only when a government stepped away from the "neutral" monetary policy of the constant growth rate rule and did something else.

It was, I think, that description of optimal monetary policy--not "the central bank has to be constantly intervening in order to offset shocks to cash demand by households and businesses, shocks to desired reserves on the part of banks, and shocks to the financial depth of the banking system" but "the central bank needs to keep its nose out of the economy, sit on its hands, and do nothing but maintain a constant growth rate for the money stock"--that set the stage for what was to follow in Chicago.

First, Friedman's rhetorical doctrine eliminated the cognitive dissonance between normal laissez-faire policies and optimal macro policy: both were "neutral" in the sense of the government "not interfering" with the natural equilibrium of the market. Second, Friedman's rhetorical doctrine eliminated all interesting macroeconomic questions: if the government followed the proper "neutral" policy, then there could be no macroeconomic problems. Third, generations of Chicago that had been weaned on this diet turned out to know nothing about macro and monetary issues when they became important again.

It is in this sense, I think, that I blame Milton Friedman: he sold the Chicago School an interventionist, technocratic, managerial optimal monetary policy under the pretense that it was something--laissez-faire--that it was not.

Friday, March 4, 2011

February Employment: Slightly Better

According to today's report from the BLS, the unemployment rate ticked down to 8.9% (from 9.0%) in February, and payrolls increased by 192,000.
Those are both solid improvements, though I had been hoping for better in light of other very positive recent data like the ISM index.

The labor force participation rate was unchanged at 64.2, so at least the decline in the unemployment rate was not driven by people leaving the labor force (to be counted as unemployed, a person must report that they are looking for work).  That is in contrast to the big decline in the unemployment rate in December and January, where declining participation was a major factor.
If the recovery gets stronger, the participation rate should be expected to start rising.

The unemployment rate and labor force participation rate are calculated from a survey of households, which reported an increase of 250,000 in the number of people employed (the payroll number cited above comes from the survey of businesses, which has a larger sample).  February is one of the months where the seasonal adjustment to the unemployment rate is downwards; on a non-seasonally adjusted basis, the unemployment rate was 9.5% (down from 9.8% in January).

The BLS also revised upward the payroll employment increase estimates for December, to 152,000 (from 121,000) and January, to 63,000 (from 36,000).  It may be that some of the February increase really occurred in January, but wasn't properly counted due to the weather in January - a rough way to correct for this is to look at the average of the two months, which is an unimpressive 127,500 (that's roughly the pace needed to keep the unemployment rate stable as the population grows).

Worth noting amid the sturm-und-drang over state budget cuts:
Employment in both state and local government edged down over the month. Local government has lost 377,000 jobs since its peak in September 2008.
U-6, the BLS' broadest measure of unemployment and underemployment, which includes people who are "marginally attached" to the labor force and those who are working part-time but want to work full-time stands at 15.9% (seasonally adjusted), down from 16.1% in January.

Update: More reactions/analysis from Mark Thoma, Paul Krugman, David Leonhardt, Floyd Norris, Calculated Risk, Free Exchange, Gavyn Davies and RTE's round up of Wall Street commentary.

Scientists vs Engineers?

The House Republicans' plan to cut federal spending by $61 billion for the remainder of the fiscal year (i.e., the period between now and the end of October) will be a drag on the economy and reduce employment, according to both Goldman Sachs and Moody's Mark Zandi, who says:
The House Republicans’ proposal would reduce 2011 real GDP growth by 0.5% and 2012 growth by 0.2 percentage points This would mean some 400,000 fewer jobs created by the end of 2011 and 700,000 fewer jobs by the end of 2012.
This shouldn't come to a surprise to macroeconomics students, who know that a decrease in government purchases reduces aggregate demand and - outside of the special "classical" case of vertical aggregate supply - output.

John Taylor disagrees, however.  Ezra Klein explains:
Mark Zandi says the GOP's proposed spending cuts will cost about 700,000 jobs. John Taylor says they will "increase economic growth and employment." Both are respected economists who immerse themselves in data, research and theory. So how can they disagree so sharply?

The dispute comes down to how much weight you give to "expectations" about future deficits. Taylor's argument is that Zandi's model -- which you can read more about here -- doesn't account for the upside of deficit reduction -- namely, that when the government spends less, the private sector will spend more. Taylor thinks individuals and businesses are hoarding their money because they're afraid of the high taxes, sharp spending cuts and assorted other nastiness that deficit reduction will eventually require. "The high unemployment we are experiencing now is due to low private investment rather than low government spending," he writes. "By reducing some uncertainty and the threats of exploding debt, the House spending proposal will encourage private investment."
A similar disagreement is playing out over monetary policy.  In a recent NY Times column, Christina Romer wrote:
The debate is between what I would describe as empiricists and theorists.

Empiricists, as the name suggests, put most weight on the evidence. Empirical analysis shows that the main determinants of inflation are past inflation and unemployment. Inflation rises when unemployment is below normal and falls when it is above normal.

Though there is much debate about what level of unemployment is now normal, virtually no one doubts that at 9 percent, unemployment is well above it. With core inflation running at less than 1 percent, empiricists are therefore relatively unconcerned about inflation in the current environment.

Theorists, on the other hand, emphasize economic models that assume people are highly rational in forming expectations of future inflation. In these models, Fed actions that call its commitment to low inflation into question can cause inflation expectations to spike, leading to actual increases in prices and wages. 
She sides with the "empiricists" and argues that the influence of the "theorists" has held the Fed back from taking bolder, more effective action.  Stephen Williamson begs to differ:
Romer says some things about economic history in her piece, but of course she is very selective, and seems to want to ignore the period in US economic history and in macroeconomic thought that runs from about 1968 to 1985. Let's review that. (i) Samuelson/Solow and others think that the Phillips curve is a structural relationship - a stable relationship between unemployment and inflation that represents a policy choice for the Fed. (ii) Friedman (in words) says that this is not so. There is no long-run tradeoff between unemployment and inflation. It is possible to have high inflation and high unemployment. (iii) Macroeconomic events play out in a way consistent with what Friedman stated. We have high inflation and high unemployment. (iv) Lucas writes down a theory that makes rigorous what Friedman said. There are parts of the theory that we don't like so much now, but Lucas's work sets off a methodological revolution that changes how we do macroeconomics. 
The divides between Goldman/Zandi and Taylor over fiscal policy and between Romer and Williamson over monetary policy both reminded me of Greg Mankiw's distinction between "scientific" and "engineering" macroeconomics. The models used by the "engineers" - the people in Washington and on Wall Street who need to make practical, quantitative assessments of the impact of policy alternatives on the economy - are more elaborate versions of the "textbook" Keynesian IS-LM aggregate supply and demand framework that most of us (still) teach our macroeconomics students.  As Williamson points out, the models used by academics - Mankiw's "scientists" - in our research are fundamentally different.

The engineering models are built on relationships among aggregate macroeconomic variables like the Phillips curve, which relates inflation and unemployment, and the consumption function, which connects consumption and disposable income.  As Williamson alludes to, Robert Lucas and others won a methodological war in the profession (or at least the academic branch of it) in the 1970s and 1980s.  The result of their victory is that the macroeconomic models published in leading journals today are expected to be grounded in the optimizing, forward-looking behavior of rational individuals.

Such individuals might believe, for example, a reduction in government spending today implies that their future taxes will be lower (because the government will be servicing a smaller debt burden).  The resulting increase in their lifetime disposable income means that they will immediately increase their consumption.  So any negative impact of a cut in government purchases is offset by an increase in consumption.  Rational, forward-looking optimizers might also recognize that any monetary expansion will erode their real wages and demand an offsetting increase in nominal wages.  This means that employment will remain unchanged (because the real cost to the firms of a worker is the same) even as inflation rises.

At its most extreme, the assumption of dynamic optimization under rational expectations was once believed to imply the Lucas-Sargent "Policy Ineffectiveness" proposition, which Bennett McCallum explained in a 1980 Challenge article:
Macroeconomic policies - sustained patterns of action or reaction - will have no influence because they are perceived and taken into account by private decision-making agents. Thus, the adoption of a policy to maintain "full employment" will not, according to the present argument, result in values of the unemployment rate that are smaller (or less variable) on average than those that would be experienced in the absence of such a policy.
Of course, in a world of rationally optimizing people, where prices adjust to clear markets, it is hard to explain how we could get to such large deviations from the natural rate of unemployment in the first place...

More generally, while macroeconomic science has continued on the methodological path established by Lucas, many of its practitioners have worked to re-incorporate real effects of monetary policy.  This is a large part of the "New Keynesian" project, which is arguably now the reigning paradigm and best hope for reuniting "science" and "engineering" (and arguably is as much "monetarist" as it is "Keynesian").

Fiscal policy has received less attention - the implausibility of managing aggregate demand through the slow, cumbersome and messy budget process means that, in general, the focus has been on the Fed.

That has started to change as the global slump has pushed conventional monetary policy to its limits (and beyond into unknown worlds of unconventional policy), and governments around the world have made fitful attempts at fiscal policy.  For example, recent papers by Christiano, Eichenbaum and Rebelo, Gauti Eggertson and Michael Woodford have shown that it is possible for fiscal policy to have significant multiplier effects when monetary policy is at the zero lower bound (as it is today) in New Keynesian models.

So, while, at a superficial level, it appears that the split between "scientists" and "engineers" persists, some of the "scientific" work being done today is finding that the remedies proposed by the "engineers" are not wholly inconsistent with forward-looking rational behavior after all.