Wednesday, November 18, 2015

Fed's New Community Advisory Council to Meet on Friday

The Federal Reserve Board’s newly-established Community Advisory Council (CAC) will meet for the first time on Friday, November 20. The solicitation for statements of interest for membership on the CAC, released earlier this year, describes the council as follows:
“The Board created the Community Advisory Council (CAC) as an advisory committee to the Board on issues affecting consumers and communities. The CAC will comprise a diverse group of experts and representatives of consumer and community development organizations and interests, including from such fields as affordable housing, community and economic development, small business, and asset and wealth building. CAC members will meet semiannually with the members of the Board in Washington, DC to provide a range of perspectives on the economic circumstances and financial services needs of consumers and communities, with a particular focus on the concerns of low- and moderate-income consumers and communities. The CAC will complement two of the Board's other advisory councils--the Community Depository Institutions Advisory Council (CDIAC) and the Federal Advisory Council (FAC)--whose members represent depository institutions. The CAC will serve as a mechanism to gather feedback and perspectives on a wide range of policy matters and emerging issues of interest to the Board of Governors and aligns with the Federal Reserve's mission and current responsibilities. These responsibilities include, but are not limited to, banking supervision and regulatory compliance (including the enforcement of consumer protection laws), systemic risk oversight and monetary policy decision-making, and, in conjunction with the Office of the Comptroller of the Currency (OCC) and Federal Deposit Insurance Corporation (FDIC), responsibility for implementation of the Community Reinvestment Act (CRA).”
The fifteen council members will serve staggered three-year terms and meet semi-annually. Members include the President of the Greater Kansas City AFL-CIO, executive director of the Association for Neighborhood and Housing Development, and law professor Catherine Lee Wilson, who teaches courses including bankruptcy and economic justice at University of Nebraska-Lincoln.

The Board website notes that a summary will be posted following the meeting. I do wonder why only a summary, and not a transcript or video, will be released. While the Board is not obligated to act on the CAC's advise, in the interest of transparency, I would like full documentation of the concerns and suggestions brought forth by the CAC. That way we can at least observe what the Board decides to address or not.

Friday, October 30, 2015

Did the Natural Rate Fall***?

Paul Krugman describes the natural rate of interest as "a standard economic concept dating back a century; it’s the rate of interest at which the economy is neither depressed and deflating nor overheated and inflating. And it’s therefore the rate monetary policy is supposed to achieve."

The reason he brings it up-- aside from obvious interest in what the Fed should do about interest rates-- is because a recent paper by Thomas Laubach of Federal Reserve and San Francisco Fed President John Williams has just provided updated estimates of the natural rate for the U.S. Laubach and Williams estimate that the natural rate has fallen to around 0% in the past few years.

The authors' estimates come from a methodology they developed in 2001 (published 2003). The earlier paper noted the imprecision of estimates of the natural rate. The solid line in the figure below presents their estimates of the natural real interest rate, while the dashed line is the real federal funds rate. The green shaded region is the 70% confidence interval around the estimates of the natural rate. (Technical aside: Since the estimation procedure uses the Kalman filter, they compute these confidence intervals using Monte Carlo methods from Hamilton (1986) that account for both filter and parameter uncertainty.) The more commonly reported 90% or 95% confidence interval would of course be even wider, and would certainly include both 0% and 6% in 2000.
Source: Laubach and Williams 2001
The newer paper does not appear to provide confidence intervals or standard errors for the estimates of the natural rate. As the figure below shows, the decline in the point estimate is pretty steep, and this decline is robust to alternative assumptions made in the computation, but robustness and precision are not equivalent.
Source: Laubach and Williams 2015

Note the difference in y-axes on the two preceding figures. If you were to draw those green confidence bands from the older paper on the updated figure from the newer paper, they would basically cover the whole figure. In a "statistical significance" sense (three stars***!), we might not be able to say that the natural rate has fallen. (I can't be sure without knowing the standard errors of the updated estimates, but that's my guess given the width of the 70% confidence intervals on the earlier estimates, and my hunch that the confidence intervals for the newer estimates are even wider, because lots of confidence intervals got wider around 2008.)

I point this out not to say that these findings are insignificant. Quite the opposite, in fact. The economic significance of a decline in the natural rate is so large, in terms of policy implications and what it says about the underlying growth potential of the economy, that this result merits a lot of attention even if it lacks p<0.05 statistical significance. I think it is more common in the profession to overemphasize statistical significance over economic significance.

Tuesday, October 13, 2015

Desire to Serve, Ability to Perform, and Courage to Act

Ben Bernanke’s new book, “The Courage to Act: A Memoir of a Crisis and its Aftermath,” was released on October 5. When the title of the book was revealed in April, it apparently hit a few nerves. Market Watch reported that “Not everyone has been enamored with either Bernanke or his book-titling skills,” listing representative negative reactions to the title from Twitter.

On October 7, Stephen Colbert began an interview of Bernanke by asking about his choice of title for the book, to which Bernanke responded, “I totally blame my wife, it was entirely her idea.”

I hope to comment more substantively on the book after I get a chance to read it, but for now, I just wanted to point out a fun fact about the title. The phrase “courage to act” is the third of three parts of the U.S. Air Force Fire Protection motto: “the desire to serve, the ability to perform, and the courage to act.”

Bernanke has made an explicit analogy between monetary policymakers in the crisis and fire fighters before. In a speech at Princeton in April 2014, he said, “In the middle of a big fire, you don’t start worrying about the fire laws. You try to get the fire out.” On his blog, Bernanke described a bill proposed by Senators Elizabeth Warren and David Vitter as “roughly equivalent to shutting down the fire department to encourage fire safety.” The appeal of the fire fighter analogy to technocratic policymakers with academic backgrounds must be huge. How many nerds’ dreams can be summed up by the notion of saving people from fire…with your brain!

Do we want our policymakers “playing fire fighter”? Ideally, we would be better off if they were more like Smoky the Bear, preventing rather than responding to emergencies. Anat Admati, among others, makes this point in her piece “Where’s the Courage to Act on Banks?” in which she argues that “banks need much more capital, specifically in the form of equity. In this area, the reforms engendered by the crisis have fallen far short.”

Air Force Fire Protection selected its motto by popular vote in 1980. The nominator of the motto was Sargent William J. Sawyers. A discussion of the new motto in the 1980 Fire Protection Newsletter reveals additional dimensions of the analogy, as well as its limits: 
The motto signifies that the first prerequisite of a fire fighter is "the desire to serve." The fire fighter must understand that he is "serving" the public and there is no compensation which is adequate to reward the fire fighter for what they may ultimately give - their life. The second part of the motto is absolutely necessary if the fire fighter is to do the job and do it safely. "The ability to perform" signifies not only a physical and mental ability but also that knowledge is possessed which enables the fire fighter to accomplish the task. The final segment of the motto indicates that fire fighters must have an underlying "courage to act" even when they know what's at stake. To enter a smoke filled building not knowing what's in it or where the fire is, or whether the building is about to collapse requires "courage." To fight an aircraft fire involving munitions, pressure cylinders, volatile fuels, fuel tanks, and just about anything else imaginable requires "courage."
The tripartite Air Force Fire Protection motto emphasizes intrinsic motivation for public service and personal competence as prerequisites to courage. Indeed, in the Roman Catholic tradition, courage, or fortitude, is a cardinal virtue. But as St. Thomas Aquinas explains, fortitude ranks third among the cardinal virtues, behind prudence and justice. He writes that “prudence, since it is a perfection of reason, has the good essentially: while justice effects this good, since it belongs to justice to establish the order of reason in all human affairs: whereas the other virtues safeguard this good, inasmuch as they moderate the passions, lest they lead man away from reason's good. As to the order of the latter, fortitude holds the first place, because fear of dangers of death has the greatest power to make man recede from the good of reason.”

Courage alone, without prudence and justice, is akin to running into a burning building, literally or metaphorically. It may either be commendable or the height of recklessness. As we evaluate Bernanke’s legacy at the Fed, and the role of the Fed more generally, any appraisal of courage should be preceded by consideration of the prudence and justice of Fed actions.

Other mottos that were nominated for the Air Force Fire Protection motto are also interesting to consider in light of the Fed-as-fire-fighter analogy. Which others could Bernanke have considered as book titles? The proposed mottos include:
  • Let us know to let you know we care. 
  • Wherever flames may rage, we are there. 
  • Duty bound. 
  • To serve and preserve. 
  • To intercede in time of need. 
  • When no one else can do. 
  • Duty bound when the chips are down. 
  • For those special times. 
  • Forever vigilant
  • Honor through compassion and bravery.
  • To care to be there. 
  • Prepared for the challenge. 
  • Readiness is our profession.
  • To protect - to serve
  • Without fear and without reproach.
  • Fire prevention - our job is everyone's business
  • Support your fire fighters, we can't do the job alone.

Monday, September 21, 2015

Whose Expectations Augment the Phillips Curve?

My first economics journal publication is now available online in Economics Letters. This link provides free access until November 9, 2015. It is a brief piece (hence the letter format) titled "Whose Expectations Augment the Phillips Curve?" The short answer:
"The inflation expectations of high-income, college-educated, male, and working-age people play a larger role in inflation dynamics than do the expectations of other groups of consumers or of professional forecasters."

Sunday, September 6, 2015

Which Measure of Inflation Should a Central Bank Target?

"Various monetary proposals can be viewed as inflation targeting with a nonstandard price index: The gold standard uses only the price of gold, and a fixed exchange rate uses only the price of a foreign currency."
That's from a 2003 paper by Greg Mankiw and Ricardo Reis called "What Measure of Inflation Should a Central Bank Target?" At the time, the Federal Reserve had not explicitly announced its inflation target, though an emphasis on core inflation, which excludes volatile food and energy prices (the blue line in the figure below), arose under Alan Greenspan's chairmanship. Other central banks, including the Bank of England and the European Central Bank, instead focus on headline inflation. In 2012, the Fed formalized PCE inflation as its price stability target, but closely monitors core inflation as a key indicator of underlying inflation trends.. Some at the Fed, including St. Louis Fed President James Bullard, have argued that the Fed should focus more on headline inflation (the red line) and less on core inflation (the blue line).

Source: FRED
Mankiw and Reis frame the issue more generally than just a choice between core and headline inflation. A price index assigns weights to prices in each sector of the economy. A core price index would put zero weight on food and energy prices, for example, but you could also construct a price index that put a weight of 0.75 on hamburgers and 0.25 on milkshakes, if that struck your fancy. Mankiw and Reis ask how a central bank can optimally choose these weights for the price index it will target in order to maximize some objective.

In particular, they suppose the central bank's objective is to minimize the volatility of the output gap. They explain, "We are interested in finding the price index that, if kept on an assigned target, would lead to the greatest stability in economic activity. This concept might be called the stability price index." They model how the weight on each sector in this stability price index depend on certain sectoral properties: the cyclical sensitivity of the sector, the proclivity of the sector to experience idiosyncratic shocks, and the speed with which the prices in the sector can adjust.

The findings are mostly intuitive. If a particular sector's prices are very procyclical, do not experience large idiosyncratic shocks, or are very sticky, that sector should receive relatively large weight in the stability price index. Each of these characteristics makes the sector more useful from a signal-extraction perspective as an indicator of economic activity.

Next, Mankiw and Reis do a "back-of-the-envelop" exercise to calculate the weights for a stability price index for the United States using data from 1957 to 2001. They consider four sectors: food, energy, other goods and services, and nominal wages. They stick to four sectors for simplicity, but it would also be possible to include other prices, like gold and other asset prices. To the extent that these are relatively flexible and volatile, they would probably receive little weight.  The inclusion of nominal wages is interesting, because it is a price of labor, not of a consumption good, so it gets a weight of 0 in the consumer price index. But nominal wages are procyclical, not prone to indiosyncratic shocks, and sticky, so the result is that the stability price index weight on nominal wages is near one, while the other sectors get weights near zero. This finding is in line with other results, even derived from very different models, about the optimality of including nominal wages in the monetary policy target.

More recently, Josh Bivens and others have proposed nominal wage targets for monetary policy, but they frame this as an alternative to unemployment as an indicator of labor market slack for the full employment component of the Fed's mandate. In Mankiw and Reis' paper, even a strict inflation targeting central bank with no full employment goal may want to use nominal wages as a big part of its preferred measure of inflation. (Since productivity

If we leave nominal wages out of the picture, the results provide some justification for a focus on core, rather than headline, inflation. Namely, food and energy prices are very volatile and not very sticky. Note, however, that the paper assumes that the central bank has perfect credibility, and can thus achieve whatever inflation target it commits to. In Bullard's argument against a focus on core inflation, he implicitly challenges this assumption:
"One immediate benefit of dropping the emphasis on core inflation would be to reconnect the Fed with households and businesses who know price changes when they see them. With trips to the gas station and the grocery store being some of the most frequent shopping experiences for many Americans, it is hardly helpful for Fed credibility to appear to exclude all those prices from consideration in the formation of monetary policy."
Bullard's concern is that since food and energy prices are so visible and salient for consumers, they might play an oversized role in perceptions and expectations of inflation. If the Fed holds core inflation steady while gas prices and headline inflation rise, maybe inflation expectations will rise a lot, becoming unanchored, and causing feedback into core inflation. There is mixed evidence on whether this is a real concern in recent years. I'm doing some work on this topic myself, and hope to share results soon.

As an aside to my students in Senior Research Seminar, I highly recommend the Mankiw and Reis paper as an example of how to write well, especially if you plan to do a theoretical thesis.

Note: The description of the Federal Reserve's inflation target in the first paragraph of this post was edited for accuracy on September 25.

Sunday, August 30, 2015

False Discoveries and the ROC Curves of Social Science

Diagnostic tests for diseases can suffer from two types of errors. A type I error is a false positive, and a type II error is a false negative. The sensitivity or true positive rate is the probability that a test result will be positive when the disease is actually present. The specificity or true negative rate is the probability that a test result will be negative when the disease is not actually present. Different choices of diagnostic criteria correspond to different combinations of sensitivity and specificity. A more sensitive diagnostic test could reduce false negatives, but might increase the false positive rate. Receiver operating characteristic (ROC) curves are a way to visually present this tradeoff by plotting true positive rates or sensitivity on the y-axis and false positive rates (100%-specificity) on the x-axis.


As the figure shows, ROC curves are upward sloping-- diagnosing more true positives typically means also increasing the rate of false positives. The curve goes through (0,0) and (100,100), because it is possible to either diagnose nobody as having the disease and get a 0% true positive rate and 0% false positive rate, or to diagnose everyone as having the disease and get a 100% true positive rate and 100% false positive rate. The further an ROC is above the 45 degree line, the better the diagnostic test is, because for any level of false positives, you get a higher level of true positives.

Rafa Irizarry at the Simply Statistics blog makes a really interesting analogy between diagnosing disease and making scientific discoveries. Scientific findings can be true or false, and if we imagine that increasing the rate of important true discoveries also increases the rate of false positive discoveries, we can plot ROC curves for scientific disciplines. Irizarry imagines the ROC curves for biomedical science and physics (see the figure below). Different fields of research vary in the position and shape of the ROC curve--what you can think of as the production possibilities frontier for knowledge in that discipline-- and in the position on the curve.

In Irizarry's opinion, physicists make fewer important discoveries per decade and also fewer false positives per decade than biomedical scientists. Given the slopes of the curves he has drawn, biomedical scientists could make fewer false positives, but at a cost of far fewer important discoveries.

Source: Rafa Irizarry
A particular scientific field could move along its ROC curve by changing the field's standards regarding peer review and replication, changing norms regarding significance testing, etc. More critical review standards for publication would be represented by a shift down and to the left along the ROC curve, reducing the number of false findings that would be published, but also potentially reducing the number of true discoveries being published. A field could shift its ROC curve outward (good) or inward (bad) by changing the "discovery production technology" of the field.

The importance of discoveries is subjective, and we don't really know numbers of  "false positives" in any field of science. Some never go detected. But lately, evidence of fraudulent or otherwise irreplicable findings in political science and psychology point to potentially high false positive rates in the social sciences. A few days ago, Science published an article on "Estimating the Reproducibility of Psychological Science." From the abstract:
We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects.
As studies of this type hint that the social sciences may be far to the right along an ROC curve, it is interesting to try to visualize the shape of the curve. The physics ROC curve that Irizarry drew is very steep near the origin, so an attempt to reduce false positives further would, in his view, sharply reduce the number of important discoveries. Contrast that to his curve for biomedical science. He indicates that biomedical scientists are on a relatively flat portion of the curve, so reducing the false positive rate would not reduce the number of important discoveries by very much.

What does the shape of the economics ROC curve look like in comparison to those of other sciences, and where along the curve are we? What about macroeconomics in particular? Hypothetically, if we have one study that discovers that the fiscal multiplier is smaller than one, and another study that discovers that the fiscal multiplier is greater than one, then one study is an "important discovery" and one is a false positive. If these were our only two macroeconomic studies, we would be exactly on the 45 degree line with perfect sensitivity but zero specificity.

Thursday, August 6, 2015

Macroeconomics Research at Liberal Arts Colleges

I spent the last two days at the 11th annual Workshop on Macroeconomics Research at Liberal Arts Colleges at Union College. The workshop reflects the growing emphasis that liberal arts colleges place on faculty research. There were four two-hour sessions of research presentations--international, banking, information and expectations, and theory--in addition to breakout sessions on pedagogy. I presented my research in the information and expectations session.

I definitely recommend this workshop to other liberal arts macro professors. The end of summer timing was great. I got to think about how to prioritize my research goals before the semester starts and to hear advice on teaching and course planning from a lot of really passionate teachers. It was very encouraging to witness how many liberal arts college professors at all stages of their careers have maintained very active research agendas while also continually improving in their roles as teachers and advisors.

After dinner on the first day of the workshop, there was a panel discussion about publishing with undergraduates. I also attended a pedagogy session on advising undergraduate research. Many of the liberal arts colleges represented at the workshop have some form of a senior thesis requirement. A big part of the discussion was how to balance the emphasis on "product vs. process" for undergraduate research. In other words, how active of a role should a faculty member take in trying to ensure a high-quality final product of a senior thesis project versus ensuring that different learning goals are met. What should those learning goals be? Some possibilities include helping students decide if they want to go to grad school, teach independence, writing skills, econometric techniques, the ability to for an economic argument. And relatedly, how should grades or honors designations reflect the final product and the learning goals that are emphasized?

We also discussed the relative merits of helping students publish their research, either in an undergraduate journal or a professional journal. There was a lot of lack of clarity about how it affects an assistant professor's tenure case if they have very low-ranked publications with undergraduate coauthors, and a general desire for more explicit guidelines about whether that is considered a valuable contribution.

These discussions of research by or with undergraduates left me really curious to hear about others' experiences doing or supervising undergraduate research. I'd be very happy to feature some examples of research with or by undergraduates as guest posts. Send me an email if you're interested.

At least two other conference participants have blogs, and they are definitely worth checking out. Joseph Joyce of Wellesley blogs about international finance at "Capital Ebbs and Flows." Bill Craighead of Wesleyan blogs at "Twenty-Cent Paradigms." Both have recent thoughtful commentary on Greece.