Brownback's Kansas: Good Example or Horrible Warning?
Liveblogging World War II: February 19, 1943

Crowding-in and Rapid Growth in the 1990s: Dean Baker Gets One Wrong, I Think

Screenshot 2 18 13 9 46 AM

Dean Baker writes:

Can We Cut the Crap on Robert Rubin and Deficit Reduction?: Ezra Klei… feed[s] this myth when he tells us of the great virtue of deficit reduction in the Clinton years.

Back in the 1990s, we knew why we feared deficits. They raised interest rates and “crowded out” private borrowing. This wasn’t an abstract concern. In 1991, the interest rate on 10-year Treasurys was 7.86 percent. That meant the interest rate for private borrowing was, for the most part, much higher, choking off investment and economic growth. Enter Clintonomics. The theory was simple: Bring down deficits, and you’d bring down interest rates. Bring down interest rates, and you’d make it easier for the private sector to invest and grow. Make it easier for the private sector to invest and grow, and the economy would boom. The theory was correct. By the end of Clinton’s term, the interest rate on 10-year Treasurys had fallen to 5.26 percent — lower than it had been in 30 years. And the economy was, indeed, booming. 'The deficit reduction increased confidence, helped bring interest rates down, and that, in turn, helped generate and sustain the economic recovery, which, in turn, reduced the deficit further,' Treasury Secretary Robert Rubin said in 1998.

Okay, fans of intro economics know that it is the real interest -- the difference between the nominal interest rate and the inflation rate -- that matters for investment, not the nominal interest rate. The inflation rate in the first half of 1991 was over 5.0 percent. This means that the real interest rate -- the rate that all economists understand is relevant for growth -- around 2.5 percent… [in] the last half year of the Clinton administration (and not some cherry picked low-point) the interest rate on 10-year Treasury bonds averaged around 5.7 percent. The inflation rate for the second half of 2000 averaged around 3.5 percent. This gives us a a real interest rate of 2.2 percent (5.7 percent minus 3.5 percent equals 2.2 percent). So we are supposed to believe that the difference between the 2.5 percent real interest rate in the high deficit pre-Clinton years and the 2.2 percent real interest rate at the end of the Clinton years is the difference between the road to hell and the path to prosperity?

The actual inflation rate in 1991 was 5%/year, but the expected inflation rate over the next decade was more like 3%/year. We are not talking about an 0.3 percentage-point decline in real interest rates, but rather about a 2.3 percentage-point decline in real interest rates. Moreover, back in 1992 when we unwound the yield curve and projected interest rates in the future we saw nominal interest rates has highly likely to rise unless the deficit was substantially reduced. The 2000 we were looking forward at had forecast nominal interest rates of not 7.86%/year but 10%/year or so--a real interest rate of 7%/year.

The counterfactual for 2010 is thus different not by 0.3 percentage-points but by 4.8 percentage-points. That is a much bigger deal.

How big a deal? Enough to boost the growth rate of potential output by between 0.5 and 1.0 percentage points per year, in my estimation…