Below, I elaborate on two claims.

1. Macroeconometrics requires the imposition of prior beliefs.

2. If one has diffuse prior beliefs, the correct macroeconomic policies are not easily formulated.

The proof that macroeconometrics requires strong prior beliefs is quite simple.

1. Macroeconomics in theory is a set of simultaneous equations. There are equations for interest rates, consumer spending, investment spending, and so on. What I call M70, the macroeconomic thinking of around 1970, was embodied in models with over one hundred equations.

2. Simultaneous equation systems face what is known as the "identification problem." For example, in microeconomics, if you look at observations of price and quantity, you do not know whether you observing movements along a demand curve, along a supply curve, or a combination of both. To identify what you are looking at, you need at least one exogenous variable for each endogenous variable. Each exogenous variable can plausibly be excluded from one equation and included in another. For example, the cost of an input can plausibly be excluded from the demand curve and included in the supply curve. The exogenous variable allows you to identify shifts in the curves and to undertake estimation of the system. Thus, if there are r equations, you need at least r exogenous variables.

3. On the other hand, if you have k exogenous variables, you need the number of observations, n, to be larger than k. If you have more equations than observations, then you necessarily have more exogenous variables than observations, and the matrix will not invert.

4. In practice, macroeconometrics only works through the imposition of priors. In fact, M70 models are estimated in single-equation format, which implicitly imposes a prior belief that the equations are not truly simultaneous. In vector autoregression, the prior belief is that only a small subset of variables will summarize the entire system. VAR's exclude most of the endogenous and all of the exogenous variables used in M70 models.

Overall, macroeconomics is problematic for those with diffuse priors. One can observe a Scott Sumner who is totally convinced that monetary expansion now is warranted. And one can observe a John Taylor who is equally adamant that monetary expansion now is not warranted. And, of course, one can observe Prad Krulong adamantly arguing that we are in a liquidity trap that requires more fiscal stimulus. Not to mention those of us who have doubts about the whole paradigm of aggregate supply and demand.

What should a policy maker do? In my view, a fiscal expansion is high risk, given the long-term debt outlook. A monetary expansion is lower risk. If we get more inflation, well, we were going to get it sooner or later, given the fiscal outlook. The worst case is that we get it sooner rather than later.

But, of course, my priors lean to the view that neither fiscal nor monetary expansion will do much good.

The implication, in the context of your other posts, is that model-building is futile rather than merely "hard". But even if you have diffuse priors, you should be able to identify a subset which can build a system of equations which provide predictions within an acceptable margin of error.

The acceptable margin might actually be quite high. Given that policy will be written anyway and sometimes dramatic steps will be taken (or neglected), "better than blind guessing" might actually be the margin at which a model has an absolute advantage over silence.

"Good enough for me to proclaim to my friends that people who disagree with us are moral monsters." is probably the relevant margin of error for Paul Krulong, and I suspect, for Murray Rothbard too.

There is nothing unique to macro about this. Micro is all about partial equilibrium analysis that has the same flaws. You may believe the assumptions are more reasonable and what you neglect is unimportant, but these are still beliefs. There is a real world out there though, which if the models are seriously flawed, will show up in the data. It may not be fast, flawless, or foolproof, but time really does tell eventually.

Are the constructs measurable? Are the relationships stable over time? What happens when you make an endogenous variable the focus of discretionary policy? Just who is the wizard of Oz with the freedom from external constraints that spins the policy dials?

To me the evidence is overwhelming that while the government can crash an economy, the government cannot fine tune the economy to smooth out the business cycle. And this has become more true with the passage of time.

@ Charles R. Williams

I think I agree with what you said during the downturn. I think however that lifting restrictions on land use, capital requirements, etc during the previous downturn can make the next downturn less severe, while increasing them will make the next downturn worse.

Actually, the worst case scenario is that the monetary pump prevents the economy from adjusting as it needs to and sets us up for a very quick depression in a couple of years. Money pumping does more than raise prices; it unbalances the ratio of capital good to consumer goods production. The correction of that ratio to the demands of consumers causes depressions.

The population can read the equations and hedge themselves favorably. That's a new equation with a fallacy of composition to figure in, that not everybody can hedge themselves favorably at once.

Looking for invariants is more fruitful, like you can't consume more than people are producing; then add prices only consistent with that invariant.

In effect, the supply and demand curves are backwards as to causality.

The exposition of economics ought to be working out the consequences of invariants. Then even the general reader would understand it.