Paul Krugman has been somewhat cavalier lately, which is kind of surprising, given his history of defending good economic sense in the hotly-debated issues of international trade and the role of government in ensuring economic stability. More to the point, his recent hit piece for the NYT goes to the effect of attributing the putative “failure of economics” to predict and respond to the financial crisis of 2008q4 to an unhealthy weight given to “mathematical elegance” in economic modeling. A school of physicists peeking their heads into economics, led by Eric Weinstein's theories of gauge invariance as a foundational description of inflation cried fowl to the skies, going to the effect that it's been mathematical stagnation, not excess, that's hurting economics as a field.

I’ll defer to Weinstein on the actual elegance of the current state of mainstream mathematical economics, centered around the ramifications of the Arrow-Hahn-Debreu (AHD) strain of general equilibrium theory. His proposals of extending the current technical apparatus are beyond my present ability to fairly evaluate. I will, however, nitpick from this article by Lee “inflation” Smolin, the cosmologist, who has recently written up an appraisal of AHD theory and the proposed weinsteinian (gauge-theoretical) extensions.

At the very least, the neoclassical theory of general equilibrium establishes that economics is one of the mathematical sciences. It sets a standard for clarity, rigor and generality that alternative approaches to economics must aim to live up to. This theory is a model that mathematical scientists in other disciplines should be interested to study and understand, because there is a serious claim it has succeeded in capturing something true in a simple mathematical structure, not about quarks or gravity waves, but about human beings.

Mathematical economics is a force to be reckoned with. It is not to be taken lightly, even from the mighty technical towers of contemporary physics. The real McCoy here lies elsewhere, though; at no recent point has Krugman attacked AHD theory by name, and his conception of mathematical economics is closer to the popular idea of, uh, mathematics applied to economics, than that of foundational theorists of choice and production.

This is the deal: for the longest while, economics has not been one science, but two: micro and macroeconomics. Both micro and macro have had their respective pure and applied varieties, and there has been debate in the realm of pure theory with varying degrees of impact on the applied problems.

Foundational debates in microeconomics have ranged from the treatment of uncertainty (some economists of mathematical extraction following Frank Knight preferred nonprobabilistic theories of nondeterminism in stark contrast to probabilistic theories of risk) to the very goal of microeconomic modeling – which is why I have emphasized the AHD research programme emphasizing economic kinematics (existence, stability and convergence to equilibria), as opposed to the paretian programme (which emphasizes social welphare issues, and has had a major contribution by Kenneth Arrow himself with his devastating Impossibility Theorem) or the hicksian programme, which emphasized the problems of capital and production.

Applied problems in microeconomics, on the other hand, have stemmed from the statistical difficulties of dealing with the full general equilibrium monty. On the one hand, individual demand functions are not summable in general without losing their relationship with ordinal choice theory – which means the concept of "market demand" as a one-to-one mapping that can be estimated through the methods of econometrics is rather inconsistent. On the other hand, various "observational equivalence" theorems have been proven, which means that observable economic equilibria (assuming we're observing phenomena at the equilibrium to begin with) might correspond to different general equilibrium configurations. On the gripping hand, it's very hard to estimate general equilibrium models from the data with any kind of certainty, which is why most applied microeconometrics impose "partial equilibrium hypotheses" and applied work is really about estimating incomplete models of sub-economies in a way that's as consistent with ordinal choice theory as possible.

Now, this looks like a stark picture, but applied micro has been at large very succesful. Market intelligence and operations research departments use microeconometrics widely (I happen to work in a consulting shop), and even the first-order approximations of sensibility parameters (the so-called "elasticities") are very useful and consistently predictive as rules of thumb. Microeconometric predictions of mid-term demand are used by large companies to map out investment plans and argue with regulators. This is the economics I work with, and this is the economics that has recently become popular in books such as "Freakonomics". (We have a funny but smart tool called "instrumental variables" that helps sort out causation from correlation in surprising ways, and this has caused a so-called "imperialist expansion" from economics/econometrics into sociology and political science in ways that are still half-resented).

Then there's macro. The standard textbook will say that macroeconomics started with Johnny Keynes' "General Theory of Money and Everything", but the main themes of modern macro have been at play at least since Wicksell – to the point that the smart crowd has pointed out that under the Greenspan doctrine we have reverted to full neo-wicksellianism. Of course, before Keynes sorted out the good ideas in Malthusian economics from the primitive understanding of dynamical systems, the business cycle hadn't really been understood – not on a macro level at least. His book is actually called "The General Theory of All Stuff" because he thought that the foundational problems of macro had been laid out under the theory of effective demand, and quote-unquote one day economists would be looked upon as good dentists.

The foundational debates raged on, not only because pure keynesianism had overlooked a few important aspects – among other things, a proper monetary theory that went beyond general platitudes, an actual step backwards from Wicksell – but because of deep ideological constraints and a short-lived applicability of applied macro based on keynesian foundations. I won't get any deeper on the futile debates between the likes of Franco Modigliani, Milton Friedman, James Tobin et al. in order not to tax the patience of the uninterested reader.

The real problem with "foundational macro" as a separate discipline is that the predictive power of its applied counterpart, which is what is being contested after all, was short-lived, mainly because it depended on threadbare behavioural parameters (believe it or not, there were long-lived debates on the affine 'consumption function', ignoring the fact that the idea is entirely inconsistent with foundational or applied micro) and 'adaptative expectations' (aggregate economic agents – which don't exist by virtue of many theorems of foundational micro – learning processes) assumptions. Even if these assumptions were usable, its parameters are subject to periodic structural breaks from external shocks, which happened over and over until they broke down completely under the 1970s stagflation regime triggered by the nonlinear chain of transmission of high oil prices.

The utter failure of "macroeconometrics" prompted a deep soul-searching from within the profession for the "microfoundations of macroeconomics" – the unifying links between foundational micro and applied macro. It's this process that Krugman bemoans. Proposals have ranged from always-in-equilibrium simulation models to so-called "refinements" of standard micro which raised ad hoc exceptions (for example, efficient wages theory) that would yield the results of classical keynesianism while resting on the architecture of foundational micro.

This deep soul-searching has led to some fruits in what economists call "development theory", but has yielded little in what was the main program of applied macro in the classical keynesian sense – we claim to have tamed the business cycle, but apart from what amount to microplatitudes (ranging from neo-austrian to "new keynesian" à la Olivier Blanchard) we have nothing of the sort, and nothing with the scope and ambition of 1960s applied macroeconometrics. Most applied macro nowadays rests on a silver bullet called the vector autoregression model, which is kind of a way of dodging responsibilities, both from Big Macroeconometrics and the microfoundations programme.

All of this has little to do with mathematical elegance. It's a matter of whether "foundational macroeconomics" is a worthy research programme in and of itself or the better part of credibility and policy-making should be ascertained to whatever applied results the soul-searching for "microfoundations" begets. Somehow the microfoundations programme is mixed up with deep mathematics by macroeconomists who haven't done much work in mathematical economics (i.e. foundational micro), but this is a red herring meant to fool the public, and shouldn't fool the mathematically educated – who have much to contribute to foundational micro. But it has.

Log in or register to write something here or to contact authors.