In Jacobin: A Demystifying Decade for Economics

(The new issue of Jacobin has a piece by me on the state of economics ten years after the crisis. The published version is here. I’ve posted a slightly expanded version below. Even though Jacobin was generous with the word count and Seth Ackerman’s edits were as always superb, they still cut some material that, as king of the infinite space of this blog, I would rather include.)


For Economics, a Demystifying Decade

Has economics changed since the crisis? As usual, the answer is: It depends. If we look at the macroeconomic theory of PhD programs and top journals, the answer is clearly, no. Macroeconomic theory remains the same self-contained, abstract art form that it has been for the past twenty-five years. But despite its hegemony over the peak institutions of academic economics, this mainstream is not the only mainstream. The economics of the mainstream policy world (central bankers, Treasury staffers, Financial Times editorialists), only intermittently attentive to the journals in the best times, has gone its own way; the pieties of a decade ago have much less of a hold today. And within the elite academic world, there’s plenty of empirical work that responds to the developments of the past ten years, even if it doesn’t — yet — add up to any alternative vision.

For a socialist, it’s probably a mistake to see economists primarily as either carriers of valuable technical expertise or systematic expositors of capitalist ideology. They are participants in public debates just like anyone else. The profession as the whole is more often found trailing after political developments than advancing them.


The first thing to understand about macroeconomic theory is that it is weirder than you think. The heart of it is the idea that the economy can be thought of as a single infinite-lived individual trading off leisure and consumption over all future time. For an orthodox macroeconomist – anyone who hoped to be hired at a research university in the past 30 years – this approach isn’t just one tool among others. It is macroeconomics. Every question has to be expressed as finding the utility-maximizing path of consumption and production over all eternity, under a precisely defined set of constraints. Otherwise it doesn’t scan.

This approach is formalized in something called the Euler equation, which is a device for summing up an infinite series of discounted future values. Some version of this equation is the basis of most articles on macroeconomic theory published in a mainstream journal in the past 30 years.It might seem like an odd default, given the obvious fact that real economies contain households, businesses, governments and other distinct entities, none of whom can turn income in the far distant future into spending today. But it has the advantage of fitting macroeconomic problems — which at face value involve uncertainty, conflicting interests, coordination failures and so on — into the scarce-means-and-competing-ends Robinson Crusoe vision that has long been economics’ home ground.

There’s a funny history to this technique. It was invented by Frank Ramsey, a young philosopher and mathematician in Keynes’ Cambridge circle in the 1920s, to answer the question: If you were organizing an economy from the top down and had to choose between producing for present needs versus investing to allow more production later, how would you decide the ideal mix? The Euler equation offers a convenient tool for expressing the tradeoff between production in the future versus production today.

This makes sense as a way of describing what a planner should do. But through one of those transmogrifications intellectual history is full of, the same formalism was picked up and popularized after World War II by Solow and Samuelson as a description of how growth actually happens in capitalist economies. The problem of macroeconomics has continued to be framed as how an ideal planner should direct consumption and production to produce the best outcomes for anyone, often with the “ideal planner” language intact. Pick up any modern economics textbook and you’ll find that substantive questions can’t be asked except in terms of how a far sighted agent would choose this path of consumption as the best possible one allowed by the model.

There’s nothing wrong with adopting a simplified formal representation of a fuzzier and more complicated reality. As Marx said, abstraction is the social scientist’s substitute for the microscope or telescope. But these models are not simple by any normal human definition. The models may abstract away from features of the world that non-economists might think are rather fundamental to “the economy” — like the existence of businesses, money, and government — but the part of the world they do represent — the optimal tradeoff between consumption today and consumption tomorrow — is described in the greatest possible detail. This combination of extreme specificity on one dimension and extreme abstraction on the others might seem weird and arbitrary. But in today’s profession, if you don’t at least start from there, you’re not doing economics.

At the same time, many producers of this kind of models do have a quite realistic understanding of the behavior of real economies, often informed by first-hand experience in government. The combination of tight genre constraints and real insight leads to a strange style of theorizing, where the goal is to produce a model that satisfies the the conventions of the discipline while arriving at a conclusion that you’ve already reached by other means. Michael Woodford, perhaps the leading theorist of “New Keynesian” macroeconomics, more or less admits that the purpose of his models is to justify the countercyclical interest rate policy already pursued by central banks in a language acceptable to academic economists. Of course the central bankers themselves don’t learn anything from such an exercise — and you will scan the minutes of Fed meetings in vain for discussion of first-order ARIMA technology shocks — but they  presumably find it reassuring to hear that what they already thought is consistent with the most modern economic theory. It’s the economic equivalent of the college president in Randall Jarrell’s Pictures from an Institution:

About anything, anything at all, Dwight Robbins believed what Reason and Virtue and Tolerance and a Comprehensive Organic Synthesis of Values would have him believe. And about anything, anything at all, he believed what it was expedient for the president of Benton College to believe. You looked at the two beliefs, and lo! the two were one. Do you remember, as a child without much time, turning to the back of the arithmetic book, getting the answer to a problem, and then writing down the summary hypothetical operations by which the answer had been, so to speak, arrived at? It is the only method of problem-solving that always gives correct answers…

The development of theory since the crisis has followed this mold. One prominent example: After the crash of 2008, Paul Krugman immediately began talking about the liquidity trap and the “perverse” Keynesian claims that become true when interest rates were stuck at zero. Fiscal policy was now effective, there was no danger in inflation from increases in the money supply, a trade deficit could cost jobs, and so on. He explicated these ideas with the help of the “IS-LM” models found in undergraduate textbooks — genuinely simple abstractions that haven’t played a role in academic work in decades.

Some years later, he and Gautti Eggertson unveiled a model in the approved New Keynesian style, which showed that, indeed, if interest rates  were fixed at zero then fiscal policy, normally powerless, now became highly effective. This exercise may have been a display of technical skill (I suppose; I’m not a connoisseur) but what do we learn from it? After all, generating that conclusion was the announced  goal from the beginning. The formal model was retrofitted to generate the argument that Krugman and others had been making for years, and lo! the two were one.

It’s a perfect example of Joan Robinson’s line that economic theory is the art of taking a rabbit out of a hat, when you’ve just put it into the hat in full view of the audience. I suppose what someone like Krugman might say in his defense is that he wanted to find out if the rabbit would fit in the hat. But if you do the math right, it always does.

(What’s funnier in this case is that the rabbit actually didn’t fit, but they insisted on pulling it out anyway. As the conservative economist John Cochrane gleefully pointed out, the same model also says that raising taxes on wages should also boost employment in a liquidity trap. But no one believed that before writing down the equations, so they didn’t believe it afterward either. As Krugman’s coauthor Eggerston judiciously put it, “there may be reasons outside the model” to reject the idea that increasing payroll taxes is a good idea in a recession.)

Left critics often imagine economics as an effort to understand reality that’s gotten hopelessly confused, or as a systematic effort to uphold capitalist ideology. But I think both of these claims are, in a way, too kind; they assume that economic theory is “about” the real world in the first place. Better to think of it as a self-constrained art form, whose apparent connections to economic phenomena are results of a confusing overlap in vocabulary. Think about chess and medieval history: The statement that “queens are most effective when supported by strong bishops” might be reasonable in both domains, but its application in the one case will tell you nothing about its application in the other.

Over the past decade, people (such as, famously, Queen Elizabeth) have often asked why economists failed to predict the crisis. As a criticism of economics, this is simultaneously setting the bar too high and too low. Too high, because crises are intrinsically hard to predict. Too low, because modern macroeconomics doesn’t predict anything at all.  As Suresh Naidu puts it, the best way to think about what most economic theorists do is as a kind of constrained-maximization poetry. It makes no more sense to ask “is it true” than of a haiku.


While theory buzzes around in its fly-bottle, empirical macroeconomics, more attuned to concrete developments, has made a number of genuinely interesting departures. Several areas have been particularly fertile: the importance of financial conditions and credit constraints; government budgets as a tool to stabilize demand and employment; the links between macroeconomic outcomes and the distribution of income; and the importance of aggregate demand even in the long run.

Not surprisingly, the financial crisis spawned a new body of work trying to assess the importance of credit, and financial conditions more broadly, for macroeconomic outcomes. (Similar bodies of work were produced in the wake of previous financial disruptions; these however don’t get much cited in the current iteration.) A large number of empirical papers tried to assess how important access to credit was for household spending and business investment, and how much of the swing from boom to bust could be explained by the tighter limits on credit. Perhaps the outstanding figures here are Atif Mian and Amir Sufi, who assembled a large body of evidence that the boom in lending in the 2000s reflected mainly an increased willingness to lend on the part of banks, rather than an increased desire to borrow on the part of families; and that the subsequent debt overhang explained a large part of depressed income and employment in the years after 2008.

While Mian and Sufi occupy solidly mainstream positions (at Princeton and Chicago, respectively), their work has been embraced by a number of radical economists who see vindication for long-standing left-Keynesian ideas about the financial roots of economic instability. Markus Brunnermeier (also at Princeton) and his coauthors have also done interesting work trying to untangle the mechanisms of the 2008 financial crisis and to generalize them, with particular attention to the old Keynesian concept of liquidity. That finance is important to the economy is not, in itself, news to anyone other than economists; but this new empirical work is valuable in translating this general awareness into concrete usable form.

A second area of renewed empirical interest is fiscal policy — the use of the government budget to manage aggregate demand. Even more than with finance, economics here has followed rather than led the policy debate. Policymakers were turning to large-scale fiscal stimulus well before academics began producing studies of its effectiveness. Still, it’s striking how many new and sophisticated efforts there have been to estimate the fiscal multiplier — the increase in GDP generated by an additional dollar of government spending.

In the US, there’s been particular interest in using variation in government spending and unemployment across states to estimate the effect of the former on the latter. The outstanding work here is probably that of Gabriel Chodorow-Reich. Like most entries in this literature, Chodorow-Reich’s suggests fiscal multipliers that are higher than almost any mainstream economist would have accepted a decade ago, with each dollar of government spending adding perhaps two dollars to GDP. Similar work has been published by the IMF, which acknowledged that past studies had “significantly underestimated” the positive effects of fiscal policy. This mea culpa was particularly striking coming from the global enforcer of economic orthodoxy.

The IMF has also revisited its previously ironclad opposition to capital controls — restrictions on financial flows across national borders. More broadly, it has begun to offer, at least intermittently, a platform for work challenging the “Washington Consensus” it helped establish in the 1980s, though this shift predates the crisis of 2008. The changed tone coming out of the IMF’s research department has so far been only occasionally matched by a change in its lending policies.

Income distribution is another area where there has been a flowering of more diverse empirical work in the past decade. Here of course the outstanding figure is Thomas Piketty. With his collaborators (Gabriel Zucman, Emmanuel Saez and others) he has practically defined a new field. Income distribution has always been a concern of economists, of course, but it has typically been assumed to reflect differences in “skill.” The large differences in pay that appeared to be unexplained by education, experience, and so on, were often attributed to “unmeasured skill.” (As John Eatwell used to joke: Hegemony means you get to name the residual.)

Piketty made distribution — between labor and capital, not just across individuals — into something that evolves independently, and that belongs to the macro level of the economy as a whole rather than the micro level of individuals. When his book Capital in the 21st Century was published, a great deal of attention was focused on the formula “r > g,” supposedly reflecting a deep-seated tendency for capital accumulation to outpace economic growth. But in recent years there’s been an interesting evolution in the empirical work Piketty and his coauthors have published, focusing on countries like Russia/USSR and China, etc., which didn’t feature in the original survey. Political and institutional factors like labor rights and the legal forms taken by businesses have moved to center stage, while the formal reasoning of “r > g” has receded — sometimes literally to a footnote. While no longer embedded in the grand narrative of Capital in the 21st Century, this body of empirical work is extremely valuable, especially since Piketty and company are so generous in making their data publicly available. It has also created space for younger scholars to make similar long-run studies of the distribution of income and wealth in countries that the Piketty team hasn’t yet reached, like Rishabh Kumar’s superb work on India. It has also been extended by other empirical economists, like Lukas Karabarbounis and coauthors, who have looked at changes in income distribution through the lens of market power and the distribution of surplus within the corporation — not something a University of Chicago economist would have ben likely to study a decade ago.

A final area where mainstream empirical work has wandered well beyond its pre-2008 limits is the question of whether aggregate demand — and money and finance more broadly — can affect long-run economic outcomes. The conventional view, still dominant in textbooks, draws a hard line between the short run and the long run, more or less meaning a period longer than one business cycle. In the short run, demand and money matter. But in the long run, the path of the economy depends strictly on “real” factors — population growth, technology, and so on.

Here again, the challenge to conventional wisdom has been prompted by real-world developments. On the one hand, weak demand — reflected in historically low interest rates — has seemed to be an ongoing rather than a cyclical problem. Lawrence Summers dubbed this phenomenon “secular stagnation,” reviving a phrase used in the 1940s by the early American Keynesian Alvin Hansen.

On the other hand, it has become increasingly clear that the productive capacity of the economy is not something separate from current demand and production levels, but dependent on them in various ways. Unemployed workers stop looking for work; businesses operating below capacity don’t invest in new plant and equipment or develop new technology. This has manifested itself most clearly in the fall in labor force participation over the past decade, which has been considerably greater than can be explained on the basis of the aging population or other demographic factors. The bottom line is that an economy that spends several years producing less than it is capable of, will be capable of producing less in the future. This phenomenon, usually called “hysteresis,” has been explored by economists like Laurence Ball, Summers (again) and Brad DeLong, among others. The existence of hysteresis, among other implications, suggests that the costs of high unemployment may be greater than previously believed, and conversely that public spending in a recession can pay for itself by boosting incomes and taxes in future years.

These empirical lines are hard to fit into the box of orthodox theory — not that people don’t try. But so far they don’t add up to more than an eclectic set of provocative results. The creativity in mainstream empirical work has not yet been matched by any effort to find an alternative framework for thinking of the economy as a whole. For people coming from non-mainstream paradigms — Marxist or Keynesian — there is now plenty of useful material in mainstream empirical macroeconomics to draw on – much more than in the previous decade. But these new lines of empirical work have been forced on the mainstream by developments in the outside world that were too pressing to ignore. For the moment, at least, they don’t imply any systematic rethinking of economic theory.


Perhaps the central feature of the policy mainstream a decade ago was a smug and, in retrospect, remarkable complacency that the macroeconomic problem had been solved by independent central banks like the Federal Reserve.  For a sense of the pre-crisis consensus, consider this speech by a prominent economist in September 2007, just as the US was heading into its worst recession since the 1930s:

One of the most striking facts about macropolicy is that we have progressed amazingly. … In my opinion, better policy, particularly on the part of the Federal Reserve, is directly responsible for the low inflation and the virtual disappearance of the business cycle in the last 25 years. … The story of stabilization policy of the last quarter century is one of amazing success.

You might expect the speaker to be a right-wing Chicago type like Robert Lucas, whose claim that “the problem of depression prevention has been solved” was widely mocked after the crisis broke out. But in fact it was Christina Romer, soon headed to Washington as the Obama administration’s top economist. In accounts of the internal debates over fiscal policy that dominated the early days of the administration, Romer often comes across as one of the heroes, arguing for a big program of public spending against more conservative figures like Summers. So it’s especially striking that in the 2007 speech she spoke of a “glorious counterrevolution” against Keynesian ideas. Indeed, she saw the persistence of the idea of using deficit spending to fight unemployment as the one dark spot in an otherwise cloudless sky. There’s more than a little irony in the fact that opponents of the massive stimulus Romer ended up favoring drew their intellectual support from exactly the arguments she had been making just a year earlier. But it’s also a vivid illustration of a consistent pattern: ideas have evolved more rapidly in the world of practical policy than among academic economists.

For further evidence, consider a 2016 paper by Jason Furman, Obama’s final chief economist, on “The New View of Fiscal Policy.” As chair of the White House Council of Economic Advisers, Furman embodied the policy-economics consensus ex officio. Though he didn’t mention his predecessor by name, his paper was almost a point-by-point rebuttal of Romer’s “glorious counterrevolution” speech of a decade earlier. It starts with four propositions shared until recently by almost all respectable economists: that central banks can and should stabilize demand all by themselves, with no role for fiscal policy; that public deficits raise interest rates and crowd out private investment; that budget deficits, even if occasionally called for, need to be strictly controlled with an eye on the public debt; and that any use of fiscal policy must be strictly short-term.

None of this is true, suggests Furman. Central banks cannot reliably stabilize modern economies on their own, increased public spending should be a standard response to a downturn, worries about public debt are overblown, and stimulus may have to be maintained indefinitely. While these arguments obviously remain within a conventional framework in which the role of the public sector is simply to maintain the flow of private spending at a level consistent with full employment, they nonetheless envision much more active management of the economy by the state. It’s a remarkable departure from textbook orthodoxy for someone occupying such a central place in the policy world.

Another example of orthodoxy giving ground under the pressure of practical policymaking is Narayana Kocherlakota. When he was appointed as President of the Federal Reserve Bank of Minneapolis, he was on the right of debates within the Fed, confident that if the central bank simply followed its existing rules the economy would quickly return to full employment, and rejecting the idea of active fiscal policy. But after a few years on the Fed’s governing Federal Open Market Committee (FOMC), he had moved to the far left, “dovish” end of opinion, arguing strongly for a more aggressive approach to bringing unemployment down by any means available, including deficit spending and more aggressive unconventional tools at the Fed. This meant rejecting much of his own earlier work, perhaps the clearest example of a high-profile economist repudiating his views after the crisis; in the process, he got rid of many of the conservative “freshwater” economists in the Minneapolis Fed’s research department.

The reassessment of central banks themselves has run on parallel lines but gone even farther.

For twenty or thirty years before 2008, the orthodox view of central banks offered a two-fold defense against the dangerous idea — inherited from the 1930s — that managing the instability of capitalist economies was a political problem. First, any mismatch between the economy’s productive capabilities (aggregate supply) and the desired purchases of households and businesses (aggregate demand) could be fully resolved by the central bank; the technicians at the Fed and its peers around the world could prevent any recurrence of mass unemployment or runaway inflation. Second, they could do this by following a simple, objective rule, without any need to balance competing goals.

During those decades, Alan Greenspan personified the figure of the omniscient central banker. Venerated by presidents of both parties, Greenspan was literally sanctified in the press — a 1990 cover of The International Economy had him in papal regalia, under the headline, “Alan Greenspan and His College of Cardinals.” A decade later, he would appear on the cover of Time as the central figure in “The Committee to Save the World,” flanked by Robert Rubin and the ubiquitous Summers. And a decade after that he showed up as Bob Woodward’s eponymous Maestro.

In the past decade, this vision of central banks and central bankers has eroded from several sides. The manifest failure to prevent huge falls in output and employment after 2008 is the most obvious problem. The deep recessions in the US, Europe and elsewhere make a mockery of the “virtual disappearance of the business cycle” that people like Romer had held out as the strongest argument for leaving macropolicy to central banks. And while Janet Yellen or Mario Draghi may be widely admired, they command nothing like the authority of a Greenspan.

The pre-2008 consensus is even more profoundly undermined by what central banks did do than what they failed to do. During the crisis itself, the Fed and other central banks decided which financial institutions to rescue and which to allow to fail, which creditors would get paid in full and which would face losses. Both during the crisis and in the period of stagnation that followed, central banks also intervened in a much wider range of markets, on a much larger scale. In the US, perhaps the most dramatic moment came in late summer 2008, when the commercial paper market — the market for short-term loans used by the largest corporations — froze up, and the Fed stepped in with a promise to lend on its own account to anyone who had previously borrowed there. This watershed moment took the Fed from its usual role of regulating and supporting the private financial system, to simply replacing it.

That intervention lasted only a few months, but in other markets the Fed has largely replaced private creditors for a number of years now. Even today, it is the ultimate lender for about 20 percent of new mortgages in the United States. Policies of quantitative easing, in the US and elsewhere, greatly enlarged central banks’ weight in the economy — in the US, the Fed’s assets jumped from 6 percent of GDP to 25 percent, an expansion that is only now beginning to be unwound.  These policies also committed central banks to targeting longer-term interest rates, and in some cases other asset prices as well, rather than merely the overnight interest rate that had been the sole official tool of policy in the decades before 2008.

While critics (mostly on the Right) have objected that these interventions “distort” financial markets, this makes no sense from the perspective of a practical central banker. As central bankers like the Fed’s Ben Bernanke or the Bank of England’s Adam Posen have often said in response to such criticism, there is no such thing as an “undistorted” financial market. Central banks are always trying to change financial conditions to whatever it thinks favors full employment and stable prices. But as long as the interventions were limited to a single overnight interest rate, it was possible to paper over the contradiction between active monetary policy and the idea of a self-regulating economy, and pretend that policymakers were just trying to follow the “natural” interest rate, whatever that is. The much broader interventions of the past decade have brought the contradiction out into the open.

The broad array of interventions central banks have had to carry out over the past decade have also provoked some second thoughts about the functioning of financial markets even in normal times. If financial markets can get things wrong so catastrophically during crises, shouldn’t that affect our confidence in their ability to allocate credit the rest of the time? And if we are not confident, that opens the door for a much broader range of interventions — not only to stabilize markets and maintain demand, but to affirmatively direct society’s resources in better ways than private finance would do on its own.

In the past decade, this subversive thought has shown up in some surprisingly prominent places. Wearing his policy rather than his theory hat, Paul Krugman sees

… a broader rationale for policy activism than most macroeconomists—even self-proclaimed Keynesians—have generally offered in recent decades. Most of them… have seen the role for policy as pretty much limited to stabilizing aggregate demand. … Once we admit that there can be big asset mispricing, however, the case for intervention becomes much stronger… There is more potential for and power in [government] intervention than was dreamed of in efficient-market models.

From another direction, the notion that macroeconomic policy does not involve conflicting interests has become harder to sustain as inflation, employment, output and asset prices have followed diverging paths. A central plank of the pre-2008 consensus was the aptly named “divine coincidence,” in which the same level of demand would fortuitously and simultaneously lead to full employment, low and stable inflation, and production at the economy’s potential. Operationally, this was embodied in the “NAIRU” — the level of unemployment below which, supposedly, inflation would begin to rise without limit.

Over the past decade, as estimates of the NAIRU have fluctuated almost as much as the unemployment rate itself, it’s become clear that the NAIRU is too unstable and hard to measure to serve as a guide for policy, if it exists at all. It is striking to see someone as prominent as IMF chief economist Olivier Blanchard write (in 2016) that “the US economy is far from satisfying the ‘divine coincidence’,” meaning that stabilizing inflation and minimizing unemployment are two distinct goals. But if there’s no clear link between unemployment and inflation, it’s not clear why central banks should worry about low unemployment at all, or how they should trade off the risks of prices rising undesirably fast against the risk of too-high unemployment. With surprising frankness, high officials at the Fed and other central banks have acknowledged that they simply don’t know what the link between unemployment and inflation looks like today.

To make matters worse, a number of prominent figures — most vocally at the Bank for International Settlements — have argued that we should not be concerned only with conventional price inflation, but also with the behavior of asset prices, such as stocks or real estate. This “financial stability” mandate, if it is accepted, gives central banks yet another mission. The more outcomes central banks are responsible for, and the less confident we are that they all go together, the harder it is to treat central banks as somehow apolitical, as not subject to the same interplay of interests as the rest of the state.

Given the strategic role occupied by central banks in both modern capitalist economies and economic theory, this rethinking has the potential to lead in some radical directions. How far it will actually do so, of course, remains to be seen. Accounts of the Fed’s most recent conclave in Jackson Hole, Wyoming suggest a sense of “mission accomplished” and a desire to get back to the comfortable pieties of the past. Meanwhile, in Europe, the collapse of the intellectual rationale for central banks has been accompanied by the development of the most powerful central bank-ocracy the world has yet seen. So far the European Central Bank has not let its lack of democratic mandate stop it from making coercive intrusions into the domestic policies of its member states, or from serving as the enforcement arm of Europe’s creditors against recalcitrant debtors like Greece.

One thing we can say for sure: Any future crisis will bring the contradictions of central banks’ role as capitalism’s central planners into even sharper relief.


Many critics were disappointed the crisis of a 2008 did not lead to an intellectual revolution on the scale of the 1930s. It’s true that it didn’t. But the image of stasis you’d get from looking at the top journals and textbooks isn’t the whole picture — the most interesting conversations are happening somewhere else. For a generation, leftists in economics have struggled to change the profession, some by launching attacks (often well aimed, but ignored) from the outside, others by trying to make radical ideas parsable in the orthodox language. One lesson of the past decade is that both groups got it backward.

Keynes famously wrote that “Practical men who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist.” It’s a good line. But in recent years the relationship seems to have been more the other way round. If we want to change the economics profession, we need to start changing the world. Economics will follow.

Thanks to Arjun Jayadev, Ethan Kaplan, Mike Konczal and Suresh Naidu for helpful suggestions and comments.

23 thoughts on “In Jacobin: A Demystifying Decade for Economics”

  1. What a great essay, thank you for sharing it.
    Your last two sentences though. “If we want to change the economics profession, we need to start changing the world. Economics will follow.” Just so many, many ways to consider that statement. So many ways to argue about it. I think it is a fantastic ending. Because I want to argue six different things about those two sentences.

    1. Hi Ritwik! You have just intrigued me about your subject (my area of study is economics). Can you give me any reference to that orthodoxy you just wrote?

  2. I wonder, do you have a strong view of the specific mechanism that caused the crisis? I have always found Morgan Ricks’ view that ’08 was ultimately another flavor of bank run (in which the runnable assets were interbank funding markets due to a collapse of confidence in financial institutions viability as credit derivatives referenced to ‘toxic’ structured mortgage securities blew huge unexpected and misunderstood holes [at least on paper] in the banks’ equity capital).

    “If financial markets can get things so catastrophically during crises, shouldn’t that affect our confidence in their ability to allocate credit the rest of the time?”

    I think the case of the causal mechanism of runnable assets suggests that what happened in ‘08 is a fairly well understood phenomenon (banks invent a thing, lever up exposure, then at the inflection point where actors realize there’s been a mistake there’s a run based on an ~equal and opposite overreaction to the asset class in question). I think this speaks to a general framework – panicked markets are bad at pricing assets during such panic, and through that lens it is perfectly reasonable that some creditor ex machina would take over until markets were able to assess actual damage to other institutions (also bailouts etc etc)?

    I think this suggests the last crisis was just another version of a bank run, the causes of and response to which are relatively well understood in abstraction (stop the bleeding, stimulus, pare back positions, etc).

    I am curious to see if you have a view on an underlying mechanism within the market that drove the crisis, because I think the runnable asset class theory suggests a much narrower indictment of the concept of financial markets generally than the article here takes for granted and runs with.

  3. That was a great summary of the decade.

    How many times do the empiric economists need to show up the orthodox economists to elicit any kind of concession. It will be interesting to see how well Bill Mitchell and Randy Wray’s new textbook on Macro is received.

  4. Does the good stuff coming out of the mainstream have any lessons for what methodologies are the most useful? I get that the language of optimization and equilibrium still dominates, but stripping that away, what tools do you really need?

  5. “This combination of extreme specificity on one dimension and extreme abstraction on the others might seem weird and arbitrary. But in today’s profession, if you don’t at least start from there, you’re not doing economics.”


    “He explicated these ideas with the help of the “IS-LM” models found in undergraduate textbooks — genuinely simple abstractions that haven’t played a role in academic work in decades.”

    Simple and obtuse.

    Get thee to an accounting class, Krugman.

    “The creativity in mainstream empirical work has not yet been matched by any effort to find an alternative framework for thinking of the economy as a whole.”


    “But after a few years on the Fed’s governing Federal Open Market Committee (FOMC), he had moved to the far left, “dovish” end of opinion, arguing strongly for a more aggressive approach to bringing unemployment down by any means available, including deficit spending and more aggressive unconventional tools at the Fed.”

    Who would have thought real world exposure could change an academic perspective?

    “Even today, it is the ultimate lender for about 20 percent of new mortgages in the United States.”

    *new* mortgages?

    I haven’t been paying close attention lately, so this puzzled me – but I guess they’re still replacing maturities and prepayments to some degree?

    “there is no such thing as an “undistorted” financial market”

    Interesting way of putting it.

    There’s also no such thing as a market without at least some interest rates being administered (set fixed for some non-zero time period) by institutions (including central banks and commercial banks).

    1. *new* mortgages?

      I haven’t been paying close attention lately, so this puzzled me – but I guess they’re still replacing maturities and prepayments to some degree?

      That’s right. They’ve been rolling over the mortgage-backed securities they bought during the crisis, which means buying new ones (backed by newly made mortgages) as the existing ones mature.

        1. The Fed purchased about $310 billion in MBSs in 2017. (See here.) In 2017 there were $1.7 trillion in new originations ($1.1 trillion purchase mortgages, $600 billion refinancing). So Fed purchases of mortgages were 18 percent of new mortgages issued in that year.

          Go back to 2013-2014, and the figure was 60 percent. I expect 2018 will turn out to be a bit lower.

  6. Re: impact on asset prices, both academic and policy economists seem to and can ignore that in developing and applying macro theory. Wall Street economists and their coworkers and clients have an interest in asset mispricing. Has there been any analysis the viewpoint coming from that sector?

    1. I and other heterodox economists can attest that it is often easier to talk to finance people than to our mainstream colleagues. I’m not sure it’s exactly what you’re looking for, but Zoltan Pozsar’s “Global Money Notes” (produced for Credit Suisse, but freely available online) are superb.

  7. On the margin, the intellectual action is in our growing understanding of network externalities; their scaling laws, their manifestation as sources of enormous power, and so their potential danger. Eric Lonergan recently suggested our payments system, mediated by banks, is one of the oldest examples of such a data network. Combine this with what some argue is bank’s ability to create money by lending, largely controlling the money supply, and the stage is set for the wild and mysterious ride we are now experiencing.

  8. “UPDATE: I jumped the gun by posting my piece here before the issue of Jacobin that it appears in was officially out. Once the issue is live, I will repost the article here. I expect that will be in a day or two. Sorry!”

    It’s funny how much a slave to capital Jacobin is.

    1. Oh come on. They’re trying to build a platform people actually read. That’s really hard, and they’ve been amazingly successful at it. If part of that invovles an organized rollout for a new issue instead of stuff just kind of appearing, I have no problem with that at all.

      1. You’re heavily bought into the system you so mildly criticize, my friend.

        Also, Joseph Carpenter’s comment above is the most interesting. Please read Bernanke’s 2018 paper for a presentation of the view that household balance sheet deterioration alone was not nearly enough to cause the crisis …

  9. I think that some of the trends that you touch in this article are actually quite related, so, this being the internet, I’ll bang my drum a bit:

    1) “Piketty made distribution — between labor and capital, not just across individuals — into something that evolves independently, and that belongs to the macro level of the economy as a whole rather than the micro level of individuals.”

    This actually is not a new idea, but rather a return to the old: classical political economist did in fact think in this way.
    This approach can be traced back to the physiocrats at least (Marx says so in Theories of surplus value), and is based on the idea that, if there is a total production of say 100, if 70 goes to the workers then only 30 goes to capital and the reverse.

    This might sound as a tautology, but classical economists assumed that the share of production that went to the workers was determined exogenously, so that total profit were determined as (total productivity) – (total wages), both being exogenously determined.

    But, when the marginalist theory of economics became dominant, the ratio of wages to total productivity became part of the general equilibrium together with the rest of the price structure, so the distributional question (between labor and capital) became moot.

    To put it a bit differently, there are two “families” of equilibrium theories:

    – The LTV family (classical), where the relative price of commodities reaches a certain price equilibrium, but the wage share is somehow free-floating or at least not determined by the general price equilibrium. This family includes Marx, other classical economists, Sraffa, various “new interpretation” guys, etc.. I think that many post-keynesian models, while they ignore the problem of the general equlibrium of prices, largely imply this kind of approach.

    – The marginalist family (neoclassical), where there is a natural general equilibrium of prices, and this equilibrium also includes the wage share, so that there is a natural and correct wage share. This family is the dominant one currently and basically what is called “orthodox” economics is this. In this approach, for example, since there is a natural level of wages, full employment is the situation where wages reach this natural level, and the business cycle has to be caused by some market failure that prevents prices to get to their natural level.

    2) “The heart of it is the idea that the economy can be thought of as a single infinite-lived individual trading off leisure and consumption over all future time. […] It was invented by Frank Ramsey, a young philosopher and mathematician in Keynes’ Cambridge circle in the 1920s, to answer the question: If you were organizing an economy from the top down and had to choose between producing for present needs versus investing to allow more production later, how would you decide the ideal mix?”

    This also is linked to the classical/neoclassical debate. In the “classical” theory, investiment in additional capital always comes from the profit share, so, for example in a central planning economy like the USSR there is a limit to wages that is due to the part of production that has to be invested in additional capital (there is some ambiguity between the expansion – building new capital goods – and technological progress – building better, more efficient capital goods).
    In neoclassical theory, the profit share is determined by the magic of the price equilibrium structure, and therefore the quantity of investiment is also magically determined by the market, which should lead to the correct level of investiment. This is where the “divine coincidence” ultimately comes from.
    But in the classical theory, the profit share is determined as a residual after wages (and eventually other stuff like taxes) are detracted from production, so there is no particular reason that investiment should stay at its correct level (in fact the whole concept of the correct level of investiment is quite dubious from a classical perspective). So from a classical perspective stuff like overinvestiment, saving gluts, underconsumption etc. make a lot of sense.

    3) “If financial markets can get things wrong so catastrophically during crises, shouldn’t that affect our confidence in their ability to allocate credit the rest of the time?”

    This also, in my opinion, leads back to the problem in point (2).
    Here I want to point out that there are two different ways the capital markets can fail to allocate credit and more generally values:
    (A) capital markets could allocate credit to the wrong businesses: for example banks could lend too much to homebuyers and too few to shops;
    (B) capital markets could allocate too much wealth or too few wealth in an absolute sense.
    The way the failure of capital markets is usually discussed, it’s somewhat implicit that the problem is (A).
    But, if there is no reason to think that the flow of income that goes to investiment is in any way related to the actual invetiment needs of the economy (however these needs are described), as per the classical approach, then it is natural to think that some of this investiment will necessariously go into bubbles or a general increase in asset prices, so that the problem is actually (B), an excess in invetiment during booms and later a disappearence of investiment during boosts.
    I think this is a very important point, however it seems to me that many authors are still wedded to the neoclassical approach, and therefore assume that the problem is only (A).

    This increase of the wealth to income ratio was also discussed by Piketty, although I think his r>g argument sort of muddled the waters.
    However there is definitely a long term increase in the wealth to income ratio:
    and apparently some economists see this, but still think about the kind of assets that represent the wealth (problem A) instead than to the umbalances that cause the growth in wealth (problem B):

    1. Mr Mister: the idea that Marx was any kind of an equilibrium theorist is popular with those who have been dubbed “20th century marxists”, but has been effectively debunked by the so-called Temporal Single System Interpretation of his work. See, for example, “Simultaneous and Temporal Valuation Contrasted”, by Alan Freeman and Andrew Kliman (

      1. Hi Julian Wells!
        (my handler by the way is MisterMR, as MR are my initials, I say this because in another forum I found a commenter who signs as MR Mister who is not me).

        Las year I finally bought and read Kliman’s “Reclaiming Marx”. I found it very underwhelming, so I’ll use this answer to grumble a bit about it (sorry JW Mason, I hope you are not offended by the OT, in 2 comments because it’s very long).

        To put it simply, Kliman not only is wrong, but is doing a big disservice to marxism in general because he is putting forth a theory that is (1) dogmatic (2) does not even try to represent reality (Kliman makes his arguments only on the base of what Marx “said”, not on the basis of whether his theory makes sense in reality, so even if he was right all we get is that Marx said something very stupid) and (3) wrong even in terms of what Marx actually said.

        The main problem is: there are two broad interpretations of Marx’s LTV:
        a) the equilibrium interpretation, where market mechanisms of supply and demand lead to an equilibrium where prices reflect more or less the amount of labor needed to produce the goods; various authors like Sraffa and Shaik fall in this cathegory (though with different models of equilibrium).
        b) the phlogiston interpretation (I call it so because I don’t like it), where there is something like an abstract “labour value”, and for some magical reason this is supposed to be the true value of things and prices reflect this (sometimes); this interpretation is mostly given by critics of the LTV, but also by some marxists that are pissed off that modern (marginalist) economic theory said that the LTV as an equilibrium theory is wrong.

        a) means that, as the labor values are brought out by market forces, these prices will mostly approximate the notional “labor” used to produce something, so you get to an “approximative” labor theory of value;
        b) instead insists that labor values exist idependently from prices, but then have to postulate some way through which prices are influenced by these phlogiston “values”. (b) is very stupid IMHO.

        What did Marx actually say? It’s actually quite ambiguous because in large parts of Capital he speaks like in (b); in other places, though, Marx clearly and explicitly speaks in term of equilibrium prices:

        “We have just seen how the fluctuation of supply and demand always bring the price of a commodity back to its cost of production. The actual price of a commodity, indeed, stands always above or below the cost of production; but the rise and fall reciprocally balance each other, so that, within a certain period of time, if the ebbs and flows of the industry are reckoned up together, the commodities will be exchanged for one another in accordance with their cost of production. Their price is thus determined by their cost of production.”

        So, Marx clearly and EXPLICITLY says that the LTV is an equilibrium theory, there can be no doubt about this.

        Kliman himself never says explicitly that (b) is true, however, his argument is based on his particular interpretation of Marx’s theory of the falling rate of profit, and to make his interpretation work he has to accept a serie of assumptions, such as that the “value” of commodities is allways and at all moments equal to their labor values, something that doesn’t fit, for example, with the above Marx quote (however Kliman finds various other quotes that support his interpretation, mostly because Marx in his writings generally assumes equilibrium and so more or less assumes that “prices” correspond to normal equilibrium prices, which Marx calls “values”).
        This means that he is actually implicitly accepting interpretation (b), since he doesn’t even try to justify those “labor values” as market outcomes, that necessarily lead to an “approximative” labor theory of value. So in pratice he jumps with two foot into phlogiston value territory, without even realizing it (this incidentially makes his theory a dual system theory, even if he doesn’t recognize this, because values exist independently from prices).

        Hence my point (3) that Kliman doesn’t get Marx right.

        Kliman starts and ends his book based on this idea: Marx wrote some stuff about prices, and Marx also wrote some stuff about the falling rate of profit, so the only coherent interpretation of Marx’s tought is one that can DERIVE the falling rate of profit from the stuff about prices (this is also wrong IMO but I’ll get to this later).
        In this sense Kliman gives a dogmatic interpretation of Marx in the sense that he doesn’t accept the critique that the falling rate of profit cannot be derived from Marx’s LTV, so he creates a completely bogus theory where prices change during time in order to satisfy Marx’s supposed theory (but without a reason for which prices should behave this way).

        Hence my points (1) and (2), that Kliman is dogmatic in an attemt to be more Marxist than Marx, and doesn’t even try to explain reality but just goes on a tangent in a weird sort of literary criticism of Marx.


        1. [segue]

          Which leads to my last criticism of Kliman, that he doesn’t really understand what the theory of the falling rate of profit is about.
          The falling rate of profit is not Marx’s invention, but it’s something that Ricardo discussed first, although Marx gave a different explanation for it.
          The question is this: both Marx and Ricardo assumed that wages are approximately stuck at subsistance level, and that when a productivity increase happens almost all the gains go to the employer.
          This means that the profit share of total income is supposed to increase secularly: for example if in year 0 workers produce 100 and have a subsistance wage of 50 (profit share of 50%), but then in year 2 workers produce 150 and have a wage that is stuck at 60 (profit share of 60%).
          Thus at a first approximation both in Marx’s and Ricardo’s theories profits “should” be on a secular increase. But in reality, both Marx and Ricardo realised that the profit rate wasn’t rising.

          Ricardo explanation was malthusian: workers reproduce like rabbits and so need more and more food, but agriculture has a marginally declining productivity (the only part of Marx’s and Ricardo’s theory where they have marginalism).
          So even as technology improves, productivity in terms of subsistence goods fall (food becomes more expensive in LTV terms), so the wage share actually increases because productivity in an important field of production is actually falling.

          Marx recognised that this was wrong, because the wage share was actually falling in his lifetime but the profit rate wasn’t rising. He terefore gave this explanation: the rate of profit is not the same thing as the share of profit, in an economy you have a capital to output ratio (that Marx called organic composition of capital) so that for example, you start with 500h capital and 100 workers each working 1h, you have a gross output of 150h, but 50h of capital have been consumed so that part of the production is used to reproduce the consumed capital, so that in the end you have 500h of capital both at the beginning and at the end of the period, and only 100h of NET output, that represent “live labor”, so that there is a capital to income ratio of 5 to 1.
          Profits obviously can only come from net output, so for example if the profit share is 30%, the profit RATE is 30/(500+70) = 5,3% (and not the whopping 30/70 = 43% that comes by calculating only profits and wages, as Ricardo did, and that in Marx’s economics correspont to the “exploitation rate” not to the profit rate).

          So Marx explained the falling or at least not rising rate of profit (an empirical reality in his time) with a tendential increase of the capital to income ratio due to technology (the only hypothesis that is true to the empirical data od falling wage share and contemporaneous falling or static profit rate under Marx’s economic theory).

          But this is an EMPIRICAL trove, it’s not derived from Marx’s LTV apart from the fact that given the empirical data and Marx’s theory, this is more or less the only explanation.

          Unfortunately the fact that in Marx’s times the wage share was falling, but in the postwar years it wasn’t, made postwar marxists think that the “falling rate of profit” was based exclusively on the LTV and not on empirical data, because the empirical data in their times was different, hence the misunderstanding, and the idea that Marx’s theory was contradictory since you can’t really derive the falling rate of profit from the LTV.

          Kliman though does the same error in reverse: he doesn’t understand Marx’s problem (that was to explain some empirical data) and so he also believes that the falling rate of profit SHOULD be derived from the LTV, so he tortures the concept of the falling rate of profit by turning it in a sort of weird monetaristic thing, that is totally not what Marx meant.

Comments are closed.