In 2005, University of Chicago finance professor Raghuram Rajan published a paper in the proceedings of the Federal Reserve Bank of Kansas City called “Has Financial Development Made the World Riskier?” Rajan, then the chief economist at the International Monetary Fund, warned bluntly that incentive structures in the banking profession were leading to reckless credit expansion, herding, and other “perverse behaviors.” He was frostily received when he presented his findings at the Federal Reserve’s annual summer retreat in Jackson Hole that year. The Fed-linked experts who snorted at Rajan’s warnings were sure that financial innovations helped “spread risk” in a way that made the world safer. There was a fixed amount of risk in the world, they seemed to believe, and the more widely distributed it was, the better off we were. Rajan, too, thought the new products and practices “spread risk,” but in a different and more dangerous way: They multiplied it.

Rajan is worth reading not just because he was correct when few were but also because his writing is clear as a bell, even to nonspecialists. His new book, Fault Lines: How Hidden Fractures Still Threaten the World Economy, is not a coherent argument so much as a bunch of independent-minded essays on various topics in contemporary global finance. Some are excellent (his essay on the misaccounting of “tail risk” on corporate balance sheets, for instance). Others are not so hot (his suggestions on improving access to education or his plea for giving IMF analysts more power to impose their views on recalcitrant nations).

Most notably, in the course of this book Rajan offers a bold and convincing diagnosis of how a screw-up in the regulation of poor people’s mortgages in one country has brought the world to the brink of economic disaster, where it teeters still. He goes beyond the proximate causes of the problem—the subprimes and derivatives and trade imbalances and the like. The ultimate cause, Rajan convincingly argues, is a widening of economic inequality that American politicians of both parties found politically intolerable, and chose to fix by turning the credit market into an under-the-table welfare state.

The growth in inequality has been large, Rajan shows. The top 1 percent of the population have laid hold of 58 cents out of every dollar in income growth since the Ford administration. Rajan shows no interest in being patted on the back for pointing this out. Moralizing is not his intent, and there are no Gilded Age clichés here about the undeserving rich. In fact, as a professor of finance rather than economics, he is able to show us some unfamiliar statistics about the way rich people are rich, and one of the most interesting is that we do not have a particularly big class of idle remittance men. “Even for the richest 0.01 percent of Americans toward the end of the twentieth century,” he points out, “80 percent of income consisted of wages and income from self-owned businesses, and only 20 percent consisted of income from arm’s-length financial investments.”

Rajan is describing not the moral problems of capitalism but the political problems. The median American is losing ground. And while people at the 90th percentile have never had it so good, Americans in the 10th percentile have endured a punishing economy for about a third of a century now. Their problems become particularly acute during recessions. For reasons that are not fully clear, recessions have changed in nature in the last 20 years. Historically, Western economies returned to full employment within a few months of hitting a recession’s trough. Losing a job was a calamity, but a calamity of short duration. Since 1992, however, all recoveries have been “jobless recoveries”—in the 2001 recession, it took more than 38 months for the economy to return to full employment. 

And, as Rajan puts it with some understatement, “the United States is singularly unprepared for jobless recoveries.” This is only partly because the United States has a weaker welfare state than other industrialized countries. It is also because the American safety net—in which government provides fewer health and retirement benefits but incentivizes employers to fill the gap—winds up placing all of a person’s eggs in the basket of his job. Lose your job and you lose not only your income but also your (and your children’s) health insurance and possibly (as in several scandalous recent cases) your pension.

Under such circumstances, any recession with the slightest perceptible effect on the public will end political careers by the score. And recessions are, alas, inevitable. The result, under both Democratic and Republican leadership, has been reckless government extension of credit. As a remedy for downturns, this has two political advantages. First, it does not bother conservatives as much as handouts do. Second, “easy credit has large, positive, immediate, and widely distributed benefits, whereas the costs all lie in the future. It has a payoff structure that is precisely the one desired by politicians, which is why so many countries have succumbed to its lure.” You might say that the financial crisis reflects the emergence of the off-balance-sheet liabilities—the human costs—of deindustrialization.

This is an account of what ails us that is radically at odds with the familiar tale of greedy bankers in $5,000 suits. “Almost every financial crisis has political roots,” Rajan writes. The credit market—at least as regards housing—was distorted by government policy, not by a sudden and mysterious escalation in “greed.” The trends that shook the world economy came out of Fannie Mae and Freddie Mac, out of the Federal Housing Administration, and out of their “regulator,” the U.S. Department of Housing and Urban Development.

By 2000, HUD required that low-income loans make up 50 percent of Fannie and Freddie’s portfolios. Out of “compassionate conservatism,” perhaps, the Bush administration raised that mandate to 56 percent. Rajan cites Fannie Mae’s former chief credit officer, Edward Pinto, who notes that, by 2008, “the FHA and various other government programs were exposed to about $2.7 trillion in subprime and Alt-A loans, approximately 59 percent of total loans to these categories.” Peter Wallison of the American Enterprise Institute found that government-mandated loans accounted for two-thirds of “junk mortgages.” 


Another way of looking at this problem is provided in a study done by Rajan’s Chicago colleagues Atif Mian and Amir Sufi. They found that, if you look at the period between 2002 and 2005, the number of mortgages obtained in a given ZIP code “is negatively correlated with household income growth.” In other words, lenders preferred un-creditworthy borrowers to creditworthy borrowers. In a market governed by “greed” and undistorted by government pressure, such a result would make no sense.

It is important to note that the “bubble” part of our real-estate crisis was not especially severe. U.S. housing prices, although they rose unduly, were never nearly as out-of-whack as they became in the past decade in Ireland, Spain, or the United Kingdom or as they became in the late 1980s in Japan. The problem was not so much the amount of collateral people got access to through home-equity loans; it was that they were not well-enough vetted for credit-worthiness—at any level of borrowing—to begin with. Almost everyone has rued that sometime in the past generation, the old-fashioned method of checking out mortgage applicants—through face-to-face interviewing and rigorous investigation of applicants’ character and community standing—gave way to an anonymous, bureaucratic, arm’s-length process that could easily be abused. This was damaging in Rajan’s view. “The judgment calls historically made by loan officers were, in fact, extremely important to the overall credit assessment,” he writes. “It really does matter if the borrower is rude, shifty, and slovenly in the loan interview.” 

The change in procedures was not just a matter of bankers’ forgetting or growing sloppy. To inquire too closely into borrowers’ creditworthiness would leave bankers in danger of falling afoul of antidiscrimination laws, particularly after Bill Clinton vowed to crack down on alleged “red-lining” (racial prejudice in mortgage lending) in the 1990s. George W. Bush’s decision to raise the quotas for low-income lending from 50 to 56 percent of loans certainly strikes us today as foolhardy. But can there be any doubt he would have been pilloried as a racist had he sought to lower them? The Indian-born Rajan never enunciates the unavoidable conclusion, but he is constantly walking much closer to it than any American-born academic would dare to: The finance crisis also reflects, in part, the emergence of the off-balance-sheet liabilities of a kind of affirmative action.

The overarching point is that, whereas European countries until about a decade ago addressed sluggish job creation by expanding their welfare states (which made job creation more sluggish still), the United States chose a different path that proved just as counterproductive. It spread a safety net under its less fortunate citizens through wanton credit creation. And the terrible problem of credit is that it resembles alcohol—as the dosage rises, the problems get bigger, but so does the capacity to ignore them.

 For his part, Rajan thinks overt welfare protections are preferable to the extension of credit as a surrogate safety net, if only because welfare programs are more transparent. 

In the United States for the past decade, any time the economy began to sputter in the slightest, the government ran around like a chicken with its head cut off trying to fix it. America thus took on its present role as the world’s “stimulator of first resort”—the life of the global party. It borrows money from abroad to stoke world demand. If we think of the international economy as a barroom, then the United States is the guy who can be relied upon to buy a round, even if he has a hard time feeding his own family.

And ad hoc remedies are no different from overt welfare in their tendency to breed unintended consequences. “Policy made in the midst of a downturn is often hurried, opportunistic, and poorly thought out.” Extensions to unemployment benefits, because they are so hard to vote against, are routinely loaded with pork-barrel spending. Rajan is scathing about the Obama stimulus, dismissing much of it as not stimulus but “a form of redistribution to fulfill election promises.”

But long before Barack Obama came to power, the United States was pursuing, through tax cuts as well as easy credit, a program of nonstop Keynesian stimulus. In fiscal terms, in credit terms, the “change” that the president is delivering consists of pursuing the same fiscal policy his predecessors did, only more so. The debate over whether the country now needs a “second stimulus package” is in this sense deceptive. We ought to be arguing about whether it is wise to prolong an era of permanent stimulus that is now decades old.


Christopher Caldwell is a senior editor at The Weekly Standard.