Books discussed in this essay:

The Ascent of Money: A Financial History of the World, by Niall Ferguson

The Return of Depression Economics and the Crisis of 2008, by Paul Krugman 

 

Any list of inventions that have transformed the human condition would include the plough, the printing press, the steam engine, and the generation of electricity. But perhaps the greatest of all is the creation of money, because money is the essential foundation of all modern societies. It is money that dispenses with the traditional quest for self-sufficiency and the clumsy reliance on barter, enabling people and organizations to specialize. Prosperous, technologically advanced societies could not exist without the widespread acceptance of money.

This is so obvious that we rarely think about it, but once we do, we instinctively recognize that money's existence is something of a miracle. After all, we routinely take as tokens embodying real value pieces of paper—whether bills or checks—with little intrinsic worth. Even more astonishing, we have moved from paper to electronic money. We regard digital computer entries that we cannot see or touch as repositories of value. That these leaps of faith occur billions of times a day defines money's requisite traits: trust and confidence. 

We are now rediscovering this truth, because the world has experienced since August 2007 what is generally regarded as the worst financial crisis since the Great Depression of the 1930s. Some venerable institutions including Merrill Lynch, Bear Stearns, and Lehman Brothers have gone bankrupt or merged into stronger rivals. Large parts of the credit market—involving the "securitizing" of home mortgages, auto loans, and credit card debts into bonds—have shrunk dramatically. The economy has plunged into a deep recession. People wonder what happened. What caused confidence and trust to collapse? 

The crisis also reminds us that the story of money is more than a curious historical detour. It's a central artery of civilization, because the spread of money led to the invention of finance, another building block of modern societies. Finance—whether by banks, securities markets, insurance contracts, or government—enables nations to save and invest for tomorrow. Along with specialization, the regard for the future made sustained economic growth possible. But as historian Niall Ferguson shows in his highly readable and informative The Ascent of Money: A Financial History of the World, money and finance have historically been a double-edged sword. They're usually a tremendous boon—and yet can also bring calamity.

"[P]overty is not the result of rapacious financiers exploiting the poor," he writes.

It has much more to do with the lack of financial institutions…. Only when borrowers have access to efficient credit networks can they escape from the clutches of loan sharks, and only when savers can deposit their money in reliable banks can it be channeled from the idle rich to the industrious poor.

But financial breakdowns shred the fabric of ordinary life, undermining political and social cohesion. Lenin allegedly said that the best way to destroy a society is to debauch its currency. Banking panics and market "crashes" can be fearsome. "[F]ew things are harder to predict accurately than the timing and magnitude of financial crises," Ferguson argues, "because the financial system is so genuinely complex and so many of the relationships within it are non-linear, even chaotic." 

What can't be easily understood can't be easily controlled. The origins and causes of the present economic crisis beg for greater clarity. Ferguson's panoramic overview of finance and Paul Krugman's The Return of Depression Economics and the Crisis of 2008 get us part of the way to a better understanding. But there's more to the story than they imagine, or tell.

 

Money's Origins

 

We learn from introductory college economics that money serves three purposes: it is a medium of exchange, a unit of account (that is, a way of setting prices), and a store of value. Ferguson, the Laurence A. Tisch Professor of History at Harvard University and William Ziegler Professor at Harvard Business School, traces the earliest use of money to Mesopotamia about 5,000 years ago, when clay tablets were sometimes used to confirm specific transactions. The Romans had coins made of gold (aureus), silver (denarius), and bronze (sestertius), but for much of history most people had little recourse to money. In Civilization and Capitalism, 15th-18th Century: The Structure of Everyday Life (1992), the historian Fernand Braudel noted that the money economy "was nowhere fully developed, even in a country like France in the sixteenth and seventeenth centuries…." 

Until the last few hundred years, money generally meant metals whose natural scarcity was thought to guarantee their value. The Roman coins reflected this logic; gold was worth more because there was less gold. In the medieval world, the quest for wealth often became the pursuit of gold and silver. The Crusades, Ferguson contends, were at least partly intended to plunder the Muslim world of its precious metals. The explorations of the 16th and 17th centuries sought, too, to ease Europe's scarcity of metals. The discovery of vast silver deposits in Peru and Mexico made Spain a dominant power. Convoys of up to 100 ships transported the metals to Seville, where the crown took a fifth for itself. In the late 16th century, this metallic bonanza accounted for nearly half of Spain's royal spending. 

Trade and war were crucibles of financial innovation. In Florence, the Medicis built their 15th-century banking empire in part by pioneering "bills of exchange": merchants who could not be paid immediately by their customers received "bills" (in effect, promissory notes) pledging payment at a fixed future date; the merchants could then raise cash by discounting the bills (that is, selling them at less than face value) with the Medicis'. Bonds were created, Ferguson relates, by Italian city-states—initially Venice and Florence—to pay for wars. Wealthy families were required to make loans that, in theory, would be repaid from taxes in peacetime. 

Gradually, gold and silver coin (referred to as "specie") begat credit, new securities, and paper money. The Dutch invented the modern corporation—and common stock—with the creation in 1602 of the United Dutch Chartered East India Company, which received a government monopoly on the country's trade with Asia. In Amsterdam alone, there were 1,143 initial investors in this early "joint stock company." (The English East India Company, founded two years earlier, was only an eighth its size.) A stock market quickly arose to allow investors in the Dutch company to sell their shares. Paper money emerged as a way of minimizing the burdensome transfer of large stashes of coin. The Bank of England, created in 1694 to help pay war debts, received distinct privileges in return for investors "converting a portion of the government's debt into shares in the bank." The most important of these came in 1742: a partial monopoly on the issuance of paper notes in and around London.

 

Mississippi and the South Sea

 

 By the 1800s, then, many features of modern financial markets had come into being. There were banks, stock markets, paper money, creditors, debtors, and investors. This system enabled merchants to get loans to ship goods before receiving payment and farmers to buy supplies before harvests. What we now call consumer credit barely existed. Stock and bond markets encouraged the aggregation of investment capital for new ventures—canals, railroads, textile mills, and (later) steel mills. Then as now, financial intermediaries—mostly banks and merchant banks (which sold bonds for governments, railroads, and other industrial concerns)—were thought necessary to evaluate the risks of lending and investing. It was their ability to separate good loans and investments from bad that gave them a moral claim to profits and protected other peoples' money. Risk was spread and calibrated.

That was the theory.

In practice, financial panics and crashes have a long history. Profits were not always ensured; money was not always protected. Among early crises, France's "Mississippi Bubble" and England's "South Sea Bubble," which occurred almost simultaneously in the early 18th century, are well known. Both crises involved attempts to reduce steep government debts, incurred mainly to finance wars, by offering shares in new entities that were granted exclusive privileges: in France, trade in the Louisiana territory west of the Mississippi River; in England, a monopoly on trade with Spain's South American empire (the South Sea). The prospective profits aimed to convince the countries' creditors to exchange their old claims for shares in the new enterprises. Today, we'd call this a "restructuring" of government debt.

Probably the advertised profits would never have materialized. But these schemes collapsed quickly, because the promoters could not resist speculative temptations. Early investors—including the promoters—stood to make a fortune if the price of their shares doubled or tripled. In France, John Law, a renegade Scotsman, designed and managed the plan, including a bank that could issue paper money. Law promised 40% dividends on shares of the Mississippi Company. The dividends were paid with paper money. Investors could buy new shares by borrowing (again, in paper money) against the old. In June 1719, the Mississippi Company issued stock at 550 livre per share. By early September, the price was 5,000; by early December, it was 10,025!

Inflation emerged with a vengeance, as the volume of paper money (which supplemented gold and silver) soared. By the fall of 1720, Paris prices had roughly doubled from two years earlier. Meanwhile, shares of the Mississippi Company dropped to 1,000 in December and continued to plunge. Riots erupted; Law was briefly imprisoned. In England, the South Sea Bubble was smaller and less ruinous because its promoters could not create paper money at will. South Sea stock rose by a factor of 9.5 from its initial prices to its peak, Ferguson relates. The comparable increase for the Mississippi Company was 19.6.

 

Fixing the System

 

Money and finance posed other perils. One was inflation. But that seemed a problem only when countries abandoned gold and silver—as the French did in 1720 and later in the Revolution—and printed vast amounts of paper money. Mostly, paper money was limited by a country's supply of gold. The Bank of England's paper notes could by law be redeemed for gold. By the late 1800s, most developed countries had adopted the gold standard. In the United States, paper money had always been redeemable for gold or silver, with the notable exception of more than $500 million of "greenbacks" issued to pay for the Civil War. In the late 19th century, Americans' main complaint was deflation, or falling prices, because new gold supplies didn't keep pace with the demand for money. Debtors, particularly farmers, felt aggrieved because they had to repay loans in more expensive dollars.

More than inflation, bank panics seemed a threat. Depositors might periodically lose confidence that they could get their money. Bad loans, or rumors of bad loans, could trigger runs. Banks would be imperiled because no bank can meet the simultaneous demands of all depositors for their money—most, after all, has been lent. A run on one bank might cause runs on others. The entire banking system could collapse, depriving borrowers of loans and depositors of cash. One solution, as argued in 1873 by Walter Bagehot, the legendary editor of The Economist, was to have the Bank of England—or any central bank—act as "lender of last resort." It would lend to solvent banks (whose assets exceeded their liabilities) in times of crisis. Depositors, reassured that they could retrieve their money, would leave it be.

Together, the gold standard and the lender of last resort seemed to ensure adequate financial stability. They buttressed confidence and trust. After the brutal Panic of 1907, Congress established the Federal Reserve—America's central bank—in 1913. But the Depression destroyed the prevailing consensus. Defending the gold standard and serving as lender of last resort were at odds. The first required central banks to be stingy with money and credit; the second, just the opposite. Temporarily, the gold standard prevailed, but the social costs were too great. From 1929 to 1933, two-fifths of U.S. banks failed. Ultimately, all advanced societies abandoned the gold standard. In 1933, Congress created deposit insurance; that would be the first line of defense against panic. Banks also would be strictly regulated and examined; banks engaging in shoddy or fraudulent practices would be shut. That was a second line of defense. And finally, the Fed could still be lender of last resort. That was a final defense. 

In post-World War II America, these defenses seemed to have solved the problem of confidence and trust for good. True, there were occasional bank failures and stock market fluctuations. But these were seen as isolated events that did not impugn the system's overall integrity. Hardly anyone worried about financial panics. They seemed relics of a bygone era. 

 

The 2008 Slump

 

We know now that this optimism was an illusion that helped foster the present crisis. Since the late summer of 2007, we've experienced a worldwide credit implosion that has depressed production, employment, stock prices, and confidence almost everywhere. Taking financial stability for granted, money managers, bankers, traders, government officials, and ordinary investors did things that destroyed financial stability. 

The standard story of how this occurred is well-told by Ferguson, whose book was completed in mid-2008 after the housing crisis hit, and by Nobel Prize-winning Princeton economist and New York Times columnist Krugman, whose 1999 book was newly updated and published in December. The debacle starts with so-called "subprime" mortgages that were extended to borrowers with weak credit histories, low incomes, or both. These subprime mortgages were then bundled into various complex bonds—including "collateralized debt obligations" (CDOs)—that were sold to investors, who bought various "tranches" (or segments) of the mortgages' cash flows. Investors in the safest tranches had the first claim on mortgage payments, so they got the lowest interest rate but had the highest probability of being paid. Investors in lower tranches got just the opposite—a higher interest rate but more exposure to losses if borrowers defaulted. 

The marketing of subprime loans was often sleazy. "Subprime lending hit Detroit like an avalanche of Monopoly money," writes Ferguson. "The city was bombarded with radio, television, direct-mail advertisements and armies of agents and brokers, all offering what sounded like attractive deals." Credit standards and loan documentation deteriorated. Some loans, later called NINJA—meaning borrowers had "No Income, No Job, or Assets"—were fraudulent. Still, these loans moved briskly along the financial assembly line—bankers or mortgage brokers made loans; the loans were sold to investment bankers who "securitized" them into bond-like securities; rating agencies like Moody's and Standard & Poor's graded the different tranches, allowing them to be sold to investors—banks, pensions, hedge funds—who thought they knew what they were buying.

They didn't. Ratings proved optimistic. When subprime borrowers began defaulting, a chain reaction ensued. Banks and other investors suffered large losses. To cover the losses, they had to sell other financial assets, or raise new capital. Selling other stocks and bonds drove down their prices, creating more losses for the entire system and generating a larger need for capital. In addition, many banks, investment banks, and hedge funds had relied heavily on borrowed money—"leverage," in financial jargon. At some investment banks, leverage ratios exceeded 30-to-1: there was $30 of borrowed money for every $1 of capital. As losses mounted, lenders (often other banks and investment banks) grew increasingly skittish about renewing their loans. That intensified the pressure to sell assets. The sturdiness of this jerry-built structure of interconnected loans and credits depended on trust and confidence, but trust and confidence were rapidly eroding.

Government tried to bolster confidence through injections of credit and capital. These efforts only partially succeeded. The Bush Administration engineered the rescue of the investment bank Bear Stearns in March 2008. It did not rescue Lehman Brothers in mid-September. That decision created more losses, deepening mistrust and worsening the crisis. The subsequent $700 billion Troubled Asset Relief Program (TARP) added money to the system but not enough to restore confidence. All the financial setbacks weakened the real economy of production and jobs. Suffering huge losses on stocks and homes, American consumers curbed spending. Securitized lending for homes and vehicles fell dramatically. Other countries felt the effects through lower exports to the United States and declines in their financial markets. Global investors sold worldwide, not just in American markets. 

 

Scapegoating the Crisis

 

Though complex, this "deleveraging" resembled an old-fashioned bank run. Lenders suddenly hoarded cash in the face of growing losses and threats to their own sources of credit. The usual explanations for the crisis—greed, herd behavior, and stupidity—are accurate, up to a point. All along the financial supply chain, mortgage bankers, investment bankers, and rating agencies collected hefty fees and passed the risk on to someone else. Quick profits substituted for independent judgment. Because everyone was doing it, it seemed okay. But why did all these people—many of them very smart—succumb so easily? It had been almost eight decades since a similar financial collapse had occurred. During these decades, greed, herd behavior, and stupidity hadn't taken a holiday.

The standard view is that, since the 1980s, government's relaxed regulation of financial markets had permitted reckless behavior that otherwise would have been restricted. Securitization of hedge funds and investment banks in the 1990s created a "shadow banking system"—a parallel network for channeling investment and credit—that was largely unregulated. As Krugman puts it:
 

As the shadow banking system expanded to rival or even surpass conventional banking in importance, politicians and government officials should have realized that we were recreating the kind of financial vulnerability that made the Great Depression possible—and they should have responded by extending regulation and the financial safety net to cover these new institutions.

 

On paper, enlightened regulators might have averted the crisis. Mortgage bankers and brokers could have been prevented from making abusive, unrealistic loans. Investment banks and hedge funds could have been limited in their leverage. Rating agencies could have been better supervised. But all this is hindsight. The unstated—and unrealistic—assumption is that regulators would have spotted financial vulnerabilities that their private counterparts missed. Two bits of evidence suggest that this is wishful thinking. First, regulators didn't prevent subprime blunders at the most heavily regulated financial institutions, commercial banks. Second, regulators at the Securities and Exchange Commission were explicitly warned about the Bernie Madoff swindle and still couldn't find it. 

It's doubtful that government regulators are smarter or better informed than private bankers and investors. True, they have a different mandate and face different incentives: not profit maximization and self-enrichment, but crisis-minimization and bureaucratic power, prestige, and independence. But they are not miracle workers. Chances are that they, too, would not have defused the emerging crisis. Free-market ideology is a convenient explanation and scapegoat for the crisis. But it does not really explain what happened; Ferguson and Krugman don't get to the crux of the matter. 

 

Mistaking Profits for Wisdom

 

People are conditioned by their own experiences. With hindsight, we know that investors, traders, and bankers engaged in reckless risk-taking that created economic and financial havoc. But while this dangerous speculation flourished, its participants mostly thought that the economy and financial markets had become safer. The paradox is that, believing the world was growing less risky, they took actions that made it more risky. 

In some ways, their self-deception was understandable. By many indicators, the economy and financial markets seemed remarkably tranquil. Since the early 1980s, the economy had suffered only two modest recessions, those of 1990-91 and 2001, each lasting only eight months. Though job creation was sometimes sluggish, peak monthly unemployment during these decades reached only 7.8%, well below the monthly highs of the 1970s (9%) or the early 1980s (10.8%). Even after the dot-com speculation of the late 1990s and the September 11 attacks, the economy had not gone into a deep slump. The business cycle seemed tamed, if not conquered. Economists called this improvement the Great Moderation. 

For Wall Street, these years were a bonanza. As interest rates dropped, investors moved into stocks. In 1982, the Dow Jones Industrial Average itself averaged 884. By 1989, this was 2,509; by 1999, 10,465. Bond prices rose, because interest rates and bond prices are mirror images: lower rates mean higher prices. In 1982, AAA-corporate bonds carried rates of nearly 14%; by 1999, they were down to 7%. On both stocks and bonds, investors earned fabulous profits, year in and year out. Falling interest rates also boosted housing prices, because buyers could afford to pay more for homes. In 1981, interest rates on 30-year fixed mortgages averaged nearly 15%; by 1999, they were down to 7%. Existing homeowners enjoyed huge windfalls in higher prices. 

Everything seemed less hazardous and more predictable. In many markets, "volatility" declined. As a financial term, volatility measures typical swings in prices—of stocks, bonds, foreign exchange. Higher volatility signifies greater risk; traders don't know what prices should be. Less volatility suggests less risk. By 2004, volatility in financial markets had dropped sharply. Everyone was aware of it. Risk seemed in retreat. Governments took note. A 2006 study published by the Bank for International Settlements, whose members are government central banks, suggested that the improvement might be "permanent." The reasons included the growth of sophisticated money managers ("well informed agents") and new securities ("risk transfer instruments") that permitted more hedging. Ironically, these would later be cited as causes of the financial crisis. 

If risk had retreated, then once-dangerous practices were safer. Investment banks and hedge funds could assume more leverage (which improved profitability) because volatility (threatening big losses) had declined. Lending standards for mortgages could be relaxed, because even if borrowers defaulted, the relentless rise of home prices would protect lenders against losses. Foreclosed homes could be sold for more than the value of the loan. So, all manner of adventurous behavior was rationalized. Carelessness and complacency were made respectable, even as greed and herd behavior were indulged. In Congress, Democrats pushed the giant government-created mortgage lenders Fannie Mae and Freddie Mac to expand credit for poorer borrowers. Investment houses created and marketed new securities. On Wall Street, there developed a culture of ostentatious, often obnoxious self-congratulation. Some of its wealthiest practitioners assumed airs of superior insight, mistaking profits for wisdom.

 

 Too Much Success

 

What virtually everyone overlooked was that much of this bonanza was the result of good luck, not greater financial acumen. It was the consequence of falling inflation, one of the momentous (if poorly appreciated) economic events of our time. From 1979 to 1989, consumer price inflation dropped from 13.3% to 4.6%; by 2001, it was 1.6%. Interest rates followed inflation down, because rates reflect an inflationary component. Lenders want to be compensated for the erosion of their money. As rates dropped, stock prices, bond prices, and real estate prices rose; investors shifted out money from savings accounts and money market funds to stocks. Economic expansions lengthened because high inflation had been destabilizing. Consumers borrowed more and spent more, because they counted some of their new stock and housing wealth as saving. Stronger consumer spending bolstered America's economy—and the world's, too, because Americans bought vast amounts of imported goods. 

It is hard to argue that the defeat of double-digit inflation, engineered by Federal Reserve chairman Paul Volcker and President Ronald Reagan in the early 1980s, was a bad thing. It was the fundamental cause of the long economic expansions of the 1980s and '90s. Falling inflation created "virtuous circles" for both financial markets and the "real economy." But, perversely, it also led to bad consequences, because its great benefits induced economic imbalances and beliefs that were ultimately self-defeating. The great profits made in financial markets gave money managers, investment bankers, and analysts an exaggerated sense of their own skills and understanding. Long expansions and shallow recessions encouraged lenders to make loans to weaker borrowers. In 2005, only 3% of subprime mortgage borrowers were in default (by late 2008, the figure was 13%). 

Given the initial rise in stocks and home prices, households could borrow more. But they could not endlessly increase the ratio of their debt-to-income—which is what happened, in part because lending standards became more lax. Americans could buy more imports, but trade imbalances based on a "strong" dollar that overpriced U.S. exports and under-priced imports could not grow indefinitely without making some countries like China and Japan too export dependent. Because they were so reliant on Americans' ever-rising indebtedness to buy imports, the structure of the world economy became dangerously unstable. And the "strong dollar"—the linchpin of the entire system—would not have existed without the low inflation that buttressed faith in the dollar's purchasing power. 

What this suggests is that prolonged prosperity was the underlying cause of the great financial meltdown. Too much success bred failure. Overcoming high inflation was a triumph, but the ensuing prosperity warped private behavior and public policies in ways that undermined prosperity. Money managers, lenders, and many ordinary Americans were lulled into a false sense of security, control, and optimism. So were government officials. After the dot-com bubble and 9/11, the Federal Reserve cut short-term interest rates to 1%. Such a move was possible only because the Fed was a credible anti-inflation fighter. With modest inflationary expectations, low interest rates didn't cause price increases. Cheap credit softened the recession but also exacerbated the housing bubble and financial speculation. Riskier borrowers, at home and abroad—including financial institutions—got loans. Too much trust and confidence destroyed trust and confidence. 

 

Balancing Markets and Regulation

 

Modern, advanced democracies are dedicated in part to the delivery of as much prosperity as possible to as many people as possible for as long as possible. The troubling implication of the current crisis is that this promise is itself a source of instability. Behind the promise lies the presumption that economic and financial knowledge have improved sufficiently to allow governments to supervise and manage the financial system and the larger economy. We had supposedly gone beyond the era of inevitable "booms and busts." The advance in knowledge meant that governments could legitimately be held accountable for economic performance. In a general sense, this will surely continue. The promise won't be revoked, and the presumption won't be repudiated. If some policies don't succeed, others will be proposed. But the innate human tendency to overdo things suggests that the very striving for a perpetual, ever-improving prosperity creates its own booms and busts.

The present crisis is evidence of this maddening interplay. What defines today's crisis is that it originated in the behavior of households and financial markets, which came to rely on too much debt, and in the lopsided international trade imbalances that were inherently unstable. Whenever the resulting prosperity seemed threatened, government—mostly through the Federal Reserve—moved aggressively to extend and prolong it. The initial success of these policies fed the illusion that financial instability had been contained and the Great Moderation was an enduring feature of our system. As these assumptions subconsciously spread, ordinary Americans, businesses, and investors acted increasingly in ways that made the assumptions false.

The news is sobering for ideologues of all varieties. For those who place great faith in "markets," the lesson of the present crisis is that they are sometimes given to destructive instability and, though they may ultimately self-correct, the wild swings—either up or down—may involve such huge social costs that no democratically-elected government could watch passively and wait for them to play out. For those who believe in the virtues of government regulation and government intervention, the lesson is that too much intervention to produce "sustained growth" achieves at best pyrrhic victories: temporary gains from longer expansions that are followed by deeper, longer, more punishing slumps. 

There is, it seems, no self-evident "happy medium," no utopian mix of market power and government power that will achieve perpetual expansion. It might be better to tolerate more frequent, milder recessions and financial setbacks than to strive for some superficially more appealing but unattainable ideal. But just what that mix would be and whether it would be politically acceptable are hard to know. The fact that so much economic activity now involves international flows of goods, services, and money compounds the difficulty. These questions have not inspired much rigorous thinking, in part because people are preoccupied by the present crisis and in part because politically attractive solutions seem hard to imagine. The financial meltdown has led to an intellectual meltdown.