Spain’s Lessons for American Decline
Spain was the first global superpower. Obviously, there had been other great powers—Rome, the China of the Tang and Ming dynasties, the vast Mongol domains—but none had spanned oceans and continents the way the Spanish Empire did at its height. In the first half of the 16th century, Charles V reigned over vast swathes of Europe; his son Philip II controlled most of the Western Hemisphere as well as a sizable chunk of Asia (the Philippines were named after him). Imperial Spain’s maximum territorial reach would only be surpassed by the British Empire in the 19th century, and in the 20th by the informal American imperium, with its 750 overseas bases and network of global alliances.
But then Spain blew it. Already by the middle of the 17th century, under the crisis-ridden rule of Philip IV, the Iberian kingdom “had been left behind by the rest of Europe,” as John A. Crow wrote in his classic study, Spain: The Root and the Flower. England’s emergent sea power had dealt an early, crippling blow to Spanish naval might in the 1588 defeat of the Armada. A little more than three centuries later, the United States would effectively end Spain’s overseas empire, seizing control of its last colonies in Cuba, Puerto Rico, the Philippines, and Guam. Between these two catastrophes there intervened a long period of slow decline.
Those contemplating the possible demise of American global hegemony most often turn for lessons to Rome or the Soviet Union, but the parallels between America and Spain are striking. Both countries, in their formative, pre-imperial periods, were defined by processes of territorial expansion across shifting frontiers: the reconquista of southern Spain from the Moors and the conquest of the American West. In both cases, the closing of the frontier—Spain’s in 1492 with the capture of Granada by Ferdinand and Isabella, America’s famously marked by the historian Frederick Jackson Turner in 1893—coincided with the initial phase of overseas territorial expansion that would lead to superpower status. Columbus arrived in the Caribbean the same year Granada fell; America’s seizure of Cuba, Puerto Rico, and the Philippines occurred in 1898, the same year Congress ratified the annexation of Hawaii.
But the most revealing parallels relate to a different expansionary dynamic—that of money. The key to so much else that happened to both countries was the appearance of what seemed like unlimited wealth but was actually access to unlimited quantities of a universal medium of exchange, craved and accepted everywhere. In the late 16th century, the Spanish elite could buy whatever it wanted wherever it wanted with the gold and silver that was pouring into Spain from the mines of Peru and Mexico. Today, the American ruling class can do the same with US dollars created at will and deposited in the memory banks of the Federal Reserve’s computers. That Midas-like power permitted elites in both countries to confuse money with what it could buy, and led to financialization, politically dangerous levels of inequality, and the wasting of wealth on endless wars aimed at remaking the world in the image, respectively, of Iberian Catholicism and American democracy.
In his 2015 book Killing the Host, economist Michael Hudson elaborates on this similarity:
Despite its vast stream of gold and silver, Spain became the most debt-ridden country in Europe—with its tax burden shifted entirely onto the least affluent, blocking development of a home market. Yet today’s neoliberal lobbyists are urging similar tax favoritism to un-tax finance and real estate, shift the tax burden onto labor and consumers, cut public infrastructure and social spending, and put rentier managers in charge of government. The main difference from Spain and other post-feudal economies is that interest to the financial sector has replaced the rent paid to feudal landlords.
The process by which the fiat currency of the United States became the specie of our time was more convoluted than Spain’s pillaging of the Aztec and Inca empires. But the story is worth telling, since the dollar’s hegemony is more than a simple natural consequence of America’s position as the world’s premier economic and military power. It might not have happened had it not been for a series of accidents—among them, an ill-conceived tax measure and an assassination attempt against a popular president—and measures taken for their own reasons by some non-Americans, including an eminent German-Jewish banker, Arabs seeking to avoid US sanctions, and portfolio managers at Japanese insurance companies.
Dollar hegemony had its origins in America’s emergence from World War I as the planet’s principal creditor and supreme economic power. But it took the Bretton Woods Conference of 1944 to establish the dollar’s central role in global finance. Bretton Woods is the New Hampshire resort where the Allied powers convened to construct a formal monetary order for the postwar world. The architects of that order, Harry Dexter White and John Maynard Keynes, sought to avoid any return to the monetary chaos of the interwar years when Washington had refused to step up and manage global monetary matters at a time when no other government—particularly, that in London—had the power to do so anymore.
Bretton Woods, which became shorthand for the monetary regime that prevailed between 1945 and 1971, was indispensable to the three decades of prosperity that followed the end of World War II. But this arrangement proved too rigid to survive the onset in the late 1960s of American trade deficits, together with their flipside: Japanese and West German trade surpluses. (Keynes had attempted to insert a mechanism into the Bretton Woods system that would have required countries to take steps to reduce surpluses; he was overruled by American delegates unable to foresee the day when the United States could or would run trade deficits.) The advent of structural American deficits on its external accounts didn’t—contrary both to expectations at the time and conventional theory—dethrone the dollar, despite a rough decade for the American currency after Richard Nixon unilaterally reneged on Washington’s obligation to exchange gold for dollars presented by foreign central banks. Instead, an unforeseen series of events cemented the dollar’s place at the core of global finance.
The story starts back in 1963, when the Kennedy administration sought to tax interest payments from foreign companies and governments that went to New York to borrow money. Siegmund Warburg, a refugee from Hitler’s Germany who went on to establish Britain’s premier merchant bank, demonstrated that the tax could be circumvented by raising dollars in London from non-Americans via transactions managed by British and European, rather than American, banks and governed by UK law and regulatory oversight. Washington reacted with outrage, but the long-term result was the largest offshore financial market the world had ever seen—the market in eurodollars and eurobonds, which played an indispensable role in making the dollar the world’s currency. Another result was that London regained its status as the world’s financial capital, demoting New York to secondary status.
This arrangement meant, for instance, that if you were an Arab oil company in 1974, you instructed your customers to pay for your oil by depositing dollars into Arab accounts with banks in London where the dollars were beyond the reach of American regulators seeking, say, to sanction Arab countries for going to war with Israel. (The Saudis briefly considered billing their customers in some currency other than the dollar, but quickly realized that no other currency circulated in sufficient quantities to allow their customers to put their hands on enough of it to pay their oil-import bills.) Hence, countries like Brazil and Argentina went to London to borrow the dollars they needed to meet the Arab oil exporters’ demands. American trade deficits and military spending overseas may have fed the ever-growing pool of dollars circulating outside the United States, but the infrastructure of the eurodollar and eurobond markets helped ensure that path dependence would begin to take hold. If most companies were billing their foreign customers in dollars and using dollars to pay their cross-border obligations, yours was probably going to do so, too.
The path dependence that would entrench the dollar’s supremacy hadn’t yet become fully established in the mid-1970s, even if Saudi Arabia and other OPEC nations had reluctantly concluded they had no choice for the time being but to bill their customers in dollars. The 1970s were, in fact, a bad time for the American currency: It lost almost two-thirds of its purchasing power during that decade, and many were desperate for an alternative, a desperation that led the dollar’s value to plunge on foreign-exchange markets in 1978.