With inflation soaring in the U.S., economists from monetary policy analytics and forecasting firm LH Meyer say the U.S. Federal Reserve could stop shrinking its balance sheet earlier than expected. However, critics have said the U.S. central bank hasn’t really shrunk the Fed’s balance at all, and the entity has been accused of keeping quantitative easing (QE) practices persistent by continuing to purchase long-term securities from the market.
Forecasting Firm LH Meyer Predicts Fed Will Shrink the Balance Sheet Earlier Than Expected, While the Central Bank’s Reductions Remain Contested
U.S. monetary policymakers are up in arms over the economy’s inflationary pressures and the current debate over the technical definition of a recession. Analysts suspect the Federal Reserve will increase the federal funds rate by at least 75 to upwards of 100 basis points (bps) at the next meeting.
In addition to the rate hikes, the Fed said last year that it would reduce the $8.5 trillion balance sheet by June 1. The central bank said at the time it would slowly stop purchasing mortgage-backed securities (MBS) and maturing Treasuries.
As the war continues in Ukraine and inflation rose at the highest pace in over 40 years last month, many economists believe the U.S. central bank has a lot of work to do when it comes to monetary tightening practices. The former economic adviser to ex-president Barack Obama, Larry Summers, recently mentioned the Fed has an issue to deal with.
When speaking about a recession, Summers insisted that things will depend on “how skillful the [Federal Reserve] turns out to be… They’ve got a very, very difficult problem of balance in setting monetary policy, given the situation in which we find ourselves.”
The latest U.S. Consumer Price Index (CPI) report had shown that June reflected a 9.1% year-over-year increase. The inflation has caused a number of people to suspect the Fed will be dovish on the next two federal fund rate hikes and possibly halt the central bank’s QE reduction.
However, the Fed’s balance sheet reduction that was supposed to start in June has been contested, and many observers think the Fed has continued QE. On the other hand, economists from the forecasting firm LH Meyer say the Fed’s reduction “may stop early as recession risk rises,” according to a report published by the Wall Street Journal (WSJ).
The WSJ article details that recession risk may make the Fed stop shrinking its balance sheet “sooner than expected,” according to the LH Meyer economists. The researchers at the firm predict a recession is likely to take place in 2024. Furthermore, the report explains that it’s possible the U.S. central bank could halt quantitative tightening (QT) by next year.
When the WSJ shared the editorial via Twitter many criticized the entire report, because they don’t believe the Fed has reduced its balance sheet. “It never started,” one individual wrote. “Balance sheet keeps growing, there was no reduction,” another person replied.
Critics Claim Fed’s QE Programs Are Fully Operational
At the end of June, the gold bug and economist Peter Schiff denounced the U.S. central bank for continuing the QE process. “The Fed’s balance sheet just expanded for the third week in a row in June,” Schiff said. “The rise of $1.9 billion increased the size of the Fed’s balance sheet to $8.934 trillion. I wonder when the Fed will stop creating inflation by ending QE and actually start fighting it by beginning QT.”
On July 15, the author and market maniac at Welt, Holger Zschaepitz, said the Fed “has already stopped the shrinking of the balance sheet.” Zschaepitz added:
Total assets grew by $4bn the past week to $8.896tn. Fed balance sheet now equal to 36.5% of [the] U.S.’s GDP vs ECB’s 81.9% and BoJ’s 135%.
The Twitter account called Occupy the Fed Movement spoke about the Fed continuing QE the day before Zschaepitz’s tweet. “FED BS Update: FED increases balance sheet by $4BN ($3.3BN “other assets”) the same week that CPI prints 9.1%,” Occupy the Fed wrote. “USTs up $1.1BN and MBS flat despite supposed QT plans. FED is clearly serious about fighting inflation,” the Twitter account sarcastically added.
For years now the Federal Reserve has been accused of bailing out the mega banks and creating unnatural booms and busts in the American and global economies. Since 2020, the Fed’s balance sheet is significantly larger than any time in history, and the monetary supply growth since that year is pretty hard to fathom.
What do you think about the recent WSJ report that says the Fed could halt the shrinking of its balance sheet? What do you think about the accusations that say the U.S. central bank hasn’t shrunk the balance sheet much at all? Let us know what you think about this subject in the comments section below.
The head of the International Monetary Fund (IMF), Kristalina Georgieva, has warned that the global economic outlook “has darkened significantly,” emphasizing that, regarding a global recession, “we cannot rule it out.”
IMF Says Global Economic Outlook ‘Has Darkened Significantly’
Kristalina Georgieva, the managing director of the International Monetary Fund (IMF), discussed the global economic outlook and the prospect of a global recession in an interview with Reuters Wednesday.
Commenting on the global economy, the IMF managing director said:
The outlook since our last update in April has darkened significantly.
She cited several factors, including a more universal spread of inflation, more substantial interest rate hikes, China’s economic growth slowdown, and mounting sanctions related to the Russia-Ukraine war.
In April, the IMF slashed its global growth forecast from an estimated 6.1% in 2021 to 3.6% in 2022 and 2023. This was “0.8 and 0.2 percentage points lower for 2022 and 2023 than projected in January,” the Fund noted at the time.
The IMF will be cutting its global growth forecast further late this month, Georgieva noted, adding that it will be the third downgrade this year.
Global Recession Cannot Be Ruled Out
When asked about the prospect of a global recession, the IMF managing director said:
The risk has gone up so we cannot rule it out.
“We are in very choppy waters,” she continued. Investors are becoming increasingly concerned about recession risks.
Georgieva noted that recent economic data showed that some large economies, including those of China and Russia, had contracted in the second quarter. She cautioned that the risks were even higher in 2023.
The IMF boss said:
It’s going to be a tough ’22, but maybe even a tougher 2023 … Recession risks increased in 2023.
Georgieva believes that slower economic growth may be a “necessary price to pay,” citing the urgent and pressing need to restore price stability.
She opined: “We need to create the same strong level of coordination between central banks and finance ministries so they provide support in a very targeted way … and don’t weaken what monetary policies are aiming to achieve.”
In June, World Bank President David Malpass warned about a possible global recession. “For many countries, a recession will be hard to avoid … This is the sharpest slowdown in 80 years,” he said.
What do you think about the comments by IMF Managing Director Kristalina Georgieva? Let us know in the comments section below.
Fears of a recession and a 1970s-style stagflation economy continue to grip Wall Street and investors this week, as multiple reports show that recession signals have intensified. With oil and commodity prices surging, Reuters reports that investors are “recalibrating their portfolios for an expected period of high inflation and weaker growth.”
While Wall Street Fears Stagflation, Analyst Believes ‘Global Markets Will Collapse’ This Year
This week there’s been a slew of headlines indicating that fears of a 1970s-style stagflation economy have risen and economic fallout is coming soon. Three days ago, Reuters’ author David Randall noted that U.S. investors are scared of a hawkish central bank, oil prices surging, and the current conflict in Ukraine. Randall spoke with Nuveen’s chief investment officer of global fixed income, Anders Persson, and the analyst noted stagflation isn’t here just yet, but it is getting near that point.
“Our base case is still not 1970s stagflation, but we’re getting closer to that ZIP code,” Persson said.
On Saturday, Bitcoin.com News reported on the skyrocketing energy stocks, precious metals, and global commodities breaking market records. The same day, the popular Twitter account Pentoshi tweeted about a pending “greater depression.” At the time of writing, the tweet was retweeted 69 times and has close to a thousand likes. Pentoshi told his 523,500 Twitter followers:
The most exciting thing this year. Will be global markets collapsing. Any market that trades above 0 will be too high. They will call this: ‘The greater depression’ which will be 10x worse than the Great Depression.
US Treasury Yield Curve Highlights ‘Recession Concerns Showing up More Prominently’
The following day, Reuters’ author Davide Barbuscia detailed that “recession concerns are showing up more prominently in the U.S. Treasury yield curve.” Data from Barbuscia’s report stresses that the “closely watched gap between yields on two- and 10-year notes stood at its narrowest since March 2020.”
Numerous financial publications are highlighting how rising oil and commodity prices are typically associated with a pending recession. Furthermore, recent filings indicate that Warren Buffett’s Berkshire Hathaway obtained a $5 billion stake in Occidental Petroleum. Berkshire Hathaway has also doubled the firm’s exposure to Chevron as well.
What do you think about the reported signals that show a recession or 1970s stagflation is looming over the economy? Let us know what you think about this subject in the comments section below.
A U.S. senator has warned about China’s central bank digital currency. “Analysts have raised the eCNY’s potential to subvert U.S. sanctions, facilitate illicit money flows, enhance China’s surveillance capabilities, and provide Beijing with ‘first mover’ advantages,” the senator informed Treasury Secretary Janet Yellen.
U.S. Senator Warns About the Threat From China’s Central Bank Digital Currency
U.S. Senator Pat Toomey sent a letter to Treasury Secretary Janet Yellen and Secretary of State Tony Blinken last week raising concerns about China’s central bank digital currency, the digital yuan.
“I write to request your engagement on a momentous development in Beijing this week: the rollout of the world’s first major central bank digital currency (CBDC) to a foreign audience,” he told Yellen and Blinken.
“While the United States is still evaluating the concept of a digital dollar, China is using the Beijing Winter Olympics as an international test for the digital yuan (eCNY), which has been piloted domestically since 2019,” the lawmaker from Pennsylvania described, elaborating:
Analysts have raised the eCNY’s potential to subvert U.S. sanctions, facilitate illicit money flows, enhance China’s surveillance capabilities, and provide Beijing with ‘first mover’ advantages, such as setting standards in cross-border digital payments.
The senator noted that “Beijing has also launched the first state-backed global distributed ledger infrastructure, the Blockchain-based Services Network (BSN).”
Furthermore, Senator Toomey commented on China’s cryptocurrency crackdown, stating: “China’s crackdown presents an opportunity for the United States to be the forerunner of crypto innovation, grounded in individual freedom, and other American and democratic principles.”
The senator continued:
Given the prospective threat to U.S. economic and national security interests, I request that the Treasury and State Departments closely examine Beijing’s CBDC rollout during the Olympic Games.
Senator Toomey also requested information on nine areas to be provided to his office by March 7.
They include how the digital yuan was distributed, strategies employed to advance eCNY adoption by Chinese and non-Chinese persons, eCNY adoption rate by foreigners, total issuance of the eCNY after the Olympic Games, lessons for the U.S. government, and possible challenges to U.S. interests.
In January, China’s central bank, the People’s Bank of China (PBOC), revealed that the digital yuan now has more than 261 million users, and transactions worth almost $14 billion have been made using the central bank digital currency. Last week, China designated 15 national pilot zones and 164 entities for blockchain projects.
Do you agree with Senator Toomey? Let us know in the comments section below.
How the U.S. succeeded in turning the dream of monetary imperialism into our current inequitable reality, and how Bitcoin offers a new standard.
In 1972, one year after President Richard Nixon defaulted on the dollar and formally took the United States off of the gold standard for good, the financial historian and analyst Michael Hudson published “Super Imperialism,” a radical critique of the dollar-dominated world economy.
The book is overlooked by today’s economic mainstream but puts forward a variety of provocative arguments that place it outside of the orthodoxy. However, for those seeking to understand how the dollar won the money wars of the past century, the book makes for essential reading.
Hudson’s thesis comes from the left-leaning perspective — the title inspired by the German Marxist phrase “überimperialismus” — and yet thinkers of all political stripes, from progressives to libertarians, should find value in its approach and lessons.
In “Super Imperialism,” Hudson — who has updated the book twice over the past 50 years, with a third edition published just last month — traces the evolution of the world financial system, where U.S. debt displaced gold as the ultimate world reserve currency and premium collateral for financial markets.
How did the world shift from using asset money in the form of gold to balance international payments to using debt money in the form of American treasuries?
How did, as Hudson puts it, “America’s ideal of implementing laissez-faire economic institutions, political democracy, and a dismantling of formal empires and colonial systems” turn into a system where the U.S. forced other nations to pay for its wars, defaulted on its debt, and exploited developing economies?
For those seeking to answer the question of how the dollar became so dominant — even as it was intentionally devalued over and over again in the decades after World War I — then “Super Imperialism” has a fascinating, and at times, deeply troubling answer.
Drawing on extensive historical source material, Hudson argues that the change from the gold standard to what he calls the “Treasury Bill Standard” happened over several decades, straddling the post-World War I era up through the 1970s.
In short, the U.S. was able to convince other nations to save in dollars instead of in gold by guaranteeing that the dollars could be redeemed for gold. But eventually, U.S. officials rug-pulled the world, refusing to redeem billions of dollars that had been spent into the hands of foreign governments under the promise that they were as good as gold through fixed rate redemption.
This deceit allowed the U.S. government to finance an ever-expanding military-industrial complex and inefficient welfare state without having to make the traditional trade offs a country or empire would make if its deficit grew too large. Instead, since U.S. policymakers figured out a way to bake American debt into the global monetary base, it never had to pay off its debt. Counterintuitively, Hudson says, America turned its Cold War debtor status into an “unprecedented element of strength rather than weakness.”
As a result, the U.S. has been able to, in Hudson’s words, pursue domestic expansion and foreign diplomacy with no balance of payment concerns: “Imposing austerity on debtor countries, America as the world’s largest debtor economy acts uniquely without financial constraint.”
A key narrative in Hudson’s 380-page book is the story of how the U.S. government systematically demonetized gold out of the international economic system. Curiously, he does not mention Executive Order 6102 — passed by President Roosevelt in 1933 to seize gold from the hands of the American public — but weaves a compelling narrative of how the U.S. government pulled the world away from the gold standard, culminating in the Nixon Shock of 1971.
In Hudson’s view, leaving the gold standard was all about America’s desire to finance war abroad, particularly in Southeast Asia. He says the Vietnam War was “single-handedly” responsible for pushing the U.S. balance-of-payments negative and drastically drawing down America’s once staggering gold reserves.
Ultimately, Hudson’s thesis argues that unlike classic European imperialism — driven by private sector profit motives — American super imperialism was driven by nation-state power motives. It was not steered by Wall Street, but by Washington. Bretton Woods institutions like the World Bank and International Monetary Fund (IMF) did not primarily help the developing world, but rather harnessed its minerals and raw goods for America and forced its leaders to buy U.S. agricultural exports, preventing them from developing economic independence.
There are, of course, several criticisms of Hudson’s narrative. It can be argued that dollar hegemony helped defeat the Soviet Union, pressuring its economy and paving the way for a more free world; usher in the age of technology, science, and information; push growth globally with surplus dollars; and isolate rogue regimes. Perhaps most compellingly, history seems to suggest the world “wanted” dollar hegemony, if one considers the rise of the eurodollar system, where even America’s enemies tried to accumulate dollars outside of the control of the Federal Reserve.
Hudson was not without contemporary critics, either. A 1972 review in The Journal Of Economic Historyargued that “it would require an exceptionally naive understanding of politics to accept the underlying assertion that the United States government has been clever, efficient, totally unscrupulous, and consistently successful in exploiting developed and developing nations.”
The reader can be the judge of that. But even with these criticisms in mind, Hudson’s work is important to consider. The undeniable bottom line is that by shifting the world economy from relying on gold to relying on American debt, the U.S. government implemented a system where it could spend in a way no other country could, in a way where it never had to pay back its promises, and where other countries financed its warfare and welfare state.
“Never before,” Hudson writes, “has a bankrupt nation dared insist that its bankruptcy become the foundation of world economic policy.”
In 1972, the physicist and futurist Herman Kahn said that Hudson’s work revealed how “the United States has run rings around Britain and every other empire-building nation in history. We’ve pulled off the greatest rip-off ever achieved.”
Governments always dreamed of transforming their debt into the most valuable asset on earth. This essay explains how the U.S. succeeded in turning this dream into a reality, what the implications for the wider world were, how this era might be coming to a close, and why a Bitcoin standard might be next.
I. The Rise And Fall Of America As A Creditor Nation
European powers, tempted by the ability to print paper money to finance war operations, broke off the gold standard entirely during World War I. The metal’s restraint would have resulted in a much shorter conflict and the warring factions decided instead to prolong the violence by debasing their currencies.
Between 1914 and 1918, German authorities suspended the convertibility of marks to gold and increased the money supply from 17.2 billion marks to 66.3 billion marks, while their British rivals increased their money supply from 1.1 billion pounds to 2.4 billion pounds. They expanded the German monetary base by six-fold and the British monetary base by nearly four-fold.
While European powers went deeper and deeper into debt, America enriched itself by selling arms and other goods to the allies, all while avoiding conflict in its homeland. As Europe tore itself to shreds, American farms and industrial operations ran full steam. The world at large began to buy more from the U.S. than it sold back, creating a large American current account surplus.
Post-war, U.S. officials broke with historical precedent and insisted that their European allies repay their war debts. Traditionally, this kind of support was considered a cost of war. At the same time, U.S. officials put up tariff barriers that prevented the allies from earning dollars through more exports to America.
Hudson argues that the U.S. essentially starved Germany through protectionist policy as it was also unable to export goods to the U.S. market to pay back its loans. Britain and France had to use whatever German reparations they did receive to pay back America.
The Federal Reserve, Hudson says, held down interest rates so as not to draw investment away from Britain, hoping in this way the English could pay back their war debt. But these low rates in turn helped spark a stock market bubble, discouraging capital outflows to Europe. Hudson argues this dynamic, especially after the Great Crash, created a global economic breakdown that helped trigger nationalism, isolationism, autarky, and depression, paving the way for World War II.
Hudson summarizes America’s post-World War I global legacy as follows: the devastation of Germany, the collapse of the British Empire, and a stockpiling of gold. At home, President Roosevelt ended domestic convertibility of dollars for gold, made holding gold a felony, and devalued the dollar by 40%. At the same time, the U.S. received most of Europe’s “refugee gold” during the 1930s as the threat of renewed war with Germany led to capital flight from wealthy Europeans. Washington was accumulating gold in its own coffers, just as it was stripping the precious metal from the public.
As World War II neared, Germany halted reparations payments, drying up the allied cash flow. Britain was unable to pay its debts, something it wouldn’t be able to fully do for another 80 years. Capital flight to the “safe” U.S. accelerated, combining with Roosevelt’s tariffs and export-boosting dollar devaluation to further enlarge America’s balance-of-payments position and gold stock. America became the world’s largest creditor nation.
This advantage grew even more dramatic when the allies spent the rest of their gold to fight the Nazis. By the end of the 1940s, the U.S. held more than 70% of non-Soviet-central-bank-held gold, around 700 million ounces.
In 1922, European powers had gathered in Genoa to discuss the reconstruction of Central and Eastern Europe. One of the outcomes was an agreement to partially go back to the gold standard through a “gold exchange” system where central banks would hold currencies which could be exchanged for gold, instead of the metal itself, which was to be increasingly centralized in financial hubs like New York and London.
In the later stages of World War II in 1944, the U.S. advanced this concept even further at the Bretton Woods conference in New Hampshire. There, a proposal put forth by British delegate John Maynard Keynes to use an internationally-managed currency called the “bancor” was rejected. Instead, American diplomats — holding leverage over their British counterparts as a result of their gold advantage and the bailouts they had extended through Lend-Lease Act policies — created a new global trade system underpinned by dollars, which were promised to be backed by gold at the rate of $35 per ounce. The World Bank, International Monetary Fund, and General Agreement on Tariffs and Trade were created as U.S.-dominated institutions which would enforce the worldwide dollar system.
Moving forward, U.S. foreign economic policy was very different from what it was after World War I, when Congress gave priority to domestic programs and America adopted a protectionist stance. U.S. policymakers theorized that America would need to remain a “major exporter to maintain full employment during the transition back to peacetime life” after World War II.
“Foreign markets,” Hudson writes, “would have to replace the War Department as a source of demand for the products of American industry and agriculture.”
This realization led the U.S. to determine it could not impose war debt on its allies like it did after World War I. A Cold War perspective began to take over: if the U.S. invested abroad, it could build up the allies and defeat the Soviets. The Treasury and the World Bank lent funds to Europe as part of the Marshall Plan so that it could rebuild and buy American goods.
Hudson distinguishes the new U.S. imperial system from the old European imperial systems. He quotes Treasury Secretary Morgenthau, who said Bretton Woods institutions “tried to get away from the concept of control of international finance by private financiers who were not accountable to the people,” pulling power away from Wall Street to Washington. In dramatic contrast to “classic” imperialism, which was driven by corporate interests and straightforward military action, in the new “super imperialism” the U.S. government would “exploit the world via the international monetary system itself.” Hence why Hudson’s original title for his book was “Monetary Imperialism.”
The other defining feature of super imperialism versus classic imperialism was that the former is based on a debtor position, while the latter was based on a creditor position. The American approach was to force foreign central banks to finance U.S. growth, whereas the British or French approach was to extract raw materials from colonies, sell them back finished goods, and exploit low wage or even slave labor.
Classic imperialists, if they ran into enough debt, would have to impose domestic austerity or sell off their assets. Military adventurism had restraints. But Hudson argues that with super imperialism, America figured out not just how to avoid these limits but how to derive positive benefits from a massive balance-of-payments deficit. It forced foreign central banks to absorb the cost of U.S. military spending and domestic social programs which defended Americans and boosted their standards of living.
Hudson points to the Korean War as the major event that shifted America’s considerable post-World War II balance-of-payments surplus into a deficit. He writes that the fight on the Korean peninsula was “financed essentially by the Federal Reserve’s monetizing the federal deficit, an effort that transferred the war’s cost onto some future generation, or more accurately from future taxpayers to future bondholders.”
II. The Failure Of Bretton Woods
In the classic gold standard system of international trade, Hudson describes how things worked:
“If trade and payments among countries were fairly evenly balanced, no gold actually changed hands: the currency claims going in one direction offset those going in the opposite direction. But when trade and payments were not exactly in balance, countries that bought or paid more than they sold or received found themselves with a balance-of-payments deficit, while nations that sold more than they bought enjoyed a surplus which they settled in gold… If a country lost gold its monetary base would be contracted, interest rates would rise, and foreign short-term funds would be attracted to balance international trade movements. If gold outflows persisted, the higher interest rates would deter new domestic investment and incomes would fall, thereby reducing the demand for imports until balance was restored in the country’s international payments.”
Gold helped nations account with each other in a neutral and straightforward way. However, just as European powers discarded the restraining element of gold during World War I, Hudson says America did not like the restraint of gold either, and instead “worked to ‘demonetize’ the metal, driving it out of the world financial system — a geopolitical version of Gresham’s Law,” where bad money drives out the good. By pushing a transformation of a world where the premium reserve was gold to a world where the premium reserve was American debt, the U.S. hacked the system to drive out the good money.
By 1957, U.S. gold reserves still outnumbered dollar reserves of foreign central banks three to one. But in 1958, the system saw its first cracks, as the Fed had to sell off more than $2 billion of gold to keep the Bretton Woods system afloat. The ability of the U.S. to hold the dollar at $35 per ounce of gold was being called into question. In one of his last acts in office, President Eisenhower banned Americans from owning gold anywhere in the world. But following the presidential victory of John F. Kennedy — who was predicted to pursue inflationist monetary policies — gold surged anyway, breaking $40 per ounce. It was not easy to demonetize gold in a world of increasing paper currency.
American and European powers tried to band-aid the system by creating the London Gold Pool. Formed in 1961, the pool’s mission was to fix the gold price. Whenever market demand pushed up the price, central banks coordinated to sell part of their reserves. The pool came under relentless pressure in the 1960s, both from the dollar depreciating against the rising currencies of Japan and Europe and from the enormous expenditures of Great Society programs and the U.S. war in Vietnam.
Some economists saw the failure of the Bretton Woods system as inevitable. Robert Triffin predicted that the dollar could not act as the international reserve currency with a current account surplus. In what is known as the “Triffin dilemma,” he theorized that countries worldwide would have a growing need for that “key currency,” and liabilities would necessarily expand beyond what the key country could hold in reserves, creating a larger and larger debt position. Eventually the debt position would grow so large so as to cause the currency to collapse, destroying the system.
By 1964, this dynamic began to visibly kick in, as American foreign debt finally exceeded the Treasury’s gold stock. Hudson says that American overseas military spending was “the entire balance-of-payments deficit as the private sector and non-military government transactions remained in balance.”
The London Gold Pool was held in place (buoyed by gold sales from the Soviet Union and South Africa) until 1968, when the arrangement collapsed and a new two-tiered system with a “government” price and a “market” price emerged.
That same year, President Lyndon B. Johnson shocked the American public when he announced he would not run for another term, possibly in part because of the stress of the unraveling monetary system. Richard Nixon won the presidency in 1968, and his administration did its part to convince other nations to stop converting dollars to gold.
By the end of that year, the U.S. had drawn down its gold from 700 million to 300 million ounces. A few months later, Congress removed the 25% gold backing requirement for federal reserve notes, cutting one more link between the U.S. money supply and gold. Fifty economists had signed a letter warning against such an action, saying it would “open the way to a practically unlimited expansion of Federal Reserve notes… and a decline and even collapse in the value of our currency.”
In 1969, with the end of Bretton Woods palpably close, the IMF introduced Special Drawing Rights (SDRs) or “paper gold.” These currency units were supposed to be equal to gold, but not redeemable for the metal. The move was celebrated in newspapers worldwide as creating a new currency that would “fill monetary needs but exist only on books.” In Hudson’s view, the IMF violated its founding charter by bailing out the U.S. with billions of SDRs.
He says the SDR strategy was “akin to a tax levied upon payments surplus nations by the United States… it represented a transfer of goods and resources from civilian and government sectors of payments-surplus nations to payments deficit countries, a transfer for which no tangible quid pro quo was to be received by the nations who had refrained from embarking on the extravagance of war.”
By 1971, short-term dollar liabilities to foreigners exceeded $50 billion, but gold holdings dipped below $10 billion. Mirroring the World War I behavior of Germany and Britain, the U.S. inflated its money supply to 18-times its gold reserves while it waged the Vietnam War.
III. The Death Of The Gold Standard And The Rise Of The Treasury Bill Standard
As it became clear that the U.S. government could not possibly redeem extant dollars for gold, foreign countries found themselves in a trap. They could not sell off their U.S. treasuries or refuse to accept dollars, as this would collapse the dollar’s value in currency markets, advantaging U.S. exports and harming their own industries. This is the key mechanism that made the Treasury bill system work.
As foreign central banks received dollars from their exporters and commercial banks, Hudson says they had “little choice but to lend these dollars to the U.S. government.” They also gave seigniorage privilege to the U.S. as foreign nations “earned” a negative interest rate on American paper promises most years between the end of World War II and the fall of the Berlin Wall, in effect paying Washington to hold their money on a real basis.
“Instead of U.S. citizens and companies being taxed or U.S. capital markets being obliged to finance the rising federal deficit,” Hudson writes, “foreign economies were obliged to buy the new Treasury bonds… America’s Cold War spending thus became a tax on foreigners. It was their central banks who financed the costs of the war in Southeast Asia.”
American officials, annoyed that the allies never paid them back for World War I, could now get their pound of flesh in another way.
French diplomat Jacques Rueff gave his take on the mechanism behind the Treasury bill standard in his book, “The Monetary Sin Of The West”:
“Having learned the secret of having a ‘deficit without tears,’ it was only human for the US to use that knowledge, thereby putting its balance of payments in a permanent state of deficit. Inflation would develop in the surplus countries as they increased their own currencies on the basis of the increased dollar reserves held by their central banks. The convertibility of the reserve currency, the dollar, would eventually be abolished owing to the gradual but unlimited accumulation of sight loans redeemable in US gold.”
The French government was vividly aware of this, and persistently redeemed its dollars for gold during the Vietnam era, even sending a warship to Manhattan in August 1971 to collect what they were owed. A few days later, on August 15, 1971, President Nixon went on national television and formally announced the end of the dollar’s international convertibility to gold. The U.S. had defaulted on its debt, leaving tens of billions of dollars abroad, all of a sudden unbacked. By extension, every currency that was backed by dollars became pure fiat. Rueff was right, and the French were left with paper instead of precious metal.
Nixon could have simply raised the price of gold, instead of defaulting entirely, but governments do not like admitting to their citizenry that they have been debasing the public’s money. It was much easier for his administration to break a promise to people thousands of miles away.
As Hudson writes, “more than $50 billion of short-term liabilities to foreigners owed by the U.S. on public and private account could not be used as claims on America’s gold stock.” They could, of course, “be used to buy U.S. exports, to pay obligations to U.S. public and private creditors, or to invest in government corporate securities.”
These liabilities were no longer liabilities of the U.S. Treasury. American debt had been baked into the global monetary base.
“IOUs,” Hudson says, became “IOU-nothings.” The final piece of the strategy was to “roll the debt over” on an ongoing basis, ideally with interest rates below the rate of monetary inflation.
Americans could now obtain foreign goods, services, companies, and other assets in exchange for mere pieces of paper: “It became possible for a single nation to export its inflation by settling its payment deficit with paper instead of gold… a rising world price level thus became in effect a derivative function of U.S. monetary policy,” Hudson writes.
If you owe $5,000 to the bank, it’s your problem. If you owe $5 million, it’s theirs. President Nixon’s Treasury Secretary John Connolly riffed on that old adage, quipping at the time: “The dollar may be our currency, but now it’s your problem.”
IV. Super Imperialism In Action: How The U.S. Made The World Pay For The Vietnam War
As the U.S. deficit increased, government spending accelerated, and Americans — in a phenomenon hidden from the average citizen — watched as other nations paid “the cost of this spending spree” as foreign central banks, not taxes, financed the debt.
The game which the Nixon administration was playing, Hudson writes, “was one of the most ambitious in the economic history of mankind … and was beyond the comprehension of the liberal senators of the United States… The simple device of not hindering the outflow of dollar assets had the effect of wiping out America’s foreign debt while seeming to increase it. At the same time, the simple utilization of the printing press — that is, new credit creation — widened the opportunities for penetrating foreign markets by taking over foreign companies.”
“American consumers might choose to spend their incomes on foreign goods rather than to save. American business might choose to buy foreign companies or undertake new direct investment at home rather than buy government bonds, and the American government might finance a growing world military program, but this overseas consumption and spending would nonetheless be translated into savings and channeled back to the United States. Higher consumer expenditures on Volkswagens or on oil thus had the same effect as an increase in excise taxes on these products: they accrued to the U.S. Treasury in a kind of forced saving.”
By repudiating gold convertibility of the dollar, Hudson argues “America transformed a position of seeming weakness into one of unanticipated strength, that of a debtor over its creditors.”
“What was so remarkable about dollar devaluation,” he writes, “is that far from signaling the end of American domination of its allies, it became the deliberate object of U.S. financial strategy, a means to enmesh foreign central banks further in the dollar-debt standard.”
One vivid story about the power of the Treasury bill standard — and how it could force big geopolitical actors to do things against their will — is worth sharing. As Hudson tells it:
“German industry had hired millions of immigrants from Turkey, Greece, Italy, Yugoslavia and other Mediterranean countries. By 1971 some 3 percent of the entire Greek population was living in Germany producing cars and export goods… when Volkswagens and other goods were shipped to the United States… companies could exchange their dollar receipts for deutsche marks with the German central bank… but Germany’s central bank could only hold these dollar claims in the form of U.S. Treasury bills and bonds… It lost the equivalent of one-third the value of its dollar holdings during 1970-74 when the dollar fell by some 52 percent against the deutsche mark, largely because the domestic US inflation eroded 34 percent of the dollar’s domestic purchasing power.”
In this way, Germany was forced to finance America’s wars in Southeast Asia and military support for Israel: two things it strongly opposed.
Put another way by Hudson: “In the past, nations sought to run payments surpluses in order to build up their gold reserves. But now all they were building up was a line of credit to the U.S. Government to finance its programs at home and abroad, programs which these central banks had no voice in formulating, and which were in some cases designed to secure foreign policy ends not desired by their governments.”
Hudson’s thesis was that America had forced other countries to pay for its wars regardless of whether they wanted to or not. Like a tribute system, but enforced without military occupation. “This was,” he writes, “something never before accomplished by any nation in history.”
V. OPEC To The Rescue
Hudson wrote “Super Imperialism” in 1972, the year after the Nixon Shock. The world wondered at the time: What will happen next? Who will continue to buy all of this American debt? In his sequel, “Global Fracture,” published five years later, Hudson got to answer the question.
The Treasury bill standard was a brilliant strategy for the U.S. government, but it came under heavy pressure in the early 1970s.
Just two years after the Nixon Shock, in response to dollar devaluation and rising American grain prices, Organization of the Petroleum Exporting Countries (OPEC) nations led by Saudi Arabia quadrupled the dollar price of oil past $10 per barrel. Before the creation of OPEC, “the problem of the terms of trade shifting in favor of raw-materials exporters had been avoided by foreign control over their economies, both by the international minerals cartel and by colonial domination,” Hudson writes.
But now that the oil states were sovereign, they controlled the massive inflow of savings accrued through the skyrocketing price of petroleum.
This resulted in a “redistribution of global wealth on a scale that hadn’t been seen in living memory,” as economist David Lubin puts it.
In 1974, the oil exporters had an account surplus of $70 billion, up from $7 billion the year before: an amount nearly 5% of US GDP. That year, the Saudi current account surplus was 51% of its GDP.
The wealth of OPEC nations grew so fast that they could not spend it all on foreign goods and services.
“What are the Arabs going to do with it all?” askedThe Economist in early 1974.
In “Global Fracture,” Hudson argues that it became essential for the U.S. “to convince OPEC governments to maintain petrodollars [meaning, a dollar earned by selling oil] in Treasury bills so as to absorb those which Europe and Japan were selling out of their international monetary reserves.”
As detailed in the precursor to this essay — “Uncovering The Hidden Costs Of The Petrodollar” — Nixon’s new Treasury Secretary William Simon traveled to Saudi Arabia as part of an effort to convince the House of Saud to price oil in dollars and “recycle” them into U.S. government securities with their newfound wealth.
On June 8, 1974, the U.S. and Saudi governments signed a military and economic pact. Secretary Simon asked the Saudis to buy up to $10 billion in treasuries. In return, the U.S. would guarantee security for the Gulf regimes and sell them massive amounts of weapons. The OPEC bond bonanza began.
“As long as OPEC could be persuaded to hold its petrodollars in Treasury bills rather than investing them in capital goods to modernize its economies or in ownership of foreign industry,” Hudson says, “the level of world oil prices would not adversely affect the United States.”
At the time, there was a public and much-discussed fear in America of Arab governments “taking over” U.S. companies. As part of the new U.S.-Saudi special relationship, American officials convinced the Saudis to reduce investments in the U.S. private sector and simply buy more debt.
The Federal Reserve continued to inflate the money supply in 1974, contributing to the fastest domestic inflation since the Civil War. But the growing deficit was eaten up by the Saudis and other oil-exporters, who would recycle tens of billions of dollars of petrodollar earnings into U.S. treasuries over the following decade.
“Foreign governments,” Hudson says, “financed the entire increase in publicly-held U.S. federal debt” between the end of WWII and the 1990s, and continued with the help of the petrodollar system to majorly support the debt all the way to the present day.
At the same time, the U.S. government used the IMF to help “end the central role of gold that existed in the former world monetary system.” Amid double-digit inflation the institution sold off gold reserves in late 1974, to try and keep any possible upswing in gold down as a result of a new law in the United States that finally made it legal again for Americans to own gold.
By 1975, other OPEC nations had followed Saudi Arabia’s lead in supporting the Treasury bill standard. The British pound sterling was finally phased out as a key currency, leaving, as Hudson writes, “no single national currency to compete with the dollar.”
The legacy of the petrodollar system would live on for decades, forcing other countries to procure dollars when they needed oil, causing America to defend its Saudi partners when threatened with aggression from Saddam Hussein or Iran, discouraging U.S. officials from investigating Saudi Arabia’s role in the 9/11 attacks, supporting the devastating Saudi war in Yemen, selling billions of dollars of weapons to the Saudis, and making Aramco the second-most valuable company in the world today.
VI. Exploitation Of The Developing World
The Treasury bill standard carried massive costs. It was not free. But these costs were not paid for by Washington, but were often borne by citizens in Middle Eastern countries and in poorer nations across the developing world.
Even pre-Bretton Woods, gold reserves from regions like Latin America were sucked up by the U.S. As Hudson describes, European nations would first export goods to Latin America. Europe would take the gold — settled as the balance-of-payments adjusted — and use it to buy goods from the U.S. In this way, gold was “stripped” from the developing world, helping the U.S. gold stock reach its peak of nearly $24.8 billion (or 700 million ounces) in 1949.
Originally designed to help rebuild Europe and Japan, the World Bank and International Monetary Fund became in the 1960s an “international welfare agency” for the world’s poorest nations, per The Heritage Foundation. But, according to Hudson, that was a cover for its true purpose: a tool through which the U.S. government would enforce economic dependency from non-Communist nations worldwide.
The U.S. joined the World Bank and IMF only “on the condition that it was granted unique veto power… this meant that no economic rules could be imposed that U.S. diplomats judged did not serve American interests.”
America began with 33% of the votes at the IMF and World Bank which — in a system that required an 80% majority vote for rulings — indeed gave it veto power. Britain initially had 25% of the votes, but given its subordinate role to the U.S. after the war, and its dependent position as a result of Lend-Lease policies, it would not object to Washington’s desires.
A major goal of the U.S. post-WWII was to achieve full employment, and international economic policy was harnessed to help achieve that goal. The idea was to create foreign markets for American exports: raw materials would be imported cheaply from the developing world, and farm goods and manufactured goods would be exported back to those same nations, bringing the dollars back.
Hudson says that U.S. congressional hearings regarding Bretton Woods agreements revealed “a fear of Latin American and other countries underselling U.S. farmers or displacing U.S. agricultural exports, instead of the hope that these countries might indeed evolve towards agricultural self-sufficiency.”
The Bretton Woods institutions were designed with these fears in mind: “The United States proved unwilling to lower its tariffs on commodities that foreigners could produce less expensively than American farmers and manufacturers,” writes Hudson. “The International Trade Organization, which in principle was supposed to subject the U.S. economy to the same free trade principles that it demanded from foreign governments, was scuttled.”
In a meta-version of how the French exploit Communauté Financière Africaine (CFA) nations in Africa today, the U.S. employed many double standards, did not comply with the most-favored-nation rule, and set up a system that forced developing countries to “sell their raw materials to U.S.-owned firms at prices substantially below those received by American producers for similar commodities.”
Hudson spends a significant percentage of “Super Imperialism” making the case that this policy helped destroy economic potential and capital stock of many developing countries. The U.S., as he tells it, forced developing nations to export fruit, minerals, oil, sugar, and other raw goods instead of investing in domestic infrastructure and education — and forced them to buy American foodstuffs instead of grow their own.
Post-1971, why did the Bretton Woods institutions continue to exist? They were created to enforce a system that had expired. The answer, from Hudson’s perspective, is that they were folded into this broader strategy, to get the (often dictatorial) leaders of developing economies to spend their earnings on food and weapons imports. This prevented internal development and internal revolution.
In this way, “super imperial” financial and agricultural policy could, in effect, accomplish what classic imperial military policy used to accomplish. Hudson even claims that “Super Imperialism” the book was used as a “training manual” in Washington in the 1970s by diplomats seeking to learn how to “exploit other countries via their central banks.”
In Hudson’s telling, U.S.-directed aid was not used for altruism, but for self interest. From 1948 to 1969, American receipts from foreign aid approximated 2.1 times its investments.
“Not exactly an instrument of altruistic American generosity,” he writes. From 1966 to 1970, the World Bank “took in more funds from 20 of its less developed countries than it disbursed.”
In 1971, Hudson says, the U.S. government stopped publishing data showing that foreign aid was generating a transfer of dollars from foreign countries to the U.S. He says he got a response from the government at the time, saying “we used to publish that data, but some joker published a report showing that the U.S. actually made money off the countries we were aiding.”
Former grain-exporting regions of Latin America and Southeast Asia deteriorated to food-deficit status under “guidance” from the World Bank and IMF. Instead of developing, Hudson argues that these countries were retrogressing.
Normally, developing countries would want to keep their mineral resources. They act as savings accounts, but these countries couldn’t build up capacity to use them, because they were focused on servicing debt to the U.S. and other advanced economies. The World Bank, Hudson argues, pushed them to “draw down” their natural resource savings to feed themselves, mirroring subsistence farming and leaving them in poverty. The final “logic” that World Bank leaders had in mind was that, in order to conform with the Treasury bill standard, “populations in these countries must decline in symmetry with the approaching exhaustion of their mineral deposits.”
Hudson describes the full arc as such: Under super imperialism, world commerce has been directed not by the free market but by an “unprecedented intrusion of government planning, coordinated by the World Bank, IMF, and what has come to be called the Washington Consensus. Its objective is to supply the U.S. with enough oil, copper, and other raw materials to produce a chronic over-supply sufficient to hold down their world price. The exception of this rule is for grain and other agricultural products exported by the United States, in which case relatively high world prices are desired. If foreign countries still are able to run payments surpluses under these conditions, as have the oil-exporting countries, their governments are to use the process to buy U.S. arms or invest in long-term illiquid, preferably non-marketable U.S. treasury obligations.”
This, as Allen Farrington would say, is not capitalism. Rather, it’s a story of global central planning and central bank imperialism.
Most shockingly, the World Bank in the 1970s under Robert McNamara argued that population growth slowed down development, and advocated for growth to be “curtailed to match the modest rate of gain in food output which existing institutional and political constraints would permit.”
Nations would need to “follow Malthusians policies” to get more aid. McNamara argued that “the population be fitted to existing food resources, not that food resources be expanded to the needs of existing or growing populations.”
To stay in line with World Bank loans, the Indian government forcibly sterilized millions of people.
As Hudson concludes: the World Bank focused the developing world “on service requirements rather than on the domestic needs and aspirations of their peoples. The result was a series of warped patterns of growth in country after country. Economic expansion was encouraged only in areas that generated the means of foreign debt service, so as to be in a position to borrow enough to finance more growth in areas that might generate yet further means of foreign debt service, and so on ad infinitum.
On an international scale, Joe Hill’s “We go to work to get the cash to buy the food to get the strength to go to work to get the cash to buy the food to get the strength to go to work to get the cash to buy the food…” became reality. The World Bank was pauperizing the countries that it had been designed in theory to assist.
VII. Financial Implications Of The Treasury Bill Standard
By the 1980s, the U.S. had achieved, as Hudson writes, “what no earlier imperial system had put in place: a flexible form of global exploitation that controlled debtor countries by imposing the Washington Consensus via the IMF and World Bank, while the Treasury Bill standard obliged the payments-surplus nations of Europe and East Asia to extend forced loans to the U.S. government.”
But threats still remained, including Japan. Hudson explains how in 1985 at the Louvre Accords, the U.S. government and IMF convinced the Japanese to increase their purchasing of American debt and revalue the yen upwards so that their cars and electronics became more expensive. This is how, he says, they disarmed the Japanese economic threat. The country “essentially went broke.”
On the geopolitical level, super imperialism not only helped the U.S. defeat its Soviet rival — which could only exploit the economically-weak COMECON countries — but also kept any potential allies from getting too strong. On the financial level, the shift from the restraint of gold to the continuous expansion of American debt as the global monetary base had a staggering impact on the world.
Despite the fact that today the U.S. has a much larger labor force and much higher productivity than it did in the 1970s, prices have not fallen and real wages have not increased. The “FIRE” sector (finance, insurance, and real estate) has, Hudson says, “appropriated almost all of the economic gains.” Industrial capitalism, he says, has evolved into finance capitalism.
For decades, Japan, Germany, the U.K., and others were “powerless to use their economic strength for anything more than to become the major buyers of Treasury bonds to finance the U.S. federal budget deficit… [these] foreign central banks enabled America to cut its own tax rates (at least for the wealthy), freeing savings to be invested in the stock market and property boom,” according to Hudson.
The past 50 years witnessed an explosion of financialization. Floating currency markets sparked a proliferation of derivatives used to hedge risk. Corporations all of a sudden had to invest resources in foreign exchange futures. In the oil and gold markets, there are hundreds or thousands of paper claims for each unit of raw material. It is not clear if this is a direct result of leaving the gold standard, but is certainly a prominent feature of the post-gold era.
Hudson argues that U.S. policy pushes foreign economies to “supply the consumer goods and investment goods that the domestic U.S. economy no longer is supplying as it post-industrializes and becomes a bubble economy, while buying American farm surpluses and other surplus output. In the financial sphere, the role of foreign economies is to sustain America’s stock market and real estate bubble, producing capital gains and asset-price inflation even as the U.S. industrial economy is being hollowed out.”
Over time, equities and real estate boomed as “American banks and other investors moved out of government bonds and into higher-yielding corporate bonds and mortgage loans.” Even though wages remained stagnant, prices of investments kept going up, and up, and up, in a velocity previously unseen in history.
As financial analyst Lyn Alden has pointed out, the post-1971 fiat-based financial system has contributed to structural trade deficits for the U.S. Instead of drawing down gold reserves to maintain the system like it did during the Bretton Woods framework, America has drawn down and “sold off” its industrial base, where more and more of its stuff is made elsewhere, and more and more of its equity markets and real estate markets are owned by foreigners. The U.S., she argues, has extended its global power by sacrificing some of its domestic economic health. This sacrifice has mainly benefitted U.S. elites at the cost of blue-collar and middle-income workers. Dollar hegemony, then, might be good for American elites and diplomats and the wider empire, but not for the everyday citizen.
Data from the work of political economists Shimson Bichler and Jonathan Nitzan highlights this transformation and shines a light on how wealth is moving to the haves from the have-nots: In the early 1950s, a typical dominant capital firm commanded a profit stream 5,000 times the income of an average worker; in the late 1990s, it was 25,000 times greater. In the early 1950s, the net profit of a Fortune 500 firm was 500 times the average; in the late 1990s, it was 7,000 times greater. Trends have accelerated since then: Over the past 15 years, the eight largest companies in the world grew from an average market capitalization of $263 billion to $1.68 trillion.
Inflation, Bichler and Nitzan argue, became a “permanent feature” of the 20th century. Prices rose 50-times from 1900 to 2000 in the U.K. and U.S., and much more aggressively in developing countries. They use a staggering chart that shows consumer prices in the U.K. from 1271 to 2007 to make the point. The visual is depicted in log-scale, and shows steady prices all the way through the middle of the 16th century, when Europeans began exploring the Americas and expanding their gold supply. Then prices remain relatively steady again though the beginning of the 20th century. But then, at the time of World War I, they shoot up dramatically, cooling off a bit during the depression, only to go hyperbolic since the 1960s and 1970s as the gold standard fell apart and as the world shifted onto the Treasury bill standard.
Bitchler and Nitzan disagree with those who say inflation has a “neutral” effect on society, arguing that inflation, especially stagflation, redistributes income from workers to capitalists, and from small businesses to large businesses. When inflation rises significantly, they argue that capitalists tend to gain, and workers tend to lose. This is typified by the staggering increase in net worth of America’s richest people during the otherwise very difficult last 18 months. The economy continues to expand, but for most people, growth has ended.
Bichler and Nitzan’s meta point is that economic power tends to centralize, and when it cannot anymore through amalgamation (merger and acquisition activity), it turns to currency debasement. As Rueff said in 1972, “Given the option, money managers in a democracy will always choose inflation; only a gold standard deprives them of the option.”
As the Federal Reserve continues to push interest rates down, Hudson notes that prices rise for real estate, bonds, and stocks, which are “worth whatever a bank will lend.” Writing more recently in the wake of the Global Financial Crisis, he said “for the first time in history people were persuaded that the way to get rich was by running into debt, not by staying out of it. New borrowing against one’s home became almost the only way to maintain living standards in the face of this economic squeeze.”
This analysis of individual actors neatly mirrors the global transformation of the world reserve currency over the past century: from a mechanism of saving and capital accumulation to a mechanism of one country taking over the world through its growing deficit.
Hudson pauses to reflect on the grotesque irony of pension funds trying to make money by speculating. “The end game of finance capitalism,” he says, “will not be a pretty sight.”
VIII. Counter-Theories And Criticisms
There is surely a case to be made for how the world benefited from the dollar system. This is, after all, the orthodox reading of history. With the dollar as the world reserve currency, everything as we know it grew from the rubble of World War II.
One of the strongest counter-theories relates to the USSR, where it seems clear that the Treasury bill standard — and the unique ability for the U.S. to print money that could purchase oil — helped America defeat the Soviet Union in the Cold War.
To get an idea of what the implications are for liberal democracy’s victory over totalitarian communism, take a look at a satellite image of the Korean peninsula at night. Compare the vibrant light of industry in the south with the total darkness of the north.
So perhaps the Treasury bill standard deserves credit for this global victory. After the fall of the Berlin Wall, however, the U.S. did not hold another Bretton Woods to decentralize the power of holding the world’s reserve currency. If the argument is that we needed the Treasury bill standard to defeat the Soviets, then the failure to reform after their downfall is puzzling.
A second powerful counter-theory is that the world shifted from gold to U.S. debt simply because gold could not do the job. Analysts like Jeff Snider assert that demand for U.S. debt is not necessarily part of some scheme but rather as a result of the world’s thirst for pristine collateral.
In the late 1950s, as the U.S. enjoyed its last years with a current account surplus, something else major happened: the creation of the eurodollar. Originally borne out of an interest from the Soviets and their proxies to have dollar accounts that the American government could not confiscate, the idea was that banks in London and elsewhere would open dollar-denominated accounts to store earned U.S. dollars beyond the purview of the Federal Reserve.
Sitting in banks like Moscow Narodny in London or Banque Commerciale pour L’Europe du Nord in Paris, these new “eurodollars” became a global market for collateralized borrowing, and the best collateral one could have in the system was a U.S. treasury.
Eventually, and largely due to the changes in the monetary system post-1971, the eurodollar system exploded in size. It was unburdened by Regulation Q, which set a limit on interest rates on bank deposits in the U.S. Eurodollar banks, free from this restriction, could charge higher rates. The market grew from $160 billion in 1973 to $600 billion in 1980 — a time when the inflation-adjusted federal funds rate was negative. Today, there are many more eurodollars than there are actual dollars.
To revisit the Triffin dilemma, the demand for “reserve” dollars worldwide would inevitably lead to a draining of U.S. domestic reserves and, subsequently, confidence in the system breaking down.
How can a stockpile of gold back an ever-growing global reserve currency? Snider argues that the Bretton Woods system could never fulfill the role of a global reserve currency. But a dollar unbacked by gold could. And, the argument goes, we see the market’s desire for this most strongly in the growth of the eurodollar.
If even America’s enemies wanted dollars, then how can we say that the system only came into dominance through U.S. design? Perhaps the design was simply so brilliant that it co-opted even America’s most hated rivals. And finally, in a world where gold had not been demonetized, would it have remained the pristine collateral for this system? We’ll never know.
A final major challenge to Hudson’s work is found in the discourse arguing that the World Bank has helped increase living standards in the developing world. It is hard not to argue that most are better off in 2021 than in 1945. And cases like South Korea are provided to show how World Bank funding in the 1970s and 1980s were crucial for the country’s success.
But how much of this relates to technology deflation and a general rise in productivity, as opposed to American aid and support? And how does this rise compare differentially to the rise in the West over the same period? Data suggests that, under World Bank guidance between 1970 and 2000, poorer countries grew more slowly than rich ones.
One thing is clear: Bretton Woods institutions have not helped everyone equally. A 1996 report covering the World Bank’s first 50 years of operations found that “of the 66 less developed countries receiving money from the World Bank for more than 25 years, 37 are no better off today than they were before they received such loans.” And of these 37, most “are poorer today than they were before receiving aid from the Bank.”
In the end, one can argue that the Treasury bill standard helped defeat Communism; that it’s what the global market wanted; and that it helped the developing world. But what cannot be argued is that the world left the era of asset money for debt money, and that as the ruler of this new system, the U.S. government gained special advantages over every other country, including the ability to dominate the world by forcing other countries to finance its operations.
IX. The End Of An Era?
In Enlightenment philosopher Immanuel Kant’s landmark 1795 essay “Toward Perpetual Peace,” he argues for six primary principles, one of which is that “no national debt shall be contracted in connection with the external affairs of the state”:
“A credit system, if used by the powers as an instrument of aggression against one another, shows the power of money in its most dangerous form. For while the debts thereby incurred are always secure against present demands (because not all the creditors will demand payment at the same time), these debts go on growing indefinitely. This ingenious system, invented by a commercial people in the present century, provides a military fund which may exceed the resources of all the other states put together. It can only be exhausted by an eventual tax-deficit, which may be postponed for a considerable time by the commercial stimulus which industry and trade receive through the credit system. This ease in making war, coupled with the warlike inclination of those in power (which seems to be an integral feature of human nature), is thus a great obstacle in the way of perpetual peace.”
Kant seemingly predicted dollar hegemony. With his thesis in mind, would a true gold standard have deterred the war in Vietnam? If anything, it seems certain that such a standard would have made the war at least much shorter. The same, obviously, can be said for World War I, the Napoleonic Wars, and other conflicts where the belligerents left the gold standard to fight.
“The unique ability of the U.S. government,” Hudson says, “to borrow from foreign central banks rather than from its own citizens is one of the economic miracles of modern times.”
But “miracle” is in the eye of the beholder. Was it a miracle for the Vietnamese, the Iraqis, or the Afghans?
Nearly 50 years ago, Hudson writes that “the only way for America to remain a democracy is to forgo its foreign policy. Either its world strategy must become inward-looking or its political structure must become more centralized. Indeed since the start of the Vietnam War, the growth of foreign policy considerations has visibly worked to disenfranchise the American electorate by reducing the role of congress in national decision making.”
This trend obviously has become much more magnified in recent history. In the past few years America has been at war in arguably as many as seven countries (Afghanistan, Iraq, Syria, Yemen, Somalia, Libya and Niger), yet the average American knows little to nothing about these wars. In 2021, the U.S. spends more on its military than do the next 10 countries combined. Citizens have more or less been removed from the decision-making process, and one of the key reasons — perhaps the key reason — why these wars are able to be financed is through the Treasury bill standard.
How much longer can this system last?
In 1977, Hudson revisits the question on everyone’s mind in the early 1970s: “Will OPEC supplant Europe and Japan as America’s major creditors, using oil earnings to buy U.S. Treasury securities and thereby fund U.S. federal budget deficits? Or will Eastern Hemisphere countries subject the U.S. to a gold-based system of international finance in which renewed U.S. payment deficits will connote a loss of its international financial leverage?”
We of course know the answer: OPEC did indeed fund the U.S. budget for the next decade. Eastern hemisphere countries then failed to subject the U.S. to a gold-based system, in which payments deficits marked loss of leverage. In fact, the Japanese and Chinese in turn kept buying American debt once the oil countries ran out of money in the 1980s.
The system, however, is once again showing cracks.
As of 2013, foreign central banks have been dishoarding their U.S. treasuries. As of today, the Federal Reserve is the majority purchaser of American debt. The world is witnessing a slow decline of the dollar as the dominant reserve currency, both in terms of percentage of foreign exchange reserves and in terms of percentage of trade. These still significantly outpace America’s actual contribution to global GDP — a legacy of the Treasury bill standard, for sure — but they are declining over time.
De-dollarization toward a multi-polar world is gradually occurring. As Hudson says, “Today we are winding down the whole free lunch system of issuing dollars that will not be repaid.”
X. Bitcoin Vs. Super Imperialism
Writing in the late 1970s, Hudson predicts that “without a Eurocurrency, there is no alternative to the dollar, and without gold (or some other form of asset money yet to be accepted), there is no alternative to national currencies and debt-money serving international functions for which they have shown themselves to be ill-suited.”
Thirty years later, in 2002, he writes that “today it would be necessary for Europe and Asia to design an artificial, politically created alternative to the dollar as an international store of value. This promises to be the crux of international political tensions for the next generation.”
It’s a prescient comment, though it wasn’t Europe or Asia that designed an alternative to the dollar, but Satoshi Nakamoto. A new kind of asset money, bitcoin has a chance to unseat the super-imperial dollar structure to become the next world reserve currency.
As Hudson writes, “One way to discourage governments from running payments deficits is to oblige them to finance these deficits with some kind of asset they would prefer to keep, yet can afford to part with when necessary. To date, no one has come up with a better solution than that which history has institutionalized over a period of about two thousand years: gold.”
In January 2009, Satoshi Nakamoto came up with a better solution. There are many differences between gold and bitcoin. Most importantly, for the purposes of this discussion, is the fact that bitcoin is easily self-custodied and thus confiscation-resistant.
Gold was looted by colonial powers worldwide for hundreds of years, and, as discussed in this essay, was centralized mainly into the coffers of the U.S. government after World War I. Then, through shifting global monetary policy of the ’30s, ’40s, ’50s, ’60s, and ’70s, gold was demonetized, first domestically in the U.S. and then internationally. By the 1980s, the U.S. government had “killed” gold as a money through centralization and through control of the derivatives markets. It was able to prevent self-custody, and manipulate the price down.
Bitcoin, however, is notably easy to self-custody. Any of the billions of people on earth with a smartphone can, in minutes, download a free and open-source Bitcoin wallet, receive any amount of bitcoin, and back up the passphrase offline. This makes it much more likely that users will actually control their bitcoin, as opposed to gold investors, who often entered through a paper market or a claim, and not actual bars of gold. Verifying an inbound gold payment is impossible to do without melting the delivery bar down and assaying it. Rather than go through the trouble, people deferred to third parties. In Bitcoin, verifying payments is trivial.
In addition, gold historically failed as a daily medium of exchange. Over time, markets preferred paper promises to pay gold — it was just easier, and so gold fell out of circulation, where it was more easily centralized and confiscated. Bitcoin is built differently, and could very well be a daily medium of exchange.
In fact, as we see more and more people demand to be paid in bitcoin, we get a glimpse of a future where Thier’s law (found in dollarizing countries, where good money drives out the bad) is in full effect, where merchants would prefer bitcoin to fiat money. In that world, confiscation of bitcoin would be impossible. It may also prove hard to manipulate the spot price of bitcoin through derivatives. As BitMEX founder Arthur Hayes writes:
“Bitcoin is not owned or stored by central, commercial, or bullion banks. It exists purely as electronic data, and, as such, naked shorts in the spot market will do nothing but ensure a messy destruction of the shorts’ capital as the price rises. The vast majority of people who own commodity forms of money are central banks who it is believed would rather not have a public scorecard of their profligacy. They can distort these markets because they control the supply. Because bitcoin grew from the grassroots, those who believe in Lord Satoshi are the largest holders outside of centralised exchanges. The path of bitcoin distribution is completely different to how all other monetary assets grew. Derivatives, like ETFs and futures, do not alter the ownership structure of the market to such a degree that it suppresses the price. You cannot create more bitcoin by digging deeper in the ground, by the stroke of a central banker’s keyboard, or by disingenuous accounting tricks. Therefore, even if the only ETF issued was a short bitcoin futures ETF, it would not be able to assert any real downward pressure for a long period of time because the institutions guaranteeing the soundness of the ETF would not be able to procure or obscure the supply at any price thanks to the diamond hands of the faithful.”
If governments cannot kill bitcoin, and it continues its rise, then it stands a good chance to eventually be the next reserve currency. Will we have a world with bitcoin-backed fiat currencies, similar to the gold standard? Or will people actually use native Bitcoin itself — through the Lightning Network and smart contracts — to do all commerce and finance? Neither future is clear.
But the possibility inspires. A world where governments are constrained from undemocratic forever wars because restraint has once again been imposed on them through a neutral global balance-of-payments system is a world worth looking forward to. Kant’s writings inspired democratic peace theory, and they may also inspire a future Bitcoin peace theory.
Under a Bitcoin standard, citizens of democratic countries would more likely choose investing in domestic infrastructure as opposed to military adventurism. Foreigners would no longer be as easily forced to pay for any empire’s wars. There would be consequences even for the most powerful nation if it defaults on its debt.
Developing countries could harness their natural resources and borrow money from markets to finance Bitcoin mining operations and become energy sovereign, instead of borrowing money from the World Bank to fall deeper into servitude and the geopolitical equivalent of subsistence farming.
Finally, the massive inequalities of the past 50 years might also be slowed, as the ability of dominant capital to enrich itself in downturns through rent-seeking and easy monetary policy could be checked.
In the end, if such a course for humanity is set, and Bitcoin does eventually win, it may not be clear what happened:
Did Bitcoin defeat super imperialism?
Or did super imperialism defeat itself?
This is a guest post by Alex Gladstein. Opinions expressed are entirely their own and do not necessarily reflect those of BTC Inc or Bitcoin Magazine.
Global investment bank Goldman Sachs is seeing huge institutional demand for bitcoin with no signs of abating. A survey of Goldman’s institutional clients shows that 61% expect to increase their cryptocurrency holdings. Meanwhile, 76% say the price of bitcoin could reach $100,000 this year.
Goldman Sachs Sees No Signs of Institutional Demand for Bitcoin Abating
In a podcast published Friday, Mathew McDermott, head of Digital Assets for Goldman Sachs’ Global Markets Division, discusses the cryptocurrency trading environment for institutional investors.
He explained that his team conducted a cryptocurrency survey across the firm’s institutional client base, from “hedge funds, to asset managers, to macro funds, to banks, to corporate treasurers, insurance, and pension funds.” He clarified that “all of our institutional client discussion is really focused around bitcoin.”
His team received responses from 280 institutional clients and published the results of the survey this week. “What’s been particularly interesting,” according to McDermott, was that “40% of the clients currently have exposure to cryptocurrencies,” which he explained could be in any forms, from “physical through derivatives, through securities products, or other offerings in the market.” The executive revealed:
In terms of institutional demand, we have seen no signs of that abating … We see a huge amount of demand institutionally, [and] we’re also seeing that reflected in the private wealth management space as well.
He further described that “corporate treasurers, for example, they’re interested in two different aspects.” The first is whether they should be “investing in bitcoin on their balance sheet,” McDermott detailed, citing that “the key drivers from their perspective are negative rates … [and] just the general fears around asset devaluation.”
In addition, he said that they are also thinking “should we consider it as a payment mechanism? … particularly in the context of Tesla’s announcement.” Elon Musk’s electric car company, Telsa, said that it invested $1.5 billion in bitcoin in January and will soon be accepting the cryptocurrency as a means of payments for its products.
Out of the institutional clients that have crypto exposure, the survey shows that 41% own physical or spot crypto. McDermott emphasized:
61% of the clients expect their digital asset holdings to increase over the next year.
As for what’s stopping institutions from investing in cryptocurrencies, 34% of respondents believe that “regulation, internal investment, mandate permissions” are the greatest hurdles to start allocating to crypto assets. 24% believe that a lack of well-regulated, investable crypto assets is the greatest hurdle.
Most Goldman’s Institutional Clients Expect Bitcoin Price Could Reach $100K This Year
As for the future outlook of cryptocurrencies, 54% of respondents predict the price of BTC will be between $40,000 and $100,000 in 12 months while 22% predict it will be more than $100,000. This price level is not far-fetched as several fund managers are predicting the same, including Skybridge Capital and Mike Novogratz.
“In terms of the price action, I think it’s very difficult to predict bitcoin. It’s not an easy pastime,” McDermott opined, elaborating:
The survey was quite insightful in the sense that 76% agreed that the price by the end of the year would be between $40,000 and $100,000 … But, 22% were predicting over $100,000.
“I was on a similar survey with a private roundtable recently and the results there echoed something quite similar where 33% were predicting over $80,000 by the end of the year,” the Goldman executive further shared.
The global investment bank recently restarted its bitcoin trading desk. McDermott confirmed that the desk will begin handling bitcoin futures and non-deliverable forwards for clients. Goldman’s global head of commodities research, Jeff Currie, recently said that the bitcoin market “is beginning to become more mature,” calling the cryptocurrency “a retail inflation hedge.”
What do you think about Goldman Sachs’ view on bitcoin? Let us know in the comments section below.
Citigroup says bitcoin is at a tipping point and the cryptocurrency could become “the currency of choice for international trade.” The firm wrote in a report that “we could be at the start of massive transformation of cryptocurrency into the mainstream.”
Bitcoin Is at the Tipping Point, Citi Says
Citigroup’s Global Perspectives & Solutions (GPS) team released a 108-page report Monday entitled “Bitcoin At the Tipping Point.”
The Citi GPS report explains that “the biggest change with bitcoin is the shift from it being primarily a retail-focused endeavor to something that looks attractive for institutional investors.” The firm attributes the change to “Specific enhancements to exchanges, trading, data, and custody services” that are “increasing and being revamped to accommodate the requirements of institutional investors.”
Highlighting “the advantage of bitcoin in global payments, including its decentralized design, lack of foreign exchange exposure, fast (and potentially cheaper) money movements, secure payment channels, and traceability,” the report details:
These attributes combined with bitcoin’s global reach and neutrality could spur it to become the currency of choice for international trade.
The report also explains that bitcoin has seen three different stages of focus so far: technological oddity, censorship-resistant money, and digital gold. It further predicts that we will soon see a fourth stage of focus as bitcoin transitions to becoming an international trade currency. “This would take advantage of bitcoin’s decentralized and borderless design, its lack of foreign exchange exposure, its speed and cost advantage in moving money, the security of its payments, and its traceability,” the Citi report describes.
While pointing out a number of remarkable developments in bitcoin over the past seven years, the report outlines a few obstacles in the cryptocurrency’s way to becoming a globally-used “trade currency.” Among them are marketplace security — including tether’s role to bitcoin — the environmental impact of mining, and institutional concerns, such as capital lock-up, insurance, and custody limitations. The report adds:
There are a host of risks and obstacles that stand in the way of bitcoin progress. But weighing these potential hurdles against the opportunities leads to the conclusion that bitcoin is at a tipping point and we could be at the start of massive transformation of cryptocurrency into the mainstream.
“Bitcoin’s future is thus still uncertain,” the report additionally asserts, reiterating that “developments in the near term are likely to prove decisive as the currency balances at the tipping point of mainstream acceptance or a speculative implosion.”
Meanwhile, the report notes that “Large institutional investors and organizations are choosing to participate in and support bitcoin” while “Regulators are beginning to lay the groundwork for the asset to potentially enter the mainstream.”
It further emphasizes that this progression occurring “in just over a decade makes bitcoin remarkable regardless of its future,” concluding:
Bitcoin is at the tipping point of its existence and the path forward from here may have broad and widening repercussions.
What do you think about Citi’s view on bitcoin? Let us know in the comments section below.
Financial incumbents may face a harsh reality soon, as a commercial real estate crisis has threatened the profits from America’s largest banks. A number of reports show that banks with a large amount of commercial real estate in their portfolios may see a significant fallout in the next few months.
Just recently PWC’s real estate practice published a report called “The 2021 Emerging Trends,” which shows city ranks have changed a great deal since the coronavirus outbreak. For instance, for a number of years, the city of Seattle was a top ten city for real-estate investment but after Covid-19, it dropped to No. 34 in terms of ratings with other American cities.
One of the biggest issues major cities like Seattle, Boston, New York, L.A., Atlanta, and many more cities face, is the mounting commercial real estate (CRE) losses looming on the horizon.
For instance, on November 11, 2020, columnist David J. Lynch published an article about how the current CRE market should frighten financial institutions like banks. The editorial explains how the Manhattan-based Signature Bank’s third-quarter earnings had shown “60 percent of its portfolio tied up in commercial real estate.”
Lynch further explains that lending funds to businesses like hotels, landlords, and local shops used to be something banks could count on but in cities like New York, these places are now a “ghost town.”
Signature Bank is suffering badly from the fallout, as Lynch further states:
The bank’s bad-loan write-offs, though still modest, are creeping higher. Despite years of steady profits, investors have punished the stock, which even after a recent rebound has lost 27 percent of its value this year.
Basically, commercial real estate or CRE is a type of property that is leveraged exclusively for business purposes. An extremely large portion of the world’s CRE is leased to those who generate an income but due to Covid-19 and the government’s response to the virus, some people leasing CRE cannot generate income.
The CRE crisis looming in the United States is happening in nearly every state of the nation. On November 16, 2020, Jdsupra published a report that covers Delaware and the horrible effects the response to Covid-19 has had on commercial real estate tenants and landlords.
“The real estate industry in Delaware experienced dramatic changes over the past eight months resulting from the Covid-19 pandemic— Without a regular income stream, many commercial tenants cannot meet their monthly rent obligations,” Jdsupra contributor John Newcomer, Jr writes. “Facing diminished monthly rental income, some landlords are left with a cash shortfall that affects their ability to make mortgage payments to their banks.”
Meanwhile, the Federal eviction ban enacted by the CDC will be lifted at the end of the year and skeptics think it could trigger a wake of delinquencies. Local authorities from hard-hit CRE markets like New York and California are trying to curb the fallout by adding further regulations.
For instance, California will continue to limit annual property tax increases for CRE markets. Moreover, analysts say no matter who is in office come January, no U.S. President will be able to affect returns on CRE. According to a recently published report from Cushman & Wakefield, real-estate downturns are guided by intense recessions no matter which political party is in charge of the United States.
“Rather than elections,” stressed the Cushman & Wakefield report, “the real estate cycle, the economy, interest rates, COVID-19, geopolitical events, and long-term growth drivers (like demographics and technological change) are the areas to focus on in determining leasing fundamentals and property values.”
Meanwhile, besides CRE and residential real estate, the investment assets gold and bitcoin have seen different price changes in recent days. For instance, after the Moderna vaccine announcement on Monday, spot gold prices dropped 0.40% and one ounce of fine gold is trading for $1,888 per unit. Gold also staggered in value when Pfizer announced a vaccine for Covid-19 as well, but crypto-asset markets have done the exact opposite.
For instance, after the Moderna vaccine announcement, bitcoin (BTC) touched a high of $16,850 on the exchange Bitstamp rising 5.6%. Ethereum prices jumped 3.39% on Monday touching a high of $464 during Monday’s afternoon trading sessions. The entire crypto market economy is still nearing a half of a trillion dollars at $464 billion which is up 2.6% on Monday.
What do you think about the looming commercial real estate crisis in the U.S.? Let us know what you think about this subject in the comments section below.
American economist and former chairman of Morgan Stanley Asia, Stephen Roach said on Sunday that he believes the U.S. dollar will “crash faster and harder.” Roach said similar statements during an interview back in June, and his latest commentary stresses that people should “expect the dollar to plunge by as much as 35 percent next year.”
Stephen Roach is a well known American economist as he worked as chairman of Morgan Stanley Asia and he also advised as the company’s chief economist as well. Roach currently serves as a senior fellow at Yale University and he’s been discussing the American economy regularly during the last few months. Last June, news.Bitcoin.com reported on Roach’s interview with CNBC when he explained a number of reasons as to why he predicts a “dollar crash.”
On Sunday, Roach published an editorial that bolsters his current opinion concerning a dollar crash and the economist emphasized that the USD has “entered the early stages of what looks to be a sharp descent.”
The economist noted that the U.S. dollar index has slumped by 4.3% after it benefited by 7% when there was a flight to cash in February. Despite what Roach calls a “modest correction” the former Morgan Stanley Asia chairman said, “the dollar remains the most overvalued major currency in the world.”
Roach expects the USD index to slide by as much as 35% in 2021 for a number of reasons.
“I continue to expect this broad dollar index to plunge by as much as 35 percent,” Roach says in a newly written editorial. “This reflects three considerations: the rapid deterioration in macroeconomic imbalances in the United States, the ascendancy of the euro and renminbi as alternatives, and the end of the aura of American exceptionalism that has given the dollar Teflon-like resilience for most of the post-World War II era,” he added.
Roach noted this past June in a prior opinion editorial that digital currencies like bitcoin and gold could possibly benefit from the massive dollar downturn. However, the two free-market assets may not see a significant boon from the major fiat adjustments, Roach highlighted at the time.
“Although cryptocurrencies and gold should benefit from dollar weakness, these markets are too small to absorb major adjustments in world foreign-exchange markets where daily turnover runs around $6.6 trillion,” Roach said.
The famed economist wrote on Sunday that it’s “no secret” what caused the unprecedented savings collapse in 2020. Moreover, the coronavirus outbreak “has been more than outweighed by a record expansion in the federal budget deficit.”
In Roach’s opinion, this is just the beginning of the USD’s deterioration, and “the savings plunge is only a hint of what lies ahead.”
“The vice is tightening on a still-overvalued dollar,” Roach concludes. “Domestic savings are plunging as never before, and the current-account balance is following suit. Don’t expect the Fed, focused more on supporting equity and bond markets than on leaning against inflation, to save the day. The dollar’s decline has only just begun.”
What do you think about Stephen Roach’s opinion about the dollar collapse? Let us know what you think in the comments section below.
The US dollar is increasingly being viewed in a negative light by investors. Bank of America analysts are seeing a “death cross,” a bearish technical formation suggesting that a period of dollar weakness is coming. Meanwhile, confidence in gold has been rising.
Investors Losing Confidence in US Dollar
Investors are increasingly viewing the U.S. dollar in a negative light amid the resurging covid-19 pandemic and the prospect of improving growth abroad, such as in Europe.
Analysts at Bank of America Global Research explained on Wednesday that a decline in the dollar earlier this week has set off a bearish technical formation known as a “death cross” in the USD Index DXY. This occurs when the 50-day moving average crosses below the 200-day moving average. Reuters conveyed that according to the bank:
Past occurrences of the death cross have been followed by a period of dollar weakness eight out of nine times since 1980 when the 200-day moving average has been declining, as it is now.
Moreover, the dollar index fell 6% from its recent highs, the news outlet stated, adding that net bets against the dollar in futures markets are near their highest level since 2018. While the dollar has been seen as a safe haven for investors, its drop to two-week lows on Wednesday shows reduced safe-haven appeal, CNBC noted.
Some investors are also factoring into their dollar outlook criticism over the U.S. government’s response to the coronavirus crisis, protests over racial inequality, and President Donald Trump losing support months before the Nov. 3 presidential election. Analysts at TD Securities told Reuters that the dollar could also suffer if U.S. lawmakers fail to extend some stimulus programs for businesses and families that will soon expire. Recently, the firm cut its outlook for the dollar’s performance against a broad range of major currencies.
In contrast, some investors believe Europe could have greater success in containing the covid-19 pandemic, which could result in accelerated growth in the region. Shaun Osborne, chief FX strategist at Scotiabank, told the publication: “Clearly at this point, European Union countries have made more progress than … the U.S. — where economic trends are lagging noticeably.”
Meanwhile, confidence in global stock markets and gold is strengthening. Edward Moya, a New York-based senior market analyst at OANDA, said: “Investors are growing more confident that this stock market rally is not going to end any time soon … And that’s pretty much based on expectations that you’re going to continue to see a strong global stimulus response over the coming weeks and months.”
Analysts are also bullish on gold. On Wednesday, gold prices soared above the technical level of $1,800 per ounce for the first time since 2011, CNBC detailed, adding that “analysts say the metal’s rally is just getting started.” Besides gold, some investors have also expressed rising confidence in cryptocurrency, particularly bitcoin, such as Galaxy Digital CEO Mike Novogratz. However, gold prices retreated Thursday while the dollar rallied, coinciding with the Supreme Court ruling that a New York prosecutor can obtain Trump’s financial records.
Where do you think the US dollar is headed? Let us know in the comments section below.