Bitcoin is often derided for its volatility, but ever since the U.S. abandoned the gold standard, stability with respect to currency is a far more plastic concept than understood.
Over fifty years ago, U.S. President Richard Nixon met with his economic advisors secretly over a weekend at Camp David, to plot a new course for the U.S. dollar, one that would change monetary history forever.
Upon his arrival back at the White House from Camp David, on August 15, 1971, President Nixon announced that the U.S. dollar would be taken off the gold standard, upending the cornerstone of the international monetary system.
Since the end of the Second World War, no policy decision has done more to affect the prices of goods and services, shaped global exchange, and influenced today’s economic and geopolitical rivalries than America abandoning the gold standard.
But it was never intended to have ended that way.
Back in 1944, as much of Europe and Asia lay a smoldering wreck in the twilight of the Second World War, the U.S. and its allies gathered at Bretton Woods, to chart a new course for the world — one that would begin with monetary stability.
By making the dollar convertible into gold at the rate of US$35 per ounce and pegging every other currency to the dollar at a fixed rate, Bretton Woods was intended to inject stability into international commerce.
Because the interwar years had fostered competitive currency depreciations, rampant tariff increases and worsened the Great Depression of the 1930s, the soon-to-be victors of the Second World War were keen to put in place a global monetary system that would help prevent a repeat of that ruinous competition.
And the dollar-gold peg was credible because at the time, the U.S. would soon own most of the world’s gold and because America’s leaders would publicly reiterate their commitment to Bretton Woods for years thereafter.
In 1963, then-U.S. President John F. Kennedy told a joint session of Congress,
“I want to make it equally clear that this nation will maintain the dollar as good as gold, freely interchangeable with gold at $35 an ounce, the foundation-stone of the free world’s trade and payments system.”
U.S. President Lyndon Johnson, who took over from Kennedy after the former’s assassination, echoed the promise of the gold standard by his predecessor, as did successive chairmen of the U.S. Federal Reserve and numerous Secretaries of the Treasury thereafter.
The dollar grew to define the global monetary system because greenback holders knew that in an economic crisis, they had access to an alternative asset that would maintain its value — gold.
Money, contrary to popular belief and propaganda, was not something invented by governments, but emerged by evolution from the market process.
For centuries, economic forces contributed to the evolution of global monetary systems, with the security (or lack thereof) of nations judged by how they managed their money.
In the postwar period, with the gold-dollar link becoming the basis of world trade and investment, that currency stability helped fuel the phenomenal recovery of Japan and Western Europe from its wartime devastation, and played a pivotal role in the economic boom of the U.S. in the 1950s and the 1960s.
So why would Nixon abandon Bretton Woods?
While it took a full 15 years to get the Bretton Woods system fully operating on the gold standard, the same problems that plague fixed-pegs — adjustment, confidence and liquidity — emerged almost as soon as the system became settled.
Because the dollar was pegged to gold, it prevented the normal price adjustment of wages and prices.
Consequently, payment deficits would be associated with rising unemployment and recessions, a problem faced by the United Kingdom in the postwar years, as it alternated between expansionary policies and austerity in the face of a currency crisis, often referred to as “stop-go.”
The other problem was that although the rest of the world would have to adjust their currencies in the pegged exchange rate system, the U.S. did not.
Because the dollar was used as the standard for gold, it naturally assumed the role of a central reserve currency, affording the U.S. the privilege of not having to move the needle on its policies to cater for the currency peg, and a major point of resentment for a recovering Europe.
But that’s not all.
Issues of confidence also arose almost immediately after Bretton Woods because official dollar liabilities held abroad skyrocketed against a backdrop of successive U.S. deficits, increasing the risk that these dollars would one day be converted into gold, which could trigger a run.
Everyone’s happy to trade dollars knowing that they’re backed by gold, but the minute enough counterparties start to question that convertibility, everyone will want their dollar’s worth of gold.
Indeed, by 1959, there were more U.S. dollars circulating overseas than there was gold to back it up.
Finally, there was the issue of liquidity.
Because Bretton Woods had undervalued the price of gold, gold production wouldn’t be sufficient to provide the resources to finance the growth of global trade, leading to capital outflows from the U.S., something that the Nixon administration could ill-afford at a time of rising inflation.
As outstanding dollar liabilities mounted, the likelihood of a classic run on the dollar increased, which would eventually force U.S. policymakers to tighten monetary policy that would result in global deflationary pressure.
So how did America solve the problem and why is the dollar still the global reserve currency?
By 1971, America had little choice but to go off the gold standard, as France and the United Kingdom made clear their intention to convert their dollars into gold.
Because Bretton Woods was based around an official peg, it was inappropriate for the key currency in the system, the dollar, to be subject to inflationary policies, which is precisely what the circumstances were in the 1970s.
As soon as Washington became profligate in its spending, it was just a matter of time before the dollar’s fixed peg became unglued.
The reason why other countries didn’t immediately rush to abandon the dollar is because there were few other substitutes for it at the time.
As the United States became a superpower in the wake of the Second World War, the dollar was “re-marketed” not directly, but by way of U.S. sovereign debt.
Because idle funds are symptomatic of a “lazy” national balance sheet, governments outside the U.S. had to put their money somewhere.
As soon as Washington abandoned the gold standard, the U.S. Treasury deliberately set about orchestrating scheduled Treasury auctions, building an accounting system that not only made it easier to trade U.S. debt securities, but more attractive, thereby lowering borrowing cost for itself.
Now governments, corporations and investors globally had a place to put their idle funds that would generate a nominal yield, but at the same time be highly safe and liquid.
But that also led Washington to adopt some interesting views on what it could do with its currency.
Take for instance Modern Monetary Theory, which suggests that there is no limit to the amount of dollars that the U.S. government can borrow and so Washington shouldn’t hesitate to spend as much as it wants, and should indeed borrow as much as it deems necessary.
That form of monetary thinking is a relatively new invention, yet the experience of the past three decades seems to confirm those assumptions.
Despite increasing bond sales almost every year between 1980 and 1990, the yield on benchmark 30-year U.S. Treasuries had continued to drop during that decade.
And even though the U.S. Treasury continues to sell more bonds every year, the interest it pays on them has continued to fall.
Global investors have assumed that U.S. sovereign debt markets are the deepest, most liquid and safe securities in the world, but that didn’t happen organically.
Because before the U.S. entered into the Second World War, Congress used to design every new debt issue in detail — terms, durations and amounts.
Imagine if your neighbor asked to borrow US$10,000 from you, surely you’d ask what they needed the money for right?
But not for the U.S. government.
Whereas in the past, all U.S. sovereign debt had a specific purpose, none of that happens anymore.
And that’s how the U.S. very deliberately and very cleverly replaced the gold standard backing its dollar with something far more insidious, an IOU.
But that Vegas-style magic trick by the U.S. Treasury also completely divorced the act of appropriation, from the act of borrowing — what was America using the money for?
And that’s when things get really slippery.
Imagine you could borrow as much money as you wanted, as cheaply as you needed with no limit to how much you could borrow and what you could spend it on, what would you do?
If the 2008 Financial Crisis was anything to go by, it would be McMansions, yachts and cars.
And in the case of Washington, it was whatever the government of the day decided.
There were no bond drives for the wars on terror in Afghanistan and Iraq, because there was no need for them.
In the 2008 Financial Crisis, then-U.S. Treasury Secretary Henry Paulson simply totaled up how much he needed and went to Congress, there wasn’t a need for American citizens to sell their furniture to fund the bailout.
Again in the wake of the coronavirus pandemic, emergency monetary and fiscal measures were paid for as a matter of course — by simply selling more American debt.
And all the while, investors didn’t care.
As the market for U.S. Treasuries became deeper and more liquid, it was easy to lose any sense of how to value a Treasury as a security, because it’s hard to have rational price discovery on an asset with a guaranteed buyer — the U.S. Federal Reserve.
Which is why investors are looking at Bitcoin more closely than ever before.
To be sure, Washington can keep on borrowing so long as nobody questions what it’s borrowing with because Treasuries are a rational bubble and investors can’t price them the way they would any other debt security.
But for how long?
The past year has shown that investors have grown wary of soaking up U.S. sovereign debt ad infinitum, and there have been several recent 5-year and 7-year U.S. sovereign debt auctions which were extremely illiquid, resulting in sharp spikes in yield volatility.
If not for the role of the U.S. Federal Reserve, it’s hard to know for sure if U.S. Treasury yields would continue to remain as depressed as they have been, with the knowledge that America is headed into a bout of serious spending with the Democrats in charge of all the levers of power.
Whereas in the past, governments couldn’t spend more than they had because their currencies were inextricably tied back to gold, those considerations are no longer relevant and the reason why as recently as 2014, former U.S. Federal Reserve Chairman Alan Greenspan, speaking at the New Orleans Investment Conference noted,
“Gold is a good place to put money, given its value as a currency outside of the policies conducted by governments.”
And therein lies the value for Bitcoin and why not just high net worth individuals but national governments are looking more closely at the cryptocurrency.
As an unconstrained asset, Bitcoin’s value as a currency lies in its ability to sit outside the ambit of the policies conducted by any government.
Since 1971, the world has suffered from no less than three disastrous recessions and unemployment in the U.S. spiked to as high as 11% during those those periods of economic calamity, far higher than when the world was still using the gold standard.
As governments have the power to manipulate the quantity and value of money, they can control the economy, but not the free markets and it’s ordinary citizens who pay the cost each day for the promise of free money, economically.
While it’s impossible to return to the gold standard now, to maintain the ideals of free markets and liberty, requires a return to the classical school of economics and that’s potentially where Bitcoin fits in.
It’s not a coincidence that almost half of all family offices surveyed by Goldman Sachs recently revealed that they want to add cryptocurrencies to their stable of investments and why governments have eyed the nascent asset class with both admiration and disdain.
At its core, a reapplication of classical economics to monetary systems would allow the “invisible hand” of the market to determine the value of something as fundamental as currency, but allowing for the forces of demand and supply to determine price of that currency.
While it has been over five decades since the dollar was backed by gold, its value has not ever been determined by free market forces.
Instead, successive U.S. administrations have leveraged and abused the deliberately constructed global financial structures that have led to the long-term debasement of the dollar on the pretext of economic growth.
Make no mistake about it, market forces no longer determine the value of the dollar nor the price of U.S. sovereign debt, both of which are inextricably linked and both of which are heavily manipulated.
Far from providing the promise of stability, the dollar has fomented countless boom-bust economic cycles with shocking regularity and at a frequency far beyond anything seen when fiat currencies were tied to the gold standard.
The main bone of contention is that economists have fed us for far too long on the opioid that a lack of volatility is indicative of stability.
Yet when it’s come to assessing the dollar, the only reason it can be perceived of as being stable is because the volatility that the greenback would otherwise have experienced was suppressed through Treasuries being marketed as the safe haven security of choice.
Conversely, the volatility inherent in Bitcoin is often seen as a limitation, whereas investors should see it as a feature of its free market promise.
Free from the constraints of government, there are plenty of reasons to at least be open to the view that Bitcoin in particular, and cryptocurrencies in general, play a legitimate and growing role in the global financial system.
Patrick is an innovative entrepreneur and a lawyer passionate about cryptocurrencies and the business world. He is the CEO of Novum Global Technologies, a cryptocurrency quantitative trading firm. He understands the business concerns of founders and business people helping them to utilise the legal framework to structure their companies to take advantage of emerging technologies such as the blockchain in order to reach greater heights. His passion for travel, marketing and brand building has led him across careers and continents. He read law at the National University of Singapore and graduated with Honors in the Upper Division and joined one of Singapore’s top law firms, Allen & Gledhill where he was called to the Singapore Bar as an Advocate & Solicitor in 2005. He created Purer Skin, a skincare and inner beauty company which melds the traditional wisdom of ancient Asian ingredients such as Bird's Nest with modern technology. In 2010, his partner and himself successfully raised $589,000 from the National Research Foundation of Singapore under the Prime Minister’s Office. He has played a key role in the growth of Purer Skin from 11 retail points in Singapore to over 755 retail points in Singapore and 2 overseas in less than a year. He taught himself graphic design, coding, website design and video editing to create the Purer Skin brand and finished his training at a leading Digital Media Company.