Originally Published in the August 2012 Issue of Empirical
In the first part of this two-part series in the July and August 2012 issues of Empirical (available for purchase here) we revisited the beginnings of colonial American and US history and examined how founding fathers like the libertarian-leaning Thomas Jefferson critically regarded aspects of the banking sector in his day. We also looked at the remarkable construction of the so-called “New Economy” during the Clinton years, a development that led to increased lending to new markets based on the belief that risk could be accurately assessed and hedged against through new mathematical models, assisted by emerging technology. Additionally examined were the forces at play in the East Asian crash of the late nineties, and, in particular, the destabilizing influence of “Washington consensus” economics pushed by the US government and powerful global institutions like the International Monetary Fund (IMF).
We return to the story at the turn of the century.
The Great Crash of the New Century
The East Asian crash of the late nineties devastated the lives of the poor and the burgeoning middle-classes in those Eastern nations that had, prior to the crisis, seen remarkable growth by adopting economic policies at variance with many of the prescriptions of the IMF and Washington. Roughly a decade later, the global crash of 2008, which had its epicenter in the United States, had a parallel impact on the same groups of people.
House foreclosures, evictions, job losses, prolonged unemployment, house-value slumps and negative equity hit ordinary Americans hard.
In the case of the East Asian crash, as addressed in Part I, the damage to the economies assaulted by predatory speculators who made serious money out of currency manipulation was intended to be cushioned by IMF loans. However, according to leading economists like Nobel laureate Joseph Stiglitz, these same loans ended up effectively bailing-out Western investors, who promptly removed their money from the ailing eastern economies, leaving Asian taxpayers to foot the bill.
In the United States, as Stiglitz also observed, something not entirely dissimilar occurred. “As we pour money in” to the banks, “they can pour money right out,” he stated at the time, referring to taxpayer bail-outs of major banks and the imperiled mortgage sector. The public could not easily trace where exactly the money dispensed to these beneficiaries was going at the time, despite the fact that the taxpayers were those who salvaged the banks–arguably, the American economy as a whole. It is impossible to imagine a similar situation occurring if a private source provided $700 billion or more in an act of comparable generosity.
Journalists, such as Matt Appuzzo from the Associated Press, tried and failed to get a meaningful response to queries about the big banks’ use of public money. That reluctance persists to this day.
Returning briefly to the Clinton years, parts of laws dating from the Great Depression designed to protect ordinary people from the predation of Wall Street were repealed, such as those produced by the Glass-Steagall Act of 1933 (signed into law by Franklin D. Roosevelt), which was shoved into history by the Gramm-Leach-Bliley Act of 1999. Glass-Steagall separated investment (stock market) banking from depository banking. The legislation that replaced it allowed commercial banks, investment banks, securities firms, and insurance companies to consolidate, giving them access to large amounts of formerly protected funds and a carte blanche, to some degree, to speculate with them.
The Bush camp that followed Clinton looked even less favorably on regulatory legislation than their predecessor, doing little to protect ordinary Americans from the coming crisis, which they, instead, proceeded to expeditiously deepen. The treasury was occupied by followers of the laissez-faire school of neoliberal capitalist economics who entrusted the fate of the American economy in the hands of powerful banks and corporations, adopting the reflexive belief that “the market knows best” and that its workings inevitably lead to efficient outcomes.
Not long after the East Asian crisis and few months into the Bush presidency, a recession hit America caused by the first pin prick of reality to pierce the “New Economy.” The “dot-com bubble,” as it is known, burst in 2000: caused by a sudden fall in the value of the information technology markets that had previously been so buoyant. The bursting of the “dot-com bubble,” in conjunction with a drop in business outlays, investments, and the events of 9/11 contributed to a minor recession.
The Enron scandal followed two weeks after 9/11, drawing back the curtain on the widespread corporate mendacity. A number of leading American firms had committed large-scale fraud in order to maintain an appearance of success to keep share values high during the boom years. Following Enron, it was exposed that many leading corporations faked evidence of profits and had hidden their debts, allegedly in collusion with respected accounting firms.
The US economy, hit by a triple shock in such a short period, looked to be set to go through a period of crisis unprecedented for a decade. The new challenges posed by these problems called for action. Greenspan’s bold and controversial response was to drastically lower interest rates in order to encourage greater borrowing, spending, and consumption. This created a gargantuan consumer boom, without the dreaded side effect of inflation–a result that gave the impression that America’s economy was once again in the best of health.
Meanwhile, the Chinese deliberately held their exchange rate at a low level, meaning that their exports were cheap and therefore highly attractive to American corporations, leading to a huge inflow of US dollars into China and a re-stimulated American economy. The Chinese then immediately purchased American bonds, which contributed to keeping the US economy in health– and ensured greater Chinese influence over the US, allegedly an intentional strategy by the Chinese politburo.
Yet again it seemed that Greenspan’s inscrutable wizardry injected health into the economy, leading to a temporary boom. As the appearance of well-being continued, widespread lending activity akin to that seen in the period of the first Greenspan boom occurred. Again, loans were made available to members of American society who would not normally have been lent to under ordinary circumstances. Borne from this, a massive housing bubble was being constructed involving a new frontier in the housing loan market–the “subprime” mortgage.
The Housing Bubble
According to the US Department of Housing and Urban Development, “sub-prime” lending occurs in a market intended “for persons with blemished or limited credit histories. The loans carry a higher rate of interest than prime loans to compensate for increased credit risk.” Roughly translated, this means that people who have a reasonable chance of not being able to repay are given loans anyway, with big interest charges to compensate for the risk to the lender–the obvious effect being that the risk of default is great, and the risk of further indebtedness for a consumer of this financial product is increased. The ethics of this financial product were as questionable as loans to “sub-prime” borrowers were imprudent.
As recent history reveals, sub-prime mortgages turned out to be a very bad deal for both the lenders and their customers. The now-infamous Lehman Brothers invested heavily in the subprime market by “bankrolling lenders across the country that were making convoluted loans to questionable borrowers” as well as producing their own subprime loan offers, Time magazine recalls. Lehman “took all those loans, whipped them into bonds and passed on to investors billions of dollars of what is now toxic debt,” the Time piece continues. When the debt bubble broke, the American economy took a hit. Operating in this market helped Lehman CEO Dick Fuld earn around half a billion dollars for himself in the process, while effectively steering the company he managed to ruin.
The fall of Lehman coincided with a decline in housing prices from a historical peak in 2006 to ever more worrying levels in 2007. As a result of the permissive lending environment of the Clinton-Bush years, the ratio of American debt to disposable personal income reached a high of 127% in 2007, largely owing to the opening up of the mortgage market. A slump in housing values meant that many Americans who held subprime mortgages with adjustable rates saw their repayment costs increase just as times were getting harder for everyone. Mortgage delinquencies became common, and financial instruments such as securities backed by mortgages, which constituted a big market, increasingly lost their value.
The housing slump was also helped by Greenspan’s decision to lower interest rates after the “dot-com crash.” He would later admit that the housing bubble was “fundamentally engendered by the decline in real long-term interest rates,” which he had intentionally kept low in order to stimulate the economy, knowing that they would have to be raised again eventually–with unpredictable results. In the meantime money on credit became more available to borrowers who would eventually simply default on their payments as times got tough. All of a sudden a lot of money that was owed and which was backed-up financial instruments owned by Wall Street could not be accessed, with a sweeping domino effect throughout the economy.
The consequences for the average American, as already established, were horrendous. Moreover, the cost of the economic crisis was borne primarily by the taxpayer–just as much of the benefits of the boom years had flowed to private companies, the costs of rescuing many of the biggest offenders got paid for by ordinary people.
The manifest injustice of the situation can be adequately demonstrated by looking at figures between 2007 and 2009: the top 1% who owned 34.6% of the nation's wealth in 2007 increased their proportional share to over 37.1% by 2009, while nearly two-thirds of Americans saw a decline in wealth.
The dream of the self-regulating market was assaulted by reality in the form of the crash of 2008. The notion of trickle-down wealth was no less bruised.
There are those, however, who contend that many of the ideological truisms of modern economic thinking are myths, particularly in the “Gordon Gekko” age of no-holds barred wealth-seeking. Respected economists, such as the Nobel laureate and New York Times contributor Paul Krugman, have drawn attention to why the “greed is good” culture that has dominated Wall Street and influenced politics so profoundly since the 1980s was no less as miraculous or socially salutary as Greenspan’s 90s boom, despite common assumptions.
Considering “how trends changed after 1980 or so, when the underlying rules of American business (and politics) shifted” this 23rd May in the New York Times, Krugman noted that “productivity growth has actually been slower” since that period. Additionally, he observed that, coupled with this, “income distribution became radically more unequal,” and that the notion that the US “began selling competitively on world markets instead of running big trade deficits” is also demonstrably false.
On the Threat of Environmental Catastrophe
The influence of private power over human fate is as strong as it has ever been and looks set to have an impact generally on much of life on earth if the reckless and single-minded pursuit of profit so often associated with modern capitalism is not reigned in. The gravity of the problem is almost certainly unrivaled by any threat to the species in recent history since the Second World War or the Cuban missile crisis.
Yet the danger is not posed by the familiar boogeyman of corporate greed per se. The threat is represented by the effects of significant global climate change, presently on course to occur barring some miracle.
An authoritative government report released last year indicated that in only the next decade New York would be under threat from temporary or partial submergence by rising sea levels and increased storm activity similar to Hurricane Irene, causing enormous damage with a massive economic price tag attached to the mess. Yet this scenario, entirely plausible and very worrying, is only a taste of what looks set to be a part of our future.
In November last year the International Energy Agency released a report described as the “most thorough analysis yet of world energy infrastructure,” which indicated that if global fossil-fuel-producing infrastructure (i.e. coal and power stations) is not widely replaced or significantly altered in the next five years, then it would “become impossible to hold global warming to safe levels, and the last chance of combating runaway climate change will be lost for ever.”
Additionally, around the same time as the IEA report was published last year, the US Department of Energy reported that the “biggest jump” in carbon dioxide (a major cause of climate change) outputs ever measured occurred in 2010, indicating that the trajectory of risk from the effects of global environmental cataclysm is rising steeply.
World-leading academics like John Reilly, a senior climate change researcher at Massachusetts Institute of Technology (MIT), have warned that some of the most widely-accepted estimates of the effects of global warming have been far too conservative. Reilly’s team at MIT forecast carbon emissions scenarios, their likelihood, and what the most likely outcomes are in the event they occur. What they discovered recently does not bode well. According to an Associated Press report, a “[UN-organised International Panel on Climate Change, or IPCC, report’s] worst-case scenario was only about in the middle of what MIT calculated are likely scenarios.” It is interesting to note that, to many climate skeptics, the IPCC report was widely derided as being “too alarmist.”
The IPCC estimates foresaw a rise in global temperature of somewhere between 4 and 11 degrees Fahrenheit (2.4-6.4 Celsius), with the most likely outcome being a rise of 7.5 Fahrenheit (4 degrees Celsius). To put this in perspective, the generally-agreed baseline for “safety” in terms of climate change would see an increase in global temperatures by only 2 degrees, in itself a global climate shift that would still have profound consequences.
However, topping the safety line things begin to look really scary. At 3 degrees alone the consequences for humanity are close to nightmarish.
According to British newspaper The Guardian’s science correspondent Alok Jha, who compiled the predictions of researcher Mark Lynas, the World Bank’s “Stern report,” and Britain’s Met Office, at 3 degrees: “Billions of people are forced to move from their traditional agricultural lands, in search of scarcer food and water. Around 30-50% less water is available in Africa and around the Mediterranean.” At 4 degrees “Italy, Spain, Greece and Turkey become deserts and mid-Europe reaches desert temperatures of almost 50 degrees Celsius in summer. Southern England's summer climate could resemble that of modern southern Morocco.”
At 5 degrees and above, the picture becomes apocalyptic. The results would see “global average temperatures … hotter than for fifty [million] years.” Additionally, Jha said that “most of the tropics, sub-tropics and even lower mid-latitudes are too hot to be inhabitable. The sea level rise is now sufficiently rapid that coastal cities across the world are largely abandoned,” with a risk that at 6 degrees and over, “there would be a danger of "runaway warming," perhaps spurred by release of oceanic methane hydrates,” risking that the “human population would be drastically reduced.”
That’s quite some bad news. However, at present a 5-6 degree rise is not guaranteed, nor yet confidently forecast. There’s a lot of work to be done however to prevent or mitigate the worst effects of probable temperature rises above 2, 3 or even 4 degrees Celsius. God forbid anything higher.
Yet despite the urgent need for action on this issue, there are those who would try to convince the average citizen that climate change, a problem of planetary significance that Western industry has had an unrivalled role in creating, is merely the product of “liberal propaganda”–a kind of modern-day myth.
Oil companies like Exxon-Mobil are still largely the biggest in the world, and these groups have been proven to have funded climate change skeptics.
As the “carbon bubble” is being readied for bursting by rising emissions, a drop in media coverage of the effects of climate change has been measured by groups monitoring the news, helping to efface the issue from the public mind in an election year, where the aftermath of the economy still rides high among concerns for most people.
Yet regardless of the economic woes that still persist for many people, through little fault of their own, something has to shift in the world if it is to be rescued from the threat of climate change.
A Stark Choice
If this is to be done, a stark choice between submitting to the imperatives of the economy’s endless need for profit or protecting the future of the planet may be required of us. As environmentalist Bill McKibben articulated recently: “If we spew 565 gigatons more carbon into the atmosphere, we’ll quite possibly go right past that reddest of red lines. But the oil companies, private and state-owned, have current reserves on the books equivalent to 2,795 gigatons–five times more than we can ever safely burn. It has to stay in the ground. Put another way, in ecological terms it would be extremely prudent to write off $20 trillion worth of those reserves. In economic terms, of course, it would be a disaster, first and foremost for shareholders and executives of companies like ExxonMobil … If you run an oil company, this sort of write-off is the disastrous future staring you in the face as soon as climate change is taken as seriously as it should be, and that’s far scarier than drought and flood. It’s why you’ll do anything–including fund an endless campaigns of lies–to avoid coming to terms with its reality.”
“Growth for its own sake,” so the saying goes, “is the ideology of the cancer cell.” Regardless of the cliché of this thoroughly-abused slogan, its message is apt to our present crisis: the interminable desire for gain required by our present way of life may yet so damage the organism from which it derives sustenance (our planet) that it sabotages its own existence. This negative-sum game is given license to continue apace because it is inexpedient for those with real power to challenge it.
Endless clamoring for growth has meant that along with development, massive pollution has shadowed the steps of Western prosperity–yet the effects of this on the climate, now widely accepted as fact, are an “externality” not incorporated into market calculations. Climate change thus remains a total irrelevance to the closed system of global capitalism, regardless of its long-term impacts on the future of the sine qua non base that supports the market itself: human beings and their labor, the environment and its resources.
For big business, even when there are devastating economic crashes, somebody always benefits. Goldman Sachs famously reaped massive rewards by betting on the housing crash that they themselves contributed to, helping to consolidate their leading position in the banking world. However shocking this may seem, however such acts stink of grotesque immorality–they are merely consistent with the demands of the system in which they operate, and the rigid logic of the market.
It remains for politicians to act on this issue. But they are not doing enough.
As a result of runaway climate change, losses in the future may be so broadly and profoundly felt, however, that future generations can hardly be expected to accept with equanimity what history may teach them about how the miserable state of the world they have inherited came to be. Explaining to our grandchildren that the Earth was left to go to hell because it was deemed too much for our politicians to reign in corporate and industrial irresponsibility will not be easy, but it won’t stop it from being true–if we do nothing.
It is time to forget what is convenient or ideologically appealing, and address what is real–for our children’s sake.
There is still time, although barely, to act to influence our politicians to deal with this most serious of global issues–that is, if we care about something so petty and meaningless as the future of life on Earth.
Be sure to visit the Empirical website to subscribe!
If you are a writer and are interested in writing for Empirical, check out this link to find out how to submit.