Wednesday, August 31, 2011

The raison d’être

The politics of fear blows into New York
 The world’s greatest city was brought to a standstill not by Hurricane Irene, but by politicians’ worst-case thinking.


By Tim Black

There was a lot that was unprecedented about Hurricane Irene. It prompted the first weather-inspired, mandatory evacuation of New York. It caused the first-ever shutdown of the city’s subway system. And it provoked an incredible round of almost titillated forewarnings of what would be left of New York after Irene had wended its destruction-strewn way across Manhattan and beyond. What was not unprecedented, however, was Irene itself.

In fact, despite the Biblical predictions of flooding and wind-induced havoc, by the time Irene hit New York early on Sunday morning, it wasn’t actually a hurricane anymore. It had been downgraded to the status of a ‘tropical storm’. Or, as we call it in England, ‘summer’. It was wet. It was windy. But it was not The Day After Tomorrow.

Flippancy is the wrong approach, though. Despite Irene’s damp-squibbish reality in New York, it had caused significant damage elsewhere on America’s east coast. Twenty-nine people had been killed, power cuts were widespread, and billions of dollars worth of damage had been inflicted on property and infrastructure.

But for all that it was a damaging and, for the bereaved, tragic event, there is little getting away from the fact that the likelihood of Hurricane Irene wreaking death and destruction across New York was always minimal. This is not hindsight talking. By Saturday – that is, the day before it was due to hit New York – Irene had already been downgraded from a category 2 hurricane to a category 1 hurricane, and many predicted that it would continue to decrease in strength the closer it got to the city. Which is exactly what did happen.

Yet despite the possibility of hurricane havoc shrinking with each passing hour, the US authorities actually went the other way. They ramped up the threat, turning a highly unlikely scenario into the expected result. What else can explain the decision on Friday to issue a mandatory evacuation for the 350,000 New Yorkers living in low-lying areas? ‘By five o’clock tomorrow you have to be out’, announced the calm-averse New York mayor, Michael Bloomberg: ‘Waiting for the last minute is not a smart thing to do. This is life-threatening.’ If anyone expected his fearful fervour to have been dampened somewhat by Irene’s dissipation during the course of Saturday, they would have been disappointed. ‘Time is running out’, Bloomberg intoned ominously: ‘It’s going to get dark in a little while… If you haven’t left you should leave now. Not later this evening, not this afternoon, immediately.’

And so Bloomberg and Co managed to do something that countless other events, natural and social, have singularly failed to: they brought New York to a standstill. Remember, this is New York we’re talking about. This is a city that withstood the 1929 Wall Street crash and the Great Depression, a city that, for the most part, kept calm and carried on despite the terrorist attack on the Twin Towers, a city that for its 400 often-tough years, persisted and eventually flourished. Yet today, all it takes for everyday life to be suspended is the minutest possibility that something bad might possibly happen.

That there was an overreaction to what was a bit of rain and bluster has been widely acknowledged. However, the main recipient of blame so far has not been the politicians who made the decisions; it has been the media, which supposedly inspired the politicians through their over-the-top reporting. Writing for the BBC News website, one American ex-pat was not impressed by what he believed to be a media-created reality: ‘American society has finally become “media-tised”. By that, I mean many people (by no means all) find it hard to consider something real unless they encounter it via media.’

Elsewhere, Daily Beast columnist Howard Kurtz shouted, ‘Someone has to say it: cable news was utterly swept away by the notion that Irene would turn out to be Armageddon.’ In the Washington Post it was the Weather Channel’s tendency towards hyperbole that was criticised, which was perhaps understandable given this statement on its website: ‘Irene is a hurricane that poses an extraordinary threat and is one that no one has yet experienced in North Carolina to the mid-Atlantic to the Northeast and New England.’ As a Washington Post columnist concluded, ‘Be scared’ seemed to be the message. In the words of one New York resident speaking to the Guardian website in the undramatic aftermath of Irene: ‘Now I realise that the whole thing is just media hype, to get us all upset and anxious, to feed into the American way of getting us paranoid and fearful.’

This scepticism towards the overhyping tendency of the 24-hour news-cycle, not to mention the fearful proclamations of politicians, is certainly admirable in one respect. It testifies to our actual resilience and our preparedness to take on what hardships come our way, even if they’re travelling in 75 miles per hour gusts. For instance, one woman interviewed at a New York evacuation centre said that many of her neighbours had remained in a south Brooklyn waterside block: ‘We warned a lot of them but I guess they took it as a joke.’ Elsewhere, a New York couple responded to a question as to how they found the storm with a phlegmatic response. It was ‘underwhelming’, he said. ‘We were asleep’, she interjected.

This shouldn’t be a surprise. Confronted by testing events, people do actually tend to display far more resilience than the authorities ever give us credit for. Our experience and the support of those around us tend to count for far more than a thousand overhyped weather forecasts. For instance, late last year, as Cyclone Tasha began to inundate the Australian state of Queensland, residents long accustomed to putting up with flooding were far more reasoned and composed than many in the media and the authorities seemed to anticipate. Deon Barden, a resident of Rockhampton in Queensland even turned his flood-enforced exile on the fringes of Rockhampton into a joke: ‘My missus and everything is all stuck in Rocky – I’m out here by myself so I’ll have a bit of peace and quiet if anything.’

But while the contrast between doom-laden media reports and the actual response of people on the ground is often pronounced during such events, to blame the media for hyping Irene up to apocalyptic proportions is to ignore the extent to which the authorities, from politicians to bureaucrats, provide the real impulse for the politics of fear.

The power of self-delusion


Libya and the shameless rewriting of history
The repackaging of NATO’s reckless intervention as a clever war for liberty would make Orwell’s Ministry of Truth beam with pride.
By Brendan O’Neill

Not since Winston Smith found himself in the Ministry of Truth in George Orwell’s 1984, rewriting old newspaper articles on behalf of Big Brother, has there been such an overnight perversion of history as there has been in relation to NATO’s intervention in Libya. Now that the rebels have taken Tripoli, NATO’s bombing campaign is being presented to us as an adroit intervention, which was designed to achieve precisely the glorious scenes we’re watching on our TV screens. In truth, it was an incoherent act of clueless militarism, which is only now being repackaged, in true Minitrue fashion, as an initiative that ‘played an indispensable role in the liberation of Tripoli’.

Normally it takes a few years for history to be rewritten; with Libya it happened in days. No sooner had rebel soldiers arrived at Gaddafi’s compound than the NATO campaign launched in March was being rewritten as a cogent assault. Commentators desperate to resuscitate the idea of ‘humanitarian intervention’, and NATO leaders determined to crib some benefits from their Libya venture, took to their lecterns to tell us that their aims had been achieved and they had ‘salvaged the principle of liberal interventionism from the geopolitical dustbin’. In order to sustain these bizarre claims, they’ve had to put the real truth about NATO’s campaign into a memory hole and invent a whole new ‘truth’.

Over the past few days every aspect of NATO’s bombing campaign has been, as Winston Smith might put it, ‘falsified’. Since everybody now seems to have forgotten the events of just five months ago, it is worth reminding ourselves of the true character of NATO’s intervention in Libya. It was incoherent from the get-go, overseen by a continually fraying and deeply divided Western ‘alliance’ and with no serious war aim beyond being seen to bomb an evil dictator. It was cowardly, where all alliance members wanted to appear to be Doing Something while actually doing as little as possible. This was especially true of the US, which stayed firmly on the backseat of the anti-Gaddafi alliance. And it was reckless, revealing that military action detached from strategy, unanchored by end goals, can easily spin out of control.

Yet now, courtesy of the Ministry of Truthers, these deep moral flaws and political failings are being reinterpreted as brilliant stratagems. So the determination of Cameron, Sarkozy and Obama to present their bombing of Libya, not as a Western initiative but rather as a UN-approved act of uber-multilateralism, is now depicted as a brilliant, oh-so-sly decision that massively aided the rebellion by giving the impression that it was more an organic uprising than a power play aided by ‘evil’ Western outsiders. Commentators write about the West’s adoption of ‘humility’ as a ‘strategic device’. They claim the downplaying of America’s role in the setting up of the anti-Gaddafi alliance in March was designed to enhance the likelihood of success. As one observer now claims, ‘It suited everyone for America to appear to take a backseat. It suited the uprising.’

Here, the profound crisis of identity of the West, its increasing inability to project any kind of mission into the international sphere, is refashioned as the knowing adoption of ‘humility’, designed to boost Western influence in tyranny-ruled lands. In truth, the West-in-denial nature of the anti-Gaddafi alliance, where NATO presented its campaign as a non-American, non-gung-ho initiative, spoke to the corrosion of American authority in international affairs and to the post-Iraq moral paralysis of that entity once known as ‘the West’. So in March, it was reported that Washington was being distanced from the alliance and that Cameron was desperately seeking Arab League backing, in order to make sure ‘this did not look like a Western initiative’. It was shamefacedness about what the West is seen to represent today, and a recognition that American authority is now way more divisive than it was during the Cold War, which gave rise to this orgy of Western sheepishness.

Yet now, the moral hollowness and political incoherence of Western institutions revealed during the formation of the anti-Gaddafi alliance are being presented as clever disguises, designed to boost the fortunes of the rebels. Indeed, since the rebels took Tripoli, some observers have even started claiming that we’re witnessing the emergence of a ‘new era in US foreign policy’, a new ‘model for intervention’. According to Fareed Zakaria of CNN, it might have looked as if Obama’s approach was ‘too multilateral and lacked cohesiveness’, what with his decision to withdraw his fighter planes just 48 hours after the intervention started in March, but actually that was all part of a brilliant new strategy called ‘leading from behind’. Others sing the praises of ‘Obama’s light-footprint approach’, claiming that his strategy of ‘limited engagement’ has now produced a ‘nuanced victory’ in Libya. Here, disarray is repackaged as deftness, and a ‘model’ is retrospectively projected on to the mayhem that reigned during the creation and launch of NATO’s mission.

The world must live within its means

The End Of The Long Con
 
By Tim Price

Around 2002, a developing country defaulted on its debt. Following protracted negotiations, agreement was reached that the bad debt would be replaced with a smaller amount of new good debt, with all investors losing around half their original investment. The country’s finance minister, accompanied by a vast retinue of assistants and bankers, embarked on a road show to sell the deal.  

In Tokyo, the meeting attracted a vast throng of aged Japanese retirees, who had invested their savings in the defaulted securities, on the recommendations of financial advisers to earn interest rates higher than those available in Japan. At the end of the minister’s presentation, a frail, ancient Japanese woman stood up and spoke. In a quiet steady voice, she explained the hardships that the loss had caused. She wanted to know “whether there was any chance she would see any of her money before her life ended.

— From Extreme Money by Satyajit Das.

In July 2008, a bank in Zimbabwe cashed a cheque for $1,072,418,003,000,000 (one quadrillion, seventy-two trillion, four hundred and eighteen billion, three million Zimbabwe dollars). It had taken 28 years from independence for the former colony of Rhodesia to become an economic basket case.  

Inflation in Zimbabwe was 516 quintillion percent (516 followed by 18 zeroes). Prices doubled every 1.3 days. The record for hyperinflation is Hungary where in 1946 monthly inflation reached 12,950,000,000,000,000 percent  –  prices doubled every  15.6 hours. In 1923, Weimar Germany  experienced inflation of 29,525 percent a month, with prices  doubling every 3.7 days. People  burned Marks for heat in the cold Northern German winter. It was cheaper than firewood. The  butter standard was a more reliable form of value than the Mark. The German government took  over newspaper presses to print money, such was the demand for bank notes. The abiding image  of the Weimar Republic remains of ordinary Germans in search of food pushing wheelbarrows filled with wads of worthless money.

The quotation is from Satyajit Das’ just published Extreme Money – Masters of the Universe and  the Cult of Risk (FT Press). This investor’s heart initially sank when offered the chance to review  a copy  –  four years, and counting, of  financial crisis have spawned all sorts of crisis porn  – but  Extreme Money rewards the effort. The tone isn’t exactly gleeful (just look at the mess we’re in !) but the book fairly scampers across the financial landscape, scattering juicy quotes a-plenty in its  wake. Das cites Tom Wolfe, for example, quoting Austrian economist Joseph Schumpeter:

‘Stocks and bonds are what he called evaporated property. People completely lose touch of the  underlying assets. It’s all paper  –  these esoteric devices. So it  has become evaporated property  squared. I call it evaporated property cubed.’ Extreme money is eviscerated reality  –  the  monetary shadow of real things. 

Our own external investment panellist, Guy Fraser-Sampson, has described the situation nicely. If  humans vanished tomorrow, the likes of economics, financial markets, and money (of any meaning)  would vanish with them – but the world would still turn. Finance is a demon of our own design,  and we now inhabit an acutely over-financialised world.   So what is money ?

Something “universally accepted as payment, a claim on other thing … a medium of exchange, a  measure of the market value of real goods and services, a standard unit of value, and a store of  wealth that can be saved and retrieved in the safe knowledge that it will be exchangeable into real  things when retrieved”.

But there is also  commodity money: anything  that is  simultaneously money but also a desired  tradeable commodity in its own right, money that is good enough to eat. Over history mankind  has experimented with dried fish, almonds, corn, coconuts, tea and rice. As Das points out, the  ancient Aztecs used cocoa.

The large green-yellow pods of the cacao tree produce a white pulp that, when dried, roasted and ground, becomes chocolate. Some European pirates seized a ship full of cocoa beans – a true El Dorado worth more than galleons filled with gold doubloons.  Unaware of the value of the cargo and mistaking it for rabbit dung, the pirates dumped the cocoa into the ocean.

And as Das ominously observes, in economic chaos, war or collapse, commodity money reappears.

Tuesday, August 30, 2011

Imperial nightmares


The Kurdish Problem


by Morton Abramowitz
Whatever his impressive domestic achievements, Turkish prime minister Erdogan has done a lot of fancy footwork this year trying to repair a vigorous and much-advertised Middle East involvement. Once the avowed comrade of Qaddafi, Bashir, Assad and Ahmedinejad, he has now emerged as a rousing democrat, defender of the Arab revolts. He seems to have been successful in burying the past—at least in Turkey where public criticism is increasingly muted and he reigns supreme. In Syria, he has joined the West by distancing Turkey from Assad but not yet disowning him, incurring the wrath of both Syria and its staunchest ally, Iran, which has sent warnings to Ankara. In Libya, which once bestowed upon him the Qaddafi human-rights award [3], he is trying desperately to restore the huge Turkish economic stake by fervently and helpfully embracing the rebels. But for all his foreign-policy activism, he can no longer escape his biggest problem, an internal one: the growing difficulties with his own twelve million or so Kurds.
In the period between 2005–1009, Erdogan became the first Turkish leader to do much for the Kurds, bringing in significant investment and notably accepting the ‘Kurdish reality.” He implemented some modest reforms on expressions of Kurdish identity—whether he believed in them or did so to guarantee the vote in southeast Turkey and a route to a new presidency is not clear. But the basic issue has advanced little, and today intensified military activity on the part of the Kurdistan Worker’s Party (PKK) has once again shattered a deceptive Turkish calm. Some forty Turkish soldiers have been killed and many wounded in the southeast over the past two months. In response, Erdogan has shifted gear and publicly declared his intent to finally destroy the PKK and, along the way, to undermine the major domestic Kurdish political party.
The next page-turner will be the promised new Turkish constitution sometime this autumn and what reforms he will secure in that document for the Kurds. Top AKP leadership rhetoric on the new constitution has been democratic and conciliatory, but with popular nationalist feeling running high and Kurds deeply skeptical, not much can be expected. Many fear violence will extend to Turkey’s major cities and to urbanized Kurdish youth. That has always been a concern that has not yet materialized, although small-scale clashes like car burnings, attacks on coffee shops and flash mobs are on the rise. With the schism with Iran the possibility of urban violence may have increased.
These riots were not a product of permissiveness
Blaming the looting on the ‘liberal experiment’ of the 1960s is not only wrong - it could also make the real problems in urban communities worse.
By Jennie Bristow 

It is hard to formulate a genuinely liberal response to the recent spate of riots and looting in Britain.

You get caught between two dystopian, and equally depressing, visions of society: one where the consequences of the cultural, moral and legislative changes associated with the permissive Sixties are leading us to hell in a handcart faster than you can say ‘Daily Mail’; the other where kids stealing computers and beating up their neighbours are just another (indeed, more sympathetic) version of what bankers, business tycoons and immoral governments do. Those people’s everyday practices have been described by the anti-globalisation campaigner Naomi Klein as ‘Looting with the lights on’.

Both these parables of decline share a common fatalism, borne out of an implicit contempt for individual autonomy. The Daily Mail brigade argues that people need stricter social and moral codes – backed up by the police – to control the excesses of their individual desires, while Guardianistas would prefer to believe that the looting kids are blindly driven by the bleakness of their material circumstances in a hyper-consumerist culture. By way of a solution, one side wants to reduce people’s autonomy by clamping down on their civil liberties; the other wants to do it through recognising poor youth as victims of their circumstances who need more, not less, financial and therapeutic support from the state.

One of the influential ideas to gain traction in the post-riots autopsy is ‘compassionate Conservatism’. This seeks to promote a kind of ‘third way’ between crass lefty-ism and traditional moralising, and has been most systematically developed by the Lib-Con work and pensions secretary Iain Duncan Smith and the think-tank the Centre for Social Justice (CSJ). The CSJ’s agenda is to fix ‘broken Britain’ through policies that self-consciously reduce individuals’ dependence on the welfare state through strengthening the role of the family and communities. In this social vision, individual autonomy becomes reposed as a sinful indulgence practised by the selfish; the goal is to nudge people’s values more in the direction of conformity and self-sacrifice.

None of these diagnoses have any space for a genuinely liberal perspective, which upholds individuals’ ability to make and exercise choices about their personal lives without in any way endorsing the trashing of other people’s livelihoods. A liberal vision of society is one in which individuals are assumed to be able to make moral choices and live with the consequences of their actions. In the post-riots dialogue, all sides assume, for different reasons, that individuals cannot or should not make moral choices, and that any response must find more effective ways of controlling people’s behaviour, whether through sheer force or therapeutic manipulation.

Yet it is not true that our current malaise is a consequence of permissiveness, and it is neither possible nor necessary to turn the clock back to a time where people had fewer lifestyle choices or more stringent community obligations. The problem is rather that the spirit of permissiveness has been emptied of its content: the principle of individual autonomy. And there is a danger that many of the solutions being proposed in the wake of the riots will exacerbate the very problems they set out to address.

The ‘permissive’ moment
An article by Tim Montgomerie, editor of the Conservative Home blog, in the Daily Telegraph articulates the social conservatives’ diagnosis of ‘broken Britain’. ‘Over the past week we have witnessed the culmination of the liberal experiment’, he wrote, arguing that: ‘The experiment attested that two parents don’t matter; that welfare, rather than work, cures poverty; you tolerate “minor crime”; you turn a blind eye to celebrity drug use; you allow children to leave school without worthwhile skills; you say there’s no difference between right and wrong. Well now we’ve seen the results.’

Attacking the Labour Party for its reliance on the welfare state to solve every problem, Montgomerie complains: ‘The left is always ready to attack hyper-capitalism for the ways in which it can erode community bonds, but it looks the other way when it comes to thinking about the ways in which the hyper-state can devour social capital. Labour has become the most materialist and consumerist of Britain’s two largest parties… It reveres “lifestyle choices” as though the kind of home in which a child is raised is somehow equivalent to whether you get your weekly groceries from Morrisons or Asda.’

Future plunder and welfare payments



By D. Blount
Our society is made up of the makers who built it, and the takers who are looting it into oblivion. The ideology of the takers is liberalism. Liberals control the courts. Consequently, the legal system produces rulings like this:

An El Paso County [Colorado] jury on Friday awarded nearly $300,000 to the daughter of a burglar who was fatally shot in 2009 while breaking into an auto lot.…
Phillip and Sue Fox, who filed suit for wrongful death in 2010 on behalf of [burglar Robert Johnson] Fox’s 3-year-old daughter, called the jury’s award a victory in their fight to seek accountability for the death of their son, who they say never posed a threat to the heavily armed men.
Never mind that he had knives in his pockets and one strapped to his ankle, or that he was high on methamphetamine according to his accomplice. Poor Fox was just a cuddly “have-not” who needed money to buy more drugs.
The exact amount of the award was $269,500, for factors such as loss of companionship and loss of future earnings.

Future earnings — for a meth-head burglar? They must mean future plunder and welfare payments.

How does the world work?


Krueger's Keynesian Leftovers
IBD Editorial
Just a week before he unveils a new, improved jobs plan, President Obama has named a new person to be his top economic adviser, Princeton University's Alan Krueger. This doesn't bode well for job creation.
Krueger, a labor economist, is no obscure academic. Though youngish at 50, he's been around for decades, most recently spending two years in Obama's Treasury Department. And in the 1990s, he served a stint as Bill Clinton's chief labor economist.
By naming him to the chairmanship of the president's Council of Economic Advisers, replacing the departed Austan Goolsbee, Obama is sending a strong signal to the business world, Wall Street and the rest of America: expect little in the way of major economic policy shifts.
Or in other words: if you don't like the White House status quo, tough.
Krueger's a known quantity. While serving as Treasury's chief economist in 2009 and 2010, he analyzed several programs, including giving employers tax incentives to hire, "Cash for Clunkers," the Small Business Lending Fund and "Build America" muni bonds.
The economy is still a shambles. None of these programs has worked very well. Was Krueger at Treasury telling the White House these were bad ideas? Nothing we know of suggests that's the case.
Going further back, Krueger was co-author of a major 1992 study that posited rises in minimum wage could lead to more hiring. Try telling that to black youth, who suffer a 40% unemployment rate largely because the minimum wage has priced them out of the job market.
Common sense should tell you that when you tax something, you get less of it — not more. Krueger's study was roundly criticized and debunked.
As such, with joblessness remaining stubbornly above 9%, we're not optimistic about Krueger's input into Obama's coming Jobs Program.
Still more recently, Krueger popped up as an advocate for a value-added tax (VAT) or, as some call it, a consumption tax. Nothing wrong with that, per se, unless you're pushing it not as a replacement for our current dysfunctional income-tax code, but as an addition to it.
But that's exactly what Krueger did, although to his credit he did write in a January 2009 New York Times piece that "the main downside of this proposal is that taxes reduce economic activity."
Darn right. Not only that, but unless you get rid of the income tax entirely when you impose a consumption tax, you end up with an overtaxed, stagnant mess. Don't think so? Look at Europe, where citizens are hit with both income tax and a VAT, and the two just keep marching higher.
"European nations imposed VATs about 40 years ago, which simply encouraged more spending and more debt — and now several nations are on the verge of bankruptcy," noted economist Daniel Mitchell of the Cato Institute.
Not everyone feels as we do about Mr. Krueger. Some right-of-center economists, such as former George W. Bush adviser Greg Mankiw and George Mason University's Tyler Cowen, lauded his selection.
And, to his credit, Krueger is the author of several influential studies that have held up over time — including one that suggests extending jobless benefits isn't really stimulus — significant, since this is expected to be part of Obama's jobs program.
Even so, we're disappointed in Krueger's appointment. Nothing personal, but we had hoped Obama would select someone who stands outside of the reigning Keynesian consensus that accepts the primary role of government as a driver of the economy.
That's not how the world works. A massive amount of new and innovative economic research shows that. That's why we can't join others in rejoicing, especially given this administration's repeated economic errors.
Intellectually, Krueger represents nothing new.
Just more Keynesian leftovers.

"Handicapping" in action


The Struggle Over Egalitarianism

Rothbard's 1991 introduction to "Freedom, Inequality, Primitivism, and the Division of Labor," which was written in 1970.

Introduction

In the two decades since this essay was written, the major social trends I analyzed have accelerated, seemingly at an exponential rate. The flight away from socialism and central planning begun in Yugoslavia has stunningly succeeded over the entire "socialist bloc" of Eastern Europe, and there is now at least rhetorical allegiance to the idea of privatization and a free-market economy. More and more, Marxism has become confined to the academics of the United States and Western Europe, comfortably ensconced as parasites upon their capitalist economies. But even among academics, there is almost nothing left of the triumphalist Marxism of the 1930s and 40s, with their boasts of the economic efficiency and superiority of socialist central planning. Instead, even the most dedicated Marxists now pay lip service to the necessity of some sort of "market," however restricted by government.

I. New Areas of Inequality and "Oppression"

But this does not mean that the struggle over egalitarianism is over. Far from it. On the contrary, after the New Left of the late 1960s and early '70s had been discredited by its bizarre turn to violence, it took the advice of its liberal elders and "joined the system." New Leftists launched a successful Gramscian "long march through the institutions," and by becoming lawyers and academics – particularly in the humanities, philosophy, and the "soft" social sciences – they have managed to acquire hegemony over our culture. Seeing themselves defeated and routed on the strictly economic front (in contrast to the Old Left of the 1930s, Marxian economics and the labor theory of value was never the New Left's strong suit), the Left turned to the allegedly moral high ground of egalitarianism.

And, as they did so, they turned increasingly to what was suggested in the last paragraph of my essay: de-emphasizing old-fashioned economic egalitarianism in favor of stamping out broader aspects of human variety. Older egalitarianism stressed making income or wealth equal; but, as Helmut Schoeck brilliantly realized, the logic of their argument was to stamp out in the name of "fairness," all instances of human diversity and therefore implicit or explicit superiority of some persons over others. In short, envy of the superiority of others is to be institutionalized, and all possible sources of such envy eradicated.

In his book on Envy, Helmut Schoeck analyzed a chilling dystopian novel by the British writer, L.P. Hartley. In his work, Facial Justice, published in 1960, Hartley, extrapolating from the attitudes he saw in British life after World War II, opens by noting that after the Third World War, "Justice had made great strides." Economic Justice, Social Justice and other forms of justice had been achieved, but there were still areas of life to conquer. In particular, Facial Justice had not yet been attained, since pretty girls had an unfair advantage over ugly ones. Hence, under the direction of the Ministry of Face Equality, all Alpha (pretty) girls and all Gamma (ugly) girls were forced to undergo operations at the "Equalization (Faces) Centre" so as all to attain Beta (pleasantly average) faces.[i]

Coincidentally, in 1961, Kurt Vonnegut published a pithy and even more bitterly satirical short story depicting a comprehensively egalitarian society, even more thoroughgoing than Hartley's. Vonnegut's "Harrison Bergeron" begins:

The year was 2081, and everybody was finally equal. They weren't only equal before God and the law. They were equal every which way. Nobody was smarter than anybody else. Nobody was better looking than anybody else. Nobody was stronger or quicker than anybody else. All this equality was due to the 211th, 212th, and 213th Amendments to the Constitution, and to the unceasing vigilance of agents of the United States Handicapper General.

The "handicapping" worked partly as follows:

Hazel had a perfectly average intelligence, which meant she couldn't think about anything except in short bursts. And George, while his intelligence was way above normal, had a little mental handicap radio in his ear. He was required by law to wear it at all times. It was tuned to a government transmitter. Every twenty minutes or so, the transmitter would send out some sharp noise to keep people like George from taking unfair advantage of their brains.[ii]

This sort of egalitarian emphasis on noneconomic inequalities has proliferated and intensified in the decades since these men penned their seemingly exaggerated Orwellian dystopias. In academic and literary circles "political correctness" is now enforced with an increasingly iron hand; and the key to being politically correct is never, ever, in any area, to make judgments of difference or superiority.

Thus, we find that a Smith College handout from the Office of Student Affairs lists ten different kinds of "oppression" allegedly inflicted by making judgments about people. They include: "heterosexism," defined as "oppression" of those with nonheterosexual orientations, which include "not acknowledging their existence"; and "ableism," defined as oppression of the "differently abled" [known in less enlightened days as "disabled" or "handicapped"], by the "temporarily able." Particularly relevant to our two dystopian writers is "ageism," oppression of the young and the old by youngish and middle-aged adults, and "lookism" (or "looksism"), defined as the "construction of a standard of beauty/attractiveness."

"Oppression" is also supposed to consist, not only of discriminating in some way against the unattractive, but even in noticing the difference. Perhaps the most chilling recently created category is "logism" or "logo-centric," the tyranny of the knowledgeable and articulate. A set of "feminist scholarship guidelines" sponsored by the state of New Jersey for its college campuses attacks knowledge and scientific inquiry per se as a male "rape of nature." It charges:
mind was male. Nature was female, and knowledge was created as an act of aggression – a passive nature had to be interrogated, unclothed, penetrated, and compelled by man to reveal her secrets.[iii]
"Oppression" is of course broadly defined so as to indict the very existence of possible superiority – and therefore an occasion for envy – in any realm. The dominant literary theory of deconstructionism fiercely argues that there can be no standards to judge one literary "text" superior to another. At a recent conference, when one political science professor referred correctly to Czeslaw Milosz's book The Captive Mind as a "classic," another female professor declared that the very word classic "makes me feel oppressed."[iv] The clear implication is that any reference to someone else's superior product may engender resentment and envy in the rank and file, and that catering to these "feelings of oppression" must be the central focus of scholarship and criticism.

The whole point of academia and other research institutions has always been an untrammelled search for truth. This ideal has now been challenged and superseded by catering to the "sensitive" feelings of the politically correct. This emphasis on subjective feelings rather than truth is evident in the current furor over the teaching of the distinguished Berkeley anthropologist, Vincent Sarich. Sarich's examination of genetic influences on racial differences in achievement was denounced by a fellow faculty member as "attempting to destroy the self-esteem of black students in the class."[v]

II. Group Quotas

Indeed, one radical change since the writing of this essay has been the rapid and accelerating transformation of old-fashioned egalitarianism, which wanted to make every individual equal, into group-egalitarianism on behalf of groups that are officially designated as "oppressed." In employment, positions, and status generally, oppressed groups are supposed to be guaranteed their quotal share of the well-paid or prestigious positions. (No one seems to be agitating for quotal representation in the ranks of ditch diggers.) I first noticed this trend in a paper written one year after the present essay at a symposium on The Nature and Consequences of Egalitarian Ideology.

Monday, August 29, 2011

American Financial Colony or Mercantilist Predator?


China and the Gold Standard 

By Lewis Lehrman

China is an important trading partner of America. But it may also be a mortal threat. And not for the conventional reasons usually cited in the press. Ironically, it is a threat because China is in fact a financial colony of the United States, a colony subsidized and sustained by the pegged, undervalued, yuan-dollar exchange rate. Neither the United States nor its economic colony seems to understand the long-term destructive consequences of the dollarization not only of the Chinese economy but also of the world monetary system. While the Chinese financial system has been corrupted primarily by tyranny, deceit, and reckless expansionism, it is also destabilized by the workings of the world dollar standard. Neither the United States nor China has come to grips with the perverse effects of the world dollar standard.

The social and economic pathology of 19th century colonialism is well studied, but the monetary pathology of its successor, the neo-colonial reserve currency system of the dollar, is less transparent. In order to remedy this pathological defect, the United States must rid itself of its enormous Chinese financial colony, whose exports are subsidized by the undervalued yuan in return for Chinese financing of the U.S. twin deficits. Both China and the United States must also free themselves from the increasing malignancy of the dollar reserve currency system, the primary cause of inflation in both China and the United States.

In the end, only monetary reform, including an end to the reserve currency system, can permanently separate the dollar host from its yuan colony. Without monetary reform, the perverse effects of the dollar reserve currency system will surely metastasize into one financial and political crisis after another—even on the scale of the 2007–2009 crisis.

It is, of course, a counter intuitive fact that China has been financially colonized by the United States. But why is this a fact? Simply because China has chained itself to the world dollar standard at a pegged undervalued exchange rate, choosing therefore to hold the exchange value of its trade surplus—that is, its official national savings—in U.S. dollar securities. It is true that the dollar-yuan strategy of America’s Chinese colony has helped to finance a generation of extraordinary Chinese growth. But China now holds more than 3 trillion dollars of official reserves and more than a trillion dollars in U.S. government securities. These Chinese dollar reserves directly finance the deficits of the American colonial center. This arrangement clearly resembles the imperial system of the late 19th century. The value of a British colony’s reserves were often held in the currency of the imperial center, then invested in the London money market. Thus, the colony’s reserves were entirely dependent on the stability of the currency of the colonial center. While China is America’s largest financial colony, most other developing countries are also bound to neo-colonial status within the reserve currency hegemony of the dollarized world trading system.

China’s dollarized monetary system reminds us of nothing so much as the historic colonial financial arrangements imposed by the later British Empire on India before World War I—India actually remaining a financial colony of England long after its independence in 1947. How did the sterling financial empire work? The imperial colony of India, beginning in the late 19th century, held its official Indian currency reserves (savings) in British pounds deposited in the English money market; independent developed nations at that time, like France and Germany, held their reserves in gold. That is, France, Germany, and the United States settled their international payment imbalances in gold—a non-national, common, monetary standard—holding their official reserves, too, in gold. But the London-based reserves of colonial India were held not primarily in gold, but in British currency, helping to finance not only the imperial economic system, but also the imperial banking system, imperial debts, imperial wars, and British welfare programs. Eventually, as we know, both the debt-burdened British Empire and its official reserve currency system collapsed.

For more than a generation now, a similar process has been at work in China. China is America’s chief colonial appendage. The Chinese work hard and produce goods. Subsidized by an undervalued yuan, they export much of their surplus production to America. But, like the Indians who were paid in sterling, the exports of Chinese colonials are substantially paid in dollars, not yuan—because bilateral and world trade, and the world commodities market, have been dollarized. And thus it may be said that the world financial system is today an unstable neocolonial appendage of the unstable dollar.

China, like its predecessor the British colony of India, has chosen to hold a significant fraction of what it is paid in the form of official dollar reserves (or savings). These dollars are promptly redeposited in the U.S. dollar market, where they are used to finance U.S. deficits. Every Thursday night, the Federal Reserve publishes its balance sheet, and there we now read that more than $2.5 trillion of U.S. government securities are held in custody for foreign monetary authorities, 40 percent of which is held for the account of America’s chief financial colony, Communist China. It is clear that without financial colonies to finance and sustain the immense U.S. balance of payments and budget deficits, the U.S. paper dollar standard and the growth of U.S. government spending would be unsustainable.

It is often overlooked that these enormous official dollar reserves held by China are a massive mortgage on the work and income of present and future American private citizens. This Chinese mortgage on the American economy has grown rapidly since the suspension of dollar convertibility to gold in 1971. China—poor and undeveloped in 1971—was at that time very jealous of its sovereign independence, sufficiently so to reject its alliance with the Soviet Union—even earlier to attack U.S. armies on the Chinese border during the Korean War. In an ironic twist of fate, China surrendered its former independence and, as a U.S. financial colony, joined the dollar-dominated world financial system. China’s monetary policy is anything but independent. It is determined primarily by the Federal Reserve Board in America, the pegged yuan-dollar exchange rate serving as the transmission mechanism of Fed-created excess dollars pouring into the Chinese economic system. Perennial U.S. balance of payments deficits send the dollar flood not only into China but also into all emerging countries. The Chinese central bank buys up these excess dollars by issuing new yuan, thereby holding up the overvalued dollar, and holding down the undervalued yuan. Much of these Chinese official dollar purchases are then invested in U.S. government debt securities. So even though America exports excess dollars to China, China sends them back to finance the U.S. budget deficit—much like marionettes walking off one side of the stage, merely to reappear unchanged on the other side.