Sunday, November 6, 2011

The spilled milk of the American empire


Wall Street protesters share the Tea Party’s illusions

By John Payne
As I walked toward One Liberty Plaza on the first Friday in October, I thought about the Arcade Fire song “Rococo,” in which lead singer Win Butler lambastes “the modern kids” who use “great big words that they don’t understand” and build up an institution “just to burn it back down.” At the time it seemed an apt description of the Occupy Wall Street protesters who had seized the plaza—mostly young kids who enjoyed the fruits of American capitalism but now talked of tearing down the system that weaned them.

After walking around the protest for a half hour or so, I felt secure in that prejudice. A dozen or so drummers pounded away in a corner, their ceaseless rhythm rumbling through the canyons of lower Manhattan for at least a quarter mile in any direction. In the middle of the protesters’ impromptu shantytown, four people sat at a marble table busily rolling cigarettes behind a donation bucket and sign reading “Free Cigarettes: Nick@Nite.” The atmosphere led one of my friends to remark that it was “all the worst aspects of a hippie festival with none of the drugs.”

With the exception of a smattering of Ron Paul signs, the protesters’ placards and literature showed an almost universal hostility to a market economy. A young man in a suit handed out a semi-official pamphlet welcoming me to the occupation. It advised me to visit the food bar where I could dine on granola and “Occu-pie” and suggested that after eating I should “feel free to refresh [myself] in the restrooms of neighboring businesses like Burger King and McDonalds without feeling obligated to buy anything.”

The pamphlet listed only one political demand: “Stop Corporate Personhood.” This was a relatively common theme on the signs, but for all that talk, many protesters also advocated an increase in corporate taxes, evidently failing to realize that the one policy precludes the other. Near the People’s Library, a piece of paper hung on the wall urging adoption of a mandatory four-day work week because it “keeps all the efficiencies of capitalism” but forces employers to hire the currently jobless to make up for the 20 percent reduction in labor hours.

As my group began walking uptown on Broadway, we ran into another throng of protesters marching down to the main demonstration. I stepped back and started snapping pictures just in time to capture a handsome young man with spiky blond hair dressed smartly in a black collared shirt and matching jeans. He carried a sign urging us to “Occupy Everything” because “we already know that we own everything.” With his square jaw and the steely resolve in his eyes he looked like a Bizzaro John Galt, ready to throw himself on the gears of modern capitalism and grind them to a halt.

I left that night convinced that the protest was little more than the latest banshee cry of the radical left. Over the weekend, however, hundreds of protesters were arrested on the Brooklyn Bridge and similar occupations sprang up across the country. These developments prompted me to return to One Liberty Plaza the next Tuesday.

I arrived just in time for the General Assembly, where the plaza’s occupants attempt to build consensus about how their community should be run. Protesters are prohibited from using amplification, so to ensure that everyone can hear the proceedings they use what they call the “human mic,” whereby all participants repeat what the person who has the floor just said. Periclean Athens this wasn’t. Still, it struck me as a genuine attempt by the protesters to build something real.

As I interviewed some of the protesters that night, I discovered that many of them were not driven by a blind rage against capitalism but were simply trying to assert some modicum of control over institutions they believe are running over them roughshod. Carey Tan, an event planner with a nonprofit, told me she wanted the Glass-Steagall Act put back into place “to make sure my money isn’t being used to buy … sub-prime mortgages and lots of risky investments.” She also wanted to see the revolving door between business and government closed but was unsure how that could be accomplished.

Joe Therrien, a teacher in Brooklyn, echoed Tan’s argument about Glass-Steagall and also called for higher taxes on the rich and corporations. But he was not opposed to corporations as such, saying that he “want[s] there to be rich companies in America” and thinks they should pay more in taxes because they benefit from government services. Therrien was refreshingly humble about the limits of his own knowledge and put his faith in the ability of Americans to solve our problems through civil discourse. He said that although “there are individuals who claim to have the answers … as a group, we’re trying to figure it out together.”

You can disagree with Tan’s policy proposals or call Therrien’s trust in participatory democracy naïve, but these were not bomb-throwing radicals. They are relatively ordinary Americans who looked around one day, saw obviously dysfunctional political and economic systems, and decided to do something about them. And although the media portrays them as the Wall Street protesters’ polar opposites, the same can be said of most Tea Partiers. The Tea Party is older and more conservative, while the Occupiers are younger and more left-wing, but both are attempting to come to terms with American decline. They are both sincere and well-meaning in their own ways, but our problems are much more severe than either group dares admit.

[1]It’s easy to scapegoat earmark spending (never mind that middle-class entitlements such as Social Security and Medicare are the real driving forces of government spending) or the 1 percent on Wall Street who are supposedly exploiting the rest of us poor bastards (never mind that most of us own stocks and bonds), but these are lies. Comforting lies, but lies nonetheless.

On Tuesday night, I stumbled across Jimmy McMillan—the Rent Is Too Damn High Party guy—standing on the edge of the plaza with a small crowd around him. “Go home,” he roared to the mostly uninterested protesters. “Make love to your girl.” When I asked him for an explanation of the comment, he told me that all Americans are responsible for our current predicament because they have perpetuated a corrupted political system. Now “the diehard Democrat is dying real hard.”

McMillan’s comments give away his political self-interest, of course, but they at least confronted the fact that we largely brought our woes upon ourselves by living beyond our means. The 1 percent in government and business may have made our bed, but we slept in it, happily dreaming the impossible, and now we refuse to shake off our delusional slumber. The Federal Reserve and the lending institutions sold us houses at 3 percent interest and no money down, but we bought them.

American exceptionalism and privilege are crashing down around us, but these protest groups—like the vast majority of Americans—refuse to reconcile themselves to this new, hostile reality. The world will move on without us; we are no longer the indispensable nation. Once we accept that fact, we can get down to the difficult business of becoming a normal country. Until then, Occupiers and Tea Partiers will remain little more than petulant children crying over the spilled milk of the American empire.

Arrivederci, Roma


Will popular democracy bring down the New World Order?
 
By Patrick J. Buchanan

A fair question. For Western peoples are growing increasingly reluctant to accept the sacrifices that the elites are imposing upon them to preserve that New World Order.

Political support for TARP, to rescue the financial system after the Lehman Brothers collapse, is being held against any Republican candidate who backed it. Germans and Northern Europeans are balking at any more bailouts of Club Med deadbeats.

Eighty-one members of David Cameron’s party voted against him to demand a referendum on whether Britain should leave the European Union altogether, the worst Tory revolt ever against the EU.

Greek Prime Minister George Papandreou imperiled the grand bargain to save the eurozone by announcing a popular vote on whether to accept the austerity imposed on Greece, or default, and let the bank dominoes begin to fall. The threat faded only when Papandreou cancelled the referendum.

But the real peril is Italy, No. 3 economy in the eurozone, with a national debt at 120 percent of gross domestic product.

After the plan to save the eurozone was announced, interest rates on new Italian debt surged above 6 percent, with 6.5 regarded as unsustainable.

When Papandreou announced his referendum, the cost of Italian debt surged again. Should buyers of Italy’s debt go on strike, fearing a Rome default or write-down, that is the end of the eurozone and potentially the end of the EU.

But an even larger question hangs over Rome.

Will Italy survive as one nation and one people?

For the austerity demanded of Italy to deal with its debt crisis is adding kindling to secessionist fires in the north, where the Lega Nord of Umberto Bossi, third largest party in Italy, seeks to lead Lombardy, Piedmont and Veneto, with the cities of Turin, Milan and Venice, out of Italy into a new nation — Padania.

The north has long resented Rome, Naples and Sicily, seeing them as lazier and less industrious. Bossi, who calls himself “Braveheart,” after the Scottish hero of the Mel Gibson movie, sees northern people as Celts who are ethnically different and separate from the rest of Italy.

The Northern League belief that people of Southern Italy caused their debt crisis, bringing on austerity, mirrors the belief of much of Northern Europe that Italy and Greece do not deserve to be bailed out.

As the north is also home to 60 percent of the immigrants who have poured into Italy — Gypsies from Romania, Arabs from the Mahgreb and Middle East — Bossi’s party is aggressively anti-immigrant, as are the other surging populist parties of Europe.

Americans who deplore the tough laws against illegal immigration in Arizona and Alabama might look to Italy, where the Northern League managed to have illegal entry into the country declared a felony.

The League was also behind a new law calling for sending back tens of thousands of Arab Spring migrants who arrived on the tiny Italian island of Lampedusa, which is closer to Africa than Italy.

But while resentment against the south for alleged freeloading and causing the debt crisis is bringing the secession issue to a boil, demography may be the greater threat to the national future.

Italy, says Cardinal Angelo Bagnasco, president of the Italian Bishops Conference, is heading for “demographical suicide,” and the reason is a low birth rate caused by its “cultural and moral distress.”

According to Italy’s National Office of Statistics, in 2009 the fertility rate of Italian women was 1.41 children per woman. This is only two-thirds of what is needed simply to replace Italy’s existing population.

Italy’s fertility rate has been below replacement levels for 35 years. By mid-century, Italy will be a nation with a birth rate that will have been below, at times far below, zero population growth for 75 years.

Italy’s birth rate in 1950 was almost twice its death rate. But the death rate equaled the birth rate in 1985, exceeds it today and will be approaching twice the birth rate by 2050.

Italy is not only aging, with the median age of its population going from 43 today to 50 at midcentury, Italy is dying. If this does not change, what the world knows as Italy will not exist at the end of this century.

Like other European nations, Italy faces an existential crisis.

Her national debt is twice what the EU says is tolerable. She must undergo years of painful austerity to pay back what she has borrowed and spent. Yet a shrinking population of working age young and an expanding pool of seniors and aged to care for will make that increasingly difficult, and default on her debts increasing attractive, as it is today to the Greeks.

The Northern League, seeing the south as the source of its troubles, will grow in appeal, as those troubles grow.

If your debts are larger than your economy, your death rate exceeds your birth rate and every new generation will be one-third smaller than the previous one, what kind of future does your country have?

The kind of future Italy faces.

Tough principles


Who Killed Horatio Alger?
 Illustration by Paul Pope
The decline of the meritocratic ideal
By Luigi Zingales
The title character of Horatio Alger’s 1867 novel Ragged Dick is an illiterate New York bootblack who, bolstered by his optimism, honesty, industriousness, and desire to “grow up ’spectable,” raises himself into the middle class. Alger’s novels are frequently misunderstood as mere rags-to-riches tales. In fact, they recount their protagonists’ journeys from rags to respectability, celebrating American capitalism and suggesting that the American dream is within everyone’s reach. The novels were idealized, of course; even in America, virtue alone never guaranteed success, and American capitalism during Alger’s time was far from perfect. Nevertheless, the stories were close enough to the truth that they became bestsellers, while America became known as a land of opportunity—a place whose capitalist system benefited the hardworking and the virtuous. In a word, it was a meritocracy.

To this day, Americans are unusually supportive of meritocracy, and their support goes a long way toward explaining their embrace of American-style capitalism. According to one recent study, just 40 percent of Americans attribute higher incomes primarily to luck rather than hard work—compared with 54 percent of Germans, 66 percent of Danes, and 75 percent of Brazilians. But perception cannot survive for long when it is distant from reality, and recent trends seem to indicate that America is drifting away from its meritocratic ideals. If the drifting continues, the result could be a breakdown of popular support for free markets and the demise of America’s unique version of capitalism.

The fundamental role of an economic system, even an extremely primitive one, is to assign responsibility and reward. In animal packs, the responsibility of leadership and the reward of mating opportunities are generally assigned to the strongest. In human societies, responsibility tends to take the form of employment, and the rewards are money and prestige. Because physical strength has long since lost its importance, economic systems determine in various ways who receives the responsibilities and the rewards. The dominant criterion in traditional society was birth: the king’s firstborn son was the next king; the landowner’s firstborn son, the new landowner; and the son of the company’s owner, the next chief executive. Most modern societies, by contrast, try to select and reward according to merit. Indeed, surveys show that in the abstract, most people in developed countries agree with the idea that merit should be rewarded.

It isn’t easy to decide what constitutes merit, of course. Consider an environment with which I’m familiar: American academia. Let’s say you want to determine who the best professors are. How do you rank publications? Do you value the number of papers that someone has written, or their impact? How do you measure that impact—is it merely the number of times that a paper has been cited, or should citations be weighted by the importance of the journal that makes the citation? Do you value good citations (“This is a seminal paper”) and bad citations (“This paper is fundamentally flawed”) equally? What about teaching? How do you evaluate that? Should it be measured by students’ satisfaction, or should other criteria come into the picture? If so, which? And what about other dimensions, such as collegiality and “service” to the school? Any system of determining merit must assign various weights to each of these dimensions, a process that is inevitably somewhat arbitrary, and the arbitrariness can create the presumption of unfairness and favoritism.

As that example suggests, a system of measuring merit should be efficient and difficult to manipulate, and above all, it should be deemed fair—or at least not too unfair—by most of the people subject to it. We can now begin to understand why support for meritocracy translates so neatly into support for the market system. Markets are far harder to manipulate than, say, a list of tenure requirements that an academic committee has created, or—to take a broader example—the decisions of statist regimes determining which lucky citizens get which consumer products. The market system has the reputation, too, of producing efficient results. And it doesn’t violate the prevailing notion of fairness too much.

Naturally, not everyone embraces the market system. Probably the reason that intellectuals tend to reject it is that it doesn’t reward what they think is meritorious: Lady Gaga makes a lot more money than Nobel laureates do. But in America, people largely accept the system—not merely because they think that it will deliver a reasonably efficient outcome, but also because they consider it mostly fair. As in Horatio Alger’s stories, they believe, such virtues as honesty, frugality, and hard work will be rewarded.

But this rosy picture obscures a hard fact: meritocracy is a difficult principle to sustain in a democracy. Any system that allocates rewards on the basis of merit inevitably gives higher compensation to the few, leaving the majority potentially envious. In a democracy, the majority generally rules. Why should that majority agree to grant a minority disproportionate power and rewards?

To understand the difficulty, consider the University of Chicago, an institution that still attracts market-oriented people, thanks to its association with the great free-market economist Milton Friedman. Who could be more merit-minded than MBA students who attend such a place, investing tens of thousands of dollars and two years of their lives to reap the rewards of a meritocratic system? Nevertheless, in a move that contradicted the meritocratic spirit, Chicago MBA students voted in 2000 not to reveal their grades to recruiters. The reason was clear: allowing recruiters to distinguish among them based on merit would benefit a minority of them at the expense of a majority. Even the most meritocratic people, then, can vote against meritocracy when it damages their own prospects. No wonder meritocracy is so politically fragile.

However, two factors help sustain a meritocratic system in the face of this challenge: a culture that considers it legitimate to reward effort with higher compensation; and benefits large enough, and spread widely enough through the system, to counter popular discontent with inequality. The cultural factor is easy to spot in America, which encouraged meritocracy from its inception. In the eighteenth century, the social order throughout the world was based on birthrights: nobles ruled Europe and Japan, the caste system prevailed in India, and even in England, where merchants were gaining economic and political strength, the aristocracy wielded most of the political power. The American Revolution was a revolt against aristocracy and the immobility of European society, but unlike the French Revolution, which emphasized the principle of equality, it championed the freedom to pursue happiness. In other words, America was founded on equality of opportunities, not of outcomes. The subsequent economic success of the new country cemented in the collective perception the benefits of assigning rewards and responsibilities according to merit.

This historical heritage is reflected in American attitudes today. Income inequality in the United States is among the largest in the developed world. Yet in a recent survey of 27 developed countries by the Pew Charitable Trusts, only one-third of Americans agreed that it was the government’s responsibility to reduce income inequality; the country with the next smallest fraction to agree was Canada, with 44 percent, and the responses rose as high as Portugal’s 89 percent. Americans do not want to redistribute income, but they do want the government to provide a level playing field: over 70 percent of Americans said that the role of government was “to ensure everyone has a fair chance of improving their economic standing.”

This belief in equality of opportunity is supported by another belief: that the system is actually fair. Sixty-nine percent of Americans in the same survey agreed with the statement “People are rewarded for intelligence and skill,” a far larger percentage than in any other country. At the same time, only 19 percent of Americans thought that coming from a wealthy family was important for getting ahead, versus 39 percent in Chile, 53 percent in Spain, and a median response across all nations of 28 percent.

In America, the legitimacy of rewarding hard work is so pervasive that even undergraduates in the country’s leftmost precinct, Berkeley, apparently endorse it. Economists have created an experiment called the “dictator game,” in which a subject is given a sum of money and asked to divide it however he likes between himself and an anonymous player. The experiment has been run thousands of times, and on average, people give 20 percent of their money to the anonymous players, presumably out of altruism or compassion. Recently, however, economist Pamela Jakiela changed the conditions of the experiment. In one treatment, the subjects—Berkeley undergrads—were told that the anonymous players had worked hard; in another treatment, they were told that they had done nothing. The students, it turned out, were much more willing to reward the hard workers than the slackers. In still another experiment, in which the person allocating the money had to work hard to gain it (by sorting beans, as it happened), she was much less willing to give it away.

Don’t suppose that the culture of meritocracy is universal. When the same experiment was conducted in Kenya, it yielded opposite results, with the subjects more willing to reward luck than hard work. But in America, from Berkeley to Boston, people believe in greater reward for greater effort, and that belief helps protect meritocratic capitalism from the forces that threaten to undermine it.

A meritocracy can’t survive on a supportive culture alone, however; it must also confer benefits large enough for people to recognize them. It’s no accident that meritocratic systems emerge when their potential benefits are the most sharply felt. At the national level, that tends to occur in wartime, especially when the survival of a country is at stake. In 1793, when the French Revolution was threatened by the invading armies of other European powers, the Jacobin government started to promote talented soldiers, rather than well-bred ones. This simple innovation allowed the Revolution to beat back Europe’s better-armed and better-trained forces. A similar effect was demonstrated by a friend of mine who felt sick one day. His wife offered to call their doctor friend. “No, I’m really sick,” he said. “I need a real doctor.” When life is in danger, there’s an enormous benefit to choosing according to merit, not loyalty.

So a meritocratic system, to engender a broad consensus, must confer relatively sizable benefits, even if they aren’t necessarily so enormous as saving a country from defeat. That isn’t an easy task. In politics, for example—a field in which value is mostly redistributed rather than created—the benefits conferred by meritocracy are relatively small compared with the benefits conferred by cronyism. If I appoint my friends to office, even when they aren’t terribly competent, I lose relatively little efficiency and gain quite a lot of power. Hence meritocracy is difficult to sustain in government.

To die well, we must know first what we have lived for


Here and After
Political warrior David Horowitz reflects on life and death in his new book “A Point in Time: The Search for Redemption in This Life and the Next”
By Theodore Dalrymple
Death is every life’s inevitable denouement, but La Rochefoucauld told us that we can no more stare it in the face than we can stare at the sun. For the most part, we continue our daily round in a state of presumed immortality, and because we are so unfamiliar nowadays with death—it having been carefully put out of our sight by a host of professionals—we treat it as an unwarranted intrusion into our affairs rather than as an existential limit to our brief earthly sojourn. For many, death has become anomalous rather than inevitable, something to protest against rather than accept. For them, the concept of a good death is entirely alien or antipathetic.

David Horowitz tries to stare his own death in the face. Now 71, he has had cancer of the prostate, and he has diabetes and angina; his diplomatic immunity from death, which we all grant ourselves, has been unmistakably withdrawn. His short new book, which it is both necessary and a pleasure to read in one sitting, is a meditation on the meaning of life, sub specie aeternitatis.

Horowitz begins by reflecting on the nature and character of his dogs, whom he takes for regular walks. Perhaps those who don’t love dogs will think this an odd way to begin a book on the meaning of life, but it seems entirely natural and fitting. Indeed, I was struck by how Horowitz’s meditations paralleled mine, occasioned by my relationship, and walks, with my own dog—a relationship intense and happy, at least on my side and, if I don’t delude myself, on his also. The dog, of course, has no intimation of his own mortality, while the owner’s pleasure in the animal’s company is increasingly tinged with a melancholy awareness of his swiftly approaching dissolution. Yet the dog maintains his passionate interest in the little world around him, his small-scale curiosity in his immediate environment. In the face of the physical immensity of the universe and the temporal vastness that both preceded and will follow his oblivion, is a man in any fundamentally different situation?

As far as we know, we are the only creatures to demand of their existence a transcendent meaning. This can be supplied by various means, most commonly religious belief. Horowitz is unable to accept belief in a personal God, but wishes he could and, unlike many in his position, does not scorn those who do. He is decidedly not the village atheist.

More than most, however, he has reason to know that politics can also give, or at any rate appear to give, transcendent meaning to life. The secular religion of Marxism was particularly adept at supplying this meaning, though nationalist struggles could do the same. To believe that one was a soldier in history’s army, marching toward the predestined final victory when mankind would become terminally happy, and that one’s participation would help bring forward that consummation, was to know that one did not live in vain. Even personal suffering can be lessened by adherence to a political cause: either such suffering is experienced as a consequence of the struggle, or it is at least ameliorated by an acceptance of its pettiness by comparison with the greater goal.

Horowitz offers brief but moving glimpses of his father, a true believer in the ability of Marxism (in what he considered its indubitably correct form) not only to interpret the world but to change it. The preposterous intellectual grandiosity of this belief contrasted comically, and sadly, with Horowitz senior’s position in the world. His son’s depiction has an elegiac quality, portraying the tragicomedy of a man who thought he had penetrated to the heart of existence’s mystery but was really quite weak. Though he embraced a doctrine that had done untold evil in the world, he himself was a gentle soul. His son writes in sorrow, not anger.

The author has reason to know better than most the religious nature of the revolutionary creed. In 1971, when still under the influence of leftism, he edited a book of essays dedicated to the life and work of the Marxist historian Isaac Deutscher. Like Horowitz’s father, Deutscher kept his faith in the immaculate conception of the October Revolution, a revolution that was, alas, subsequently to be corrupted—just as Rousseau thought naturally innocent mankind was corrupted by society. One of the essays in this book, by the Economist’s former Paris correspondent, Daniel Singer, contains the following passage:

Could one trust the statement of a Komintern ready to distort in such a fashion? Isaac was driven to question all authorized versions, to go back to the October revolution, to study the conflicts that followed Lenin’s death. The German heresy thus led him logically to an understanding and rejection of the Stalinist system.
The religious nature of Deutscher’s belief in revolutionary Marxism could hardly have been clearer. Authorized versions give rise to, or at least are the precondition of, heresies. Deutscher went back to the October Revolution, and to Lenin’s words, as Muslim fundamentalists go back to the Koran, for a source of undoubted and indisputable truth. Inside every heretic, it seems, a dogmatist is trying to get out.

Horowitz has put the pseudo-transcendence of a purpose immanent in history completely behind him so completely that he can now write about it calmly and without rancor. His masters are now Marcus Aurelius, the stoic Roman emperor, whom he likes to quote, and Dostoyevsky, who was among the first to grasp the significance of the perverted religious longings of the revolutionary intelligentsia, and the hell on earth to which they would inevitably lead. But the temptations of ideology are always present: Dostoyevsky, so aware of the dangers of the revolutionary intelligentsia, himself subscribed in entries in A Writer’s Diary to an ideology at least as absurd: that of Slavophile millenarianism. It is wrong to oppose one ideology with another, but it is by no means easy to escape the trap of doing so.

If neither formal religious belief nor secular religions like Marxism gives meaning to Horowitz’s life, what does? In large measure, it is his work: a lifetime spent in the crucible of political thought and struggle, first on the left, and then, over the last quarter century or so, as a devout conservative. It is vain to suppose, of course, that any human achievement, even the highest, could possibly be of a duration that would entitle it to the word “eternal.” No literary fame, for example, has so far lasted longer than 3,000 years—not even the blinking of the universe’s eyelid. But we humans must live on a human scale and measure things accordingly. The journalist, while he writes his latest article, thinks it of the greatest significance, though he knows perfectly well that it will be forgotten the day after tomorrow, if indeed it is read or noticed at all. Often I have thought to myself, as I write articles, “If only I can be spared until I have finished it,” though I am aware that even I will have forgotten its content by the week after next.

Significance and importance, however, are not natural qualities found inhering in objects or events. Only the appraising mind can impart such meaning. That is why, in my view, the neurosciences are doomed to failure, at least in their more ambitious claims. A mysterious metaphysical realm exists beyond the reach of even the most sophisticated of scanners, even if we cannot specify exactly where that realm is or how it came to be. The physiologist Moleschott, in the nineteenth century, declared that the brain secreted thought like the liver secreted bile; those neuroscientists who tell us that we are about to empty life of its mystery will come to seem as ridiculous, as absurdly presumptuous, as Moleschott seems to us now.

Horowitz tackles these problems in an indirect and gentle fashion. When he talks of the meaning that his work gives to his life, he is not saying to all his readers “Go and do likewise,” because it is clearly not given to everyone to do so (and thank goodness—a world composed of only one kind of person would be unbearable). The satisfaction of work is not, or at least should not be, proportional to the amount of notice it receives in the world. Perhaps the worst effect of celebrity culture is that it makes fame the measure of all things, and thus devalues or renders impossible not only satisfaction from useful but unglamorous labour, but precisely the kinds of pleasures and deep consolations that are to be had from walking a dog.

David Horowitz’s book is a small but important contribution to the revival of the art of dying well, an art from which most of us, both the living and the dying, would benefit. And to die well, we must know first what we have lived for.

Arab Spring, Egyptian edition


CHRISTIAN STUDENT BEATEN TO DEATH FOR WEARING CROSS
A 17 year old Christian in a high school in Mallawi was ordered by his teacher to cover up a tattoo of a cross on his wrist. True to his faith, he refused to do so and instead exposed a crucifix that he wore around his neck. He was thenbeaten to death by his teacher and two Muslim students:
According to Ayman’s father, eyewitnesses told him that his son was not beaten up in the school yard as per the official story, but in the classroom. “They beat my son so much in the classroom that he fled to the lavatory on the ground floor, but they followed him and continued their assault. When one of the supervisors took him to his room, Ayman was still breathing. The ambulance transported him from there dead, one hour later.”
Such atrocities have become shockingly common, but have been met with almost complete apathy by Christians outside the Middle East.

An epitaph for the republic


Corporate Collaborators
 Standing with “the 99%” means supporting the destruction of civilized society.
by Mark Steyn
Way back in 1968, after the riots at the Democratic Convention in Chicago, Mayor Daley declared that his forces were there to “preserve disorder.” I believe that was one of Hizzoner’s famous malapropisms. Forty-three years later Jean Quan, mayor of Oakland, and the Oakland city council have made “preserving disorder” the official municipal policy. On Wednesday, the “Occupy Oakland” occupiers rampaged through the city, shutting down the nation’s fifth-busiest port, forcing stores to close, terrorizing those residents foolish enough to commit the reactionary crime of “shopping,” destroying ATMs, spraying the Christ the Light Cathedral with the insightful observation “F**k,” etc. And how did the Oakland city council react? The following day they considered a resolution to express their support for “Occupy Oakland” and to call on the city administration to “collaborate with protesters.”

That’s “collaborate” in the Nazi-occupied-France sense: The city’s feckless political class are collaborating with anarchists against the taxpayers who maintain them in their sinecures. They’re not the only ones. When the rumor spread that the Whole Foods store, of all unlikely corporate villains, had threatened to fire employees who participated in the protest, the regional president, David Lannon, took to Facebook: “We totally support our Team Members participating in the General Strike today — rumors are false!” But, despite his “total support,” they trashed his store anyway, breaking windows and spraypainting walls. As the Oakland Tribune reported:
A man who witnessed the Whole Foods attack, but asked not to be identified, said he was in the store buying an organic orange when the crowd arrived.
There’s an epitaph for the republic if ever I heard one.

The experience was surreal, the man said. “They were wearing masks. There was this whole mess of people, and no police here. That was weird.”

No, it wasn’t. It was municipal policy. In fairness to the miserable David Lannon, Whole Foods was in damage-control mode. Men’s Wearhouse in Oakland had no such excuse. In solidarity with the masses, they printed up a huge poster declaring “We stand with the 99%” and announcing they’d be closed that day. In return, they got their windows smashed.

I’m a proud member of the 1 percent, and I’d have been tempted to smash ’em myself. A few weeks back, finding myself suddenly without luggage, I shopped at a Men’s Wearhouse, faute de mieux, in Burlington, Vt. Never again. I’m not interested in patronizing craven corporations so decadent and self-indulgent that as a matter of corporate policy they support the destruction of civilized society. Did George Zimmer, founder of Men’s Wearhouse and backer of Howard Dean, marijuana decriminalization, and many other fashionable causes, ever glance at the photos of the OWS occupiers and ponder how many of “the 99%” were ever likely to be in need of his two-for-one deal on suits and neckties? And did he think even these dummies were dumb enough to fall for such a feebly corporatist attempt at appeasing the mob?

I don’t “stand with the 99%,” and certainly not downwind of them. But I’m all for their “occupation” continuing on its merry way. It usefully clarifies the stakes. At first glance, an alliance of anarchists and government might appear to be somewhat paradoxical. But the formal convergence in Oakland makes explicit the movement’s aims: They’re anarchists for statism, wild free-spirited youth demanding more and more total government control of every aspect of life — just so long as it respects the fundamental human right to sloth. What’s happening in Oakland is a logical exercise in class solidarity: The government class enthusiastically backing the breakdown of civil order is making common cause with the leisured varsity class, the thuggish union class, and the criminal class in order to stick it to what’s left of the beleaguered productive class. It’s a grand alliance of all those societal interests that wish to enjoy in perpetuity a lifestyle they are not willing to earn. Only the criminal class is reasonably upfront about this. The rest — the lifetime legislators, the unions defending lavish and unsustainable benefits, the “scholars” whiling away a somnolent half decade at Complacency U — are obliged to dress it up a little with some hooey about “social justice” and whatnot.

But that’s all it takes to get the media and modish if insecure corporate entities to string along. Whole Foods can probably pull it off. So can Ben & Jerry’s, the wholly owned subsidiary of the Anglo-Dutch corporation Unilever that nevertheless successfully passes itself off as some sort of tie-dyed Vermont hippie commune. But a chain of stores that sells shirts, ties, the garb of the corporate lackey has a tougher sell. The class that gets up in the morning, pulls on its lousy Men’s Wearhouse get-up, and trudges off to work has to pay for all the other classes, and the strain is beginning to tell.

Let it be said that the “occupiers” are right on the banks: They shouldn’t have been bailed out. America has one of the most dysfunctional banking systems in the civilized world, and most of its allegedly indispensable institutions should have been allowed to fail. But the Occupy Oakland types have no serious response, other than the overthrow of capitalism and its replacement by government-funded inertia.

America is seizing up before our eyes: The decrepit airports, the underwater property market, the education racket, the hyper-regulated business environment. Yet curiously the best example of this sclerosis is the alleged “revolutionary” movement itself. It’s the voice of youth, yet everything about it is cobwebbed. It’s more like an open-mike karaoke night of a revolution than the real thing. I don’t mean just the placards with the same old portable quotes by Lenin et al., but also, say, the photograph in Forbes of Rachel, a 20-year-old “unemployed cosmetologist” with remarkably uncosmetological complexion, dressed in pink hair and nose ring as if it’s London, 1977, and she’s killing time at Camden Lock before the Pistols gig. Except that that’s three and a half decades ago, so it would be like the Sex Pistols dressing like the Andrews Sisters. Are America’s revolting youth so totally pathetically moribund they can’t even invent their own hideous fashion statements? Last weekend, the nonagenarian Commie Pete Seeger was wheeled out at Zuccotti Park to serenade the oppressed masses with “If I Had a Hammer.” As it happens, I do have a hammer. Pace Mr. Seeger, they’re not that difficult to acquire, even in a recession. But, if I took it to Zuccotti Park, I doubt very much anyone would know how to use it, or be able to muster the energy to do so.

At heart, Oakland’s occupiers and worthless political class want more of the same fix that has made America the Brokest Nation in History: They expect to live as beneficiaries of a prosperous Western society without making any contribution to the productivity necessary to sustain it. This is the “idealism” that the media are happy to sentimentalize, and that enough poseurs among the corporate executives are happy to indulge — at least until the window-smashing starts. To “occupy” Oakland or anywhere else, you have to have something to put in there. Yet the most striking feature of OWS is its hollowness. And in a strange way the emptiness of its threats may be a more telling indictment of a fin de civilisation West than a more coherent protest movement could ever have mounted.

Saturday, November 5, 2011

In search of heretics


By Matt Ridley
My topic today is scientific heresy. When are scientific heretics right and when are they mad? How do you tell the difference between science and pseudoscience?
Let us run through some issues, starting with the easy ones.
Astronomy is a science; astrology is a pseudoscience.
Evolution is science; creationism is pseudoscience.
Molecular biology is science; homeopathy is pseudoscience.
Vaccination is science; the MMR scare is pseudoscience.
Oxygen is science; phlogiston was pseudoscience.
Chemistry is science; alchemy was pseudoscience.
Are you with me so far?
A few more examples. That the earl of Oxford wrote Shakespeare is pseudoscience. So are the beliefs that Elvis is still alive, Diana was killed by MI5, JFK was killed by the CIA, 911 was an inside job. So are ghosts, UFOs, telepathy, the Loch Ness monster and pretty well everything to do with the paranormal. Sorry to say that on Halloween, but that’s my opinion.
Three more controversial ones. In my view, most of what Freud said was pseudoscience.
So is quite a lot, though not all, of the argument for organic farming.
So, in a sense by definition, is religious faith. It explicitly claims that there are truths that can be found by other means than observation and experiment.
Now comes one that gave me an epiphany. Crop circles*.
It was blindingly obvious to me that crop circles were likely to be man-made when I first starting investigating this phenomenon. I made some myself to prove it was easy to do*.
This was long before Doug Bower and Dave Chorley fessed up to having started the whole craze after a night at the pub.
Every other explanation – ley lines, alien spacecraft, plasma vortices, ball lightning – was balderdash. The entire field of “cereology” was pseudoscience, as the slightest brush with its bizarre practitioners easily demonstrated.
Imagine my surprise then when I found I was the heretic and that serious journalists working not for tabloids but for Science Magazine, and for a Channel 4 documentary team, swallowed the argument of the cereologists that it was highly implausible that crop circles were all man-made.
So I learnt lesson number 1: the stunning gullibility of the media. Put an “ology” after your pseudoscience and you can get journalists to be your propagandists.
A Channel 4 team did the obvious thing – they got a group of students to make some crop circles and then asked the cereologist if they were “genuine” or “hoaxed” – ie, man made. He assured them they could not have been made by people. So they told him they had been made the night before. The man was poleaxed. It made great television. Yet the producer, who later became a government minister under Tony Blair, ended the segment of the programme by taking the cereologist’s side: “of course, not all crop circles are hoaxes”. What? The same happened when Doug and Dave owned up*; everybody just went on believing. They still do.
Lesson number 2: debunking is like water off a duck’s back to pseudoscience.
In medicine, I began to realize, the distinction between science and pseudoscience is not always easy.  This is beautifully illustrated in an extraordinary novel by Rebecca Abrams, called Touching Distance*, based on the real story of an eighteenth century medical heretic, Alec Gordon of Aberdeen.
Gordon was a true pioneer of the idea that childbed fever was spread by medical folk like himself and that hygiene was the solution to it. He hit upon this discovery long before Semelweiss and Lister. But he was ignored. Yet Abrams’s novel does not paint him purely as a rational hero, but as a flawed human being, a neglectful husband and a crank with some odd ideas – such as a dangerous obsession with bleeding his sick patients. He was a pseudoscientist one minute and scientist the next.
Lesson number 3. We can all be both. Newton was an alchemist.
Like antisepsis, many scientific truths began as heresies and fought long battles for acceptance against entrenched establishment wisdom that now appears irrational: continental drift, for example. Barry Marshall* was not just ignored but vilified when he first argued that stomach ulcers are caused by a particular bacterium. Antacid drugs were very profitable for the drug industry. Eventually he won the Nobel prize.
Just this month Daniel Shechtman* won the Nobel prize for quasi crystals, having spent much of his career being vilified and exiled as a crank. “I was thrown out of my research group. They said I brought shame on them with what I was saying.”
That’s lesson number 4: the heretic is sometimes right.
What sustains pseudoscience is confirmation bias. We look for and welcome the evidence that fits our pet theory; we ignore or question the evidence that contradicts it. We all do this all the time. It’s not, as we often assume, something that only our opponents indulge in. I do it, you do it, it takes a superhuman effort not to do it. That is what keeps myths alive, sustains conspiracy theories and keeps whole populations in thrall to strange superstitions.
Bertrand Russell* pointed this out many years ago: “If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence.”
Lesson no 5: keep a sharp eye out for confirmation bias in yourself and others.
There have been some very good books on this recently. Michael Shermer’s “The Believing Brain”, Dan Gardner’s “Future Babble” and Tim Harford’s “Adapt”* are explorations of the power of confirmation bias. And what I find most unsettling of all is Gardner’s conclusion that knowledge is no defence against it; indeed, the more you know, the more you fall for confirmation bias. Expertise gives you the tools to seek out the confirmations you need to buttress your beliefs.
Experts are worse at forecasting the future than non-experts.
Philip Tetlock did the definitive experiment. He gathered a sample of 284 experts – political scientists, economists and journalists – and harvested 27,450 different specific judgments from them about the future then waited to see if they came true. The results were terrible. The experts were no better than “a dart-throwing chimpanzee”.
Here’s what the Club of Rome said on the rear cover of the massive best-seller Limits to Growth in 1972*:
“Will this be the world that your grandchildren will thank you for? A world where industrial production has sunk to zero. Where population has suffered a catastrophic decline. Where the air, sea and land are polluted beyond redemption. Where civilization is a distant memory. This is the world that the computer forecasts.”
"Science is the belief in the ignorance of the experts", said Richard Feynman.
Lesson 6. Never rely on the consensus of experts about the future. Experts are worth listening to about the past, but not the future. Futurology is pseudoscience.

Income mobility vs


"Time-Lapse Analysis" Instead of Snapshot Shows That 57% of Top 1% in 1996 Weren't There in 2005
By Mark Perry
From the 2007 study "Income Mobility in the U.S. from 1996 to 2005" from the Department of the Treasury (emphasis mine):
"The mobility of the top 1 percent of the income distribution is also important. More than half (57.4 percent) of the top 1 percent of households in 1996 had dropped to a lower income group by 2005 [MP: dropped into the bottom 99%]. This statistic illustrates that the top income groups as measured by a single year of income (i.e., cross-sectional analysis) often include a large share of individuals or households whose income is only temporarily high. Put differently, more than half of the households in the top 1 percent in 2005 were not there nine years earlier. Thus, while the share of income of the top 1 percent is higher than in prior years, it is not a fixed group of households receiving this larger share of income."
MP: The chart above also shows that almost half (45.6%) of the top 5% in 1996 had moved to a lower income group nine years later in 2005, and roughly 39% of the top 10% in 1996 dropped into a lower income group by 2005.  Whether it's the top 1%, top 5% or top 10%, those income groups are not static, closed groups, but snapshots in just one year of the national income distribution, which is constantly changing over time.  A large majority of today's 1% won't be there in the future, and weren't there in the past, they are just making a temporary stop in that group. 

As mentioned before, income mobility is far more important than income inequality.  Empirical evidence provided in this Treasury Department report and supported by other studies shows that there is significant income mobility in the U.S. for all income groups.  And yet all we hear about are the snapshot comparisons of income differentials for income groups in different years, which contain completely different people and households from snapshot to snapshot.  When you do a "time-lapse" analysis of the same people or households over time, what you find is significant income mobility and that finding deserves more attention.         

Government gone Insane


Bentley gets $4.8 billion UK government grant to increase R&D
The UK’s Department for Business Regional Growth Fund has approved Bentley Motors for a grant for approximately $4.8 billion. According to Bentley, the grant will “support the development of a new powertrain application which will enable Bentley to exploit new export markets.” In addition, “It will safeguard over 200 jobs and also create a small number of additional positions within the Company’s 900-strong engineering department.”
“This is a real boost for Bentley which has one of the most highly skilled automotive workforces in the country and, uniquely, has now been awarded two RGF grants,” said CEO Wolfganag Durheimer. “It shows that the Government recognises the importance of Bentley and the contribution we make to high value manufacturing and UK exports.”
This grant follows one previously issued for $2.7 billion that went towards an expansion in training and jobs at brand’s business, manufacturing and development location.
“Our customers expect the very best in terms of exclusivity, technology, quality, and engineering excellence which requires significant ongoing investment in R&D,” Durheimer said. “This grant will help safeguard those operations as we look to develop new powertrains which appeal to new markets and new customers.”

Death By A Thousand Cuts


Capitalism RIP
By Keith Weiner
Capitalism died when they decided to subsidize railroads for the sake of national prestige in the mid 19th century.
Capitalism died when, to compensate for the consequences of subsidized railroads, they passed anti-trust laws in 1890, under which it is illegal to have lower prices, the same prices, and higher prices than one’s competitors.
Capitalism died in 1913 when they started taxing income, and created a central bank.
Capitalism died after 1929 under the flailing interventionism of Hoover.
Capitalism died in 1933 when FDR confiscated the gold of US citizens, outlawed gold ownership, and defaulted on the domestic gold obligations of the US government.
Capitalism died when FDR stacked the supreme court, and created a veritable alphabet soup of regulatory agencies that could write law, adjudicate law, and execute law.
Capitalism died when FDR created the welfare state replete with a ponzi “retirement system”.
Capitalism died in 1944 when the rest of the world agreed to use the US dollar as if it were gold, at Bretton Woods.
Capitalism died under Johnson’s Great Expansion of FDR’s welfare state (Medicare).
Capitalism died when Kennedy removed silver from coins.
Capitalism died in 1971 when Nixon defaulted on the remaining gold obligations of the US government to foreign central banks.
Capitalism died when rampant expansion of counterfeit credit led to a near-death experience for the US dollar in the 1970′s.
Capitalism died when they ended the era where investors paid a firm to rate the debt they were going to buy. Congress enacted a law giving a government-protected franchise to Moodys, Fitch, and S&P.
Capitalism died when they decided to tax dividends at a higher rate than capital gains, thus distorting capital markets.
Capitalism died when they created Fannie, Freddie, Ginnie, and Sally.
Capitalism died when in 1981 Reagan and Volcker conspired to begin a long boom by a process of falling interest rates that continues to this very day, destroying inconceivable amounts of capital with every tick either up or (mostly) down.
Capitalism died when Greenspan discovered that market corrections could be overruled by another shot of crack cocaine, i.e. dirt cheap credit effluent, i.e. lowering the rate of interest.
Capitalism died with the growth of laws and court decisions granting legally privileged status to some kinds of employees but not others (and trampling all over the rights of employers). For example, the Americans with Disabilities Act.
Capitalism died with the passage of Medicare Part D.
Capitalism died with the bailouts, stimulus and other lies, deceit, fraud, and theft post 2008.
Capitalism died when Obama set aside the rule of hundreds of years old bankruptcy law and precedent to give unions priority in the bankruptcy of GM.
Capitalism died when Obama socialized medicine.
Capitalism died with every new regulatory package for financial markets: “Operation FD” in the late 1990s (as I recall), Sarbanes-Oxley, and now Dodd-Frank. With each one of these, the process is the same. Congress floats an idea publicly to “go after” the banks and dealers and brokers. Then the banks must go to Washington, spend money like water, and 6 months of back-room deals later, a multi thousand page document emerges as law. Then the regulatory agency must write regulations, so the banks spend more money, and a year of backroom dealings later, a hundred thousand page regulation emerges. Then this is to be enforced by armies of regulators. …
Capitalism died with Zero Interest Rate Forevah(TM).
Capitalism is long since dead. Whatever the name for today’s failed system is, “capitalism” is not that name.