Thursday, December 29, 2011

Sympathy Deformed


Misguided compassion hurts the poor
A scramble for rotting fish: decades of foreign aid have not helped Tanzanians.
A scramble for rotting fish: decades of foreign aid have not helped Tanzanians.
By Theodore Dalrymple
To sympathize with those who are less fortunate is honorable and decent. A man able to commiserate only with himself would surely be neither admirable nor attractive. But every virtue can become deformed by excess, insincerity, or loose thinking into an opposing vice. Sympathy, when excessive, moves toward sentimental condescension and eventually disdain; when insincere, it becomes unctuously hypocritical; and when associated with loose thinking, it is a bad guide to policy and frequently has disastrous results. It is possible, of course, to combine all three errors.
No subject provokes the deformations of sympathy more than poverty. I recalled this recently when asked to speak on a panel about child poverty in Britain in the wake of the economic and financial crisis. I said that the crisis had not affected the problem of child poverty in any fundamental way. Britain remained what it had long been--one of the worst countries in the Western world in which to grow up. This was not the consequence of poverty in any raw economic sense; it resulted from the various kinds of squalor--moral, familial, psychological, social, educational, and cultural--that were particularly prevalent in the country
My remarks were poorly received by the audience, which consisted of professional alleviators of the effects of social pathology, such as social workers and child psychologists. One fellow panelist was the chief of a charity devoted to the abolition of child poverty (whose largest source of funds, like that of most important charities in Britain's increasingly corporatist society, was the government). She dismissed my comments as nonsense. For her, poverty was simply the "maldistribution of resources"; we could thus distribute it away. And in her own terms, she was right, for her charity stipulated that one was poor if one had an income of less than 60 percent of the median national income.
This definition, of course, has odd logical consequences: for example, that in a society of billionaires, multimillionaires would be poor. A society in which every single person grew richer could also be one in which poverty became more widespread than before; and one in which everybody grew poorer might be one in which there was less poverty than before. More important, however, is that the redistributionist way of thinking denies agency to the poor. By destroying people's self-reliance, it encourages dependency and corruption--not only in Britain, but everywhere in the world where it is held.
I first started thinking about poverty when I worked as a doctor during the early eighties in the Gilbert Islands, a group of low coral atolls in an immensity of the Central Pacific. Much of the population still lived outside the money economy, and the per-capita GDP was therefore extremely low. It did not seem to me, however, that the people were very poor. Their traditional way of life afforded them what anthropologists call a generous subsistence; their coconuts, fish, and taros gave them an adequate--and, in some respects, elegant--living. They lived in an almost invariant climate, with the temperature rarely departing more than a few degrees from 85. Their problems were illness and boredom, which left them avid for new possibilities when they came into contact with the outside world.
Life in the islands taught me a lively disrespect for per-capita GDP as an accurate measure of poverty. I read recently in a prominent liberal newspaper that "the majority of Nigerians live on less than $1 a day." This statement is clearly designed less to convey an economic truth than to provoke sympathy, evoke guilt, and drum up support for foreign aid in the West, where an income of less than $1 a day would not keep body and soul together for long; whereas it is frequently said that one of Nigeria's problems is the rapid increase in its population.
As it happens, an island next door (in Pacific terms) to the Gilbert Islands was home to an experiment in the sudden, unearned attainment of wealth. Nauru, a speck in the ocean just ten miles around, for a time became the richest place on earth. The source of its sudden riches was phosphate rock. Australia had long administered the island, and the British Phosphate Commission had mined the phosphate on behalf of Australia, Britain, and New Zealand; but when Nauru became independent in 1968, the 4,000 or so Nauruans gained control of the phosphate, which made them wealthy. The money came as a gift. Most Nauruans made no contribution to the extraction of the rock, beyond selling their land. The expertise, the management, the labor, and the transportation arrived from outside. Within just a few years, the Nauruans went from active subsistence to being rentiers.
The outcome was instructive. The Nauruans became bored and listless. One of their chief joys became eating to excess. On average, they consumed 7,000 calories per day, mainly rice and canned beef, and they drank Fanta and Chateau d'Yquem by the caseload. They became the fattest people on earth, and, genetically predisposed already to the illness, 50 percent of them became diabetic. It was my experience of Nauru that first suggested to me the possibility that abruptly distributing wealth has psychological effects as well as economic ones.
I next spent a few years (1983 to 1986) in Tanzania, a country that presented another experiment in treating poverty as a matter of maldistribution. Julius Nyerere, the first--and, until then, the only--president, had been in charge for more than 20 years. His honorific, Mwalimu--Teacher--symbolized his relation to his country and his people. He had become a Fabian socialist at the University of Edinburgh, and a more red-blooded one (according to his former ally and foreign minister, Oscar Kambona, who fell out with him over the imposition of a one-party socialist state) after receiving a delirious, orchestrated reception in Mao's China.
One can say a number of things in Nyerere's favor, at least by the standards of post-independence African leaders. He was not a tribalist who awarded all the plum jobs to his own kind. He was not a particularly sanguinary dictator, though he did not hesitate to imprison his opponents. Nor was he spectacularly corrupt in the manner of, say, Bongo of Gabon or Moi of Kenya. He was outwardly charming and modest and must have been one of the only people to have had good personal relations with both Queen Elizabeth II and Kim Il-sung.
Nyerere wished the poor well; he was full of sympathy and good intentions. He thought that, being so uneducated, ignorant, and lacking in resources, the poor could not spare the time and energy--and were, in any case, unqualified--to make decisions for themselves. They were also lazy: Nyerere at one point complained about the millions of his fellow countrymen who spent half their time drinking, gossiping, and dancing (which suggested to me that their lives were not altogether intolerable).
But Nyerere knew what to do for them. In 1967, he issued his famous Arusha Declaration, named for the town where he made it, committing Tanzania to socialism and vowing to end the exploitation of man by man that made some people rich and others poor. On this view of things, the greater accumulation of wealth, either by some individuals or by some nations, could be explained only by exploitation, a morally illicit process. The explanation for poverty was simple: some people or nations appropriated the natural wealth of mankind for themselves. It was therefore a necessary condition of improvement, as well as a form of restitution, that they no longer be allowed to do so and that their wealth be redistributed. So Tanzania nationalized the banks, appropriated commercial farms, took over all major industry, controlled prices, and put all export trade under the control of paragovernmental organizations.
There followed the forced collectivization of the rural population--which is to say, the majority of the population--into Ujamaa villages. Ujamaa is Swahili for "extended family"; as Nyerere insisted, all men were brothers. By herding the people into collectivized villages, Nyerere thought, the government could provide services, such as schools and clinics. After all, rich countries had educated and healthy populations; was it not evident that if the Tanzanian people were educated and healthy, wealth would result? Besides, collectively the villagers could buy fertilizer, perhaps even tractors, which they never could have done as individuals (assuming, as Nyerere did, that without government action there would be no economic growth). Unfortunately, the people did not want to herd fraternally into villages; they wanted to stay put on their scattered ancestral lands. Several thousand were arrested and imprisoned.
The predictable result of these efforts at preventing the exploitation of man by man was the collapse of production, pauperizing an already poor country. Tanzania went from being a significant exporter of agricultural produce to being utterly dependent on food imports, even for subsistence, in just a few years. Peasants who had once grown coffee and sold it to Indian merchants for soap, salt, and other goods uprooted their bushes and started growing meager amounts of corn for their own consumption. No reason existed for doing anything else because growers now had to sell their produce to paragovernmental procurement agencies, which paid them later, if at all, at derisory prices in a worthless currency that peasants called "pictures of Nyerere."
Nyerere blamed shortages of such commonplaces as soap and salt on speculators and exploiters, rather than on his own economic policies. He made the shortages the pretext for so-called crackdowns, often directed at Indian traders, which eventually drove them from the country. Nyerere's policies were no more soundly based than those of Idi Amin, who drove out the Indians more brutally. Anti-Semitism, it has often been said, is the socialism of fools. I would put things another way: socialism is the anti-Semitism of intellectuals.
With foreign exchange exhausted, only the funds that the honey-tongued Nyerere continued to obtain from the World Bank and foreign donors enabled the country to avoid mass starvation. By the time I reached Tanzania, the country had become completely dependent on handouts. Aid represented two-thirds of Tanzania's foreign-exchange earnings; one might say that its largest export was requests for such aid. In the rural area where I lived, the people dressed in hand-me-downs sent by European charities. A single egg was a luxury. One of the goals that had induced Nyerere to move to socialism, ironically, was national "self-reliance."
The foreign aid that allowed Nyerere's policies to continue well after the economic disaster was evident had precisely the baleful effects that Peter Bauer, the development economist who contradicted the professional orthodoxies of his time, predicted. The aid immensely increased the power of the sole political party by giving its officials control over scarce goods. When I was in Tanzania, you needed political connections to buy even a bottle of beer--the famous local monopoly brand, Safari, which, the saying went, caused you to pass directly from sobriety to hangover without passing through drunkenness. The regime provided ample opportunities for corruption. Most Tanzanians were slender; you could recognize a party man by his girth.
Thanks to foreign aid, a large bureaucracy grew up in Tanzania whose power, influence, and relative prosperity depended on its keeping the economy a genuine zero-sum game. A vicious circle had been created: the more impoverished the country, the greater the need for foreign aid; the greater the foreign aid, the more privileged the elite; the more privileged the elite, the greater the adherence to policies that resulted in poverty. Nyerere himself made the connection between privilege and ruinous policies perfectly clear after the International Monetary Fund suggested that Tanzania float its currency, the Tanzanian shilling, rather than maintain it at a ridiculously overvalued rate. "There would be rioting in the streets, and I would lose everything I have," Nyerere said.
Long years of living under this perverse regime encouraged economically destructive attitudes among the general population. While I was impressed by the sacrifices that Tanzanian parents were willing to make to educate their children (for a child to attain a certain stage of education, for example, a party official had to certify the parents' political reliability), it alarmed me to discover that the only goal of education was a government job, from which a child could then extort a living from people like his parents--though not actually from his parents, for he would share his good fortune with them. In Tanzania, producing anything, despite the prevailing scarcity of almost everything, became foolish, for it brought no reward.
When I returned to practice among the poor in England, I found my Tanzanian experiences illuminating. The situation was not so extreme in England, of course, where the poor enjoyed luxuries that in Tanzania were available only to the elite. But the arguments for the expansive British welfare state had much in common with those that Nyerere had used to bring about his economic disaster. The poor, helpless victims of economic and social forces, were, like Ophelia in the river, "incapable of their own distress." Therefore, they needed outside assistance in the form of subsidies and state-directed organizations, paid for with the income of the rich. One could not expect them to make serious decisions for themselves.
This attitude has worked destruction in Britain as surely as it has in Tanzania. The British state is today as much a monopoly provider of education to the population as it is of health care. The monopoly is maintained because the government and the bureaucratic caste believe, first, that parents would otherwise be too feckless or impoverished to educate their children from their own means; and second, that public education equalizes the chances of children in an otherwise unequal society and is thus a means of engineering social justice.
The state started to take over education in 1870, largely because the government saw a national competitor, Prussia, employing state power to educate its children. But practically all British children went to school already: according to the calculations of economist and historian E. G. West, 93 percent of the population was by then literate. It is true that the British state had started providing support to schools long before, but in 1870, 67 percent of school income still came from the fees that parents paid.
Not all British children received a good education before the state intervened: that was as vanishingly unlikely then as it is today. But it is clear that poor people--incomparably poorer than anyone in Britain today--were nonetheless capable of making sacrifices to carry out their highly responsible decisions. They did not need the state to tell them that their children should learn to read, write, and reckon. There is no reason to suppose that, left alone, the astonishing progress in the education of the population during the first three-quarters of the nineteenth century would not have continued. The "problem" that the state was solving in its destruction of the voluntary system was its own lack of power over the population.
As in Tanzania, the state-dominated system became self-reinforcing. Because of the high taxation necessary to run it, it reduced the capacity and inclination of people to pay for their own choices--and eventually the habit of making such choices. The British state now decides the important things for British citizens when it comes to education and much else. It is no coincidence that British advocates of the cradle-to-grave welfare state were great admirers of Julius Nyerere--who, incidentally, has been proposed for Roman Catholic canonization, thus bringing close to reality Bauer's ironic reference to him as Saint Julius.
The only time I ever saw Nyerere in person was in Dodoma, the dusty town designated to become Tanzania's new capital. He was expected to drive by, and by the side of the road sat a praise singer--a woman employed to sing the praises of important people. She was singing songs in praise of Nyerere, of which there were many, with words such as: "Father Nyerere, build and spread socialism throughout the country and eliminate all parasites."
The great man drove past in a yellow Mercedes. The praise singer was covered in dust and started to cough.

Wednesday, December 28, 2011

Do No harm

The Rise of Government and the Decline of Morality


by James A. Dorn 
The recent financial crisis has expanded the power of government. Tea parties have revealed the disillusion of millions of Americans with the rise of government and the decline of morality. The crisis has damaged, unfairly, the vision of market liberalism. It is essential, therefore, to reexamine and articulate the principles of a free society and to understand the danger to liberty that the new progressivism poses.
Since this essay was first presented at the historic Chautauqua Institution in 1995, the federal government has grown in size and scope. Today Congress spends nearly $4 trillion, the federal share of GDP has risen to 25 percent, and the U.S. debt exceeds $12 trillion. Washington has bailed out financial, insurance, and automobile firms while also taking control of the mortgage market. We are now more dependent on government for our health care, pensions, and future than ever before.
Politicians thrive on using other people’s money and promising free lunches. The growth of government has politicized life and weakened the nation’s moral fabric. Government intervention—in the economy, the community, and society—has increased the payoff from political action and reduced the scope of private action. People have become more dependent on the State and have sacrificed freedom for a false sense of security.
One cannot blame government for all of society’s ills, but there is no doubt that economic and social legislation, especially since the mid-1960s, has had a negative impact on individual responsibility. Individuals lose their moral bearing when they become dependent on government. Subsidies, bailouts, and other aspects of the “nanny state” socialize risk and reduce individual accountability. The internal moral compass that normally guides individual behavior will no longer function when the State undermines incentives for moral conduct and blurs the distinction between right and wrong.
More government spending is not the answer to our social, economic, or cultural problems. The task is not to reinvent government or to give politics meaning; the task is to limit government and revitalize civil society. Government meddling will only make matters worse.
If we want to help the disadvantaged, we do not do so by making poverty pay, restricting markets, prohibiting educational freedom, discouraging thrift, and sending the message that the principal function of government is to take care of us. We do so by eliminating social engineering and all kinds of welfare, cultivating free markets, and returning to our moral heritage.
At the beginning of the twentieth century there was no welfare state as we know it. Fraternal and religious organizations flourished. Total government spending was less than 10 percent of GDP, and the federal government’s powers were limited.
Immigrants were faced with material poverty, true, but they were not wretched. There was a certain moral order in everyday life, which began in the home and spread to the outside community. Baltimore’s Polish immigrants provide a good example. Like other immigrants, they arrived with virtually nothing except the desire to work hard and to live in a free country. Their ethos of liberty and responsibility is evident in a 1907 housing report describing the Polish community in Fells Point:
A remembered Saturday evening inspection of five apartments in a house [on] Thames Street, with their whitened floors and shining cook stoves, with the dishes gleaming on the neatly ordered shelves, the piles of clean clothing laid out for Sunday, and the general atmosphere of preparation for the Sabbath, suggested standards that would not have disgraced a Puritan housekeeper.
Yet, according to the report, a typical Polish home consisted “of a crowded one- or two-room apartment, occupied by six or eight people, and located two floors above the common water supply.”
Even though wages were low, Polish Americans sacrificed to save and pooled their resources to help each other by founding building and loan associations, as Linda Shopes noted in The Baltimore Book. By 1929, 60 percent of Polish families were homeowners—without any government assistance.
Dependent , Not Self-Reliant
Today, after spending billions of dollars on anti-poverty programs since the mid-1960s, Baltimore and other American cities are struggling for survival. Self-reliance has given way to dependence and a loss of respect for persons and property.
The inner-city landscape is cluttered with crime-infested public housing and public schools that are mostly dreadful, dangerous, and amoral—where one learns more about survival than virtue. And the way to survive is not to take responsibility for one’s own life and family—which government intervention makes more difficult through occupational licensing, the minimum wage, and other impediments to self-help—but to vote for politicians who have the power to keep the welfare checks rolling.
Dysfunctional behavior now seems almost normal as people are shot daily and births out of wedlock are common. (The replacement of Aid to Families with Dependent Children with Temporary Assistance to Needy Families, as a result of the welfare reform during the Clinton administration, was a bipartisan recognition of the perverse incentives under AFDC. ) In addition to the moral decay, high tax rates and regulatory overkill have driven businesses and taxpayers out of the city and slowed economic development. It’s not a pretty picture.
In sum, the growth of government and the rise of the “transfer society” have undermined the work ethic and substituted an ethos of dependence for an ethos of liberty and responsibility. Virtue and civil society have suffered in the process, as has economic progress.
The Founding Fathers recognized that the nature of government is force, and they sought to limit its use to the protection of life, liberty, and property. Markets, both formal and informal, could then be relied on to bring about economic prosperity and social harmony.
In a free society the relationship between the individual and the State is simple. Thomas Jefferson said it well: “Man is not made for the State but the State for man, and it derives its just powers from the consent of the governed.” The fact that the Founders never fully realized their principles should not divert attention from the importance of those principles for a free society and for safeguarding the dignity of all people.
From a classical-liberal perspective, the primary functions of government are to secure “the blessings of liberty” and “establish justice”—not by mandating outcomes, but by setting minimum standards of just conduct and leaving individuals free to pursue their own values within the law. The “sum of good government,” wrote Jefferson, is to “restrain men from injuring one another,” to “leave them . . . free to regulate their own pursuits of industry and improvement,” and to “not take from the mouth of labor the bread it has earned.”
The Jeffersonian philosophy of good government was widely shared in nineteenth-century America. Indeed, Jeffersonian democracy became embodied in what John O’Sullivan, editor of the United States Magazine and Democratic Review, called the “voluntary principle” or the “principle of freedom.” In 1837 he wrote, “The best government is that which governs least . . . . [Government] should be confined to the administration of justice, for the protection of the natural equal rights of the citizen, and the preservation of the social order. In all other respects, the voluntary principle, the principle of freedom . . . affords the true golden rule.”
During the nineteenth century most Americans took it for granted that the federal government has no constitutional authority to engage in public charity (to legislate forced transfers to help some individuals at the expense of others). It was generally understood that the powers of the federal government are delegated, enumerated, and therefore limited, and that there is no explicit authority for the welfare state. From a classical-liberal, or market-liberal, perspective, then, the role of government is not to “do good at the taxpayers’ expense,” but “to prevent harm.”
The general-welfare clause of the Constitution cannot be used to justify the welfare state. That clause simply states that the federal government, in exercising its enumerated powers, should exercise them to “promote the general welfare,” not to promote particular interests. The clause was never meant to be an open invitation to expand government far beyond its primary role of night watchman.
Yet “Progressives” who sought to use government to do good (with other people’s money) overtook the vision of limited government. “Public charity” gradually became the norm. Unlike private charity, however, government transfers always involve coercion or the threat of force. Doing good with other people’s money without their consent is not a virtue but a vice—or, rather, a crime.
The transformation of the framers’ constitutional vision began with the Progressive Era, accelerated with the New Deal, and mushroomed with the Great Society’s war on poverty, which created new entitlements and enshrined welfare rights. Today, more than half the federal budget is spent on entitlements—the largest being Social Security, Medicare, and Medicaid. The newly passed health insurance legislation will add fuel to the fire of the welfare state. The $100 trillion in unfunded liabilities in Social Security and Medicare will place a heavy burden on future generations.
Freedom from Responsibility
During the transition from limited government to the welfare state, freedom has come to mean freedom from responsibility. Such freedom, however, is not true freedom but a form of tyranny, which creates moral and social chaos.
The modern liberal’s vision of government is based on a twisted understanding of rights and justice—an understanding that clashes with the principle of freedom inherent in the higher law of the Constitution. Welfare rights, or entitlements, are “imperfect rights,” or pseudo-rights; they can be exercised only by violating what legal scholars call the “perfect right” to private property. Rights to welfare—whether to food stamps, public housing, health care, or business subsidies—create a legal obligation to help others. In contrast, the right to property, understood in the Lockean sense, merely obligates individuals to refrain from taking what is not theirs. For the modern liberal, justice refers to “social (or distributive) justice”—an amorphous term, subject to all sorts of abuse if made the goal of public policy, as F. A. Hayek has aptly noted in The Constitution of Liberty and other writings. As a norm for action, the concept of “social justice” leads to uncertainty and competition for government favors. The result is bigger government and corruption. The cost of the pursuit of social justice is the loss of freedom. Instead of creating certainty by limiting the range of government actions under a just rule of law, the modern “liberal” State has produced discord. Indeed, when the role of government is to do good with other people’s money, there is no end to the mischief government can cause.
Many Americans seem to have lost sight of the idea that the role of government is not to instill values but to protect those rights that are consistent with a society of free and responsible individuals. Everyone has a right to pursue happiness, but no one has the right to do so by depriving others of their liberty and their property.
When democracy overreaches, there is no end to the demands on the public purse, and the power of government grows. The Founding Fathers sought to create a republic with limited government, not an unlimited democracy in which the “winners” are allowed to impose their will and vision of the good society on everyone else. In such a system politics becomes a fight of all against all, like the Hobbesian jungle, and nearly everyone is a net loser as taxes rise, deficits soar, and economic growth slows.
Bankrupt in Every Way
Most voters recognize that the welfare state is inefficient and has a built-in incentive to perpetuate poverty. It should be common sense that when government promises something for nothing, demand will grow and so will the welfare state. That has clearly been the case with health care spending under Medicaid and Medicare—and it will be the case with Obamacare. For all the money spent on fighting poverty since 1965, the official poverty rate has remained roughly the same, about 14 percent. Government waste is only part of the problem; the welfare state is also intellectually, morally, and constitutionally bankrupt.
Intellectually bankrupt. It is intellectually bankrupt because increasing the scope of market exchange, not welfare, is the viable way to alleviate poverty. The best way to help the poor is not by redistributing income but by generating economic growth and removing impediments to self-help and mutual aid. Poverty rates fell morebefore the war on poverty when economic growth was higher.
The failure of communism shows that any attenuation of private property rights weakens markets and reduces choice. Individual welfare is lowered as a result. The welfare state has attenuated private property rights and weakened the social fabric. When people look to government to provide retirement income, health care, mortgage guarantees, and various business subsidies, private initiative gives way to collectivist thinking. Economic decisions become politicized, and people lean more and more on government.
Morally bankrupt. In addition to being inefficient and intellectually bankrupt, the welfare state is morally bankrupt. In a free society people are entitled to what they own, not to what others own. Yet under the pretense of morality politicians and advocacy groups have created rights out of thin air. The rights to education, health care, housing, a minimum wage, and other “necessities” are now deemed sacrosanct. Politicians have become the high priests of the new State religion of welfare rights and self-proclaimed “benefactors” of humanity. If there is a problem—any problem—Congress is there to solve it, regardless of whether the Constitution gives it the power to do so.
The truth is, “the emperor has no clothes.” Politicians pretend to do good, but they do so through coercion not consent. Politicians put on their moral garb, but there is really nothing there. Government benevolence, in reality, is a naked taking. Public charity is forced charity, or what the great French liberal Frédéric Bastiat called “legal plunder.”
Constitutionally bankrupt. The welfare state is also constitutionally bankrupt; it has no basis in the framers’ constitution of liberty. By changing the role of government from a limited one of protecting persons and property to an unlimited one of achieving “social justice,” Congress, the courts, and presidents have broken their oaths to uphold the Constitution.
In contrast Congressman Davy Crockett, who was elected in 1827, told his colleagues, “We have the right, as individuals, to give away as much of our own money as we please in charity; but as members of Congress we have no right to appropriate a dollar of the public money.”
Polls show that most Americans distrust government and that more young people believe in UFOs than in the future of Social Security. Those sentiments express a growing skepticism about the modern welfare state. President Obama’s election does not mean most Americans have abandoned the principles of the Constitution and are in a rush to move toward a socialist state. What can be done to meet the challenge of safeguarding freedom?
What Can Be Done
First and foremost, we need to expose the intellectual, constitutional, and moral bankruptcy of the welfare state. We need to change the way we think about government and restore an ethos of liberty and responsibility. The political process will then be ready to begin rolling back the welfare state.
Although Americans have grown accustomed to the welfare state, its disappearance would strengthen the nation’s moral fabric and reinvigorate civil society. We should end the parasitic State—not because we want to harm the poor, but because we want to help them help themselves.
The federal government has become bloated and unable to perform even its rudimentary functions. It is awash with debt and is endangering America’s future. The collapse of communism and the failure of socialism should have been warning enough that it is time to change direction.
It is time to limit the size and scope of government and to get the State out of the business of charity. Private virtue, responsibility, and benevolence can then grow naturally along with civil society—just as they did more than 150 years ago when Alexis de Tocqueville wrote in his classic Democracy in America:
"When an American asks for the cooperation of his fellow citizens it is seldom refused, and I have often seen it afforded spontaneously and with great good will. . . . If some great and sudden calamity befalls a family, the purses of a thousand strangers are at once willingly opened, and small but numerous donations pour in to relieve their distress."
The role of government in a free society is not to legislate morality—an impossible and dangerous goal—or even to “empower people”; the role of government is to allow people the freedom to grow into responsible citizens and to exercise their inalienable rights.
The modern liberal’s idea of “good government” has divorced freedom from responsibility and created a false sense of morality. Good intentions have led to bad policy. The moral state of the union can be improved by following two simple rules: “Do no harm” and “Do good at your own expense.” Those rules are perfectly consistent in the private moral universe. It is only when the second rule is replaced by “Do good at the expense of others” that social harmony turns into discord as interest groups compete for scarce resources at the public trough.

Never Give Up!


The Race

An unwanted savior


The World's Worst Human Rights Observer

As Arab League monitors work to expose President Bashar al-Assad's crackdown, the head of the mission is a Sudanese general accused of creating the fearsome "janjaweed," which was responsible for the worst atrocities during the Darfur genocide.
BY DAVID KENNER
For the first time in Syria's nine-month-old uprising, there are witnesses to President Bashar al-Assad's crackdown, which according to the United Nations has claimed more than 5,000 lives. Arab League observers arrived in the country on Dec. 26, and traveled to the city of Homs -- the epicenter of the revolt, where the daily death toll regularly runs into the dozens, according to activist groups -- on Dec. 27. Thousands of people took to the streets to protest against Assad upon the observers' arrival, while activists said Syrian tanks withdrew from the streets only hours before the Arab League team entered the city.
"I am going to Homs," insisted Sudanese Gen. Mohammad Ahmed Mustafa al-Dabi, the head of the Arab League observer mission, telling reporters that so far the Assad regime had been "very cooperative."
But Dabi may be the unlikeliest leader of a humanitarian mission the world has ever seen. He is a staunch loyalist of Sudan's President Omar al-Bashir, who is wanted by the International Criminal Court for genocide and crimes against humanity for his government's policies in Darfur. And Dabi's own record in the restive Sudanese region, where he stands accused of presiding over the creation of the feared Arab militias known as the "janjaweed," is enough to make any human rights activist blanch.
Dabi's involvement in Darfur began in 1999, four years before the region would explode in the violence that Secretary of State Colin Powell labeled as "genocide." Darfur was descending into war between the Arab and Masalit communities -- the same fault line that would widen into a bloodier interethnic war in a few years' time. As the situation escalated out of control, Bashir sent Dabi to Darfur to restore order.
According to Julie Flint and Alex De Waal's Darfur: A New History of a Long War, Dabi arrived in Geneina, the capital of West Darfur, on Feb. 9, 1999, with two helicopter gunships and 120 soldiers. He would stay until the end of June. During this time, he would make an enemy of the Masalit governor of West Sudan. Flint and De Waal write:
Governor Ibrahim Yahya describes the period as ‘the beginning of the organization of the Janjawiid', with [Arab] militia leaders like Hamid Dawai and Shineibat receiving money from the government for the first time. ‘The army would search and disarm villages, and two days later the Janjawiid would go in. They would attack and loot from 6 a.m. to 2 p.m., only ten minutes away from the army. By this process all of Dar Masalit was burned.'
Yahya's account was supported five years later by a commander of the Sudan Liberation Army, a rebel organization movement in the region. "[T]hings changed in 1999," he told Flint and De Waal. "The PDF [Popular Defense Forces, a government militia] ended and the Janjawiid came; the Janjawiid occupied all PDF places."
Dabi provided a different perspective on his time in Darfur, but it's not clear that he disagrees on the particulars of how he quelled the violence. He told Flint and De Waal that he provided resources to resolve the tribes' grievances, and employed a firm hand to force the leaders to reconcile -- "threatening them with live ammunition when they dragged their feet," in the authors' words. "I was very proud of the time I spent in Geneina," Dabi said.
De Waal said that Yahya, who would become a senior commander for the rebel Justice and Equality Movement (JEM), had "an axe to grind" against the Sudanese military -- but his charge that Dabi spurred the creation of the janjaweed wasn't far off base.
"[T]he army command finds the militia useful and fearsome in equal measure," De Waal said.  "So al-Dabi's regularization of the Arab militia served both to rein them in, but also to legitimize their activities and retain them as a future strike force."
Dabi's role in Darfur is only one episode in a decades-long career that has been spent protecting the interests of Bashir's regime. He has regularly been trusted with authority over the regime's most sensitive portfolios: The day Bashir took power in a coup in 1989, he was promoted to head of military intelligence. In August 1995, after protesters at Khartoum University rattled the regime, Dabi became head of Sudan's foreign intelligence agency -- pushing aside a loyalist of Hassan al-Turabi, the hard-line Islamist cleric who helped Bashir rise to power but would be pushed aside several years later. And as civil war ravaged south Sudan, Dabi was tasked from 1996 to 1999 as chief of Sudan's military operations.

The Rise of Networks

The (B)end of History
BY JOHN ARQUILLA 
Where have all the leaders gone? So much has happened in 2011, but there is precious little evidence of world events being guided by a few great men and women. From the social revolution in Egypt's Tahrir Square to the impact of the Tea Party on American politics, and on to the Occupy movement, loose-knit, largely leaderless networks are exercising great influence on social and political affairs.
Networks draw their strength in two ways: from the information technologies that connect everybody to everybody else, and from the power of the narratives that draw supporters in and keep them in, sometimes even in the face of brutal repression such as practiced by Bashar al-Assad's regime in Syria. Aside from civil society uprisings, this is true of terrorist networks as well. The very best example is al Qaeda, which has survived the death of Osama bin Laden and is right now surging fighters into Iraq -- where they are already making mischief and will declare victory in the wake of the departure of U.S. forces.
The kind of "people power" now being exercised, which is the big story of the past year, is opening a whole new chapter in human history -- an epic that was supposed to have reached its end with the ultimate triumph of democracy and free market capitalism, according to leading scholar and sometime policymaker Francis Fukuyama. When he first advanced his notion about the "end of history" in 1989, world events seemed to be confirming his insight. The Soviet Union was unraveling, soon to dissolve. Freedom was advancing nearly everywhere. Fukuyama knew there would still be occasional unrest but saw no competing ideas emerging. We would live in an age of mop-up operations, such as the 2003 invasion of Iraq -- for which he had initially plumped -- and this year's war to overthrow Libya's Muammar al-Qaddafi. As Fukuyama noted in his famous essay, "the victory of liberalism has occurred primarily in the realm of ideas or consciousness and is as yet incomplete in the real or material world."
Fukuyama is only the latest in a long line of wise people who thought things were "over." From humankind's historical beginnings, a very lively interest in endings has always been apparent. The unknown author of the epic of Gilgamesh, a ruler of ancient Uruk (modern Iraq), was the first to focus on the mortality of the individual. He explored questions that were picked up on later by Aristotle, Lucretius, and Aurelius -- about the meaning of existence and what happens after death -- and that have continued to puzzle the thoughtful up into our time. Others have looked at "the end" from a wider, world-encompassing perspective -- most dramatically depicted in the "revelations" envisioned by Christian Apocalyptic literature. The Mayans, too, thought very much about endings. Their "long-count" calendar is famously set to terminate on Dec. 21, 2012.
The larger sweep of world events has often been incorporated into these "endist" views as well. Genghis Khan's Mongol hordes, the "Tatars," were so named by Christians who believed that these all-conquering riders had come from the nether world, Tartarus, to announce the looming end of times. Tolstoy's character from War and Peace, Pierre Bezukhov, spent a lot of time and effort attaching numerical values to Napoleon's name -- to see whether the Corsican had the "number of the Beast" (666). Hitler also had his turn in the dock as a candidate anti-Christ. All of them proved false, however, and the end never quite came.
Many have expressed doubts about the latest "end of history" thesis, and even Fukuyama has mused that, even if some kind of inflection point has been reached, history could well continue on in some new vein. In this he might be right. For it is possible -- indeed, more appropriate -- to look at world events from a point of view that considers "endings" as not so final.
Instead there are historical turnings after which what was recedes and what is and will persist flourishes -- a world less driven by the apocalyptic, one more attuned to the epochal. It could be argued that the Bible takes this view: The Flood in Genesis ushers in not the end but a new beginning; the Second Coming in Revelation features travail, but also a 1,000-year era of peace. Even J.R.R. Tolkien's saga of Middle-earth sees "the end" as a new beginning -- as does the Mayan long-count calendar.
So it may be now. But just what is ending? And what is beginning? In terms of world affairs, I see that a great turning has occurred: A process that began in the 16th century reached its climax at the end of the millennium. There was a protracted struggle during this period between empires and the nation-states that rose up, fought against, and eventually defeated them.
Before the start of the long wars between empires and nations -- i.e., for all of recorded history from Sargon of Akkad to Philip II of Spain -- all great events were driven by empires that fed on the territory, resources, and labor of others. Persian, Greek, Roman, Moorish, Ottoman, Mongol, Mughal -- with few exceptions, these and other empires were the arbiters of events. But in the 1500s, a sense of nationalism began to emerge in some places, most notably

Might is Right

Abusing History?
China’s mix of historical and legal claims in the South China Sea are inconsistent, says Frank Ching. Beijing can’t have its cake and eat it.
By Frank Ching
US scholar Lucian Pye once famously said that China was not a country but ‘a civilization pretending to be a state.’ That may have been apt at one time, but today’s China has been transformed into a modern state that plays an active role in international forums.
However, China also tries to capitalize on its long history when pressing its case in international disputes. Nowhere is this more clear than in the current South China Sea territorial dispute, which pits China against several of its neighbours. Also embroiled in the various rows are the United States, India and, increasingly, Japan. It’s a potent mix.
In 1996, Beijing ratified the UN Convention on the Law of the Sea(UNCLOS) and publicly embraced the treaty’s provision that ‘China shall enjoy sovereign rights and jurisdiction over an exclusive economic zone of 200 nautical miles and the continental shelf’ – a hitherto unknown concept.
At the same time, however, it reaffirmed its claim over the islets, rocks and reefs in the South China Sea on historical grounds—grounds that aren’t recognized by the convention. That is to say, China claims all the rights granted under international law today and, in addition, claims rights that aren’t generally recognized because its civilization can be traced back several thousand years.
Historically, China was the dominant power in East Asia and considered lesser powers as its tributaries. By insisting now on territorial claims that reflect a historical relationship that vanished hundreds of years ago with the rise of the West, Beijing is, in a sense, attempting to revive and legitimize a situation where it was the unchallenged hegemon.
The ambiguity about what parts of international law China recognizes and which bits it doesn’t gives rise to the current dispute, which directly involves Vietnam, the Philippines, Malaysia and Brunei, and indirectly involves the interests of many other nations.
The claims made by Southeast Asian countries rest primarily on the provisions of the Law of the Sea. China, however, is taking the position that its sovereignty over the territories concerned precedes the enactment of the Law of the Sea, and so the law doesn’t apply. History trumps law.
In 2009, China submitted a map to the UN Commission on the Law of the Sea in support of its claims to ‘indisputable sovereignty over the islands of the South China Sea and the adjacent waters’ as well as ‘the seabed and subsoil thereof.’
The map featured a U-shaped dotted line that encompassed virtually the entire South China Sea and hugged the coasts of neighbouring countries including Vietnam, Malaysia and the Philippines. This was the first time China had submitted a map to the United Nations in support of its territorial claims, but there was no explanation given as to whether it claimed all the waters as well as the islands enclosed by the dotted line.
This was a radical departure from the position China took when it ratified the treaty. Back then, China said that it would hold consultations ‘with the states with coasts opposite or adjacent to China respectively on the basis of international law and in accordance with the principle of equitability.’
Significantly, especially for the United States, China’s position on UNCLOS has also shifted in another respect. In 1996, it took the position that foreign warships required its approval in order to pass through China’s territorial waters. Now, China says that foreign warships must obtain its approval before they can pass through its exclusive economic zone – a much wider area that isn’t part of its sovereign waters.
The United States disputes that position, maintaining that waters in a country’s EEZ are part of the high seas and that naval vessels are free to enter them and even conduct operations without any need for approval.
This difference in opinion between China and the United States (as well as most developed countries) has led to confrontations between the two countries, with US naval surveillance vessels carrying out information-gathering missions in China’s EEZ and being challenged by the Chinese.
China’s resort to history is a relatively new development in international law, although it isn’t completely unprecedented. For example, coastal states have been allowed to claim extended jurisdiction over waters, especially bays or islands, when those claims have been open and long-standing, exclusive, and widely accepted by other states.
In China’s case, however, its claims are evidently neither exclusive nor widely accepted by other states since they are being openly contested. Still, Chinese officials and scholars have attempted to buttress their arguments by appealing to historical records.
For example, Li Guoqiang, a research scholar with the Research Center for Chinese Borderland History and Geography of the Chinese Academy of Social Sciences wrote in July in the China Daily: ‘Historical evidence shows that Chinese people discovered the islands in the South China Sea during the Qin (221-206 BC) and Han (206 BC-AD 220) dynasties.’ China’s maritime boundary, he asserts, was established by the Qing dynasty (1644-1911).
‘In contrast,’ he wrote, ‘Vietnam, Malaysia and the Philippines hardly knew anything about the islands in the South China Sea before China’s Qing Dynasty.’
Vietnam, in pressing its case, has cited maps and geography attesting to its ‘historical sovereignty’ over the Paracel and Spratly islands going back to the 17th century. This doesn’t match the antiquity of China’s claims, but, at the very least, it shows that Chinese claims have been contested for centuries, and that China didn’t enjoy exclusive and continuous jurisdiction over these islands.
And, if history is to be the criterion, which period of history should be decisive? After all, if the Qin or Han dynasty is to be taken as the benchmark, then China’s territory today would be much smaller, since at the time it had not yet acquired Tibet, Xinjiang or Manchuria, now known as the northeast.
One compromise that China has offered to its neighbours is to shelve the territorial disputes and engage in joint development of natural resources. This was proposed by President Hu Jintao as recently as August 31, when he met the Philippine President Benigno Aquino.
However, there are serious problems. Just what does China mean by this policy?
The Chinese Foreign Ministry website explains: ‘The concept of “setting aside dispute and pursuing joint development” has the following four elements:
‘1. The sovereignty of the territories concerned belongs to China.
‘2. When conditions are not ripe to bring about a thorough solution to territorial dispute, discussion on the issue of sovereignty may be postponed so that the dispute is set aside. To set aside dispute does not mean giving up sovereignty. It is just to leave the dispute aside for the time being.
‘3. The territories under dispute may be developed in a joint way.
‘4. The purpose of joint development is to enhance mutual understanding through cooperation and create conditions for the eventual resolution of territorial ownership.’
These four points make it clear that instead of shelving the territorial disputes, the idea of joint development is China’s way of imposing its claims of sovereignty over the other party. Chinese sovereignty is the stated desired outcome of any joint development. No wonder that no country has taken China up on its proposal.
Perhaps because of the conflict between historical claims and the UNCLOS, other Chinese scholars are now calling for a review of the Law of the Sea.
Li Jinming, a professor at the Center for Southeast Asia Studies at Xiamen University, says that there are ‘shortcomings’ in UNCLOS and, as a result, ‘China should consider its own situation before enforcing UNCLOS.’ That is to say, even though China has ratified the treaty, which has been in effect for 17 years, Beijing shouldn’t abide by its provisions unless the convention is somehow revised to support China’s territorial claims.
Beijing, it appears, wants to be made an exception in international law. It wants to have its cake and eat it. But law is law. What is the point of having international law when it is no longer international, and when it is no longer law?

Tools of the Trade

The Art and Science of Pseudology
by Thomas Szasz 
The common belief that the scientist’s job is to reveal the secrets of nature is erroneous. Nature has no secrets; only persons do. Secrecy implies agency, which is absent in nature. This is the main reason the so-called “behavioral sciences” are not merely unlike the physical sciences but are in many ways their opposites.
“Nature,” observed Thomas Carlyle (1795-1881), “admits no lie.” While nature neither lies nor tells the truth, persons habitually do both. As the famous French mathematician and philosopher Antoine Augustin Cournot (1801-1877) observed, “It is inconceivable that [in the science of politics] telling the truth can ever become more profitable than telling lies.” Indeed, deception and prevarication are indispensable tools for the politician and the psychiatrist—experts expected to explain, predict, and prevent unwanted human behaviors.
The integrity of the natural scientific enterprise depends on truth-seeking and truth-speaking by individuals engaged in activities we call “scientific,” and on the scientific community’s commitment to expose and reject erroneous explanations and false “facts.” In contrast, the stability of political organizations and of the ersatz religions we call “behavioral sciences” depends on the loyalty of its practitioners to established doctrines and institutions and the rejection of truth-telling as injurious to the welfare of the group that rests on its commitment to fundamental falsehoods. Not by accident, we call revelations of the “secrets” of nature “discoveries,” and revelations of the secrets of powerful individuals and institutions “exposés.”
Because nature is not an agent, many of its workings can be understood by observation, reasoning, experiment, measurement, and calculation. Deception and divination are powerless to advance our understanding of how the world works; indeed, they preempt, prevent, and substitute for such understanding.
Psychiatry is one of the most important institutions of modern American society. Understanding modern psychiatry—the historical forces and the complex economic, legal, political, and social principles and practices that support it—requires understanding the epistemology of imitation and the sociology of distinguishing “originals” from “counterfeits.” With respect to disease, the process consists of two parts: One part is separating persons who suffer from demonstrable bodily diseases from those who do not, but pretend or claim to; another part is separating physicians who believe it is desirable to distinguish between illness and health, sick persons and healthy, from physicians who reject this desideratum and insist that everyone who acts or claims to be sick has an illness and deserves to be treated. In an effort to clarify the difference between medicine and psychiatry—between real medicine and fake medicine—I proposed a satirical definition of psychiatry, slightly revised as follows:
The subject matter of psychiatry is neither minds nor mental diseases, but lies, beginning with the names of the participants in the transaction—the designation of one party as “patient,” even though he is not ill, and the other party as “therapist” even though he is not treating any illness. The lies continue with the deceptions that comprise the subject matter proper of the discipline—the psychiatric “diagnoses,” “prognoses,” and “treatments”—and end with the lies that, like shadows, follow ex-mental patients through the rest of their lives—the records of denigrations called “depression,” “schizophrenia,” or whatnot, and of imprisonments called “hospitalization.” If we wished to give psychiatry an honest name, we ought to call it “pseudology,” or the art and science of lies and lying.
The imitation of illness is memorably portrayed by Molière (1622–1673) in his famous comedy, The Imaginary Invalid (Le malade imaginaire). The main character is a healthy individual who wants to be treated as if he were sick by others, especially doctors. Since those days, we in the West have undergone an astonishing cultural-perceptual transformation of which we seem largely, perhaps wholly, unaware. Today medical healing is regarded as a form of applied science. At the same time, the medical profession defines imaginary illnesses as real illnesses, in effect abolishing the notion of pretended illness: Officially, malingering is now a disease “just as real” as melanoma.
The view that pretending to be mentally ill is itself a form of mental illness became psychiatric dogma during World War II. Kurt R. Eissler (1908-1999), then the quasi-official pope of the Freudian faith in America, declared: “It can be rightly claimed that malingering is always the sign of a disease often more severe than a neurotic disorder. . . . The diagnosis should never be made but by the psychiatrist.” Now, more than 50 years later, this medicalized concept of malingering is an integral part of the mindset of every well-trained, right-thinking Western psychiatrist. For example, Phillip J. Resnick, a leading American forensic psychiatrist, declares: “Detecting malingered mental illness is considered an advanced psychiatric skill, partly because you must understand thoroughly how genuine psychotic symptoms manifest.”
In World War I soldiers afraid of being killed in battle malingered; psychiatrists who wanted to protect them from being returned to the trenches diagnosed them as having a mental illness, then called “hysteria.” Today, almost a hundred years later, soldiers returning home and afraid of being without “health care coverage” diagnose themselves as having a mental illness, called “post-traumatic stress disorder (PTSD)”: Almost 50 percent of the troops returning from Iraq suffer from post-traumatic stress disorder (PTSD) and depression “because they want to make sure that they continue to get health care coverage once their deployments have ended.” (Syracuse Post-Standard, Nov. 25, 2007, E1).
Psychiatrists and the science writers they deceive—and who eagerly deceive themselves—love to dwell on how far psychiatrists have “progressed” from their past practices. They have indeed, if we consider creating ever more mental illnesses/psychiatric diagnoses “progress.” Today psychiatrists assert that the person who regards himself as a mental patient suffers from a bona fide illness and laud him for his insight into his “having a disease” and “need for treatment.”At the same time, they lament the person who “denies” his mental illness, his “lack of insight” into being ill, and his “negative attitudes toward treatment seeking.” For example, from the International Journal of Eating Disorders we learn: “Considering that males have negative attitudes toward treatment-seeking and are less likely than females to seek treatment, efforts should be made to increase awareness of eating disorder symptomatology in male adolescents.”
Counterfeit art is forgery. Counterfeit testimony is perjury. But counterfeit illness is still illness—mental illness, officially decreed “an illness like any other.” The consequences of this policy—economic, legal, medical, moral, personal, philosophical, political, and social—are momentous: counterfeit disability, counterfeit disease, counterfeit doctoring, counterfeit rehabilitation, and the bureaucracies, courts, industries, and professions studying, teaching, practicing, administering, adjudicating, and managing them make up a substantial part of the national economies of modern Western societies and of the professional lives of the individuals in them.