Tuesday, May 17, 2011

Nearly Half Of Detroit’s Adults Are Functionally Illiterate, Report Finds

100,000 Detroit High School Graduates Are Illiterate

by BigFurHat 
Detroit Literacy
Detroit’s population fell by 25 percent in the last decade. And of those that stuck around, nearly half of them are functionally illiterate, a new report finds.
According to estimates by The National Institute for Literacy, roughly 47 percent of adults in Detroit, Michigan — 200,000 total — are “functionally illiterate,” meaning they have trouble with reading, speaking, writing and computational skills. Even more surprisingly, the Detroit Regional Workforce finds half of that illiterate population has obtained a high school degree.
The DRWF report places particular focus on the lack of resources available to those hoping to better educate themselves, with fewer than 10 percent of those in need of help actually receiving it. Only 18 percent of the programs surveyed serve English-language learners, despite 10 percent of the adult population of Detroit speaking English “less than very well.”
Additionally, the report finds, one in three workers in the state of Michigan lack the skills or credentials to pursue additional education beyond high school.
In March, the Detroit unemployment rate hit 11.8 percent, one of the highest in the nation, the U.S. Bureau of Labor Statistics reported last month. There is a glimmer of hope, however: Detroit’s unemployment rate dropped by 3.3 percent in the last year alone.
(Comment: Maybe it just dropped because the illiterate couldn’t fill out the unemployment forms.)
Detroit Mayor Dave Bing and Michigan Governor Rick Snyder have been aggressively attempting to reinvent the once-great Motor City. Last year, the Wall Street Journal reported that then newly-elected Mayor Bing planned to tear down 10,000 of the city’s 90,000 abandoned properties.

Crony capitalists and corrupt politicians unite

By Lori Aratani

Fans of cheap rotisserie chicken and bulk toilet paper can rejoice. It looks as if a new Costco will be coming to Wheaton in 2012.
The Montgomery County Council defeated a proposal Monday that would have blocked the county from giving millions in funding to shopping mall giant Westfield to help secure the deal.
The vote is a significant victory for County Executive Isiah Leggett (D), who had lobbied for a plan that will give Westfield $4 million over two years. The subsidy raised eyebrows in some quarters because it comes as the county faces a $300 million budget shortfall for its next fiscal year and is cutting a variety of programs.
Leggett’s plan received reluctant approval from council members last fall, but momentum seemed to shift after two new members — Hans Riemer (D-At Large) and Craig Rice (D-Upcounty) — were elected in November.
The deal appeared to be in jeopardy last month, but the Leggett administration moved to make its case. Officials argued that pulling out of the agreement could damage the county’s reputation and undermine efforts to attract businesses.
That seemed to sway Riemer and Rice as well as Council President Valerie Ervin (D-Silver Spring).
“I don’t think [Westfield needs] the money to bring Costco to Wheaton,” Riemer said Monday. “But the integrity of the county is at stake, and I don’t think it’s my right to jeopardize the integrity of the county.”
Six members voted against a measure that would have blocked the award of $2 million, half of the proposed incentive: Ervin, Rice, Riemer, Roger Berliner (D-Potomac-Bethesda), Nancy Floreen (D-At Large) andGeorge L. Leventhal (D-At Large). Three voted for it: Marc Elrich (D-At Large), Nancy Navarro (D-Eastern County) and Phil Andrews (D-Gaithersburg-Rockville).
“This has been a long and tortured experience with Westfield and Costco,” said Ervin, who had previously opposed offering Westfield the subsidy. In the end, she said she thought it would set a bad precedent to change course after promising the money.
Council members opposed to the subsidy said the county might have put its pride ahead of its people.
“The idea of giving $2 million of our scarce dollars to Westfield is a mistake,” said Elrich, who led the opposition to the subsidy. “We can take this $2 million and put it to better use in our community.”
As a condition of receiving the money, Westfield officials must work closely with the neighborhood to address concerns about noise, traffic and other possible impacts.
Monday’s vote did not include a decision on whether the project will include a gas station, which many residents who live near the shopping center oppose. A decision on that part of the project will move through a separate process later this year.
Under the plan, Westfield will receive $2 million in 2011 and $2 million in 2012 to pay for construction costs related to Costco’s move to Westfield Wheaton mall. Costco will take the second-floor space that was occupied by a Hecht’s department store. Money from the county will be used to pay for renovations to the first floor of that space so the mall’s owners can try to attract a second tenant.
Under an agreement reached by Costco and Westfield in July, a store could open early next year.
Leggett has touted the project as a way to bring jobs and revenue to the eastern part of the county. County officials say rebuilding the former Hecht’s space, which has been vacant since 2006, to accommodate Costco and other retailers would bring up to 300 construction jobs and 475 retail jobs to the area at a time when such jobs are scarce.

The Sanctification of Awful Men

by Dr Zero
Saturday brought the bizarre saga of Sweden announcing a rape charge against WikiLeaks founder Julian Assange, then withdrawing the warrant within a matter of hours, downgrading the international media hurricane to a tropical storm of “molestation” charges.  Molestation isn’t “severe” enough to get you arrested in Sweden, so it was all much ado about nothing.
Some have speculated this was more than just a bureaucratic snafu.  Was the Swedish government co-operating with the military and intelligence services of the United States, hoping to discredit Assange with false rape charges?  I hope nobody working for the CIA is incompetent enough to believe that would work.  Even hard evidence of rape would not “discredit” a hero of the international Left.
The murder of police officer Daniel Faulkner wasn’t enough to “discredit” Mumia abu Jamal.  His release from prison remains a romantic obsession of the hard-core Left, which sees no reason for a soul brimming with the people’s poetry to rot in stir over one little dead cop.
The Left seethes in frustration that small, judgmental minds continue to hold Roman Polanski’s assault of an underage girl against him.  His fashionable politics and artistic talent should have long ago erased the memory of that messy business at Jack Nicholson’s place.  In a 1979 interview, Polanski wailed, “If I had killed somebody, it wouldn’t have had so much appeal to the press, you see? But… f—ing, you see, and the young girls. Judges want to f— young girls. Juries want to f— young girls. Everyone wants to f— young girls!”  He was wrong about the “killing somebody” part, as Mumia abu Jamal could explain.

Those foolish enough to take Michael Moore seriously are happy to swallow hypocrisy and deceit that would be obvious to a small child.  They are not repulsed by the spectacle of a greedy man making millions from selling them propaganda designed to keep them bitter and poor.  Profiting from lies told in the service of a “larger truth” does not “discredit” him.
The Left is happy to watch people like Al Gore rake in billions from the global-warming scam.  No amount of hard data, or evidence of fraud, will discredit the clergy of the Church of Global Warming.  Their sacred ideal is the construction of an absolute international authority, empowered to defend the Earth from grubby little people who keep asserting privileges that should be reserved for the elite, such as driving cars.  No action taken in the service of this ideal can “discredit” the priesthood.

Following the government’s nutritional advice can make you fat and sick

The Washington Diet
By Steven Malanga
Last October, embarrassing e-mails leaked from New York City’s Department of Health and Mental Hygiene disclosed that officials had stretched the limits of credible science in approving a 2009 anti-obesity ad, which depicted a stream of soda pop transforming into human fat as it left the bottle. “The idea of a sugary drink becoming fat is absurd,” a scientific advisor warned the department in one of the e-mails, a view echoed by other experts whom the city consulted. Nevertheless, Gotham’s health commissioner, Thomas Farley, saw the ad as an effective way to scare people into losing weight, whatever its scientific inaccuracies, and overruled the experts. The dust-up, observed the New York Times, “underlined complaints that Dr. Farley’s more lifestyle-oriented crusades are based on common-sense bromides that may not withstand strict scientific scrutiny.”
Under Farley and Mayor Michael Bloomberg, New York’s health department has been notoriously aggressive in pursuing such “lifestyle-oriented” campaigns (see the sidebar below). But America’s public-health officials have long been eager to issue nutrition advice ungrounded in science, and nowhere has this practice been more troubling than in the federal government’s dietary guidelines, first issued by a congressional committee in 1977 and updated every five years since 1980 by the United States Department of Agriculture. Controversial from the outset for sweeping aside conflicting research, the guidelines have come under increasing attack for being ineffective or even harmful, possibly contributing to a national obesity problem. Unabashed, public-health advocates have pushed ahead with contested new recommendations, leading some of our foremost medical experts to ask whether government should get out of the business of telling Americans what to eat—or, at the very least, adhere to higher standards of evidence.
Until the second half of the twentieth century, public medicine, which concerns itself with community-wide health prescriptions, largely focused on the germs that cause infectious diseases. Advances in microbiology led to the development of vaccines and antibiotics that controlled—and, in some cases, eliminated—a host of killers, including smallpox, diphtheria, and polio. These advances dramatically increased life expectancy in industrialized countries. In the United States, average life expectancy improved from 49 years at the beginning of the twentieth century to nearly 77 by the century’s end.
As the threat of communicable diseases receded, public medicine began to turn its attention to treating and preventing health problems that weren’t germ-caused, such as chronic heart disease and strokes, the death rates for which seemed to be soaring after World War II. Some observers cautioned that the apparent increase might be the result of diagnostic advances, which had improved doctors’ ability to detect heart ailments. This possibility, however, failed to deter the press and advocacy groups like the American Heart Association from declaring the arrival of a frightening epidemic.

Monday, May 16, 2011

How EU officials simply forgot about Christmas

The European oligarchy’s failure to include Christmas in a diary for schoolkids sums up their separation from the demos.
by Frank Furedi 
A year ago the European Commission (EC) printed more than three million school diaries for distribution to students. They are lovely diaries which, true to the EU’s multicultural ethos, helpfully note all the Sikh, Hindu, Muslim and Chinese festivals. The diary also highlights Europe Day, which falls on 9 May. But the diary is not without some very big gaps. For example, it makes no reference to Christmas - or Easter or indeed to any Christian holidays.
However, the importance of 25 December is not entirely ignored. At the bottom of the page for that day, schoolchildren are enlightened with the platitude: ‘A true friend is someone who shares your concern and doubles your joy.’
Not surprisingly, many Europeans are not exactly delighted by the conspicuous absence of Christian festivals from a diary produced for children. In January, an Irish priest complained to the ombudsman of the EC and demanded an apology for the omission of Christian holidays and the recall of the diaries. A month later, the commission apologised for its ‘regrettable’ blunder. However, the ombudsman dismissed the demand to recall the diaries, arguing that a one-page correction sent to schools had rectified the error.

Negative Productivity

by Anthony de Jasay

As of January 1, 2011, the French "legal" minimum wage or smic was raised by 1.6 percent to 9 euros ($12) an hour or 1,365 euros per month. About 10 percent of wage-earners, condescendingly called smicards, are paid the minimum wage. Their take-home pay is amputated by what is called "their" contribution to "social" insurance. Of course the remaining and greater part of "social" insurance premiums is just as much "their" contribution, but for cosmetic reasons is called the employer's contribution. The division into employer and employee contributions is stark economic nonsense. Both parts come out of the pay the worker would get if there were no compulsory insurance or if he paid the premium directly rather than through the employer paying it on his behalf. However, many or most workers fall for the cosmetic and live with the illusion that the benevolent, "socially" just government orders the employers to give them something on top of the wage.
With retail prices lifted by a value-added tax of 19.6 percent, the purchasing power of the smic is hardly above the bare subsistence level in an urban environment. People who have a heart must find it shamefully low. Yet people with a head regretfully find it too high; for within the great mass of unemployed, the proportion of the unskilled who would be candidates for the smic is much higher than the average, i.e. the skilled and the unskilled taken together. Average unemployment is at 9.5 percent, but among the unskilled it can locally be 25 percent or more. Is then the minimum wage too low or too high?
The answer is that it is both, due largely to the caring, "socially"-just hand of the government. Paying 1,365 euros a month to a smicard costs his or her employer anything between 2,200 and 2,500 euros due to the highly complex social insurance schemes whose premiums the employer pays on his employees' behalf. The result is that it is cheaper to go capital-intensive, install automatic checkout counters in supermarkets, automatic ticket controllers at subway stations and giant street-cleaning machines to sweep the streets, rather than employ smicards to do these lowly tasks.
Suppose for a near-delirious moment that freedom of contract is suddenly and miraculously recognised as a firm rule. Among many other things, "social" insurance against illness, old age and unemployment ceases to be compulsory. The wage, with or without deductions for insurance, would become freely negotiable. Would a smicard give up all his entitlements under the various "social" schemes in exchange for a rise in his take-home pay? Some would not at almost any price, but some, probably many, would rather take a raise from 1,365 to 2,000 euros in cash than persist with the old system. The effect on employment of the unskilled might be very substantial indeed.

Minor illusions

“With or without the depression Wallace Carothers would have invented nylon.”
—Alexander J. Field, A Great Leap Forward



One of the themes of Alexander J. Field's impressive new book, A Great Leap Forward: 1930s Depression and U.S. Economic Growth, is that technological progress did not come to a halt during the Great Depression. On the contrary, he claims that the 1930s were the most technologically progressive decade in our history.
Field poses and attempts to answer interesting questions using straightforward number-crunching and reasoning, rather than resorting to obscure mathematics or advanced statistics. The result is a book that represents the best of what economics can be. I will attempt to sketch some of his key ideas in this essay, but I recommend the entire book to anyone with an interest in U.S. economic history, macroeconomics, or economic growth.
Field describes the last 90 years or so of economic history in terms of six eras: the Twenties, the Depression, World War II, the Golden Age, the Slowdown, and the Tech Boom. These are summarized in the following table.
Field devotes at least one chapter to each era. He chooses the endpoints for the eras as cyclical peaks. This reflects a presumption that there are two broad factors affecting productivity: a cyclical factor, which reflects the state of aggregate demand; and secular factors, which vary by era, that affect the supply side of the economy.
Nearly the entire book is concerned with interpreting the secular or supply-side factors. Only one chapter looks at cyclical patterns, where Field finds a positive relationship between productivity and the level of labor utilization. His hypothesis is that most firms are optimized for a high level of output, so that productivity falls when the economy slumps, due to lower utilization rates for fixed assets, such as warehouses and hotels.
One interesting question, particularly given the current slump, is whether downturns have permanent economic effects, for good or ill. Field concludes that the inventions and innovations that drive changes in the standard of living seem to be independent of cyclical forces. However, this answer is already so embedded in his basic assumptions that the issue can hardly be called decided.

What if you had to buy American?

It might be supremely patriotic to stop purchasing imports, but the consequences for US consumers and the economy would be devastating.

Image: Made in USA Garment label © David Engelhardt, Getty ImagesLegions of patriotic Americans look for "made in USA" stickers before buying products, out of a desire to support the country's economy.
But what if we all were restricted to purchasing only those goods that were made in America?
Our homes would be stripped virtually bare of telephones, televisions, toasters and other electronics, and many of our favorite foods and toys would be gone, too. Say goodbye to your coffee or tea, and forget about slicing bananas into your breakfast cereal -- all three would become prohibitively expensive if we relied on only Hawaii to grow tropical crops.
We'd have to trash our beloved Apple products because the iPod, iPad and MacBook aren't made in the U.S. Gasoline would double or triple in price, given that we now import more than 60% of our oil. And you couldn't propose to your true love with a diamond ring: There are no working diamond mines in the U.S.
Moreover, a complete end to imports would actually hurt the U.S. economy, because consumers and domestic companies would lose access to cheap goods. Trade protections, whether through tariffs or quotas, cost the economy roughly $2 for every $1 in additional profit for domestic producers, said Mark Perry, an economics professor at the University of Michigan-Flint and a visiting scholar at the American Enterprise Institute, a conservative think tank.

What If Justice Demands Open Borders?

Why do we cling to the myth that anyone can get in line and come to America? 
Mostly because our values demand it.
History shows… that the West has no model of economic development to offer the still-poor countries of the world. There is no simple economic medicine that will guarantee growth, and even complicated economic surgery offers no clear prospect of relief for societies afflicted with poverty... The only policy the West could pursue that will ensure gains for at least some of the poor of the Third World is to liberalize immigration from these countries…. each extra migrant admitted to the emerald cities of the advanced world is one more person guaranteed a better material lifestyle.
A Farewell to Alms, Gregory Clark


I don’t believe the United States of America should be in the business of separating families. That’s not right. That’s not who we are. We can do better than that.
— President Barack Obama
It was a speech the country needed to hear, yet it was full of the same old evasions.
Americans want to see themselves as a country open to immigration, a country, as in President Obama’s remarks this week, where “anyone can write the next chapter in our story,” where “what matters… is that you believe that all of us are created equal, endowed by our Creator with certain inalienable rights,” and where “in embracing America, you can become American.”
But the law states otherwise. As Obama said, “as long as current laws are on the books, it’s not just hardened felons who are subject to removal, but sometimes families who are just trying to earn a living, or bright, eager students, or decent people with the best of intentions.”
What he did not mention is that most people who apply for visas do not get them, and, anticipating this, most people who would like to come do not bother to apply. Gallup polls have found that one-quarter of the world’s population wishes to migrate, and 165 million wish to come to the United States. Only 35 million immigrants live in America. Why don’t the rest come?
Because they can’t.
“In general,” according to the State Department, “to be eligible to apply for an immigrant visa, a foreign citizen must be sponsored by a U.S. citizen relative(s), U.S. lawful permanent resident, or by a prospective employer.” Even if you have that, you are likely to be rejected, particularly those seeking employment visas, of which far fewer are available than demanded. Those without sponsorship can apply only for the diversity visas lottery, with odds of admission at just over 1 percent from Europe and Africa and under 0.5 percent from Asia.
So it’s not correct to say “anyone can write the next chapter of our story.” Only for a favored few is legal immigration an option.

To “liberals”, apostasy is the greatest sin ...

Andrew Ferguson has a lively profile of the playwright David Mamet:

His fame was enough to fill the stalls of Memorial Hall at Stanford University when he came to give a talk one evening a couple of years ago. About half the audience were students. The rest were aging faculty out on a cheap date with their wives or husbands. You could identify the male profs by the wispy beards and sandals-’n’-socks footwear. The wives were in wraparound skirts and had hair shorter than their husbands’… The unease that began to ripple through the audience had less to do with the speaker’s delivery than with his speech’s content. Mamet was delivering a frontal assault on American higher education, the provider of the livelihood of nearly everyone in his audience.
Higher ed, he said, was an elaborate scheme to deprive young people of their freedom of thought. He compared four years of college to a lab experiment in which a rat is trained to pull a lever for a pellet of food. A student recites some bit of received and unexamined wisdom—“Thomas Jefferson: slave owner, adulterer, pull the lever”—and is rewarded with his pellet: a grade, a degree, and ultimately a lifelong membership in a tribe of people educated to see the world in the same way.

What happens when you decide, as Mamet did, that your membership in the tribe is no longer lifelong?


After reading The Secret Knowledge in galleys, the Fox News host and writer Greg Gutfeld invented the David Mamet Attack Countdown Clock, which “monitors the days until a once-glorified liberal artist is dismissed as an untalented buffoon.” Tick tock.

I wrote about what Mr Mamet could expect a couple of years back:


In The Village Voice the other week, the playwright David Mamet recently outed himself as a liberal apostate and revealed that he’s begun reading conservative types like Milton Friedman and Paul Johnson. If he’s wondering what he’s in for a year or two down the line, here’s how Newsweek’s Jonathan Tepperman began his review this week of another literary leftie who wandered off the reservation:
“Toward the end of The Second Plane, Martin Amis’s new book on the roots and impact of 9/11, the British novelist describes a fellow writer as ‘an oddity: his thoughts and themes are… serious — but he writes like a maniac. A talented maniac, but a maniac.’ Amis is describing Mark Steyn, a controversial anti-Islam polemicist, but he could just as well be describing another angry, Muslim-bashing firebrand: himself. Talented, yes. Serious, yes. But also, judging from the new book, a maniac.”
Poor chap. What did Martin Amis ever do to deserve being compared to me? As Mr. Tepperman concludes, the new Amis is “painful for the legion of Amis fans who still love him for novels like The Rachel Papers and his masterpiece, London Fields.”

Likewise, the new Mamet will be “painful for the legion of [Mamet] fans who still love him for [plays] like [American Buffalo] and his masterpiece, [Glengarry Glen Ross]“. And pretty soon at all those colleges the received wisdom will be either that only the early Mamet is worth reading – or that even those first works were hopelessly overrated and don’t stand the test of time. To “liberals”, apostasy is the greatest sin, and they’re serious enforcers.

How fast can a country go down the drain? Real fast

Once Upon a Time in Egypt

In 1959, Elie Moreno, then a 19-year-old sophomore engineering student at Purdue University in Indiana, visited the Egyptian port city of Alexandria on his summer vacation, and brought his camera. Moreno, an Egyptian of Sephardic Jewish descent, had been born in Alexandria and raised in Cairo. But the Egypt in which he had grown up, the milieu of the country's multi-ethnic urban elite, was fast disappearing; the summer of 1959 was the last Moreno would see of it.
The late 1950s marked the end of an era in Alexandria that had begun in the late 19th century, when the port -- then the largest on the eastern Mediterranean -- emerged as one of the world's great cosmopolitan cities. Europeans -- Greeks, Italians, Armenians, and Germans -- had gravitated to Alexandria in the mid-19th century during the boom years of the Suez Canal's construction, staying through the British invasion of the port in 1882 and the permissive rule of King Farouk in the 1930s and 1940s. Foreign visitors and Egyptians alike flocked to the city's beaches in the summers, where revealing bathing suits were as ordinary as they would be extraordinary today.
But by midcentury, King Farouk -- a lackadaisical ruler in the best of times -- had grown deeply unpopular among Egyptians and was deposed in a CIA-backed coup in 1952. Cosmopolitan Alexandria's polyglot identity -- half a dozen languages were spoken on the city's streets -- and indelible links to Egypt's colonial past were an uncomfortable fit with the pan-Arab nationalism that took root under President Gamal Abdel Nasser in the late 1950s and 1960s. "[W]hat is this city of ours?" British novelist Lawrence Durrell, who served as a press attachĂ© in the British Embassy in Alexandria during World War II, wrote despairingly in 1957 in the first volume of The Alexandria Quartet, his tetralogy set in the city during its heyday as an expatriate haven. "In a flash my mind's eye shows me a thousand dust-tormented streets. Flies and beggars own it today -- and those who enjoy an intermediate existence between either." By the time of Hosni Mubarak's rule (and largely in response to his secularism), Egypt's second-largest city had become synonymous with devout, and deeply conservative, Islam.
The pictures from Moreno's collection, taken on the 1959 visit and several beach trips in previous years, capture the last days of an Alexandria that would be all but unrecognizable today, in which affluent young Egyptians of Arab, Sephardic, and European descent frolic in a landscape of white sand beaches, sailboats, and seaside cabanas. Two years later, in 1961, the structural steel company Moreno's father ran was nationalized by Nasser, and his family left for the United States shortly thereafter. Moreno, who went on to found a semiconductor company in Los Angeles, wouldn't visit his birthplace until he was well into middle age.
But the memories aren't all bittersweet. The woman on the far left in the above photograph, taken on Alexandria's Mediterranean coast in 1955, is Odette Tawil, whom Moreno first met in Alexandria in the summer of 1959. Reunited in the United States years later, they visited Egypt together in 1998, to get married.
Courtesy of Elie Moreno
Beachgoers on Alexandria's Stanley Beach, 1931.

How fast can a country go down the drain? Real fast.


Once Upon a Time in Afghanistan…

Record stores, Mad Men furniture, and pencil skirts -- when Kabul had rock 'n' roll, not rockets.

BY MOHAMMAD QAYOUMI 

On a recent trip to Afghanistan, British Defense Secretary Liam Fox drew fire for calling it "a broken 13th-century country." The most common objection was not that he was wrong, but that he was overly blunt. He's hardly the first Westerner to label Afghanistan as medieval. Former Blackwater CEO Erik Prince recently described the country as inhabited by "barbarians" with "a 1200 A.D. mentality." Many assume that's all Afghanistan has ever been -- an ungovernable land where chaos is carved into the hills. Given the images people see on TV and the headlines written about Afghanistan over the past three decades of war, many conclude the country never made it out of the Middle Ages.
A half-century ago, Afghan women pursued careers in medicine; men and women mingled casually at movie theaters and university campuses in Kabul; factories in the suburbs churned out textiles and other goods. There was a tradition of law and order, and a government capable of undertaking large national infrastructure projects, like building hydropower stations and roads, albeit with outside help. Ordinary people had a sense of hope, a belief that education could open opportunities for all, a conviction that a bright future lay ahead. All that has been destroyed by three decades of war, but it was real.But that is not the Afghanistan I remember. I grew p in Kabul in the 1950s and '60s. When I was in middle school, I remember that on one visit to a city market, I bought a photobook about the country published by Afghanistan's planning ministry. Most of the images dated from the 1950s. I had largely forgotten about that book until recently; I left Afghanistan in 1968 on a U.S.-funded scholarship to study at the American University of Beirut, and subsequently worked in the Middle East and now the United States. But recently, I decided to seek out another copy. Stirred by the fact that news portrayals of the country's history didn't mesh with my own memories, I wanted to discover the truth. Through a colleague, I received a copy of the book and recognized it as a time capsule of the Afghanistan I had once known -- perhaps a little airbrushed by government officials, but a far more realistic picture of my homeland than one often sees today.
I have since had the images in that book digitized. Remembering Afghanistan's hopeful past only makes its present misery seem more tragic. Some captions in the book are difficult to read today: "Afghanistan's racial diversity has little meaning except to an ethnologist. Ask any Afghan to identify a neighbor and he calls him only a brother." "Skilled workers like these press operators are building new standards for themselves and their country." "Hundreds of Afghan youngsters take active part in Scout programs." But it is important to know that disorder, terrorism, and violence against schools that educate girls are not inevitable. I want to show Afghanistan's youth of today how their parents and grandparents really lived.
Original caption: "Kabul University students changing classes. Enrollment has doubled in last four years."

Government 'workers' of the Americas unite

The Dreamliner nightmare
This summer, the huge Boeing assembly plant here will begin producing 787 Dreamliners — up to three a month, priced at $185 million apiece. It will, unless the National Labor Relations Board, controlled by Democrats and encouraged by Barack Obama’s reverberating silence, gets its way.
Last month — 17 months after Boeing announced plans to build here and with the $2 billion plant nearing completion — the NLRB, collaborating with the International Association of Machinists and Aerospace Workers (IAM), charged that Boeing’s decision violated the rights of its unionized workers in Washington state, where some Dreamliners are assembled and still will be even after the plant here is operational. The NLRB has read a 76-year-old statute (the 1935 Wagner Act) perversely, disregarded almost half a century of NLRB and Supreme Court rulings, and patently misrepresented statements by Boeing officials.
South Carolina is one of 22 — so far — right-to-work states, where workers cannot be compelled to join a union. When in September 2009, Boeing’s South Carolina workers — fuselage sections of 787s already are built here — voted to end their representation by IAM, the union did not accuse Boeing of pre-vote misbehavior. Now, however, the NLRB seeks to establish the principle that moving businesses to such states from non-right-to-work states constitutes prima facie evidence of “unfair labor practices,” including intimidation and coercion of labor. This principle would be a powerful incentive for new companies to locate only in right-to-work states.
The NLRB complaint fictitiously says Boeing has decided to “remove” or “transfer” work from Washington. Actually, Boeing has so far added more than 2,000 workers in Washington, where planned production — seven 787s a month, full capacity for that facility — will not be reduced. Besides, how can locating a new plant here violate the rights of IAM members whose collective bargaining agreement with Boeing gives the company the right to locate new production facilities where it deems best?
The NLRB says that Boeing has come here “because” IAM strikes have disrupted production and “to discourage” future strikes.
Since 1995, IAM has stopped Boeing’s production in three of five labor negotiations, including a 58-day walkout in 2008 that cost the company $1.8 billion and a diminished reputation with customers.

Sunday, May 15, 2011

White man's burden part II

Political Repression and Kokonut Democracies

by George B.N. Ayittey
The political situation in many African countries continues to remain distressing. The euphoria that gripped Africans as the "winds of change" swept across the continent, following the collapse of communism in Eastern Europe has largely dissipated and been replaced with a sense of disillusionment. While a few African despots were toppled, in the large majority of African nations they successfully beat back the democratic challenge. True, “elections” have been held in many African countries since 1990 to lend a veneer of ‘democracy” to authoritarian regimes. But as The Economist (23 Nov 1996) observed:
 
 'Boundaries, the media, the economy and the voters’ roll are all manipulated. Opponents are squashed. Soldiers have also learnt how to play the game. Half the countries of west and central Africa are ruled by “elected” ex-soldiers, among them the bosses of The Gambia and Niger, both voted into office after recently overthrowing democratically-elected governments (21).'

In some countries opposition leaders were partly to blame. Their own divisiveness, fragmentation, and lack of imagination as well as their propensity to choose ineffective tactics played right into the hands of the dictators. Out of the 54 African countries, only 14 are democratic: Benin, Botswana, Cape Verde Islands, Ghana, Madagascar, Malawi, Mali, Mauritius, Namibia, Sao Tome & Principe, Senegal, Seychelles Islands, South Africa, and Zambia. Even then, if a rigorous definition of democracy is applied, less than 5 would meet the requirements. Apart from periodic elections, they include a freely negotiated constitution, neutral and professional armed forces, an independent judiciary, an independent media and an independent central bank.

In the postcolonial period, three scenarios have emerged in the ouster of Africa's dictators. In the Doe scenario, those leaders who foolishly refused to accede to popular demands for democracy risked their own safety and the destruction of their countries: Doe of Liberia, Barre of Somalia, Mengistu of Ethiopia, and Mobutu Sese Seko of Zaire (now Congo). (Doe was killed in September 1990; Barre fled Mogadishu in a tank in January 1991; Mengistu fled to Zimbabwe in February 1991; and Mobutu fled in May 1997.) African countries where this scenario is most likely to be repeated are Algeria, Cameroon, Chad, Djibouti, Equatorial Guinea, Libya, Niger, Sierra Leone, and Tunisia.