Saturday, October 8, 2011

Magic Johnson vs LeBron James

The Top 1%
By  RUSS ROBERTS 
Robert Lieberman, a political scientist at Columbia University writes in Foreign Affairs:
The U.S. economy appears to be coming apart at the seams. Unemployment remains at nearly ten percent, the highest level in almost 30 years; foreclosures have forced millions of Americans out of their homes; and real incomes have fallen faster and further than at any time since the Great Depression. Many of those laid off fear that the jobs they have lost — the secure, often unionized, industrial jobs that provided wealth, security, and opportunity — will never return. They are probably right.And yet a curious thing has happened in the midst of all this misery. The wealthiest Americans, among them presumably the very titans of global finance whose misadventures brought about the financial meltdown, got richer. And not just a little bit richer; a lot richer. In 2009, the average income of the top five percent of earners went up, while on average everyone else’s income went down.
I’m not sure where he gets that statistic from. In the Census data (here, Table F-3, ) scroll down for the numbers in 2010 dollars, corrected for inflation) this is the mean income for the top 5%:

2010 Dollars
2010
14,991
37,066
60,363
91,991
187,395
313,298
2009
15,541
37,657
60,896
92,464
192,614
330,388
2008
16,107
38,607
62,361
93,326
192,809
331,064
2007
16,896
40,279
64,612
96,618
196,146
332,943
2006
16,804
39,762
63,245
95,589
202,641
358,700
2005
16,492
39,243
62,797
93,921
196,891
344,699

The first five columns are the various quintiles. The last column is the mean income of the top 5%. This is family income. Maybe Lieberman has it for individuals. But for families, the richest 5% have seen their income fall on average for the last four years. Much or maybe all of that is the people at the very top taking a hit, pulling down the mean. We don’t know, but I’d like to see Lieberman justify the figure. Or maybe he means the share going to the top 5%. Lieberman continues:
This was not an anomaly but rather a continuation of a 40-year trend of ballooning incomes at the very top and stagnant incomes in the middle and at the bottom. The share of total income going to the top one percent has increased from roughly eight percent in the 1960s to more than 20 percent today.This is what the political scientists Jacob Hacker and Paul Pierson call the “winner-take-all economy.” It is not a picture of a healthy society. Such a level of economic inequality, not seen in the United States since the eve of the Great Depression, bespeaks a political economy in which the financial rewards are increasingly concentrated among a tiny elite and whose risks are borne by an increasingly exposed and unprotected middle class. Income inequality in the United States is higher than in any other advanced industrial democracy and by conventional measures comparable to that in countries such as Ghana, Nicaragua, and Turkmenistan. It breeds political polarization, mistrust, and resentment between the haves and the have-nots and tends to distort the workings of a democratic political system in which money increasingly confers political voice and power.
The death of Steve Jobs is a useful reminder of the fact that much wealth is not winner-take-all but winner makes everybody better off. Steve Jobs’s estate is estimated to be something between $6 billion and $7 billion. About 2/3 of that is Disney stock he received when Disney acquired Pixar. The rest if Apple stock. This is clearly a fraction, maybe a small fraction of the wealth Jobs created for the rest of us.Yes, he made a lot of money. But he made it by making the rest of us better off. He didn’t take it from us. He shared it with us.

One reason that the top 1% only earned 8% of the income in the 1960′s vs. 20% now is that our economy has changed in ways that are good for all of us. I pause here to mention the obvious–the bottom 99% can be better off with a smaller share of the pie if the pie is getting sufficiently bigger which is what has happened over the last 50 years. But the top 1% gets a bigger share not because they are hoarding more of the pie. The top 1% gets a bigger share because the opportunity to create a lot of wealth for everyone has changed.

Think of it this way. The IBM Selectric was a wonderful improvement in the typewriter market. The people who created it and ran IBM made a lot of money from that improvement. And that’s nice. But improving the personal computer makes you a lot richer now than it did then. It creates more wealth. So the most creative people in technology today (Brin, Jobs, Page, Gates, Zuckerberg) make a lot more money than they did in 1960. That’s good.

Here is another way to see it. I often point out that the top 1% is not a club with a fixed number of people. There is considerable movement in and out of the different parts of the income distribution. But the fact is that once you are in the top 1%, if you fall out, you often don’t fall far. But there is a more important aspect of it not being the same people. Think of it this way. A great NBA player today earns a lot more than a great NBA player of 30 years ago. Magic Johnson, at the peak of his career made a little over $3 million dollars, annually, plus some endorsement money. LeBron James makes over $15 million and a lot more money from endorsements. Why? Because basketball, via technology and expanded wealth around the world, is a more popular sport than it was in the 1980s. That’s good. That’s why Lebron James captures a bigger share. He makes more people happy and they have more money to spend on basketball than people did in Magic Johnson’s day.

The top 1% are different people and the share that goes to the most talented people at the top has grown.

But not everyone in the top 1% earns their money as Steve Jobs did and LeBron James does by making other people’s lives better. As I have said many times, and will continue to say, the financial sector has made lots of money for executives in that sector because of government policies bailing out creditor which allows leverage to grow artificially large. That in turn, makes it easier for investment banks to profit and justifies large salaries for executives. That in turn, ratchets up earnings of people in related fields–hedge fund managers and even professors of economics who must be paid more now to keep them in academia and away from Wall Street.
Some of those gains to the financial sector are literally zero sum–bonuses paid for with my money and yours.

If we stop bailing out creditors–socializing the losses of the financial sector–the top 1% numbers will become “healthier.”

If we fail to distinguish between ill-begotten gains and those gains that enrich all of us, we are headed down a very dangerous path.


Down With Evil Corporations

Friday, October 7, 2011

Gravy Trains


Pigford v. Glickman: 86,000 claims from 39,697 total farmers?
Posted By Zombie 
I’m confused.
If there are only 39,697 African-American farmers grand total in the entire country, then how can over 86,000 of them claim discrimination at the hands of the USDA? Where did the other 46,303 come from?
Now, if you’re confused over what the heck I’m even talking about, let’s go back to the beginning of the story:
Pigford v. Glickman
In 1997, 400 African-American farmers sued the United States Department of Agriculture, alleging that they had been unfairly denied USDA loans due to racial discrimination during the period 1983 to 1997. The farmers won the case, known as Pigford v. Glickman, and in 1999 the government agreed to pay $50,000 each to any farmer who had been wrongly denied an agricultural loan. By then it had grown into a class action case, and any black farmer who had filed a complaint between 1983 and 1997 would be given at least $50,000 — not limited to the original 400 plaintiffs. It was estimated at that time that there might be as many as 2,000 beneficiaries granted $50,000 each.
According to the summary of the case linked above,
Originally, claimants were to have filed within 180 days of the consent decree. Late claims were accepted for an additional year afterwards, if they could show extraordinary circumstances that prevented them from filing on time.
Far beyond the anticipated 2,000 affected farmers, 22,505 “Track A” applications were heard and decided upon, of which 13,348 (59%)were approved. US$995 million had been disbursed or credited to the “Track A” applicants as of January 2009, including US$760 million disbursed as US$50,000 cash awards…. Beyond those applications that were heard and decided upon, about 70,000 petitions were filed late and were not allowed to proceed. Some have argued that the notice program was defective, and others blamed the farmers’ attorneys for “the inadequate notice and overall mismanagement of the settlement agreement”. A provision in the 2008 farm bill essentially allowed a re-hearing in civil court for any claimant whose claim had been denied without a decision that had been based on its merits.
Then on February 23 of this year, the USDA finally consented to pay $1.25 billion to those farmers whose claims had earlier been denied:
In the 1999 case Pigford v. Glickman, the USDA agreed to pay 16,000 black farmers $1 billion after a judge held the federal government responsible for the decline in black farmers. Critics argued that more than 70,000 farmers were shut out of the lawsuit. In 2008, then-Sen. Barack Obama and Republican Sen. Chuck Grassley got a law passed to reopen the case, and the settlement talks moved forward.
The $1.25 billion settlement, announced Thursday, comes on top of the money paid out a decade ago. The new agreement would provide cash payments and debt relief to farmers who applied too late to participate in the earlier settlement, The Washington Post reported. Authorities say they are not certain how many farmers might apply this time, but analysts say the number could be higher than 70,000.
Seventy-thousand+ applicants in addition to the 16,000 already compensated now means that over 86,000 people are slated to be paid.
The U.S. Senate and Shirley Sherrod
Which brings us up to today, when two current events suddenly thrust this otherwise little-known case into the spotlight. First, the Senate stripped funding for the settlement out of an unrelated war appropriations bill, as they had done several times in the past. Second, it was revealed today that “A farm collective founded by Shirley Sherrod and her husband that was forced out of business by the discriminatory practices received a $13 million settlement as part of Pigford last year, just before she was hired by the USDA.”
Suddenly, everyone in America is talking about a class-action suit that until a few hours ago very few had ever heard of.
But I want to go back to the beginning.
Forget about Shirley Sherrod’s connection. Forget about the Senate not funding the settlement. There will be plenty of pundits commenting on those aspects over the upcoming days.
What I want to know is: How can there be 86,000 legitimate claimants?
The Census pinpoints the precise number of African-American farmers
I ask this question because it didn’t take me very long to find the latest census statistics released by the Department of Agriculture, which can be found linked to from this official USDA page. There, you will find this direct link to a text version of the Census report, and this recommended pdf version.
In the pdf version of the government’s official 2007 Agricultural Census, Table 53 on page 646 shows that there are exactly 39,697 African-American farmers grand total in the entire nation:
(A scan through earlier census reports shows that this number has remained fairly constant over time, which is to be expected, as farming tends to be a long-term lifestyle rather than a “job” that one gets and then quickly abandons.)
Granted: the original case was valid. But has it spun out of control?
Let’s accept as a point of fact that some African-American farmers were unfairly denied loans by racists in the USDA during the Clinton and Reagan administrations. I’m not casting any aspersions on the validity of the original lawsuit, nor on the courts’ rulings in the case.
But ponder the numbers.
• There are approximately 40,000 African-American farmers in the country.
• Of that 40,000, not all of them have gotten into financial trouble. Some have successful farms.
• Of those who had financial trouble, not all of them sought out loans. Some tried to stay afloat on their own.
• Of those who sought out loans, not all of them sought out loans from the USDA. Some got loans from banks or friends.
• Of those who sought out loans from the USDA, not all of them were denied loans. Some got the loans as requested.
• Of those who were denied loans, not all of them were denied due to discriminatory racial practices.
In the end, a total much much smaller than 40,000 could legitimately claim to be victims of discrimination.
As shown above, it was originally estimated to be no more than 2,000 possible total plaintiffs.
Somehow, that number quickly swelled to 16,000 wronged claimants.
And now, as of February, the government has announced its plans to hand out at least $50,000 each to over 70,000 more claimants, over and above the original 16,000.
This means that the U.S. may be recompensing at least 86,000 African-American farmers for past racial discrimination. But how could that possibly be true if there are only 39,697 African-American farmers in existence nationwide? And if only some subset of them ever applied for loans in the first place and were then unfairly denied loans?
If someone can explain this to me, I’ll add it in an update to this post. Could it be that there is a constant turnover of African-Americans trying out farming for a few years, and then quickly giving it up, so that although there may be only 40,000 farmers at any one time, over the years, the total number of different people involved in farming is much larger? If so, is there any evidence for this? Or could there be another explanation?
I have a feeling that the Senate repeatedly fails to fund this settlement because there is a strong suspicion among the senators that something is amiss with the case — that a substantial percentage of the 70,000 claims that were originally rejected must necessarily have been fraudulent claims. And so there is reluctance to fork over the money. But there also seems to be a reluctance on the part of the Senate to admit why they won’t fund the settlement, because the issue is just too racially charged.
It is a tragedy that victims of institutional discrimination like the legitimately wronged African-American farmers could be denied their payout due to scammers trying to undeservedly grab a piece of the pie. Instead of getting angry at the Senate for hesitating with the funds, we should be angry at the swindlers (and their lawyers) who contaminated an otherwise valid case.
Unless, of course, there is a clear explanation of where those 86,000 farmers came from. Any ideas?
UPDATE I:
Another Department of Agriculture census report details the total number of African-American farmers in 1992, during the exact time covered by the Pigford v. Glickman lawsuit — and it reveals that there were far fewer back then than there were in 2007. According to the chart on page 20 of the USDA’s pdf 1998 “Status Report, Minority & Women Farmers In the U.S.”, there were only 18,816 black farmers in 1992:
If this is true (and it seems to be), then the disparity between the available total number of black farmers in the U.S. and the number of claimants in Pigford v. Glickman is even far greater than I originally calculated.
UPDATE II:
This official USDA census report, in Appendix 3 on page 24 of the pdf, gives even more statistics for 1987, 1992, and 1997 — every single agricultural census year during the period covered by the lawsuit. Turns out that the number of black farmers in every year was consistently very small:
Total number of African-American farmers in the United States, by year:
1987: 22,954
1992: 18,816
1997: 18,451

Even if there was a 100% turnover, and every single farmer went out of business every five years and was replaced by a new farmer (extremely unlikely), there still wouldn’t be enough black farmers throughout the entire period combined to account for the number of claimants.
It seems, no matter how you look at it, that a substantial number of the 86,000 claims must necessarily be fraudulent.

A train to genocide and the gulags

Community versus collectivism
Community and collectivism are opposites. Community is valuable and powerful; it is individuals freely choosing to cooperate and identify with each other to achieve more than they can individually, as we do in the open-source community.
Collectivism is a fraud. It pretends to be about community, but it is actually about the use of force. Collectivists want us not only to bow to their desire for power over others, but to thank them for coercing us and praise them as our moral superiors.
Compassion is a duty of every individual. Groups of people organizing voluntarily to achieve compassionate ends are deserve admiration and support. Collectivists pervert compassion, speaking the language of caring but committing the actions of criminals.
It is a crime to rob your neighbor. It is a crime to use your neighbor for your own ends without allowing him or her a choice in the matter. It is a crime to deprive your neighbor of his liberty when he or she has committed no aggression against you.
These crimes are no less crimes when a sociopath (or a politician – but, I repeat myself) justifies them by chanting “for the poor” or “for the children” or “for the environment”. They do not cease to be crimes just because a majority has been conned into voting for them. The violence is just as violent, the victims just as injured, the harm done just as grave.
Valid ethical propositions do not contain proper names. What is criminal for an individual to do is criminal for a community to do. Collectivists are not the builders of community, as they pretend, but its deadliest enemies – its corrupters and betrayers. When we fail to understand these simple truths, we board a train to genocide and the gulags.

Brutal blueprints

On the architecture of Brasilia
by Anthony Daniels
When I was about ten years old, I used to design cities. It was very easy, and I was surprised that everyone before me had made such a hash of it. I could conclude only that the world had hitherto been populated by fools. At the very center of the city was the parliament building, which was like St. Peter’s but on a bigger and grander scale. Round it ran an eight-lane circular road, from which radiated, symmetrically, six large avenues. How the deputies to the parliament were supposed to reach it—dodge between the traffic, I suppose—was not a question with which I concerned myself. I was designing cities and buildings, not human convenience. Along the avenues were situated the institutions that I then considered essential for cities: the natural history museum, the art gallery, the royal palace. Everything was on a grand scale, and no mess of the kind created by commercial or other inessential establishments was permitted or planned for.

Brasilia was being built while I designed my cities, though in a different architectural vocabulary: one of reinforced concrete rather than marbled neoclassical façades. From the point of view of urban design and planning, however, it was not much of an advance over mine, but, unlike my designs, it was put into practice.

The first thing to say about Brasilia is that it is an astonishing achievement or feat, and this is so whether you think it good or bad or somewhere in between the two. Where nothing but a remote, hot, and scrubby plain existed just over half a century ago, there now stands a functioning city of over three million people. This is enough to excite wonderment.

What perhaps is even more astonishing is that Brasilia was up and running within less than four years of the first foundation being laid. The dream of moving the capital from the coast to the interior was almost as old as Brazil itself, and, indeed, such a move had long been a constitutional requirement, if only a dead-letter one. The idea was both economic and strategic: the move would simultaneously develop the interior and protect the country from foreign occupation.

It was President Juscelino Kubitschek de Oliveira, a Parisian-trained former urologist, who finally ordered Brasilia’s construction. According to the story, a man asked Kubitschek at a pre-election meeting whether, if elected, he would comply with the constitutional requirement that the capital be moved, and he said that he would. Whether for reasons of probity not universal among politicians, or for more pragmatic reasons, Kubitschek kept to his undertaking, but made it a condition of doing so that the new capital be completed within his presidential mandate. As with many, perhaps most, or even all grand schemes, the economic cost was not taken into account: Kubitschek was, in effect, Brazil’s Peter the Great, but without the cruelty or indifference to human life. Unfortunately, he was also without the taste.


Thursday, October 6, 2011

A string quartet at bayonet point


The Fallacy of the 'Public Sector'
 
"capitalism stands its trial before judges who have the sentence of death in their pockets. They are going to pass it, whatever the defense they may hear; the only success a victorious defense can possibly produce is a change in the indictment."
by Murray N. Rothbard
We have heard a great deal in recent years of the "public sector," and solemn discussions abound through the land on whether or not the public sector should be increased vis-à-vis the "private sector." The very terminology is redolent of pure science, and indeed it emerges from the supposedly scientific, if rather grubby, world of "national-income statistics." But the concept is hardly wertfrei; in fact, it is fraught with grave, and questionable, implications.
In the first place, we may ask, "public sector" of what? Of something called the "national product." But note the hidden assumptions: that the national product is something like a pie, consisting of several "sectors," and that these sectors, public and private alike, are added to make the product of the economy as a whole. In this way, the assumption is smuggled into the analysis that the public and private sectors are equally productive, equally important, and on an equal footing altogether, and that "our" deciding on the proportions of public to private sector is about as innocuous as any individual's decision on whether to eat cake or ice cream. The State is considered to be an amiable service agency, somewhat akin to the corner grocer, or rather to the neighborhood lodge, in which "we" get together to decide how much "our government" should do for (or to) us. Even those neoclassical economists who tend to favor the free market and free society often regard the State as a generally inefficient, but still amiable, organ of social service, mechanically registering "our" values and decisions.
One would not think it difficult for scholars and laymen alike to grasp the fact that government is not like the Rotarians or the Elks; that it differs profoundly from all other organs and institutions in society; namely, that it lives and acquires its revenues by coercion and not by voluntary payment. The late Joseph Schumpeter was never more astute than when he wrote, "The theory which construes taxes on the analogy of club dues or of the purchase of the services of, say, a doctor only proves how far removed this part of the social sciences is from scientific habits of mind."[1]
Apart from the public sector, what constitutes the productivity of the "private sector" of the economy? The productivity of the private sector does not stem from the fact that people are rushing around doing "something," anything, with their resources; it consists in the fact that they are using these resources to satisfy the needs and desires of the consumers. Businessmen and other producers direct their energies, on the free market, to producing those products that will be most rewarded by the consumers, and the sale of these products may therefore roughly "measure" the importance that the consumers place upon them. If millions of people bend their energies to producing horses-and-buggies, they will, in this day and age, not be able to sell them, and hence the productivity of their output will be virtually zero. On the other hand, if a few million dollars are spent in a given year on Product X, then statisticians may well judge that these millions constitute the productive output of the X-part of the "private sector" of the economy.
One of the most important features of our economic resources is their scarcity: land, labor, and capital-goods factors are all scarce, and may all be put to various possible uses. The free market uses them "productively" because the producers are guided, on the market, to produce what the consumers most need: automobiles, for example, rather than buggies. Therefore, while the statistics of the total output of the private sector seem to be a mere adding of numbers, or counting units of output, the measures of output actually involve the important qualitative decision of considering as "product" what the consumers are willing to buy. A million automobiles, sold on the market, are productive because the consumers so considered them; a million buggies, remaining unsold, would not have been "product" because the consumers would have passed them by.
Suppose now that into this idyll of free exchange enters the long arm of government. The government, for some reasons of its own, decides to ban automobiles altogether (perhaps because the many tailfins offend the aesthetic sensibilities of the rulers) and to compel the auto companies to produce the equivalent in buggies instead. Under such a strict regimen, the consumers would be, in a sense, compelled to purchase buggies because no cars would be permitted. However, in this case, the statistician would surely be purblind if he blithely and simply recorded the buggies as being just as "productive" as the previous automobiles. To call them equally productive would be a mockery; in fact, given plausible conditions, the "national product" totals might not even show a statistical decline, when they had actually fallen drastically.
And yet the highly touted "public sector" is in even worse straits than the buggies of our hypothetical example. For most of the resources consumed by the maw of government have not even been seen, much less used, by the consumers, who were at least allowed to ride in their buggies. In the private sector, a firm's productivity is gauged by how much the consumers voluntarily spend on its product. But in the public sector, the government's "productivity" is measured – mirabile dictu – by how much it spends! Early in their construction of national-product statistics, the statisticians were confronted with the fact that the government, unique among individuals and firms, could not have its activities gauged by the voluntary payments of the public – because there were little or none of such payments. Assuming, without any proof, that government must be as productive as anything else, they then settled upon its expenditures as a gauge of its productivity. In this way, not only are government expenditures just as useful as private, but all the government need to do in order to increase its "productivity" is to add a large chunk to its bureaucracy. Hire more bureaucrats, and see the productivity of the public sector rise! Here, indeed, is an easy and happy form of social magic for our bemused citizens.
The truth is exactly the reverse of the common assumptions. Far from adding cozily to the private sector, the public sector can only feed off the private sector; it necessarily lives parasitically upon the private economy. But this means that the productive resources of society – far from satisfying the wants of consumers – are now directed, by compulsion, away from these wants and needs. The consumers are deliberately thwarted, and the resources of the economy diverted from them to those activities desired by the parasitic bureaucracy and politicians. In many cases, the private consumers obtain nothing at all, except perhaps propaganda beamed to them at their own expense. In other cases, the consumers receive something far down on their list of priorities – like the buggies of our example. In either case, it becomes evident that the "public sector" is actually antiproductive: that it subtracts from, rather than adds to, the private sector of the economy. For the public sector lives by continuous attack on the very criterion that is used to gauge productivity: the voluntary purchases of consumers.

Total State or total freedom?


The Fascist Threat
 
by Llewellyn H. Rockwell, Jr.
Everyone knows that the term fascist is a pejorative, often used to describe any political position a speaker doesn’t like. There isn’t anyone around who is willing to stand up and say: "I’m a fascist; I think fascism is a great social and economic system."

But I submit that if they were honest, the vast majority of politicians, intellectuals, and political activists would have to say just that.

Fascism is the system of government that cartelizes the private sector, centrally plans the economy to subsidize producers, exalts the police State as the source of order, denies fundamental rights and liberties to individuals, and makes the executive State the unlimited master of society.

This describes mainstream politics in America today. And not just in America. It’s true in Europe, too. It is so much part of the mainstream that it is hardly noticed any more.

It is true that fascism has no overarching theoretical apparatus. There is no grand theorist like Marx. That makes it no less real and distinct as a social, economic, and political system. Fascism also thrives as a distinct style of social and economic management. And it is as much or more of a threat to civilization than full-blown socialism.

This is because its traits are so much a part of life – and have been for so long – that they are nearly invisible to us.

If fascism is invisible to us, it is truly the silent killer. It fastens a huge, violent, lumbering State on the free market that drains its capital and productivity like a deadly parasite on a host. This is why the fascist State has been called The Vampire Economy. It sucks the economic life out of a nation and brings about a slow death of a once thriving economy.

Let me just provide a recent example.

The Decline

The papers last week were filled with the first sets of data from the 2010 US Census. The headline story concerned the huge increase in the poverty rate. It is the largest increase in 20 years, and now up to 15%.

But most people hear this and dismiss it, probably for good reason. The poor in this country are not poor by any historical standard. They have cell phones, cable TV, cars, lots of food, and plenty of disposable income. What’s more, there is no such thing as a fixed class called the poor. People come and go, depending on age and life circumstances. Plus, in American politics, when you hear kvetching about the poor, everyone knows what you’re supposed to do: hand the government your wallet.


Steve Jobs RIP

How to Live Before You Die


"Nobody ever asked why Steve Jobs kept working after he was rich. Everyone understood."

The Fall of Kodak

A Tale of Disruptive Technology and Bad Business
by D. DiSalvo
I grew up in a Kodak family. My grandfather worked in the photography dark rooms of a Kodak production facility in Rochester, New York for better than 30 years.  My father was a supervisor at Kodak headquarters in downtown Rochester, and later became a liaison between Kodak and Disney in Orlando, for 25 years. Other members of my family worked for the company in various roles, some until retirement.

As a kid, the Eastman Kodak brand was the undisputed king in a city known for its industry giants, including Bausch and Lomb, Xerox, Gannett, and Western Union. If you lived in Rochester and worked for Kodak, the expectation was that you would stay there until retirement, and receive a handsome pension thereafter. Every Kodak employee looked forward to a generous bonus–an annual event that juiced the local economy unlike any other.

By the mid1980s—just about 100 years after George Eastman invented paper-based film—my father was already voicing concerns about Kodak’s future. The digital revolution was sparking, and he wasn’t seeing signs that Kodak knew exactly what to do about it. Instead of focusing its strategic attention on the emerging digital technologies, Kodak was making odd maneuvers, like acquiring pharmaceutical giant Sterling Drugs for $5.1 billion and trying to establish a brand in the battery business.

The connection with Sterling—really the only linkage that made sense for Kodak—was Sterling’s diagnostic imaging business that Kodak rightly forecasted would become gigantic in the years ahead. But acquiring the entirety of Sterling proved a disastrous decision, resulting in massive losses and the eventual selling off of all Sterling’s divisions within six years. Likewise, Kodak took a costly black eye in the battery business from industry leaders Duracell and Eveready, and divested from its battery spin-off, Ultra Technologies, with another painful loss.


Much, much more than you think.

How Much Does the Federal Government Really Spend?
By Andrew G. Biggs
The size of government is of particular interest these days, with many Americans believing that rising government spending is crowding out the ability of individuals and businesses to control their own well-being and improve the economy. Indeed, Gallup reported last week that, on average, Americans think the federal government wastes 50 cents of each dollar it spends.
But how much does the federal government really spend?
On paper, the Congressional Budget Office reports that in 2010, the federal government spent $3.456 trillion, an amount that is equal to 23.8 percent of gross domestic product. That’s one-quarter higher than the historical norm of around 19 percent of GDP.
But direct spending isn’t the only spending Washington does. As Lori Montgomery reports in the Washington Post, last year the federal government spent an additional $1.08 trillion on tax expenditures, which are tax breaks that for all intents and purposes are spending.
From the mortgage interest deduction to employer-provided health coverage to credits for purchasing corporate jets, these tax expenditures reduce individual and corporate taxes only if the individuals and corporations do what the government would like them to do. Rent a house? No tax break. Buy your own health insurance rather than have your employer provide it? Same story. Buy a corporate boat rather than jet? Nope, no tax cut for you.
That $1.08 trillion in tax expenditures is 24 percent of all federal spending, and is all off the books, allowing a much bigger government than official statistics tell—and much bigger than people might be willing to tolerate if they knew. 
Tax expenditures, as Senator Daniel Patrick Moynihan said, are “boob bait for the bubbas,” with the bubbas being conservative Republicans who will support anything that calls itself a tax cut even if it walks, talks, and quacks like government spending.
Put it all together and the federal government spends an amount equal to 31.2 percent of GDP—that is, almost one-third of everything produced in the economy.
Moreover, according to the Congressional Research Service, tax expenditures have grown by about 24 percent relative to the size of the economy since 1974, from 5.8 percent of GDP to 7.2 percent. So not only do tax expenditures add to overall government spending, they’re adding more today than they did 37 years ago.
On top of this you have the cost of government regulations, which also amount to de facto federal spending because they impose costs on businesses which are used to further policy goals. A recent report from the Small Business Administration’s Office of Advocacy finds that the total cost of federal regulations was $1.175 trillion, equal to over $8,000 per employee or 8.1 percent of gross domestic product.
Add regulations to the mix and the total sway of the federal government over the private sector equals roughly 39 percent of the economy, a larger portion than in prior decades. And, as the baby boom generation retires and entitlement costs skyrocket, the federal share of GDP will rise even further.
We can argue regarding the costs and benefits of larger government; but we should not argue that larger government is in fact what we have today.

In Praise for Jacques Rueff

Jacques Rueff, the Age of Inflation, and the True Gold Standard
by LEWIS E. LEHRMAN at the PARLIAMENT OF FRANCE (Assemble Nationale) November 7, 1996

Distinguished Leaders of France:

In what I now say to you, I draw from the speeches, the writings, and the letters of the greatest economist of the twentieth century.  Your courtesy may require you to hear politely the words I now speak.  But I beg you to believe me, that all the arguments I shall make in your presence are   distilled from the wisdom of the master himself.  The ideas I set before you originate in the proven genius of an extraordinary teacher, a selfless servant of the French people, and a peerless citizen of the world -- in the words of General de Gaulle -- “une poète de finance.”

I speak of Jacques Rueff.

As a soldier of France, no one knew better than Jacques Rueff that World War I had brought to an end the preeminence of the classical European states system; that it had decimated the flower of European youth; that it had destroyed the European continent’s industrial productivity.  No less ominously, on the eve of the Great war, the gold standard – the gyroscope of the Industrial Revolution, the proven guarantor of one hundred years of price stability, the common currency of the world trading system – this precious institution of commercial civilization was suspended by the belligerents.

The Age of Inflation was upon us.

The overthrow of the historic money of commercial civilization, the gold standard, led, during the next decade, to the great inflations in France, Germany, and Russia.  The ensuing convulsions of the social order, the rise of the speculator class, the obliteration of the savings of the laboring and middle classes on fixed incomes, led directly to the rise of Bolshevism, Fascism, and Nazism – linked, as they were, to floating European currencies, perennial budgetary and balance of payments deficits, central bank money printing, currency wars and the neo-mercantilism they engendered.

Today, three quarters of a century later (1996), one observes -- at home and abroad -- the fluctuations of the floating dollar, the unpredictable effects of its variations, the abject failure to rehabilitate the dollar’s declining reputation.  Strange it is that an unhinged token, the paper dollar, is now the monetary standard of the most scientifically advanced global economy the world has ever known.

In America, the insidious destruction of its historic currency, the gold dollar, got underway in 1922 during the inter-war experiment with the gold-exchange standard and the dollar’s new official reserve currency role.  It must be remembered that World War I had caused the price level almost to double.  Britain and America tried to maintain the pre-war dollar-gold, sterling-gold parities.  The official reserve currency roles of the convertible pound and dollar, born of the gold-exchange standard, collapsed in the Great Depression and so did the official foreign exchange reserves of the developed world – which helped to cause and to intensify the depression.  Franklin Roosevelt in 1934 reduced the value of the dollar by raising the price of gold from $20 to $35 per ounce, believing the change to be a necessary adjustment to the post World War I price level rise.

But it must be emphasized that it was twelve years earlier, in 1922, at the little known but pivotal Monetary Conference of Genoa, that the unstable gold-exchange standard had been officially embraced by the European financial authorities.  It was here that the dollar and the pound were first confirmed as official reserve currencies to supplement what was said to be a scarcity of gold.  For those of you who remember his writings, Jacques Rueff warned in the 1920s of the dangers of the Genoa gold-exchange system and, again, predicted in 1960-61 that the Bretton Woods system, a post World War II gold-exchange standard, flawed as it was by the same official reserve currency contagion of the 1920s, would soon groan, under the flood weight of excess American dollars going abroad.  Rueff in the 1950s and 1960s forecast permanent U.S. balance of payments deficits and the tendency to constant budget deficits, and ultimate suspension of dollar convertibility to gold.  After World War II, he saw that because the United States was the undisputed hegemonic military and economic power of the free world, foreign governments and central banks, in exchange for these military services and other subsidies rendered, would for a while continue to purchase, (sometimes to protect their export industries,) excess dollars on the foreign exchanges against the creation of their own monies.  This was the inevitable result of the dollar’s official reserve currency status.  But these dollars, originating in the U.S. balance of payments and budget deficits, were then redeposited by foreign governments in the New York dollar market which led to inflation in the U.S., and inflation in its European and Asian protectorates which were absorbing the excess dollars.  Incredibly, during this same period, the International Monetary Fund authorities had the audacity to advocate the creation of Special Drawing Rights, SDRs, so-called “paper gold,” invented, as International Monetary Fund officials said, to avoid a “potential liquidity shortage.”  At that very moment, the world was awash in dollars, in the midst of perennial dollar and exchange rate crises.  Jacques Rueff casually remarked to Le Monde that the fabrication of these SDRs by the International Monetary Fund would be as gratuitous as “irrigation plans implemented during the flood.”

The dénouement of post-war financial history came at the Ides of March, in 1968, when President Johnson suspended the London Gold Pool and, mercifully, abdicated his candidacy for reelection.  And so after a few more disabling years, Bretton Woods expired on August 15, 1971.  The truth is that Monetarists and Keynesians sought not to reform Bretton Woods, as the gold standard reform of President DeGaulle and Jacques Rueff did, but rather to demolish it.  The true gold standard, indeed any metallic currency basis, was passé among the cognoscenti.  I shall give you just one example of the obtuseness of the political class, which happened at the height of a major dollar crisis.  A friend of Jacques Rueff, the renowned American banker and policy intellectual, Henry Reuss, Chairman of the Banking and Currency Committee of the United States House of Representatives, went so far as to predict in The New York Times, with great confidence and even greater fanfare, that when gold was demonetized, it would fall from $35 to $6 per ounce.  (I am not sure whether Congressman Reuss ever covered his short at $800 per ounce in 1980.)

President Nixon, a self-described conservative, succeeded President Johnson and was gradually converted to Keynesian economics by so-called conservative academic advisers, led by Prof. Herbert Stein.  Mr. Nixon had also absorbed some of the teachings of the Monetarist School from his friend Milton Friedman -- who embraced the expediency of floating exchange rates and central bank manipulation and the targeting of the money stock.  Thus it was no accident that the exchange rate crises continued, and on August 15, 1971, after one more major dollar crisis, Nixon defaulted at the gold window of the western world, declaring that “we are all Keynesians now.”  In 1972, Nixon, a republican, so-called free market President, imposed the first peacetime wage and price controls in American history – encouraged by some of the famous “conservative” advisers of the era.

In President Nixon’s decision of August 1971, the last vestige of monetary convertibility to gold, the final trace of an international common currency, binding together the civilized nations of the West, had been unilaterally abrogated by the military leader of the free world.


Fiat money "conservatism"

The Nixon Shock Heard 'Round the World
By severing the dollar's convertibility to gold in 1971, the president ushered in a decade of inflation and economic stagnation.
By LEWIS E. LEHRMAN

lehrmanOn the afternoon of Friday, Aug. 13, 1971, high-ranking White House and Treasury Department officials gathered secretly in President Richard Nixon's lodge at Camp David. Treasury Secretary John Connally, on the job for just seven months, was seated to Nixon's right. During that momentous afternoon, however, newcomer Connally was front and center, put there by a solicitous president. Nixon, gossiped his staff, was smitten by the big, self-confident Texan whom the president had charged with bringing order into his administration's bumbling economic policies.

In the past, Nixon had expressed economic views that tended toward "conservative" platitudes about free enterprise and free markets. But the president loved histrionic gestures that grabbed the public's attention. He and Connally were determined to present a comprehensive package of dramatic measures to deal with the nation's huge balance of payments deficit, its anemic economic growth, and inflation.

Dramatic indeed: They decided to break up the postwar Bretton Woods monetary system, to devalue the dollar, to raise tariffs, and to impose the first peacetime wage and price controls in American history. And they were going to do it on the weekend—heralding this astonishing news with a Nixon speech before the markets opened on Monday.

The cast of characters gathered at Camp David was impressive. It included future Treasury Secretary George Shultz, then director of the Office of Management and Budget, and future Federal Reserve Chairman Paul Volcker, then undersecretary for monetary affairs at Treasury. At the meeting that afternoon Nixon reminded everyone of the importance of secrecy. They were forbidden even to tell their wives where they were. Then Nixon let Connally take over the meeting.

The most dramatic Connally initiative was to "close the gold window," whereby foreign nations had been able to exchange U.S. dollars for U.S. gold—an exchange guaranteed under the monetary system set up under American leadership at Bretton Woods, N.H., in July 1944. Recently the markets had panicked. Great Britain had tried to redeem $3 billion for American gold. So large were the official dollar debts in the hands of foreign authorities that America's gold stock would be insufficient to meet the swelling official demand for American gold at the convertibility price of $35 per ounce.

On Thursday, Connally had rushed to Washington from a Texas vacation. He and Nixon hurriedly decided to act unilaterally, not only to suspend convertibility of the dollar to gold, but also to impose wage and price controls. Nixon's speechwriter William Safire attended the conference in order to prepare the president's speech to the nation. In his book "Before the Fall," Safire recalled being told on the way to Camp David that closing the gold window was a possibility. Despite the many international ramifications of what the administration would do, no officials from the State Department or the National Security Council were invited to Camp David.


The president had little patience or understanding of the disputes among his economic team members. He found wearisome the mumbo-jumbo from Federal Reserve Chairman Arthur Burns. But the president had determined he would have a unified economic team and a unified economic policy, no matter what the consequences. So the White House dutifully leaked stories designed to undermine and humiliate Burns, as Connally waited in the wings with his "New Economic Policy."

At Camp David, Connally argued: "It's clear that we have to move in the international field, to close the gold window, not change the price of gold, and encourage the dollar to float." Burns timidly objected but was easily flattered by the president. By the evening of Aug. 15, Burns was on board with terminating the last vestige of dollar convertibility to gold, depreciating the dollar on the foreign exchanges, imposing higher tariffs, and ultimately ordering price and wage controls.

Nixon and Safire put together a speech to be televised Sunday night. It had taken only a few hours during that August 1971 weekend for Nixon to decide to sever the nation's last tenuous link to the historic American gold standard, a monetary standard that had been the constitutional bedrock (Article I, Sections 8 and 10) of the American dollar and of America's economic prosperity for much of the previous two centuries.

At least one Camp David participant, Paul Volcker, regretted what transpired that weekend. The "Nixon Shock" was followed by a decade of one of the worst inflations of American history and the most stagnant economy since the Great Depression. The price of gold rose to $800 from $35.

The purchasing power of a dollar saved in 1971 under Nixon has today fallen to 18 pennies (see the nearby graph). Nixon's new economic policy sowed chaos for a decade. The nation and the world reaped the whirlwind.

Rhetoric, rather than realities

‘Stop Whining’?

T. Sowell

Some of the policies most devastating to blacks have come from liberal Democrats.

If there was ever any doubt that the Democrats take the black vote for granted, that doubt should have been put to rest when Barack Obama told the Congressional Black Caucus, “Stop whining!”

Have you ever before heard either a Democratic or a Republican leader tell his party’s strongest supporters, “Stop whining”?

Blacks have a lot to complain about, not just about this Democratic administration but about many other Democratic administrations, national and local, over the years.

Unfortunately, black voters, like many other voters, often judge by rhetoric, rather than realities. When it comes to racial rhetoric, the Democrats outdo the Republicans by miles.

Even Ronald Reagan, the great communicator, had problems communicating with black voters, as I pointed out years ago in my book A Personal Odyssey (pages 274–278).

All this came back to me during a recent cleanup of my office, which turned up an old yellowed copy of the New York Times with the following front-page headline: “White-Black Disparity in Income Narrowed in 80’s, Census Shows” (July 24, 1992).

How many people in the media have pointed out that the black-white income gap narrowed during the Reagan administration, just as it has widened during the Obama administration? For that matter, how many Republicans have pointed it out?

The Reagan administration did not have any special program to narrow the racial gap in incomes. The point is that the kinds of policies followed in the 1980s had that effect, just as the kinds of policies followed by the Obama administration had opposite effects. But just listening to rhetoric won’t tell you that.

Over the years, some of the most devastating policies, in terms of their actual effects on black people, have come from liberal Democrats, from the local to the national level.

As far back as the Roosevelt administration during the Great Depression of the 1930s, liberal Democrats imposed policies that had counterproductive effects on blacks. None cost blacks more jobs than minimum-wage laws.

In countries around the world, minimum-wage laws have a track record of increasing unemployment, especially among the young, the less skilled, and minorities. They have done the same in America.

One of the first acts of the Roosevelt administration was to pass the National Industrial Recovery Act of 1933, which included establishing minimum wages nationwide. It has been estimated that blacks lost 500,000 jobs as a result.

After that act was declared unconstitutional, the Fair Labor Standards Act of 1938 set minimum wages. In the tobacco industry alone, 2,000 black workers were replaced by machines, just as blacks had been replaced by machines in the textile industry after the previous minimum-wage law.

Fortunately, the high inflation of the 1940s raised the wages of even unskilled labor above the level prescribed by the minimum-wage law. The net result was that this law became virtually meaningless, until the minimum-wage rate was raised in 1950.

During the late 1940s, when the minimum-wage law had essentially been repealed by inflation, 16- and 17-year-old blacks in 1948 had an unemployment rate of 9.4 percent, slightly lower than that of whites the same ages and a fraction of what it would be in even the boom years after the minimum-wage rate kept getting increased by liberal Democrats.

Urban renewal was another big Democratic liberal idea. It destroyed mostly low-income minority neighborhoods and replaced them with upscale housing that the former residents could not afford. People by the hundreds of thousands were scattered to the winds, destroying community ties between families, neighbors, and local institutions from churches to family doctors to businesses.

Even when liberal Democrats try specifically to help blacks, the results often backfire. The political crusade for “affordable housing” and minority home ownership drew many blacks into homes they could not afford. The net result was an especially high rate of foreclosure and, in the end, black home-ownership rates lower than they were before the “affordable housing” crusade began.

Listening to political rhetoric often leads to opposite conclusions from those resulting from checking out hard facts — and not just for blacks.