Monday, July 4, 2011

The politics of Envy

The Folly of Soaking the Rich
By Mario Loyola 
The chart Andrew Stiles referred to Friday (from an earlier post by Veronique de Rugy) shows only the start of how counterproductive it is to increase taxes on the wealthy. As a result of lower tax rates on the top income earners, not only do they pay a much larger share of all taxes, but they pay much more taxes total — and revenue to the government has increased. This is because lowering taxes on the rich creates more rich people and richer rich people. The federal government gets much more revenue if you impose a 40 percent tax on a large number of very wealthy millionaires than if you impose a 70 percent tax on a small number of less wealthy millionaires.
Every tax has a “revenue-maximizing” point well short of 100 percent. If a tax is set higher than its “revenue-maximizing” point, overall tax revenue to the government will decrease. This is the basic theory behind the Laffer Curve, which states that when taxes are zero percent, revenue to the government is (obviously) zero, but when taxes are 100 percent, revenue to the government is also zero, because by taxing all the income of a particular group of people, you kill all economic activity in that group, so you’re left with nothing to tax. Between those two extremes is a curve whereby revenue to the government rises as you increase taxes from zero percent, but begins to fall as you approach 100 percent taxation — that’s the Laffer Curve.
Arthur Laffer and Ford Scudder explore this phenomenon at length in their brilliant series The Onslaught from the Left.. In keeping with what Veronique pointed out, they write, in Part II of the series:
In the year Ronald Reagan took office (1981) the top 1% of income earners as reflected by the Adjusted Gross Income of all tax filers paid 17.58 % of all federal income taxes. Twenty-five years later, in 2005 the top 1% paid 39.8% of all income taxes, representing a greater than doubling of the share of tax payments made by this group.
But even more to the point, from 1981 to 2005 the income taxes paid by the top 1% rose from 1.59% of GDP to 2.96% of GDP.  In addition to the huge rise in the percent of GDP paid in income taxes by the top 1% of income earners and the more than doubling of the share of taxes paid by this group was the huge absolute increase in real taxes (2005 dollars using the GDP price deflator [in other words, adjusting for inflation - ML])  from 1981 through 2005.  In 1981 total tax payments from from the richest 1% were $98.84 billion, while in 2005 the top 1% paid $368.13 billion in taxes; that’s a 288% increase in 25 years.  In rough numbers, that means that each of the richest 1% of filers in 1981 paid a little over $100,000 in 2005 dollars, while in 2005 each filer on average paid over $288,000.  And remember that’s inflation-adjusted dollars.”
This astonishing statistic is explained by a simple fact. As a result of reducing taxes on the rich, the rich got much richer — so much so that they wound up paying nearly four times as much total tax (and nearly three times as much tax per rich person) as when taxes were higher. 
This also reveals the truth behind the increased income inequality that liberals love to cite as their chief evidence against supply-side economics. In fact, as Laffer explains in  Part I of the Onslaught from the Left series, the poor have gotten richer — just not as quickly as the rich have. “The increasingly unequal distribution of income during the era of supply-side economics has resulted from the poor increasing their income at a rate that has not kept pace with the phenomenal gains in income the rich have experienced — not from the poor getting poorer.” He goes on to show that in fact, lower taxes rates have led both to higher income among the bottom 50 percent of income earners and lower total taxes paid by that group. 
Most important of all, of course, is the fact that when the rich get richer, they invest more money in the economy, thereby stimulating economic growth. Democrats generally can’t stomach the rich getting richer, even when it means everyone is better off. But you’d think they would at least propose tax policy that increases government revenue. Alas, they so want to punish the rich that they are even willing to lower government revenue in the process.  

Get the State out of Marriage

The Death of Moral Community
by  Patrick J. Buchanan
"The opponents (of same-sex marriage) have no case other than ignorance and misconception and prejudice."
So writes Richard Cohen in his celebratory column about Gov. Andrew Cuomo's role in legalizing gay marriage in New York state.
Now, given that no nation in 20 centuries of Christendom legalized homosexual marriage, and, in this century, majorities in all 31 states where it has been on the ballot have rejected it, Cohen is pretty much saying that, since the time of Christ, Western history has been an endless Dark Age dominated by moral ignoramuses and bigots.
For the belief that homosexuality is unnatural and immoral and same-sex marriage an Orwellian absurdity has always been part of the moral code of Christianity. Gen. George Washington ordered active homosexuals drummed out of his army. Thomas Jefferson equated homosexuality with rape. Not until 2003 did the Supreme Court declare homosexual acts a protected right.
 What is the moral basis of the argument that homosexuality is normal, natural and healthy? In recent years, it has been associated with high levels of AIDS and enteric diseases, and from obits in gay newspapers, early death. Where is the successful society where homosexual marriage was normal?
 Not until the Stonewall riots at a gay bar in Greenwich Village in 1969 was the case broadly made by anyone but the Mattachines of Frank Kameny that homosexuality deserved to be treated as a natural and normal expression of love.
 Still, Cohen is not without a point when he uses the term "prejudice."
  As Albert Einstein observed, "Common sense is the collection of prejudices acquired by age 18." By 14, most boys have learned on the playground there is something disordered about boys sexually attracted to other boys.
 Hence the need for politically correct universities to purge such ideas from young minds and indoctrinate them in the new truths of modernity.
 But are we really wiser than our ancestors? As Edmund Burke wrote of the thinkers of his time:
  "Many of our men of speculation, instead of exploding general prejudices, employ their sagacity to discover the latent wisdom which prevails in them. If they find what they seek and they seldom fail, they think it more wise to continue the prejudice, with the reason involved, than to cast away the coat of prejudice, and to leave nothing but the naked reason."
  Great minds once found merit in the "prejudices," or inherited wisdom, of a people, as a spur to virtuous behavior. Again, Burke:
 "Prejudice is of ready application in an emergency. It previously engages the mind in a steady course of wisdom and virtue, and does not leave the man hesitating in the moment of decision, skeptical and unresolved."
  In our new society from which traditionalists are seceding, many ruling ideas are rooted in an ideology that is at war with Burke's "general prejudices."
  High among them is that homosexuality is natural and normal. That abortion is a woman's right. That all voluntary sexual relations are morally equal. That women and men are equal, and if the former are not equally represented at the apex of academic, military and political life, this can only be the result of invidious discrimination that the law must correct. That all races, religions and ethnic groups are equal and all must have equal rewards.
  Once a nation synonymous with freedom, the new America worships at the altar of equality.
  Writing on the same Washington Post page as Cohen, a day earlier, Greg Sargent exulted in Cuomo's law as "a huge victory ... for equality ... a major defeat for those self-described 'conservatives' who hate government except when it is enforcing a form of legalized discrimination that comports with their prejudices."
  Sargent also has a point. But behind the "prejudices" of conservatives about the moral superiority of traditional marriage are 2,000 years of history and law. What is the intellectual and moral basis of Sargent's notion?
 He claims "majorities of Americans are not prepared to assign sub-par status to the intimate relationships of gays and lesbians."
  Certainly, that is true of the Albany legislature.
But why then does Barack Obama seem so hesitant to embrace gay marriage?
  In 2012, we shall find out who is right politically, when the issue goes on the ballot in battleground states. But is moral truth to be discovered at a ballot box? Do we have no superior moral compass than majority rule?
  "A new kind of America is emerging in the early 21st century," said Archbishop Charles Chaput of Denver last week, "and it's likely to be much less friendly to religious faith than anything in the nation's past."
  He added, pointedly, "If Catholic social services should be forced to alter their Catholic beliefs on marriage, the family, social justice, sexuality (and) abortion," they should terminate those services.
  Prediction: We are entering an era where communities will secede from one another and civil disobedience on moral grounds will become as common as it was in the days of segregation.

Ministry of Peace

Attacking Libya -- and the Dictionary
If Americans Don’t Get Hurt, War Is No Longer War
By Jonathan Schell
The Obama administration has come up with a remarkable justification for going to war against Libya without the congressional approval required by the Constitution and the War Powers Resolution of 1973.
American planes are taking off, they are entering Libyan air space, they are locating targets, they are dropping bombs, and the bombs are killing and injuring people and destroying things. It is war. Some say it is a good war and some say it is a bad war, but surely it is a war.
Nonetheless, the Obama administration insists it is not a war. Why?  Because, according to “United States Activities in Libya,” a 32-page report that the administration released last week, “U.S. operations do not involve sustained fighting or active exchanges of fire with hostile forces, nor do they involve the presence of U.S. ground troops, U.S. casualties or a serious threat thereof, or any significant chance of escalation into a conflict characterized by those factors.”
In other words, the balance of forces is so lopsided in favor of the United States that no Americans are dying or are threatened with dying. War is only war, it seems, when Americans are dying, when we die.  When only they, the Libyans, die, it is something else for which there is as yet apparently no name. When they attack, it is war. When we attack, it is not.
 This cannot be classified as anything but strange thinking and it depends, in turn, on a strange fact: that, in our day, it is indeed possible for some countries (or maybe only our own), for the first time in history, to wage war without receiving a scratch in return. This was nearly accomplished in the bombing of Serbia in 1999, in which only one American plane was shot down (and the pilot rescued).
The epitome of this new warfare is the predator drone, which has become an emblem of the Obama administration. Its human operators can sit at Creech Air Force Base in Nevada or in Langley, Virginia, while the drone floats above Afghanistan or Pakistan or Yemen or Libya, pouring destruction down from the skies.  War waged in this way is without casualties for the wager because none of its soldiers are near the scene of battle -- if that is even the right word for what is going on.
Some strange conclusions follow from this strange thinking and these strange facts. In the old scheme of things, an attack on a country was an act of war, no matter who launched it or what happened next.  Now, the Obama administration claims that if the adversary cannot fight back, there is no war.
It follows that adversaries of the United States have a new motive for, if not equaling us, then at least doing us some damage.  Only then will they be accorded the legal protections (such as they are) of authorized war.  Without that, they are at the mercy of the whim of the president.
The War Powers Resolution permits the president to initiate military operations only when the nation is directly attacked, when there is “a national emergency created by attack upon the United States, its territories or possessions, or its armed forces.”  The Obama administration, however, justifies its actions in the Libyan intervention precisely on the grounds that there is no threat to the invading forces, much less the territories of the United States.
There is a parallel here with the administration of George W. Bush on the issue of torture (though not, needless to say, a parallel between the Libyan war itself, which I oppose but whose merits can be reasonably debated, and torture, which was wholly reprehensible).  President Bush wanted the torture he was ordering not to be considered torture, so he arranged to get lawyers in the Justice department to write legal-sounding opinions excluding certain forms of torture, such as waterboarding, from the definition of the word.  Those practices were thenceforward called “enhanced interrogation techniques.”
Now, Obama wants his Libyan war not to be a war and so has arranged to define a certain kind of war -- the American-casualty-free kind -- as not war (though without even the full support of his own lawyers). Along with Libya, a good English word -- war -- is under attack.
In these semantic operations of power upon language, a word is separated from its commonly accepted meaning. The meanings of words are one of the few common grounds that communities naturally share. When agreed meanings are challenged, no one can use the words in question without stirring up spurious “debates,” as happened with the word torture. For instance, mainstream news organizations, submissive to George Bush’s decisions on the meanings of words, stopped calling waterboarding torture and started calling it other things, including “enhanced interrogation techniques,” but also “harsh treatment,” “abusive practices,” and so on.
Will the news media now stop calling the war against Libya a war?  No euphemism for war has yet caught on, though soon after launching its Libyan attacks, an administration official proposed the phrase “kinetic military action” and more recently, in that 32-page report, the term of choice was “limited military operations.” No doubt someone will come up with something catchier soon.
How did the administration twist itself into this pretzel? An interview that Charlie Savage and Mark Landler of the New York Times held with State Department legal advisor Harold Koh sheds at least some light on the matter.  Many administrations and legislators have taken issue with the War Powers Resolution, claiming it challenges powers inherent in the presidency. Others, such as Bush administration Deputy Assistant Attorney General John Yoo, have argued that the Constitution’s plain declaration that Congress “shall declare war” does not mean what most readers think it means, and so leaves the president free to initiate all kinds of wars.
Koh has long opposed these interpretations -- and in a way, even now, he remains consistent. Speaking for the administration, he still upholds Congress’s power to declare war and the constitutionality of the War Powers Resolution. “We are not saying the president can take the country into war on his own,” he told the Times. “We are not saying the War Powers Resolution is unconstitutional or should be scrapped or that we can refuse to consult Congress. We are saying the limited nature of this particular mission is not the kind of ‘hostilities’ envisioned by the War Powers Resolution.”
In a curious way, then, a desire to avoid challenge to existing law has forced assault on the dictionary. For the Obama administration to go ahead with a war lacking any form of Congressional authorization, it had to challenge either law or the common meaning of words. Either the law or language had to give.
It chose language.

Masquerading Petty Politics as Principles

America may be coming to its senses.
By Andrew J. Bacevich
At periodic intervals, the American body politic has shown a marked susceptibility to messianic fevers. Whenever an especially acute attack occurs, a sort of delirium ensues, manifesting itself in delusions of grandeur and demented behavior.
By the time the condition passes and a semblance of health is restored, recollection of what occurred during the illness tends to be hazy. What happened? How’d we get here? Most Americans prefer not to know. No sense dwelling on what’s behind us. Feeling much better now! Thanks!
Gripped by such a fever in 1898, Americans evinced an irrepressible impulse to liberate oppressed Cubans. By the time they’d returned to their senses, having acquired various parcels of real estate between Puerto Rico and the Philippines, no one could quite explain what had happened or why. (The Cubans meanwhile had merely exchanged one set of overseers for another.)
In 1917, the fever suddenly returned. Amid wild ravings about waging a war to end war, Americans lurched off to France. This time the affliction passed quickly, although the course of treatment proved painful: confinement to the charnel house of the Western Front, followed by bitter medicine administered at Versailles.
The 1960s brought another bout (and so yet more disappointment). An overwhelming urge to pay any price, bear any burden landed Americans in Vietnam. The fall of Saigon in 1975 seemed, for a brief interval, to inoculate the body politic against any further recurrence. Yet the salutary effects of this “Vietnam syndrome” proved fleeting. By the time the Cold War ended, Americans were running another temperature, their self-regard reaching impressive new heights. Out of Washington came all sorts of embarrassing gibberish about permanent global supremacy and history’s purpose finding fulfillment in the American way of life.
Give Me Fever
Then came 9/11 and the fever simply soared off the charts. The messiah-nation was really pissed and was going to fix things once and for all.
Nearly 10 years have passed since Washington set out to redeem the Greater Middle East.  The crusades have not gone especially well. In fact, in the pursuit of its saving mission, the American messiah has pretty much worn itself out.
Today, the post-9/11 fever finally shows signs of abating. The evidence is partial and preliminary. The sickness has by no means passed. Oddly, it lingers most strongly in the Obama White House, of all places, where a keenness to express American ideals by dropping bombs seems strangely undiminished.
Yet despite the urges of some in the Obama administration, after nearly a decade of self-destructive flailing about, American recovery has become a distinct possibility. Here’s some of the evidence:
In Washington, it’s no longer considered a sin to question American omnipotence. Take the case of Robert Gates. The outgoing secretary of defense may well be the one senior U.S. official of the past decade to leave office with his reputation not only intact, but actually enhanced. (Note to President Obama: think about naming an aircraft carrier after the guy). Yet along with restoring a modicum of competence and accountability to the Pentagon, the Gates legacy is likely to be found in his willingness — however belated — to acknowledge the limits of American power.
That the United States should avoid wars except when absolutely necessary no longer connotes incipient isolationism. It is once again a sign of common sense, with Gates a leading promoter. Modesty is becoming respectable.
The Gates Doctrine
No one can charge Gates with being an isolationist or a national security wimp. Neither is he a “declinist.” So when he says anyone proposing another major land war in the Greater Middle East should “have his head examined” –citing the authority of Douglas MacArthur, no less — people take notice.  Or more recently there was this: “I’ve got a military that’s exhausted,” Gatesremarked, in one of those statements of the obvious too seldom heard from on high. “Let’s just finish the wars we’re in and keep focused on that instead of signing up for other wars of choice.” Someone should etch that into the outer walls of the Pentagon’s E-ring.
A half-dozen years ago, “wars of choice” were all the rage in Washington. No more. Thank you, Mr. Secretary.
Or consider the officer corps. There is no “military mind,” but there are plenty of minds in the military and some numbers of them are changing.
Evidence suggests that the officer corps itself is rethinking the role of military power. Consider, for example, “Mr. Y,” author of A National Strategic Narrative, published this spring to considerable acclaim by the Woodrow Wilson Center for Scholars. The actual authors of this report are two military professionals, one a navy captain, the other a Marine colonel.
What you won’t find in this document are jingoism, braggadocio, chest-thumping, and calls for a bigger military budget. If there’s an overarching theme, it’s pragmatism. Rather than the United States imposing its will on the world, the authors want more attention paid to the investment needed to rebuild at home.
The world is too big and complicated for any one nation to call the shots, they insist. The effort to do so is self-defeating. “As Americans,” Mr. Y writes, “we needn’t seek the world’s friendship or proselytize the virtues of our society. Neither do we seek to bully, intimidate, cajole, or persuade others to accept our unique values or to share our national objectives. Rather, we will let others draw their own conclusions based upon our actions… We will pursue our national interests and let others pursue theirs…”
You might dismiss this as the idiosyncratic musing of two officers who have spent too much time having their brains baked in the Iraqi or Afghan sun. I don’t. What convinces me otherwise is the positive email traffic that my own musings about the misuse and abuse of American power elicit weekly from serving officers. It’s no scientific sample, but the captains, majors, and lieutenant colonels I hear from broadly agree with Mr. Y.  They’ve had a bellyful of twenty-first-century American war and are open to a real debate over how to overhaul the nation’s basic approach to national security.
Intelligence Where You Least Expect It
And finally, by gum, there is the United States Congress. Just when that body appeared to have entered a permanent vegetative state, a flickering of intelligent life has made its reappearance. Perhaps more remarkably still, the signs are evident on both sides of the aisle as Democrats and Republicans alike — albeit for different reasons — are raising serious questions about the nation’s propensity for multiple, open-ended wars.
Some members cite concerns for the Constitution and the abuse of executive power.  Others worry about the price tag. With Osama bin Laden out of the picture, still others insist that it’s time to rethink strategic priorities. No doubt partisan calculation or personal ambition figures alongside matters of principle.  They are, after all, politicians.
Given what polls indicate is a growing public unhappiness over the Afghan War, speaking out against that war these days doesn’t exactly require political courage. Still, the possibility of our legislators reasserting a role in deciding whether or not a war actually serves the national interest — rather than simply rubberstamping appropriations and slinking away — now presents itself. God bless the United States Congress.
Granted, the case presented here falls well short of being conclusive. To judge by his announcement of a barely-more-than-symbolic troop withdrawal from Afghanistan, President Obama himself seems uncertain of where he stands. And clogging the corridors of power or the think tanks and lobbying arenas that surround them are plenty of folks still hankering to have a go at Syria or Iran.
At the first signs of self-restraint, you can always count on the likes of Senator John McCain or the editorial board of the Wall Street Journal to decry (inMcCain’s words) an “isolationist-withdrawal-lack-of-knowledge-of-history attitude” hell-bent on pulling up the drawbridge and having Americans turn their backs on the world.  In such quarters, fever is a permanent condition and it’s always 104 and rising. Yet it is a measure of just how quickly things are changing that McCain himself, once deemed a source of straight talk, now comes across as a mere crank.
In this way, nearly a decade after our most recent descent into madness, does the possibility of recovery finally beckon.

Legislating Prosperity

The Great Gov't.-Induced Homeownership Bubble
by M. Perry
In his latest column, George Will reviews the scalding new book “Reckless Endangerment" by New York Times columnist Gretchen Morgenson and housing finance expert Joshua Rosner.  Here's an excerpt from George Will:
"The book is another cautionary tale about government’s terrifying self-confidence. It is, the authors say, “a story of what happens when Washington decides, in its infinite wisdom, that every living, breathing citizen should own a home.”
The 1977 Community Reinvestment Act pressured banks to relax lending standards to dispense mortgages more broadly across communities. In 1992, the Federal Reserve Bank of Boston purported to identify racial discrimination in the application of traditional lending standards to those, Morgenson and Rosner write, “whose incomes, assets, or abilities to pay fell far below the traditional homeowner spectrum.”

In 1994, Bill Clinton proposed increasing homeownership through a “partnership” between government and the private sector, principally orchestrated by Fannie Mae, a “government-sponsored enterprise” (GSE). It became a perfect specimen of what such “partnerships” (e.g., General Motors) usually involve: Profits are private, losses are socialized.

There was a torrent of compassion-speak: “Special care should be taken to ensure that standards are appropriate to the economic culture of urban, lower-income, and nontraditional consumers.” “Lack of credit history should not be seen as a negative factor.” Government having decided to dictate behavior that markets discouraged, the traditional relationship between borrowers and lenders was revised. Lenders promoted reckless borrowing, knowing they could off­load risk to purchasers of bundled loans, and especially to Fannie Mae. In 1994, subprime lending was $40billion. In 1995, almost one in five mortgages was subprime. Four years later such lending totaled $160billion.

By 2003, the government was involved in financing almost half — $3.4trillion of the home-loan market. Not coincidentally, by the summer of 2005, almost 40 percent of new subprime loans were for amounts larger than the value of the properties."

MP:  The chart above helps tell the story, by showing graphically the unprecedented, government-induced rise in homeownership, from less than 64% in 1994 to more than 69% in 2004, a 5.4 percentage point increase in only one decade.  In many ways, what has been called the "housing bubble" was at the same time an unsustainable "homeownership bubble" (fueled by the political obsession with homeownership) and the bursting of the home price bubble was at the same time a bursting of the "homeownership bubble" as the graph clearly demonstrates.   

Self-interest cancels out objectivity.

The Smartest Guys in the Room

Our natural tendency is to listen to successful people on any financial subject. They must know something that we don’t or they wouldn’t be rich. And,we feel,Rich=Smart.
Should we listen to them?
I’ve given this question a lot of thought in the past year. I’ve always been a bit of an ideologue–well not just a “bit.” I believe ideas mean something in that you’ve got to have some basis or theory through which the world makes sense to you.
To determine what’s right or wrong most of us fall back on our own judgment in areas we understand. Or think we understand. When we don’t really understand something,we listen to those whom we think do understand. In the financial world we listen to Buffet,and Paulson,and Kramer,and the like,and generally,any one we know who’s made a ton of money.
But,if they are so smart how come they have been so wrong? And I mean,really wrong. So wrong that we are in the most serious financial crisis since the Great Depression.
I don’t mean to demean the above-named gentlemen because they have been very,very successful and deserve accolades for that. But,even Berkshire is down 31%.
I think we should pause and think about how we got in this mess. More specifically:how we let ourselves be convinced that our financial leaders,public and private,were right when we now clearly see that they were very wrong.
Maybe they were just lucky
I‘ve been working on a list of all the hedge funds and investment banks who have gone broke,are about to go broke,have lost a ton of money for their clients,or the investment stars who have been fired for some of the foregoing acts. It’s a very long list. (See The Problems,below.) And I just started looking. What you see are some of the headlines I’ve pulled out of Dow Jones Financial News Online’s daily e-mail from just the last 6 weeks or so.
Let us take for granted that these are all very bright men,they have vast knowledge of the financial markets,have operated honestly and in good faith,and at some point made a lot of money for their investors and clients.
Now we find that they’ve jumped off the bridge attached to a rope that’s tied around our necks.
Why do we listen to them? Are they right about economics when they are successful and wrong when they are not?
Could it be that they were just lucky–in an up market?
I’ve decided that there are only good ideas,bad ideas,and luck. And really,it’s mostly luck.
Nasim Taleb in his books Black Swan and Fooled by Randomness has addressed this issue brilliantly. He tackles the question of “how do we know what we know” through social science discoveries that relate to this question. It is a branch of science and philosophy called epistemology. He shows that there are behavioral and statistical reasons why we all make the same mistakes. That is,the nature of the human mind and personality makes us prone to make the same mistakes over and over.
His basic premise is that financial experts underestimate risk. As such they are caught by surprise when some significant unforeseen event occurs. Such events can be good or bad,but it is the bad ones that can kill you. The surprise is that these kinds of events,he refers to them as Black Swans,occur unpredictably but regularly.

Incentives Matter

Phases of a credit crunch
Posted by The Dinocrat  on Sunday, December 9th, 2007 at 11:27 am
Phase One
Even within the last year, you may recall driving down the highway listening to one of the many mortgage ads on the radio and hearing the phrase: “No income verification.” Perhaps you thought: “That seems a little odd. I wonder how they do that. They must have very sophisticated computer models.”
Well, the phrase is still around, but for the most part the loans aren’t. And therein lies a tale. In the olden times, say, 20 years ago, you went to the bank to get a mortgage when you bought a house. The loan officer went over your tax returns, credit report, bank statements, proof of down payment and all the rest, and the loan committee decided if you were creditworthy. That all ended when mortgage securitization was invented and then became widespread, expanding massively in this decade, as interest rates reached historic lows.
It is much easier to use computer models than credit judgment. And computer models showed fascinating facts about mortgage holders and defaults. They showed consistent correlations between different credit ratings and mortgage default rates, for example. So using various inputs of this sort, computer models of large numbers of mortgages showed that only a small portion of the mortgage holders default and most mortgages were as good as gold. This allowed smart fellows to divide up, say, $1 billion in mortgages, into various groups, or tranches (see Allan Sloan’s article).
The tranches then got ratings from the credit agencies, well known entities with long histories like S&P and Moody’s. Based on various metrics gleaned from statistical history, large numbers of these mortgages were rated AAA, and lesser mortgage pools got lower ratings, etc. So then the mortgage tranches could be sold to different groups who wanted AAA paper at a low yield, or riskier paper at higher yield, etc.
But the computer models were wrong. They made, in essence, the same mistake that Mike Milken made in his analysis many years ago of the low default rates of junk bonds. It was true that junk bonds had historically low default rates, but of course that was not an Iron Law. In fact issuing vast numbers of junk bonds changed the very nature of the game, attracting a group of wise guys who sought to exploit the fact that investors had substituted a computer model for common sense. So it became with subprime mortgages, when a flood of new subprime borrowers changed the game and created the certainty that future default rates would vastly exceed those from loans made in a previous time of prudent lending.

“The louder he talked of his honor, the faster we counted our spoons.”

Burning down the house
By G. Will
The louder they talked about the disadvantaged, the more money they made. And the more the financial system tottered.
Who were they? Most explanations of the financial calamity have been indecipherable to people not fluent in the language of “credit default swaps” and “collateralized debt obligations.” The calamity has lacked human faces. No more.
Put on asbestos mittens and pick up “Reckless Endangerment,” the scalding new book by Gretchen Morgenson, a New York Times columnist, and Joshua Rosner, a housing finance expert. They will introduce you to James A. Johnson, an emblem of the administrative state that liberals admire.
The book’s subtitle could be: “Cry ‘Compassion’ and Let Slip the Dogs of Cupidity.” Or: “How James Johnson and Others (Mostly Democrats) Made the Great Recession.” The book is another cautionary tale about government’s terrifying self-confidence. It is, the authors say, “a story of what happens when Washington decides, in its infinite wisdom, that every living, breathing citizen should own a home.”
The 1977 Community Reinvestment Act pressured banks to relax lending standards to dispense mortgages more broadly across communities. In 1992, the Federal Reserve Bank of Boston purported to identify racial discrimination in the application of traditional lending standards to those, Morgenson and Rosner write, “whose incomes, assets, or abilities to pay fell far below the traditional homeowner spectrum.”
In 1994, Bill Clinton proposed increasing homeownership through a “partnership” between government and the private sector, principally orchestrated by Fannie Mae, a “government-sponsored enterprise” (GSE). It became a perfect specimen of what such “partnerships” (e.g., General Motors) usually involve: Profits are private, losses are socialized.
There was a torrent of compassion-speak: “Special care should be taken to ensure that standards are appropriate to the economic culture of urban, lower- income, and nontraditional consumers.” “Lack of credit history should not be seen as a negative factor.” Government having decided to dictate behavior that markets discouraged, the traditional relationship between borrowers and lenders was revised. Lenders promoted reckless borrowing, knowing they could off­load risk to purchasers of bundled loans, and especially to Fannie Mae. In 1994, subprime lending was $40billion. In 1995, almost one in five mortgages was subprime. Four years later such lending totaled $160billion.
As housing prices soared, many giddy owners stopped thinking of homes as retirement wealth and started using them as sources of equity loans — up to $800billion a year. This fueled incontinent consumption.
Under Johnson, an important Democratic operative, Fannie Mae became, Morgenson and Rosner say, “the largest and most powerful financial institution in the world.” Its power derived from the unstated certainty that the government would be ultimately liable for Fannie’s obligations. This assumption and other perquisites were subsidies to Fannie Mae and Freddie Mac worth an estimated $7billion a year. They retained about a third of this.
Morgenson and Rosner report that in 1998, when Fannie Mae’s lending hit $1trillion, its top officials began manipulating the companys results to generate bonuses for themselves. That year Johnson’s $1.9million bonus brought his compensation to $21million. In nine years, Johnson received $100million.
Fannie Mae’s political machine dispensed campaign contributions, gave jobs to friends and relatives of legislators, hired armies of lobbyists (even paying lobbyists not to lobby against it), paid academics who wrote papers validating the homeownership mania, and spread “charitable” contributions to housing advocates across the congressional map.
By 2003, the government was involved in financing almost half — $3.4trillion of the home-loan market. Not coincidentally, by the summer of 2005, almost 40 percent of new subprime loans were for amounts larger than the value of the properties.
Morgenson and Rosner find few heroes, but two are Marvin Phaup and June O’Neill. These “digit-heads” and “pencil brains” (a Fannie Mae spokesman’s idea of argument) with the Congressional Budget Office resisted Fannie Mae pressure to kill a report critical of the institution.
“Reckless Endangerment” is a study of contemporary Washington, where showing “compassion” with other people’s money pays off in the currency of political power, and currency. Although Johnson left Fannie Mae years before his handiwork helped produce the 2008 bonfire of wealth, he may be more responsible for the debacle and its still-mounting devastations — of families, endowments, etc. — than any other individual. If so, he may be more culpable for the peacetime destruction of more wealth than any individual in history.
Morgenson and Rosner report. You decide.

Banned in San Francisco

A Guide to What the City Has Rejected
By ADAM MARTIN
San Francisco City Hall has been making headlines this week for considering a law prohibiting the sale of any animals as pets in the city. The proposed ban, expanded from a bill meant to shut down puppy and kitten farms, would include fish, reptiles, birds, insects, and rodents. Quipped the Los Angeles Times, "If it flies, crawls, runs, swims or slithers, you would not be able to buy it in the city named for the patron saint of animals." This is, of course, not the first relatively normal thing that San Francisco has tried to ban. In its unending quest to be a place of populist values that represent the under-served, give voice to the voiceless, improve the quality of life for all, and promote peace and the environment, the city sometimes legislates itself into ridiculousness. Here's a look at some of the more notable things in recent memory that San Francisco has tried to ban, sometimes with success.
Handguns: A measure passed by city voters in 2005 banned the possession of handguns for private use. It also prohibited the manufacture, sale, or distribution of any type of firearms within the city limits of San Francisco. The National Rifle Association immediately sued to have the ban overturned, and in 2008, after a series of decisions and appeals, the California Supreme Court decided the ban was unconstitutional.
Circumcision: Activists have gotten enough signatures on a proposed ballot measure banning the practice of circumcision to add it to this fall's election. The proposed ban quickly became the subject of controversy by critics who said it would be an institutionalized form of antisemitism. Last week, a coalition of Jews and Muslims sued the city's elections commission to have the measure removed from the ballot before the election. "There are anti-Semitic overtones to the initiative ... that adds to the perception of threat by the Jewish community," attorney Michael Jacobs said.
Plastic shopping bags: In 2007, the city's board of supervisors successfully passed the nation's first ban on plastic shopping bags at supermarkets and large chain pharmacies. Unlike some of the other bans that have received snickers from the rest of the country, this one worked well, reducing the number of bags consumed in the city by about five million a month in its first year. Now, Portland, Oregon, Boston, and other cities have also instituted their own bans.
Smoking: While plenty of cities and states have prohibited smoking indoors, San Francisco was at the vanguard of that movement. Since California banned smoking in most indoor public places in 1994, and in bars and nightclubs in 1998, San Francisco has gone much further. The most recent extension of its citywide smoking ban, passed in March 2010, prohibits lighting up at ATMs, pretty much any line in which you're waiting, outdoor seating at cafes, and some semi-enclosed bar patios. The ban passed the board of supervisors unanimously, and anti-smoking activists are eager to expand it even further.
Bike lanes: San Francisco prides itself on its environmentally conscious attitude and legislation. That means encouraging bicycling and other alternative transit, but it also means strict adherence to California's requirement that an environmental impact review be conducted before any major construction project. So when the city wrote and began to implement a comprehensive plan to paint more bike lanes, add bike parking, and establish new bicycle routes, all it took was one man, Rob Anderson, to sue for it to be halted. Anderson argued that the plan, like any effort to change the look and feel of the streetscape, required an environmental impact report, which the city had not done. A judge enjoined the plan's implementation in 2006, and it wasn't allowed to go forward until August 2010.
Happy Meals: In November 2010, the board of supervisors voted by a veto-proof margin to prohibit restaurants from giving out free toys with meals whose calories, fat, and sugar exceeded set levels. If a restaurant does want to give out a free toy with a meal, it must also come with fruit and vegetables.
Battleships: In 2006, the U.S. Navy decommissioned the U.S.S. Iowa, one of the biggest battleships it had every sailed. It wanted to give the ship to San Francisco to use as a tourist attraction, but the board of supervisors voted against accepting it. USA Today reported at the time: "Supervisors who oppose the offer say they don't want a ship from a military in which openly gay men and women cannot serve. They also say they don't want it because they oppose the Iraq war, which city voters condemned in a 2004 ballot question."
Sitting or lying on the sidewalk: Last November, voters approved a measure prohibiting people from sitting or lying on the sidewalk. San Francisco has a large homeless population, and dealing with that frequently puts its principles at odds. The city wants to care for its under-served, but it wants to promote good quality of life for all its citizens, and that does not mean encouraging aggressive panhandlers. The upshot: A majority voted to ban sidewalk loafing once and for all.
Phone Books: In May, San Francisco Mayor Ed Lee signed into law a bill that bans the unsolicited delivery of the Yellow Pages phone book. Nobody seems to be all that upset about it, except possibly the Yellow Pages folks.

Voting with their feet

What Explains The Middle-Class Black Exodus From The Northeast?
By K. Smith
Middle-class blacks whose forebears fled the South in mid-century are now leaving Northern cities like New York and heading for Dixie. Why ever could that be?
Fear not: The sleuths from The New York Times are on the case. Exploring why some black residents of middle-class Queens might be inclined to uproot their families, turn away from their friends, sell property and start over with new careers or firms 1,000 miles away, the Times discovered such banal and limp reasons as New York City having “lost its cachet with black people,” Atlanta and other Southern cities being less of “a struggle to survive,” the South offering the nourishment of “emotional and spiritual roots,” and a “more relaxed and comfortable life.” Black New Yorkers have even supposedly soured on race relations in the North. (This last reason seems to be the viewpoint of a single interviewee who had an unpleasant run-in with the NYPD and says she is heading to Charlotte, N.C., in search of a more  genial constabulary.)
Of 44,000 blacks leaving New York State in 2009, says a survey conducted on behalf of the Times, more than half have gone to the South. If there is a reverse Great Migration beginning to get underway, though, it is taking place for the same reason blacks flooded to the north in the first place — economics.
The Times story contains only one economic figure, near the bottom. But it’s a whopper: $150,000. This is the price former Queens resident Danitta Ross paid for a seven-room house, including a three-car garage and a piece of land, in Atlanta four years ago. And how nicely does this passage typify economics-based reporting for the Times? “The migration of middle-class African-Americans is helping to depress already falling housing prices. It is also depriving the black community of investment and leadership from some of its most educated professionals, black leaders say.” That’s one way of looking at it, but housing prices are presumably rising, and the black community being reinvigorated with new talent, in the cities that are best able to attract new residents. Why should not cities that are most welcoming to the black middle class be so rewarded?
Mired in its anecdotage, the New York Times doesn’t, or would prefer not to, recognize what is obvious. As northern cities gradually choke themselves with red tape and taxation, southern hospitality toward business becomes increasingly attractive. A 2010 KPMG study reported that among 22 large cities, Atlanta ranks as the second-cheapest place to do business after Tampa (the bottom five are Boston, San Diego, Los Angeles, New York City and, in last place, San Francisco).

Sunday, July 3, 2011

Without a credible currency, an urbanized nation can quickly descend into chaos—and even starvation.

Our Big, Fat Greek Our Big, Fat Greek BankrupctyBankruptcy
By L. Woodhill
What do you do when your economy is in a power dive and the ground is rushing up to meet you?  If you’re Greece, you turn on your austerity afterburners so that you can blast out a bigger impact crater when you crash.
As this is written, the Greek parliament is doing another austerity rain dance, seeking to appease the bailout gods and obtain a few billion more euros to shovel into the unionized money incinerator that is the Greek public sector.  No matter.  The Greek economy is contracting so fast that EU/IMF bailouts have a shorter half-life than Iodine 125.
In modern economies, the effects of government policies show up first and fastest in employment.  Greece reports its monthly employment numbers two months slower than the U.S., but the pattern is clear.  The Greek economic situation is deteriorating so fast that reporters are writing silly things like the following, which was published on June 8:
March jobless rate hits 16.2%, new record.  The European Union expects it to average out at 14.6% this year and hit 14.8% in 2012.
Does no one think it odd that the E.U. expects Greek unemployment to average 14.6% for all of 2011 when it registered 15.1% in January, 15.9% in February, and 16.2% in March?
As of March 2011, total employment in Greece was down by 9.3% from its October 2008 peak, and was still falling.  In contrast, in the case of the U.S. recession, total employment fell by 5.9% from its November 2007 peak to its December 2009 trough, and then rebounded 1.5% by March 2011.
Given that not one worker in Greece’s bloated public sector has yet lost his job to “austerity,” the employment numbers imply that the Greek private sector is melting faster than the Wicked Witch of the West in a hot tub.  Because the Greek private sector has to both support the huge Greek public sector and to service the Greek government’s debt, this is probably not a good thing.
Here is a quote from another news story published on June 8:
The May 2010 agreement between the IMF/EC/ECB and the Greek government projected a GDP drop of 4% in 2010, followed by a contraction of 2.6% in 2011.  In reality, GDP dropped by 4.5% (in 2010), leading to a revised forecast for 2011 at -3%.  So far, in Q1 2011, GDP has dropped by 4.8% year-on-year, which makes the revised 3% contraction for 2011 seem optimistic.
Seem optimistic?
Right now, Greeks are rioting in the streets as Prime Minister Papandreou struggles to push through yet another “austerity plan”, this one calling for an additional 3.8 billion euros in spending cuts.  It appears that he will succeed.  However, at the rate that the Greek economy appears to be contracting, this would offset falling revenues for less than a year.  Then what?
No one seems to have noticed that the tax increases included in previous austerity programs have pitched the Greek economy into a violent contraction.  The plan being debated now includes even more tax hikes.  Despite all of this, the EU’s financial projections assume that Greek GDP will shrink by only 3% in 2011, and then will grow by 1.1% in 2012.  If, instead, the Greek economy were to continue to contract at a 4.8% rate, in 2012 real GDP would be 7.6% smaller than the EU is expecting, and 11.5% less than it was in 2009.
Social order in Greece will break down before GDP shrinks to 88.5% of its 2009 level.  Ordinary people won’t accept self-inflicted economic wounds of this scale.  If the Papandreou government continues on its austerity kamikaze mission, it will eventually fall.
With debt equal to more than 150% of GDP and a rapidly contracting economy, Greece must choose between declining the EU/IMF bailout and defaulting now, or imposing more austerity, getting more loans, and defaulting a few months from now.  So, why would the Greek government choose to go through all of this agony just to buy a few months?  And why would it want to pile on more austerity when what is needed is a program for economic growth?
It could be because Greek elites have not yet moved all of their capital out of the country.
Greek banks are frantically borrowing euros from the ECB, using Greek government bonds (valued at par, not at market) as collateral.  These ECB loans make it possible for Greeks to withdraw euros from Greek banks and transfer them abroad.  The moment that Greece defaults, its bonds will no longer be eligible for use as ECB collateral, the Greek banking system will collapse, and this process will screech to a halt.  Greeks with money may not want this to happen—at least not right now.
However, let’s imagine that the Greeks wanted to change course and try to save their country.  What should their new economic plan be?  Before the Greeks find themselves going hungry, they might try “going Hungary.”
In mid-2010, both Greece and Hungary were in financial trouble and were being pressured to adopt “austerity” measures in return for bailout loans.  While Greece chose to drink the IMF/EU tax-hike hemlock, Hungary declined the pact proffered by the IMF devil.
Greece raised taxes in the name of “austerity”, while Hungary embarked on a radical tax reform program that included a 16% flat income tax and a 10% corporate income tax for small and medium-sized companies.  Let’s see which approach produced better results.
In 2010, Hungary’s GDP rose by 1.2%, while Greece’s GDP fell by 4.5%.  While Greece’s economy is expected (by the EU) to contract by 3.0% in 2011, Hungary’s is forecasted (by the IMF) to grow by 2.8%.  From January 2011 to March 2011, Greece’s unemployment rate increased from 15.1% to 16.2%, while joblessness in Hungary fell from 12.1% to 11.6%.
The whole point of austerity is to improve a country’s ability to pay its debts.  However, all that a year of austerity did for Greece was to raise the market interest rate on its 10-year bonds from 10.5% to 16.8%.  In contrast, the interest rate on Hungary’s 10-year bonds fell from 7.7% to 7.4% over the same time period.
Some economists say that the key to getting the Greek economy growing again would be to replace the euro with a “new drachma”, which could then be devalued in order to improve Greek “competitiveness”.  In this light, it is interesting to note that over the past year, Hungary’s currency, the forint, has actually risen by almost 6% against the euro.  Accordingly, Hungary’s economic progress was not produced by devaluing its currency.
Of course, in mid-2010, Hungary was in an economic/financial position where it could refuse bailout loans without defaulting on its debt.  In mid-2011, Greece is not.  However, as things stand, Greece is going to default.  At some point, we are going to have a big, fat, Greek bankruptcy.  The only question is whether the coming default is used to make things better for Greece, or is allowed to make things worse.
For Greece to truly recover, it must do now whatever it takes to get its economy growing now.  However, whatever else it does, Greece must stick with the euro.  Without a credible currency, an urbanized nation can quickly descend into chaos—and even starvation.
Broadly speaking, Greece needs to do the same things that the U.S. needs to do.  It must enforce the rule of law, expand economic freedom, maintain a stable currency, reduce and simplify taxes, cut government spending, open up trade, and reform burdensome regulations.  This path would not be (politically) easy for the U.S., and it may or may not even be possible for Greece. 
We shall see.

Never Enough

America’s Ever Expanding Welfare Empire
By P.Ferrara
   
A fundamental misconception about America’s welfare state misleads millions of voters to reflexively support ever bigger and more generous government. William Voegeli fingers the attitude in his book, Never Enough: America’s Limitless Welfare State: “no matter how large the welfare state, liberal politicians and writers have accused it of being shamefully small” and “contemptibly austere.”
Barbara Ehrenreich expresses the attitude in her book, Nickled and Dimed: “guilt doesn’t go anywhere near far enough; the appropriate emotion is shame” regarding the stingy miserliness of America’s welfare state. In light of the current budget debate, with House Budget Committee Chairman Paul Ryan putting fundamental entitlement reform on the table, this misconception especially needs to be corrected.
America’s welfare state is not a principality. It is a vast empire bigger than the entire budgets of almost every other country in the world. Just one program, Medicaid, cost the federal government $275 billion in 2010, which is slated to rise to $451 billion by 2018. Counting state Medicaid expenditures, this one program cost taxpayers $425 billion in 2010, soaring to $800 billion by 2018. Under Obamacare, 85 million Americans will soon be on Medicaid, growing to nearly 100 million by 2021, according to the CBO.
But there are 184 additional federal, means-tested welfare programs, most jointly financed and administered with the states. In addition to Medicaid is the Children’s Health Insurance Program (CHIP). Also included is Food Stamps, now officially called the Supplemental Nutrition Assistance Program (SNAP). Nearly 42 million Americans were receiving food stamps in 2010, up by a third since November, 2008. That is why President Obama’s budget projects spending $75 billion on Food Stamps in 2011, double the $36 billion spent in 2008.
But that is not the only federal nutrition program for the needy. There is the Special Supplemental Nutrition Program for Women, Infants and Children (WIC), which targets assistance to pregnant women and mothers with small children. There is the means tested School Breakfast Program and School Lunch Program. There is the Summer Food Service Program for Children. There are the lower income components of the Child and Adult Care Food Program, the Emergency Food Assistance Program, and the Commodity Supplemental Food Program (CSFP). Then there is the Nutrition Program for the Elderly. All in all, literally cradle to grave service. By 2010, Federal spending for Food and Nutrition Assistance overall had climbed to roughly $100 billion a year.
Then there is federal housing assistance, totaling $77 billion in 2010. This includes expenditures for over 1 million public housing units owned by the government. It includes Section 8 rental assistance for nearly another 4 million private housing units. Then there is Rural Rental Assistance, Rural Housing Loans, and Rural Rental Housing Loans. Also included is Home Investment Partnerships (HOME), Community Development Block Grants (CDBG), Housing for Special Populations (Elderly and Disabled), Housing Opportunities for Persons with AIDS (HOPWA), Emergency Shelter Grants, the Supportive Housing program, the Single Room Occupancy program, the Shelter Plus Care program, and the Home Ownership and Opportunity for People Everywhere (HOPE) program, among others.
Besides medical care, food, and housing, the federal government also provides cash. The old New Deal era Aid to Families with Dependent Children (AFDC) is now Temporary Assistance for Needy Families (TANF), which pays cash mostly to single mothers with children. There is the Earned Income Tax Credit (EITC), which sends low income workers checks even though they usually owe no taxes to be credited against. The Child Tax Credit similarly provides cash to families with children. Supplemental Security Income (SSI) provides cash for the low income aged, blind and disabled. In 2010 such income security programs accounted for nearly another $200 billion in federal spending.
The federal government also provides means tested assistance through multiple programs for child care, education, job training, and the Low Income Energy Assistance Program (LIHEAP), the Social Services Block Grant, the Community Services Block Grant, and the Legal Services Corporation, among other programs.
The best estimate of the cost of the 185 federal means tested welfare programs for 2010 for the federal government alone is nearly $700 billion, up a third since 2008, according to the Heritage Foundation. Counting state spending, total welfare spending for 2010 reached nearly $900 billion, up nearly one-fourth since 2008 (24.3%).

Individual Liberty and War

Why the Left Fears Libertarianism
by Anthony Gregory
Leftist criticisms of libertarianism have surged lately, a phenomenon warranting explanation. We libertarians could justifiably find it all quite confusing. For decades we have thought our battle a largely losing one, at least in the short term. We are a tiny, relatively powerless minority. The state has raged on, expanding in virtually every direction, for my entire lifetime and that of my parents. Yet nearly every week our beloved philosophy of non-aggression is subject to some progressive’s relatively widely read hatchet job. On the surface, it appears at least as misdirected as the rightwing hysteria about Marxists during the Cold War. But at least Marxism was the supposed tenet of the Soviet Union, a regime with thousands of nukes ready to launch. Why all this concern about little ol’ us?
We could go through all these critiques line by line and expose the many factual errors and gross misinterpretations, whether disingenuous or unintentional. But it might be more worthwhile to ask, Why all this focus on the supposed demonic threat of libertarianism in the first place?
It was not too long ago that the Slate’s Jacob Weisberg declared the end of libertarianism. Time of death? The financial collapse, which proved our "ideology makes no sense." Not three years later, the same web publication is exposing "the liberty scam": "With libertarianism everywhere, it's hard to remember that as recently as the 1970s, it was nowhere to be found."
Funny, I thought libertarianism was dead. Now it is an insidious scam worthy of multiple articles exposing the danger that lurks beneath the façade. In 28 months our defunct ideology has resurrected into a ubiquitous threat.
If only. Despite the leftists’ hysteria that libertarianism is permeating the Tea Parties, defining Republican politics, and central to the message espoused by Glenn Beck, this is so far from the truth, so paranoid a delusion, that it makes Beck’s most incoherent sketches upon his notorious chalkboard appear like plausible, sensible political analysis by comparison.
The government grows bigger every day and every year, no matter how you measure it. There are more laws, more police, and more prisoners than ever. The empire and presidential power have been on the rise for decades. Spending has increased at all levels. New bureaucracies, edicts, social programs, and prohibitions crop up continually. Almost no regulations are ever repealed – yes, back in the late 1990s, Clinton signed a partial deregulation of certain bank practices (opposed by Ron Paul, as it was phony to begin with), which had nothing to do with the financial meltdown and yet is blamed for every economic problem that unfolded in the last decade. Yes, back in the early 1980s, Reagan cut marginal tax rates while increasing other taxes and positioning himself to double the federal government, and, according to the left-liberals, we’ve been in a laissez-faire tailspin ever since. But anyone who really thinks libertarianism has been dominant in this country clearly has very little understanding of what libertarianism is – or is utterly detached from reality.
Weisberg was wrong in 2008 when he predicted the demise of our philosophy after an era of major influence, and his fellow-traveling writer at Slate is wrong now when he thinks he sees it everywhere. It is telling, however, that when they choose to go after the Tea Party conservatives, the beltway think tanks, and the GOP rightwing, they do not generally attack these people for their many unlibertarian views (views that the left claims to oppose as well): Their love of the police state, their support for the drug war, their disregard for the Fourth Amendment, their comfort with torture, their demonization of immigrants and foreigners, and, above all, their unwavering penchant for warmongering. No, you see, these positions, while unfashionable in some liberal circles, are at least within the respectable parameters of debate. But if some conservative ever mentioned the Tenth Amendment favorably, questioned the legitimacy of the welfare state, or said perhaps the budget deficit should be cut by at least a third this year – horror upon horrors! This is far beyond the bounds of reasonable discussion. And, as it so happens, these are positions that libertarians would find somewhat agreeable, and so we see the real problem with Glenn Beck isn’t his flirtations with fascism and militarism; it’s the quirky way he wonders aloud if government has gotten a bit too big and might pose a threat to freedom. The populist conservatives are not exposed for being protectionists – that much is tolerable – but rather for clinging to their guns and localism. The neolibertarian policy wonks are attacked not for being soft on war but for being too hard on the state.
The fact is, most left-liberals do hate and fear libertarianism more than they oppose modern conservatism. It makes sense. For one thing, the conservatives and liberals seemingly agree on 90% of the issues, certainly when compared to the views of principled libertarians. They all favor having a strong military. We tend to want to abolish standing armies. They all think the police need more power – to crack down on guns, if you’re a liberal, and to crack down on drugs, if you’re a conservative. We libertarians think police have way too much power and flirt with the idea of doing away with them altogether. The conservatives and liberals all want to keep Medicare, Social Security, and public schools intact, if tweaked around the edges. We see these programs for what they are: the parasitic class’s authoritarian and regressive programs to control the youth and foment intergenerational conflict.