Thursday, December 8, 2011

From Bankruptcy to Tyranny


The Golden Age of Government Is Just Beginning
By Mark R. Crovelli
When I talk to self-identified “conservatives” today, I am surprised how many of them have, finally, awakened to the fact that governments all over the Western world are bankrupt. It has taken a long time for them to do the math, but it is finally dawning on them that when a government’s debts and liabilities massively outweigh its current and future assets and “income” (a more-accurate word would be “loot”), that country is headed for disaster. While they cannot be praised for their quickness in recognizing something so blatantly obvious, at least these “conservatives” have bested their “liberal” friends in solving the problem, since most of the latter are, sadly, unable to add and subtract numbers with 12 zeros.
While I am pleasantly surprised that many so-called “conservatives” can now spot an obvious bankruptcy when they see it, I am less than impressed with their understanding of what bankruptcy entails for a government. Almost invariably, they naively assume that government bankruptcy is analogous to the bankruptcy of a private company. Just as a bankrupted company like Enron shrivels up and disappears from the economic stage, they assume, bankrupted governments will shrivel up and, if not disappear from the world stage, at least take on severely limited roles.
The bankruptcy of governments is, thus, assumed to be a positive development for individual liberty, according to many so-called “conservatives,” because governments will be forced to live within their means and abandon most of their unsustainable and meddling schemes. A golden age of liberty and respect for the Constitution is assumed to be right around the corner.
This idea that government bankruptcy is a positive development for individual liberty is just plain wrong, however. More than that, it is just plain delusional. Governments are not, in any way, analogous to private companies, and it cannot be sanely assumed that they will shrivel up or disappear like private companies, just because they are bankrupt.
Governments obtain their wealth by “taxing” people, and bankruptcy in no way impedes their ability to seize wealth (unless they, like the Romans, stupidly neglect to pay police and military salaries). On the contrary, their desperate need for money during bankruptcy should be expected to induce them to try to suck even more money out of their subjects than they did before.
And why shouldn’t they? A politician’s job always entails spending other people’s money. Some of this money is seized in the form of taxes from the hapless taxpayers of the country, some is printed out of thin air and some is borrowed from people or politicians in other countries that are too stupid or economically ignorant to know better. When a government goes bankrupt, as Greece and Italy are currently in the process of doing, and the flow of funds from the suckers abroad dries up, the government loses only one of these three sources of other people’s money. It can still tax the daylights out of its own subjects, and it can still print money. What’s to stop it?
The example of interwar Germany is instructive in this regard. As a result of the disgusting Treaty of Versailles following World War I, the German government was made insolvent in exactly the same way that today’s Western governments are insolvent. The gigantic war “debt” foisted on the German government’s books was, literally, impossible to pay off, just as most Western governments today have debts and future liabilities on their books that cannot possibly be honored.
What was the result of this de facto bankruptcy of the German government in the 1920s? Did it automatically usher in a golden age of individual liberty and limited government in Germany in the 1930s? Did the German government stop taxing its subjects or printing money? Did the German government learn its lesson about wasting its people’s money on pointless and extravagantly wasteful wars? (N.B.: If you don’t know the answer to these questions, you are about as bright a “conservative” as Newt Gingrich or Mitt Romney.)
The problem with assuming that governments will shrivel up just because they are bankrupt is that governments, unlike private companies, can still strong-arm people into giving them money, even when they are bankrupt. When Enron went bankrupt, it was not in a position to send armed thugs to the homes of its investors to hustle up more money. Nor was it able to simply print a pile of money in order to pay off its mounting debts. In other words, it went down, as it should have gone down, because it couldn’t force people to keep funding its idiotic and wasteful operation. Government, by contrast, does have a literal army of enthusiastic and sadistic men on the payroll, who will follow orders to kick in doors, bust heads and gas people in order to hustle up money to keep the wasteful operation rolling along. (N.B.: If you think people pay their taxes out of the kindness of their hearts, instead of out of fear that cops will haul them away to the American gulag, you, too, are about as bright a “conservative” as Newt Gingrich and Mitt Romney.)
Moreover, governments are very careful to continuously waste a very large chunk of money on the military and the police. After all, governments claim that their primary purpose is to “protect” their subjects from foreign threats, so they are mindful to spill a nice chunk of their budget on these strongmen when times are good. (Whether government does, in fact, “protect” its subjects from foreign threats can be gauged by the fact that governments often bankrupt themselves trying to fund their militaries. With a “protector” as financially irresponsible as Enron, how much protection are we really getting?)
So when a government goes bankrupt, there exists a giant horde of armed men in the military and police who expect to get paid, and who will not take kindly to budget cuts. Ever mindful that a horde of armed men is a constant threat to the civilian government when they are unpaid and unhappy, the political class should be expected to do whatever it takes to keep paying the salaries of the horde. And where, do you suppose, will this money be hustled up when the government has bankrupted itself and can no longer borrow money from foreign suckers? (N.B.: If you don’t know the answer to this question, you are definitely as bright a “conservative” as Newt Gingrich and Mitt Romney.)
Hustling enough money up to pay the salaries of the military and police (and other privileged and militant bureaucrats, as in Greece) is not always easy, however, because subjects don’t often appreciate having more and more of their money confiscated by wildly irresponsible politicians.
Fortuitously for governments, shaking people down ain’t what it used to be. They no longer need to send their armed thugs to kick down doors, crack skulls and gas their subjects in order to confiscate money. They can simply print money out of thin air and — voila! — now they can make payroll! If their subjects are stupid enough to trust paper money, then why not skin them a little in order to “solve” the government’s problems? Do you really think that an organization with a budget problem that has the ability to print money will not choose to do so for its own benefit? Do you really think Enron would have refrained from printing money to prop itself up if it had had the ability to do so?
The reason so many so-called “conservatives” cannot grasp these obvious and foreseeable consequences of a government bankruptcy is that they do not have a true understanding of what government is. Government is not a private company or a charitable organization. It does not abide by the same laws as the rest of society. It can continue to exist — nay, thrive — even when its debts vastly outweigh its assets and income. It can print its own money and continue to tax its subjects even when it has bankrupted itself. Hence, government cannot be likened to an Enron or a Lehman Bros. as a relatively benign entity when it goes bankrupt. It is an economic vampire that will not shrivel or die easily. It can continue to suck its citizen victims in order to nourish itself even when the absurdity of its balance sheet is evident to everyone.
Hence, if you are a self-identified “conservative” and you are sick and tired or scared to death of the government we have today, you should not look to our government’s impending bankruptcy as some sort of cathartic and purifying event that will usher in a new age of liberty. It will not. More than likely, if history is any guide, the slew of government defaults that are in the pipe in the Western world will usher in a golden age of government.
Government bankruptcy is not a substitute for the hard work of liberty-minded people to advance the cause of freedom. In and of themselves, a thousand government defaults would not advance the cause of liberty one iota. What is needed in the time leading up to the government’s default is a cadre of devoted, almost fanatical, freedom-fighters who are willing and able to teach the masses about the nature of government and the nature of money. Only with the persistent help of this devoted cadre will there be any chance of fighting the growth of government and the devaluation of money that government default will inevitably tow in its wake.

Wednesday, December 7, 2011

Scary for just about everybody

This is what a real market crash looks like
FORTUNE -- Investing sage Jeremy Grantham sounded a little guilty in his latest report to clients, titling it "The Shortest Quarterly Letter Ever." He should hold the apologies. Grantham, one of the pithiest market writers around, includes a chilling graphic in the four-page note that is one of the most mesmerizing market visuals of 2011.
Grantham is a value investor who oversees nearly $100 billion at his Boston-based firm, GMO. Using historical averages of prosaic data like profit margins and price-to-earning ratios, he's made a series of prescient market calls. This spring, as U.S. stocks quickly rose, he told investors to flee the market because of escalating global fears. (He was right.) And back in 2009, he famously published a bullish note titled "Reinvesting When Terrified" at the market's nadir.
Today he's sounding the alarm again on stocks, and he seems as wary as ever. "Since the spring," Grantham explains, "the equity markets have been absolutely bombarded by bad news." Between the eurozone crisis and fears of a slowdown in China, there's as much bad news as ever, he says. Yet the S&P 500 keeps recovering whenever crises ease for a just few days, thanks to sky-high profit margins and historically low inflation. Those two factors are driving U.S. stocks past Grantham's estimate of the market's fair value of 975-1,000 for the S&P 500 (SPX).
This is where his analysis starts to get scary. Profit margins will fall back to historical levels eventually, he says, and stocks will come down with them. Then there's an inflection point. If any unresolved crises remain on the table when this happens -- the eurozone crisis; a slowdown in China; budget impasses in the U.S. -- then U.S. stocks could start to look a lot like those in Japan.
For two decades the Federal Reserve has bailed out stock markets, he argues. Former Fed Chairman Alan Greenspan cut interests rates to near zero percent at the slightest indication of economic decline. And today, Chairman Ben Bernanke has followed the same course, stimulating the market so drastically in 2009 that after stocks crashed they took only three months to recover to a long-term upward trend.
"This pattern is unique," Grantham writes. And now that the Fed's balance sheet is stuffed full with debt, he adds, it may not come to aid during another stock downturn.
"GMO has looked at the 10 biggest bubbles of the pre-2000 era and has calculated that it typically takes 14 years to recover to the old trend," Grantham says. The important point of all this, he writes, is that almost none of today's professional investors have experienced anything like this because the Fed has come to the rescue.
"When one of these old-fashioned but typical declines occurs," he writes, "professional investors, conditioned by our more recent ephemeral bear markets, will have a permanent built-in expectation of an imminent recovery that will not come."
That sets up an environment that Grantham dubs, "No Market for Young Men." Grantham shows how long it may take U.S. stocks to recover if they crashed today:
It's important to note that Grantham isn't calling for stocks to languish like this past 2020. But he shows how they could, and that's scary for just about anybody.

Anti-humans


Time for an injection of common sense
Groups opposed to modern agriculture are using scare stories to try to have antibiotics banned on farms.
by Jason Smith 
‘A world without effective antibiotics is a terrifying but real prospect. Now, the situation is so acute that the director-general of the World Health Organisation, Dr Margaret Chan, has warned of “a post-antibiotic era, in which many common infections will no longer have a cure and once again, kill unabated”... [O]ver-use of antibiotics in factory farming, especially at low doses over several days, is contributing to the huge threat of a world without effective cures for bacterial infections.’
So said Compassion in World Farming, launching a report last month with two other campaign groups, the Soil Association and Sustain. The report, Case Study of a Health Crisis is part of an ‘Alliance to Save Our Antibiotics’. But does factory farming really threaten human health?
The emergence of antibiotic resistance as a serious problem in human medicine has prompted concerns about the public health implications of antibiotic use in agriculture. Opponents of intensive agriculture argue that bacteria become resistant to antibiotics in the guts of animals that are exposed to routine antibiotic use. Then, humans ingest these bacteria through the consumption of animal products and by drinking water contaminated by ‘run-off’ from factory farms. But is there a basis for these fears?
Antibiotics have been used for over 40 years on farms for three main purposes: to treat identified illnesses; to prevent illness; and to increase growth rates. The use of antibiotics as growth promoters added to animal feed was banned in the European Union, against the advice of the EU’s own Scientific Committee for Animal Nutrition, in January 2006. In a press release from the European Parliament in October, it was argued that the EU should also phase out the pre-emptive ‘prophylactic’ use of antibiotics, too. MEPs agreed that active ingredients used in veterinary and human medicines should be kept as separate as possible to reduce risks of resistance transferring between animals and humans.
Antibiotics are sometimes used to prevent diseases that might occur in a herd or group of animals. In situations where the proportion of animals suffering a disease during a defined period reaches a threshold, all animals in the herd are treated, as the probability of most or all of the animals getting infected is high. However, in animals as in humans, a significant proportion of those treated for infectious disease would recover without antibiotics, so it could be deemed that such use is unnecessary. But does this application of antibiotics create resistance?
According to scientists from the US Center for Disease Control and Prevention and the National Institutes of Health, there is ‘no scientific study linking antibiotic use in food-animal production with antibiotic resistance’. The most thorough study on this topic, from the Journal of Risk Analysis in 2008, concluded that the risk of a human experiencing an infection from antibiotic-resistant bacteria because cattle were fed antibiotics is one in 608million, which means it is over 2,000 times less likely than being struck by lightning.
There is, however, ample evidence to suggest that bacteria - including resistant strains - enter a farm from many different sources and that transmission of resistant bacteria may occur even when livestock are not being given antibiotics. According to the US National Academy of Sciences, humans may acquire resistant infections, via livestock, even if antibiotics are not given to those animals. Epidemiological studies have identified other risk factors for infections in humans, including contact with their own pet dogs and cats. These animals may be treated with antibiotics but are rarely tested as potential sources of human infection.
There is also evidence that the removal of antibiotics from veterinary medicine would cause welfare problems. Recent analysis of antibiotic use on farms in Denmark, where a voluntary ban on the use of antibiotic growth promotants (AGPs) was instituted in 1998, reports that antibiotics are now being used sparingly. Farmers and veterinarians must now wait until animals are exhibiting clear signs of illness before treatment is applied. However, this has led to higher doses of antibiotics being used overall. The Denmark ban led to an increase in diarrhoea in pigs and an increase in deaths by more than 20 per cent according to the World Health Organisation.
It is important to understand that the antibiotics used to prevent disease in animals are not used to treat humans. However, the antibiotics used to treat disease amongst animals are also used to treat humans. The ban actually increases the use of antibiotics that are also used in human medicine. Since the Danish ban, antimicrobial use has increased by nearly 110 per cent due to higher dosages being required to treat, rather than prevent, disease.
Since the antibiotic ban was introduced pig farmers in Denmark have begun utilising zinc to help control diarrhoea in hogs. Ironically, it is highly likely that this may be encouraging the incidence of the so-called ‘hospital superbug’, Methicillin-resistant Staphylococcus aureus (MRSA). Most importantly, WHO stated in 2002 that there had been no evidence of improved public health since the ban. In fact, resistant salmonella in humans has increased and Denmark had its largest outbreak of MRSA in 2008.
The Danish ban may have also contributed to a decrease in the number of farms in Denmark from nearly 25,000 in 1995 to fewer than 10,000 in 2005. Farmers, who were already finding it difficult to make a living, faced the increased cost of cattle lost to illnesses that, in the past, would have been saved by using antibiotics. Antibiotics reduce suffering and distress and speed recovery, and since an animal cannot be allowed to suffer the only alternative is to kill it.
Given that there have been few studies into the link between antibiotic resistance and agricultural use, and that these studies have found no evidence of a link, we might ask what all the fuss is about? But when it comes to modern, highly productive and safe farming methods, evidence is not important to groups - like the disingenuously named Alliance to Save Our Antibiotics - who would apparently rather we used Victorian-era methods for food production. The same evidence phobia seems to have afflicted EU bureaucrats and faceless Euro MPs trying to find some connection with the public by implementing ‘popular’ but counterproductive policies.

What They Want To Hear


Getting the rioters to do their dirty work
The Guardian’s study of the August riots is pure advocacy research, designed to harness the power of riotous menace to chattering-class causes.
by Brendan O’Neill 
Well, that’s convenient, isn’t it? A four-month Guardian/London School of Economics study into the riots that rocked English cities in August has found that the rioters were pretty much Guardian editorials made flesh. Concerned about government cuts, annoyed by unfair policing, shocked by social inequality and outraged by the MPs’ expenses scandal, it seems the young men and women who looted shops and burnt down bus stops weren’t Thatcher’s children after all – they were Rusbridger’s children, the moral offspring of those moral guardians of chattering-class liberalism.
This is a blatant case of advocacy research, of researchers finding what they wanted to find, or at least desperately hoped to find. For months now, the Guardian has been publishing articles arguing that the rioters were politically motivated, under headlines such as ‘These riots were political’ and with claims such as ‘the looting was highly political’ and the riots were a protest against ‘brutal cuts and enforced austerity measures’. And now, lo and behold, a Guardian study, Reading the Riots, has discovered that the rioters were indeed ‘rebels with a cause’, with 86 per cent of the 270 rioters interviewed claiming the violence was caused by poverty, 85 per cent arguing that policing was the big issue, and 80 per cent saying they were riled by government policies. Reading this study, we are left to marvel either at the extraordinary perspicacity of Guardian writers, or at their ability to carry out research in such a way that it confirms their own political preconceptions.
This study looks less like a cool-headed, neutral piece of sociology, and more like a semi-conscious piece of political ventriloquism, where rioters have been coaxed to mouth the political beliefs of the middle-class commentariat. This is not to say the Guardian and LSE researchers have been purposely deceitful, inventing evidence to suit a political thesis. Advocacy research is more subtle and less conscious than that. It involves a kind of inexorable pursuit of facts that fit and evidence that helps bolster a pre-existing conviction. So mental-health charities keen to garner greater press coverage always find high levels of mental illness, children’s charities that want to raise awareness about child abuse always find rising levels of child neglect, and now Guardian researchers who want to show that they’re right to fret about Lib-Con policies and outdated policing have found that these are burning issues amongst volatile English youth, too.
In terms of both the way the research was carried out and the comments that were made by the rioters who were interviewed, we can see advocacy research in action. As one commentator has pointed out, the selection process for the study means that it is largely the ‘upper crust’ of the rioters who ended up being interviewed. Many of the 270 interviewees were recruited through their connections with community organisations, meaning they may have already been infused with, or at least influenced by, the mores and outlook of community activism, of the kind you’ll frequently find in the Guardian ‘Society’ supplement. As a Telegraph writer says, ‘The sort of rioter who agrees to be interviewed as part of a social science research project for the Guardian is unlikely to be representative’. Indeed, the Guardian admits that ‘a large majority of the 270 people interviewed for the project had not been arrested’ – that is, they’re the ones who got away with it – and they were ‘surprisingly articulate’. These are the sections of inner-city youth more likely to be au fait with the liberal classes’ explanations for the rioting.
Also, we shouldn’t underestimate the keenness of the interviewees to say things that might make their rather pointless anti-social behaviour in August appear grand and meaningful. Where some of the interviewees are fairly honest about their opportunism – one says the rioting was ‘a festival with no food, no dancing, no music, but a free shopping trip for everyone’ – many of them adopt the kind of political language that had already appeared in the serious press in an attempt to make their behaviour seem purposeful. ‘It felt like I was part of a revolution’, said one; another described his fellow looters as ‘a battalion, a squadron, a troop of men’, as if he were involved in a political war rather than an exercise in kicking in JD Sports’ windows. With the researchers talking only to ‘the right kind’ of rioters and hoping to hear a political message, and the rioters keen to parrot some of the political excuses that had already been made for their behaviour, it was inevitable that this report would end up as something like a 1.3million-wordGuardian editorial.
The Guardian writers now promoting this report as evidence that they were right all along – with one of them claiming the rioters were ‘far more politically conscious’ than many people thought – imagine that they are doing the opposite of what the Lib-Con government did in response to the riots. Where David Cameron and his cronies condemned the rioters as feral or amoral, this report and its cheerleaders claim to reveal that the riots were in fact ‘political in nature’, if also ‘destructive and incoherent’. Yet this is just the flipside of what the Lib-Cons did. Government officials claimed to see in the rioters evidence of a widespread and dangerous ‘gang culture’ (a claim that was challenged by spiked long before anybody else), while their Guardian critics claim to see confused but definitely socially-aware protesters. Both sides see simply what they want to see in the weird tumult of August, imagining that the rioters confirm either their prejudices about feckless youth or their fantasies about reruns of 1960s-style, anti-conservative uprisings.
If anything, the riot-related advocacy campaigning of the Guardian is worse than what Cameron and Co. indulged in. Where Cameron’s shallow and predictable claims that this violence all sprung from bad parenting and ‘Broken Britain’ were opportunistically designed to make him and his government look strong in retrospect, through taking on has-been rioters, the advocacy aim of this latest piece of research is somewhat more sinister. What we have here is a pretty naked attempt to add a touch of physical force and menace to Guardian-style arguments about cuts and inequality and the monarchy and MPs, an attempt to harness the violence of the rioting to the various causes of the liberal commentariat. Feeling, perhaps, that their measured, middle-class demands for nicer policing, fewer cuts to the public sector and more banker wrist-slapping lack urgency and oomph, the Guardian and others are now effectively arguing that the failure to address such issues causes actual violence; that the alienated youth of Britain not only share this general outlook, but are willing to use violence to pursue it. It is moral blackmail in place of proper conviction and proof.
What gets lost in this dual attempt to politicise the rioters, with the Conservatives slamming them as badly mothered urchins and the Guardian kind-of praising them as ‘political in nature’, is any serious attempt to get to grips with what was new and different and unusual about what occurred in August. The riots did indeed reveal a great deal about modern Britain, particularly about the dearth of social solidarity amongst younger generations of poorer communities and the collapse of police and state authority in inner cities and elsewhere in England; yet neither of these things can seriously be discussed so long as all political factions remain more interested in plonking the rioters on their knees and getting them to mouth What We Want To Hear.

Charity with other peoples money

Free To Die?
By W. Williams
More Liberty Means Less Government: Our Founders Knew This Well (HOOVER INST PRESS PUBLICATION)
Nobel Prize-winning economist Paul Krugman, in his New York Times column titled "Free to Die" (9/15/2011), pointed out that back in 1980, his late fellow Nobel laureate Milton Friedman lent his voice to the nation's shift to the political right in his famous 10-part TV series, "Free To Choose." Nowadays, Krugman says, "'free to choose' has become 'free to die.'" He was referring to a GOP presidential debate in which Rep. Ron Paul was asked what should be done if a 30-year-old man who chose not to purchase health insurance found himself in need of six months of intensive care. Paul correctly, but politically incorrectly, replied, "That's what freedom is all about — taking your own risks." CNN moderator Wolf Blitzer pressed his question further, asking whether "society should just let him die." The crowd erupted with cheers and shouts of "Yeah!", which led Krugman to conclude that "American politics is fundamentally about different moral visions." Professor Krugman is absolutely right; our nation is faced with a conflict of moral visions. Let's look at it.
If a person without health insurance finds himself in need of costly medical care, let's investigate just how might that care be provided. There are not too many of us who'd suggest that we get the money from the tooth fairy or Santa Claus. That being the case, if a medically indigent person receives medical treatment, it must be provided by people. There are several possible methods to deliver the services. One way is for people to make voluntary contributions or for medical practitioners to simply treat medically indigent patients at no charge. I find both methods praiseworthy, laudable and, above all, moral.
Another way to provide those services is for Congress to use its power to forcibly use one person to serve the purposes of another. That is, under the pain of punishment, Congress could mandate that medical practitioners treat medically indigent patients at no charge.
I'd personally find such a method of providing medical services offensive and immoral, simply because I find the forcible use of one person to serve the purposes of another, what amounts to slavery, in violation of all that is decent.
I am proud to say that I think most of my fellow Americans would be repulsed at the suggestion of forcibly using medical practitioners to serve the purposes of people in need of hospital care. But I'm afraid that most Americans are not against the principle of the forcible use of one person to serve the purposes of another under the pain of punishment. They just don't have much stomach to witness it. You say, "Williams, explain yourself."
Say that citizen John pays his share of the constitutionally mandated functions of the federal government. He recognizes that nothing in our Constitution gives Congress the authority to forcibly use one person to serve the purposes of another or take the earnings of one American and give them to another American, whether it be for medical services, business bailouts, handouts to farmers or handouts in the form of foreign aid. Suppose John refuses to allow what he earns to be taken and given to another. My guess is that Krugman and, sadly, most other Americans would sanction government punishment, imprisonment or initiation of violence against John. They share Professor Krugman's moral vision that one person has a right to live at the expense of another, but they just don't have the gall to call it that.
I share James Madison's vision, articulated when Congress appropriated $15,000 to assist some French refugees in 1794. Madison stood on the floor of the House to object, saying, "I cannot undertake to lay my finger on that article of the Constitution which granted a right to Congress of expending, on objects of benevolence, the money of their constituents," adding later that "charity is no part of the legislative duty of the government." This vision of morality, I'm afraid, is repulsive to most Americans.

Humans, the ultimate resource

The "Shale Gale" Goes Global with Discoveries in Argentina and China, "Peak Oil" Losing Relevance
1. Peak Oil Debate Losing Relevance Due to New Upstream Technology -- "The debate over whether the world's reserves of hydrocarbons have now peaked and are in decline has lost relevance over recent years as new technology allows oil companies to find and exploit new hydrocarbon sources, the CEO of Repsol Antonio Brufau said today.
Brufau said progress made in exploring and developing ultra-deepwater areas, unconventional oil and gas sources and the move into remote areas such as the Arctic, have been key to growing global reserves of oil and gas. "The speed at which technology changes and its consequences have taken us largely by surprise. The peak oil debate has lost a great deal of its relevance in the past three years," Brufau told the World Petroleum Congress in Doha. 
Repsol continued to more than replace its proven oil and gas reserves outside Argentina this year and will accelerate output from 2015 onwards as it converts contingent resources into proven reserves. Brufau pointed to developments in the U.S. shale gas industry and highlighted Repsol's own plans to develop a huge shale oil and gas area in Argentina. The Vaca Muerta shale oil and gas discovery in Argentina covers nearly 1 billion equivalent barrels of recoverable shale oil."
2.  Shell Strikes Shale Gas in China -- "Royal Dutch Shell has found shale gas in China, a development that could cap imports in a market natural gas producers are hoping will drive demand.  An official with Shell's partner, PetroChina, a unit of the country's top energy group, state-owned CNPC, said drilling results from two wells Shell drilled had been positive.
"Shell has two vertical wells and they got very good primary production," Professor Yuzhang Liu, Vice president of Petrochina's Research Institute of Petroleum Exploration and Development (RIPED), said in an interview at the sidelines of the World Petroleum Congress in Doha. "It's good news for shale gas," said Liu. China currently has no commercial shale gas production."

The eternal crisis of U.S. foreign policy


Egypt and the Idealist-Realist Debate in U.S. Foreign Policy
By George Friedman
The first round of Egyptian parliamentary elections has taken place, and the winners were two Islamist parties. The Islamists themselves are split between more extreme and more moderate factions, but it is clear that the secularists who dominated the demonstrations and who were the focus of the Arab Spring narrativemade a poor showing. Of the three broad power blocs in Egypt — the military, the Islamists and the secular democrats — the last proved the weakest.
It is far from clear what will happen in Egypt now. The military remains unified and powerful, and it is unclear how much actual power it is prepared to cede or whether it will be forced to cede it. What is clear is that the faction championed by Western governments and the media will now have to accept the Islamist agenda, back the military or fade into irrelevance.
One of the points I made during the height of the Arab Spring was that the West should be careful of what it wishes for — it might get it. Democracy does not always bring secular democrats to power. To be more precise, democracy might yield a popular government, but the assumption that that government will support a liberal democratic constitution that conceives of human rights in the European or American sense is by no means certain. Unrest does not always lead to a revolution, a revolution does not always lead to a democracy, and a democracy does not always lead to a European- or American-style constitution.
In Egypt today, just as it is unclear whether the Egyptian military will cede power in any practical sense, it is also unclear whether the Islamists can form a coherent government or how extreme such a government might be. And as we analyze the possibilities, it is important to note that this analysis really isn’t about Egypt. Rather, Egypt serves as a specimen to examine — a case study of an inherent contradiction in Western ideology and, ultimately, of an attempt to create a coherent foreign policy.
Core Beliefs
Western countries, following the principles of the French Revolution, have two core beliefs. The first is the concept of national self-determination, the idea that all nations (and what the term “nation” means is complex in itself) have the right to determine for themselves the type of government they wish. The second is the idea of human rights, which are defined in several documents but are all built around the basic values of individual rights, particularly the right not only to participate in politics but also to be free in your private life from government intrusion.
The first principle leads to the idea of the democratic foundations of the state. The second leads to the idea that the state must be limited in its power in certain ways and the individual must be free to pursue his own life in his own way within a framework of law limited by the principles of liberal democracy. The core assumption within this is that a democratic polity will yield a liberal constitution. This assumes that the majority of the citizens, left to their own devices, will favor the Enlightenment’s definition of human rights. This assumption is simple, but its application is tremendously complex. In the end, the premise of the Western project is that national self-determination, expressed through free elections, will create and sustain constitutional democracies.
It is interesting to note that human rights activists and neoconservatives, who on the surface are ideologically opposed, actually share this core belief. Both believe that democracy and human rights flow from the same source and that creating democratic regimes will create human rights. The neoconservatives believe outside military intervention might be an efficient agent for this. Human rights groups oppose this, preferring to organize and underwrite democratic movements and use measures such as sanctions and courts to compel oppressive regimes to cede power. But they share common ground on this point as well. Both groups believe that outside intervention is needed to facilitate the emergence of an oppressed public naturally inclined toward democracy and human rights.
This, then, yields a theory of foreign policy in which the underlying strategic principle must not only support existing constitutional democracies but also bring power to bear to weaken oppressive regimes and free the people to choose to build the kind of regimes that reflect the values of the European Enlightenment.
Complex Questions and Choices
The case of Egypt raises an interesting and obvious question regardless of how it all turns out. What if there are democratic elections and the people choose a regime that violates the principles of Western human rights? What happens if, after tremendous Western effort to force democratic elections, the electorate chooses to reject Western values and pursue a very different direction — for example, one that regards Western values as morally reprehensible and aims to make war against them? One obvious example of this is Adolph Hitler, whose ascent to power was fully in keeping with the processes of the Weimar Republic — a democratic regime — and whose clearly stated intention was to supersede that regime with one that was popular (there is little doubt that the Nazi regime had vast public support), opposed to constitutionalism in the democratic sense and hostile to constitutional democracy in other countries.

Tuesday, December 6, 2011

The big sleep

Defending the Austrian Explanation of the Great Depression 
by Robert P. Murphy
America's Great DepressionScott Sumner is a Chicago-trained economist who has gained notoriety in recent months for his vigorous advocacy of "NGDP targeting" by the Federal Reserve and other central banks. I have criticized Sumner's views before, and he and I have agreed to a formal online debate to be held early next year.
In the present article, I want to respond to a recent post — titled "The myth at the heart of internet Austrianism" — in which Sumner criticized the Austrian explanation of the Great Depression. I will pick apart Sumner's post almost line by line, so I encourage readers to first follow the link and read it in its entirety before turning to my reply.
Sumner opens up his article, "This post is not about Austrian economics, a field I know relatively little about." Thus far, he and I are in perfect agreement.
Sumner then writes,
[This article] is a response to the claim that the 1929 crash was caused by a preceding inflationary bubble. I will show that the 1920s were not inflationary, and hence that there was no bubble that could have caused an economic slump which began in late 1929.
In order to prove that there was no inflationary bubble in the 1920s, Sumner goes through a list of possible definitions of "inflation" and (in his mind) shows that there was no such expansion under any of the definitions.
1. Inflation as price change: Let's start with the obvious, the 1920s was a decade of deflation; prices fell. Indeed the 1927–29 expansion was the only deflationary expansion of the entire 20th century. That's right, believe it or not the price level actually declined during the boom at the end of the 1920s.
This is correct, if by "price" we mean the consumer price index (CPI). A basket of typical household goods did indeed become cheaper from 1927 through 1929. In fact, this was one of my arguments in my own book about the Depression, to show why the modern hysteria over "deflation" is nonsense.
Politically Incorrect Guide to the Great Depression and the New DealThe typical economist or financial pundit today will warn that if prices ever began actually falling, then it would set in motion a vicious downward spiral as consumers postponed spending, waiting for further price falls. Well, this deflationary black hole obviously wasn't occurring in the heyday of the Roaring Twenties, showing that falling prices per se don't wreck an economy.
Ironically, Mises and Hayek themselves pointed to the relatively stable (i.e., noninflationary) consumer prices of the late 1920s to show why their theory (i.e., the Austrian explanation) was better than Irving Fisher's approach. Fisher famously thought the US Fed had been doing a smashing job during the late 1920s, because after all it had kept the purchasing power of the dollar relatively stable.
From the Austrian perspective, this apparent stability was an illusion, and was masking the actual distortions building in the economy. (Had the Fed not inflated the money supply, increases in productivity would have yielded much sharper drops in consumer prices during the second half of the decade.)
Having disposed of the first case — where "inflation" refers to rising consumer prices — Sumner then turns to the a different definition for the term, namely a rising stock of money:
2. Inflation as money creation: At this point commenters start claiming that inflation doesn't mean rising prices, it means a rising money supply. I think that is absurd, as that would mean we lack a term for rising prices. But let's assume it's true. The next question is; which money? If inflation means more money, then don't you have to say "base inflation," or "M2 inflation?" After all, these quantities often go in dramatically different directions. Since the internet Austrians seem to blame the Fed, let's assume they are talking about the sort of money created by the Fed, the monetary base. In January 1920 the base was $6.909 billion, and in December 1929 it was $6.978 billion. Thus it was basically flat, and this was during a period where the US population and GDP rose dramatically.
Now this is extremely misleading. In fairness, Sumner is tackling the claim of whether there was an inflationary boom in "the 1920s," and so he understandably looked at the start and end dates for the decade. Yet look at the actual chart of the monetary base during the period:
Figure 1
By picking January 1920 as his start date, Sumner was in the midst of the huge inflationary boom during World War I (when the Fed was partially monetizing the massive debt issued by the federal government). To curb the rampant consumer price inflation (exceeding 20 percent on a year-over-year basis), the Fed jacked up rates and crashed the monetary base, ushering in thedepression of 1920–1921.