"I would not look to the U.S. Constitution if I were drafting a constitution in the year 2012.”
The speaker was Associate
Justice Ruth Bader Ginsburg of the United States Supreme Court. These,
therefore, were astonishing words.
The authority over American law enjoyed by Justice
Ginsburg and her colleagues on the Court owes solely to the existence of the
U.S. Constitution, complemented by the high court’s proclamation that it has
the last word on how that Constitution is to be construed. That latter power
grab traces its roots back to Chief Justice John Marshall’s legendary 1803
opinion in Marbury v. Madison. Marshall “emphatically”
declared it “the duty of the Judicial Department to say what the law is.”
Despite naysayers from Jefferson to Lincoln, who thought that judicial
supremacy would eviscerate popular sovereignty, Marshall’s assertion paved the
way for the modern Court to claim even more boldly, in Cooper v. Aaron (1959)
for instance, that judicial control over the Constitution’s meaning is a
“permanent and indispensable feature of our constitutional system.”
In short, were there no Constitution, there would be precious little interest in Justice Ginsburg’s views. Yet, when she looks at this venerable source of her power—ratified in 1788 and, thus, as she explained, “the oldest written constitution still in force in the world”—she sees obsolescence. In its place, the Court’s senior progressive ideologue advised the assemblage of university students whom she was addressing to “look at the constitution of South Africa. That was a deliberate attempt to have a fundamental instrument of government that embraced basic human rights, [and] had an independent judiciary.”
But wait. Let’s put aside the fact that no jurists in
the world are more autonomous than the federal judges of the United States.
Does not America’s fundamental law, with its robust Bill of Rights addendum,
embrace basic human rights? Well, no. It embraces basic human freedoms.
That makes all the difference.
Freedom is of minimal interest to progressives,
certainly not freedom as is commonly understood: namely, the
bedrock conceit that we are our own governors, autonomous over our own lives.
To be clear, we are talking about freedom in a democracy, not an anarchy. In a
rational social compact, freedom requires that we surrender a quantum of our
independence to secure the nation and to honor the rudimentary norms of respect
for life and property. If a free society is to flourish, nothing less than
ordered liberty will do.
Alas, the “liberty” part of ordered liberty is not the
concern of Justice Ginsburg and her fellow travelers. For them, the
individual’s freedom is a relic of a bygone time, when life was simpler and
dominated by sexist, slave-holding white men of a colonialist bent. The modern
Left’s métier is rights, in the contemporary connotation: i.e.,
what you must give to me, with government handling both the confiscation and
redistribution ends of the arrangement. In contrast to the traditional rights
Justice Ginsburg finds so unrefined—to wit, the right to be free from
government demands and the right to have government restricted to its expressly
enumerated powers—the new rights cover everything from the mortgage arrears of
spendthrifts for houses they could not afford to contraceptives for the sexual
recreation of young women at nominally Catholic law schools.
On those sorts of “rights,” the U.S. Constitution
never was much good. Better to go with South Africa or, as Ginsburg further
recommended, the Canadian Charter of Rights and Freedoms, the European
Convention on Human Rights, and “all the constitution writing that has gone on
since the end of World War II.”
What has actually gone on since the end of World War
II is the rise of “totalitarian democracy,” to borrow the lapidary descriptor
of the political scientist Jacob Leib Talmon. This is a form of “political
Messianism” (another Talmon coinage) that must be distinguished from quaint old
liberal democracy. Indeed, while the burden of this essay is to consider the
place of the rule of law in an age of upheaval, it would be as
apt to speak of the role of law. For what we will experience as
“law” will be very different depending on which variety of democracy remains
when the upheaval’s dust has settled.
The totalitarian democratic school, Talmon instructed,
“is based upon the assumption of a sole and exclusive truth in politics.”
Liberal democracy, by contrast, “assumes politics to be a matter of trial and
error.” It takes human beings as basically good but incorrigibly fallible, and
sees their political systems as just another pragmatic contrivance in lives that
for the most part are lived “altogether outside the sphere of politics.” To the
contrary, the avatars of totalitarian democracy maintain that they have arrived
at a sole and exclusive truth. Consequently, the personal becomes the
political. The car you drive, the clothes you wear, and the movies you watch—it
all becomes, as President Barack Obama is fond of saying, a “teachable moment.”
Politics is not on the sideline; it is the juggernaut that perfects mankind in
accordance with the totalitarian truth. Law is the principal instrument by
which this overwhelming force is wielded.
As such, law manifests the central contradiction of
political Messianism. Totalitarian democrats, also known as “progressives,”
have feigned homage to the centrality of freedom since the French Revolution.
But whereas the conservative (i.e., the classic, Burkean “liberal”) finds the
essence of freedom in what Talmon described as “spontaneity and the absence of
coercion,” progressives like Justice Ginsburg and President Barack Obama
“believe it to be realized only in the pursuit and attainment of an absolute
collective purpose.”
Thus, the circle that cannot be squared: Even if we
assume for the sake of argument that totalitarian democrats are
well-intentioned—that their quest for social justice, their “absolute
collective purpose,” is not merely a thin veneer for the pursuit of raw
power—human freedom is not compatible with an exclusive pattern of social
existence. Thinking that it is leads to cognitive dissonance of the
Jean-Jacques Rousseau variety. Rousseau was the seminal totalitarian democrat
who thought that man must “be forced to be free” because liberty “tacitly
includes the undertaking, which alone can give force to [the social compact],
that whoever refuses to obey the general will shall be compelled to do so by
the whole body” of society.
How fitting, then that Justice Ginsburg chose Cairo,
ground-zero of the “Arab Spring,” as the setting for her speech. Even as she
uttered words to consign America’s fundamental law to the ash heap, Egypt’s
triumphant Islamic supremacists were in the process of winning 80 percent of
the seats in the new parliament. Their first major task will be the drafting of
a new constitution—which is why Justice Ginsburg was asked to ruminate about
America’s in the first place. Only one thing is certain about the constitution
the Muslim Brotherhood and its Islamist coalition partners will establish:
Section Two of the current constitution, which makes Islam’s repressive sharia
supreme, will remain sacrosanct and be given real teeth. Ginsburg thinks our
two-hundred-and-thirty-years-old Constitution is outdated, but at
fourteen-hundred years old and counting, Islam’s totalitarian legal code is
practically primeval—and yet, here it is born anew. After all, it serves
precisely the function that law serves in totalitarian democracy: It suppresses
free expression, free will, and volition. Conformity eventually becomes “free
choice” because it is the only available alternative. As Talmon put it,
addressing the tension between freedom and the progressive vision:
This difficulty could only be resolved by thinking not
in terms of men as they are, but as they were meant to be, and would be, given
the proper conditions. In so far as they are at variance with the absolute
ideal they can be ignored, coerced or intimidated into conforming without any
real violation of the democratic principle being involved. In the proper
conditions, it is held, the conflict between spontaneity and duty would
disappear, and with it the need for coercion. The practical question is, of
course, whether constraint will disappear because all have learned to act in
harmony, or because all opponents have been eliminated.
Islam, we are tirelessly reminded by its apologists
citing Sura 2:256, prohibits compulsion in matters of religion. We need,
however, to read the sharia fine-print. True, Islam will not force you to
become a believer—at least not officially. It has no compunction, however,
about imposing what Talmon would call “the proper conditions”—the sharia
system, which, in fact, assumes the presence in the caliphate of non-believers,
whose subjugation has a sobering in terrorem effect (and whose
obligatory poll tax promotes the sharia state’s fiscal health). The concept is
that with enough coercion, there will eventually be no need for coercion:
everyone, of his own accord, will come to the good sense of becoming a
Muslim—all other alternatives having been dhimmified into desuetude.
Post–World War II, “all the constitution writing” so
admired by Justice Ginsburg for its promotion of human rights has become
totalitarian democracy’s cognate version of social engineering. It seeks to
create the proper conditions that might mold us into what progressives think we
are meant to be. The wellspring of this rights revolution is “international
humanitarian law,” a now bulging corpus of bien pensant pieties.
The global human rights movement represents over half
a century’s erosion of first principles: that nations are sovereign; that
international standards may not be applied to them absent their consent; and
that treaties are political agreements between national governments, not
banquets of individual and largely redistributive “rights” that citizens may
enforce judicially against national governments. This regression, from the
venerable “Law of Nations” enshrined in our Constitution to today’s amorphous
international humanitarian law mirrors the ongoing contortion of domestic
“rights” from freedom-preserving safeguards enjoyed by all citizens against
government into freedom-killing intrusions into private life by government for
the benefit of some citizens over others.
The “Law of Nations,” derived from antiquity’s jus
gentiumprinciple, was invoked by ancient Rome when the jus civile,
the law applied to citizens, was inapposite. A bow to natural law, the
principle recognized that basic strictures honored by diverse peoples across
the empire must be grounded in human reason. This was the framers’ classic
meaning in empowering Congress to proscribe offenses against the Law of
Nations. As construed by the English jurist William Blackstone, the grant was
exceedingly narrow, relating only to piracy and the protection of diplomats.
Two centuries later, however, seizing on the
previously moribund Alien Torts Act enacted in 1789 by the first Congress, the
federal courts began expanding the doctrine to meddle in the affairs of other
countries—for example, entertaining a lawsuit by Paraguayan victims tortured by
Paraguayan officials in Paraguay. Today’s judges rationalize that conduct
should be actionable if, in their estimation, it transgresses “definable,
universal, and obligatory norms.” As night follows day, the busybody jurists of
other nations reciprocate, claiming “universal jurisdiction” to hound former
American government officials with “war crimes” investigations over their
execution of U.S. policy.
The Law of Nations is not to be confused (which is to
say, it is forever confused) with “international law.” As George Mason
University’s Jeremy Rabkin explains, the latter is a broader concept, based on
the mutuality of obligations between consenting sovereigns: “a
law that was entirely between nations, rather than reaching
into their internal affairs.” The simplest iteration is the written treaty.
The Constitution makes treaties the supreme law of the
land, provided they follow a fairly arduous ratification procedure:
presidential agreement approved by two-thirds of the Senate. But nothing about
the “rule of law” is ever simple for long. Once ratified, a treaty has the same
legal force as a statute—or does it? In theory, a statute cannot override the
Constitution, but in an obscure 1920 case, the Supreme Court held that, by
striking down a treaty with Great Britain that regulated the hunting of
migratory birds, the federal government effectively preempted the laws of
Missouri—something that federalism and the separation-of-powers principle
barred Congress from doing, bird hunting having always been a matter of
sovereign state control. So, could Leviathan, by making a treaty with, say,
Mexico, degrade or eviscerate other powers our allegedly outdated Constitution
reserves to the states or the people? What does one suppose Justice Ginsburg
would say?
The question is not idle. Treaty writing has exploded
since World War II. Reeling from the unprecedented carnage, victorious nations
undertook to prohibit war, or at least regulate it out of existence, convincing
themselves that human nature could be altered by parchment. This was an irony
coming from the generation that had endured a hellacious war precisely because
it realized there were worse evils, like Fascism—a generation that watched Nazi
atrocities give way to Soviet tyranny. Beyond the Geneva Conventions, the
post-war era gave birth to the nascent United Nations’ 1948 Universal
Declaration of Human Rights.
To this day, the Declaration animates the rights
revolution Justice Ginsburg now heralds. Midwifed by Eleanor Roosevelt, it
proclaimed us all as a single “human family,” collectively responsible to
guarantee each other not just life, liberty, and property but personal
security, freedom from torture, slavery, and arbitrary arrest, equality before
the law, travel and asylum, employment with “just and favourable compensation,”
education, healthcare, food, clothing, housing, leisure time, and on and on.
Today, the utopian world envisioned by the Declaration lives on in the
economically sclerotic and unsustainable nations of continental Europe.
For all its dreamy rhetoric, the Declaration is merely
hortatory, a tocsin, not a treaty. Moreover, the cold, hard fact that we are so not one
big happy human family inevitably arrived in the form of an aggrieved Muslim
response: the 1990 Declaration of Human Rights in Islam, which advocates the
global hegemony of sharia. Nevertheless, numerous treaties were weaved from the
original Declaration’s nostrums. The two most prominent, by way of the United
Nations, are the International Covenant on Civil and Political Rights and the
International Covenant on Economic, Social and Cultural Rights. The former
hopes to forbid degrading treatment, hate speech, advocacy of war, unequal
burdens in child-rearing, and capital punishment (a later add-on). The latter
is a progressive wish-list that envisions government-controlled economic
sectors, comparative worth compensation, a mandate to end hunger, and universal
healthcare (perhaps galactic healthcare was too modest). The torrent of human
rights treaties eventually added conventions prohibiting discrimination against
women, racial discrimination, torture, and cruel, inhuman, and degrading
treatment, as well as conventions promoting the rights of migrant workers, the
“rights of the child,” and so on.
For the most part, the authoritarian and
redistributive components of these monstrosities would stand little chance of
being enacted into domestic law by Congress. Knowing that,
presidents—especially those of a Leftist orientation—often play a cynical game.
They sign on to the document as a show of internationalist solidarity but do
not press for its ratification to avoid political damage at home. Who, after
all, wants to be labeled as an opponent of such treacle as “the rights of the
child”? And yet, who in America wants to be accountable for seeking to
nationalize education, grant children rights of action against their parents,
and, according to the drafting committee, require the criminalization of all
corporal punishment? Justice Ginsburg’s patron, President Bill Clinton, not
only signed, but helped draft this convention. With even Somalia on board, it
has become a source of no small embarrassment to President Obama that the
United States is the only country on earth (besides the newly minted South Sudan)
to refuse its ratification.
Alternatively, some human rights treaties do finally
get ratified, but only after the Senate lards them with sweeping reservations
and caveats. The UN Convention Against Torture is an example: while a
government with weighty security responsibilities does not want to be seen as
pro-torture, neither can it afford to endorse such inscrutably vague terms as a
prohibition on “degrading treatment” (many Muslim men, for instance, find it
degrading to be interrogated by a female investigator). So the UNCAT was
ratified with the proviso that, other than torture—which is controlled by a
federal statute—the “cruel, inhuman and degrading” terms do not proscribe
anything not already forbidden by the Constitution.
The palpable objective of these precautions is to
enable practical politicians to nullify lunatic treaties while ostensibly
endorsing both the ideal of “human rights” and the illusion of an
“international community”—each immensely popular among progressive academics
and their media echo-chamber. The Constitution was once our insurance against
the consequences of these shenanigans: absent clear, unconditional
ratification, we could not be saddled with bad treaty terms even if presidents
made a show of embracing them. This is no longer the case, though. The
transnational Left, spearheaded by judges, law professors, bar associations, NGOs,
and international bureaucrats, has devised several stratagems for defeating the
Constitution’s preference for American self-determination over foreign
consensus.
The simplest of these devices is the “executive
agreement,” a favorite of presidents across the ideological divide. To avoid
ratification disputes, presidents make deals with their counterparts in other
nations, maintaining that their plenary authority over foreign affairs permits
this autocratic streamlining. It also, they insist, dictates compliance at
home, even if that means riding roughshod over both the constitutional check of
Congress and the constitutional prerogatives of states and citizens. There is
today no shortage of politicians, even self-avowed “conservatives,” who urge
that national security, or at least national honor, requires rallying behind
the commander-in-chief, regardless of the fall-out.
Still, executive agreements can at least be seen,
grasped, and theoretically undone by straightforward legislation. Far more
troublesome is “customary” international law, which is not easily grasped
because it is forever evolving and forever breaking with its own bedrock tenet
of sovereign consent. This species of law is referred to as “custom” because
there is no express agreement; abstract principles are simply deemed to have
transmogrified into binding law once they achieve some mystical degree of
claimed consensus. Deemed by whom, and based on what consensus? These are the
questions, but the answers are elusive. What is “customary” turns out to be
very much in the eye of the beholder—invariably, a progressive party,
organization, or court desirous of radiating its statist predilections with the
majesty of “law.”
This is not to pooh-pooh the whole idea of custom.
That some deeply rooted customs achieve nigh-global consensus is beyond a
doubt. Were it not patently imperative that a nation’s emissaries be given safe
conduct in foreign lands and were pirates on the high seas not regarded as hostis
humani generis (the enemy of humanity), there would be no foreign
commerce to speak of—hence, the Law of Nations recognized by the framers. To
take another example, the “laws and customs of war” were developed over
centuries of international practice—limiting legitimate combat to sovereign
states; proscribing intentional or disproportionate attacks against civilians;
and requiring soldiers to identify themselves as such (wearing uniforms,
carrying their arms openly, etc.).
Back when international understandings were not
typically memorialized on paper, these truly were time-honored norms, basic and
civilizing. It is unsurprising, then, that even after the modern spate of
multilateral treaties commenced in the nineteenth century, it became, well,
customary to insert the so-called “Martens Clause” (from the Hague Convention
of 1899), stipulating that, notwithstanding the reduction of extensive
agreements to writing, “populations and belligerents remain under the
protection and empire of the principles of international law, as they result
from the usages established between civilized nations, from the laws of
humanity, and the requirements of the public conscience.”
But that was before the ascendancy of the Lawyer Left.
Once the rights revolution took hold, terms like the “laws of humanity” and the
“public conscience” gave cavernous maneuvering room to progressives who see law
as their social-justice cudgel. They play the “customary law” game as follows:
Some law is written and some is unspoken but widely accepted. As we’ve seen,
countries may decline to adopt the written law by refusing to join treaties,
signing but declining to ratify them, or freighting ratification with caveats,
as the United States is wont to do when treaty terms against, say, free
expression, gun ownership, or the death penalty run afoul of the Constitution.
Yet, once a written treaty is adopted by a critical mass of countries over a
period of years, progressives contend that it has been subsumed into customary
law. This, they assert, means that even countries that have never consented are
now obliged to submit to the treaty terms, as originally written by the
international community’s left-leaning lawyers, not as qualified by ratification
caveats.
Thus, for example, does the Supreme Court circumvent
the Constitution’s treaty-ratification procedure by citing unratified treaties.
In 2010, for example, Justice Anthony Kennedy cited the Convention on the
Rights of the Child in invalidating a Florida law that permitted a minor to be
sentenced to life imprisonment for a non-homicide offense. Justice Kennedy is
the centrist “swing justice” who, like Justice Ginsburg and the Court’s three
other reliable Leftists, trends transnational-progressive in human rights
cases. He rationalized that the Eighth Amendment bar against cruel and unusual
punishment had expanded to include the provisions of the unratified treaty and
other indicia of global decency norms—“global” and “norms” evidently excluding
the law and practices actually followed in land masses east, west, and south of
the Euro Zone sliver.
Worse, in elevating their sensibilities over
democratic self-determination, customary law enthusiasts do not limit
themselves to treaty terms that have never been ratified. Rather, as Adam
Roberts and Richard Guelff detail in their essential treatise, Documents
on the Laws of War, customary law is also informed by, among other things:
(a) diplomatic, political, and military behavior by states (including their official
statements, court decisions, legislation, and administrative decrees); (b) the
judicial decisions of international tribunals such as the United Nations
International Court of Justice (the pretentiously self-styled “World Court”
which, for example, ruled in 2004 that Israel’s security fence, a passive
security measure that reduced Palestinian suicide bombing attacks by over 90
percent, somehow violated international law); (c) the International Criminal
Court (which claims jurisdiction over the United States despite our refusal to
join); (d) such ad hoc tribunals as those established at Nuremburg
and Tokyo after World War II, and after the genocides in Rwanda and Yugoslavia
in the mid–1990s; (e) treatises and other writings by international law experts
interpreting treaties and customs; (f) military manuals; (g) the proceedings of
the United Nations and its various components (which tend to be rabidly
anti-Israeli and anti-American); and (h) interpretive publications and
conventions produced by such influential (and reliably progressive)
non-governmental organizations as the International Committee of the Red Cross,
Amnesty International, and Human Rights Watch.
On the international stage, human rights
revolutionaries do not content themselves with inventing law out of thin air
and the minutes of progressive gab-fests. They have perverted the very idea of
the treaty, and thus of the point of having international law at all.
Since 1648, when the Treaty of Westphalia for the most
part ended a century of war in Europe, the sovereign nation-state has been the
foundation of the international order. Law is essentially a domestic affair.
Treaties are political agreements between sovereigns, and violations of them
are the stuff of diplomacy, which can run the gamut from compromise to
appeasement to economic sanctions to war. Judicial processes may not be imposed
on sovereigns without their consent, either in the courts of another nation or
in international tribunals.
The human rights revolution seeks to cataclysmically
change these assumptions. Transnational progressives are post-sovereign. They
see nation-states like they see the U.S. Constitution: as obsolete. Theirs is
not a world—like the real one—in which order is kept by the assurance that
sovereigns will pursue their interests and use force as necessary to maintain
security. It is a mirage: a global legal order overseen by supranational
agencies and managed by courts, whose writs will be enforced by—er, well, never
mind. Enforcement will be unnecessary in a world devoted to the rule of law,
meaning all that old-fashioned military spending can be diverted to healthcare,
retirement security, education, housing, employment, and recreation.
In such a world, treaties are not
sovereign-to-sovereign agreements. They are repositories of rights that run to
the individual, who may enforce them in court. That the treaties would never
have been entertained in the first place if that were the case—that there would
be no Geneva Conventions had Harry Truman, Dwight Eisenhower, and the post-war
Senate understood that al Qaeda could one day use them to put the United States
on trial in our own courts over the manner in which we wage a war that the
terrorists started—is beside the point for progressives. Professor Rabkin
observes that this is a radical departure: until recently, “talk of an
‘international law of human rights’ would have seemed . . . oxymoronic.”
International law used to mean that sovereigns would protest the mistreatment
of their own citizens in foreign territory. But the central conceit of “human
rights law”—namely, that a growing body of rights and privileges “apply to
human beings, as such” and may be judicially enforced by them, even against
their own governments, even if their fellow citizens have not consented—would
have been thought absurd.
So what is the end game? When acolytes of totalitarian
democracy gush, as Justice Ginsburg did, over the foreign embrace of “basic
human rights,” they are celebrating a radical transformation not just of
international law but also of what a “right” is.
In 2001, Illinois State Senator Barack Obama gave an
interview to Chicago Public Radio in which he lamented the timorousness of the
Warren Court. Now, most Americans remember the Supremes of the Sixties and
Seventies as a rather revolutionary bunch—blazing the trail on abortion, the
rights of the criminal, and today’s imperial judiciary. To Obama, though, they
had flinched. They had failed to confront “the issues of redistribution of
wealth, and of more basic issues such as political and economic justice in
society.”
It was an early glimpse of the change agent who, as a
presidential candidate seven years later, would admonish an ordinary Ohioan
named Joe Wurzelbacher, now known to America as “Joe the Plumber,” that social
progress could come only when government “spread the wealth around.” The Warren
Court, Obama explained back in 2001, failed to “break free from the essential
constraints that were placed by the founding fathers in the Constitution.”
Instead, Obama complained, the justices clung to the hoary construction of the
Constitution as “a charter of negative liberties,” one that says only what
government “can’t do to you.” Obama explained that real economic justice
demands the positivecase: what government “must do on your behalf.”
This philosophy is a reprise of what Jonah Goldberg
elegantly calls the “apotheosis of liberal aspirations.” It first surfaced in
President Franklin D. Roosevelt’s 1944 proposal of a “Second Bill of Rights,” a
mandate that government construct “a new basis of security and prosperity.” The
new guarantees—which, not coincidentally, also found their way into Mrs.
Roosevelt’s Universal Declaration of Human Rights—would include “a useful and
remunerative job,” “a decent home,” “adequate medical care and the opportunity
to achieve and enjoy good health,” “adequate protection from the economic fears
of old age, sickness, accident, and unemployment,” and a “good education.”
This is the dream of totalitarian democracy, and Obama
hopes to be its political Messiah. Law is to be the principal tool for
achieving it. Politically, it cannot be done: the cost would be too prohibitive
even if a rising tide of citizens were not already growing restive over the
debt crisis that Washington blithely ignores. Thus, the Left’s reliance on law:
Americans like to see themselves as law-abiding—which is why politicians lace
their rhetoric with allusions to the “rule of law” though they exhibit scant
allegiance to the law in their own machinations. Americans are apt to abide even
that which they deeply resent if they come to believe the law requires it.
The sad irony is that the inversion of rights from
safeguards to entitlements is a profound betrayal of our fundamental law. The
political commentator Mark Levin has explained it well:
This is tyranny’s disguise. These are not rights. They
are the Statist’s false promises of utopianism, which the Statist uses to
justify all trespasses on the individual’s private property. Liberty and
private property go hand in hand. By dominating one, the Statist dominates
both, for if the individual cannot keep or dispose of the value he creates by
his own intellectual and/or physical labor, he exists to serve the state. The
“Second Bill of Rights” and its legal and policy progeny require the individual
to surrender control of his fate to the government.
For the framers, government was a necessary evil. It
was required for a free people’s collective security but, if insufficiently
checked, it was guaranteed to devour liberty. The purpose of the Constitution
was not to make the positive case for government. The
case for government is the case for submission—submission to, as Talmon put it,
the “sole and exclusive truth,” the progressives’ “absolute collective
purpose,” their “proper conditions” for making men not what they are but what
“they were meant to be.”
In stark contrast, the Constitution is the positive
case for freedom—real freedom, not freedom in the sense (actually,
the nonsense) of Rousseau, the Islamists, and totalitarian democracy, in which
the individual complies because a coercive environment leaves him with no other
options. Freedom cannot exist without order, and thus implies some measure of
government. It is, however, a limited government, vested with only the powers
expressly enumerated in the law of the land, our Constitution. As the framers
knew, a government that strays beyond those powers is necessarily treading on
freedom’s territory. It is certain to erode the very “Blessings of Liberty” the
Constitution was designed to secure. Freedom is our protection from that kind
of government.
There is a positive argument to be made for
government, and the Constitution does not ignore it. It is eloquently stated in
the document’s opening lines, which enjoin government to establish justice and
protect national security. These injunctions are vital: there is no liberty
without them. But they do not involve social engineering or the picking of
winners and losers. These guarantees, instead, are for everyone,
uniformly: Government must “provide for the common defense”
and “promote the general welfare.” The Blessings of Liberty
are to be secured “to ourselves and to our posterity”—not to yourself at the
expense of my posterity.
We are in an age of upheaval, and what becomes of our
law will go a long way toward determining how it ends. In a free society such
as ours, grounded in a culture of ordered liberty, law should not be a didactic
force. It undergirds economic and social life as it is already lived,
reflecting the society’s values rather than instructing the society on what to
think and how to live. But today’s progressive legal elites would have it
another way. To them, the “rule of law” is code for a “social justice” crusade
in which the courts, government bureaucracies, and international tribunals
replace democratic self-determination with their sole and exclusive truth. If
the progressives get their way, upheaval will not yield utopia. It will yield
totalitarianism.
No comments:
Post a Comment