The choice is ours to make
Decline” we Americans and
Westerners mope about daily; “fall” most of us still hope to postpone.
Decadence, it would seem, is the mean between the two.
The much-overused decline and
fall trope, fixed permanently into our abstract vocabulary ever since Edward
Gibbon’s Decline and Fall of the Roman Empire took a
then-experimentally post-Christian Western Europe by storm, was meant to
demonstrate the mortality of all human constructions. Oddly enough, however,
Gibbon did it in spite of the Enlightenment’s discovery of progress by
retreating to the oldest trope of all—the cyclical, organic metaphor of birth,
growth, decay, death. Much of the 19th century was spent trying to
reconcile progress with the cyclical via the uses and abuses of Darwin. In the
20thcentury, Oswald Spengler, Arnold Toynbee and Paul Kennedy rejoined that
intellectual dispute, traceable to remote antiquity: Either the human condition
is cyclical, like the seasons and the life cycle, or it is linear, starting
someplace, going someplace, with a positive goal ahead.
German, Briton and American all
knew that America was perhaps the key to the answer. The greatest event in
history, the discovery of the New World, had apparently put America on the
linear track, destined to escape a cyclical fate. The presumption had a religious
basis made clear by St. Augustine’s diatribe against cyclical thinking in Book
XII of The City of God, and it even waxed imperial in Virgil’s
time-transcending Aeneid, a very American epic, as illustrated by those
three Virgilian quotations on the dollar bill. Like Gibbon, Spengler and
Toynbee ultimately sited their declinism in the cyclical rhythms of life.
Kennedy, American and steeped in all things Christian and imperial, instead
found the fatal flaw in linearity. It was linearity of the Faustian kind: The
rise to wealth and power generates delusions of inevitably more successful
adventures ahead until “overstretch”, a form of national self-indulgence,
brings down the entire enterprise.
Kennedy’s approach seems to
have been inspired more by mechanics or physics than by that most influential,
and also ancient, variation on the “rise and fall” theme, that of moral decay,
or decadence. Livy’s Roman Republic maintained its manly virtues because “they
turned away from a thousand daily temptations”, but, Tacitus said, the Empire
was doomed as Romans “indulged every desire as soon as it came to mind.” George
Kennan extended the Roman experience with decadence to our own:
Poor old West: succumbing
feebly, day by day, to its own decadence, sliding with debility on the slime of
its own self-indulgent permissiveness; its drugs, its crime, its pornography,
its pampering of the youth, its addiction to its bodily comforts, its rampant
materialism and consumerism—and then trembling before the menace of the wicked
Russians . . . .1
Far more seriously and
exhaustively than the supercilious Kennan, the French-born American historian
Jacques Barzun took up the matter in his monumental From Dawn to
Decadence: 1500 to the Present.
Published in 2000 when Barzun
was 94, From Dawn to Decadence covered a half-millennium of “Western
Cultural Life”, describing four phases in nearly encyclopedic detail: from
Luther’s Reformation—really a revolution that tore the West apart—to the
Scientific Revolution, which provided the basics for universal material
progress; from the Royal Courts of Europe to “the Tennis Court” of the French
Revolution; from Goethe’s Faust as a driver of the modern era to
modernism’s fragmentation of arts and letters; and from the mass illusion of a
socialist utopia to the horrors of the Great War and finally on to the late 20th-century
protest mob’s gleeful chant of “Hey, hey, ho, ho: Western Civ Has Got to Go!”
Across the centuries in Barzun’s chronicle history moves in both a linear and
cyclical manner. An explosion of dynamic individualism propels civilization
forward toward a better future; but that same dynamic proves incapable of
virtuous control, causing greed, violence and deepening self-indulgence to
spiral society downward toward chaos. Barzun liberates us from the tyranny of
either-or, but fails to offer much hope of escaping decadence in the process.
But pace Barzun, if
America is exceptional, might it not be an exception to the inevitability of
decadence? It is, at the least, a matter to which Americans have been attentive
over much of their history.
Early on, Americans sensed that
they were somehow exempt from Old World cycles of rise and fall, but that sense
was nonetheless powerfully counteracted by a continuing, pervasive fear of
decadence. The Puritans were consciousness personified, assiduous diary-keepers
who were ever watchful for the slightest signs of grace or degeneracy. Yale was
founded because Abraham Pierson and other divines concluded that Harvard was
becoming doctrinally depraved. Jonathan Edwards’s “Sinners in the Hands of an
Angry God” revolved around the biblical warning of the prophet Amos that “thy
feet will slide in due time.” Thomas Jefferson sought to refute the theory of
the French naturalist Buffon that the plants, animals and even geographical
features of the New World were degenerate, declining and weakening as a result
of the fetid swamps and clogged forests that bespread the Western Hemisphere.
Jefferson, outraged, sent troops to New England to gather evidence on the size
and strength of the bull moose, and later instructed Lewis and Clark to be on
the lookout for mastodons.
The issue troubled George
Washington, too. His Farewell Address, commonly interpreted as a warning to
avoid foreign entanglements, was more concerned with maintaining the character
of the nation amid the temptations of freedom. As the world’s first-ever free
people, the individual virtue vital to successful popular government could only
be upheld, Washington believed, by respect for religion. But the Enlightenment,
by ruling “foundations” like religion out of permissible intellectual bounds,
called the matter into question ab initio. Thus the ancient insight, as
old as Plato, remained in effect: The soul and the polis reflect each other.
Or, to put it in contemporary terms, consciousness creates the self, but that
self is then the subject of society and government. If so, then the people must
be constantly on guard to prevent government from attempting to remake the self
and consciousness to the point of enlisting them in the government’s causes.
Liberty requires conscious vigilance.
It is a good thing, then, that
consciousness has been an American preoccupation since Tocqueville analyzed the
nation’s “point of departure” in early New England, reinforced by Thoreau’s
call upon his fellow citizens, in Walden, to become fully “awake.” But the
Enlightenment’s assertion of the sovereignty of the individual subject as the
center of human knowledge, capable of essentializing thought in itself, was
soon challenged by what the intelligentsia made of Marx, Freud and the new
social scientists. The challengers insisted that innate biological systems
transcend the power of individuals, such that Freud’s “unconscious” or Marx’s
“capital” are elemental to the human condition, leaving the individual only a
slight possibility of using them along with at least as likely a prospect of being
used by them. This was a condition that, as Edward Said put it, “flatly
contradicts the core of humanistic thought” by relegating the idea of
individualism to the status of illusory autonomy or fiction.2 This dispute
goes to the heart of the question that every consequential modern political
thinker has felt compelled to take up at the start of his argument: “What is
human nature?” The answer, we recently have been instructed, is that human
nature is not as “human” as we supposed.
As recently as a quarter
century ago, most psychologists believed that human behavior was primarily
guided by conscious thoughts and feelings. It was still possible to affirm
Herodotus, who in Book Two of his History showed us that the
inexplicable absence of such a predetermined nature was why human beings have
to hold political meetings, as crocodiles do not. In the early years of this
new century, however, consciousness has atrophied at an accelerating pace.
Social science is the new scholasticism, an intellectual paradigm in which
participants are published, prized, tenured and made prominent for their
contribution to one great required idea: to prove “scientifically” that human
beings have nothing resembling what formerly was called “free will.” An
avalanche of “studies” now unsurprisingly asserts that we hold prejudices
seated in a level of our minds so deep as to be inaccessible to our conscious
awareness.3
The advent of “screen
culture”—cellphones, iPads, as well as old-fashioned TV and film—now ubiquitous
among the young in their formative years of education, has shrunk consciousness
down in a different way. Students increasingly seem conditioned by the fact
that much of their waking life is populated by mechanically mediated images in
which they can see other beings on screens but those others cannot see them. As
a result the viewer can become oblivious to others, having no need to interact
or maintain a minimum of civil conduct with them. To think back on Herodotus
again, this is the Gyges question: What do you do when no one is looking? The
“screenie” has invisibility even without privacy. As consciousness has
atrophied, obliviousness—and no little rudeness—replaces it. This phenomenon
adds a new dimension to the age-old definition of decadence.
We know that consciousness
changes across time, expanding or contracting when affected by major events in
culture or technology. Hegel declared that any study of consciousness must be a history in
which the expanding consciousness of the individual recapitulates the
historical development of “the world spirit”, which is that of an expanding
realm of freedom.4 Victor Hugo in The Hunchback of Notre Dame described
the cathedral as a university, a universe of knowledge in stone, with its
sculpted scenes understandable even to an illiterate people. Then came the
printed book, which created a new culture and consciousness of learning that
brought an end to the era of the cathedral as university. T.S. Eliot, too,
claimed that a shift of similar consequence occurred in the early 1600s, which
he termed the “dissociation of sensibility.” Whereas thought and feeling
formerly had been experienced together, the cultural transformation of the time
separated them into unconnected “rational” and “emotional” states, a dichotomy
of consciousness that has continued ever since. Flaubert depicted this shift in Madame
Bovary by revealing Emma’s consciousness as severely one-dimensional as a
result of her infatuation with the genre of popular romance novels.5
So in the early 21st century
it is the electronic domination of society by screen culture, as well as the
orthodoxies of social science, that is shriveling consciousness. As Sven
Birkerts prophesied at the fading cusp of the century past, this new millennium
shift may spell the end of the centuries-long book culture heralded in Hugo’s Notre
Dame.6 The advent of “modernism” in the arts of the early 20th century
depicted an age of fragments—to be “shored against” our ruin, Eliot wrote—that
has been carried to greater fissuring by hand-held “remote” devices. These
devices produce an ever shorter “attention span” that tolerates only fragments
of information. As Stanley Cavell of the Harvard philosophy department has
noted, “chronic interruption means the perpetual incompleteness of human
expression.”7
The habits of the incomplete
have adversely affected the book as a unit of knowledge, for the book’s unique
characteristic is to present an “extended argument.” By now, several
generations of students have been conditioned to read books by way of
fragmentation, which subverts any real book’s purpose. The consequences include
the demise of bookstores, a form of textocide brought about not only by online
price-cutting but also by the denigration of extended argument itself. This
does grave damage to intellectual serendipity, for the richest value of a
bookstore—as well as a large, open-shelf library—is to reveal via softly
structured browsing what you were not looking for, or had no idea
even existed. Now we are corralled by Google’s “big-data” efficiency into
finding only that which we already know is there to be found. To Edmund Burke’s
disgust at a time of “sophisters, economists, and calculators” we might now add
“researchers.”
I
n one of John Updike’s last
novels, one of the witches of Eastwick listens to a lady minister preach about
“selves”:
We live in a very
self-conscious age. There is a magazine called Self. There is a book
called Our Bodies, Our Selves. We want to find our selves, and to be true
to our selves. My Webster’s New Collegiate Dictionary holds two full
columns of compound words beginning with ‘self-abandonment’ and ‘self-abuse’
through ‘self-interest’ and ‘self-reliance’ and ‘self-satisfaction’ down to
‘self-willed’ and ‘self-winding.’ So—what is this self, this precious entity
each individual uniquely possesses?
It is a good enough question to
sire an update of another. Here is a 21st-century recasting of the 18th-century
question posed by Hector St. John de Crevecoeur in his 1782 Letter from an
American Farmer: “What, then, is this new man, the American?” What scope is
there for the self when consciousness has shriveled and free will
dwindled?
The answer is “not much”, and
this means the loss of a core dimension of human purpose. This, from Emerson to
Nietzsche to David Foster Wallace, is the imperative to harness the power of
one’s own attention so that we can “construct ourselves by assembling our
experiences, desires and actions in the way a novelist gives coherence to the
incidental plot points of a novel.”8 But a diminished self, with its loss
of interiority, is disinclined toward or simply unable to undertake such a task
of self-fashioning. As a population, such diminished selves increasingly become
passive, receptive, pleasure-seeking, self-indulgent and, yes, decadent.
With this observation we enter
the Age of Entertainment, Plato’s nightmare, made worse by a ubiquitous screen
culture that makes entertainment always available and that turns
everything—news, sports, health, war—into one or another form of entertainment.
Here is how the late Michael Crichton put it, linking entertainment with
accelerating fragmentation, in his 1999 novel Timeline:
Today, everybody expects to be
entertained, and they expect to be entertained all the time. . . . [E]veryone
must be amused, or they will switch: switch brands, switch channels, switch
parties, switch loyalties. This is the intellectual reality of Western society
at the end of the century. In other centuries, human beings wanted to be saved,
or improved, or freed, or educated. But in our century, they want to be
entertained. The great fear is not of disease or death, but of boredom. A sense
of time is on our hands, a sense of nothing to do. A sense that we are not
amused.
Entertainment’s contribution to
decadence was exhibited in the mid-1990s through the medium of advertising,
which has now become commercial, political, a product of the cultural elite and
designed to entertain all at the same time. We thus now have perhaps the most
sanctimonious form of advertising in human history. To take just one example,
in 1995 designer Calvin Klein was forced to withdraw a sexually charged ad
campaign for children’s jeans that critics had called pornographic. In its
defense, the company released the following statement:
The message of the Calvin Klein
jeans current advertising campaign is that young people today, the most media
savvy generation yet, have a real strength of character and independence. They
have very strongly defined likes of what they will and will not do—and have a
great ability to know who they are and who they want to be. . . . We continue
to believe in the positive message of these ads.
If Alfred North Whitehead was
correct to say that the world’s intellectual history “consists of a series of
footnotes to Plato”, then the philosopher-novelist Iris Murdoch was justified
in saying, “What Plato feared should now be clear.”
Indeed, there is a logical
chain across intellectual history that links Plato’s dismissal of the “poets”,
by which he meant all the major art forms, and today’s Age of Entertainment,
which aims to turn all that is not entertainment into entertainment. For Plato,
the arts were mimetic; they merely represented reality and therefore distorted
it, keeping one from confronting reality directly. When the Enlightenment ruled
out metaphysical foundations, “art” and the “sublime” became a substitute for
religion. With modernism and its invited return of the mythic, the arts became
a demonic force, as in Thomas Mann’s Doktor Faustus. And in postmodernism
the classic arts, such as the symphony orchestra, painting, poetry and the
novel, the “legitimate” theater and architecture, all become merely the hobbies
of elevated cultural elites.
In this way entertainment displaces
all the major arts and humanities, blocking substantive communication and
democratic discourse and replacing them with dictated, corrosively sardonic,
simulated moments. Art is interactive, emotionally direct and dialectical;
entertainment is hierarchical, satirically gamed, dictatorial. When
entertainment was a lower-case word, it was a commonplace that, as the critic
Lester Bangs put it, “the ultimate sin of any performer is contempt for the
audience.” Now sinning of that sort is pervasive, for the power of
entertainment corrupts entertainers, and the perquisites of celebrity are too
tempting to resist and too delightful not to exploit. This accounts for many an
audience’s vaguely uneasy sense that while being “entertained” they actually
are being insulted.
This helps to explain Toynbee’s
striking observation about virtue shifting—that the creative minority behind
great civilizations will, as the civilization begins to decline, transform
itself into a dominant minority that takes on the vulgar and promiscuous behaviors
of society’s low-life. So in the post-Cold War period America’s “cultural
elite”, a label made prominent by Vice President Dan Quayle in the early
1990s, adopted distinctly coarse attitudes and practices. A Vice Presidential
case in point: When the wife of Senator and later Vice President Gore attacked
the violence and misogyny of some rock and most rap lyrics, she was scolded by
most of her social and political peers. Why were four-letter words, formerly
seen by the upper-middle class as déclassé, now appearing in glossy upscale
magazines? How had “the hooker look” become a fashion trend among nice girls
from the American suburbs? How had multiple body piercings and tattoos, which a
few decades ago marked only sailors and motorcycle gang thugs, become trendy?
Toynbee would have shrugged and
said simply that we are witnessing the self-proletarianization of the American
dominant minority. Happens all the time. Yet there is reason to suspect
that the primary cause of this vulgarization may rather be the adoption of a
broad cultural style that enhances the elite’s power. In a polity that has been
shedding its Founding Fathers-designed barriers against Athenian-style direct
democracy, the power elite ever more requires the protective camouflage of proletarian
class superiority. Flaunting coarse conduct and a combat boots dress code adds
heft to the elite’s domination. So in place of the classically tripartite
elements of the soul—reason, desire and spirit, according to the parable of
Leontias in Plato’s Republic, or in earlier America, self-reliance,
Christian-Roman virtues and patriotism—a new triad emerges: claims on
government, vulgar behavior and a yearning for relief from world leadership.
This vast societal
transformation might be called “The Great Virtue Shift.” Almost every act
regarded in the mid-20th century as a vice was, by the opening of the 21st century,
considered a virtue. As gambling, obscenity, pornography, drugs, divorce, homosexuality,
abortion and sneering disaffection became The New Virtue, government at all
levels began to move in on the action, starting with casinos and currently
involving, in several states and the District of Columbia, an officially
approved and bureaucratically managed narcotics trade.
The Great Virtue Shift has
produced among its practitioners the appearance of profound moral concern,
caring and legislated activism on behalf of the neediest cases and most
immiserated populations at home and around the world. To this may be added the
panoply of social agenda issues designed to ignite resentment and righteous
indignation among the new “proletarian” elite. All this works to satisfy the
cultural elite’s desire to feel morally superior about itself regarding collective
moral issues of large magnitude even as they, as individuals, engage in
outsized self-indulgent personal behavior. This is Reinhold Niebuhr’s “moral
man and immoral society” turned on its head, where hedonism takes cover beneath
a superficial global moralism.
The virtue shift has been
paralleled by a governmental shift. As gifted politicians have sensed the
changing psychology and national character of the country, they have learned to
constantly scan the political horizon to identify each special interest group,
make the necessary promises and then move to satisfy each group’s claim on
government largesse, or its demand for deeper government intervention to
enforce adherence to each group’s behavioral choices. Throughout most of
American history people were preoccupied with how to prevent government from
becoming corrupt. In our time, governments have discovered how to corrupt the
people. It then follows that the more corrupted the people become, the more
numerous the laws must be, thus further aggrandizing government’s
indispensability.
Social science plays a role
here, too. Science provides a basis for philosophical, cultural and political
ideas and institutions in any era, as when Newton’s new physics ratified the
idea of a clockwork universe and later derived into the concept of an
international balance of power as somehow “natural.” In our time, as noted,
ideas extrapolated from science, and its camp follower social science, are
leading to a constriction of consciousness and, consequently, of the idea of
the self. Quite beyond the sirens of entertainment and self-proletarianization,
this constriction will inevitably affect national character and especially the
psychology of leadership.
The social science studies
endorsing such constriction tend to render leaders less comprehending of or
inclined to take on international matters of large magnitude. Reinhold Niebuhr
recognized the problem in the mid-20th century. As he wrote in The
Irony of American History:
The realm of freedom which
allows the individual to make his decisions within, above and beyond the
pressure of causal sequences, is beyond the realm of scientific analysis.
Furthermore, the acknowledgement of its reality introduces an unpredictable and
incalculable element in the causal sequence. It is therefore embarrassing to
any scientific scheme. Hence scientific cultures are bound to incline to
determinism. The various sociological determinations are reinforced by the
general report which the psychologists make of the human psyche. For they bear
witness to the fact that their scientific instruments are unable to discover
that integral, self-transcendent center of personality, which is in and yet
above the stream of nature and time and which religion and poetry take for
granted.
A belief in determinism
conduces to passivity in leadership. In social science, it conduces to an
oscillation between passivity and unbridled hubris. On one hand, social science
is often disinclined to go near, or even to recognize, the most difficult
challenges of statecraft. This is because it cannot address them by confining
thought to narrow, manageable and countable pieces where “variables” can be
eliminated and outcomes can appear to be replicable. Yet at the same time, proponents
of science—or, more accurately, “scientism”—eagerly advance large and highly
speculative hypotheses to explain matters that are too complex and contingent
to be successfully addressed by social scientific methods. As John Gray put it,
“Fact and value are systematically confused; and the attractively simple
theories that result are invested with the power of overcoming moral and
political difficulties that have so far proved intractable.”10 Social
authoritarianism cloaked in the guise of science, and its accompanying
determinism, has all but abolished what we used to understand as the human
condition.
S
o from consciousness to self to
society to state to statecraft, “decadence”, though defined as a downward
spiraling process, actually appears as a rising phenomenon. As far as Western
Civ goes, it looks like a bottom-up surge so powerful that it may wash over the
modern age itself—defined and commonly understood from its conveniently
accepted inauguration around 1500 to the present. Four major revolutions in
ideas and institutions have made the era we call “modern”—Renaissance,
Reformation, Enlightenment and, for want of a better term, Westphalianism—and
each is now in a declining or deteriorated condition.
The Renaissance recovered and
advanced liberal learning, the study of classic texts and arts that are
“possessions for all time”, as Thucydides put it. These are the works that
speak to the highest challenges of the human predicament, problems requiring
more than quantificational solutions, problems where nothing can precisely be
replicated, where a leader must decide before all factors can be known, let
alone assessed, where uncertainty and ambiguity cannot be eliminated. But the
humanities in our time are devalued. In this vocational age, learning is
crowded out by training. The “sixth sense” of statecraft—from Themistocles as a
reader of oracles to Spengler’s recognition that the statesman’s art is akin to
being “a judge of horseflesh”—is deeply impaired by the decline of the liberal
arts.
The Reformation, a revolution
to overturn established theological doctrine, was vastly consequential in
defining modernity as the age of individualism. If what mattered was one’s
faith alone above Church, priesthood and sacrament, it ultimately came to mean
that the individual was neither predestined nor therefore bound to be unchanged
or unfree. From there, it was only a short distance to credit the right and the
consequentiality of personal and political decisions on secular issues in the
huge ideational space opened up by the Reformation’s liberating concepts and
its pluralization of social authority across Europe. Before long, class,
gender, ethnicity, wealth and other distinctions mattered less than the individual’s
new power as a free actor. In The Prince, published just four years before
Luther hammered nails into a church door in Wittenberg, Machiavelli recognized
that the vertical power of medieval Christendom was giving way to a horizontal
form of power that was up for grabs in the rise of Europe’s new commercial
city-states. While Machiavelli would later be reviled for his “evil” challenge
to papal-imperial authority, he unknowingly gave spiritual legitimacy to
secular, individualistic drives to fill this power vacuum.
But we are now defined once
again as fated—by ethnic and cultural heritage, by “studies” that “show” our
behavior is ruled by the unconscious mind or immovable
socio-psychological-biological factors. Emerson’s self-reliant individual is
now condemned as selfishly indifferent to the needs of the community. When all
available options are ruled out or pronounced illusory, what is left for the
individual or for individuality itself?
The Enlightenment, symbolized
by Diderot’s encyclopedia and Linnaeus’s classification system, awakened the
world not only to the universe of knowledge but to a set of responsible methods
for applying knowledge. One must categorize but not compartmentalize. Thus
Archilochus, who has come to us by way of Tolstoy and Isaiah Berlin, used
the characteristics of fox and hedgehog to express the tension between
individual actions and the inexorable forces of change. Hegel saw history as
the arena in which the greatest issues of the human condition must be played
out, and this has been the task of statecraft ever since. These became the
defining attributes of the modern age: that you must decide as best you can,
that decisions are consequential, and that making them well requires
considering both minute particulars and the entirety of the situation. These
are the essentials of statecraft, which is an art, not a science. Today, all of
these are in a dilapidated condition or worse.
The fourth pillar of the modern
age has been the international state system given incipient form at Westphalia
in the settlement of the Thirty Years’ War. The genius of the concept lay in
its procedural nature, making it potentially universal. Any nation, of whatever
character, could become a legitimate international citizen simply by committing
to a short list of procedural requirements: be a state, respect international
law, follow minimal norms (no slavery), and field professional military and
diplomatic services. With difficult military and diplomatic efforts this
international system has been defended—all major modern wars and revolutions
originated in a determination to wreck it—and in the 20th century it
became the structure within which the world’s nations agreed to conduct their
common affairs. Kant added a dimension to Westphalia by pointing out that a
republic, or better a democratic republic, was the most practically effective
governmental form for establishing justice within a state and maintaining peace
and security among states thanks to its benign focus on procedural norms. The
Kantian project, yearning for center stage since 1795, seemed about to take
firm institutional hold at the end of the Cold War, when “democratization” won
UN acceptance as a procedural element in the quest for a solidified world
order.
But this system, too, is now in
a deteriorated, even decadent condition, and democracy worldwide is in decline.11 If
present trends are not reversed, prospects are grim for the international state
system, democratization within nations and the modern era itself, based on
humanism, individualism, universalism and the procedural nature of relations
among nations. That which supplants it will be adversarial to each of its
principles. Whether future observers will decree this result a matter of the
cyclical or the linear, or both or neither, one cannot say. One can barely
bring oneself to care.
A
round the turn of the 19th to
the 20th century, “decadence” arose as a romantically thrilling elitist
fashion, providing a “sweet spot” in which a privileged, self-selected class
could revel in dissolute practices while applauding their own cultural
superiority. At the turn of the 20th to the 21st century something
akin has emerged—call it a democratized form of decadence—among a far wider
swath of the population, with the support of government and approbation of the
cultural elite. Many observers have gazed upon such phenomena, then and now,
and have seen mainly the sources of shifts in the art world. We move from the
1913 New York Armory Exhibition to mainstreaming of “street art” a century
later rather effortlessly. But if what is at stake is world order, with
national character and identity as its foundation stone, and democracy as the
procedurally and practically most efficacious political form, then the fate of
the art world may be the least of our concerns.
It comes down, finally, to the
individual and to George Washington’s recognition that a free society must be
made up of virtuous, self-disciplined citizens. Americans have grappled with
this idea since the days of the early Republic. Americans possess liberty as do
no others and so have sought to understand its uses and responsibilities as
well as the myriad of ways, direct or insidious, through which it can be taken
away. Freedom is for a people; liberty is for the individual. So if liberty
must be limited in order to be possessed, it must be self-imposed in the
recognition that certain limits are essential to making one’s actions
effective, intellectually coherent and even possessed of a certain beauty.
To the main point of
Washington’s Farewell Address, that virtue is the necessary restraint upon
liberty and religion the best source of virtue, Tocqueville added that in
America, uniquely, religion and liberty are compatible: Freedom sees religion
as the cradle of its infancy and the divine source of its rights, while
religion is the guardian and guarantee of the laws that preserve liberty. But
at the same time, from the Puritans onward, American liberty has been
endangered by the American “passion for regulation.” This, Tocqueville
predicted, eventually would enable government to extend its arms over society
as a whole, to cover its surface “with a network of small, complicated,
painstaking, uniform rules through which the most original minds and the most
vigorous souls cannot clear a way.”
There is a logic chain at work
here, too: a lack of self-limitation on individual liberty will produce excess
and coarseness; virtue will retreat and, as it does, hypocritical moralizing
about society’s deficiencies will increase. Widening irresponsibility coupled
with public pressure for behavior modification will mount and be acted upon by
government. The consequential loss of liberty scarcely will be noticed by the
mass of people now indulging themselves, as Tocqueville predicted, in the
“small and vulgar pleasures with which they fill their souls.” We will not as a
result be ruled by tyrants but by schoolmasters in suits with law degrees, and
be consoled in the knowledge that we ourselves elected them.
To retain liberty, or by now to
repossess it, Americans must re-educate themselves in what has been made of
Burke’s precept: “Liberty must be limited in order to be possessed.” Walt
Whitman re-formulated this as, “The shallow consider liberty a release from all
law, from every constraint. The wise man sees in it, on the contrary, the
potent Law of Laws.” Learning what liberty is and what it requires of us is the
only bulwark, ultimately, against American decadence. Pay no heed to the
determinists: The choice is ours to make.
1Kennan quoted in “Western
Decadence and Soviet Moderation”, in Decline of the West? George Kennan and His
Critics (Ethics and Public Policy Center, 1978), p. 8.
2Said, Humanism and Democratic
Criticism (Columbia University Press, 2004), p. 10.
3Note, for example, Mahzarin R.
Banaji and Anthony G. Greenwald, Blindspot: Hidden Biases of Good People
(Delacorte, 2013).
4See Theodore Ziolkowski, Clio
the Romantic Muse (Cornell University Press, 2004), p. 43.
5See Erich Auerbach, Mimesis:
The Representation of Reality in Western Literature, first published in German
in 1946.
11See Joshua Kurlantzick,
Democracy in Retreat (Yale University Press, 2013).
6Birkerts, The Gutenberg
Elegies: The Fate of Reading in an Electronic Age (Faber & Faber, 1995).
7Cavell, Little Did I Know:
Excerpts from Memory (Stanford University Press, 2010), p. 30.
8Thomas Meaney, “David Foster
Wallace on Planet Trillaphon”, Times Literary Supplement, March 13, 2013.
9Noted in Charles Murray,
Coming Apart (Crown Forum, 2013), pp. 286–7. For the original analysis see
Toynbee, A Study in History, revised and abridged by the author and Jane Caplan
(Portland House, 1988), pp. 241–7.
10Gray, “The Knowns and
Unknowns”, The New Republic, May 10, 2012.
11See Joshua Kurlantzick,
Democracy in Retreat (Yale University Press, 2013).
No comments:
Post a Comment