WIth the Blue Guitar

Friday, April 29, 2005

A Rational Don Quixote

http://www.prospectmagazine.co.uk/article_details.php?id=6892&issue=505&category=&author=927&AuthKey=de05cc17a927e64de7fbcc9174f859db

Julian Evans

In all the battles for the Enlightenment, one combatant's name is rarely mentioned. Don Quixote de la Mancha, icon of everything in humanity that is calamitously idealistic, is renowned for qualities other than rationalist courage: for kindness and foolishness; for unintended comedy and a refusal to be disenchanted; for clairvoyant lunacy and obstinate romanticism in a rotten, factual world. He rides out with Sancho Panza from his village in la Mancha to discover that the world is not as he has read about it in books of chivalry and, impervious to ridicule or failure, for 124 chapters seeks to live up to the pastoral ideal of the knight errant, that fiction of the good man. Only in the 126th and final chapter does he acknowledge the "absurdities and deceptions" of the books that inspired him and then, in an ending of unbearable sadness, finally renounces his world of fantasy, returns to his senses, and dies.For 400 years—the first edition of the Quixote was distributed in Madrid in 1605—his story has supplied the archetype of the bookish dreamer and the outermost comic landmark of our idealism. Yet Don Quixote's achievement is surely greater than that. Without him, and without Cervantes's own constant shifting between tradition and modernity, we might have remained for longer in a world of superstition and dogma. "Enlightenment is man's leaving his self-caused immaturity," Kant wrote in 1784, 180 years after the first publication of the Quixote. "The motto of Enlightenment is therefore: Sapere aude! Have courage to use your own intelligence." On the knight's 400th anniversary we can see that this was the courage that Don Quixote has bequeathed us. His own misguided intelligence, bound to an immaturity that leads to folly, takes him on an epic of discovery in which he finally leads the reader out of his or her own immaturity. Frequently evoked as picaresque, the Quixote is more accurately seen as a Bildungsroman. It takes its Bildung in two directions, the one in which Don Quixote is shown his own folly, and the other in which the reader is invited to understand the difference between appearance and reality.How much did Cervantes intend such a reading of his book? There is no reason to disbelieve his claim that his main object was to ridicule the romances of chivalry which, in their late 16th-century incarnation, had become increasingly absurd. Cervantes wrote, like most writers, for money, and his intention at the outset was to write a prose tale in which these absurdities could be satirised. As he continued, his story expanded into a brilliant panoramic fresco of Spanish society declining into economic chaos and class resentment under the decadent rule of Philip II and III. But Cervantes could not have understood that he was also composing something else, a determining text, the first story to be aware at every moment of its own fictitiousness, the book which would send a continent of writers off in search of a new identity—the original modern novel.When the first part, the 1605 publication, became successful Cervantes saw the logic of producing a sequel, but we have a rival "second volume" by Fernández de Avellaneda, published in 1614 in an attempt to cash in on Quixote's popularity, to thank for Cervantes finally finishing the second part (Cervantes was always a leisurely writer). We may also owe the deep well of pathos that is Quixote's death scene to Avellaneda's attempted hijacking. The death of his hero may have been Cervantes's way of ensuring that no one could ever again interfere with his character. But the novel's contemporary popularity was due to its wealth of incident and its strokes of comic timing. Full appreciation of its political insight, its grasp of its own times and its humanity came much later. It is Jorge Luis Borges who, 300-odd years after publication of the Quixote, writes in his story "Pierre Menard, author of the Quixote" about how we are able, in the light of what has happened to the world since Cervantes's novel appeared, to find it much richer in allusion and significance now than it was then. We cannot attribute to Cervantes a sense of his own future greatness or influence. He was experienced in matters of state: he had seen Spain fall from greatness through misgovernment, bankruptcy and military arrogance, a fall so sudden that Spaniards wondered if their country's original grandeur had been "no more than un engaño [illusion]?" This land of depopulation and unrest was Don Quixote's country; foolish and unsuccessful wanderer he may have been, but Cervantes intentionally set him up in stark opposition to Philip II, who rarely rose from his desk in the Escorial.The author was, however, aware of his novel's early success. We are told that by June 1605, only a few months after publication, "the citizens of Valladolid [where he was living] already regarded Don Quixote and Sancho Panza as proverbial types." He would also have known that his novel was first translated into English in 1612 and into French in 1614. (There may be a strand of the English character that uniquely identifies with a strand of the Quixote's romanticism, not so much its idealism and emotion as its eccentricity. The English feel a special sympathy for folly committed in the name of loyalty to an utterly outmoded code of conduct.) But Cervantes could not know that in 2002, in a poll organised by the Norwegian Nobel Institute, 100 writers worldwide would vote Don Quixote the "best and most central work" in literature, eclipsing the plays of Shakespeare, Dostoevsky's novels and Homer's epics. So the Quixote is, like all masterpieces, accidental. But how, as readers, are we to discern its greatness from the perspective of our own post-Enlightenment times? Do its concerns still speak to us? We need to start by reading it, but do we actually read it? Well, not individually, as the head of a modern publishing conglomerate recently said to an author he met by chance. We do not read the Quixote because of its length and, even in most recent English versions, difficult prose. But we also do not read it because we do not need to. In critical terms, one problem, perhaps unique to Cervantes's work, is that we have no perspective on the novel, because the Quixote itself created our perspective. Harold Bloom writes, in his introduction to Edith Grossman's excellent 2003 translation, that "it so contains us that, as with Shakespeare, we cannot get out of it." The day Quixote and Sancho rode out from their unnamed village, a fictional blueprint came to life. Don Quixote is our prototypical text, the first story to emerge out of a self-awareness of its own fictional form, to take as its theme the gap between appearance and reality; to be, in our terms, modern. It is to the modern novel what Sigmund Freud is to psychoanalysis. Freud, in fact, was an admirer of Cervantes: in the summer of 1883 he confessed to his fiancée Martha Bernays that he was more interested in Don Quixote than in brain anatomy. He found Quixote's dialogues with Sancho Panza significant for the lesson they offered of the need both to discriminate between reality and fantasy and to understand their interplay. He expressed an oddly romantic sympathy for Quixote's idealism: "Once we were all noble knights passing through the world caught in a dream, misinterpreting the simplest things, magnifying commonplaces into something noble and rare, and thereby cutting a sad figure… we men always read with respect about what we once were and in part still remain."In the 21st century, with our potent self-consciousness, we not only know too much but know that we do, and to read Don Quixote is to be heartened that in the embrace of their illusions people are capable of decent, funny, unpredictable acts. The Enlightenment was essential for our freedoms, but more than rationalism is needed in the world. Carlos Fuentes has written that at the end of the novel, Don Quixote suffers from "the nostalgia of realism"—not the realism Cervantes has invented but the realism of old, of impossible adventures with knights errant, magicians and frightful giants. "Before, everything that was written was true," writes Fuentes, "even if it were a phantasy. There were no cracks between what was said and what was done in the epic. 'For Aristotle and the middle ages,' explains Ortega y Gasset, 'all things were possible that do not contain an inner contradiction. For Aristotle, the centaur is a possibility; not for us, since biology will not tolerate it.'" Fuentes illuminates well Don Quixote's suffering—that he must choose between the drama of make-believe and the mean necessities of reality—but the novel additionally lights the way of readers yet unborn through the knight's dual lesson of the choice he must make and the choice the reader must make about his fictional necessity (or not).Believe in me! My feats are true, the windmills are giants, the herds of sheep are armies, the inns are castles and there is in the world no lady more beautiful than the empress of la Mancha, the unrivalled Dulcinea del Toboso! Believe in me.Reality, as Fuentes writes, "may laugh or weep on hearing such words." But reality also feels itself outmanoeuvred, outgunned by their appeal. After hearing them, we as readers can forever understand that there is more than one objective reality.Miguel de Cervantes Saavedra personally felt the disenchantment of reality. His novel is driven by the rebuffs and misfortunes he was dealt: to choose refuge in noble dreams would have been an obvious choice for a man whose aspirations repeatedly failed to bear fruit. Little is known of Cervantes's first 24 years. In 1571, enlisted with his brother Rodrigo as a soldier, he sailed on the Marquesa from Messina as a member of Don Miguel de Moncada's regiment to repel the Ottoman advance in the fleet of Don John of Austria. At the battle of Lepanto, in the gulf of Corinth, the Marquesa was in the thick of the eventually victorious fight. Cervantes received three gunshot wounds, one of which maimed his left hand, "for the greater glory of the right" as he said afterwards. After Lepanto his military service was spent in Naples and Palermo. In 1575, returning to Spain, he and Rodrigo were captured by Barbary corsairs. Cervantes was enslaved to a Greek renegade at Algiers. Repeated escape attempts failed: twice betrayed, he then saw his brother liberated when funds sent by his parents were inadequate to ransom them both. Resold to the viceroy of Algiers and betrayed again by a Dominican monk, he was finally released after five years of slavery when two Trinitarian friars successfully ransomed him. On his return he wrote plays and the pastoral novel Galatea, a serious bid for fame that failed; it was inconclusive and derivative, and pushed him back into paid employment. At Seville in 1587 he found employment provisioning the Armada and was excommunicated for excessive zeal in collecting wheat. He then applied for a complete getaway, to a post in the Indies. "Let him look for something nearer home," his petition was drily annotated. He found work as a tax collector and was imprisoned at least four, possibly six times for everything from irregularities in his accounts to allegedly making a pass at the sister/niece/mistress of a (probably tax-evading) landowner, Don Rodrigo de Pacheco. It may be that Don Quixote began as a desire to get his own back by satirising Don Rodrigo as a mad knight who "slept so little and read so much that his brain dried up and he lost his reason." A continuing run of professional bad luck during the 1590s produced increasing disillusionment, chiefly with Spain's imperial outlook and incompetent absolutist monarchy. His poetry and prose began to show signs of intensifying parody and of the mock-heroic attitude that would become his strongest comic device.Cervantes's whereabouts in the early 1600s are unclear, but if he was in the prison cell in Argamasilla de Alba where Don Rodrigo had slung him, he was using his time well, writing the novel that would reverse his fortunes and determine the form of fiction for the next four centuries. He suspected neither of these things, and curiously, though he read some of the manuscript, neither did Lope de Vega, Spain's greatest playwright, who had written to a friend that "no poet is as bad as Cervantes, nor so foolish as to praise Don Quixote."And what of everything else that Cervantes could not suspect? Bloom compares Cervantes with Shakespeare but makes a key distinction between their methods: "Cervantes remains the best of all novelists, just as Shakespeare remains the best of all dramatists. There are parts of yourself that you will never know fully until you know Don Quixote and Sancho Panza. But there's a fundamental difference between Cervantes and Shakespeare: Sancho and the Don develop newer and richer egos by talking to each other. Falstaff and Hamlet perform the same process through lonely soliloquies." On the one hand, we first realise through Don Quixote that the novel exists as a new kind of meaning, a sign, as Carlos Fuentes writes "of a modern divorce between words and things." On the other hand, in Quixote's search for a new union between reality and the words to articulate it, we also realise that it is dangerous to attempt this enterprise alone. The novel has become a social form for a very good reason: the identity that emerges from each of us is composed not only of our egos but our links with other egos. How can the novel tell us who we are, or ask us if we recognise anything human in it, without reflecting on those links? A modern or postmodern Quixote might consider it his duty to liberate, as well as the widows, maidens and orphans, the millions of urban dwellers who live alone. For such reasons has Cervantes's novel held its ground since 1605. In the 17th century, at a time when more ruthlessly than today the market decided, the wide pirating of the Quixote was an infallible mark of its popularity. In the 18th, as the English novel established itself, British novelists paid Cervantes constant tribute: witness Defoe's inspirations in Moll Flanders and Robinson Crusoe; Fielding's 1742 preface to Joseph Andrews, "written in Imitation of the Manner of Cervantes, Author of Don Quixote" (he also wrote Don Quixote in England, for the theatre); Sterne's blithe obviousness, among countless borrowings, in using Quixote and Sancho Panza as models for uncle Toby and corporal Trim in Tristram Shandy; and Smollett's translation of 1755 that ran to 13 editions. The English love of folly and, in Smollet's words, Cervantes's "strength, humour and propriety," not to mention the obvious commercial success of his model, ensured the Quixote's endurance.A century later, Dickens's and Thackeray's conversion of Quixote's horizontal and eschatological wanderings into fiction that journeyed vertically, socially and materially, was mirrored by Balzac and Stendhal. In Germany, Don Quixote may have been the last book Kleist ever read—found with his barest possessions after his suicide—while in France it was the first that the six-year-old Flaubert read, in an abridged version with 34 large illustrations. (The tragi-comic theme of the romantic hero at odds with reality explains almost all of Flaubert's work, from Madame Bovary and Sentimental Education to Bouvard and Pécuchet.) The Spanish novelist Antonio Muñoz Molina once said to me that although modern Spanish novelists do not make much of Cervantes—the anxiety of influence is too great—they cannot avoid his fictional blueprint. In the 20th century, neither Kafka nor Nabokov, Borges, Bellow or Kundera could have clothed their worlds in fiction without the pattern furnished by their Spanish ancestor. One might go even further: Don Quixote's influence has been super-literary—without it the French revolution, with its notion that individuals can be right, society wrong, might never have happened, and Martin Luther King Jr might never have delivered a speech that contained the words "I have a dream."Anglophone readers have never had a better chance to confront that greatness directly. Edith Grossman's new translation (apart from brief confusions of "thee" and "thou") is so good that it ought to compel us to start reading the Quixote again. Her text restores Cervantes's readability, the vitality of his dialogue and characterisation and the darkening quality of his vision. The thought patterns of his madness, which earlier translations obscured by rendering the original too literally or too loosely, are here rendered as logical, and thus funnier and sadder. One of my favourite episodes is the "enchanted boat" adventure in the second part, in which the two travellers steal a small rowing boat, Quixote believing that it has been sent by an enchanter as a kind of celestial cab to transport them to some knight or maiden requiring assistance. Knight and squire dispute their way downstream, one hurling curses at his servant's cowardice, the other cursing his master's madness, and as they are swept into the dangerous millrace their exchange climaxes in a superbly indifferent discourse by Quixote on how enchantment works. "'Be quiet, Sancho,' said Don Quixote, 'for although they seem to be watermills, they are not; I have already told you that enchantments change and alter all things from their natural state. I do not mean to say that they are really altered from one state to another, but that they seem to be, as experience has shown in the transformation of Dulcinea, sole refuge of my hopes.'" The comic timing of which Cervantes was capable, and his understanding of Quixote's pathos, have rarely been so in evidence for English readers. In the last third of the novel, Cervantes's increasingly unkind mischiefs towards his hero bespeak not just impatience but also a yearning to eliminate him, yet the admiration and compassion he has already instilled in us is proof against everything the author can do to undermine them.There is a view in literary-critical circles that Don Quixote's signal accomplishment was the victorious elevation of the novel over the romance. The deluded knight's attack on Master Pedro's puppet theatre for example, is, according to Bloom, "a parable of the triumph of Cervantes over the picaresque and of the triumph of the novel over the romance."Yet this seems a limited reading of the novel. It is as unfair to say that the Quixote is merely a "critical parody" of the romance as it is to say that its eponymous hero is merely mad. The forms of Cervantes's moral thought are pointed to in his humour: the author is simultaneously satirising Quixote's belief in chivalry and commemorating it through the comic forms of his forgiveness. There is no better example of the comedy of compassion. And Cervantes did not abandon the form of the romance; it is present in his Exemplary Novels, which he was writing between the two parts of Don Quixote, and in his posthumously published epic, Persiles and Sigismunda. Romance is in all his work. In the Quixote it is the engine both of Quixote's folly and of our deepening sympathy—a reader's way of recognising a hero's predicament as latently his or her own. Through the innumerable possible readings of the Quixote, we can perhaps identify a core of distinct principles: that there is no reality without folly, and no underlying perception of reality without romance, of one kind or another, to draw out human curiosity. To read Cervantes's Quixote as the first and greatest modern novel, and then, self-satisfied, to read back into it that we have nailed the folly of romance, is to miss half of Cervantes's intention. Having delineated Enlightenment rationality by the comic delineation of its opposite, Cervantes overflows the dimensions of both. If we admire Don Quixote today, it is surely because we continue to agree with him that his madness, not his reason, enables him to transcend the world of things and believe in a world of value. Enlightenment virtues we may all share. Our madness is our faith, and belongs to us alone.

Thursday, April 07, 2005

10,000 years of nostalgia

the antiquity of ‘the progress paradox’

Roger Sandall

Life gets better, but people feel worse. In seven short words that’s what Gregg Esterbrook’s book is about. The Progress Paradox (2003) is a revealing survey of modern discontents ranging widely in the social sciences and medicine, and it’s certainly interesting that ten times as many people may now suffer from depression as did half a century ago. But Easterbrook is broad rather than deep, and seems largely unaware that people have been complaining about progress, and looking nostalgically back at the past, for as long as there’s been a past to look back at. How depressed they felt when they did so is hard to say—as often as not they seem to have got into a towering rage—but the progress paradox has been with us for thousands of years.

primitivist fantasy: as old as civilization itself

One of its most striking sentimental manifestations is a widespread admiration for the tribal world. Anyone who thinks this began with Jean-Jacques Rousseau in the 18th century is deeply mistaken. We mentioned Lucretius in this connection last month, citing A. O. Lovejoy and George Boas’s Primitivism and Related Ideas in Antiquity, but this book also makes plain that there were numerous other thinkers from 2000 years ago who admired the simple life. And none of them liked stuff.

In fact, they all believed that less stuff was better than more. Socrates said that man’s basic requirements were few and easily satisfied, and Epicurus agreed. Diogenes once talked a prosperous Athenian into turning his agricultural land into sheep pastures—pastoralism has always had a special appeal with its visions of rustic tranquillity—and talked him into throwing his money into the sea. Plato’s Republic dwelt fondly on the idyllic picture of an earlier communal society, while any number of Greek thinkers were convinced that the savage Scythian tribes, somewhere beyond Thrace along the shores of the Black Sea, exemplified primitive virtue in contrast to degenerate Athens.

Reaching back a bit further we find that as early as 700 BC the Greek poet Hesiod felt humanity’s heroic days were past and that he lived in an era of lamentable decline. In the Golden Age (which Hesiod says was long before his own time) men were naturally peaceable, and for that reason there was no war. Nor was there any foreign trade or travel to confuse us with luxuries: everyone stayed home happily knitting their own sweaters, and no-one fussed about Paris or Pierre Cardin. Among other attractive features of the Golden Age, the people were vegetarians, made everything out of wood, and because they were naturally good their communal society was free of conflict and required no lawyers.

Notice that from the Golden Age all the way down through a series of inferior ages (Silver, Bronze, and Iron) this is a story of degeneration. Not a story of progress, but of regress. It is virtually certain that Hesiod did not live like a savage: he used a spoon and slept in a bed. But paradoxically—as Easterbrook might say—he hated progress. And notice also what is admired above and beyond all these particularities: the social and economic virtues described are only to be found in an imagined community where xenophobia and group hostilities had been vanquished and universal love prevails. In all these idealistic visions communal order was an implied prerequisite: some tight-knit form of collectivity was thought to be inseparable from the social virtues portrayed.

One last thing should be mentioned in this connection. In Scientific American Discovering Archaeology for Jan/Feb 2000 evidence was presented from 8,500 years ago of a cult in Cyprus that, somewhat incredibly, wanted to turn the clock back. According to the author, there were people at that time who found the decadence of Anatolian life intolerable, so they sailed across the sea to Cyprus, and gave up pottery, individualism, and sex. It must be added, however, regarding this intriguing article, that extrapolating from stones and bones to what people may have thought or believed eight and a half thousand years ago is a less than exact science.

But from these many examples one is forced to conclude that romantic primitivism has been with us for a very long time. In round figures, it looks as if people have been gazing nostalgically backward for nearly ten thousand years. And that is highly significant. Because the last ten thousand years is exactly the epoch in which civilization itself emerged; and what it suggests is that idealizing earlier and more primitive ways of life is a fixed mental tendency, a psychological constant if you will, inseparable from the rise of civilization itself.

from xenophobia to xenophilia

For anyone who doesn’t have them, it is obvious that the most important features of civilization are soap and toilet paper. These are the items that distinguish civilized from precivilized life, and distinguish barbarism from the dark abyss of unwashed and unwiped prehistory.

Yet today, surprisingly, many nice, clean, sweet-smelling middle class folk have somehow persuaded themselves that the tribal world, where there is no soap, no toilet paper, no shampoo, no deodorant and certainly no tampons, represents a better way of life than their own. No doubt if you pressed them about this after a good dinner they might concede that the pre-civilized world lacks amenity; but that it is morally superior and altogether more virtuous they feel in their bones to be true. And the deep reason for feeling this way about early human society is always the same: it is more communal, more collectivist, more committed to the solidarity of the group.

The reason for this persistent attraction to the tribal lies, I believe, somewhere in the complicated moral evolution of humanity—in the historical passage and shift in moral judgement from xenophobia to xenophilia. To grasp this it is necessary to be clear about these contrasting attitudes and psychological types. A xenophobe is one who holds that the humanly foreign, the Other, the culturally unseen and unknown—perhaps some vaguely reported and misunderstood tribe across the sea—is really a bit sus, and definitely not what we want at home in our living room. A xenophile on the other hand holds that the foreign, the remote, the exotically Other, precisely because it is only vaguely apprehended, and just because it radically differs from ourselves, is something wholesome and admirable that should be warmly embraced.

Of course in evolutionary terms xenophobia is probably as old as the hills—it is certainly as old as the apes. Go back a million years or so and one finds that xenophobia was the primordial attitude regulating the association of bands of violent prehumans, or low-browed protohumans, virtually everywhere you looked. Xenophobia taught that while the inhabitants of your own country were generally okay, the inhabitants of the adjacent territory were a disgraceful and unmanageable bunch who were always trying to invade your land, seize your wife and children, burn down your house, laugh at your gods, and defile all you held sacred.

The primaeval xenophobic attitude was once illustrated in a cartoon showing two English rustics, about 1890, leaning on a farm gate when a toff from London walks by.

First Rustic: “Who’s him?”
Second Rustic: “Dunno”
First Rustic: “Chuck a brick at him.”

Given a spontaneous tendency on the part of rustics to toss bricks at passing strangers, xenophobia is clearly a social problem, and quite possibly a moral problem, and it is clear that civil society cannot allow it to flourish unrestrained. At the same time it is hard to see it as an intellectual problem. There is nothing at all puzzling about it, nothing mysterious to be explained, nothing that some anxious academic commission should be asked to look into. It is obviously an expression of the same unaccommodating instincts we share with countless other animals, including chimpanzees, and goes far back into the primate past.

But the modern phenomenon of anti-civilizational xenophilia is an intellectual problem. The adoration of cultures other than our own, the worship of gods other than those we were brought up with, a devotion to all religions other than the one our parents believed—what A. O. Lovejoy and George Boas call in their book the “revolt of the civilized against civilization” with its admiration for pre-civilized social forms and a love of the exotic, the strange, and the outré—this is indeed a genuine puzzle. It is by no means obvious how it arose. What is clear at the outset, however, is that it involves an inversion of much that is natural, normal, and universal in social life.

moral rules: from Freud to Mary Midgley

Freud’s psychology may be of help here. First and most simply, he tells how the calm exterior of every man and woman conceals a tumult of instinctual desires and drives. Second, in order for the human animal to live peaceably with his fellows these instincts must be tamed, diverted, redirected—sublimated is the usual term—and made compatible with peaceful collective life. Human cultures invent moral rules to do this, the male impulse to aggression being subject to a variety of restraints on bad behaviour.

Then with the rise of civilization it becomes subject to such difficult rules as ‘Love thy Neighbour’, and the even more counterintuitive ‘Love thy Enemy’. Third, although conscience as an internal system of control is erected on the basis of these rules, those bad old violent and aggressive drives just won’t go away. They cannot and will not be eradicated. They are overlaid by the artificial rules of the super-ego, but though overlaid they won’t lie down. In the darkest subterranean levels of the human psyche they persist, producing anxiety, guilt, and neurotic symptoms up on the surface.

Now my argument is that romantic primitivism, which we have seen is a recurrent feature of western civilization for about 3,000 years, and possibly much longer, is part of a guilt complex involving moral rules idealizing the communal way of life. This idealization is deeply inscribed in conscience; and guilt arises because of the claims of this communal social conscience on the one hand, and the opposing need in civilized societies to assert oneself individualistically against the communal, against the collective, against the claustrophobia of the tribe, against the tyranny of the human herd.

Freud’s relevant statements appear in a number of places. In Totem and Taboo, for example, he states as an axiom that “where there is a prohibition, there must be an underlying desire.” This of course makes sense. Why prohibit something we have no wish to do anyway? The only reason for having a speed limit is that we want to go faster. The only reason for having a rule like “Love thy enemy” is that we want to xenophobically beat the enemy to a pulp and would like to do so.

This is the instinct which the moral norms of civilization arise to counterract: the wish to aggress, to fight, to kill. We began by saying that the normal relation of tribe to tribe is territorially xenophobic—fearful, hostile, and ready to strike and destroy: “Chuck a brick at him.” By way of reinforcing this proposition consider what Mary Midgley has to say in her book Beast and Man:

War and vengeance are primitive human institutions, not late perversions; most cosmogonies postulate strife in Heaven, and bloodshed is taken for granted as much in the Book of Judges as in the Iliad or the Sagas. There may be nonaggressive societies, as anthropologists assure us, but they are white blackbirds and perhaps not so white as they are painted.

It seems possible that man shows more savagery to his own kind than most other mammal species… Abimelech, the son of Gideon, murdered, on one stone, all his brothers, to the number of three score and ten (Judges 11:5). An animal that did anything remotely similar would (surely rightly) be labelled ‘dangerous’. (pp28–29)

war before civilization

It is frequently claimed either that war did not exist before civilization, or that it was relatively trivial with little loss of life, or that it was ritualised and involved no serious levels of death or injury, or that when conflict has been recorded between tribal societies it grew solely out of their clash with civilization itself. There is alas nothing to support these views. We are here presenting speculative moral psychology: we don’t have much space for empirical evidence. But on the subject of precivilized tribal warfare you should know there’s a lot of evidence around, especially in two recent books which show the folly of trying retrospectively to pacify the human past.

One of them is Lawrence H. Keeley’s War Before Civilization (Oxford, 1996). The other is Steven LeBlanc’s Constant Battles: the Myth of the Peaceful, Noble Savage, 2003. Keeley tells us that whether comparison is made between the frequency of war in primitive and civilized society, the scale of massacres, the proportion of those of the general population actively involved, the toll of dead and injured, in each case the surprising thing—and it certainly surprises me—is that the tribal world looks both more bloody and more deadly. As to frequency: Keeley notes that the early Roman Republic initiated a war or was attacked only about once every twenty years, while the average modern nation-state between 1800 and 1945 went to war about once in a generation.

Compare this with pre-state, preliterate, precivilized tribal societies: 65% were at war continuously, while 55% were at war every year. As to massacres: at the site of Crow Creek in South Dakota, in what seems to be the year 1325 according to archaeological dating, more than 500 men, women, and children were slaughtered, scalped, and mutilated, and all this well before anything remotely resembling civilization was available locally—and long before Columbus. Regarding the toll of dead and injured, Keeley writes that “the proportion of war casualties in primitive societies almost always exceeds that suffered by even the most bellicose or war-torn modern states.”

Traditional Australian Aboriginal life is presented as blandly pacific, the standard image used over and over again by the Australian Broadcasting Commission showing a family group wading thoughtfully into a lily pond, with flowers between their teeth. But it wasn’t quite like that in the old days. The convict William Buckley, who escaped in 1803 and lived with Aborigines on the southern coast for thirty-two years, provides a rare glimpse, from the inside, of the level of conflict among his people. One day, he writes, “we were unexpectedly intruded upon by a very numerous tribe, about three hundred. Their appearance coming across the plain, occasioned great alarm… On the hostile tribe coming near, I saw they were all men… In a very short time the fight began. Men were fighting furiously, and indiscriminately, covered with blood, two of them later were killed in this affair”.

He goes on to say that the battle ended with three killed, and then describes the counterattack that his people staged the following night: “ finding most of them asleep, laying about in groups, our party rushed upon them, killing three on the spot, and wounding several others. The enemy fled, leaving their war implements in the hands of their assailants and their wounded to be beaten to death by boomerangs.”

In pre-civilized war the beating, stabbing, or spearing to death of the wounded was routine. It may be appropriate here to mention that in accounts of the battle of Culloden, near Inverness in 1746, writers often wax indignant about the “atrocities” which followed the fighting. It is said that about 1,200 Highlanders of the Macdonald and Cameron clans lay dead or dying, and (in one author’s words) “what happened next was completely foreign to the rules of war…” Apparently the Duke of Cumberland “ordered his soldiers to spare no one, not even the wounded lying in the fields and woods. Hundreds of the fallen were shot or stabbed where they lay. Some were even buried alive. And so on.

But this is how it has always been in tribal fighting. A 12-year-old girl who was taken captive by the Yanomamo in South America in the 1930s recalled later of one fight she witnessed, “then the men began to kill the children; little ones, bigger ones, they killed many of them. They tried to run away, but they caught them, and threw them on the ground, and stuck them with bows which went through their bodies and rooted them to the ground. Taking the smallest by the feet, they beat them against the trees and rocks. The children’s eyes trembled. They killed so many.”

the social contract

Returning again to our wider moral speculations, it would seem that if this is how bad things were for the last million years or so, then there would seem to be a strong case for strong authority to stop it. And what Freud himself says is close to contract theory in more ways than one. For example, Freud writes that “Man’s natural aggressive instinct, the hostility of each against all and of all against each, opposes the programme of civilization.” In Civilization and Its Discontents he tells us, “I adopt the standpoint that the inclination to aggression is an original, self-subsisting instinctual disposition in man . . . (and) constitutes the greatest impediment to civilization.”

The fact that instinctual aggression is such a huge impediment must have been recognised quite early on, many thousands of years back. Once this happened, and reasonableness and the values of rationality attained critical mass, then a deal was done. “Human life in common”, he writes, “is only made possible when a majority comes together which is stronger than any separate individual and which remains united against all separate individuals. The power of this community is then set up as ‘right’ in opposition to the power of the individual, which is condemned as ‘brute force’ (and) this control represents the decisive step of civilization”.

It’s a pretty picture. Reasonable chaps meet other reasonable chaps and agree to behave better. But the raw material of many men and women is not reasonable. It is deeply instinctual, driven by the sort of animal desires which regularly end up in the more sensational newspapers. Sublimation is an unending social process. The work of taming instinctual impulses must be undertaken again and again with every new recruit to the social order. In brief, each individual conscience must be newly built, newly constituted, and newly installed in each individual, year after year, generation after generation, because (in Freud’s words) without this “civilization is perpetually threatened with disintegration”.

Freud employs a vivid metaphor to describe the setting up of conscience as a mechanism of moral control. “Civilization”, he writes, “obtains mastery over the individual’s dangerous desire for aggression by weakening and disarming it and by setting up an agency within to watch over it, like a garrison in a conquered city.”

What has been conquered is the instinctual city of dreadful night, the city of sinful homicidal wishes. What is set up like a garrison in the city is conscience, for without it collective life of a kind embracing millions of people living together could not take place. And that of course is what civilization is: not a family, not a hunting band, not a clan, and certainly not a tribe. It is a wholly new and unprecedented form of collective social life in which hundreds of millions somehow rub along together, largely anonymously.

beyond Freud

We will soon have to go beyond Freud. But let us first agree with Freud in stressing just how extraordinarily important the civilizational blocking of aggressive drives has been. He himself believed that in the evolution of civilization nothing was more psychologically important than the suppression of powerful animal instincts, socially destructive instincts representing a constant “hostility against which all civilizations have to struggle.” And nothing showed how important was the spread of wider and wider forms of peaceful association than that amazing injunction—so totally counterintuitive, so downright bizarre—“love thy enemy”.

But at the same time something else went completely unnoticed by Freud. Who was not, of course, an economist. This being that modern civilization as a whole, is not and cannot be along old-style communal lines. Yes, indeed: we can agree that the suppression of individualistic aggressive drives is necessary for the wider form of human association we call civilization. There Freud got it right. But after this we have to say no. Wider forms of association, the form of human association Hayek called ‘the extended order’ involving hundreds of millions of people, cannot be based on the communal arrangements of older, more primitive social units, simply sustained by the moral rule that it is desirable to “love thy neighbor”.

There Freud got it wrong. This is of course the classical collectivist delusion. In fact, the lines on which peaceful, modern, spontaneously cooperative organization is built are broadly those of the free market—as indeed, from the 1930s on, people like Mises, Hayek, and Michael Polanyi tried patiently to explain—and these spontaneous forms of large-scale social order consist of vast self-regulating systems utterly different in their dynamics from tight little fraternally bonded communes.

So what have we here? A contradiction which splits many minds and many societies quite profoundly. It also produces loads of guilt among those who have deeply internalised the communal injunction “love thy neighbor as thyself”. From that guilt in turn comes a determination to uphold, to idealise, to promulgate as desirable and preach and promote the ancient communal ideal, come what may. But where can we find a living example of this ideal? The answer for many people is that we can now only find it in concrete form, incarnated so to speak, in those small-scale tribal societies that modern civilization has marginalised or actually swept away.

And the very fact that modernity has destroyed them deepens the feeling of guilt, and adds to the determination to overcompensate by honouring their memory, to atone for the sins of modernity by presenting them as worthy and admirable, to seek expiation by rehabilitating them as uniquely sympathetic cultural forms. And in everything we say about them, by morally transfiguring the primitive world so that all traces of violence and war have been tastefully air-brushed away. This, I suspect, is what underlies much romantic primitivism today.

(This essay originated as a talk for Blackheath Philosophy.)

April 2005

Saturday, April 02, 2005

Goodbye to All That

The spirit of '68 still lives on in some quarters of the left.

Too bad -- there are much more effective ways to be an opposition party than by reliving the past.

By Kevin Mattson Web Exclusive: 03.28.05


With conservatism dominant in every branch of government, it is clear that liberals are an opposition party. We have to think, act, and strategize like an opposition party. That means figuring out ways to articulate what we stand for while not alienating those who may disagree with us but can be persuaded to see things our way. That’s a difficult balancing act. Of course, the postwar left has been in opposition before, and that’s a historical fact that can be turned to advantage -- there’s a track record to examine and think through, and a set of political styles and strategies for change to reflect upon. Examining this history can mean recycling good ideas and tactics. But what if it means recycling bad ones?
No doubt, some progressives will be drawn to the protest movements of the 1960s to inspire opposition today. There are good reasons for this. The world that existed before the ’60s is one that no one wants to go back to. The decade witnessed enormous victories for African Americans, women, and the poor. The civil-rights movement -- with its pioneering use of nonviolent and grass-roots “direct action” -- prompted these advances. It also gave birth to a new form of politics that championed the energy of ordinary citizens and that carried on within the peace movement’s struggle against the Vietnam War. College students, through the teach-in movement, learned how to connect their learning to political engagement. The decade seemed a golden age of political idealism.
Remembering the ’60s as a time of heroic activism -- when ordinary citizens changed the terms of politics -- suggests we might be able to recycle those protest styles today. Younger activists are doing that as they march on Washington, against the Iraq War or in favor of abortion rights. The left is often identified, in the press and in popular imagination, as a series of marches. Protest has become an easy way to express dissent. It’s often highly visible and focused in terms of time and resources. When people mass in the streets -- as they were known to during the 1960s -- it appears something is wrong in the country that demands attention. And because protest activists are the most vocal element of the left, they attract the energy of young idealists yearning for a way to express their political disaffection. Take it from someone who’s marched a lot in his life: There’s an emotional appeal to massing with others you share solidarity with.
But there’s also a limit to protest. With its emphasis on criticizing rather than building, it nurtures a narrow conception of opposition. Of course we need to criticize, especially with this administration in power. But for the long term, it’s far more important at this historical moment that we build. The left needs to think about long-term and broader ideas of change. Protest doesn’t help here; it’s too fleeting and spasmodic.
To romanticize protest and the decade of the 1960s cuts us off from rethinking -- with a cold, analytical eye -- the decade’s lessons. The spirit of the ’60s has something to teach us, for sure, but it’s a mixed message, one that lives on in the activist wing of today’s left in troubling ways. We need to search out styles, dispositions, and ideas that can inform our present sense of being an opposition party -- and we need to widen what we choose from. We also need to recognize how the past’s influence precludes more productive strategies for the present, how what might have worked in a previous context no longer works today. To get a sense of this, we need to travel back to 1968, to a time when the decade’s meaning crystallized, a time that seems far gone at first but whose images and memories live on in disturbing ways today. Remembering the past critically allows us to be a more effective opposition in the present.
Protest and Confrontation as Politics Both internationally and in the United States, 1968 remains one of the most evocative years in the history of the left. The spirit lives through images of protesters massing in the streets and Molotov cocktails zinging through the air. Protest and anger aren’t the only tendencies from the time, but they are certainly the most evocative. Mark Kurlansky, in his book 1968: The Year that Rocked the World, explains the allure: “People under twenty-five do not have much influence in the world. But it is amazing what they can do if they are ready to march.” Breaking from the limitations of the sidewalk into the streets now conjures a feeling of exhilaration and radical accomplishment.
No occasion in American history symbolizes this more than Chicago’s Democratic convention during the summer of 1968. Memories of Chicago come easy due to its highly charged political theater. Abbie Hoffman’s organization, the Youth International Party (Yippies), planned to protest the Democratic convention with a “Festival of Life” that would nominate a pig picked up from a local farm for president. Protesters were refused permits but insisted on marching, while Richard Daley, the mayor of Chicago, did all he could to spark a fight. Chicago became a pressure cooker, a leading Yippie calling it “a revolutionary wet dream come true.” When the riots occurred and the police clubs started swinging, protesters chanted, infamously, “The whole world is watching.” Unfortunately for the protesters, America watched, all right -- and cheered for the working-class cops of Chicago, for the “man” sticking it to the longhairs in the streets. Protest, confrontation, and outrage didn’t elicit the intended sympathetic response. Anger killed strategy.
It may be easy to overstate the resonance of such tactics today, but a romanticism about them does exist among those who still believe in street protests. When Rick Perlstein interviewed organizers of the 2004 protests at the Republican convention, he found them championing direct action and confrontation as a tactic. Check out the A31 (August 31) Action Coalition, an organization based in Brooklyn that was angry at New York City’s permitting system that confined protesters to certain areas. A31’s leaders hoped to “transform the streets of NYC into stages of resistance ... .” They called for people to “sit down and refuse to move,” and to ignore the limitations of “protest pens” set up by police. To make the connection to 1968 crystal clear, they posted a recent op-ed by Tom Hayden on their Web site -- no surprise, as Hayden had argued in 1968 that Chicago symbolized a move toward “direct action and organization outside the parliamentary process,” language remarkably similar to that used by A31.
This was not the only organization that recycled protest styles of 1968. There was Dontjustvote.com and the old peace movement organization, The War Resisters’ League (WRL), both celebrating action in the streets, no matter the consequence. A leader of the WRL told Perlstein, “We need to do what we think is right to do, and not so much worry about, ah, ‘Well, what if this? What if that?’ I think we need to do what our conscience tells us is important to do … .” When Perlstein asked if this might alienate the wrong people, the organizers shrugged. These activists seemed in the clutches of 1968, transported back to Chicago and prepared for the worst. Fortunately, this time, the “whole world” wasn’t watching.
It’s remarkable how much these protesters live in another era. Over and over, they use Martin Luther King Jr.’s words to justify their actions. They especially like the following quote (seen on numerous Web sites) from “Letter from a Birmingham City Jail” (1963): “Nonviolent direct action seeks to create … a crisis and establish such creative tension so that a community that has constantly refused to negotiate is forced to confront the issue.” Plucked out of context, the quote suggests thoughtful political strategy. After all, these activists are appropriating America’s best political thinker on nonviolence and democratic change.
But in plucking the quote, these activists ignore its context. Go to the rest of the document and you find much more. King was explaining how a minority, African Americans, could struggle to make a moral appeal to a majority. He believed black Americans had to highlight “the best in the American dream” in order to be heard. And civil-rights protesters had to rule out other options before embracing the challenging ethic of nonviolent direct action. You had to have moral merit on your side -- what Reinhold Niebuhr called a “spiritual discipline against resentment” -- before rushing into the streets.
Today’s protesters ignore King’s reflections on his own historical context. Consider that John F. Kennedy was president when King wrote his letter, and that King was one of Kennedy’s most astute critics. King believed in 1960 that candidate Kennedy “had the intelligence and the skill and the moral fervor to give the leadership” the civil-rights movement had “been waiting for.” Soon, though, King realized Kennedy had “the political skill” but not “the moral passion.” Nonviolent direct action, with its intention of creating conflict to expose tension, was precisely the tool to jump-start that moral passion. King saw an opening that the movement could prod, and this got him the legislation he desired: the Civil Rights Act of 1964.
The year 1963 was its own time, distinct from 1968 and certainly 2004. George W. Bush is no John F. Kennedy, and today’s Republican leadership in Congress is a far cry from the Congress of 1963–64. The chance that Bush and congressional Republicans would be prodded into some kind of action by such protests is zero (unless, indeed, protest moves them to act more forcefully in the other direction). The protesters at the Republican convention of 2004 might have imagined themselves as working in the tradition of King. But the context had shifted so drastically that their actions fell on -- quite literally -- deaf ears. It wasn’t even clear what they hoped to accomplish. And when the goals aren’t clear, protest means little more than expressing rage. That’s why it often takes the form of political theater, which too often encapsulates those who make it in their own hermetic world; it replaces explanation of political ideas and policies with in-jokes and references that confirm pre-existing opinions. If you know a pig stands for a white guy with power, you get it; if not, you don’t.
There’s a recent, evocative documentary, The Yes Men, that focuses on two activists inspired by the French Situationists (intellectual forerunners to 1968 France) and the Diggers (politically minded hippies before Hoffman). They pose as representatives of the World Trade Organization and attend business gatherings exhibiting a television monitor that polices workers and pops up like a phallus in a blow-up suit. They get applause in rooms of 30 people, although it’s not clear why. The movie winds up showing these “activists” as all-knowing lefties snickering at their opposition. The climactic scene involves their presentation to a college classroom, where students protest their idea of turning human feces into McDonald’s hamburgers sold to citizens of the Third World.
Unlike political humor that entertains, political theater has a pretense of changing public life. The Yes Men think of themselves as activists, but the tendency to laugh at their opposition rather than engage it betrays their project’s limitation. Asked about the “mind-set of the corporate man” who might resist their jokes, these activists call them “ready to goosestep.” Generally, people are “easy prey for the ideas of the corporate decision-makers.” The Yes Men characterize their opposition as “dumb asses” who wouldn’t “listen anyhow.” “Criticizing those in power with a smile and a middle finger” is what they intend. Expression trumps strategy.
Expressive Anti-Politics Indeed, guerilla theater and protest as outrage suggest another legacy of 1968: expressive anti-politics. This element of political style draws from pop existentialism and participatory democracy. Once again, it crystallized in Chicago, and specifically in Tom Hayden. By 1968, Hayden was disenchanted with electoral politics and supported urban riots and Third World guerilla fighters. Chicago ratified his break from electoral politics, especially when Eugene McCarthy’s supporters spilled out of the convention and into the streets. The left had literally split -- those inside the hotel symbolizing electoral politics (the fogies), and those outside practicing direct democracy in the streets (the youth). Here can be found the essence of expressive anti-politics and its long legacy of liberal powerlessness.
The impulsive nature of direct action -- its immediacy -- is precisely its major appeal for today’s activist left. L.A. Kauffman, an organizer involved with United for Peace and Justice (a leading anti-war organization that formed in the last few years), explains, “Direct actionists devote little if any energy to lobbying or passing legislation; if they interact with the government, it’s almost always by raising a ruckus.” Here’s a curious embrace of protest over power -- the bizarre idea that a presence in the streets can substitute for a presence in the halls of government, or that reacting to government action is morally superior to initiating it. The sentiment is echoed in the ideas of Dontjustvote.com, an organization that was created for protests at the Republican convention of 2004 and a clear inheritor of the spirit of ’68. As its Web site explains, the organization embraces “the power of direct action” and “direct democracy as a viable alternative to representation.” This is the political theory of street action or, put more positively, “participatory democracy.”
The idea’s salience arises from its respectable lineage in American political thought, which stretches back to Thomas Jefferson and John Dewey. Dewey believed democracy required a home in the local neighborhood where discussion and association took place. When members of Students for a Democratic Society (SDS) gathered in Michigan in 1962 to write the famous “Port Huron Statement,” they outlined the demands of participatory democracy and invoked Dewey’s ideals. But they also invoked a jargon of authenticity taken from existentialist philosophy. While embracing “a democracy of individual participation,” they hoped to find “a meaning in life that is personally authentic.”
But there’s a problem with proclaiming both of those as goals: Authenticity of the self and actually living in a democratic community with other citizens who hold varying opinions are two very different -- if not, in fact, irreconcilable -- demands. In Chicago, the two ideals clashed, and authenticity won out. Protesters pitted themselves against the inauthentic masses -- the police, those who believed in the Vietnam War, the “pigs.” When this occurred, participatory democracy no longer supplemented representative democracy but replaced it; authenticity displaced the challenge of deliberating with other citizens who might disagree. To be authentic meant to give direct expression to desire rather than to work through a longer process of changing representative institutions. It focused on what George Cotkin, the historian of American existentialism, called “catharsis.”
Critics noticed the dangers at the time. As Christopher Lasch wrote soon after the Chicago convention, “The search for personal integrity could lead only to a politics in which ‘authenticity’ was equated with the degree of one’s alienation, the degree of one’s willingness to undertake existential acts of defiance.” Bayard Rustin agreed, arguing that the participatory ethic of protest threatened the importance of doing actual politics, which required coalition-building and compromise, and wound up pitting leftists against liberals in a dangerous internecine warfare and mutual alienation. But clear as this might have been to some back then, the idea’s appeal lives on in the activist left’s disposition to political action combined with a lack of realism -- a disposition apparent today when expression trumps effectiveness. Go back and read the statements of Naderites in 2000, or the shriller ones from 2004. You can hear moral fervor trumping political responsibility -- the idea that voting is about expressing conscience rather than influencing policy. When The Progressive interviewed the few remaining Naderites working in the swing state of Wisconsin in 2004, the publication confronted purist sentiment. Supporters explained that they were “principled” while those supporting the Democrats were “muted.” One went so far as to say, “It’s not important who’s sitting in the White House, it’s who’s sitting in.”
This is the ugly legacy of 1968: the authenticity of conscience pitted against the requirements of a pluralistic and conflicted society, the ethic of expression winning out against all other aims, including practicality. “Direct nonviolent action” no longer means what King believed it meant; it now means remaining pure by turning “Your Back on Bush,” as recent protesters did at the inauguration, even if the result wasn’t anything more than making them feel better. Expressive anti-politics is the last refuge of the powerless. Impulsive, it bursts like a flame and then burns out, to be felt only in the heart of the participant while the ruling class, unperturbed, goes on its merry way.
The Right(’s) Lessons from the ’60s Burnout is a constant theme of 1968. We’ve heard the refrain about “tired radicals,” and the one about Yippies turning into yuppies. Even while appreciating the social movements from this time, Paul Berman (who was a part of it all) admits, “The uprisings proved amazingly unproductive in regard to conventional political or economic change.” The historian Alan Brinkley comments, “The new radicals” of 1968 “never developed the organizational or institutional skills necessary for building an enduring movement.”
Meanwhile, of course, an enduring movement was being built during the ’60s -- but it was on the right. Historians of the decade used to focus on left-wing organizations, writing books about sds, the Student Non-Violent Coordinating Committee, or the Southern Christian Leadership Conference, typically culminating in the tumult of 1968 and thus telling a story of factionalism and decline. Today, however, historians are growing more interested in documenting the right and telling a tale not of decline but of ascendance. James Miller, who wrote a marvelous book about sds, explained to the magazine Lingua Franca a few years back that “in terms of the political history of this country, the New Left just isn’t an important story.” Focusing on the left, he explained with a certain irony about his own historical work, evades “the extraordinary success of the forces that first supported [Barry] Goldwater, then [Ronald] Reagan as governor of California, and then [George] Wallace. I can’t help but see that absence in the historiography as integral to the mythologization of the Sixties.” Miller echoes the argument of M. Stanton Evans, a leading conservative intellectual and popular writer, who wrote, “Historians may well record the decade of the 1960s as the era in which conservatism, as a viable political force, finally came into its own.”
When Evans wrote that line he was discussing an organization that still grabs the attention of young historians today: Young Americans for Freedom (YAF). YAF’s membership was always more stable and often larger than SDS’s, but more importantly, the group created a longer-lasting infrastructure. It engaged young people philosophically, through a ringing endorsement of liberty and individualism; but it also engaged them with well-organized chapters on campuses that cultivated long-lasting skills for activists (Richard Viguerie, for instance, pioneered his direct-mail tactics through YAF). YAF worked with the Intercollegiate Society of Individualists to coordinate lectures of right-wing thinkers and circulate conservative books to students. It linked up with Goldwater and Reagan, supplying an army of young volunteers for their campaigns. Did it engage in protest? Certainly not. During its “heyday in the early ’60s,” Maurice Isserman and Michael Kazin point out, YAF members went to “the lectern and the party caucus more than into the streets.”
The networks of YAF were replicated for adults in places like Orange County, California. Here, there were chapters of the John Birch Society that supported local school-board candidates and institutions like the Orange County School of Anti-Communism, where conservatives could fraternize, learn about boycotts of corporations selling products to communist countries, and hear Reagan speak before he even considered a run for governor. There were also barbecues, coffee klatches, and discussion groups that congealed a conservative animosity toward the federal government and liberalism. Churches and right-wing bookstores helped provide “movement centers,” and the infrastructure was especially impressive considering the decentralized, suburban setting.
These networks explain the passion and long-lasting influence behind Goldwater’s run for the presidency in 1964. Traditionally, the campaign was seen as a right-wing disaster. Goldwater’s convention speech in favor of “extremism” still sounds scary. But now, more remarkable is the infrastructure that stood behind Goldwater. A strong network of activists worked hard to push the Republican Party toward the right, away from centrists like Nelson Rockefeller. It wasn’t enough to win the presidency in 1964, but that same infrastructure -- YAF, John Birch Society chapters, and general right-wing networks -- helped Reagan become governor of California in 1966. As Isserman and Kazin explain, conservatives “sustained morale and kept expanding their numbers for years after the young radicals had splintered in various directions.”
We can link this scholarship about conservative grass-roots activism to something already well-known: that throughout the 1960s, the right was developing ideas that would come to fruition much later. Leading this initiative was the well-known (now at least) American Enterprise Institute (AEI). Though founded in 1943, it changed form during the 1960s. Its leader, William Baroody, believed it should not just reflect the right’s primary “special interest” -- corporations -- but develop bigger ideas. Baroody “understood,” as Sidney Blumenthal explained in The Rise of the Counter-Establishment, “that without conservative theory there could be no conservative movement.” Baroody forged alliances with the Goldwater campaign quietly, behind the scenes. He focused on long-term goals so that, when the excesses of the ’60s erupted, there was a place neoconservative intellectuals could go to develop their ideas during the ’70s. The AEI articulated both particular public policies and a broader philosophy of the free market -- something that undergirds conservative political action today. And, of course, it provided a model for other conservative think tanks during the ’70s.
The power of YAF, grass-roots networks, and think tanks like the AEI show that the right focused its energy on infrastructure and ideas during a time when the left focused on protest. The right’s tactics weren’t loud or theatrical. Its activists operated under the radar to lay the groundwork. They worked almost entirely within the system, changing the Republican Party from moderate to conservative precinct by precinct. And their story challenges the left-wing narrative of idealism during the decade. That’s precisely why it should inform the way liberals think about the future. To win real power, liberals need to think about infrastructure, institutions, and ideas. And they’re not going to get these if they look to the late ’60s for inspiration.
The Spirit of 1948: New Ideas in the Old This is especially true for ideas. Who now reads left-wing books from 1968? Just try Hoffman’s Revolution for the Hell of It or Woodstock Nation. Or try Theodore Roszak’s The Making of a Counter Culture, a puff piece about the “non-intellective” exploration of “visionary splendor” and “human communion.” Or read the prognostication of “revolution” of “consciousness” in Charles Reich’s The Greening of America. Read even the otherwise smart Susan Sontag, who praises the worst elements of Third World revolutions in Styles of Radical Will (she later stood down from many of those positions). All of these books reflect a utopian hallucination not dissimilar from the style of protests on the streets of Chicago in 1968.
Younger thinkers today are going further back than the ’60s to rediscover good ideas. It’s been the Cold War liberalism of the ’40s and ’50s that has garnered the most interest. Books like Arthur M. Schlesinger Jr.’s The Vital Center or Niebuhr’s The Irony of American History or John Kenneth Galbraith’s American Capitalism seem much more interesting than The Making of a Counter Culture. There’s good reason for this, because though we might feel closer to the ’60s chronologically, our own age is much more parallel to the ’40s. Then, as now, liberals faced an international enemy -- Niebuhr’s “children of darkness” -- willing to murder for salvation. Then, as now, liberals confronted conservatives who entertained dangerous ideas of launching preemptive wars abroad while slashing social programs at home. And, if we take the ’48ers up to 1952 and the election of JFK in 1960, then, as now, liberals were often an opposition party.
The ’48ers knew they had to articulate a public philosophy, the way conservatives would later. They sketched out broad principles that transcended liberal interest groups. Those principles grew out of their faith in the American nation as a community of citizens sharing mutual obligations to one another -- the sort that they saw during World War II and that they hoped could live on afterward. The ideas of national greatness and patriotism grounded their political thought. They upheld a public purpose that highlighted the weaknesses of the libertarian right and led them to criticize the “social imbalance” of a society enamored of consumerism and markets, and not America’s civic fabric. Politically, they supported the idea of a “pluralist” government with many voices participating, not just those of business and privilege. They wanted influence on the inside, not protest from the outside. In The Vital Center, Schlesinger wrote, “Our democratic tradition has been at its best an activist tradition. It has found its fulfillment, not in complaint or in escapism, but in responsibility and decision.”
The ’48ers, so far as I know, never marched against American actions abroad. What they did do was construct a framework for a liberal foreign policy, a robust alternative to conservative emphasis on military action and “rolling back” the enemy. The idea of containment was not simply a doctrine of realism but a moral disposition toward the demands of national power. America certainly had a strong role to play abroad, the ’48ers argued, but it had to do so with a sense of “humility.” So, for instance, Niebuhr, drawing upon Christian ethics (not yet the sole property of the right), argued against “preventive war.” Those who articulated such an idea “assume a prescience about the future which no man or nation possesses.” He went on to explain, “We would, I think, have a better chance of success in our struggle against a fanatical foe if we were less sure of our purity and virtue.” Learning this lesson required America to work with others to “reconstruct” poorer economies as much as engage with military power. This was to be a war of ideas as well as guns.
These thinkers didn’t just think; they put ideas into action. They attended international conferences of the Congress for Cultural Freedom, where they argued that America stood for more than a prosperous consumer economy. (Richard Nixon had made this assertion to Nikita Khrushchev in 1959, displaying a gleaming American kitchen to the Soviet leader at an exhibition fair; Galbraith chided Nixon’s equation of democracy with consumer triumph as a “simple-minded and mechanical view of man and his liberties.”) The ’48ers also befriended politicians. Unlike our own age, when politicians hire overpaid consultants with few ideas, during the ’50s, politicians turned to intellectuals. In 1953, Galbraith formed the Finletter Group, which collected papers on topics by scholars and writers, crafted speeches, and found ways to have ideas inform public debate. Most famously, Americans for Democratic Action became an organizational forum where intellectuals and politicians could formulate foreign and domestic policy together. In this and other ways, they found outlets for ideas that could become a source of opposition as well as inspiration.
These strengths shouldn’t allow us to ignore their limitations. These thinkers took things for granted, including their privileged status as white, highly educated men. They sometimes had a hard time accepting the activism of the ’60s, and they were slow to see how their own anti-communism, legitimate though it was, could descend ineluctably into the disaster of Vietnam. Their experience of the staid 1950s, when bureaucratic corporations accustomed themselves to the welfare state, made them take Keynesian policies for granted. In going back to these thinkers, we need not romanticize them. Indeed, one of their central weaknesses, taking the welfare state for granted, should inspire our thinking today.
The Past’s Lessons for the Future This quick tour through postwar history gets us closer to what it means to be an opposition party today. First, we need to question the legacy of protest politics and political theater, which makes activists feel good but alienates and confuses others. We need to build a grass-roots infrastructure, like that developed by the right. We should also start reconstructing liberalism by going deeper into the past, while recognizing the limits any set of ideas from the past naturally have. These are some good first steps to take, but obviously they are just the beginning, and mostly about looking backward, not forward.
If we take these lessons seriously, our biggest challenge moving ahead is how to articulate our opposition to the right’s well-developed agenda while simultaneously developing a public philosophy like that of the ’48ers. The need for this became abundantly clear in the last presidential election. John Kerry lost because Americans didn’t understand what he stood for. They understood him as an opposition candidate but not as someone who had “values” that could be articulated and explained. This wasn’t just Kerry’s problem; it is the problem of liberalism generally. The public perceives liberalism negatively, due to the long war the right waged against it from the 1960s onward. Unlike the ’48ers, we cannot assume that our ideas resonate; we need to make them resonate.
To rearticulate liberal ideals while acting in opposition is not as hard as first appears. Take Social Security. Clearly, Bush is surprised by the backlash against privatization, as he scrambles around the country garnering support. This appears a dream come true for progressives, but it’s much more. It’s a challenge to articulate not just opposition but a public philosophy that can explain what liberals stand for. We shouldn’t defend a program inherited from the New Deal in a rearguard fashion but should reiterate the idea of a shared national purpose based on collective sacrifice.
Nor should we turn this into a demographic issue and bank on the elderly supporting Democrats; that’s interest-group politics, not a long-range public philosophy. We need to explain what Social Security teaches the nation about deeper principles. Why do Americans react against the term “privatize”? Because there is still a sense of shared obligation to one another, and it’s up to liberals to articulate that public philosophy while they oppose the president. We can show how the president’s proposal reflects the “social imbalance” the ’48ers perceived, the elevation of the self’s interest above the common good. None of this requires protest. It requires public argument. The time for protest may come, but it will undoubtedly rely on a change of leadership first and serious thinking about strategy later.
The same needs to be done on foreign policy. It’s not good enough to protest the Iraq War. Occasionally, Kerry articulated an alternative, albeit muted, to Bush’s foreign policy that embraced the ’48er idea of national humility and a critique of hubris. Today, we need to articulate this liberal foreign policy more forcefully. Its central message should be that American responsibility abroad shouldn’t rely on guns alone or a sense of superior moral virtue. Liberals should argue for nurturing civil society and democratic institutions throughout the world, envisioning an equivalent of the Marshall Plan for the Middle East and elsewhere. Liberals need to emphasize that the war against terrorism is a war of ideas as much as a war of military power and intelligence. Like the ’48ers, liberal intellectuals should define America abroad as more than just its well-known Hollywood films. We need not allow Bush to expropriate the rhetoric of democracy and freedom; we need to reshape these ideas in a more responsible and meaningful manner.
Liberals must also talk about shared sacrifice during wartime. This shouldn’t be about getting the military vote, even if that wouldn’t hurt. The tradition of national greatness expects shared sacrifice from all members of our society. As JFK quipped, “Ask what you can do for your country.” Only liberals will make it clear that the wealthiest elements of society should provide for the common good, so that we have enough to pay veterans’ benefits and provide other services. None of this will come from protest marches against the war, which to date have accomplished little more -- as unfair as this might seem -- than to permit the partisans of the right to raise questions about the left’s patriotism.
The problem with what I outline here is the lack of places to build articulate ideas and have them inform the thinking of Democratic politicians. Now is certainly the time for progressives to invest in building an infrastructure -- the only alternative to spasmodic protests in the streets. The term “progressive infrastructure” seems to spark interest among some funders today, especially considering how the quickie infrastructure built in 2004 -- notably America Coming Together -- didn’t quite do the trick. It’s time for institutions that can approximate what Americans for Democratic Action did during the Cold War -- provide a space where thinkers and politicians meet -- and build local networks. Of course, this requires that Democratic politicians stop relying so heavily on overpaid consultants, and that wealthier progressives pony up money for institutions without immediate impact.
This leaves open the question of how to relate to the “actually existing” protest left today. The ’48er spirit was recently invoked to call for a purge of the protest wing of the left today. Writing in The New Republic, Peter Beinart suggested that MoveOn should be pushed out of a more responsible left. While I think MoveOn deserves criticism for its pacifism and teaming up with hard-left dinosaurs like ANSWER, it doesn’t merit a purge (purge from what, exactly?). What MoveOn needs is an articulation of the principle of “responsibility” that Schlesinger set out against the spirit of alienated protest. There’s reason for hope on this front. After all, Mother Jones described MoveOn’s young leader, Eli Pariser, as a “scruffy indie-rock fan who not long ago was chanting anti-globalization slogans and confronting riot police at World Bank meetings.” At one anti–International Monetary Fund protest, though, he talked with police and, in his own words, “realized that the scripted confrontation of attacking and antagonizing them wasn’t going to get us anywhere. It changed the way I was thinking, tactically.” This idea of laying groundwork for an infrastructure also came out in MoveOn’s work during the last election; it didn’t succeed, but with a little help from a stronger intellectual infrastructure in the future, it might.
My tempered hope about this comes from a sense of urgency about the Bush administration. Such a sense threatens to degenerate into protest theatrics and expressive anti-politics. Instead of embracing those styles from the past, liberals should take their lessons from the right during the 1960s. Liberals will never be as powerful as the right. That’s not just because the right is richer but because the liberal faith is, by definition, weaker. Unlike evangelical Christianity, liberalism can never provide absolute zeal or commitment. We can draw some inspiration from the “fighting faith” of the ’48ers’ liberalism, but we also face challenges that they never faced, especially the infrastructure the right has built over the last few decades. With this said, liberals don’t need to be as weak as they are now. We need not recycle protest and alienation from the past. Liberals have been in the opposition before, and they’ve managed to win back political power. But it took care and precision and some serious thinking about strategy. That’s our charge today.
Kevin Mattson teaches American history at Ohio University and is the author, most recently, of When America Was Great: The Fighting Faith of Postwar Liberalism.
Copyright © 2005 by The American Prospect, Inc. Preferred Citation: Kevin Mattson, "Goodbye to All That", The American Prospect Online, Mar 28, 2005. This article may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission from the author. Direct questions about permissions to permissions@prospect.org.