A new book on failure and humility

Last week, The Washington Post published my review of philosopher Costica Bradatan’s new book on failure and humility. I thought it might be of interest to many of you.

At the start of his latest book, “In Praise of Failure: Four Lessons in Humility,” the philosopher Costica Bradatan notes without chagrin that when we consider our origins and our ultimate fate, humans are not very impressive. We are designed to fail, he emphasizes, and death is the framework for all our attempts to make something of ourselves. In a previous book, “Dying for Ideas,” he considered how philosophers across the ages wrestled with mortality. In “In Praise of Failure,” he looks at how various thinkers — Seneca, Mohandas Gandhi, Simone Weil, Emil Cioran, Yukio Mishima — detached themselves from an obsessive drive for worldly success by reckoning with failure and death. Bradatan wants us to grasp how striving to succeed prevents us from dealing with our mortality and hence from living a more meaningful life.

One hears plenty of voices these days singing the praises of failure, but Bradatan is not to be confused with those Silicon Valley types who drone on about an “iterative process,” aiming to “fail better.” Those folks like to quote a snippet of Samuel Beckett in this regard, but, observes Bradatan, the stern Beckett actually proposed something much more pessimistic: “Fail worse again. Still worse again. Till sick for good. Throw up for good.” Bradatan quotes this passage approvingly because he despairs of those who would co-opt the idea of failure into some happy tale of ultimate progress. Reading this most interesting philosopher, I was reminded of a Bob Dylan lyric: “There’s no success like failure, and … failure’s no success at all.”

The French thinker Simone Weil brooked no happy tales. She was attracted to suffering and has attracted readers somehow satisfied by her failure to find anything satisfying. Weil was moved to help those in distress but was quite inept at doing so; and since she was writing during the Nazi occupation, there was plenty of distress to go around. Still, her identification with suffering has struck many of her readers as noble, and Bradatan thinks she had mystical insight into the ways that things fall apart. All things. Always. Weil developed the notion of “decreation,” which is “to make something created pass into the uncreated,” thus getting closer to God. The things of our world are products of the Fall, and by giving up on the material world we “give back to God what is properly his.” Bradatan sees a radical humility in Weil’s luxuriating in suffering; another might see mostly mystical arrogance in her insistence that by abjecting herself she approached the divine.

The second tale of failure concerns politics, and here Bradatan is especially good at showing the hypocrisy of leaders who proudly display their humility. Front and center is Gandhi, who worked very hard at showing that he was giving up working for anything like material success. He lived a very public life of renunciation to inspire those around him to find meaning in their poverty. Bradatan quotes one of the great Indian leader’s aides who bemoaned how expensive it was to keep the Mahatma in poverty. Political leaders who become inordinately powerful, Bradatan emphasizes, are those who tell stories that satisfy their community’s desire for meaning. Going beyond Gandhi, he shows that the most dangerous stories are those that ground that meaning in a violent attack on an enemy, a scapegoat. The leader is the opposite of that enemy, embodying the patriotic virtues to which the community aspires. The moral is that the search for political purity is always dangerous.

The third tale of “In Praise of Failure” explores how we frame “the losers,” the people who just can’t measure up to the standards of the world around them. The doctrine of predestination is particularly tough in this regard since it deems losers those whom God has not selected for salvation. There is nothing the losers can do, though many wind up striving for worldly success because they imagine this is the way to prove they are among God’s chosen. The central figure of this chapter is the idiosyncratic Romanian writer Emil Cioran, for whom Bradatan has enormous sympathy. Like Weil, Cioran has terrible judgment and seems incompetent at everything but writing (especially about his incompetence). But whereas Weil found something divine in her failures, Cioran was content to trace the human dimensions of not being able to do anything right. “Only one thing matters,” he wrote: “learning to be the loser.” I don’t understand why Bradatan finds something redemptive in this embrace of failure, but the Cioran he presents is an entertaining, aphoristic writer whose pessimism becomes comedic, and the ability to laugh in the face of inevitable failure is for Bradatan a very good thing.

The fourth and final tale in the book concerns, you guessed it, death. A concern with mortality hovers over all of Bradatan’s writing, and in this section of “In Praise of Failure” he underscores that “nothing in the world compares to what we experience when we face the ultimate failure: our own death.” This is the kind of thing that a great many thinkers have said for a very long time, and here Bradatan selects two: the Roman Stoic Seneca and the Japanese novelist Mishima. Although separated by millennia, the two men are joined by the strong desire to make a good death. Bradatan tells us about the complexities and hypocrisies of each and how both, despite years of planning, botched their suicides in gory, if not obscene, ways. But he respects their willingness to consider (even choreograph) their deaths in detail, even if this didn’t seem to help all that much when the final moments came.

Bradatan wears his erudition lightly. He is a pleasure to read, and his prose conveys a happy resilience in the face of life’s inevitable contradictions. His lessons in humility remind us that the pursuit of success is often motivated by the dread of failure — and that our attempts to create things are often driven by an avoidance of our mortality. The Stoics considered fear of death to be debilitating, and Bradatan emphasizes that fear of failure can sap the meaning from our lives. It doesn’t have to be this way, he assures us.

Acknowledging that we are designed to fail might lead us to live more joyfully and meaningfully, whatever our origins and ultimate fate.

Michael S. Roth is president of Wesleyan University. His latest book is “Safe Enough Spaces: A Pragmatist’s Approach to Inclusion, Free Speech and Political Correctness on College Campuses.”

 In Praise of Failure

Four Lessons in Humility

By Costica Bradatan

Harvard. 273 pp. $29.95

 

 

Reviewing Lewis Hyde on Forgetting

In Sunday’s Washington Post I published a review of a new book on forgetting by Lewis Hyde. For many years, I was very engaged with memory studies, and my own area of interest was in memory abnormalities. I was focused on diseases of memory, especially as they were understood in 19th century Europe: amnesia, nostalgia, hysteria. I published related essays on these topics in The Ironist’s Cage:Memory, Trauma and the Construction of History (1995) and Memory, Trauma and History: Essays on Living with the Past (2011).

Lewis Hyde will be visiting Wesleyan’s Center for the Arts in November. He’s a fascinating thinker — whether you’re interested in neuroscience, politics, the arts, or just in how to make sense of the past. 

How much memory do you want? asks the salesperson at the technology store. Watching the evening news, I see commercials for a pill that will enhance my memory almost as often as I see ads for another pill that will regulate my mood. I wonder if one medication counteracts the other. We live in a culture that seems to prize memory, even as it gives us technological tools to eliminate our need for it. I can ask my phone for most bits of information that I have trouble bringing to mind, and if I forget where I left my phone, my iPad can remind me.

Still, memory remains a subject of reflection and anxiety — not least because as people live longer, more of them are surviving without connections to their past. The destruction of memory caused by Alzheimer’s disease is often experienced as a destruction of the self. It can be terrifying to those who suffer from it and seem like the ultimate cruelty to loved ones who are no longer recognized. In “A Primer for Forgetting,” Lewis Hyde doesn’t ignore the pain of involuntary amnesia, but he is much more interested in the liberating aspects of “getting past the past,” as his subtitle puts it.

His book is organized loosely — it’s made up of four notebooks of aphorisms and reflections on a wide variety of sources that discuss what it means to lose the past. When you turn your attention to forgetting, does that mean you are in fact remembering? This question runs through Hyde’s beautiful prose like a bright red thread, or perhaps a string tied around your finger. He wants readers to acknowledge how sweet it can be to get free of one’s memories — free of the baggage attached to the self. The weight of the past can lock us into repetition; it can also instigate a desire to set the past right, to correct or avenge misdeeds. “The tree of memory set its roots in blood,” Hyde emphasizes, and he asks, “Could there be an art of forgetting that puts an end to bloodshed?”

Hyde’s four notebooks explore Myth, Self, Nation and Creation. He surveys Western traditions and delves into Buddhist teachings that urge us to let go of ego-building in favor of nourishing “serene self-forgetfulness.” Hyde is especially attracted to artists who manage to forget their habits of mind to unleash the freedom of creative thought.

But what of the wounds of the past? Doesn’t the quest for justice insist on remembrance? It is to Hyde’s great credit that he dwells on cases that demand recollection to shake off the chains of past horrors. He remembers the young African American men, Charles Moore and Henry Dee, who in 1964 were brutally tortured by Klansmen before being drowned in the Mississippi River. It wasn’t that hard to find those responsible, but it took more than 40 years for anyone to be brought to justice. Hyde is fascinated by Thomas Moore, brother of the murdered Charles, who for years plotted violent revenge but wound up going back to the scene of the crime and forgiving one of those responsible. Hyde discovers a sense of awe and mystery in the way Thomas “freed himself from servitude to the Unforgettable, and became the agent of his own recollections.” This Thomas Moore didn’t need a utopia. He achieved the freedom that comes with some forgetting and was nourished by the peace that forgiveness brings.

Thomas, who had long suffered from the memories of violent injustice, achieved agency through the “work of forgetting.” For Hyde, there is a lesson here for those who wind up paying for the ways nations construct their bloody myths. Victors manipulate memory to perpetuate injustice, as the United States did when it urged citizens to forget the strife of the Civil War so as to preserve white supremacy. “Violence denied and repressed doesn’t disappear,” Hyde writes, “it repeats.” He wants nothing to do with the “organized forgetting” that perpetuates America’s “foundational violence.” But Thomas’s path to forgiveness gives Hyde hope that the “people saddled with history can work on the past rather than have the past work on [them].”

But tough questions remain. How to tell what deserves remembrance and what will just poison the present? How does one “adaptively mourn” in a way that acknowledges the past without being subsumed by it? These are not questions Hyde can answer definitively, but he raises and examines them from a variety of perspectives. He praises getting free of the past, but he knows that forgetting can be its own horror; he has seen his Alzheimer’s- stricken mother awash in anxiety at not recognizing the man claiming to be her son. Hyde confesses, “Sometimes I think it is hopeless, this quest for beneficent forgetting.”

But this is his quest, and he turns to various traditions of leaving the self behind in pursuit of answers. He admires the composer John Cage’s efforts to hear sounds as if he had never encountered them before, to leave behind his habits of attention. And Friedrich Nietzsche, the philosopher who made forgetting heroic, is always close at hand. For Nietzsche, no great deed is possible without some amnesia. But Hyde is no Nietzschean; he’s closer to Henry David Thoreau, who relished the sense of losing something instead of pounding his chest to insist that there was never anything to be lost. Thoreau, like Hyde, remembers forgetting, but he is consumed by neither memory nor loss. The last words of “A Primer For Forgetting” are “teach me to disappear.” But there they are: words visible on the page — the trace of a lesson.

A PRIMER FOR FORGETTING
Getting Past the Past
By Lewis Hyde
Farrar, Straus and Giroux.
372 pp. $28

Liberal Education: Now More than Ever

The following is reposted from the Washington Post.

I recently participated in a celebration of the 15th anniversary of the opening of Peking University’s branch campus in the young, dynamic city of Shenzhen. PKU is a venerable institution considered to be at the pinnacle of higher learning in China, and in recent years it has been making great efforts to be recognized as one of the top research universities in the world. I was invited to speak because PKU-Shenzhen has decided to start an undergraduate liberal arts college and I’ve been making the case over the last several years for a pragmatic liberal education. In the conclusion to my 2014 book “Beyond the University: Why Liberal Education Matters,” I expressed my excitement at China’s new interest in liberal learning, and the experience I just had in Shenzhen leads me to think that this interest is surging.

This is a fragile time for liberal education, making commitment to it all the more urgent. American universities are facing enormous pressures to demonstrate the cash value of their “product,” while at the same time the recreational side of college life is attracting more attention than ever – from football games to Greek life, from fancy dorms and fancier gyms to student celebrations that range from the Dionysian to the politically correct. To meet enrollment goals or to climb in the rankings many colleges offer the full spa experience, while being sure to trumpet the values of what young people learn while not in the classroom. But these efforts at brand promotion only make the educational mission of universities more fragile. “Campus follies” have become a staple of critiques of higher education’s elitism and entitlement.

To be sure, college culture has been mocked throughout American history, but today collegiate life inspires either a toxic mixture of anger and resentment or just baffled misunderstanding. Given the coverage of campus life, it’s understandable that the American public seems to have forgotten how important our universities have been as engines of economic and cultural innovation, of social mobility.

As I was preparing my remarks, I turned to the writings of John Dewey, the great pragmatist philosopher. Dewey went to China in 1919 to talk about education, also a time of change. The May 4th movement was creating a dynamic of protest against the excessive weight of tradition in favor of a notion of Enlightenment and modernization that would work within a changing Chinese context. It was a propitious moment for Dewey to advocate for a broad, liberal education to prepare the Chinese to be informed, productive members of society. He initially planned to give several lectures in China but wound up staying two years. Known as Du Wei – Dewey the Great (as John Pomfret recently noted), his influence there was powerful. Mao himself transcribed Dewey’s  lectures in Changsha, though Communists would later become intensely critical of the gradualism embedded in the Dewey’s legacy.

In Shenzhen, with Dewey in mind, I focused on two dangers and two possibilities.

 

Danger of Narrowing Specialization

Academics don’t get stuck in silos by accident; seeking professional status, they are incentivized to burrow deep. They become so accustomed to their own subdisciplinary netherworlds that they have trouble in anyone else’s atmosphere. Department members often see no reason to interact with colleagues from other fields, and so undergraduates have almost no hope of getting guidance about their education as a whole. Despite the commonplace rhetoric of interdisciplinarity, academics seem all too content creating languages and cultures that are insular. We have gotten really good at education as a form of narrowing, while what we really need is to provide students with intellectual cross-training, and for that we need faculty who can communicate across a variety of fields.

Liberal education should enhance abilities to translate across ideas and assumptions, but instead the public is treated to the spectacle of pointy-headed specialists great at one thing but not to be trusted beyond their small subfield. Of course, advanced work in any area requires rigorous work and real technical competence. But we must not confuse being a competent technician with being a scientist who can make discoveries or a teacher who can inspire students by translating complex technical issues into terms clearly relevant to pressing human concerns.

In Shenzhen I urged colleagues not to replicate the two cultures division that infects many American campuses. We need more academics who can facilitate conversations between the sciences and the humanistic disciplines.  The sciences, social sciences and humanities are all focused on research, and sustained artistic practice depends on a commitment to inquiry. It is especially important for undergraduate education to foster exchange among researchers, be they in medicine, philosophy, design, literature or economics.

 

Danger of Populist Parochialism 

Just as on campuses we have gotten all-too-good at isolation through specialization, in the public sphere we know how to stimulate parochialism. New provincialisms and nationalisms, are gaining force around the world thanks to fear-based politics; but orchestrated parochialism is antithetical to liberal learning.  A liberal education includes deepening one’s ability to learn from people with whom one doesn’t agree, but the politics of resentment sweeping across many countries substitutes demonization for curiosity. Writing people off with whom one disagrees will always be easier than listening carefully to their arguments. Without tolerance and open-mindedness, inquiry is just a path to self-congratulation at best, violent scapegoating at worst.

It is especially urgent to advocate effectively for a broadly based pragmatic liberal education when confronted by ignorant authoritarians who reject inquiry in favor of fear mongering and prejudice. A broad education with a sense of history and cultural possibilities arms citizens against manipulation and allows them to see beyond allegiance to their own.

Undergraduate education – be it in China or the United States – should promote intellectual diversity in such ways that students are inspired to grapple with ideas that they never would have considered on their own. At Wesleyan University, creating more access for low-income students and military veterans has been an important part of this process.  Groups like these have been historically under-represented on our campus, but just having diverse groups is not enough. We must also devise programs to make these groups more likely to engage with one another, bursting protective bubbles of ideas that lead some campus radicals and free speech absolutists to have in common mostly a commitment to smug self-righteousness.

 

Possibilities of Open and Reliable Communication

There can be no research progress without the effective sharing of information. In astrophysics and genomic science today, scientists depend on data sets that can be shared. Likewise, humanists depend on reliable, publicly available documents and critical editions. Unlike commercial enterprises that quickly make discoveries proprietary, academic research at its best depends on sharing methods and results. And significant research progress is made when scholars discover evidence and points of view that challenge their own assumptions.

As I admired the PKU Shenzhen campus, I remembered that search engines (like Google) and news sources (like the New York Times) are unavailable there because of government censorship. Still, the scholars I met on campus seemed to have little trouble gaining access to a variety of points of view. Under a regime that officially restricts information, they work hard at expanding the inputs they receive. In the West, we are fortunate to have at our fingertips a dizzying array of information and points of view. But in recent years Americans have increasingly tended to block out views they don’t want to hear. Curating our information inputs, we choose our choir and know what kind of preaching we are going to hear. Algorithms that filter information to each user are not the same as censorship, but they, too, are anathema to inquiry.

Almost a century ago, Dewey reminded his Chinese audiences: “Where material things are concerned, the more people who share them, the less each will have, but the opposite is true of knowledge. The store of knowledge is increased by the number of people who come to share in it. Knowledge can be shared and increased at the same time— in fact, it is increased by being shared.” A university today must be a vehicle for sharing knowledge – and its leaders must advocate for consistently communicating the values of learning, including from surprising sources.

 

Possibilities of cosmopolitanism and community

While lecturing in China, Dewey wrote of the power of education to “cultivate individuality in such ways as will enhance the individual’s social sympathy.” It’s a two-way street. If we are to prepare the soil for the more effective cultivation of pragmatic liberal education, we will need the nutrients of creative individuality, cosmopolitanism and community. Empowering individuals to take productive risks and encouraging them to develop what Dewey called “practical idealism” has long been the hallmark of pragmatic liberal learning. Cosmopolitanism helps us grow a culture of openness and curiosity, recognizing that people are, in Anthony Appiah’s words, “entitled to the options they need to shape their lives in partnership with others.”

Developing a campus community means seeding relations of trust that encourage experimentation and intellectual risk taking. At healthy universities, professors and staff learn to care for the welfare of their students, and students learn to look out for one another. In dynamic educational environments, people are more willing to venture beyond their comfort zones because they have background assumptions of trust. And as they become more adept at intellectual and cultural translation, they deepen this trust while making these zones more porous.

Although there are commendable aspects of the current American focus on skill acquisition in higher education, we must avoid confusing the accumulation of competence badges with what in China is still called “the education of the whole person.” We need an undergraduate education that is human centered – setting a framework for inquiry and exchange that will be a resource for graduates for the rest of their lives.

Almost one hundred years ago Dewey spoke about the dual tasks of the university: to preserve culture and to stimulate inquiry for the sake of social progress. In China, scholars are daring to imagine this progress, despite political tendencies that foster nationalist insularity and limit access to people and information.

Such progress is becoming harder to imagine in America given a looming administration bent on ignoring facts and a leader quick to dismiss inquiries that don’t feed his apparently bottomless need for self-aggrandizement. This is the context in which we must find, as Dewey wrote, “faith in the power of intelligence to imagine a future which is the projection of the desirable in the present, and to invent the instrumentalities of it realization.”  These remain the tasks of thinking, inquiry and communication.

Now, at this fragile time and on both sides of the Pacific, pragmatic liberal education matters more than ever.

Free Speech, Political Correctness and Higher Education

In the past week the University of Chicago made big news by defending academic freedom in a letter to incoming students. “Finally,” a distinguished alumnus wrote in a subject line of an email to me, “some sanity on campus.” Really? How is it possible that a distinguished university polishes its own apple by stating the obvious, that freedom of thought and expression are essential to its mission?

Well, last year was a tumultuous one for campus politics. Events from Claremont California to the University of Missouri to Yale gave plenty of fuel to older pundits already asking, “What’s the matter with kids today?” A chorus of critics of political correctness found common ground in mocking students’ desire for “safe spaces,” their concern over micro-aggressions, their need for trigger warnings. Kids today are coddled, we were told, and when they get to college, they fail to respect the rough and tumble contest of ideas that middle-aged alumni remember as being part of their own college experience. No matter that when most of us oldsters were in college, the campuses were far less diverse places than they are today. There were many voices back then that none of us got to hear.

Why did the Chicago’s Dean of Students feel the need to remind the happy few chosen to be part of the class of 2020 that the university does not support trigger warnings, intellectual “safe spaces” or the cancelling of visiting speakers? What if a faculty member wanted to give students a heads up that they would be reading a racist text or a book about rape so as to help them understand the reasons why it was part of the work of the class? Would giving this “trigger warning” not be part of the professor’s academic freedom? And what if students, as Northwestern’s president Morton Schapiro explained in an op-ed last year, sometimes wanted to hang out in the university’s Hillel so as to feel comfortable (safe) in discussions about Israel? What if students decided to protest a visiting war criminal who has been invited to lecture? Would these run afoul of Chicago’s posture of intellectual toughness?

When confronted with issues of power and inequitable distribution of resources, it’s far too easy to fall back on talk about abstract commitments to freedom and procedures. At a time when violent racism has been exposed as a systematic part of law enforcement, at a time when the legitimation of hatred in public discourse has become an accepted part of national presidential politics, it seems more than a little naive to tell incoming frosh that “civility and mutual respect are vital to all of us.” These students are coming to Chicago, after all ― one of the most violent cities in America. But perhaps the Dean’s letter was aimed at a different audience ― those concerned with the bogeyman of political correctness and those who worry that free speech isn’t the absolute value it used to be. That would explain the concerted efforts of the University of Chicago’s administrators to push for their unfettered marketplace of ideas version of free speech.

That said, I agree that freedom of expression is essential for education and for democracy. But speech is never absolutely free; it always takes place for specific purposes and against a background of some expression that is limited or prohibited. Hate speech and harassment fall into these legal or procedural categories. And there are some things, after all, that a university should refuse to legitimate or dignify by treating them as fit subjects for academic discussion. When we make a subject part of a debate, we legitimate it in ways that may harm individuals and the educational enterprise. We must beware of the rubric of protecting speech being used as a fig leaf for intimidating those with less power.

Last year at Wesleyan University, we had an intense debate about freedom of the press. Some students initially wanted to defund the student newspaper because they found it offensive, but others rushed to its defense. At that time, I wrote:

Debates can raise intense emotions, but that doesn’t mean that we should demand ideological conformity because people are made uncomfortable. As members of a university community, we always have the right to respond with our own opinions, but there is no right not to be offended. We certainly have no right to harass people because we don’t like their views. Censorship diminishes true diversity of thinking; vigorous debate enlivens and instructs.

It’s still the case that the great majority of those studying on American college campuses would agree.

Over time, our students realized that censorship in various forms is antithetical to our educational mission, and they also recognized that the school newspaper could do a better job soliciting diverse of points of view. Rather than merely affirming abstract principle, they worked through an on-the-ground commitment to freedom of expression along with the cultivation of diverse points of view and a sense of belonging. This is not “free speech absolutism” or even a pure standard for campus decision makers to apply. But it is a winning combination for those entering a university, in Chicago or anywhere else.

Cross-posted with the Washington Post.

Review of Elizabeth Kolbert’s The Sixth Extinction

This review of Elizabeth Kolbert’s The Sixth Extinction: An Unnatural History  appeared in the Washington Post this morning. I know there are many people at Wesleyan searching for ways to make a difference in the face of the environmental disasters of climate change. Kolbert is a thoughtful, engaged and determined guide.

 

Elizabeth Kolbert’s “Field Notes From a Catastrophe” (2006) presented a powerful account of how climate change was disrupting lives around the planet. Whether the New Yorker columnist was visiting a utility company in Burlington, Vt., ice sheets in Greenland or floating cities in the Netherlands, she deftly blended science and personal experience to warn of the enormous harm created by human-generated climate change. The last chapter of that book, “Man in the Anthropocene,” underscored that we had entered an era in which human beings had begun to change everything about the planet’s interlocking ecosystems, and that we had put much of those systems and our own species at enormous risk.“It may seem impossible,” Kolbert concluded, “to imagine that a technologically advanced society could choose, in essence, to destroy itself, but that is what we are now in the process of doing.”

(Henry Holt) – ‘The Sixth Extinction: An Unnatural History’ by Elizabeth Kolbert

In her new book, “The Sixth Extinction,” she provides a tour de horizon of the Anthropocene Age’s destructive maw, and it is a fascinating and frightening excursion. We humans have been bad news for most of the world’s living things, causing massive extinctions of species with which we share the planet. Unless we change our ways, she argues convincingly, we will certainly cause our own demise.

Until the 18th century, scientists didn’t have a clear idea that species could become extinct. Kolbert credits the French naturalist Georges Cuvier, writing in the wake of the great Revolution, with realizing that whole branches of the tree of life could permanently be cut off. Still, most of those who studied natural history were sure that extinctions happened only gradually over very long periods of time. This uniformitarian view would fit well with Darwin’s perspective on the slow and steady pace of evolutionary change through natural selection. Species did become extinct, but only very slowly as other competitors adapted more successfully to the environment around them.

This view of extinctions was definitively shattered by the work of Luis and Walter Alvarez, a father-son team who demonstrated that the Cretaceous period ended when an asteroid struck the Earth and radically changed the planet’s climate. In what has come to be called the K-T extinction, “every animal larger than a cat seems to have died out,” and things were no better in the water. The dinosaurs were just the most celebrated victims: “Following the K-T extinction,” Kolbert emphasizes, “it took millions of years for life to recover its former level of diversity.”

The scientific consensus was that things evolved very slowly, except in the face of radical events — like an asteroid crashing into the Earth. Today there is another asteroid hitting the planet, and it’s us. Slow “adaptation” in the Darwinian sense is meaningless if a creature very suddenly has to face conditions that “it has never before encountered in its entire evolutionary history.” In our age, the Anthropocene, these are the conditions human beings have been creating (very quickly) for other forms of life.

As in “Field Notes From a Catastrophe,” Kolbert presents powerful cases to bring her point home. Oceans are highly stressed by climate change, for example, and acidification of the seas is driving the extraordinary ecosystems of coral reefs into extinction. Plants and animals are desperate to migrate to more hospitable climes, while others can’t survive the arrival of the newcomers. According to entomologist E.O. Wilson, whom she cites, we are now reducing biological diversity to its lowest level since the Cretaceous period.

Some of these changes have been created by our species breaking down barriers among other species as life forms tag along on our boats and planes from one part of the globe to another. Snakes in Guam, snails in Hawaii and thousands of other species brought by human beings into new environments, intentionally or not, have “succeeded extravagantly at the expense of other species.” As we make the world more interconnected than ever (“The New Pangaea”), the fatal vulnerabilities in thousands of species are exposed. The recent annihilation of bat populations in the Northeast, for example, has been caused by a foreign fungus that the animals had never encountered and so had no defense against. When a new fungus appears, Kolbert writes, “it’s like bringing a gun to a knife fight.”

The alterations initiated by human beings build on one another, accelerating change in ways that make it all but impossible for most species to adapt quickly enough. As the great environmentalist Rachel Carson put it, “Time is the essential ingredient, but in the modern world there is no time.” But Kolbert is not nostalgic: “Though it might be nice to imagine there once was a time when man lived in harmony with nature, it’s not clear that he ever really did.”

Kolbert devotes a chapter, “The Madness Gene,” to considering the attribute of human beings that requires change in order to flourish. Unlike other species, modern humans, endowed with language, seem driven to embark on perpetual improvement projects in the course of which they alter everything around them. “With the capacity to represent the world in signs and symbols comes the capacity to change it, which, as it happens, is also the capacity to destroy it,” she writes. “A tiny set of genetic variations divides us from the Neanderthals, but that has made all the difference.”

Carson, a worthy model for Kolbert, wrote of “the problem of sharing our earth with other creatures.” We are deciding, Kolbert concludes, “which evolutionary pathways will remain open and which will forever be closed.” Our history determines the course of life on the planet. Our species changes the world, and now the most urgent question is whether we can take responsibility for what we do. “The Sixth Extinction” is a bold and at times desperate attempt to awaken us to this responsibility.

Wesleyan Alumnus-Greenpeace Activist Jailed in Russia

This morning I read a moving op-ed in the Washington Post about Dima Litvinov ’86, a Greenpeace activist recently arrested in Russia. Having organized protests against Russia’s exploitation of the Arctic, Dima was originally arrested with several others on charges of piracy. After protests against this dramatic over-reaching, the charges were reduced to hooliganism. The charges in this case were prompted by Greenpeace activists trying to put a banner on a Russian oil rig.

The op-ed piece is by Dima’s father, and this is how it concludes:

Dima and the others are threatened with long prison terms because they love and defend nature. That includes the Russian Arctic, which is threatened by senseless and dangerous drilling.

I know only too well what a prison term in Russia means. I was arrested for participating in 1968 in a demonstration against the Soviet invasion of Czechoslovakia. Lev Kopelev, Dima’s grandfather on his mother’s side, a Soviet writer, spent eight years in Soviet prison camps because he protested the looting and raping of the German population by Soviet officers and soldiers during World War II, when he fought the Nazi army.

Dima’s grandfather was arrested under Joseph Stalin, and I, Dima’s father, was arrested under Leonid Brezhnev. The Soviet Union doesn’t exist anymore, but Dima has been arrested under Russian President Vladimir Putin — a former member of the Soviet secret police, the KGB. Is it not the time to break the cycle?

The Wesleyan community has been asked to support Dima and the other Greenpeace activists. They were peacefully protesting, but they are no hooligans.

Russia_Greenpeace-0f982-1833
(Igor Podgorny/Associated Press) – In this photo released by Greenpeace International, activist Dima Litvinov in a defendants’ cage at the district court, in Murmansk, Russia, on Oct. 23.

Review of a Famous Amnesia Patient

From time to time my work takes me back to psychology, or at least to psychoanalysis. Last week I spoke at Clark University on Sigmund Freud. This was particularly meaningful to me because I often teach Freud’s “Clark Lectures,” and Clark was the only university at which Freud spoke. I even got my picture taken with the Freud statue!

MichaelFreudSMALL

 

I published a review on more mainstream psychological research in today’s Washington Post. The book deals with memory and amnesia, a topic on which I’ve worked over the years. I’ve attached it below.

PERMANENT PRESENT TENSE The Unforgettable Life of the Amnesic Patient, H. M. By Suzanne Corkin Basic. 364 pp. $28.99

Henry Molaison (1926-2008) lived a long life, but as it turned out, he experienced most of it in a very short time segments. His seizures started early, and by the time he was in high school they had become frequent. Medications to control epilepsy had a variety of side effects, and still they didn’t eliminate the seizures. The terrible blackouts were always a possibility.

Wilder Penfield had been experimenting with brain surgery on human patients since the 1920s, sometimes “operating on epilepsy patients while they were awake and conscious so that he could pinpoint the abnormal tissue responsible for their seizures.” Penfield’s work received widespread recognition, and when William Scoville examined Henry in the early 1950s, a surgical option seemed likely to provide some relief to the patient. We can imagine why Henry, then 27, and his parents decided to take a chance. Everyone involved knew that the operation was experimental, even risky. The lobotomy was scheduled.

Scoville extracted the front half of Henry’s hippocampus and most of the amygdala, among other parts of his brain. He hoped that this would significantly reduce the frequency of seizures. It didn’t work. Nobody seemed to consider the possibility that the patient would lose the ability to create any new, lasting memories. But that’s what happened. For the next 54 years Henry lived in “a world bounded by thirty seconds.”

Henry’s world was fascinating — for neuroscientists and psychologists — because it was a window onto how memory functions. He became one of the most studied human beings on the planet. Suzanne Corkin met him when she was a graduate student because her teacher co-authored a paper on Henry with surgeon Scoville. “Henry’s case,” as she puts it in one of her many inelegant phrases, “fell into my lap.”

The case defined Corkin’s career and became absolutely crucial for the development of the sciences of memory. Neuroscientists came to understand the “extent to which memory depended on a few centimeters of tissue in the medial temporal lobe” because a surgeon had destroyed those pieces of Henry’s brain.

Henry could not care for himself without significant help because he would forget just about everything that was going on around him. But psychologists and neurologists, Corkin first among them, made sure that Henry had a decent life. He would be brought to MIT on a regular basis for batteries of tests. Some odd things here and there seemed to stick in his mind; there was a sense of familiarity about a few people and a few tasks. Most of the time, he was happy enough to be a test subject — not remembering that he had been tested before, time and again, for decades.

Corkin expertly uses Henry’s case to illuminate major trends in memory research. Perhaps the fundamental lesson that scientists have drawn from his case — besides eventually stopping experimental lobotomies — is that memory is a complex interweaving of cognitive systems. Short-term memory (recalling something for up to 15 seconds) could be intact without any bearing on long-term memory. Henry could hold on to number patterns, for example, for about 15 seconds, but if more storage was needed in a task, he was deficient. He was able to call on “working memory” (the ability to store small amounts of experience while focused) in specific tasks, but he was not able to register the experiences for the long term required to concentrate on those tasks. Henry could respond correctly when he could focus, but he would lose the experience forever once the task was complete. Overall, Corkin tells us in her matter-of-fact way, “in spite of his tragedy, Henry got along.”

The explicit retrieval of the past is declarative memory: We are purposively calling up something that we experienced. Through remembering, the brain changes itself as new associations are formed with what is retrieved. Each time long-term memories are retrieved, they are edited — showing “that memory is an ongoing dynamic process driven by life’s events.” Henry’s case revealed how necessary the hippocampus is for this process.

In reading about Henry as a test subject and “guide” for neuroscience, I was eager to learn more about other aspects of his life over these decades. Corkin seems to have grown genuinely attached to her mild-mannered scientific treasure trove, but her descriptions of his existence are flat, at best. We get only the faintest glimpse of how it felt to live this rudely segmented life. The destruction of his amygdala probably flattened his emotions, his desires. Corkin tells us that after the death of his father, Henry did not “consciously grasp that his father was gone unless someone reminded him.” Yet over four years the loss seemed to sink in. It was Henry’s own words that I found especially moving: “I am having a debate with myself — about my dad. . . . I’m not easy in my mind. On the one side, I think he has been called — he’s gone — but on the other, I think he’s alive. I can’t figure it out.”

For years Henry lived with his mother, and there are indications that their relationship was complex and difficult. But this was not the subject of the neuroscientist’s research. This book informs us that at times Henry was prone to terrible rages and that he even threatened to kill himself, but there is little attempt to see the world from his point of view. Perhaps that would have been impossible; it would certainly have been inconvenient.

Corkin acknowledges how important Henry was for her work. She recognizes that “our research with Henry was certainly a boon to my lab’s reputation” and affirms his “limitless worth as a research participant.” But what does it mean to participate without memory? Corkin became Henry’s guardian in his later years and saw to it that he was comfortable and well cared for. She also saw to it that upon his death, his brain was quickly removed from his skull so that it could be studied with all the technology now at our disposal.

“Henry was dead,” Corkin writes, “but he remained a precious research participant.” How you feel about such a sentence will probably be a good indicator of how you will feel about poor Henry and his doctors in “Permanent Present Tense.”

Higher Education — Two Reviews

Over the summer I finished a short book called Beyond the University: Why Liberal Education Matters. It will be published in the spring.  I also reviewed two interesting books on American higher education, one that focused on teaching and the other providing a broad overview of the sector. You can find my review of Why Teach? here, and of Higher Education in America, from yesterday’s Washington Post, below.

Today classes begin, and I am delighted to head back to the classroom. Last night we heard some of the amazingly talented students who sing a capella at Wes, and now we are finishing our syllabi and checking our reserve readings. I am teaching the Past on Film and am looking forward to meeting the students.

 

American higher education is the envy of the world. Students flock to this country from all over, and the most highly ranked schools tend to be here. We should be proud!

American higher education is a mess. With high costs, low graduation rates, unhappy faculty members and coddled students, our universities are about to be radically disrupted by massive, technologically driven change. A good thing, too!

How to reconcile these opposing views? At a time when ambitious business-school professors and salivating entrepreneurs predict the end of the university as we know it, and at a time when we have never been more in need of an educated workforce and citizenry, the task of understanding the evolving mission and performance of American higher education has never been more urgent. Thank goodness Derek Bok, a two-time president of Harvard and a judicious, learned analyst of education, has taken on this undertaking. His book is too long to be called a report card, but it is a detailed progress report on the challenges and opportunities facing our nation’s colleges and universities.

One of the first things to note about higher ed in the United States is its heterogeneity. The problems of Harvard are not the same as the problems of the University of Texas or those of Scripps College in California or of LaGuardia Community College in New York. Bok tries to address schools in all their multiplicity, and his book suffers somewhat from the clunkiness that also characterizes higher ed. The book’s five sections discuss instruction from undergraduate to graduate and professional schools, as well as the market forces at work at each level. After the introduction, there are five forewords and four afterwords — not including the short final chapter called “The Last Word.” Yet one forgives redundancies because of the thoroughness of the research and the measured judgment consistently applied.

After noting the variety in higher ed, Bok acknowledges the extraordinary inequalities in the sector. Public discussion of education often focuses on the schools most difficult to get into, but “no more than two hundred colleges regularly reject more students than they admit.” At most highly selective schools (such as the one at which I am president), students receive some subsidy from the institution — even those paying full tuition. Students enrolled at less-selective schools get a small fraction of that support. Public institutions have seen dramatic reductions in state support for universities, and many flagship campuses are scrambling for donations and out-of-state, full-tuition-paying students. Community colleges enroll dramatically more people than other parts of the sector, but most of these students will never earn a degree.

Bok shows that the current quip that universities haven’t changed their teaching styles since the Middle Ages is just an empty canard. Universities have adapted surprisingly well to massive changes in technology, in demography and in developing streams of support. But Bok is no Pollyana, emphasizing that “universities have been especially slow to act . . . in improving the quality of undergraduate education.” Professors often confuse their desire to teach what interests them the most with what undergrads need to learn, and students in recent years are spending far less time on their studies than in past generations. Bok shows how schools cater to students in order to attract more of them, often with little attention to how campus amenities provide distractions from studying.

Bok knows the governance structures of universities as well as anyone, and he realizes that true curricular reform has to be led by the faculty. The challenge, from his perspective, is to make the faculty (at least its leadership) more aware of the empirical work on student learning that has been done over the past decade. Professors may be focused on their research and distracted by committee work, but the evidence shows that they care deeply about teaching effectiveness.

“The key to educational reform,” Bok writes, “lies in gathering evidence that will convince faculty that current teaching methods are not accomplishing the results that the professors assume are taking place.” Once the teachers understand the need for change, they will rise to the occasion and create classes that are more effective at developing the capacities that most agree are essential in college graduates. They have done so in the past, and they will do so again.

Bok’s confidence in the faculty is characteristic of his approach in this book. He believes that our varied system of higher education is very much capable of self-correction. Do we need to bend the cost curve? Sure, and that is why experiments such as massive open online courses (MOOCs) are so interesting (and mostly led by university veterans). Is there a liberal bias on our campuses? Sure, and it has been there at least since the 1940s, but faculty members realize they need more political diversity. Do university leaders spend too much time raising money? Sure, but American schools — especially the selective ones — get much more support than schools in other countries. We may have the worst system, he jokes, but like democracy, it’s better than all the alternatives.

Bok underscores two areas in urgent need of improvement: increasing the percentage of students who graduate from college and improving the quality of undergraduate education. We must do a better job attracting low-income students to our best colleges and universities, no longer wasting financial aid on wealthy students with high SAT scores to improve an institution’s place in bogus rankings. We must also do a better job of stimulating curricular reform and assessment so as to be sure students are working hard to learn what they need to know — whether at a community college or a research university. Of course, reaching agreement on what students need to know is a great challenge, but that’s the core of the faculty’s responsibility.

Competition among schools produces benefits and causes problems. Most of the important ones are addressed in Bok’s helpful volume. I hope he is right that we already have the ingredients in place to make the necessary reforms. I know we need university leaders like him to help activate those ingredients so that American higher education can continue to contribute in vital ways to our culture, our economy and our polity.

Review of “The Undivided Past”

Last week the Washington Post published my review of David Cannadine’s The Undivided Past: Humanity Beyond Our Differences. I enjoyed the book, though while reading it I was reminded of something one of my Wesleyan professors told me long ago. Hayden White joked that historians know they can always say, “things are more complicated than that, aren’t they?” It always sounds like a reasonable question (which means it’s really an empty question). Over the years, I have seen Hayden’s point made time and time again when we academics (not just historians) make similar rhetorical gestures. I play with that empty question in the review below.

Cannadine’s book is very thoughtful and wide-ranging. I’m now making the final revisions on a short book on liberal education — and I know someone will be able to say, “Roth, things are more complicated than that!”

 

THE UNDIVIDED PAST Humanity Beyond Our Differences By David Cannadine Knopf. 340 pp. $26.95

In the 19th century, historians liked to tell triumphal tales of how people came together in powerful, sometimes glorious ways. Sweeping accounts described how religious groups or nations managed to achieve great things, often after battling other groups. In recent decades, historians have very much taken the opposite tack, showing how groups that we once thought were unified were actually quite divided. Difference, not unity, has been the preferred category for thinking about the past. Triumphal tales of groups coming together have been replaced by studies of how divided we have always been from one another.

David Cannadine, a British historian teaching at Princeton, offers a critique of the major categories historians have used to describe how some human beings are fundamentally different from others. Ideas of religion, nation, class, gender, race and civilization have generated intense feelings of belonging but also feelings of antagonism toward rival groups. Hatred of “the other” is the flip side of the fellow feeling that unites people into these mutually “belligerent collectivities.” Historians have focused on the conflicts that have emerged from this process, on the particular ways in which solidarity has given rise to antagonism toward groups of which one is not a member.

In his wide-ranging and readable new book, “The Undivided Past,” Cannadine shows that a very different story can be told. Rather than underscoring perennial conflicts between mutually exclusive human groups, Cannadine emphasizes how people find ways to get along, to cross borders and to maintain harmonious societies.

He begins with religion, which in many ways seems an unlikely vehicle for showing how people can get along. Given the missionary monotheism of Christianity, a religion that is both exclusive and proselytizing, the stage would seem set for stories of endless strife. But Cannadine finds many examples of pagans and Christians collaborating, and he shows how the “intermingling” of Catholics and Muslims in the late medieval period “transformed Europe’s intellectual landscape and made possible its twelfth-century Renaissance.” He knows well the bloody divisions that swept across Europe in the 16th and 17th centuries, but he insists that for most ordinary people, religious affiliation did not lead to violent conflict.

Rulers did try to generate national ties that would motivate their subjects to love (or fear) their monarchs and to fight against rival kings and queens. But intense feelings of nationalism, Cannadine shows, were a short-lived late-19th-century phenomenon, not some natural feeling that people are bound to experience. Class solidarity, too, was never the defining figure of identity for most people, even during the heyday of industrialization when Karl Marx and Friedrich Engels developed their theory that class conflict drove history. Sure, there are times when workers hate their bosses, but in any historical period there are “more important forms of human solidarity” than class. History is “more varied and complex than Marx or Engels would ever allow.”

Like any seasoned historian, Cannadine knows that one can always say of another’s account that “things are more complicated than that” as a prelude to offering one’s own. So it goes in “The Undivided Past.” He positions himself as an enemy of generalization, and so he can criticize Marxists for over-emphasizing class and feminists for over-emphasizing gender. Things are more complicated than that — not every worker has experienced oppression at the hands of the bourgeoisie, and not every woman has been stifled by patriarchy. Nations, though important for a brief period of history, were never monolithic entities inspiring universal devotion. Many French and English citizens, for example, had intense local allegiances that trumped national identity. When you look at history afresh while emphasizing different facts, customary generalizations can appear rather arbitrary. Things were more complicated than that.

Cannadine’s most pointed rhetoric comes toward the end, when he examines the idea of mutually antagonistic civilizations — an old notion recently popularized by the political scientist Samuel Huntington. Cannadine shows how “civilization” is intertwined with the idea of barbarism, and he quotes Montaigne’s quip: “Each man calls barbarism whatever is not his own practice.” Huntington saw the world as irrevocably divided into groups whose practices — religions, cultures and ways of life — prevented any meaningful integration. The claim is that Muslim, Eastern and Western civilizations, for example, are fated to be in conflict with one another, but Cannadine can find neither facts to back this up nor a coherent logic to the argument. He is biting in his criticism of neoconservative warmongers who draped themselves in Huntington’s pseudo-academic “findings.” “Of all collective forms of human identity,” Cannadine writes, “civilization is the most nebulous, and it is this very vagueness that makes it at once so appealing and so dangerous.”

Cannadine knows that writers and political leaders can all too easily generate solidarity on the basis of one element of a group’s identity in order to generate antagonism toward those who don’t share that element. His book is a reminder that generalizations based on the supreme importance of any one concept (be it race, class, gender, religion, nation or civilization) are likely to fall apart when closely examined. The “exaggerated insistence on the importance of confrontation and difference . . . misrepresents the nature of the human condition,” he concludes.

In closing, he writes briefly of the “just inheritance of what we have always shared” and urges us “to embrace and celebrate [our] common humanity.” But he doesn’t even try to provide evidence or arguments for what this common humanity might be. After all, that would be to make woolly generalizations; he knows things are more complicated than that.

 

Review of Nirenberg’s ANTI-JUDAISM

From Sunday’s WashingtonPost

Review of Anti-Judaism: The Western Tradition. By David Nirenberg. Norton. 610 pp. $35

 

Oh, the Protestants hate the Catholics,

And the Catholics hate the Protestants,

And the Hindus hate the Muslims,

And everybody hates the Jews.

So sang Tom Lehrer in his satirical song “National Brotherhood Week.” It’s no news that even those who preach “love thy neighbor” have often combined their striving for community with the hatred of a scapegoat, the Jews. David Nirenberg’s “Anti-Judaism” is a thorough, scholarly account of why, in the history of the West, Jews have been so easy to hate. And this story goes back a very long way.

Nirenberg returns to ancient Egypt to examine traditions that portray Jews as “enemies of Egyptian piety, sovereignty, and prosperity.”This was already old in the 7th century BCE! Ancient Greeks and Romans would have their Jews, too; they found use for an “anomalous” people who stuck together and followed their own rules, who were “neither disenfranchised nor citizen, neither conquered nor conquering, neither powerless nor free.” Over the centuries, when there was trouble in the kingdom, be it corruption or military threat, famine or political chaos, pagan ideologues developed a handy solution: Attack the Jews.

Jews were useful for those who were contending for power in the ancient world, and the Egyptian model of scapegoating was often repeated. But it was the Christians who refined anti-Judaism into a core theological and political ideology. Christianity had a particular problem: to show that it had overcome Judaism — overcome its adherence to the laws of the “old” testament, overcome its tribal particularity with evangelical universalism. The idea of Judaism — together with the fact that there were still people in the world who chose to remain Jews — was an affront to that universalism. “To the extent that Jews refused to surrender their ancestors, their lineage, and their scripture, they could become emblematic of the particular, of stubborn adherence to the conditions of the flesh, enemies of the spirit, and of God.”

Throughout the centuries theologians returned to this theme when they wanted either to stimulate religious enthusiasm or quash some perceived heretical movement. Not that you needed any real Jews around to do this. You simply had to label your enemies as “Jews” or “Judaizing” to advance the purity of your cause. In the first through fourth centuries, Christians fighting Christians often labeled each other Jews as they struggled for supremacy. And proclaiming your hatred of the Jews became a tried and true way of showing how truly Christian you were. Centuries later, even Luther and Erasmus agreed that “if hatred of Jews makes the Christian, then we are all plenty Christian.”

Islam followed this same pattern of solidifying orthodoxy by stoking anti-Jewish fervor. Muhammad set Islam, like Christianity, firmly within an Abrahamic tradition, but that made it crucial to sever the new religion from any Judaizing possibilities. Rival Islamic groups, like rival forms of Christianity, often painted their adversaries as hypocritical Jews scheming to take the world away from spiritual truths essential for its true salvation.

Nirenberg shows how consistently the struggle for religious and political supremacy has been described as a struggle against the “Jews.” The quotation marks are especially important as his account moves beyond the medieval period, because between 1400 and 1600 Western Europe was more or less “a world free of Jews.” Banished from most countries, and existing only in the tiniest numbers through special exemptions, actual Jews were hardly ever seen. But it was in this period that “Christian Europe awoke haunted by the conviction that it was becoming Jewish.” In this period of cultural change and doctrinal and political disputes, patterns as old as the age of the pharoahs were reactivated: My adversaries must be extinguished for the polity to be purified; my adversaries must be Jews. And in early modern European eyes, the adversaries were especially dangerous if they were secret Jews who appeared to be Christian. Were Jews hiding everywhere?

Martin Luther brought this rhetoric to a fever pitch. In 1523 he accused the Roman Church of becoming “more ‘Jewish’ than the Jews,” and as he grew older he tried to convince his contemporaries that “so thoroughly hopeless, mean, poisonous, and bedeviled a thing are the Jews that for 1400 years they have been, and continue to be, our plague, pestilence, and all that is our misfortune.” Don’t believe in conversions, the aged Luther urged; the only way to baptize Jews was by tying millstones around their necks.

Nirenberg’s command of disparate sources and historical contexts is impressive. His account of the development of Christianity and Islam is scholarly yet readable. And his portrayal of the role that Judaism has played as a foil for the consolidation of religious and political groups is, for this Jewish reader, chilling. Nirenberg is not interested, as he repeatedly insists, in arguing that Christianity and Islam are “anti-Semitic.” Instead, he is concerned with tracing the work that the idea of Judaism does within Western culture. He shows that many of the important conceptual and aesthetic developments in that culture — from Saint John to Saint Augustine to Muhammad, from Shakespeare to Luther to Hegel — depend on denigrating Jews.That’s what’s so chilling: great cultural achievements built on patterns of scapegoating and hatred.

In the modern period, revolutionaries and counter-revolutionaries continued to employ “the Jewish problem” as something to be overcome. “How could that tiny minority convincingly come to represent for so many the evolving evils of the capitalist world order?” Nirenberg asks. He shows that for thousands of years the patterns of anti-Judaism have evolved to provide great thinkers and ordinary citizens with habits of thought to “make sense of their world.” He doesn’t say that these patterns caused the mechanized, genocidal Nazi war against the Jews in the 20th century, but he argues convincingly “that the Holocaust was inconceivable and is unexplainable without that deep history of thought.”

Presaging Tom Lehrer, Sigmund Freud in 1929 wrote ironically that Jews, by being objects of aggression, “have rendered most useful services to the civilizations of the countries that have been their hosts; but unfortunately all the massacres of the Jews in the Middle Ages did not suffice to make that period more peaceful and secure for their Christian fellows.” Even when “everybody hates the Jews,” patterns of intolerance and violence remain intact. Nirenberg offers his painful and important history so that we might recognize these patterns in hopes of not falling into them yet again.