Review of Elizabeth Kolbert’s The Sixth Extinction

This review of Elizabeth Kolbert’s The Sixth Extinction: An Unnatural History  appeared in the Washington Post this morning. I know there are many people at Wesleyan searching for ways to make a difference in the face of the environmental disasters of climate change. Kolbert is a thoughtful, engaged and determined guide.


Elizabeth Kolbert’s “Field Notes From a Catastrophe” (2006) presented a powerful account of how climate change was disrupting lives around the planet. Whether the New Yorker columnist was visiting a utility company in Burlington, Vt., ice sheets in Greenland or floating cities in the Netherlands, she deftly blended science and personal experience to warn of the enormous harm created by human-generated climate change. The last chapter of that book, “Man in the Anthropocene,” underscored that we had entered an era in which human beings had begun to change everything about the planet’s interlocking ecosystems, and that we had put much of those systems and our own species at enormous risk.“It may seem impossible,” Kolbert concluded, “to imagine that a technologically advanced society could choose, in essence, to destroy itself, but that is what we are now in the process of doing.”

(Henry Holt) – ‘The Sixth Extinction: An Unnatural History’ by Elizabeth Kolbert

In her new book, “The Sixth Extinction,” she provides a tour de horizon of the Anthropocene Age’s destructive maw, and it is a fascinating and frightening excursion. We humans have been bad news for most of the world’s living things, causing massive extinctions of species with which we share the planet. Unless we change our ways, she argues convincingly, we will certainly cause our own demise.

Until the 18th century, scientists didn’t have a clear idea that species could become extinct. Kolbert credits the French naturalist Georges Cuvier, writing in the wake of the great Revolution, with realizing that whole branches of the tree of life could permanently be cut off. Still, most of those who studied natural history were sure that extinctions happened only gradually over very long periods of time. This uniformitarian view would fit well with Darwin’s perspective on the slow and steady pace of evolutionary change through natural selection. Species did become extinct, but only very slowly as other competitors adapted more successfully to the environment around them.

This view of extinctions was definitively shattered by the work of Luis and Walter Alvarez, a father-son team who demonstrated that the Cretaceous period ended when an asteroid struck the Earth and radically changed the planet’s climate. In what has come to be called the K-T extinction, “every animal larger than a cat seems to have died out,” and things were no better in the water. The dinosaurs were just the most celebrated victims: “Following the K-T extinction,” Kolbert emphasizes, “it took millions of years for life to recover its former level of diversity.”

The scientific consensus was that things evolved very slowly, except in the face of radical events — like an asteroid crashing into the Earth. Today there is another asteroid hitting the planet, and it’s us. Slow “adaptation” in the Darwinian sense is meaningless if a creature very suddenly has to face conditions that “it has never before encountered in its entire evolutionary history.” In our age, the Anthropocene, these are the conditions human beings have been creating (very quickly) for other forms of life.

As in “Field Notes From a Catastrophe,” Kolbert presents powerful cases to bring her point home. Oceans are highly stressed by climate change, for example, and acidification of the seas is driving the extraordinary ecosystems of coral reefs into extinction. Plants and animals are desperate to migrate to more hospitable climes, while others can’t survive the arrival of the newcomers. According to entomologist E.O. Wilson, whom she cites, we are now reducing biological diversity to its lowest level since the Cretaceous period.

Some of these changes have been created by our species breaking down barriers among other species as life forms tag along on our boats and planes from one part of the globe to another. Snakes in Guam, snails in Hawaii and thousands of other species brought by human beings into new environments, intentionally or not, have “succeeded extravagantly at the expense of other species.” As we make the world more interconnected than ever (“The New Pangaea”), the fatal vulnerabilities in thousands of species are exposed. The recent annihilation of bat populations in the Northeast, for example, has been caused by a foreign fungus that the animals had never encountered and so had no defense against. When a new fungus appears, Kolbert writes, “it’s like bringing a gun to a knife fight.”

The alterations initiated by human beings build on one another, accelerating change in ways that make it all but impossible for most species to adapt quickly enough. As the great environmentalist Rachel Carson put it, “Time is the essential ingredient, but in the modern world there is no time.” But Kolbert is not nostalgic: “Though it might be nice to imagine there once was a time when man lived in harmony with nature, it’s not clear that he ever really did.”

Kolbert devotes a chapter, “The Madness Gene,” to considering the attribute of human beings that requires change in order to flourish. Unlike other species, modern humans, endowed with language, seem driven to embark on perpetual improvement projects in the course of which they alter everything around them. “With the capacity to represent the world in signs and symbols comes the capacity to change it, which, as it happens, is also the capacity to destroy it,” she writes. “A tiny set of genetic variations divides us from the Neanderthals, but that has made all the difference.”

Carson, a worthy model for Kolbert, wrote of “the problem of sharing our earth with other creatures.” We are deciding, Kolbert concludes, “which evolutionary pathways will remain open and which will forever be closed.” Our history determines the course of life on the planet. Our species changes the world, and now the most urgent question is whether we can take responsibility for what we do. “The Sixth Extinction” is a bold and at times desperate attempt to awaken us to this responsibility.

Wesleyan Alumnus-Greenpeace Activist Jailed in Russia

This morning I read a moving op-ed in the Washington Post about Dima Litvinov ’86, a Greenpeace activist recently arrested in Russia. Having organized protests against Russia’s exploitation of the Arctic, Dima was originally arrested with several others on charges of piracy. After protests against this dramatic over-reaching, the charges were reduced to hooliganism. The charges in this case were prompted by Greenpeace activists trying to put a banner on a Russian oil rig.

The op-ed piece is by Dima’s father, and this is how it concludes:

Dima and the others are threatened with long prison terms because they love and defend nature. That includes the Russian Arctic, which is threatened by senseless and dangerous drilling.

I know only too well what a prison term in Russia means. I was arrested for participating in 1968 in a demonstration against the Soviet invasion of Czechoslovakia. Lev Kopelev, Dima’s grandfather on his mother’s side, a Soviet writer, spent eight years in Soviet prison camps because he protested the looting and raping of the German population by Soviet officers and soldiers during World War II, when he fought the Nazi army.

Dima’s grandfather was arrested under Joseph Stalin, and I, Dima’s father, was arrested under Leonid Brezhnev. The Soviet Union doesn’t exist anymore, but Dima has been arrested under Russian President Vladimir Putin — a former member of the Soviet secret police, the KGB. Is it not the time to break the cycle?

The Wesleyan community has been asked to support Dima and the other Greenpeace activists. They were peacefully protesting, but they are no hooligans.


(Igor Podgorny/Associated Press) – In this photo released by Greenpeace International, activist Dima Litvinov in a defendants’ cage at the district court, in Murmansk, Russia, on Oct. 23.

Review of a Famous Amnesia Patient

From time to time my work takes me back to psychology, or at least to psychoanalysis. Last week I spoke at Clark University on Sigmund Freud. This was particularly meaningful to me because I often teach Freud’s “Clark Lectures,” and Clark was the only university at which Freud spoke. I even got my picture taken with the Freud statue!



I published a review on more mainstream psychological research in today’s Washington Post. The book deals with memory and amnesia, a topic on which I’ve worked over the years. I’ve attached it below.

PERMANENT PRESENT TENSE The Unforgettable Life of the Amnesic Patient, H. M. By Suzanne Corkin Basic. 364 pp. $28.99

Henry Molaison (1926-2008) lived a long life, but as it turned out, he experienced most of it in a very short time segments. His seizures started early, and by the time he was in high school they had become frequent. Medications to control epilepsy had a variety of side effects, and still they didn’t eliminate the seizures. The terrible blackouts were always a possibility.

Wilder Penfield had been experimenting with brain surgery on human patients since the 1920s, sometimes “operating on epilepsy patients while they were awake and conscious so that he could pinpoint the abnormal tissue responsible for their seizures.” Penfield’s work received widespread recognition, and when William Scoville examined Henry in the early 1950s, a surgical option seemed likely to provide some relief to the patient. We can imagine why Henry, then 27, and his parents decided to take a chance. Everyone involved knew that the operation was experimental, even risky. The lobotomy was scheduled.

Scoville extracted the front half of Henry’s hippocampus and most of the amygdala, among other parts of his brain. He hoped that this would significantly reduce the frequency of seizures. It didn’t work. Nobody seemed to consider the possibility that the patient would lose the ability to create any new, lasting memories. But that’s what happened. For the next 54 years Henry lived in “a world bounded by thirty seconds.”

Henry’s world was fascinating — for neuroscientists and psychologists — because it was a window onto how memory functions. He became one of the most studied human beings on the planet. Suzanne Corkin met him when she was a graduate student because her teacher co-authored a paper on Henry with surgeon Scoville. “Henry’s case,” as she puts it in one of her many inelegant phrases, “fell into my lap.”

The case defined Corkin’s career and became absolutely crucial for the development of the sciences of memory. Neuroscientists came to understand the “extent to which memory depended on a few centimeters of tissue in the medial temporal lobe” because a surgeon had destroyed those pieces of Henry’s brain.

Henry could not care for himself without significant help because he would forget just about everything that was going on around him. But psychologists and neurologists, Corkin first among them, made sure that Henry had a decent life. He would be brought to MIT on a regular basis for batteries of tests. Some odd things here and there seemed to stick in his mind; there was a sense of familiarity about a few people and a few tasks. Most of the time, he was happy enough to be a test subject — not remembering that he had been tested before, time and again, for decades.

Corkin expertly uses Henry’s case to illuminate major trends in memory research. Perhaps the fundamental lesson that scientists have drawn from his case — besides eventually stopping experimental lobotomies — is that memory is a complex interweaving of cognitive systems. Short-term memory (recalling something for up to 15 seconds) could be intact without any bearing on long-term memory. Henry could hold on to number patterns, for example, for about 15 seconds, but if more storage was needed in a task, he was deficient. He was able to call on “working memory” (the ability to store small amounts of experience while focused) in specific tasks, but he was not able to register the experiences for the long term required to concentrate on those tasks. Henry could respond correctly when he could focus, but he would lose the experience forever once the task was complete. Overall, Corkin tells us in her matter-of-fact way, “in spite of his tragedy, Henry got along.”

The explicit retrieval of the past is declarative memory: We are purposively calling up something that we experienced. Through remembering, the brain changes itself as new associations are formed with what is retrieved. Each time long-term memories are retrieved, they are edited — showing “that memory is an ongoing dynamic process driven by life’s events.” Henry’s case revealed how necessary the hippocampus is for this process.

In reading about Henry as a test subject and “guide” for neuroscience, I was eager to learn more about other aspects of his life over these decades. Corkin seems to have grown genuinely attached to her mild-mannered scientific treasure trove, but her descriptions of his existence are flat, at best. We get only the faintest glimpse of how it felt to live this rudely segmented life. The destruction of his amygdala probably flattened his emotions, his desires. Corkin tells us that after the death of his father, Henry did not “consciously grasp that his father was gone unless someone reminded him.” Yet over four years the loss seemed to sink in. It was Henry’s own words that I found especially moving: “I am having a debate with myself — about my dad. . . . I’m not easy in my mind. On the one side, I think he has been called — he’s gone — but on the other, I think he’s alive. I can’t figure it out.”

For years Henry lived with his mother, and there are indications that their relationship was complex and difficult. But this was not the subject of the neuroscientist’s research. This book informs us that at times Henry was prone to terrible rages and that he even threatened to kill himself, but there is little attempt to see the world from his point of view. Perhaps that would have been impossible; it would certainly have been inconvenient.

Corkin acknowledges how important Henry was for her work. She recognizes that “our research with Henry was certainly a boon to my lab’s reputation” and affirms his “limitless worth as a research participant.” But what does it mean to participate without memory? Corkin became Henry’s guardian in his later years and saw to it that he was comfortable and well cared for. She also saw to it that upon his death, his brain was quickly removed from his skull so that it could be studied with all the technology now at our disposal.

“Henry was dead,” Corkin writes, “but he remained a precious research participant.” How you feel about such a sentence will probably be a good indicator of how you will feel about poor Henry and his doctors in “Permanent Present Tense.”

Higher Education — Two Reviews

Over the summer I finished a short book called Beyond the University: Why Liberal Education Matters. It will be published in the spring.  I also reviewed two interesting books on American higher education, one that focused on teaching and the other providing a broad overview of the sector. You can find my review of Why Teach? here, and of Higher Education in America, from yesterday’s Washington Post, below.

Today classes begin, and I am delighted to head back to the classroom. Last night we heard some of the amazingly talented students who sing a capella at Wes, and now we are finishing our syllabi and checking our reserve readings. I am teaching the Past on Film and am looking forward to meeting the students.


American higher education is the envy of the world. Students flock to this country from all over, and the most highly ranked schools tend to be here. We should be proud!

American higher education is a mess. With high costs, low graduation rates, unhappy faculty members and coddled students, our universities are about to be radically disrupted by massive, technologically driven change. A good thing, too!

How to reconcile these opposing views? At a time when ambitious business-school professors and salivating entrepreneurs predict the end of the university as we know it, and at a time when we have never been more in need of an educated workforce and citizenry, the task of understanding the evolving mission and performance of American higher education has never been more urgent. Thank goodness Derek Bok, a two-time president of Harvard and a judicious, learned analyst of education, has taken on this undertaking. His book is too long to be called a report card, but it is a detailed progress report on the challenges and opportunities facing our nation’s colleges and universities.

One of the first things to note about higher ed in the United States is its heterogeneity. The problems of Harvard are not the same as the problems of the University of Texas or those of Scripps College in California or of LaGuardia Community College in New York. Bok tries to address schools in all their multiplicity, and his book suffers somewhat from the clunkiness that also characterizes higher ed. The book’s five sections discuss instruction from undergraduate to graduate and professional schools, as well as the market forces at work at each level. After the introduction, there are five forewords and four afterwords — not including the short final chapter called “The Last Word.” Yet one forgives redundancies because of the thoroughness of the research and the measured judgment consistently applied.

After noting the variety in higher ed, Bok acknowledges the extraordinary inequalities in the sector. Public discussion of education often focuses on the schools most difficult to get into, but “no more than two hundred colleges regularly reject more students than they admit.” At most highly selective schools (such as the one at which I am president), students receive some subsidy from the institution — even those paying full tuition. Students enrolled at less-selective schools get a small fraction of that support. Public institutions have seen dramatic reductions in state support for universities, and many flagship campuses are scrambling for donations and out-of-state, full-tuition-paying students. Community colleges enroll dramatically more people than other parts of the sector, but most of these students will never earn a degree.

Bok shows that the current quip that universities haven’t changed their teaching styles since the Middle Ages is just an empty canard. Universities have adapted surprisingly well to massive changes in technology, in demography and in developing streams of support. But Bok is no Pollyana, emphasizing that “universities have been especially slow to act . . . in improving the quality of undergraduate education.” Professors often confuse their desire to teach what interests them the most with what undergrads need to learn, and students in recent years are spending far less time on their studies than in past generations. Bok shows how schools cater to students in order to attract more of them, often with little attention to how campus amenities provide distractions from studying.

Bok knows the governance structures of universities as well as anyone, and he realizes that true curricular reform has to be led by the faculty. The challenge, from his perspective, is to make the faculty (at least its leadership) more aware of the empirical work on student learning that has been done over the past decade. Professors may be focused on their research and distracted by committee work, but the evidence shows that they care deeply about teaching effectiveness.

“The key to educational reform,” Bok writes, “lies in gathering evidence that will convince faculty that current teaching methods are not accomplishing the results that the professors assume are taking place.” Once the teachers understand the need for change, they will rise to the occasion and create classes that are more effective at developing the capacities that most agree are essential in college graduates. They have done so in the past, and they will do so again.

Bok’s confidence in the faculty is characteristic of his approach in this book. He believes that our varied system of higher education is very much capable of self-correction. Do we need to bend the cost curve? Sure, and that is why experiments such as massive open online courses (MOOCs) are so interesting (and mostly led by university veterans). Is there a liberal bias on our campuses? Sure, and it has been there at least since the 1940s, but faculty members realize they need more political diversity. Do university leaders spend too much time raising money? Sure, but American schools — especially the selective ones — get much more support than schools in other countries. We may have the worst system, he jokes, but like democracy, it’s better than all the alternatives.

Bok underscores two areas in urgent need of improvement: increasing the percentage of students who graduate from college and improving the quality of undergraduate education. We must do a better job attracting low-income students to our best colleges and universities, no longer wasting financial aid on wealthy students with high SAT scores to improve an institution’s place in bogus rankings. We must also do a better job of stimulating curricular reform and assessment so as to be sure students are working hard to learn what they need to know — whether at a community college or a research university. Of course, reaching agreement on what students need to know is a great challenge, but that’s the core of the faculty’s responsibility.

Competition among schools produces benefits and causes problems. Most of the important ones are addressed in Bok’s helpful volume. I hope he is right that we already have the ingredients in place to make the necessary reforms. I know we need university leaders like him to help activate those ingredients so that American higher education can continue to contribute in vital ways to our culture, our economy and our polity.

Review of “The Undivided Past”

Last week the Washington Post published my review of David Cannadine’s The Undivided Past: Humanity Beyond Our Differences. I enjoyed the book, though while reading it I was reminded of something one of my Wesleyan professors told me long ago. Hayden White joked that historians know they can always say, “things are more complicated than that, aren’t they?” It always sounds like a reasonable question (which means it’s really an empty question). Over the years, I have seen Hayden’s point made time and time again when we academics (not just historians) make similar rhetorical gestures. I play with that empty question in the review below.

Cannadine’s book is very thoughtful and wide-ranging. I’m now making the final revisions on a short book on liberal education — and I know someone will be able to say, “Roth, things are more complicated than that!”


THE UNDIVIDED PAST Humanity Beyond Our Differences By David Cannadine Knopf. 340 pp. $26.95

In the 19th century, historians liked to tell triumphal tales of how people came together in powerful, sometimes glorious ways. Sweeping accounts described how religious groups or nations managed to achieve great things, often after battling other groups. In recent decades, historians have very much taken the opposite tack, showing how groups that we once thought were unified were actually quite divided. Difference, not unity, has been the preferred category for thinking about the past. Triumphal tales of groups coming together have been replaced by studies of how divided we have always been from one another.

David Cannadine, a British historian teaching at Princeton, offers a critique of the major categories historians have used to describe how some human beings are fundamentally different from others. Ideas of religion, nation, class, gender, race and civilization have generated intense feelings of belonging but also feelings of antagonism toward rival groups. Hatred of “the other” is the flip side of the fellow feeling that unites people into these mutually “belligerent collectivities.” Historians have focused on the conflicts that have emerged from this process, on the particular ways in which solidarity has given rise to antagonism toward groups of which one is not a member.

In his wide-ranging and readable new book, “The Undivided Past,” Cannadine shows that a very different story can be told. Rather than underscoring perennial conflicts between mutually exclusive human groups, Cannadine emphasizes how people find ways to get along, to cross borders and to maintain harmonious societies.

He begins with religion, which in many ways seems an unlikely vehicle for showing how people can get along. Given the missionary monotheism of Christianity, a religion that is both exclusive and proselytizing, the stage would seem set for stories of endless strife. But Cannadine finds many examples of pagans and Christians collaborating, and he shows how the “intermingling” of Catholics and Muslims in the late medieval period “transformed Europe’s intellectual landscape and made possible its twelfth-century Renaissance.” He knows well the bloody divisions that swept across Europe in the 16th and 17th centuries, but he insists that for most ordinary people, religious affiliation did not lead to violent conflict.

Rulers did try to generate national ties that would motivate their subjects to love (or fear) their monarchs and to fight against rival kings and queens. But intense feelings of nationalism, Cannadine shows, were a short-lived late-19th-century phenomenon, not some natural feeling that people are bound to experience. Class solidarity, too, was never the defining figure of identity for most people, even during the heyday of industrialization when Karl Marx and Friedrich Engels developed their theory that class conflict drove history. Sure, there are times when workers hate their bosses, but in any historical period there are “more important forms of human solidarity” than class. History is “more varied and complex than Marx or Engels would ever allow.”

Like any seasoned historian, Cannadine knows that one can always say of another’s account that “things are more complicated than that” as a prelude to offering one’s own. So it goes in “The Undivided Past.” He positions himself as an enemy of generalization, and so he can criticize Marxists for over-emphasizing class and feminists for over-emphasizing gender. Things are more complicated than that — not every worker has experienced oppression at the hands of the bourgeoisie, and not every woman has been stifled by patriarchy. Nations, though important for a brief period of history, were never monolithic entities inspiring universal devotion. Many French and English citizens, for example, had intense local allegiances that trumped national identity. When you look at history afresh while emphasizing different facts, customary generalizations can appear rather arbitrary. Things were more complicated than that.

Cannadine’s most pointed rhetoric comes toward the end, when he examines the idea of mutually antagonistic civilizations — an old notion recently popularized by the political scientist Samuel Huntington. Cannadine shows how “civilization” is intertwined with the idea of barbarism, and he quotes Montaigne’s quip: “Each man calls barbarism whatever is not his own practice.” Huntington saw the world as irrevocably divided into groups whose practices — religions, cultures and ways of life — prevented any meaningful integration. The claim is that Muslim, Eastern and Western civilizations, for example, are fated to be in conflict with one another, but Cannadine can find neither facts to back this up nor a coherent logic to the argument. He is biting in his criticism of neoconservative warmongers who draped themselves in Huntington’s pseudo-academic “findings.” “Of all collective forms of human identity,” Cannadine writes, “civilization is the most nebulous, and it is this very vagueness that makes it at once so appealing and so dangerous.”

Cannadine knows that writers and political leaders can all too easily generate solidarity on the basis of one element of a group’s identity in order to generate antagonism toward those who don’t share that element. His book is a reminder that generalizations based on the supreme importance of any one concept (be it race, class, gender, religion, nation or civilization) are likely to fall apart when closely examined. The “exaggerated insistence on the importance of confrontation and difference . . . misrepresents the nature of the human condition,” he concludes.

In closing, he writes briefly of the “just inheritance of what we have always shared” and urges us “to embrace and celebrate [our] common humanity.” But he doesn’t even try to provide evidence or arguments for what this common humanity might be. After all, that would be to make woolly generalizations; he knows things are more complicated than that.


Review of Nirenberg’s ANTI-JUDAISM

From Sunday’s WashingtonPost

Review of Anti-Judaism: The Western Tradition. By David Nirenberg. Norton. 610 pp. $35


Oh, the Protestants hate the Catholics,

And the Catholics hate the Protestants,

And the Hindus hate the Muslims,

And everybody hates the Jews.

So sang Tom Lehrer in his satirical song “National Brotherhood Week.” It’s no news that even those who preach “love thy neighbor” have often combined their striving for community with the hatred of a scapegoat, the Jews. David Nirenberg’s “Anti-Judaism” is a thorough, scholarly account of why, in the history of the West, Jews have been so easy to hate. And this story goes back a very long way.

Nirenberg returns to ancient Egypt to examine traditions that portray Jews as “enemies of Egyptian piety, sovereignty, and prosperity.”This was already old in the 7th century BCE! Ancient Greeks and Romans would have their Jews, too; they found use for an “anomalous” people who stuck together and followed their own rules, who were “neither disenfranchised nor citizen, neither conquered nor conquering, neither powerless nor free.” Over the centuries, when there was trouble in the kingdom, be it corruption or military threat, famine or political chaos, pagan ideologues developed a handy solution: Attack the Jews.

Jews were useful for those who were contending for power in the ancient world, and the Egyptian model of scapegoating was often repeated. But it was the Christians who refined anti-Judaism into a core theological and political ideology. Christianity had a particular problem: to show that it had overcome Judaism — overcome its adherence to the laws of the “old” testament, overcome its tribal particularity with evangelical universalism. The idea of Judaism — together with the fact that there were still people in the world who chose to remain Jews — was an affront to that universalism. “To the extent that Jews refused to surrender their ancestors, their lineage, and their scripture, they could become emblematic of the particular, of stubborn adherence to the conditions of the flesh, enemies of the spirit, and of God.”

Throughout the centuries theologians returned to this theme when they wanted either to stimulate religious enthusiasm or quash some perceived heretical movement. Not that you needed any real Jews around to do this. You simply had to label your enemies as “Jews” or “Judaizing” to advance the purity of your cause. In the first through fourth centuries, Christians fighting Christians often labeled each other Jews as they struggled for supremacy. And proclaiming your hatred of the Jews became a tried and true way of showing how truly Christian you were. Centuries later, even Luther and Erasmus agreed that “if hatred of Jews makes the Christian, then we are all plenty Christian.”

Islam followed this same pattern of solidifying orthodoxy by stoking anti-Jewish fervor. Muhammad set Islam, like Christianity, firmly within an Abrahamic tradition, but that made it crucial to sever the new religion from any Judaizing possibilities. Rival Islamic groups, like rival forms of Christianity, often painted their adversaries as hypocritical Jews scheming to take the world away from spiritual truths essential for its true salvation.

Nirenberg shows how consistently the struggle for religious and political supremacy has been described as a struggle against the “Jews.” The quotation marks are especially important as his account moves beyond the medieval period, because between 1400 and 1600 Western Europe was more or less “a world free of Jews.” Banished from most countries, and existing only in the tiniest numbers through special exemptions, actual Jews were hardly ever seen. But it was in this period that “Christian Europe awoke haunted by the conviction that it was becoming Jewish.” In this period of cultural change and doctrinal and political disputes, patterns as old as the age of the pharoahs were reactivated: My adversaries must be extinguished for the polity to be purified; my adversaries must be Jews. And in early modern European eyes, the adversaries were especially dangerous if they were secret Jews who appeared to be Christian. Were Jews hiding everywhere?

Martin Luther brought this rhetoric to a fever pitch. In 1523 he accused the Roman Church of becoming “more ‘Jewish’ than the Jews,” and as he grew older he tried to convince his contemporaries that “so thoroughly hopeless, mean, poisonous, and bedeviled a thing are the Jews that for 1400 years they have been, and continue to be, our plague, pestilence, and all that is our misfortune.” Don’t believe in conversions, the aged Luther urged; the only way to baptize Jews was by tying millstones around their necks.

Nirenberg’s command of disparate sources and historical contexts is impressive. His account of the development of Christianity and Islam is scholarly yet readable. And his portrayal of the role that Judaism has played as a foil for the consolidation of religious and political groups is, for this Jewish reader, chilling. Nirenberg is not interested, as he repeatedly insists, in arguing that Christianity and Islam are “anti-Semitic.” Instead, he is concerned with tracing the work that the idea of Judaism does within Western culture. He shows that many of the important conceptual and aesthetic developments in that culture — from Saint John to Saint Augustine to Muhammad, from Shakespeare to Luther to Hegel — depend on denigrating Jews.That’s what’s so chilling: great cultural achievements built on patterns of scapegoating and hatred.

In the modern period, revolutionaries and counter-revolutionaries continued to employ “the Jewish problem” as something to be overcome. “How could that tiny minority convincingly come to represent for so many the evolving evils of the capitalist world order?” Nirenberg asks. He shows that for thousands of years the patterns of anti-Judaism have evolved to provide great thinkers and ordinary citizens with habits of thought to “make sense of their world.” He doesn’t say that these patterns caused the mechanized, genocidal Nazi war against the Jews in the 20th century, but he argues convincingly “that the Holocaust was inconceivable and is unexplainable without that deep history of thought.”

Presaging Tom Lehrer, Sigmund Freud in 1929 wrote ironically that Jews, by being objects of aggression, “have rendered most useful services to the civilizations of the countries that have been their hosts; but unfortunately all the massacres of the Jews in the Middle Ages did not suffice to make that period more peaceful and secure for their Christian fellows.” Even when “everybody hates the Jews,” patterns of intolerance and violence remain intact. Nirenberg offers his painful and important history so that we might recognize these patterns in hopes of not falling into them yet again.

Hallucinations and Art: Two Book Reviews

From Sunday’s Washington Post:

HALLUCINATIONS By Oliver Sacks. Knopf. 326 pp. $26.95

As a young professor, I traveled to Vienna to visit a friend. Knowing that I’d written my first book on psychoanalysis and history, he sent me off to Freud’s old apartment and office, which had been converted to a museum. One rang a doorbell to be admitted, and I was shocked when the museum attendant greeted me by name. Surely, I thought, my old friend had called ahead to play a little joke on me. Again, the attendant spoke to me by name in German, calling me “Professor Doktor Roth” — or so I thought. My wife was right beside me, and she later told me that nothing of the kind had happened. The museum employee had merely told me the price of admission.

I was befuddled by this, and later as I searched in the museum’s library to see if it had a copy of my book, I realized that what I’d heard so clearly was probably an auditory hallucination. I so very much wanted to be recognized in the house of Freud that I’d perceived something that wasn’t there at all.

Most of the examples of hallucinations in Oliver Sacks’s graceful and informative new book do not have the transparent motivations of my episode in the Freud museum. Indeed, most of his examples don’t seem “motivated” at all; they have causes rather than meanings. That is, most of the occurrences seem to be products of neurological misfirings that can be traced to disease, drugs or various changes in neurochemistry. With some important exceptions, hallucinations don’t seem to reveal desires or intentions — the kinds of things that create meaning; they do reflect workings of the brain that cause us to see or hear things that are not really there. Parkinsonian disorders, epilepsy, Charles Bonnet syndrome, migraines and narcolepsy — drawing upon descriptions of these and other conditions by patients and doctors, Sacks explores the surprising ways in which our brains call up simulated realities that are almost indistinguishable from normal perceptions.

As is usually the case with the good doctor Sacks, we are prescribed no overarching theory or even a central argument to unite his various observations. Instead, we are the beneficiaries of his keen observational sense, deep clinical practice and wide-ranging reading in the history of neurology. This doctor cares deeply about his patients’ experiences — about their lives, not just about their diseases. Through his accounts we can imagine what it is like to find that our perceptions don’t hook on to reality — that our brains are constructing a world that nobody else can see, hear or touch.

Sacks has been fascinated by neurology since his student days (he is now almost 80), and he recounts his personal experiences with neurochemistry. He started experimenting with LSD in the 1950s, and when he was a medical resident living in Southern California’s Topanga Canyon in the 1960s, his drug use combined recreation with investigation. Opiates later upped the ante, and Sacks describes his interest and pleasure in altered states of consciousness. He recalls his hallucinations that drew heavily on Froissart and Shakespeare with neither pride nor shame. His perceptions weren’t based in reality, but could he still learn from them?

Sacks has long been an avid reader of the history of medicine, and he beautifully describes his intense, amphetamine-inflected readings of such 19th-century medical texts as the English physician Edward Liveing’s work on migraines. Drugs made reading seem more powerful, but as he came down from his high, Sacks realized that while under the influence of drugs he would never be able to write with the kind of sustained attention and care evident in the texts he admired. His epiphany was that he should follow his creative muse not through more powerful hallucinations but through the work of medicine and writing. “The joy I got from doing this was real — infinitely more substantial than the vapid mania of amphetamines.”

Over the past decades we have learned much more about how we see and hear with our brains — not just with our eyes and ears. Sacks describes how neurosurgeon Wilder Penfield was able to induce “experiential seizures” by tracking electrodes over the surface of an exposed temporal cortex during surgery. His patients seemed to experience vivid flashbacks, as if the electrical charge had catalyzed a memory into a perception. Vivid though they were, these recollections seemed to lack personal significance. More recent work has explored how the brain creates networks of recollection that allow us to access memories, even as we reshape the past while bringing it into consciousness.

Some hallucinations, Sacks writes, do seem connected to highly significant, emotionally charged memories. When deep in grief, for example, we are more likely to perceive our loved one, even though we know that person has died. Bereavement “causes a sudden hole in one’s life,” and a hallucination evinces a “painful longing for reality to be otherwise.”

At the end of “Hallucinations,” Sacks returns to phantom limbs, a subject he wrote about at length in “A Leg to Stand On.” Amputees report pain in limbs they no longer physically possess, the brain seeming to retain an image of the body that trumps physical reality. Physicians today help patients learn to use their phantom limbs, fitting them into prostheses so that they can use their hallucination of a body part to maneuver what no longer seems like an artificial limb.

Turning a phantom limb from something strange and painful into something one integrates with one’s sense of self is a medical and human triumph. Sacks has turned hallucinations from something bizarre and frightening into something that seems part of what it means to be a person. His book, too, is a medical and human triumph.


From Sunday’s Los Angeles Times

Glittering Images: A Journey Through Art From Egypt to Star Wars. By Camille Paglia. Pantheon: 202 pp., $30

In the 1990s Camille Paglia established herself as a cultural critic to be reckoned with. Her daring “Sexual Personae” enraged feminists, even as it presented a view of culture, sexuality and control that offered little comfort to conservatives hoping to convert even more Americans to the cult of conventionality. Chaos, Paglia emphasized, might be contained for a while, but it would always find its way back into our lives. And that wasn’t something to be lamented.

Paglia was a radical libertarian eager to puncture sanctimony wherever she found it — either in the progressive pieties of political correctness or in the hypocrisy of fundamentalist hucksters hacking away at other people’s pleasures.

She enjoyed a fight, or at least she recognized that fights made good copy and pumped up sales. She liked to throw around the word “Stalinist” and was herself compared to both a Nazi and to Phyllis Schlafly by prominent feminist authors. Paglia particularly enjoyed polemics against pretentious academics, reserving some of her nastiest and most amusing tirades for the followers of highfalutin French theory. This too was a guaranteed audience pleaser.

In the last decade we have seen a kinder and gentler Camille Paglia as she has moved from critical polemic to cultural appreciation. In “Break, Blow, Burn” she turned her attention to what she considered great poetry in English — from Shakespeare to Joni Mitchell. Taking a page (and perhaps a business plan) from her mentor Harold Bloom, Paglia wrote in that book that in “this time of foreboding about the future of Western culture, it is crucial to identify and preserve our finest artifacts.” She collected 43 mostly canonical poems and wrote a little about each in the hope the inspiration she found in them would be contagious.

“Glittering Images” continues this project — this time with brief discussions of 29 works of visual art. Whereas “Break, Blow, Burn” sought to help us hear again the strongest poetic voices, this volume wants to help readers “find focus” amid the “torrential stream of flickering images.”

Paglia’s goal is straightforward: By offering images of great artworks and helping us to give them sustained attention, she hopes that readers will “relearn how to see” with sustained pleasure and insight. Protesting against the intense animosity toward the arts she sees in American popular culture, Paglia wants her readers to recognize the deep feeling, craft and originality that went into the works she has chosen.

The range of art discussed is enormous, though there are few surprises in the Paglia canon. She begins with Nefertari’s tomb and offers a few pages on religion and politics in ancient Egypt and on Egyptology since Napoleon. The anonymous artisans who built the tomb “were faithful messengers of the cultural code,” linking profound cultural truths to elegant visual representation. Paglia’s sympathy for the intersection of religion and art serves her well in the early chapters of the book, as she discusses objects that were venerated for more than their aesthetic power.

Given her penchant for polemic, it was odd to discover that “Glittering Images” has no argument. Her brief discussions of the objects have the flavor of the textbook or Wikipedia, with occasional anachronistic comments linking them to present concerns. It’s probably a good thing that Paglia makes no attempt to sustain a narrative about art over the ages; instead she offers reflections on why she finds, say, Donatello’s Mary Magdalene so powerfully enigmatic, or why Bronzino’s mannerism has “a polished theatricality but an unsettling stasis.”

It would be silly to complain about the particular works that Paglia has chosen. They all repay vision and reflection, and that, after all, is her point. The critic sometimes seems to believe, with George Grosz, that “great art must be discernible to everyone,” and I suppose that’s why she concludes her survey with the limited imagination but visual virtuosity of George Lucas.

In her final chapter she writes as if popularity is a key sign of artistic greatness, though she knows that many of the artists she most admires were not at all part of the popular culture of their times. They often struggled to be seen, but that doesn’t mean that fame was their ultimate artistic goal.

I’m not sure why Paglia worries so that the fine arts today have lost touch with the masses, that they “are shrinking and receding everywhere in the world.” Sure, her favorite AM talk radio shows often make fun of artists. But people have been making fun of artists for a very long time. Meanwhile, contemporary photographers, painters, sculptors and videographers pursue their practice with intensity and patience, with craft and concept.

Toward the end of “Glittering Images,” Paglia writes with appropriate and infectious admiration about Eleanor Antin’s mail art project 100 Boots. Paglia notes that the “boots, like their creator, are outsiders, eternal migrants questing for knowledge and experience.”

Artists, questing outsiders, are still with us, still finding their way, making their way. Perhaps some of them will be inspired by the glittering images Camille Paglia offers here.


Liberal Learning and Making Stuff: Review of Anderson’s MAKERS

This weekend released a podcast of an interview I did with one of their reporters on the liberal arts as a pragmatic form of education for our time. Today my review of Chris Anderson’s Makers: The New Industrial Revolution appeared in the Washington Post. Anderson argues that the digital economy offers enormous opportunities for inventors and entrepreneurs.

Having spent seven years as president of an art and design school known for education through the arts, I am particularly interested in the ways in which “making stuff that matters” is relevant to liberal learning. Over the next few years we will be launching an initiative to enable more Wesleyan students to increase their digital and computational literacy, and we will be expanding access to spaces in which students can make stuff with digital tools. Liberal learning should go hand in hand with creating things that make a positive difference in the world. Here’s the review:


These days, when our slow recovery from recession seems like a full-employment program for pessimistic pundits, it’s great to have a new book from Chris Anderson, an indefatigable cheerleader for the unlimited potential of the digital economy. Anderson, the departing editor in chief of Wired magazine, has already written two important books exploring the impact of the Web on commerce. In “The Long Tail,” he argued that companies like Amazon that faced distribution challenges arising from having large quantities of the same kind of product would thrive by “selling less of more.” Corporations didn’t have to chase blockbusters if they had a mass of small sales. In “Free: The Future of a Radical Price,” he argued that giving stuff away to attract a multitude of users might be the best way eventually to make money from loyal customers. Anderson has also helped found a Web site, Geekdad, and an aerial robotics company. From his vantage point, in the future more and more people can get involved in making things they really enjoy and can connect with others who share their passions and their products. These connections, he claims, are creating a new Industrial Revolution.

In a 2010 Wired article entitled “In the Next Industrial Revolution, Atoms Are the New Bits,” Anderson described how the massive changes in our relations with information have altered how we relate to things. Now that the power of information-sharing has been unleashed through technology and social networks, makers are able to collaborate on design and production in ways that facilitate the connection of producers to markets. By sharing information “bits” in a creative commons, entrepreneurs are making new things (reshaping “atoms”) more cheaply and quickly. The new manufacturing is a powerful economic force not because any one business becomes gigantic, but because technology makes it possible for tens of thousands of businesses to find their customers, to form their communities.

Anderson begins his new book, “Makers,” with the story of his grandfather Fred Hauser, who invented a sprinkler system. He licensed his invention to a company that turned ideas into things that could be built and sold. Although Hauser loved translating ideas into things, he needed a company with resources to make enough of his sprinklers to turn a profit. Inventing and making were separate. With the advent of the personal computer and of sophisticated but user-friendly design tools, that separation has become increasingly irrelevant. As a child, Anderson loved making things with his grandfather, and he still loves creating new stuff and getting it into the marketplace. “Makers” describes how today technology has liberated the inventor from a dependence on the big manufacturer. “The beauty of the Web is that it democratized the tools both of invention and production,” Anderson writes. “We are all designers now. It’s time to get good at it.”

Here’s where social networks come in. By sharing design ideas, we improve performance and find efficiencies. Communities of makers — whether they care about sprinklers, 3-D printers or flying robots — exchange ideas, correct one another’s plans, and together make something worth having (and that many are already invested in). Anderson sees a revolution in the contemporary preference for amateur content, and he approvingly cites Web entrepreneur Rufus Griscom’s talk of a “Renaissance of Dilettantism.” This is a “remix culture” in which everything can be customized. Web culture reveals the “long tail of talent,” and with barriers to entry rapidly disappearing, Anderson sees a new, more open playing field in which inventor-entrepreneurs (makers) will fuel economic development while creating fulfilling, less hierarchical communities.

This is heady stuff, and Anderson is an excellent guide to companies that make niche products for an international market. There are, apparently, enough folks interested in products like hammocks, weapons for Lego sets, and cool flying machines to support producers whose design and manufacturing costs are kept very low. Most of Anderson’s product examples are the kinds of things boys like to play with, and there is something of the “I found other kids like me” joy in his descriptions of community-building through the social networking of makers. The new industrial revolution, apparently, will have less to do with confronting poverty, disease and climate change, and more to do with inventing better, cooler toys. It will also be, like the last one, very male.

A firm believer in the wisdom of crowds, Anderson doesn’t take time to explore the dangers — or the limits — of wired dilettantism. He counts on networks to uncover error rather than to reinforce prejudice, and he has faith that real talent will be recognized more easily by those invested in solving a problem than by those seeking somebody who is merely properly credentialed.

Anderson is a good storyteller, and these anecdotes effectively highlight changing economic dynamics. Take Jordi Munoz Bardales, who went from hacker-hobbyist to CEO just a couple of years after graduating from his Tijuana high school. Bardales’s posting online of his design innovations to a toy helicopter was proof enough in Anderson’s eyes that he had the right stuff to be the leader of a robotics firm. It just didn’t matter where he went to school. It mattered that he had the skills and a capacity to share them.

In Anderson’s view, the Web creates an arena in which inventive people can connect with one another and figure out ways to turn their designs into things that will succeed in the marketplace. This will “unlock an economic engine” as thousands of small enterprises find new ways to be sustainable. Anderson convinced me that these enterprises will indeed succeed in making cool things that are fun to play with or that offer heightened convenience.

And I even have some hope that these new powers of making might address some of the major problems that still plague us from the last Industrial Revolution. Making hope in the future may be the most important product of the dynamic Anderson describes in his inventive new book.


From Affordability To Transformation

Since I first posted a blog at the Washington Post about affordability plans at Wesleyan, there has been strong interest in our three-year option. I’m delighted and a little surprised. As I’ve said, it’s not for everybody, but the three-year possibility might make sense for many people. Here are two audio clips in which I’ve discussed what we are doing in this regard:

NPR Marketplace

WOR Radio New York

When I arrived in the office this morning, Heather Brooke asked me how I liked the Wall Street Journal piece. I didn’t know anything about it, and then was surprised to read the opening lines of an op-ed by Fay Vincent:

As the costs of attending college continue to mount, often well beyond the rate of inflation, the search is on for ways to economize. One seemingly obvious way is to reduce the number of years required to graduate. Last month, Wesleyan University, the private liberal-arts college in Middletown, Conn., did just that.

President Michael Roth announced that his institution would encourage students who wanted to complete the requirements for the Bachelor of Arts degree in three years rather than the customary four. These students would take some course work during the summer along with their normal load during the school year.

“I think it’s important to show that liberal arts colleges, even ones as selective as Wesleyan, are trying to do something about affordability,” he told the Associated Press. Tuition, room and board there is nearing $50,000 per year.

There are a smattering of other colleges across the nation that have three-year programs, but none with as high an academic profile. And while the Wesleyan decision has not attracted much attention or discussion, I suspect there will be more such cost-saving efforts in coming years.

Mr. Vincent was underestimating the price of institutions like Wesleyan, but I was pleased to see him encouraging experiments that will enhance affordability. At Wesleyan, we also want our experiments to intensify the educational experience so that it is more compelling than ever. A liberal arts education has long been about transformation. With Wes faculty, students, staff and alumni making contributions, we can also transform liberal learning so that it’s more relevant than ever!

Creativity Works at Wes

What follows is a book review I published this weekend in The Washington Post of “Imagine: How Creativity Works,” by Jonah Lehrer. For years people have said that Wesleyan is a place for creative students, and recently we have tried to define more specifically how the work on our campus helps students develop their capacities for innovation.

A few days ago, applicants to Wes found out whether they have been invited to join the class of 2016. The competition for spots was very intense this year. With more than 10,000 applicants, most of whom are highly qualified, the process of putting together a class is increasingly difficult. We are looking for students who will thrive in the engaged, collaborative and imaginative campus culture here. Over the next four weeks many of the prospective pre-frosh will be visiting Wes, trying to determine if this will be their home and their launch pad for the next four years. The students who choose Wesleyan will likely be those who find that the dynamic student and faculty culture stimulates their own imaginative capacities. Creativity works at Wesleyan.


Check out these recent articles on the student music scene at Wes:


Here’s the review, crossposted from

Not many writers can make plausible links among musicians Bob Dylan, Yo-Yo Ma and David Byrne, animators at Pixar, neuroscientists at MIT, an amateur bartender in New York, entrepreneurs in Silicon Valley and Israeli army reservists. Not many reporters do research about an expert surfer who has Asperger’s, information theorists, industrial psychologists and artists. But Jonah Lehrer is such a writer-reporter, who weaves compelling and surprising connections based on detailed investigation and deep understanding. He says that working memory is an essential tool of the imagination, and his book is an excellent example of how a dynamic storehouse of captivating information feeds creative thinking and writing.

Lehrer begins with the story of a pop-culture breakthrough, the artistic reinvigoration that Dylan experienced when he wrote “Like a Rolling Stone.” Dylan was finishing a grueling tour schedule that had left him increasingly dissatisfied with making music. He decided to leave behind the madness of celebrity culture and the repetitive demands of pop performance. But once he was ensconced in Woodstock, N.Y., once he decided to stop trying to write songs, the great song came: “It’s like a ghost is writing a song,” he said. “It gives you the song and it goes away. You don’t know what it means.” Lehrer adds, “Once the ghost arrived, all Dylan wanted to do was get out of the way.”

Many of the stories that Lehrer recounts in the first few chapters stress the benefits of paying attention to internal mental processes that seem to come from out of the blue. We can learn to pay attention to our daydreams, to the thoughts or fantasies that seem nonsensical. Sometimes this attention must be very light, so that the stream of ideas and emotions flows, as when Ma feels his way into a new piece of music. Sometimes the attention must be very great, as when W.H. Auden (assisted by Benzedrine) focused on getting the words in a poem exactly right.

Lehrer explains some of the neuroscience behind these different modes of attentiveness. Making use of the power of the right hemisphere figures in, as does activating more energy from the prefrontal cortex to “direct the spotlight of attention.” He discusses experiments that explore which parts of the brain seem most active in different kinds of pursuits. For example, as the brain develops in childhood, the power to inhibit our flights of fancy grows. But as inhibition and focus increase, the capacity to improvise seems to diminish.

Lehrer notes that modern science has given new names to ideas that philosophers have been exploring for a very long time. Despite the fancy terminology, I found the anecdotes about scientific experiments less interesting than the anecdotes about poets, artists, surfers and inventors. That’s partly because the science stories seem to overreach, pretending to offer explanations for creativity by finding precise locations for the multitudinous connections that the brain generates. In an organ with the networking plasticity of the brain, location might not explain so much.

The last three chapters move from individuals to contexts. Lehrer offers fascinating accounts of why cities generate intense creative work and why certain urban-planning principles that emphasize heterogeneity (think Jane Jacobs) are so powerful. He shows us why teams that “are a mixture of the familiar and the unexpected,” such as those at Pixar, are the most innovative. Too much strangeness, and things fall apart. Too much closeness, and the generative spark is never struck.

Lehrer shows why brainstorming usually fails to result in real innovation because nobody is pushing back on bad ideas. “The only way to maximize creativity . . . is to encourage a candid discussion of mistakes. . . . We can only get it right when we talk about what we got wrong.” Or, as Lee Unkrich, a Pixar director, put it: “We just want to screw up as quickly as possible. We want to fail fast. And then we want to fix it. Together.”

Lehrer concludes with a discussion of why certain epochs seem to be more creative than others. Culture, he says, determines creative output, and it is through sharing information and making connections that we maximize that output. He quotes Harvard economist Edward Glaeser, who emphasizes that “even in this age of technology, we still get smart being around other smart people.”

Glaeser and Lehrer are showing why cities remain so important, but as the president of a university, I can also see how this applies to our campuses. Students and faculty seek the inspiration that is available all over campus, and that’s why so much learning happens outside the classroom. Sitting by yourself with your computer, even if you have access to thousands of Facebook “friends,” just isn’t the same as being in a creative, cosmopolitan culture in which new connections are continually (and surprisingly) formed.

“Imagine” doesn’t offer a prescription for how we are to become more imaginative, but it does emphasize some key ingredients of a creative culture: taking education seriously, increasing possibilities for human mixing and cultivating a willingness to take risks. Lehrer practices what he preaches, showing an appetite for learning, a determined effort to cross fields and disciplines, and a delight in exploring new possibilities. Reading his book exercises the imagination; the rest is up to us.