Returning to Freud and Remaking The Past

My review of Adam Phillips’ excellent new biography of Sigmund Freud, Becoming Freud: The Making of a Psychoanalyst, was published in the Washington Post today. I’ve been writing about Freud ever since my frosh classes at Wesleyan, and every so often I still return to psychoanalysis. In reading Phillips’ account of Freud’s early years, I was reminded of Hayden White’s remarks at this year’s Commencement.

You can change your personal past. You do not have to continue to live with the past provided to you by all of the agencies and institutions claiming authority to decide who and what you are and what you must try to be in your future. You can change your past and thereby give your future a direction quite different from what has been marked out for you by others.” — Hayden White

White told our graduating class that “the future you deserve depends on the past you make for yourself.” Not bad advice…and very much related to the view of the self and of history that Freud developed in his early years
BECOMING FREUD

The Making of a Psychoanalyst

By Adam Phillips

The introduction to Adam Phillips’s new book is titled “Freud’s Impossible Life,” and the author makes clear more than once his view that the biographer’s task is an unmanageable one. Freud himself didn’t make things easy, destroying a lot of evidence of his early years so as to lead (as he said) his future biographers astray. And he did this long before he had anything like the kind of résumé that would have interested biographers.

As Phillips notes, Freud had a strong distrust of the biographer’s task, although he himself wrote speculative biographical studies. “Biographical truth is not to be had,” Freud wrote, “and if it were to be had we could not use it.” So much of a person’s life is underground, unconscious, and how we reconstruct it may reveal more about ourselves than about our subject. Phillips draws on two big Freud biographies (by Ernest Jones and Peter Gay), fully aware of their limitations.

In writing about Freud’s first 50 years, the author (who is also a practicing psychoanalyst) doesn’t have much evidence to go on. “Nothing, it is worth repeating, is properly known about Freud’s mother,” Phillips emphasizes, and he is also reduced to general speculations (or silence) about his father, wife, siblings and children.

But there is also freedom in the lack of evidence; one of the reasons Freud was so interested in the ancient world, Phillips tells us, was the paucity of verifiable facts. According to psychoanalysis, “only the censored past can be lived with,” and we “make histories so as not to perish of the truth.” For Phillips, psychoanalysis is part of the history of storytelling, and “a biography, like a symptom, fixes a person in a story about themselves.” What kind of story, then, does Phillips have to tell about Freud and about psychoanalysis?

His story about the life and the work is ultimately more invested in the latter. Young Freud, a secular Jew, tries to assimilate into Viennese society while also theorizing that we humans have desires that can never be assimilated with our public, social roles. He is attracted to mentors who introduce him to painstaking scientific research (Ernst Brücke), to charismatic investigations into the irrational (Jean-Martin Charcot), to clinical work that reduces hysterical misery to common unhappiness (Josef Breuer) and finally to unstable speculation on the secrets of human nature (Wilhelm Fliess). “In this formative period of his life,” Phillips writes, “Freud moves from wondering who to believe in, to wondering about the origins and the function of the individual’s predisposition to believe.”

As a young doctor in training at Vienna’s General Hospital, Freud asked his fiancee to embroider two maxims to hang in his lodgings: “Work without reasoning” and “When in doubt abstain.” This is so telling for Phillips because work and abstinence would eventually be at the center of psychoanalysis — both paradoxically reframed as dimensions of our circuitous pursuits of pleasure. The Freudian question par excellence: “What are you getting out of your abstinence?”

Phillips does tell us that, as a young child in the 1860s, Sigmund regularly found himself displaced by the birth of new siblings — six in seven years. As newlyweds in the 1880s, Freud and his wife practically repeated this history, welcoming new children — six in eight years. Surrounded by all these little ones, the young father spent more and more time trying to understand their demands on the world around them — how they communicated those demands and how adults responded.

One of the most important ways we deal with the demands we make and those made on us is to try to forget them. Unmet demands — unrequited desires — can hurt, and so in order to get back to work (and love), we may push them away. Frustration and the repression of frustration became central to Freud’s thinking in the 1890s, when he was in his late 30s and his 40s. At first he tried to understand the phenomena of pleasure, frustration and forgetting at the neurological level. Then he started paying attention to how we express in disguised form the complications of our appetites — in symptoms, in slips of the tongue, in jokes and especially in dreams: the beginning of psychoanalysis.

Now Sigmund could become a Freudian — an interpreter who showed how our actions and words indirectly express conflicts of desire. The conflicts among our desires never disappeared; they become the fuel of our histories. Making sense of these conflicts, understanding our desires, he thought, gives us an opportunity to make our histories our own.

Freud came to this realization, indeed, came to psychoanalysis, when he acknowledged that “our (shared) biological fate was always being culturally fashioned through redescription and recollection.” Our fate, then, resulted from how we remembered and retold our histories, and psychoanalysis became a vehicle for telling those histories in ways that acknowledged our conflicting desires. Psychoanalysis wasn’t a methodology to discover one’s true history; it was a collaboration that allowed one to refashion a past with which one could live.

In prose that often crackles with insights, Phillips refashions the heroic period in Freud’s life, when he believed “that making things conscious extended the individual’s realm of choice; where there was compulsion there might be decision, or newfound forms of freedom.” At the turn of the century, before there were many followers and before there was an organization to control, “Freud emerges as a visionary pragmatist.”

This visionary pragmatist understood that we could construct meaning and direction from our memories in order to suffer less and live more fully in the present. Phillips tells a story of how Freud came to that realization, making psychoanalysis as he made his life his own.

 

Review of Elizabeth Kolbert’s The Sixth Extinction

This review of Elizabeth Kolbert’s The Sixth Extinction: An Unnatural History  appeared in the Washington Post this morning. I know there are many people at Wesleyan searching for ways to make a difference in the face of the environmental disasters of climate change. Kolbert is a thoughtful, engaged and determined guide.

 

Elizabeth Kolbert’s “Field Notes From a Catastrophe” (2006) presented a powerful account of how climate change was disrupting lives around the planet. Whether the New Yorker columnist was visiting a utility company in Burlington, Vt., ice sheets in Greenland or floating cities in the Netherlands, she deftly blended science and personal experience to warn of the enormous harm created by human-generated climate change. The last chapter of that book, “Man in the Anthropocene,” underscored that we had entered an era in which human beings had begun to change everything about the planet’s interlocking ecosystems, and that we had put much of those systems and our own species at enormous risk.“It may seem impossible,” Kolbert concluded, “to imagine that a technologically advanced society could choose, in essence, to destroy itself, but that is what we are now in the process of doing.”

(Henry Holt) – ‘The Sixth Extinction: An Unnatural History’ by Elizabeth Kolbert

In her new book, “The Sixth Extinction,” she provides a tour de horizon of the Anthropocene Age’s destructive maw, and it is a fascinating and frightening excursion. We humans have been bad news for most of the world’s living things, causing massive extinctions of species with which we share the planet. Unless we change our ways, she argues convincingly, we will certainly cause our own demise.

Until the 18th century, scientists didn’t have a clear idea that species could become extinct. Kolbert credits the French naturalist Georges Cuvier, writing in the wake of the great Revolution, with realizing that whole branches of the tree of life could permanently be cut off. Still, most of those who studied natural history were sure that extinctions happened only gradually over very long periods of time. This uniformitarian view would fit well with Darwin’s perspective on the slow and steady pace of evolutionary change through natural selection. Species did become extinct, but only very slowly as other competitors adapted more successfully to the environment around them.

This view of extinctions was definitively shattered by the work of Luis and Walter Alvarez, a father-son team who demonstrated that the Cretaceous period ended when an asteroid struck the Earth and radically changed the planet’s climate. In what has come to be called the K-T extinction, “every animal larger than a cat seems to have died out,” and things were no better in the water. The dinosaurs were just the most celebrated victims: “Following the K-T extinction,” Kolbert emphasizes, “it took millions of years for life to recover its former level of diversity.”

The scientific consensus was that things evolved very slowly, except in the face of radical events — like an asteroid crashing into the Earth. Today there is another asteroid hitting the planet, and it’s us. Slow “adaptation” in the Darwinian sense is meaningless if a creature very suddenly has to face conditions that “it has never before encountered in its entire evolutionary history.” In our age, the Anthropocene, these are the conditions human beings have been creating (very quickly) for other forms of life.

As in “Field Notes From a Catastrophe,” Kolbert presents powerful cases to bring her point home. Oceans are highly stressed by climate change, for example, and acidification of the seas is driving the extraordinary ecosystems of coral reefs into extinction. Plants and animals are desperate to migrate to more hospitable climes, while others can’t survive the arrival of the newcomers. According to entomologist E.O. Wilson, whom she cites, we are now reducing biological diversity to its lowest level since the Cretaceous period.

Some of these changes have been created by our species breaking down barriers among other species as life forms tag along on our boats and planes from one part of the globe to another. Snakes in Guam, snails in Hawaii and thousands of other species brought by human beings into new environments, intentionally or not, have “succeeded extravagantly at the expense of other species.” As we make the world more interconnected than ever (“The New Pangaea”), the fatal vulnerabilities in thousands of species are exposed. The recent annihilation of bat populations in the Northeast, for example, has been caused by a foreign fungus that the animals had never encountered and so had no defense against. When a new fungus appears, Kolbert writes, “it’s like bringing a gun to a knife fight.”

The alterations initiated by human beings build on one another, accelerating change in ways that make it all but impossible for most species to adapt quickly enough. As the great environmentalist Rachel Carson put it, “Time is the essential ingredient, but in the modern world there is no time.” But Kolbert is not nostalgic: “Though it might be nice to imagine there once was a time when man lived in harmony with nature, it’s not clear that he ever really did.”

Kolbert devotes a chapter, “The Madness Gene,” to considering the attribute of human beings that requires change in order to flourish. Unlike other species, modern humans, endowed with language, seem driven to embark on perpetual improvement projects in the course of which they alter everything around them. “With the capacity to represent the world in signs and symbols comes the capacity to change it, which, as it happens, is also the capacity to destroy it,” she writes. “A tiny set of genetic variations divides us from the Neanderthals, but that has made all the difference.”

Carson, a worthy model for Kolbert, wrote of “the problem of sharing our earth with other creatures.” We are deciding, Kolbert concludes, “which evolutionary pathways will remain open and which will forever be closed.” Our history determines the course of life on the planet. Our species changes the world, and now the most urgent question is whether we can take responsibility for what we do. “The Sixth Extinction” is a bold and at times desperate attempt to awaken us to this responsibility.

Review of “The Undivided Past”

Last week the Washington Post published my review of David Cannadine’s The Undivided Past: Humanity Beyond Our Differences. I enjoyed the book, though while reading it I was reminded of something one of my Wesleyan professors told me long ago. Hayden White joked that historians know they can always say, “things are more complicated than that, aren’t they?” It always sounds like a reasonable question (which means it’s really an empty question). Over the years, I have seen Hayden’s point made time and time again when we academics (not just historians) make similar rhetorical gestures. I play with that empty question in the review below.

Cannadine’s book is very thoughtful and wide-ranging. I’m now making the final revisions on a short book on liberal education — and I know someone will be able to say, “Roth, things are more complicated than that!”

 

THE UNDIVIDED PAST Humanity Beyond Our Differences By David Cannadine Knopf. 340 pp. $26.95

In the 19th century, historians liked to tell triumphal tales of how people came together in powerful, sometimes glorious ways. Sweeping accounts described how religious groups or nations managed to achieve great things, often after battling other groups. In recent decades, historians have very much taken the opposite tack, showing how groups that we once thought were unified were actually quite divided. Difference, not unity, has been the preferred category for thinking about the past. Triumphal tales of groups coming together have been replaced by studies of how divided we have always been from one another.

David Cannadine, a British historian teaching at Princeton, offers a critique of the major categories historians have used to describe how some human beings are fundamentally different from others. Ideas of religion, nation, class, gender, race and civilization have generated intense feelings of belonging but also feelings of antagonism toward rival groups. Hatred of “the other” is the flip side of the fellow feeling that unites people into these mutually “belligerent collectivities.” Historians have focused on the conflicts that have emerged from this process, on the particular ways in which solidarity has given rise to antagonism toward groups of which one is not a member.

In his wide-ranging and readable new book, “The Undivided Past,” Cannadine shows that a very different story can be told. Rather than underscoring perennial conflicts between mutually exclusive human groups, Cannadine emphasizes how people find ways to get along, to cross borders and to maintain harmonious societies.

He begins with religion, which in many ways seems an unlikely vehicle for showing how people can get along. Given the missionary monotheism of Christianity, a religion that is both exclusive and proselytizing, the stage would seem set for stories of endless strife. But Cannadine finds many examples of pagans and Christians collaborating, and he shows how the “intermingling” of Catholics and Muslims in the late medieval period “transformed Europe’s intellectual landscape and made possible its twelfth-century Renaissance.” He knows well the bloody divisions that swept across Europe in the 16th and 17th centuries, but he insists that for most ordinary people, religious affiliation did not lead to violent conflict.

Rulers did try to generate national ties that would motivate their subjects to love (or fear) their monarchs and to fight against rival kings and queens. But intense feelings of nationalism, Cannadine shows, were a short-lived late-19th-century phenomenon, not some natural feeling that people are bound to experience. Class solidarity, too, was never the defining figure of identity for most people, even during the heyday of industrialization when Karl Marx and Friedrich Engels developed their theory that class conflict drove history. Sure, there are times when workers hate their bosses, but in any historical period there are “more important forms of human solidarity” than class. History is “more varied and complex than Marx or Engels would ever allow.”

Like any seasoned historian, Cannadine knows that one can always say of another’s account that “things are more complicated than that” as a prelude to offering one’s own. So it goes in “The Undivided Past.” He positions himself as an enemy of generalization, and so he can criticize Marxists for over-emphasizing class and feminists for over-emphasizing gender. Things are more complicated than that — not every worker has experienced oppression at the hands of the bourgeoisie, and not every woman has been stifled by patriarchy. Nations, though important for a brief period of history, were never monolithic entities inspiring universal devotion. Many French and English citizens, for example, had intense local allegiances that trumped national identity. When you look at history afresh while emphasizing different facts, customary generalizations can appear rather arbitrary. Things were more complicated than that.

Cannadine’s most pointed rhetoric comes toward the end, when he examines the idea of mutually antagonistic civilizations — an old notion recently popularized by the political scientist Samuel Huntington. Cannadine shows how “civilization” is intertwined with the idea of barbarism, and he quotes Montaigne’s quip: “Each man calls barbarism whatever is not his own practice.” Huntington saw the world as irrevocably divided into groups whose practices — religions, cultures and ways of life — prevented any meaningful integration. The claim is that Muslim, Eastern and Western civilizations, for example, are fated to be in conflict with one another, but Cannadine can find neither facts to back this up nor a coherent logic to the argument. He is biting in his criticism of neoconservative warmongers who draped themselves in Huntington’s pseudo-academic “findings.” “Of all collective forms of human identity,” Cannadine writes, “civilization is the most nebulous, and it is this very vagueness that makes it at once so appealing and so dangerous.”

Cannadine knows that writers and political leaders can all too easily generate solidarity on the basis of one element of a group’s identity in order to generate antagonism toward those who don’t share that element. His book is a reminder that generalizations based on the supreme importance of any one concept (be it race, class, gender, religion, nation or civilization) are likely to fall apart when closely examined. The “exaggerated insistence on the importance of confrontation and difference . . . misrepresents the nature of the human condition,” he concludes.

In closing, he writes briefly of the “just inheritance of what we have always shared” and urges us “to embrace and celebrate [our] common humanity.” But he doesn’t even try to provide evidence or arguments for what this common humanity might be. After all, that would be to make woolly generalizations; he knows things are more complicated than that.

 

Review of Nirenberg’s ANTI-JUDAISM

From Sunday’s WashingtonPost

Review of Anti-Judaism: The Western Tradition. By David Nirenberg. Norton. 610 pp. $35

 

Oh, the Protestants hate the Catholics,

And the Catholics hate the Protestants,

And the Hindus hate the Muslims,

And everybody hates the Jews.

So sang Tom Lehrer in his satirical song “National Brotherhood Week.” It’s no news that even those who preach “love thy neighbor” have often combined their striving for community with the hatred of a scapegoat, the Jews. David Nirenberg’s “Anti-Judaism” is a thorough, scholarly account of why, in the history of the West, Jews have been so easy to hate. And this story goes back a very long way.

Nirenberg returns to ancient Egypt to examine traditions that portray Jews as “enemies of Egyptian piety, sovereignty, and prosperity.”This was already old in the 7th century BCE! Ancient Greeks and Romans would have their Jews, too; they found use for an “anomalous” people who stuck together and followed their own rules, who were “neither disenfranchised nor citizen, neither conquered nor conquering, neither powerless nor free.” Over the centuries, when there was trouble in the kingdom, be it corruption or military threat, famine or political chaos, pagan ideologues developed a handy solution: Attack the Jews.

Jews were useful for those who were contending for power in the ancient world, and the Egyptian model of scapegoating was often repeated. But it was the Christians who refined anti-Judaism into a core theological and political ideology. Christianity had a particular problem: to show that it had overcome Judaism — overcome its adherence to the laws of the “old” testament, overcome its tribal particularity with evangelical universalism. The idea of Judaism — together with the fact that there were still people in the world who chose to remain Jews — was an affront to that universalism. “To the extent that Jews refused to surrender their ancestors, their lineage, and their scripture, they could become emblematic of the particular, of stubborn adherence to the conditions of the flesh, enemies of the spirit, and of God.”

Throughout the centuries theologians returned to this theme when they wanted either to stimulate religious enthusiasm or quash some perceived heretical movement. Not that you needed any real Jews around to do this. You simply had to label your enemies as “Jews” or “Judaizing” to advance the purity of your cause. In the first through fourth centuries, Christians fighting Christians often labeled each other Jews as they struggled for supremacy. And proclaiming your hatred of the Jews became a tried and true way of showing how truly Christian you were. Centuries later, even Luther and Erasmus agreed that “if hatred of Jews makes the Christian, then we are all plenty Christian.”

Islam followed this same pattern of solidifying orthodoxy by stoking anti-Jewish fervor. Muhammad set Islam, like Christianity, firmly within an Abrahamic tradition, but that made it crucial to sever the new religion from any Judaizing possibilities. Rival Islamic groups, like rival forms of Christianity, often painted their adversaries as hypocritical Jews scheming to take the world away from spiritual truths essential for its true salvation.

Nirenberg shows how consistently the struggle for religious and political supremacy has been described as a struggle against the “Jews.” The quotation marks are especially important as his account moves beyond the medieval period, because between 1400 and 1600 Western Europe was more or less “a world free of Jews.” Banished from most countries, and existing only in the tiniest numbers through special exemptions, actual Jews were hardly ever seen. But it was in this period that “Christian Europe awoke haunted by the conviction that it was becoming Jewish.” In this period of cultural change and doctrinal and political disputes, patterns as old as the age of the pharoahs were reactivated: My adversaries must be extinguished for the polity to be purified; my adversaries must be Jews. And in early modern European eyes, the adversaries were especially dangerous if they were secret Jews who appeared to be Christian. Were Jews hiding everywhere?

Martin Luther brought this rhetoric to a fever pitch. In 1523 he accused the Roman Church of becoming “more ‘Jewish’ than the Jews,” and as he grew older he tried to convince his contemporaries that “so thoroughly hopeless, mean, poisonous, and bedeviled a thing are the Jews that for 1400 years they have been, and continue to be, our plague, pestilence, and all that is our misfortune.” Don’t believe in conversions, the aged Luther urged; the only way to baptize Jews was by tying millstones around their necks.

Nirenberg’s command of disparate sources and historical contexts is impressive. His account of the development of Christianity and Islam is scholarly yet readable. And his portrayal of the role that Judaism has played as a foil for the consolidation of religious and political groups is, for this Jewish reader, chilling. Nirenberg is not interested, as he repeatedly insists, in arguing that Christianity and Islam are “anti-Semitic.” Instead, he is concerned with tracing the work that the idea of Judaism does within Western culture. He shows that many of the important conceptual and aesthetic developments in that culture — from Saint John to Saint Augustine to Muhammad, from Shakespeare to Luther to Hegel — depend on denigrating Jews.That’s what’s so chilling: great cultural achievements built on patterns of scapegoating and hatred.

In the modern period, revolutionaries and counter-revolutionaries continued to employ “the Jewish problem” as something to be overcome. “How could that tiny minority convincingly come to represent for so many the evolving evils of the capitalist world order?” Nirenberg asks. He shows that for thousands of years the patterns of anti-Judaism have evolved to provide great thinkers and ordinary citizens with habits of thought to “make sense of their world.” He doesn’t say that these patterns caused the mechanized, genocidal Nazi war against the Jews in the 20th century, but he argues convincingly “that the Holocaust was inconceivable and is unexplainable without that deep history of thought.”

Presaging Tom Lehrer, Sigmund Freud in 1929 wrote ironically that Jews, by being objects of aggression, “have rendered most useful services to the civilizations of the countries that have been their hosts; but unfortunately all the massacres of the Jews in the Middle Ages did not suffice to make that period more peaceful and secure for their Christian fellows.” Even when “everybody hates the Jews,” patterns of intolerance and violence remain intact. Nirenberg offers his painful and important history so that we might recognize these patterns in hopes of not falling into them yet again.

THE STORY UNTIL NOW: Kit Reed’s radiant imagination still accelerating

The press sings the praises of Wesleyan writers on a regular basis. But this weekend the notices for Kit Reed’s recent collection of stories stopped me in my tracks.

From the first paragraph of the Wall Street Journal review:

The title of Kit Reed’s selection of her own short stories, The Story Until Now (Wesleyan University Press, 442 pages, $35), reminds us that although she has been writing award-winning fiction for some 50 years, she’s still accelerating. The scope of these 35 stories is immense, their variety unmatched.

And this from a recent Vanity Fair: “The Story Until Now unleashes new and classic stories fired by a radiant imagination.”

Kit Reed
Kit Reed

 

Kit has been inspiring students and readers for decades. A resident writer at Wesleyan, she stopped teaching full-time years ago, but she still works with students and publishes stories and novels that receive critical acclaim and a devoted readership. Speaking of devotion, the labyrinth at the south-east end of the CFA (behind the anthropology house) was installed by alumni to honor Kit and Joe Reed (Professor of English and American Studies, Emeritus). Stroll the labyrinth, and if you see Kit walking by on campus with her terrier, Killer, give her a high five. Better yet, pick up a copy of The Story Until Now. I bet she’ll be willing to sign it.

UPDATE: RAVE REVIEW IN NYT BOOK REVIEW:

 

 

Tuesday Update: Classes Resume and Why Does the World Exist?

Classes Resume this morning (Tuesday) thanks to the extraordinary efforts of our Physical Plant and Stonehedge crews. I am so grateful to all those who kept us safe and well-fed (thanks Bon Appetit!) during the aftermath of Blizzard Nemo. It’s still messy outside, so please be careful.
 
Several hundred students from the Coursera version of The Modern and the Postmodern have checked out this blog recently.  Welcome!
 
Recently the Washington Post asked me to review Jim Holt’s “Why Does the World Exist?”  This review is cross-posted with Sunday’s newspaper.
WHY DOES THE WORLD EXIST? An Existential Detective Story. By Jim Holt. Liveright. 309 pp. $27.95

Jim Holt likes to pursue questions — big questions. And he does so with a sincerity and light-heartedness that draw his readers along for the ride. He’s written for the New Yorker on tough subjects such as string theory and infinity, but his last book was on the seemingly more accessible topic of jokes. In “Why Does the World Exist?” — a finalist for this year’s National Book Critics Circle Award in nonfiction — he takes on one of the biggest questions in conversations with philosophers and scientists: What is the origin of everything?By helping readers understand what some very smart people think an answer to this question might look like, he introduces us to advanced mathematics, theology, physics, ontology and epistemology — just to name some subjects he visits. Holt is usually very good about not losing us along the way, even when the math or the logic gets pretty esoteric.“The transition from Nothing to Something seems mysterious,” he writes, “because you never know what you’re going to get.” That might be true if one were asking as a disinterested party, but Holt is anything but that. The “Something” he has in mind is us — how did we and our world come to be? He wants to know how nothingness, a state in which absolutely no things exist, gave rise to a universe that includes all the things around us. “Conceptually,” he writes, “the question Why does the world exist? rhymes with the question Why do I exist?”There are two major kinds of answers to these twinned questions. The first kind emphasizes the “how” — how a specific cause leads to a particular effect. Why am I here? Because my parents had sex. The second kind of answer moves from cause to meaning. Did my parents want a child? Do I have a purpose in life? What am I doing here? Some of the intellectuals with whom Holt talks sound as though they believe that if they thoroughly answer the “how” version of the question (the one that details causes), they will have answered the “why” version of the question (the one that provides meaning). Or perhaps they think that an airtight explanation of the emergence of causality will make the meaning question irrelevant.There are some philosophers, it should be said, who think Holt is just asking the wrong question. Most interesting is philosopher of science Adolf Grunbaum, who cheerfully tries to show our author that his anxious astonishment with the existence of the universe is misplaced. Unexamined religious longing for mystery and a confused sense that we need to figure out why nothingness does not prevail generate a confused question with no rational response: “Go relax and enjoy yourself! Don’t worry about why there’s a world — it’s an ill-conceived question.” But Holt is only briefly deterred, declaring, “There is nothing I dislike more than premature intellectual closure.”Holt travels in England, France and the United States to talk with some very thoughtful men about some very thorny issues. It’s always thoughtful men. Somehow he didn’t find any women to interview about creation, though at the end of the book he movingly describes his mother’s death. She, a believer, did not think she was passing into nothingness. Respectful, Holt has no closure on this, either.How can the “first cause” not have a cause? How can one talk about anything prior to the Big Bang, if this event created time itself? What is the role of consciousness in the universe, and how is that related to simplicity, goodness, beauty? What if our universe is just one of many, many, universes and big bangs are relatively frequent occurrences? These are the kinds of questions that drive Holt back and forth between mathematics and ethics. String theory “builds matter out of pure geometry,” while “Plato thought that the ethical requirement that a good universe exist was itself enough to createthe universe.”So why is there something rather than nothing? “There isn’t,” replies the brilliant and witty philosopher Robert Nozick. “There’s both.” Physicist Ed Tryon, on the other hand, wondered whether the universe was the product of a “quantum fluctuation,” offering “the modest proposal that our universe is simply one of those things which happen from time to time.”

Periodically our despairing guide describes himself as retreating to a cafe for a strong espresso or, even better, a restaurant where he can treat body and spirit with some good food and wine. Lucky readers may find themselves taking breaks to do the same. But it’s worth getting back in the hunt for answers (or just questions) with Holt.

There are many intellectually stirring moments in the book, and I learned more than I would have thought I could about contemporary controversies in quantum mechanics and cosmology. Holt is an excellent translator of complex ideas and issues. But the highlight of his book is his description of rushing home to help his dog Renzo, who was suffering from advanced cancer. Help in this case meant holding the long-haired dachshund for 10 days, and then stroking him while a vet administered a lethal injection. Holt tells us about a mind game he plays with prime numbers to steady himself “in moments of unbearable emotion.” He used the game at the veterinarian’s office. The next day he called a physicist to talk about why the world exists.

When Holt asks why the world exists, he is also asking whether there is any point to our being here. He is struck by the extraordinary contingency of our lives and of our world, and he seeks to address that contingency with theories about the emergence of time, of causality, of something. But contingency is not erased by causal accounts; it is just described in minute detail. Holt recognizes this when the somethings he cares about disappear. His real concern isn’t creation but extinction — why somethings turn into nothings. He knows the causal explanation, but that is not answering his question. Focusing on causes can be a mind game to help us deal with “moments of unbearable emotion.”

Why do we lose those we love? Why do important parts of our world vanish? These are not questions for a detective story, existential or not. But they are the questions to which, in the end, Holt’s wonderfully ambitious book leads us.

Hallucinations and Art: Two Book Reviews

From Sunday’s Washington Post:

HALLUCINATIONS By Oliver Sacks. Knopf. 326 pp. $26.95

As a young professor, I traveled to Vienna to visit a friend. Knowing that I’d written my first book on psychoanalysis and history, he sent me off to Freud’s old apartment and office, which had been converted to a museum. One rang a doorbell to be admitted, and I was shocked when the museum attendant greeted me by name. Surely, I thought, my old friend had called ahead to play a little joke on me. Again, the attendant spoke to me by name in German, calling me “Professor Doktor Roth” — or so I thought. My wife was right beside me, and she later told me that nothing of the kind had happened. The museum employee had merely told me the price of admission.

I was befuddled by this, and later as I searched in the museum’s library to see if it had a copy of my book, I realized that what I’d heard so clearly was probably an auditory hallucination. I so very much wanted to be recognized in the house of Freud that I’d perceived something that wasn’t there at all.

Most of the examples of hallucinations in Oliver Sacks’s graceful and informative new book do not have the transparent motivations of my episode in the Freud museum. Indeed, most of his examples don’t seem “motivated” at all; they have causes rather than meanings. That is, most of the occurrences seem to be products of neurological misfirings that can be traced to disease, drugs or various changes in neurochemistry. With some important exceptions, hallucinations don’t seem to reveal desires or intentions — the kinds of things that create meaning; they do reflect workings of the brain that cause us to see or hear things that are not really there. Parkinsonian disorders, epilepsy, Charles Bonnet syndrome, migraines and narcolepsy — drawing upon descriptions of these and other conditions by patients and doctors, Sacks explores the surprising ways in which our brains call up simulated realities that are almost indistinguishable from normal perceptions.

As is usually the case with the good doctor Sacks, we are prescribed no overarching theory or even a central argument to unite his various observations. Instead, we are the beneficiaries of his keen observational sense, deep clinical practice and wide-ranging reading in the history of neurology. This doctor cares deeply about his patients’ experiences — about their lives, not just about their diseases. Through his accounts we can imagine what it is like to find that our perceptions don’t hook on to reality — that our brains are constructing a world that nobody else can see, hear or touch.

Sacks has been fascinated by neurology since his student days (he is now almost 80), and he recounts his personal experiences with neurochemistry. He started experimenting with LSD in the 1950s, and when he was a medical resident living in Southern California’s Topanga Canyon in the 1960s, his drug use combined recreation with investigation. Opiates later upped the ante, and Sacks describes his interest and pleasure in altered states of consciousness. He recalls his hallucinations that drew heavily on Froissart and Shakespeare with neither pride nor shame. His perceptions weren’t based in reality, but could he still learn from them?

Sacks has long been an avid reader of the history of medicine, and he beautifully describes his intense, amphetamine-inflected readings of such 19th-century medical texts as the English physician Edward Liveing’s work on migraines. Drugs made reading seem more powerful, but as he came down from his high, Sacks realized that while under the influence of drugs he would never be able to write with the kind of sustained attention and care evident in the texts he admired. His epiphany was that he should follow his creative muse not through more powerful hallucinations but through the work of medicine and writing. “The joy I got from doing this was real — infinitely more substantial than the vapid mania of amphetamines.”

Over the past decades we have learned much more about how we see and hear with our brains — not just with our eyes and ears. Sacks describes how neurosurgeon Wilder Penfield was able to induce “experiential seizures” by tracking electrodes over the surface of an exposed temporal cortex during surgery. His patients seemed to experience vivid flashbacks, as if the electrical charge had catalyzed a memory into a perception. Vivid though they were, these recollections seemed to lack personal significance. More recent work has explored how the brain creates networks of recollection that allow us to access memories, even as we reshape the past while bringing it into consciousness.

Some hallucinations, Sacks writes, do seem connected to highly significant, emotionally charged memories. When deep in grief, for example, we are more likely to perceive our loved one, even though we know that person has died. Bereavement “causes a sudden hole in one’s life,” and a hallucination evinces a “painful longing for reality to be otherwise.”

At the end of “Hallucinations,” Sacks returns to phantom limbs, a subject he wrote about at length in “A Leg to Stand On.” Amputees report pain in limbs they no longer physically possess, the brain seeming to retain an image of the body that trumps physical reality. Physicians today help patients learn to use their phantom limbs, fitting them into prostheses so that they can use their hallucination of a body part to maneuver what no longer seems like an artificial limb.

Turning a phantom limb from something strange and painful into something one integrates with one’s sense of self is a medical and human triumph. Sacks has turned hallucinations from something bizarre and frightening into something that seems part of what it means to be a person. His book, too, is a medical and human triumph.

 

From Sunday’s Los Angeles Times

Glittering Images: A Journey Through Art From Egypt to Star Wars. By Camille Paglia. Pantheon: 202 pp., $30


In the 1990s Camille Paglia established herself as a cultural critic to be reckoned with. Her daring “Sexual Personae” enraged feminists, even as it presented a view of culture, sexuality and control that offered little comfort to conservatives hoping to convert even more Americans to the cult of conventionality. Chaos, Paglia emphasized, might be contained for a while, but it would always find its way back into our lives. And that wasn’t something to be lamented.

Paglia was a radical libertarian eager to puncture sanctimony wherever she found it — either in the progressive pieties of political correctness or in the hypocrisy of fundamentalist hucksters hacking away at other people’s pleasures.

She enjoyed a fight, or at least she recognized that fights made good copy and pumped up sales. She liked to throw around the word “Stalinist” and was herself compared to both a Nazi and to Phyllis Schlafly by prominent feminist authors. Paglia particularly enjoyed polemics against pretentious academics, reserving some of her nastiest and most amusing tirades for the followers of highfalutin French theory. This too was a guaranteed audience pleaser.

In the last decade we have seen a kinder and gentler Camille Paglia as she has moved from critical polemic to cultural appreciation. In “Break, Blow, Burn” she turned her attention to what she considered great poetry in English — from Shakespeare to Joni Mitchell. Taking a page (and perhaps a business plan) from her mentor Harold Bloom, Paglia wrote in that book that in “this time of foreboding about the future of Western culture, it is crucial to identify and preserve our finest artifacts.” She collected 43 mostly canonical poems and wrote a little about each in the hope the inspiration she found in them would be contagious.

“Glittering Images” continues this project — this time with brief discussions of 29 works of visual art. Whereas “Break, Blow, Burn” sought to help us hear again the strongest poetic voices, this volume wants to help readers “find focus” amid the “torrential stream of flickering images.”

Paglia’s goal is straightforward: By offering images of great artworks and helping us to give them sustained attention, she hopes that readers will “relearn how to see” with sustained pleasure and insight. Protesting against the intense animosity toward the arts she sees in American popular culture, Paglia wants her readers to recognize the deep feeling, craft and originality that went into the works she has chosen.

The range of art discussed is enormous, though there are few surprises in the Paglia canon. She begins with Nefertari’s tomb and offers a few pages on religion and politics in ancient Egypt and on Egyptology since Napoleon. The anonymous artisans who built the tomb “were faithful messengers of the cultural code,” linking profound cultural truths to elegant visual representation. Paglia’s sympathy for the intersection of religion and art serves her well in the early chapters of the book, as she discusses objects that were venerated for more than their aesthetic power.

Given her penchant for polemic, it was odd to discover that “Glittering Images” has no argument. Her brief discussions of the objects have the flavor of the textbook or Wikipedia, with occasional anachronistic comments linking them to present concerns. It’s probably a good thing that Paglia makes no attempt to sustain a narrative about art over the ages; instead she offers reflections on why she finds, say, Donatello’s Mary Magdalene so powerfully enigmatic, or why Bronzino’s mannerism has “a polished theatricality but an unsettling stasis.”

It would be silly to complain about the particular works that Paglia has chosen. They all repay vision and reflection, and that, after all, is her point. The critic sometimes seems to believe, with George Grosz, that “great art must be discernible to everyone,” and I suppose that’s why she concludes her survey with the limited imagination but visual virtuosity of George Lucas.

In her final chapter she writes as if popularity is a key sign of artistic greatness, though she knows that many of the artists she most admires were not at all part of the popular culture of their times. They often struggled to be seen, but that doesn’t mean that fame was their ultimate artistic goal.

I’m not sure why Paglia worries so that the fine arts today have lost touch with the masses, that they “are shrinking and receding everywhere in the world.” Sure, her favorite AM talk radio shows often make fun of artists. But people have been making fun of artists for a very long time. Meanwhile, contemporary photographers, painters, sculptors and videographers pursue their practice with intensity and patience, with craft and concept.

Toward the end of “Glittering Images,” Paglia writes with appropriate and infectious admiration about Eleanor Antin’s mail art project 100 Boots. Paglia notes that the “boots, like their creator, are outsiders, eternal migrants questing for knowledge and experience.”

Artists, questing outsiders, are still with us, still finding their way, making their way. Perhaps some of them will be inspired by the glittering images Camille Paglia offers here.

 

Thinking Food, Thinking Animals

This weekend Michael Strumpf from Bon Appetit and I signed the RealFood Commitment. Thanks to Manon Lefevre (who also signed the document) and her comrades in WESFRESH, I came to see that we can do more to bring more locally grown, healthy, humane and sustainably produced food to our campus. We all know that this isn’t a panacea: signing this commitment doesn’t solve all the problems with our food supply. But it is a step in the right direction – a step we were proud to take.

I’ve arrived pretty late at any consciousness at all about these issues, and my receptivity to the students in WESFRESH was due almost entirely to my wife Kari, whose work in animal studies has intersected with environmental issues in general and food production in particular. This has been an exciting week for us because the first copies were delivered of Kari’s new book Thinking Animals: Why Animal Studies Now?

The book examines “real and imagined confrontations between human and non-human animals, and “unseats the comfortable assumptions of humanist thought and its species specific distinctions.” Kari started off as a professor of French and comparative literature, but over the last several years has been increasingly involved with the burgeoning field of animal studies. The College of Letters has been a great interdisciplinary home for this wide variety of interests. Philosopher Lori Gruen and Kari co-direct a research institute at Wesleyan sponsored by the Humane Society that begins just after graduation. Lori’s Ethics and Animals is a key text in applied ethics and animal studies.

At the signing of the RealFood commitment, we were serenaded by a wonderfully inventive band, Ratched and the Lunatics, led by singer-songwriter Raechel Rosen.

The other members of the band are fabulous Wes students Shourjya Sen, Dylan Awalt-Conley, Robert Don, Jacob Masters, Rachel Pradilla, and Annie Maxwell. Their wonderful music was powered by a group of energetic cyclists, pedaling energy into batteries and generators (thanks to the College of the Environment).

Wesleyan renews energy every day. Go Wes!

Creativity Works at Wes

What follows is a book review I published this weekend in The Washington Post of “Imagine: How Creativity Works,” by Jonah Lehrer. For years people have said that Wesleyan is a place for creative students, and recently we have tried to define more specifically how the work on our campus helps students develop their capacities for innovation.

A few days ago, applicants to Wes found out whether they have been invited to join the class of 2016. The competition for spots was very intense this year. With more than 10,000 applicants, most of whom are highly qualified, the process of putting together a class is increasingly difficult. We are looking for students who will thrive in the engaged, collaborative and imaginative campus culture here. Over the next four weeks many of the prospective pre-frosh will be visiting Wes, trying to determine if this will be their home and their launch pad for the next four years. The students who choose Wesleyan will likely be those who find that the dynamic student and faculty culture stimulates their own imaginative capacities. Creativity works at Wesleyan.

UPDATE:

Check out these recent articles on the student music scene at Wes:

http://www.usatodayeducate.com/staging/index.php/ccp/student-and-alumni-musicians-bring-wesleyan-wave-to-the-national-scene

http://www.billboard.biz/bbbiz/industry/backbeat/backbeat-fort-lean-rocks-santos-party-house-1006320752.story

 

Here’s the review, crossposted from washingtonpost.com:

Not many writers can make plausible links among musicians Bob Dylan, Yo-Yo Ma and David Byrne, animators at Pixar, neuroscientists at MIT, an amateur bartender in New York, entrepreneurs in Silicon Valley and Israeli army reservists. Not many reporters do research about an expert surfer who has Asperger’s, information theorists, industrial psychologists and artists. But Jonah Lehrer is such a writer-reporter, who weaves compelling and surprising connections based on detailed investigation and deep understanding. He says that working memory is an essential tool of the imagination, and his book is an excellent example of how a dynamic storehouse of captivating information feeds creative thinking and writing.

Lehrer begins with the story of a pop-culture breakthrough, the artistic reinvigoration that Dylan experienced when he wrote “Like a Rolling Stone.” Dylan was finishing a grueling tour schedule that had left him increasingly dissatisfied with making music. He decided to leave behind the madness of celebrity culture and the repetitive demands of pop performance. But once he was ensconced in Woodstock, N.Y., once he decided to stop trying to write songs, the great song came: “It’s like a ghost is writing a song,” he said. “It gives you the song and it goes away. You don’t know what it means.” Lehrer adds, “Once the ghost arrived, all Dylan wanted to do was get out of the way.”

Many of the stories that Lehrer recounts in the first few chapters stress the benefits of paying attention to internal mental processes that seem to come from out of the blue. We can learn to pay attention to our daydreams, to the thoughts or fantasies that seem nonsensical. Sometimes this attention must be very light, so that the stream of ideas and emotions flows, as when Ma feels his way into a new piece of music. Sometimes the attention must be very great, as when W.H. Auden (assisted by Benzedrine) focused on getting the words in a poem exactly right.

Lehrer explains some of the neuroscience behind these different modes of attentiveness. Making use of the power of the right hemisphere figures in, as does activating more energy from the prefrontal cortex to “direct the spotlight of attention.” He discusses experiments that explore which parts of the brain seem most active in different kinds of pursuits. For example, as the brain develops in childhood, the power to inhibit our flights of fancy grows. But as inhibition and focus increase, the capacity to improvise seems to diminish.

Lehrer notes that modern science has given new names to ideas that philosophers have been exploring for a very long time. Despite the fancy terminology, I found the anecdotes about scientific experiments less interesting than the anecdotes about poets, artists, surfers and inventors. That’s partly because the science stories seem to overreach, pretending to offer explanations for creativity by finding precise locations for the multitudinous connections that the brain generates. In an organ with the networking plasticity of the brain, location might not explain so much.

The last three chapters move from individuals to contexts. Lehrer offers fascinating accounts of why cities generate intense creative work and why certain urban-planning principles that emphasize heterogeneity (think Jane Jacobs) are so powerful. He shows us why teams that “are a mixture of the familiar and the unexpected,” such as those at Pixar, are the most innovative. Too much strangeness, and things fall apart. Too much closeness, and the generative spark is never struck.

Lehrer shows why brainstorming usually fails to result in real innovation because nobody is pushing back on bad ideas. “The only way to maximize creativity . . . is to encourage a candid discussion of mistakes. . . . We can only get it right when we talk about what we got wrong.” Or, as Lee Unkrich, a Pixar director, put it: “We just want to screw up as quickly as possible. We want to fail fast. And then we want to fix it. Together.”

Lehrer concludes with a discussion of why certain epochs seem to be more creative than others. Culture, he says, determines creative output, and it is through sharing information and making connections that we maximize that output. He quotes Harvard economist Edward Glaeser, who emphasizes that “even in this age of technology, we still get smart being around other smart people.”

Glaeser and Lehrer are showing why cities remain so important, but as the president of a university, I can also see how this applies to our campuses. Students and faculty seek the inspiration that is available all over campus, and that’s why so much learning happens outside the classroom. Sitting by yourself with your computer, even if you have access to thousands of Facebook “friends,” just isn’t the same as being in a creative, cosmopolitan culture in which new connections are continually (and surprisingly) formed.

“Imagine” doesn’t offer a prescription for how we are to become more imaginative, but it does emphasize some key ingredients of a creative culture: taking education seriously, increasing possibilities for human mixing and cultivating a willingness to take risks. Lehrer practices what he preaches, showing an appetite for learning, a determined effort to cross fields and disciplines, and a delight in exploring new possibilities. Reading his book exercises the imagination; the rest is up to us.


Summer Reading: Review of Saramago’s SMALL MEMORIES

This weekend the WASHINGTON POST ran my review of José Saramago’s posthumously published memoir. For me, summer is a time to catch up on reading that I can’t quite get to during the school year, although I also have to get a lot of writing done myself over the next couple of months. I enjoy reviewing books outside my scholarly field. I have to think about them more intensively than I would as a casual reader, and yet I do not have a scholarly investment in the reception of the work. I did not know Saramago’s work before I reviewed SMALL MEMORIES, but now I can understand  why his achievements as a writer have seemed so remarkable to so many — especially in Europe. Discovering writers that matter to you is an intensely personal process, a process that began for me as an undergraduate at Wesleyan. Reviewing is one way for me to share that process.

What are the chances? That a child surrounded by illiteracy, shuffling between his family’s new life in Lisbon and their roots in the countryside, will have such an intense appetite for words that he relishes pages from discarded newspapers, seizes on fragments of Molière in a guidebook, and will one day create parallel worlds in which an entire nation goes blind, in which Jesus apologizes for God’s sins, in which death suddenly stops occurring. These worlds, fantastic as they are, turn out to be uncomfortably like our own.

What are the chances? That a writer whose early efforts were greeted with harsh criticism (or mere silence) leaves the literary world behind to concentrate on journalism, returns in his 50s to pen novels that capture the imagination of European writers and critics, is celebrated for political bravery and artistic originality and crowned with the Nobel Prize for literature.

José Saramago (1922-2010) was this child, this writer, and in “Small Memories” he has provided us with a collection of memories of his childhood and adolescence. The recollections don’t follow a linear path but instead touch lightly on lives framed by poverty and frequent brutality. But in Saramago’s retrospective imagination, these are also lives infused with dignity, affection and deep connection. The author knows the tricks that memory can play, and on some matters he has taken great pains to test his recollections against recorded facts. Saramago is fascinated by the vagaries of remembrance, at one point wondering if certain memories he had were really his.

Although his parents moved to Lisbon when he was just 18 months old (his father was to be a policeman), Jose continued to shuffle between Portugal’s capital and Azinhaga, his native village. The village was the “cradle in which my gestation was completed, the pouch into which the small marsupial withdrew to make what he alone could make, for good or possibly ill, of his silent, secret, solitary self.” The reader is introduced to various family members: a father consumed by jealous rage; grandparents who are hardened, stoic workers but who keep the weakest of their piglets warm by bringing them into their bed for a few nights. The author’s mother is long-suffering, but she is also the young woman who on passing through a doorway forgets she is carrying a jug of water on her head because she has just received a proposal from her future husband. “You might say that my life began there too,” Saramago writes, “with a broken water jug.”

After relating this incident of the broken jug, Saramago tells the reader that his older brother, Francisco, died at age 4 in the spring of 1924, some months after his mother brought them to Lisbon. The author wonders about his memory of his brother, the “happy, sturdy, perfect little boy, who, it would seem, cannot wait for his body to grow and for his arms to be long enough to reach something.” “It’s the summer or perhaps the autumn of the year Francisco is going to die,” Saramago writes, adding it’s “my earliest memory. And it may well be false.”

I was unprepared for the piercing sadness of this hazy recollection, steeped in sorrow but told in the same calm, matter-of-fact style as Saramago’s other childhood recollections. From the loss of his older brother we are led to a memory with a “fierce and violent truth”: Saramago’s brutal encounter with a pack of older boys who, holding him down, thrust a metal wire into his urethra. The horror and sadness of the wounded little boy, blood streaming from his penis, is startling in the context of the quiet charms of the volume as a whole. Francisco is dead; little José has no one to protect him. The physical wounds will heal, but the longing for the missing brother — and a concern for those who are vulnerable to all sorts of brutality — will always remain.

Shortly after relating this incident, Saramago recalls his older friend the “prodigious shoemaker,” also named Francisco, who asked the young author-to-be if he believed there were other worlds, where other possibilities were realized. When Saramago first decided to write a memoir, he tells us that he knew he would want to write of his brother. Bringing the forgotten back through words is the writer’s alchemy, his power to create when faced with the harshness of the world.

Saramago, a poet, journalist and diarist in addition to being an acclaimed novelist, knew that words mattered a great deal — that they can even point to one’s destiny. The writer’s paternal family name, for example, was de Sousa, and the author tells us it was a town clerk’s joke to register his surname as Saramago — the name for a wild radish eaten by the poor in harsh times. The boy grew into his name, taming his wildness but always remaining faithful to his roots in poverty. “Small Memories” is an expression of that fidelity, a small but nourishing last gift from a great writer.

Cross-posted from washingtonpost.com