Job Archives

 
The Russian composer Igor Stravinsky (1882–1971) was one of the most influential composers of the twentieth century. His career spanned from the early twentieth century, when he composed ballets inspired by Russian myth and the era's revived interest in distinctly Russian culture, to the experimentation in compositional styles that followed the Second World War. Though born in the nineteenth century, he lived and worked long enough to see his works inspire progressive rock music, just as he himself had been inspired by earlier masters like Bach and Tchaikovsky. His importance in the history of music is unquestionable.
John Heiss is an active composer, conductor, flutist, and teacher. He is the Director of the Contemporary Ensemble at New England Conservatory, where he teaches in the flute, chamber music, composition, music history, and music theory departments. He joins us on Culture Insight to share his insight into the life and work of Igor Stravinsky. RECOMMENDED [table id=43 /]

  The Russian composer Igor Stravinsky (1882–1971) was one of the most influential composers of the twentieth century. His career spanned from the early twentieth century, when he composed bal...

David M. Bergeron, professor of English at the University of Kansas, and Geraldo U. de Sousa, assistant professor of English at Xavier University in Cincinnati, have succeeded in the monumental task of revising the 1976 edition of Shakespeare: A Study and Research Guide in their detailed and updated second edition. It also adds an in-depth analysis of the new books that have been published in the eleven years since the first edition, and it also adds sections on feminism and gender studies as well as poststructuralism and historicism.

This guide is organized in an effective way for the Shakespeare student and scholar alike. The first chapter explains the many critical approaches to Shakespeare studies while the second chapter covers and evaluates the books and journals written on the Bard. The third chapter then gives an approach, ample tips, and a guide to writing a critical paper.  

Chapter One, “The Subject in Context” starts with an explanation for the lack of criticism in the years following William Shakespeare’s death. The 16th century did not value drama as a literary form and “no well-established literary criticism existed” then (2). The 17th century saw John Dryden’s writing as a critic and Ben Jonson elevating drama as a literary form. The authors also explain the development of Shakespeare criticism through the 18th century with critics like Samuel Johnson, the 19th century with A. C. Bradley, and the 20th century with M. W. Tillyard among others. The 20th century saw the publication of numerous journals like the Shakespeare Quarterly, Shakespeare Survey, Shakespeare Newsletter, and Shakespeare Studies. This history will be beneficial for students needing an overview of the development of Shakespeare criticism through the centuries. Next, the authors give a thorough definition of the critical approaches: historical, theatrical, and genre. Criticism that focuses on the analyses of language and imagery, the study of character, which most beginning students find doable. Psychological criticism, thematic and mythic criticism, feminist and gender criticism, and textual criticism are next detailed by the authors in a way that would clarify these critical approaches for any level of Shakespeare student. This chapter ends with major critics of Shakespeare, from Samuel Johnson and Samuel Taylor Coleridge to Fredson Bowers and Charlton Hinman. A flaw of this edition is that it was published in 1987, so it does not include contemporary critics or critical approaches. 

Chapter Two, “A Guide to the Resources,” gives “a selection of some of the most valuable and useful books” presented under the categories of bibliographies and reference guide, (an excellent resource for students writing scholarly papers), editions, studies in the genres of tragedy, comedy, histories, and sonnets, studies of groups and movements, interdisciplinary studies, periodicals, and biographical studies (25). In this chapter, the authors compile an impressive list of books along with their approaches that will benefit any level of Shakespeare scholar. The organization of this chapter is also effective as it details a thorough and researched complete works of William Shakespeare as well as works that deal with the many varieties of critical approaches. A flaw is that this edition, which is a great revision of the original, again does not include contemporary editions, critical works, and periodicals.  

Chapter Three, “The Research Paper” begins with an understanding of the overwhelming task that a student could face writing about Shakespeare, but as Bergeron and de Sousa explain, they have given more than ample resources in the previous chapters. However, the abundance of critical works can prove equally overwhelming. They bring up the common student complaint “that there is nothing new to be discovered or said; probably every generation of students has reached this conclusion, believing quite wrongly, that all Shakespearean materials have been discovered and that the final critical word has been uttered” (165). The authors reassure students that original ideas are not required, only a formal investigative essay with a rationally argued thesis. It is an exercise in logic, critical thinking, and sound reasoning, but it “can be fun” (166). Bergeron and de Sousa rightly argue that their study and research guide is a valuable resource for students looking for topics or sources for a paper. They further suggest that students consult composition and rhetorical guides for style guides on the research paper as well as the MLA Handbook for Writers of Research Papers and the MLA Handbook, which has had several new editions since this study and research guide was published. This chapter then breakdowns the entire process of writing a research paper, from selecting a topic to preparing the works cited page. The authors suggest that “to oversimplify, three broad areas exist for research topics in Shakespeare: his life and times, the plays or poems themselves, and textual problems” (167). This explanation seems to disregard the previous discussions on the many critical approaches. They then correctly argue that “the clearer the topic, the more efficient the research can be” (168). Because this book was published before the existence of library internet databases, it suggests that the student should use The Reader’s Encyclopedia, which can be a resource for information and sources but is not as comprehensive as a good library database can be. They suggest and provide tips on compiling a working bibliography, taking notes from research, outlining the paper, writing the paper, preparing documentation, and the works cited, and avoiding plagiarism. Finally, the authors provide a model research paper written by a college undergraduate student.

Shakespeare: A Study and Research Guide by David M. Bergeron and Geraldo U. de Sousa is a thorough and detailed guide for the student of William Shakespeare. It painstakingly gives critical editions and journals that students can now easily find and access on their university library database. However, the scholar can benefit from the discussion of critical approaches and the history of criticism throughout the century. Though dated, this study and research guide is well written and researched and will benefit the student interested in learning more about Shakespeare, the history of criticism on him, and the many critical approaches. Many of the critical editions are still valuable resources, too, because great literary criticism does not become dated and can stand through time, even centuries.

David M. Bergeron, professor of English at the University of Kansas, and Geraldo U. de Sousa, assistant professor of English at Xavier University in Cincinnati, have succeeded in the monumental task o...

The author of well-received and best-selling books about Franklin D. Roosevelt and Barack Obama, a seasoned political analyst on MSNBC and earlier on Newsweek, Jonathan Alter summons a well informed and fresh perspective on the thirty-ninth President of the United States, taking advantage of new archival sources, including declassified documents on the CIA Records Search Tool available at the Jimmy Carter Presidential Library and Museum. Alter notes that he received the “extensive cooperation of Jimmy Carter and eighteen members of his family,” including Rosalynn Carter, who shared her husband’s love letters. She is a major character in this biography, which is the story of a loving marriage (not without its tensions) and a political partnership. Rosalynn usually was, as the cliché now has it, “in the room,” serving her husband as a chief advisor. Her work on improving access to mental health therapy pioneered the later efforts of First Ladies, especially Michelle Obama. It is surprising to note that no full-fledged biography of Carter has preceded Alter’s, although another Carter biography is on the way from Kai Bird. Something is afoot in the world of politics and biography: Jimmy Carter, dismissed by political elites, the populace, and even former presidents as a failed leader, has slowly been redeemed, rising in the ranks of historians from the bottom third to the top third of American presidents. The conventional explanation for this turnabout is Carter’s spectacular post-presidency with achievements in eradicating the spread of disease, the fostering of democracy around the world, and his work at home with Habitat for Humanity. All good works, Alter acknowledges, but not nearly as consequential as what Carter accomplished as President. [caption id="" align="aligncenter" width="500"]Jimmy Carter Jimmy Carter[/caption] Alter makes a compelling case for Carter as the most important political figure in the environmental movement since Theodore Roosevelt. In the final days of his presidency, for example, he managed to preserve something like a third of Alaska as an environmental preserve that oil companies and other exploiters of the land could not despoil. When Republican senator Ted Stevens tried to weaken Carter’s Alaska’s initiative, bringing to the White House maps and arguments that he thought would release more land to private enterprise, he came away disappointed, exclaiming that Carter knew more about Stevens’s state than he did. That was typical Carter, an engineer, voracious reader, and stickler for detail—all characteristics that made him an impressive, but also vulnerable president. It is not that Carter knew too much, but in Alter’s telling Carter too often turned off both Democrats and Republicans with his know-it-all stance that was even more infuriating because he did seem to know it all! But Carter was hardly a one-issue president. Putting solar panels on the White House roof, for example, was not a stunt, but part of Carter’s realization that to save the planet as well as the American economy the country would need to develop alternative forms of energy. That President Reagan removed those panels and ignored environmental concerns set back the nation precious decades in combating climate change that the government has yet to reverse. American presidents from Clinton to Obama tried to catch up to Carter’s vision of renewable energy before President Trump halted their progress toward a cleaner world. Carter transformed the American judiciary with many more women and people of color and worked closely with African American leaders like Martin Luther King, Sr. for racial and social justice. In this case, he was rectifying his own reticence during the 1950s and 1960s when he did not speak out against segregation, and when he even welcomed the support of populist/racists like George Wallace. In his campaign for governor of Georgia, Carter remained largely silent on the issue of race, although in office he began the long drive to reconcile racial divides. In fact, he acted so boldly as governor that Carter admitted he could not have been re-elected as governor of Georgia. [caption id="" align="aligncenter" width="500"]Jimmy Carter Jimmy Carter[/caption] Alter shows that Carter had superb insights on how to get elected, pioneering the Washington outsider ploy that so many candidates have used since his successful election to the presidency in 1976. So what went wrong? Alter astutely shows that Carter, as an outsider, never bothered to cultivate a constituency within the Democratic Party. He expected to win because he was on the right side of the issues. He despised the give-and-take of congressional deal-making and almost always took the moral high ground, thus alienating even allies who found him sanctimonious. Of course, Carter also came to grief because of the takeover of the U.S. embassy in Iran. Alter shows that neither Carter nor his staff understood the changing political climate of Iran. Faulty intelligence contributed to their ignorance and misjudgment of Iran’s leaders. The ensuing drama of the more than year-long fate of the American hostages that made Ted Koppel a star on the ABC evening program, Nightline, transformed the hostage crisis into a daily indignity that frustrated and angered the American electorate. Even worse, the disastrous effort to rescue the hostages, ending in American fatalities in the desert before the rescue team could even reach the embassy, branded Carter as an unlucky and ineffectual president. It did not help that the country suffered high inflation and went into recession without recovering in time to withstand Ronald Reagan’s challenging words: “Are you better off now than you were four years ago?” Not even the historic peace agreement between Egypt and Israel could save Carter from defeat, even though all sides publicly proclaimed his indispensable role. The courageous Carter never hesitated to put his reputation at risk by personally negotiating with Egypt’s Anwar Sadat and Israel’s Menachem Begin. Foreign policy achievements, including Carter’s bold moves to open up China that went well beyond President Nixon’s policies, did not earn him enough credit for a victory that opened new markets for the United States and brought China more fully into the world order. And of course, his opponent, the taller and telegenic and well-spoken Ronald Reagan made Carter look even smaller in their one debate, which dispelled worries about the hard-line Californian who came off as so genial he made Americans feel good about themselves. By contrast, Carter’s speeches calling for sacrifice were turned against him, as though he were lecturing the American people about their profligacy. To say only this much about Alter’s biography is misleading, since this book is about more than politics. It is also about a man forever learning: to paint, to write poetry, to author many books, personal and political, and to travel tirelessly on behalf of peace—sometimes to the consternation of presidents who saw him as interfering with State Department policy and burnishing his reputation to the detriment of the very peace process Carter supported. And yet the very idea of free and fair elections, with Carter on the ground supervising in person across the globe, reinforced the moral standing of the United States and continued the human rights foreign policy Carter had promulgated as president. Coming to the close of his narrative, Alter has a paragraph that sums up both the force of Carter’s life and of this biography:
For nearly a century, he had already lived again and again and again—constantly reimagining himself and what was possible for a barefoot boy from southwest Georgia with a moral imagination and a driving ambition to live his faith. That passionate commitment—sustained long after others of his generation had left the field—would take him to the farthest corners of the earth, but he always came full circle to Plains, where his inner and out selves could find repose.
Now you see the power of Alter’s title. Carter has always given America and the world “his very best,” showing what it can mean to be an American. But Carter’s story also seems biblical: He is our Methuselah.

The author of well-received and best-selling books about Franklin D. Roosevelt and Barack Obama, a seasoned political analyst on MSNBC and earlier on Newsweek, Jonathan Alter summons a well informed a...

Most books about the famed French philosopher René Descartes center on the man’s reputation as the most influential and controversial figure of his time. Indeed, some 350 years after Descartes’ death, he is still the subject of numerous books and scholarly articles. But in Russell Shorto’s book, Descartes’ Bones, the reader is treated to more than a dry and lifeless compendium of facts. Shorto, whose other works include The Island at the Center of the World, Gospel Truth, and Saints and Madmen, has written an intensely readable and enjoyable book that breathes new life into the long-departed philosopher. How does Shorto accomplish this feat? Much like a riveting detective story, à la Sherlock Holmes, Hercule Poirot or, popular today, any of the CSI spinoffs on television, Shorto creates a mystery surrounding the bones and skull of Descartes. Shorto begins, as all good detective stories do, with a death: in this case, the death is that of Descartes. What follows is aptly described in the subtitle: A Skeletal History of the Conflict between Faith and Reason. So it is that after a beguiling preface outlining Shorto’s reasons for undertaking his years-long search for the truth, Chapter One, entitled “The Man Who Died,” gives a fascinating account of Descartes’ final days and ultimate death and burial in a cold and lonely cemetery in southern Stockholm, Sweden. Chapter Two, “Banquet of Bones,” relates the account of how the remains of Descartes, a Frenchman, were “translated” finally to Paris by the French ambassador, Hugues de Terlon, in 1666, some sixteen years after Descartes died. But this was just the beginning of the story. In Chapter Three, “Unholy Relics,” Shorto describes how the philosopher’s bones moldered in France for the next century, and a man by the name of Alexandre Lenoir, amid the furor of the French Revolution, dug up Descartes’ bones and buried them in his “garden filled with historic tombs” (p. 117). In 1819, the third burial of the philosopher’s bones took place at the church of St.-Germain-des-Pres. Then, there is the mystery of “The Misplaced Head,” the title of Chapter Four. Shorto expounds on how fitting a metaphor for modernity that during all the burials and reburials, “the head somehow got separated from the body” (p. 129), and would remain a mystery for years to come. A skull, reported to be that of Descartes, touched off a furious round of claims, counterclaims, treatises, and debate. Learned minds argued for and against the authenticity of the purported missing head of Descartes and, in the end, “they gave the head a nod” (p. 165). In Chapter Five, “Cranial Capacity,” the somewhat dubious scientific and otherwise inquiry into whether brain size equals greatness of mind is examined by Shorto. That is to say, he explains, rather wryly, how this tempest in a teapot came to be and what came from it. Descartes, by all accounts and portraits, was a small man whose brain must have also been small. How could such a great thinker have such a small brain? Steadily, over the next 100 years, the methodologies that sought to link intelligence and race were assisted by the raging debate over Descartes’ skull. As Shorto puts it, “In a small way, Descartes’ skull helped debunk bad science” (p. 203). Chapter Six, “Habeas Corpus,” concerns itself with the issue of these pitiful remains—were they even the actual bones of Descartes? Technical and artistic analysis of busts of Descartes versus paintings was at the forefront of yet more heated and spirited debate. After all the controversy, Shorto declares, “As for the body, the trail ends abruptly, veering sharply into oblivion. And that is perhaps as it should be. Dust to dust. In secula seculorum (p. 231). In Chapter Seven, “A Modern Face,” Shorto, describes how now in the modern age of 1995, a Japanese telecommunications engineer, Hiroshi Harashima and Hisao Baba, an anthropologist and anatomist, working on an exhibit on facial recognition, sought to apply their skills “to the skull of Descartes to give the facial exhibition a face” (p. 235). In the Epilogue, Shorto recounts a Mass performed by Father Jean-Robert Armogathe at the church of St.-Germaine-des-Pres in Paris for the “eternal rest of the philosopher’s soul” (p. 247). Let us hope that the famous philosopher is indeed at rest after all these years.

Most books about the famed French philosopher René Descartes center on the man’s reputation as the most influential and controversial figure of his time. Indeed, some 350 years after Descartes’ d...

Paul Addison’s Churchill: The Unexpected Hero builds on his earlier Oxford Dictionary of National Biography entry in presenting a more detailed analysis of Winston Churchill’s character and career. Currently an Honorary Fellow at Edinburgh University, Addison is an established authority on British politics and society during and after the Second World War. The publication date of 2005 is significant, for three years earlier a BBC poll voted Churchill the “Greatest Briton” of all time. Indeed, since his death in 1965, patriotic fervor and enthusiasm for Churchill have seldom abated. The dominant image of Churchill as “saviour of his country” propagated by Isaiah Berlin and A.J.P. Taylor has often led historians to absolve, overlook, minimize, or underplay his personal and political shortcomings and failures.

Even modern biographers such as Martin Gilbert and Paul Johnson are often culpable of being overly charitable towards Churchill. More often than not, these shortcomings do not result from a willful misreading of history but derive from a pervasive sense that as a ‘Great Man,’ Churchill was not accountable for his actions in quite the same way as other politicians. Conversely, revisionist reaction against the view of Churchill as a visionary and great statesman is often excessive and misguided. In blaming Churchill for the post-war collapse of the British Empire and the attendant erosion of Britain’s ‘Great Power’ status, historians such as John Charmley and David Irving are guilty of reading history backwards. As the great historian, Frederic William Maitland stated “we should always be aware that what now lies in the past once lay in the future.” Churchill himself was well aware of the ideological vicissitudes of historical reputations, noting:

“Historians are apt to judge war ministers less by the victories achieved under their direction than by the political results which flowed from them. Judged by that standard, I am not sure that I shall be held to have done very well.” (pp. 243-4)

Churchill’s long political career, from the imperial grandeur of the late-nineteenth century to the gritty but ultimately affluent society of the post-1945 era, encompassed social, economic, and political changes of great magnitude. As he was intimately involved with most of the important political issues over the course of his life, any short biography must cover a vast range of topics, and many issues are inevitably treated rather sparingly. Nevertheless, Addison makes a convincing case for his central thesis that Churchill’s many political failures before 1939 made him an “unexpected hero.” Addison provides a solid body of evidence demonstrating that Churchill was a very divisive figure, whose recklessness and impulsiveness led even many of his admirers to lace their praise with criticism. For former British prime minister H.H. Asquith, Churchill was a “wonderful creature with a curious dash of schoolboy simplicity” and “a zigzag streak of lightning in the brain” (p. 74). More dryly, Neville Chamberlain noted Churchill’s unorthodox approach to policy-making:

“In the consideration of affairs his decisions are never founded on exact knowledge, nor even on careful or prolonged consideration of the pros and cons. He seeks instinctively for the large and preferably the novel idea such as is capable of representation by the broadest brush.” (p. 123)         

These character traits, redolent of an aristocratic, aloof, and somewhat superficial attitude to politics, were identified and criticized by many intellectuals, politicians, and historians during Churchill’s lifetime. Among them, John Maynard Keynes, H.G. Wells, and Evelyn Waugh launched vitriolic attacks with coruscating wrath and wit. Churchill’s conversion from the Conservative to the Liberal Party and then back again, was, for some, testimony of his being “above party,” and to his independence of mind and originality of thinking. For others, it was indicative of his deception, opportunism, and lack of principle. Churchill’s judgment and trustworthiness were often called into question, and his arrogant, dictatorial approach alienated many within the military and political elite and led to many costly misjudgments and mistakes.

The heroic and visionary aspects of Churchill’s career, his ambition, exuberance, energy, wit, and rhetorical brilliance, are writ large throughout the book, and the literary appeal of Churchill’s experiences and adventures as a war correspondent in far-flung imperial outposts is hard to resist. Other biographers, such as Andrew Roberts, Henry Pelling, and Roy Jenkins have traveled a similar path. However, in recounting Churchill’s faults and failures as well as his qualities and successes, Addison effectively cuts through the morass of propaganda, bias, and historical inaccuracy to provide a more nuanced view of Churchill. Fully aware of Churchill’s prolific use of oratory, journalism, and historical writing in serving his political purposes, Addison gives particular attention to the dichotomy between Churchill as the bold and tenacious Minister, Servant of Crown and people, and the epitome of “British” values, and Churchill the self-serving, amateur, and often ill-informed politician and strategist.

Ultimately, like other biographers, Addison assesses Churchill on his record as a wartime leader. It is a record that is far from unblemished. Addison recounts Churchill’s lack of strategic knowledge and negligible understanding of the relative importance of different theatres of war. Churchill also vastly overestimated his influence on Roosevelt and Stalin, a diplomatic error of judgment that owed much to overstating Britain’s global position. Most seriously, Addison recounts the post-war historical dismantling of the version of events Churchill had presented, via carefully selected documents, in his six-volume The Second World War. In those volumes, there was no mention of Cabinet discussions for a compromise peace with Nazi Germany, nor was the strategic bombing of German cities fully revealed and explained. Political differences between the United States and Britain were underplayed, and foreign policy differences between Churchill and his predecessors exaggerated.

Historical research has altered many misconceptions and misinterpretations, but the most powerful early blow against Churchill was the publication of the Alanbrooke diaries between 1957 and 1959 which destroyed the “façade of statesmanship” Churchill had erected (p. 242). Criticism of Churchill (and corresponding praise for Chamberlain) has become more voluble over time, with the hagiography of the immediate post-war period superseded by a far more rigorous research agenda. The decisions made by Churchill, how he conducted the war, and whether there were any viable alternatives to war, are now all legitimate historical questions, and subject to scrutiny and debate. Churchill’s legacy has never been more highly contested. Yet, as Addison convincingly argues, identification and acknowledgment of Churchill’s failures does not necessarily negate or detract from his successes but by contributing towards a fuller and more accurate portrayal of the man, leads us closer to historical truth.

Given his importance to later revisionist accounts of Churchill, it is somewhat ironic that in 1943 Viscount Alanbrooke doubted if any future historian would be able “to paint Winston in his true colours.” (p. 183) Addison’s knowledge of the man leaves him well-placed to perform this task, for his thorough reading of Churchillian historiography is fully apparent, and excellent scholarship combine with lively, precise, and engaging prose to produce a very fine short biography. With a competent blend of narrative and analysis, the book is succinct, analytical, and informative, and most useful as a primer for those relatively unversed in Churchill’s life. Addison’s even-handed approach does not conceal or understate Churchill’s character flaws and political mistakes, and he wisely suggests that “overpowering egotism was the source of his greatest achievements as well as his biggest failures”(p. 81). Judicious to the last, Addison commendably concludes by viewing Churchill as “a hero with feet of clay” (p. 254), which is surely a fitting status for a politically cynical age. 

Paul Addison’s Churchill: The Unexpected Hero builds on his earlier Oxford Dictionary of National Biography entry in presenting a more detailed analysis of Winston Churchill’s character and career...

Despite the recent biography of Martha Freud by Katja Behling, and the numerous books about her husband, Sigmund Freud, little has been written about how Mrs. Freud (1861-1951) may have felt about her life with the founder of psychoanalysis. Nicolle Rosen’s, Mrs. Freud: A Novel, offers a fictional account and imagines Martha as an eighty-five-year-old widow given the opportunity to revisit and analyze her life. Rosen, a Paris-based psychiatrist and novelist, draws on archival documents, photographs, existing biographies, as well as her own imagination, and reveals Martha through a series of journal entries and letters to a fictional character, Mary Huntington-Smith. A writer with a background in psychoanalysis, Huntington-Smith carefully guides Freud out of the shadows, luring her with questions about her husband, her children, and her childhood. Rosen’s choice to write this novel in epistolary form was a wise one. Mrs. Freud begins her correspondence formally and reluctantly, but over the course of the novel, her personality takes shape, as she relates memories about her courtship and marriage to Freud, the complexities of her relationships with her sister, Minna, and daughter, Anna, as well as her upbringing in a prominent German Jewish family. Martha often rambles and digresses, but her revelations are deeply personal. Through these privileged letters and journal entries, she confesses why she was content to be a housewife, much as she yearned for intellectual stimulation. Martha suggests that she was stymied not only by limited educational and professional possibilities for a woman in her day, but also by the expectations to follow convention and become a wife and a mother. “In rich milieus, it was actually a stigma for a woman to work. That a woman’s fate was preordained in those days didn’t seem to trouble me. I sincerely aspired to follow my mother’s example.” mrs. freud Although acquiescent, Martha confides her marriage to Sigmund was frustrating. At the start, and throughout their lengthy courtship, he was an intensely romantic man. Soon after the wedding, however, and to her disappointment, he traded passion for affection, focusing primarily on work. For Sigmund, Martha was not an intellectual companion, but rather a wife, a mother, and a calming presence. “Well before our marriage,” she writes, “he had made it clear that the management of our home was my basic, if not sole preoccupation.” It is only after the death of her husband that Martha comfortably breaks from her role of caretaker and openly challenges her intellectual curiosity, avidly reading literature as well as her husband’s works. Over the course of the letters and entries, Martha often revisits the same memories, and in the course of doing so, sometimes contradicts herself. These contradictions, however, do not hinder the narration of Martha’s story. It may easily be argued that Martha’s recollections—like our own—change over time, upon reflection, and depending on the context. While her over formal diction mirrors Martha’s traditional behavior, the unnatural language works least well in Rosen’s detailed passages about life in nineteenth-century Vienna, and Sigmund’s battle with cancer. Without evaluation of the French edition, however, it is unclear whether this is simply a product of a less than perfect translation to English. Overall, Rosen has done an admirable job, and Mrs. Freud is a compelling, intimate tale that has wide appeal, and particularly recommended for those who are fascinated with Freud, psychoanalysis, and enjoy the epistolary format.

Despite the recent biography of Martha Freud by Katja Behling, and the numerous books about her husband, Sigmund Freud, little has been written about how Mrs. Freud (1861-1951) may have felt about her...

Many of us know the story of how Charles Darwin (1809–1882) took a five-year voyage aboard The Beagle to the Galápagos, where he found a variety of tanager birds equipped with different beaks—creatures that had adapted to living on different islands with beaks that resembled tools. He thought they were different species of birds, but an expert back in England revealed that they all belonged to the same family—the result of what Darwin would later call natural selection, one of the key mechanisms of his evolutionary theory. The story is just as famous as the one about an apple hitting Isaac Newton on the head or Ben Franklin attaching a key to a kite to collect electricity. For some reason, we tend to see the most prominent scientists as acting alone—Newton, for instance, was an eccentric who thought up calculus while walled up in a house during the Great Plague of London. Darwin was a loner too, studying quietly in his quarters aboard the Beagle when he wasn’t sketching birds and tortoises he encountered on the island. They always seem to be working in near isolation while tearing down the conventional perception of the world. Of course, as Darwin scholar Sandra Herbert explores in her book Charles Darwin, Geologist, this is all a bit far from the truth. While biographies today have to reveal a new and surprising facet of a person’s character if they want to appear on bookstore shelves, you might think of Herbert’s book as a biography of Darwin’s career in geology, rather than one of the man himself. But it includes a few good insights on the history of geology along the way, too. Few people would know that not only was Darwin trained in identifying minerals, but that he also reluctantly accepted a position as secretary of the Geological Society of London in 1838, which he held for three years (his excuse was that he was too busy writing the third volume of his travels in South America). It may also surprise you to know that the Beagle actually boasted a rather impressive library of books on geology (back then, a young science), which Darwin borrowed from and read voraciously in several languages. Thus, he was far from alone as he cultivated his ideas. Geology, the study of the Earth’s crust, had only recently branched off from mineralogy (the study of minerals), which came from chemistry. In the years following his voyage, he drew some intricate maps of coral reefs on the ocean floor. He also proposed a few theories about the origin of coal from prehistoric seaweed that he discussed in continuous correspondence with fellow geologists who often disagreed with him. While biographies tend to record experiences of their subjects and how these events shaped their work and personality, Herbert is much more interested in how Darwin’s career led him to publish Origin of Species. This book was, after all, the culmination of his life’s work, which introduced the theory of evolution to the world and influenced fields as diverse as medicine and computer programming, aside from being the unifying theory of biology and zoology. Along the way, Darwin was inspired by many people, both contemporary and long gone. French naturalist Georges Cuvier, for example, first introduced the idea that species become extinct, but Darwin found the reason extinction is both inevitable and critical for the continuation of life. Climate change, thought by many to be a new science, was actually contemporary to Darwin, and gaining traction as a theory for explaining mass extinctions. While we often think of Darwin as a dreaming idealist (perhaps that was more accurate of Darwin’s grandfather, Erasmus), Herbert shows a man who was primarily concerned with evidence while laboring away on his theory. He held geological evidence to be of greater importance than the work of biologists at the time, as geological strata and fossilized specimens revealed a chronology of developing life on Earth. Darwin was also an enthusiast of fossils and often consulted with English biologist Richard Owen (who famously coined the word “dinosaur”), for his expertise in the days before paleontology became its own branch of science, independent from geology. While thinking outside the norm and breaking with convention are still admirable qualities, as is remaining skeptical, Herbert’s work shows that people are almost never alone when working within the scientific community. It’s a process that requires frequent collaboration (and often disagreement), between experts from a variety of fields—a continuous pursuit of knowledge where new ideas are constantly tested, and new fields grow to complement the new discoveries. While Origin does not estimate the age of the Earth (something that wouldn’t be described with accuracy until the mid-20th century), it was certainly a problem that Darwin and his contemporaries were interested in—how long it would take for life to evolve into its present form. Darwin gave an estimate of 300 million years, but he quickly realized that the beginning of life couldn’t account for the beginning of the planet (if it were once as small as a meteor), and perhaps this is a telling detail for Herbert’s biography. Although Darwin was right about evolution, he not only was wrong on occasion, but realized that there were limits to his own knowledge—that determining the age of the Earth was an area left to other branches of science, and that there are limits to how much knowledge we can acquire over a lifetime. However, the discoveries and contributions will always outlast us, leaving the door open for more discoveries to come.

Many of us know the story of how Charles Darwin (1809–1882) took a five-year voyage aboard The Beagle to the Galápagos, where he found a variety of tanager birds equipped with different beaks—cre...

Everything you ever wanted to know about Jane Austen and probably didn't even know enough to ask. An extensive, perhaps definitive, collection of essays that reflect contemporary Austen studies. Suitable for serious Austen scholars and fans willing to handle its heft. So much to learn here; so much to ponder. The 42 essays cover topics from Austen's life and earliest writings, to discussions of whether she wrote in a parody of gothic style or actually promoted it, to a survey of "Austenian Subcultures," which illuminates the activities of "Janeites," characterized in one of the essays as similar to Star Trek fans ("Trekkies") with their conventions and get-togethers in the real world. Forty-two essays, plus an excellent introduction by the editors that provides an overview of current Austen studies, notes on the contributors who likely represent the best in Austen scholarship today, and an extensive and useful bibliography. In their introductory comments, the editors emphasize the relatively new debate among scholars about which Austen editions should be considered "authoritative." Does such a thing as a definitive edition exist? And if so, who defines it as such? In the end, Johnson and Tuite make a traditional choice. They rely on the Chapman editions for practical reasons, they say: these editions are the most widely used in literary criticism and have been the standard sources since 1923. The introduction marks out a path through the history of modern Austen criticism, which dates from about 1917. Johnson and Tuite attribute the most recent spate of Austen mania, in part, to the release of the modern film versions of her work, but they also note, like Henry James, that Austen lends herself to marketing efforts and that her fans tend to identify quite personally with "our dear Jane." According to Johnson and Tuite, "Austen study today is a diverse, expansive, excitable, critical life-form with feelers that reach out across disciplines." Their choice of essays is a clear illustration of this statement. The book has five sections. Part I is "The Life and Texts," and it offers an introduction and details of Jane Austen's life and relationships. The first essay examines Austen's letters, "the key to everything" in the opinion of its author Kathryn Sutherland. Part II is "Reading the Texts," a study of Austen criticism from her earliest juvenilia to her last published novel. Part III, "Literary Genres and Genealogies," covers studies of Austen's style, form, and linguistic invention. Part IV, "Political, Social, and Cultural Worlds," places Austen in the wider cultural environment of her time, including wars, politics, and feminism. Part V, "Reception and Reinvention," is designed, say the editors, to examine the "generic uses to which Austen's later readers put her writing." This is a fascinating section that examines the links between scholarly and popular interests, with attention given to movies, radio broadcasts, fiction inspired by Austen's works, fan cultures, and modern critical practice. There is so much in this book that the best way to give a flavor of its scope is to browse the essay titles: "The Austen Family Writing: Gossip, Parody, and Corporate Personality;" "Emma: Word Games and Secret Histories;" "Time and Her Aunt;" "The Army, the Navy, and the Napoleonic Wars;" Jane Austen and the Silver Fork Novel." Obviously, readers at many levels will find something to attract them here. For this reviewer, those essays describing Austen's modern fans and the impact Austen has on today's popular culture are standouts. For example, in the final essay in the book, “Austenian Subcultures," author Mary Ann O'Farrell explores the feeling expressed by writer Katherine Mansfield in her remark about the reader's response to Austen: “For the truth is that every true admirer of the novels cherishes the happy thought that he alone—reading between the lines—has become a secret friend of their author.” O'Farrell suggests that the subcultures of fans who gather in chat rooms, create websites, and purchase Austen merchandise "engage variously the rigors and the comforts of an Austen-inspired sociability." In other words, Austen fan communities reflect the acceptance, warmth, irritability, alienation, and challenges of characters in Austen's fiction. "The ability to tolerate others' private Austens—perhaps the eagerness of an Austen of one's own to be in company or to hold her ground among or against others' Austens—is what facilitates the formation of Austen societies, Austen tour groups, and even Austen classes." According to the O'Farrell, Austen societies, clubs, and classes, both formal and informal, celebrate face-to-face contact. This collection of essays is well worth mining for its nuggets of insight and inspiration. It shows that even in an age of impersonal contacts over electronic media, which tend to isolate rather than connect human beings, the social dynamics at work in the drawing rooms and sitting rooms of Jane Austen still resonate with modern readers.

Everything you ever wanted to know about Jane Austen and probably didn’t even know enough to ask. An extensive, perhaps definitive, collection of essays that reflect contemporary Austen studies....

It is fair to say that the 19th Century German philosopher Friedrich Nietzsche’s most challenging work is his four-part philosophical novel Thus Spoke Zarathustra, where the fictional character of Zarathustra, serving as Nietzsche’s mouthpiece, wanders around prophet-like giving impassioned sermons on a variety of topics. Written in several bursts of fervent activity over the years 1883-1885, this richly ambiguous and metaphorically overabundant book—considered by its author to be his gift to mankind (Ecce Homo, Preface, 4)has consistently garnered a broad range of reactions from its readership: from dismay and perplexity to avid enthusiasm, from utter dismissal to fascinated preoccupation. And despite the bewitchment it can exert on some unsuspecting and impressionable readers, a serious and deep understanding of its content is far from being easily attainable.

In her 2010 book, Nietzsche’s Zarathustra, originally published in 1987, Kathleen Marie Higgins, a professor of philosophy at the University of Texas at Austin and Nietzsche scholar, does not offer a detailed interpretation of Zarathustra, but wishes instead to “indicate the general sweep of that text” (p. 80), adopting an “interdisciplinary” (p. xx) approach that combines philosophy and literary studies, in the belief that Zarathustra is an attempt at a “bold invention in writing” (p. xix). Accordingly, the book does not offer a close analysis of some of the most important sections of the work—not to mention a section-by-section analysis as has been offered by such writers as Laurence Lampert in his Nietzsche’s Teaching: An interpretation of “Thus Spoke Zarathustra” (Yale University Press, 1989). Consequently, this book is not for the uninitiated, who seek a steady hand to guide them through the lush thicket of the text. Neither does the book explore some of Zarathustra’s most famous and important philosophical ideas such as the will to power, the overman, or the creation of values—but with an important exception to which Higgins devotes an entire chapter, namely, the idea of the Eternal Recurrence of the Same: a central idea in Nietzsche’s philosophy, which he himself presented to be the “fundamental conception” of Zarathustra (Ecce Homo, Books, Zarathustra, 1). The book is thus for the advanced reader, already familiar to some degree with Nietzsche’s philosophy in general and with Zarathustra in particular, who is interested in unlocking some of the more stylistic and thematic riddles that characterize the work. 

Higgins first sets up the stage by arguing that Zarathustra should be read in light of Nietzsche’s attempt, already articulated in his earlier The Birth of Tragedy, to address the problem of the meaning of suffering in life: how are we to affirm life in light of great suffering that cannot be remedied through “thinking things over” (p. 14)? The Christian solution is famously (and heavily) criticized by Nietzsche and is thus unavailable. But why is the answer that Nietzsche provides in Tragedy according to Higgins—that tragic play can reawaken us to our love of life, warts and all—not good enough? Why did Nietzsche have to write Zarathustra? The book does not offer an explicit answer to this question.

A focus of Higgins’ analysis is the peculiar fact that throughout the text—starting especially in the second part—Zarathustra is presented as experiencing setbacks, hesitations, states of perplexity and uncertainty, which all suggest according to Higgins the “ambivalence” (p. 75 and throughout) of Zarathustra’s doctrine. One central example of this is to be allegedly found in the baffling section “The Stillest Hour,” which concludes the second part of Zarathustra. Here Zarathustra is engaged in self-reflection and confronts a “voiceless voice” which exhorts him to speak what he knows despite his protestations that he knows “it” but does not want to say “it.” What is it that Zarathustra knows and why he is so reluctant to say it—all of this is highly vague, but according to Higgins this is an indication of Zarathustra’s “confusion” (e.g. p. 91) about his own teachings. What is the nature of this “confusion” and what is its cause? Here I found myself confused by Higgins’ account: on the one hand, she claims Zarathustra “has lost intellectual clarity” (ibid), but on the other hand, she claims his crisis “centers on his … ability to verbalize what he knows” (ibid.), which is a different thing. A third explanation given immediately is that Zarathustra “seems to lack the faith … He has evidently lost touch with the spirit that moved him when he first vowed to share his insights with others” (ibid.). But this is yet another, distinct problem: one can lose confidence in oneself without being confused about what one thinks one knows. In any event, Higgins makes the argument that what really beleaguers Zarathustra is recognition of the “disparity between life, with its chaotic wildness, and the ‘wisdom’ through which human beings attempt to describe life” (p.92). Though supported by shaky textual evidence, I found this to be one of Higgins’ chief and most interesting ideas in the book, namely, that Zarathustra, on several occasions, conveys metaphorically the idea that every attempt to conceptually grasp “life” in a set of rigid propositions is bound to fail given life’s ever-changing nature. Consequently, Zarathustra’s own philosophizing is bound to fail if taken as final pronunciations on the meaning of life, the universe, and everything. 

Higgins’ treatment of the idea of the Eternal Recurrence of the Same—the idea ‘that the cycle of events in time repeats itself an infinite number of times’ (p. 103)—was in my view rather unclear. According to her reading it is an “expression for a fundamental orientation toward one’s life” (p.105), one which focuses on the present and regards it, rather than the past or the future towards which one might be working in one’s goal-directed activities, as of the highest importance and intensity. This is supposedly Zarathustra’s (rather disappointing) panacea to the tragedy that is life. Why Nietzsche had to resort to such bombastic formulations about time and eternity to make such a rather straightforward claim remains a mystery to me. What is interesting, however, in Higgin’s approach is the way she applies her previous conclusion to Eternal Recurrence: the doctrine is itself ambivalent and possibly “horrifying” (p. 123), Kathleen argues, and so its function must be limited to the positive role is it supposed to play in one’s life—namely, to invigorate one’s attunement to the present. Once one generalizes the doctrine and regards it as some kind of theory, however, things are bound to go wrong given life’s “chaotic wildness” and essential recalcitrance to all grand-theorizing. 

In the last chapter of the book, lauded in the blurbs as providing the book’s most important contribution, Higgins attempts to explain Zarathustra’s fourth and last part in which a radical change of tone and style occurs and the somber is transformed into the burlesque and comical. Higgins’ novel claim is that the part’s composition was highly influenced by the ancient satiric Roman novel Apuleius’ The Golden Ass. One central clue on which the connection hangs is that in Zarathustra too there is an ass, more specifically, an ass “festival,” where the animal is worshipped by all sorts of “higher men” whom Zarathustra meets. What is more, Zarathustra himself is referred to directly and indirectly as an ass. Higgins spends some time arguing for the influence of the novel on Nietzsche, but the payoff is embarrassingly and frustratingly slim. One question Higgins poses is “Who is the Ass?” But she herself claims that “perhaps we do not need to compare Zarathustra to the work of Apuleius in order to see that Zarathustra is being called an ass in a certain sense” (p. 148). What is the significance of Zarathustra being called an ass, however? In The Golden Ass, the hero learns from his errors and undergoes “spiritual development” (p.150), thus redeeming his folly. Similarly, Zarathustra confronts his own folly—the error of having held his doctrine at times as a rigid teaching (p. 150)—and overcomes it by laughing at it, thus acquiring new self-understanding and insight. This is a valuable point, but again, I cannot but lament the scanty philosophical fruit that Higgins plucks from her tree following a twenty-page-long buildup. But perhaps I am just being too philosophical.

It is fair to say that the 19th Century German philosopher Friedrich Nietzsche’s most challenging work is his four-part philosophical novel Thus Spoke Zarathustra, where the fictional character of Z...

 
The author of such literary classics as Ulysses and Finnegans Wake, James Joyce (1882–1941) was one of Ireland's most celebrated novelists known for his avant-garde and often experimental style of writing.
Philip Kitcher has taught at several American Universities and is currently John Dewey Professor of Philosophy at Columbia. He is the author of over a dozen books including Advancement of Science, Science, Truth and Democracy, The Ethical Project and Joyce's Kaleidoscope. A Fellow of the American Academy of Arts and Sciences, he was also the first recipient of the Prometheus Prize, awarded by the American Philosophical Association for work in expanding the frontiers of Science and Philosophy. He joins us on Culture Insight to share his insight into the life and work of James Joyce.

  The author of such literary classics as Ulysses and Finnegans Wake, James Joyce (1882–1941) was one of Ireland’s most celebrated novelists known for his avant-garde and often exper...

Untangling the cause and effect relationships embedded in historical events can be a complicated endeavor. This is especially true when complex questions of ethics, science, and race are part of the discussion. This is what Jerry Bergman attempts in his densely packed work, Hitler and the Nazi Darwinian World View. He confronts the reader with the fact that seemingly educated men conspired to commit monstrous acts of evil. Bergman provides overwhelming evidence that Adolf Hitler and much of the Nazi inner circle were influenced by and championed social Darwinism and eugenics. They read popular books and articles on those subjects and, as in the case of Nazi propaganda minister Joseph Goebbels and physician Dr. Josef Mengele, imbibed theories of racial hygiene at German universities, spreading Hitler’s belief that “the evolutionary superiority of Aryans, the race superior to all others, gave them not only the right, but the duty to subjugate all other peoples” (p. 79). The Nazi leader concocted his own particular brand of evil based on notions of race and mythology. He espoused many of these ideas in his rambling book Mein Kampf a decade before he came to power. Bergman returns again and again to the profound influence that Ernst Haeckel, a respected German scientist, had on the Nazi worldview. He was one of the primary transmitters of Darwinian ideas in Germany, and his work was used to give Nazi ideas intellectual weight. After the Nazis took power, they knew they had to brainwash the next generation with their twisted ideology. Bergman uses Hitler’s chilling words to make this point: “Let me control the textbooks and I will control the State” (p. 265). In this way, Nazi racial ideology became a fact of life for German children. Control of education and propaganda was key for the Nazis, but just as important were their efforts to undermine, and ultimately destroy, Christianity in Germany. The book chronicles how several high-ranking Nazis, including Goebbels and Mengele, were raised with religion, but rejected it as young adults in favor of secular ideas grounded in the “science” of eugenics and racial supremacy. They knew there was no room for a religious and ethical system that taught people to be compassionate toward the weak and vulnerable. The Nazi worldview embraced survival of the fittest and preached contempt for those who did not meet arbitrary racial standards. As Bergman put it, “the Nazis offered the German people a new religion based around blood, soil, Germanic folklore and the Thousand Year Reich” (p. 170). He points out that after the failures of WWI, Versailles, and the Weimar Republic, many Germans were ready to embrace Hitler’s exhortations of confidence and strength. He is quick to remind us, however, that none of this excuses embracing or simply tolerating the Nazis. Darwinian ideas in Germany too often came to mean theories of Aryan racial superiority, with respected scientists and politicians advocating the need for “civilized man” to wage war against “primitive man.” They argued this was the only way to purify the Aryan race and prevent its demise through contamination by impure blood. According to Bergman, it was an obsession with protecting Aryan blood that drove Hitler’s thirst for conquest. But conquest and expansion were only the first steps of his master plan. The ultimate aim was the extermination of Jews and other people he believed to be inferior. Of course, this came to horrific fruition with the regimented slaughter that was the Holocaust. Not only was Hitler obsessed with murdering millions of innocent people, but Bergman also points out that the egomaniac believed mankind would eventually praise him for his actions. Hitler’s Deputy Fuhrer, Rudolph Hess, encapsulated the Nazi dogma with the words, “National Socialism is nothing but applied biology” (p. 312).  His words also reflect Bergman’s argument that Hitler’s plans were fueled by much more than the usual power politics and lust for land. He and his Nazi thugs convinced themselves they were charged with a historical mission that would allow the Third Reich to reign for a thousand years with the master race at the helm. Their fantasies of a glorious racial struggle, made real by the monstrous Final Solution, doomed Germany to be defeated at the hands of the Allied powers. Bergman’s book blends political history with the state of science, ethics, and morality in post–WWI Germany. Racial theories and eugenics gained ground with Hitler’s rise to power in the 1930s. Hitler used these same theories as the justification to implement policies of segregation, imprisonment, and finally genocide. Many histories of this period focus extensively on the what, but Bergman delves into the why behind the noxious mix of Nazism and aspects of Darwinism. Too many Germans were willing to accept this fanaticism as the price for making Germany strong again. The scope of Hitler and the Nazi Darwinian World View means it should appeal to students of history, politics, and science. Bergman reminds us that ideas don’t just exist in books and classrooms; they have real weight in the real world and can lead to tragic consequences. Books like this one will help ensure this critical historical period will be remembered and understood, by illuminating the horrendous crimes committed through the twisting of science to serve evil.

Untangling the cause and effect relationships embedded in historical events can be a complicated endeavor. This is especially true when complex questions of ethics, science, and race are part of the ...

Writer and commentator Dennis Prager has often argued that people who see themselves as victims are responsible for a lot of the evil committed in the world. What better examples of victims wreaking havoc on the world than Adolf Hitler and Napoleon Bonaparte? Victims are one thing, but Desmond Seward also clearly demonstrates that both men also saw themselves as destroyers on a mission to transform the world to match their own twisted utopian dreams. Napoleon and Hitler: A Comparative Biography like other studies of these two dictators can’t help but leave the reader with the impression that Napoleon, compared to Hitler, was a benevolent dictator. This distinction is deserved on two grounds: there is nothing like the Holocaust during Napoleon’s reign, and he left a lasting legacy of somewhat modernizing French bureaucratic administration. On a more practical level, it is OK to admire the French Emperor on some level, but to say anything positive about the Fuhrer would dismiss one from polite society. Comparative biographies can be fraught with problems because authors sometimes try too hard to sandwich two historical figures together. This is not the case with Seward’s work. On the contrary, sometimes Napoleon and Hitler seem like two sides of the same coin. One side is perhaps shinier, but both ultimately are damaged beyond all restoration. Seward describes both men as megalomaniacs, but this might be too kind. Hitler and Napoleon can both be described as sociopaths, simmering with hate for their fellow man. Both dictators put their own thirst for power before their countrymen. Their countrymen were looking for a savior, the French from the Directory, the Germans from the Weimar Republic. Napoleon promised to leave the excesses of the Revolution behind while maintaining its higher ideals. But when Napoleon became First Consul in 1799, Seward points out that “there was no mention in the new Constitution of liberty, equality and fraternity” (73). Hitler’s rise was more conventional, if perhaps even more ruthless. He used a combination of party politics, street thug tactics, and demagoguery to get himself appointed Chancellor in 1931. In the previous year’s elections, Hitler’s National Socialist Party became the second-largest party after the Social Democrats. Politics was Hitler’s arena, but the field of battle was Napoleon’s undisputed terrain. With a European coalition arrayed against him, the emperor gambled everything in December 1805 in a brilliant series of attacks that culminated at Austerlitz in the modern-day Czech Republic. But like Hitler, Napoleon exhausted his resources on his victories. The book argues that neither could recover from their invasions of Russia, but Austerlitz showed Napoleon could still muster his genius. Whatever genius Hitler had he used on his bold decision to invade France, which in 1940 had the most powerful military in the world. The rapid defeat of France was his high-water mark, particularly among his generals who had warned against invading. [caption id="" align="aligncenter" width="500"]Napoleon Bonaparte Napoleon Bonaparte[/caption] Their dreams of conquest had a practical side as “both men were convinced that they had to win fresh victories to survive” 232. Hitler had gotten a lot of what he wanted without firing a shot because Europe was desperate to avoid a second world war. Seward explains that the coalition against Napoleon would have let him keep France’s natural frontiers, but the emperor could not bear to lose. Napoleon was forced into exile, escaped, and had his Waterloo before being vanquished once and for all. The 19th century was a slightly more forgiving and romantic era, and the savagery Hitler had unleashed on Europe ensured he would get no such quarter from the Americans or the Russians who closed in on his bunker under the center of Berlin in April 1945. Seward points out that in place of romance there is the mystery of a burned body that was never positively identified as Adolf Hitler. Both Napoleon and Hitler saw themselves as victims, but victims who had risen above obstacles to become saviors to their people. Neither one saved his people; they harbored nothing but contempt for their people. When their empires crumbled around them, they sacrificed the people for their own vanity. The people became victims of the whims of dictators who became destroyers. Although “determined to escape from a war on two fronts, both had been destroyed by such a war” 293. The sheer weight of the coalitions arrayed against them guaranteed eventually defeat. Seward succeeds in writing an engaging and informative analysis of two of history’s most fascinating figures. He makes it clear that Napoleon and Hitler sought naked power through any means necessary. A few chapters sketch their early lives, but stops short of trying to delve into their childhood psyches for the keys to their megalomania. It is quite enough to show the reader what they chose to do in adulthood. Seward provides enough details and examples to flesh out both men’s careers to make a convincing case that they were very similar in important ways—in ways that had horrific consequences for millions of people. Readers of military history will find Seward’s comparative biography of these two dictators fascinating and informative. Whether interested only in Napoleon or mostly in World War II, this work offers insight to readers regardless of their primary focus. This book would also interest students of political science as it chronicles how each man took separate, but irresistible paths to ultimate power. Napoleon and Hitler: A Comparative Biography vividly demonstrates the often disregarded wisdom that absolute power corrupts absolutely.

Writer and commentator Dennis Prager has often argued that people who see themselves as victims are responsible for a lot of the evil committed in the world. What better examples of victims wreaking h...

Healthcare costs are out of control. More and more Americans are being diagnosed with depression, eating disorders, and other related illnesses. The dizzying expansion of genomics, neuroscience, and pharmacology is opening up new debates in medical ethics. The moment surely calls for a careful critique of the relationship between contemporary citizens and the institutions charged with their treatment. Thomas Szasz’s Pharmacracy: Medicine and Politics in America performs a service by provoking dialog about the fundamental meaning of disease and the social implications of diagnosis. Broadly, it argues that American life is becoming altogether too medicalized, with the diagnostic/therapeutic paradigm reaching its tendrils further and further. This is something that is not merely paternalistic and expensive, Szasz argues. Rather, it threatens to undercut our civil liberties and the very concept of adult responsibility. Szasz analyzes the perversities of a medico-governmental system in which treatment and power are conflated (The philosopher Michel Foucault discussed something similar in regard to his notion of biopolitics) Diagnoses, Szasz points out, have changed from descriptions used to clarify a patient’s condition, to a “strategic” token used in a game between various institutional players: lawyers, health insurance companies, profit-seeking clinics, employers, and bureaucrats in the welfare state. This is because diagnoses can determine everything from what services a hospital can provide, whether an employee qualifies for worker’s compensation or disability payments, to whether a criminal can be tried in court. The conflicts of interest lie in the fact that doctors today are simultaneously entrepreneurs attempting to maximize their bottom line, and bureaucratic functionaries endowed with special authority. They determine what treatments patients are entitled to and even what social services they can claim. This can lead to manipulating diagnoses to increase reimbursements, shilling for unscrupulous pharmaceutical companies, and acquiescing to patient demands to treat diseases dreamed up these same companies. The overall trend, Szasz argues, is towards diagnosis inflation: a spiral of more and more things being termed diseases, from bullying, fatigue, and social anxiety to kleptomania, drug abuse, and overeating. This absolves individuals and communities of moral responsibility, say for regulating eating or teaching proper schoolyard behavior, and leads to costly and invasive interventions by social workers, psychiatrists, and bureaucrats. Many of these problems are legitimate and troubling. American healthcare is among most inefficient in the world, and the incentives for providers are undoubtedly out of whack. And it can hardly be a positive development that so many Americans rely on health care professionals to tell them how they should be feeling in relationships, to what they should serve for dinner, to how their kids should be behaving. Yet it is hard to accept Szasz’s ultimate verdict, and also his proposed solution. He insists our emergency is so grave that something like an intellectual scorched earth strategy is necessary: The definition of disease must be retrenched only to include “somatic lesions,” that are “objectively” measurable by pathologists. In the process, the entire modern field of mental health and psychiatry—from Kraepelin and Freud onwards—must be dismissed as imaginary. For one thing, Szasz’s warnings of slippery-slope fascism seem overblown. Just because a similarity can be found a contemporary practice and some aspect of what some Nazis did doesn’t make the analogy worthwhile. For instance, comparing a physician who falsely bills an in vitro fertilization that otherwise wouldn’t be covered insurance, to the corruption of the medical establishment in Nazi Germany, seems extreme. Moreover—death panels aside—there simply isn’t evidence provided of systematic coercion or violence in today’s healthcare. Another problem is that it is unclear whether Szasz’s diagnostic standard is really sufficient for the practice of modern healthcare. Citing the 19th Century medical writer Rudolf Virchow primarily, with his concept of disease as “cellular pathology,” Szasz makes a bid to return disease to a solid, “materialist” foundation, purged of any social component whatsoever. Ironically, Virchow would hardly have supported the framework—he was an enthusiastic defender of public health, calling for “school hygiene,” and advocated for legal punishment to be “replaced with psychiatric education—“something antithetical to Szasz’s belief system. The logic Szasz uses to establish his core claim about disease is not completely convincing. He compares the concept of disease to that of carbon, which “has specific physical properties that distinguish it from every other element.” “To the trained eye and informed mind,” Szasz declares, “disease is the same kind of entity as carbon…coal and diamond are two kinds of carbon, much as diabetes and diphtheria are two kinds of disease.“ Yet clearly, an intestinal parasite, AIDS, and Down syndrome are not materially the same thing! They are linked by their ability to disrupt desired social and biological functioning. Szasz previously lauds the metaphor of human body as machine (and disease as its breakdown) as the fundamental advance enabling scientific medicine. But he doesn’t explore all the implications. A machine breaking down is not a physicochemical event with objective correlates. A machine is broken relative to what it should do. Moreover, unlike a blender or bicycle, a human being is not made to do anything in particular (unless you believe in creationism). What we think a human should be able to do is in part socially conceived and negotiated. It is a target that always moves. Diagnoses naturally reflect this. For example, the valuable labels “asthma” and “osteoporosis” are based on expectations for respiration and aging and exist on continua, not as absolutes. This is an inherent feature of medicine, not a bug that could be eradicated. Another frequent strategy of Szasz’s, pointing to the specter of malingering and responsibility avoidance, sometimes rings hollow. For instance, he goes as far as to call PTSD (a widely observed symptomatology) euphemistic for soldiers who don’t want to fight. However, it stretches plausibility that countless veterans, not to mention civilian survivors of violence, would falsify a consistent regime of debilitating flashbacks, nightmares, and anxiety simply in order to avoid life responsibilities. The ends don’t justify the means. Similarly his dismissal of modern depression as trumped up “common unhappiness,” does not address the millennia of Western literature that has distinguished from everyday misery a pathological state of torpor and fear (whether described as melancholia, tristomania, or even the early-Christian “acedia”). Now late in his career, Szasz has clearly thought a great deal about these issues; his ideas even summarized in introductory college textbooks. One might wish his discussion of mental illness were slightly more nuanced, his theoretical foundations a bit tighter. However, if fiery oppositional polemics are required to foreground pressing and fundamental issues about the American healthcare system, then it is a good thing that Szasz is here to write them. Szasz previously lauds the metaphor of human body as machine (and disease as its breakdown) as the fundamental advance enabling scientific medicine. But he resists the implications of the analogy. A machine breaking down is not a physicochemical event with objective correlates. A machine is broken relative to what it should do. Moreover, unlike a blender or bicycle, a human being is not made to do anything in particular (unless you believe creationism). What we think a human should be able to do is in part socially conceived and negotiated. It is a target that always moves. Diagnoses naturally reflect this. For example, the valuable labels “asthma” and “osteoporosis” are based on expectations for respiration and aging and exist on continua, not as absolutes. This is a feature of medicine, not a bug that could be eliminated.

Healthcare costs are out of control. More and more Americans are being diagnosed with depression, eating disorders, and other related illnesses. The dizzying expansion of genomics, neuroscience, and p...

We are told not to judge a book by its cover. In this case, however, Stacy Schiff chose an appropriate image for her work. Almost in shadow, an elegant woman is looking away from the viewer. Indeed, the face of the last ruler of Egypt (69 BC–30 BC), remains a mystery. Still, writers from Cleopatra’s day up to the present make much of the queen’s supposed looks. That is also the case with Cleopatra: A Life. Pulitzer Prize winner Schiff spends time on the issue, but seems quite disdainful of Cleopatra’s contemporary critics who try to make her into a seductress who used whatever beauty she had for immoral purposes. Schiff presents no new evidence either way. As we’ve known for some time, the only contemporary images of Cleopatra that remain are from coins. Schiff’s stance seems to be that Cleopatra was attractive enough, but relied more on her intelligence and charm to get what she wanted. Fair enough, but we should not forget she was worshipped as a demigod and a powerful queen; her actual appearance was, in some respects, besides the point. Perhaps the most interesting aspect of this issue is the double standard applied to her. As a rule, historians are not interested in the looks of their male subjects and how they may have used their appearance to get what they wanted. The ancient sources like Pliny and Cassius, both Romans, perpetuated the ancient stereotype that charming women could seduce even the strongest of men and lead them to ruin. In Cleopatra’s case, this meant none other than Julius Caesar and Marc Antony. The Romans who wrote about these events had more in common with Herodotus than with modern historians who at least try to make an objective analysis of their subject. These ancients were less concerned with facts than with popularizing the point of view or conjecture that best suited Rome. Unfortunately for Cleopatra, there is enough drama to fill a Greek or Shakespearean tragedy that could be used to show Caesar and Antony being led to ruin by her charms. However, Schiff makes it clear this is only one interpretation of these events. The reason for these biased accounts was that Roman writers had every reason to portray Cleopatra in the worst possible light, as the opposite of her virtuous overseers. Allowing for this is one thing, but it seems that Schiff goes a little too much out of her way to put her in the best possible light. For instance, she overestimates Cleopatra’s historical importance a bit when she posits the idea that the queen’s suicide ushered in the beginning of the modern world. She was certainly a significant leader on the world stage, able to manage challenging and dangerous situations. She was forced to maneuver among fickle Roman generals to do what was best for her country and her own survival. As Schiff points out, Cleopatra’s Egypt was rich and helped finance Rome’s wars, both external and internal. But her help did not necessarily endear her to Romans who didn’t want to be reminded of their weakness by a young, foreign queen. Cleopatra’s interactions with Rome and with particular Romans form the framework of her reign and of Schiff’s study. Cleopatra was famous, but not well known. Egyptian sources are scant, not even a contemporary bust or statue exists. As Schiff points out repeatedly, most of the commentators and biographers were Romans, divided by culture, loyalty, and language. “She elicited scorn and envy in equal and equally distorting measure…” p. 322. Unfortunately, we do not get enough insight into what her own people thought of her beyond the usual official fawning any dictator receives. So it seems Schiff and the readers are left struggling with what people said and wrote about Cleopatra and who Cleopatra actually was. Schiff does what she can to bring color and depth to that portrait, but we get the sense that by the end, the Egyptian monarch is still facing away from us, still somewhat mysterious. This mystery allows a certain amount of legend and romanticism to persist about Cleopatra. However, Schiff does provide enough detail to make a convincing case that her Roman biographers did her and everyone who wants to know the truth a great disservice. Her book helps to rectify this, to remind us that a person is much more than the sum of their detractors, and that ultimately Cleopatra existed independently of the people who, for their own purposes, sought to define her.

We are told not to judge a book by its cover. In this case, however, Stacy Schiff chose an appropriate image for her work. Almost in shadow, an elegant woman is looking away from the viewer. Indeed, t...

In her introduction to Making Madame Curie, Eva Hemmungs Wirtén, professor of mediated culture at Linköping University in Sweden, explains why she chose to write a new book about one of the best-known women in history. The author was not in search of the real Madame Curie behind the veil of lore that has built up around the life of the Polish-French scientist. As she explains at the beginning of the book, “Authenticity searches are hopelessly quixotic to begin with, but even more to the point, I have never considered the representational bounty surrounding Curie a curtain hiding something really interesting.” Instead, what intrigued Wirtén was our ongoing cultural construction of Curie’s persona.

One of many fascinating facts that emerge in Wirtén’s study is that at the time of the Curies' famous decision not to patent the discovery of radium, but instead to “publish without reserve,” Curie, as a married woman, was not considered a “legal” person under French law. She was instead in the category of incapable, a designation married women shared with children and the insane. Although garnering distinctions and honors far and wide, she was, like every other married woman of the Third Republic (which lasted from 1870 to 1940), unable to own, control or benefit from either tangible or intangible property. This also pertained to intellectual property, including the discovery of radium, which she made jointly with her husband, Pierre. But when he tragically died in 1906 after being struck down by a horse-drawn carriage, Marie became a liberated woman, of sorts. In Wirtén’s words, “If the death of Pierre meant the loss of Marie’s soul, it also gave his widow a new body. And this body, the widowed body, was entitled to hold property” (page 40).

One of Wirtén’s aims was to discuss Curie in terms of a burgeoning celebrity culture. As the female recipient of two Nobel Prizes, Marie was destined for widespread fame. Though temperamentally unsuited to the spotlight, she learned how to manipulate her celebrity status. But, as Wirtén describes it, Curie had a noble goal: to ensure the future of her field of research, which needed funding.

One example of Curie’s celebrity was what Wirtén calls her annus horribilis—the “horrible year” of 1911 that began with her defeat as a candidate for a prestigious position among the immortels at the Académie des sciences. Though her accomplishments would seem to have made her a natural choice for the vacant position, there was one small problem: the Académie would first have to vote on whether women were eligible. The vote on this question did not go in her favor. Curie lost to her male competitor by only two votes. But she never chose to campaign again, though it was common practice to do so. It would not be until 1979 that a woman—French scientist Yvonne Choquet-Bruhat—was elected to the Académie des sciences.

The latter part of 1911 brought another blow. Curie was discovered to have had an affair with fellow scientist and longtime friend Paul Langevin, a married man. This discovery was precipitated by the theft of some of their letters, and Curie threatened legal action. Because of her fame, the affair couldn’t be quietly swept under the carpet. The public outrage, Wirtén writes, was fuelled by the way Curie had “applied herself scientifically to luring Langevin away from his family, and her reasoning and planning overstepped the very narrow gender role prescribed to her at that time” (page 67). Even more damning, though, was the fact that the scientist, born Maria Salomea Skłodowska in Warsaw, was not French. "What is not French," complained Maurice Pujo, co-founder of the nationalist organization known as the Comité de l’Action Francaise, was that "there are no sincere cries, always excusable, but cold reasoning” (page 67).

In the end, there were five duels fought over Curie’s honor. As French libel laws were weak, the duels were seen as an effective way to resolve questions of honor. Though we may think of dueling as antiquated, Wirtén points out that a two-minute film of the first of these duels, fought between journalists of rival newspapers in November 1911, is available on YouTube.

Wirtén also recounts the tale of Curie’s first trip to America in 1921, during which she solicited funds for her work. During this trip, she was presented with the gift of one gram of uranium, which cost $100,000—an astronomical sum at that time. Thanks to her influential friend and promoter, an American socialite and journalist Missy Brown Meloney, the money was raised by subscription from a large group of women. Like popular Kickstarter projects of today, the gift was actually overfunded, and the surplus became a matter of some contention, due to the legal restrictions on how it could be used.

Wirtén uses the final chapter to discuss areas of interest that she shares with Curie, namely the problems of bibliography and of intellectual property, which Curie pondered in her role as a member of the League of Nations International Committee on Intellectual Cooperation. She saw the importance of a flexible, concise system of abstracts that could be shared globally, reducing inefficiency and duplication. But she was also interested in securing some sort of income for the originators of important scientific ideas. Although this may seem to be at odds with the Curies' original idea of “publishing without reserve,” one of Wirtén’s main concerns is to show how Curie consistently walked the fine line between the public interest and her own.

In her introduction to Making Madame Curie, Eva Hemmungs Wirtén, professor of mediated culture at Linköping University in Sweden, explains why she chose to write a new book about one of the best-kno...