Job Archives

Adam Smith: An Enlightened Life by Nicholas Phillipson is a deeply probing historical biography. This book plunges into the depths of the “hows” and “whys” that prompted Adam Smith to write at least two astounding books in his lifetime, while also detailing many of the friends, colleagues, and contemporaries who influenced him. For those who enjoy the introspection and minutia of philosophical discourse, often parsing the meaning of differences between such concepts as sympathy and pity, truth and honesty, or morality and justice, this is a book of continuous delight. Nicholas Phillipson, a history professor at Scotland’s School of History, Classics, and Archaeology, not only paints a glorious panorama of the social milieu and events which occurred at particular times during Smith’s life, but he also adds all the luscious details of contemporaries and influential friends of Smith’s showing the smallest of transitions in thought and substance of this great 18th-century thinker. It was as an academician and Professor of Moral Philosophy at Glasgow and Edinburgh universities that Smith earned his living in the mid-1700s, but it was as a speaker and contributor to critical thinking through clubs, organizations, and friendships that he pushed forward his ideas about human sociability. Such icons as David Hume and François Quesnay exchanged frequent letters and criticisms with Smith throughout the development of his lectures and writings, which helped to shape and refine his theories. Although Smith wrote prolifically and lectured often, his reputation rests on two books in particular: The Theory of Moral Sentiments (1759) and An Inquiry Into the Nature and Causes of the Wealth of Nations (1776), which is usually simply called The Wealth of Nations. The books were actually to have been Part 1 and Part 2 of a much larger Science of Man project that both Smith and Hume had envisioned as a definitive portrait of the behavior and motivations of humans in social and political organizations, but Part 3 and Part 4 were never completed. Incredibly detailed notes and whole chapters were found among Smith’s papers after he died but, in his later years, Smith admitted that the project was too much for him to finish. The cover of this book, adorned by the ubiquitous ivory cameo of Smith’s bust shown from the side that so often represents our view of him, is also very much like the book itself—an embellished, handsomely scrubbed, and idealized version of what Smith was really like. This book even boldly steps into conjecture about some of Smith’s views as information is drawn from the notes of students and those who attended his lectures. While many political and philosophical thinkers foundered in obscurity until their writings eventually rose to prominence, Smith’s views were always highly regarded. At the age of 28, he was appointed as Professor of Logic and Metaphysics at Glasgow University, and, at 36, he became Professor of Moral Philosophy, which was a highly esteemed academic position. In Glasgow, Smith saw the particulars of the tobacco trade which drove the city’s economy. He drew much of his opposition to monopolies from the dominance they imposed on commerce while also extolling the “unintended beneficence” of the “invisible hand” of “unfettered competition.” These were monumental conclusions from such a forceful thinker as he also described the essential qualities of banking, wages, interest rates, and international trade in what came to be known as the “political oeconomy” [sic]. In The Theory of Moral Sentiments, his effort was to identify the ways in which we trade our moral sentiments—that is, our feelings about moral values—with one another. He meticulously discussed our behavior, motivated, as he concluded, by the pursuit of joy and the avoidance of shame. In The Wealth of Nations, his goal was to identify how we traded our goods and services as we pursued the physical betterment of our lives, promoted by the division of labor and the mechanics of the judicious “invisible hand” of unimpeded commerce. [caption id="" align="aligncenter" width="500"]Adam Smith Adam Smith[/caption] With the discussion of all the details of teaching, travel, and lectures that this book contains, it is surprising that Smith’s notoriously absent-minded behavior was not mentioned. It was often said that he would forget his glasses, his briefcase, or even the names of very familiar colleagues because of his preoccupations, many times even wandering past his desired destination before realizing that he had forgotten where he was going. While this omission certainly did not detract from the description of Smith’s life, it nonetheless makes one wonder if only a more pristine view of his character was revealed by the author. However, using many references from other books and archived writings found in Scottish libraries—all clearly identified and credited in the Notes and Sources section at the conclusion of the book—Phillipson was able to illuminate many of the finer points of Smith’s life.  In one very insightful passage, he quoted an earlier biographer, Dugald Stewart, who had discussed Smith’s teaching style:
“There was no situation in which the abilities of Mr. Smith appeared to greater advantage than as a Professor. In delivering his lectures, he trusted almost entirely to extemporary elocution. His manner, though not graceful, was plain and unaffected; and, as he seemed to be always interested in the subject, he never failed to interest his hearers. Each discourse consisted commonly of several distinct propositions, which he successively endeavored to prove and illustrate. These propositions, when announced in general terms, had, from their extent, not unfrequently [sic] something of the air of a paradox. In his attempts to explain them, he often appeared at first, not to be sufficiently possessed of the subject, and spoke with some hesitation. As he advanced, however, the matter seemed to crowd upon him, his manner became warm and animated, and his expression easy and fluent. In points susceptible of controversy, you could easily discern, that he secretly conceived an opposition to his opinions, and that he was led upon this account to support them with greater energy and vehemence. By the fullness and variety of has illustrations, the subject gradually swelled in his hand, and acquired a dimension which, without a tedious repetition of the same views, was calculated to seize the attention of his audience, and to afford them pleasure as well as instruction, in following the same object through all the diversity of shades and aspects in which it was presented, and afterwards in tracing it backwards to that original proposition or general truth from which this beautiful train of speculation had proceeded” (p. 134).
The strength of this book is certainly its depth and detail, which spotlights Smith at a specific time and place in history while elucidating his gradual but determined progressions toward his conclusions. However, the book’s weakness is in its preoccupation with many unimportant, even trivial, facts and remembrances that seem to stuff the story with incidentals while often wavering without purpose into excess.

Adam Smith: An Enlightened Life by Nicholas Phillipson is a deeply probing historical biography. This book plunges into the depths of the “hows” and “whys” that prompted Adam Smith to write a...

Einstein 1905: The Standard of Greatness, is a detailed look at the five papers written by Einstein in 1905—known as his Annus Mirabilis, the extraordinary year—that truly separated him from other physicists of his time and beyond. The book looks into the interconnection between those papers and how Einstein built on each one to fundamentally change physics, the science community, and the public’s understanding of the world around them. Though there is little doubt about Einstein’s greatness or the importance of his paper on relativity, the depiction of how he wrote, in only six months, five groundbreaking papers is worth reading about. An Honorary Professor of Physics at Washington University in St. Louis, John Rigden delves into the interrelation of the five papers, how Einstein’s theories played from one to another, and their effect on the scientific community of his time. The book starts with Einstein’s March Paper supporting the building blocks of quantum mechanics; then, goes on to the April and May papers which influenced the development of statistical physics, followed by his well-known June paper in which he presented his special theory of relativity, and finally, his September paper bringing forth the most famous equation of all time-E=mc2. There is little new information offered by Rigden in this book, but he eloquently provides the reader with insight into the papers without an overabundance of mathematical equations usually relied upon when explaining Einstein’s theories. The book is by no means intended for the novice reader, and a rudimentary knowledge of Einstein’s work is extremely helpful in understanding how Rigden views the physicist’s accomplishments. For a reader looking for an understanding of the personal relations or outside influences that Einstein was subjected to while producing these papers, the answer is not this book. Rigden occasionally flirts with the idea of outside influences, but the area is never fully explored. The author does not explain how Einstein was able to write five seminal papers in only six months—an astounding feat by any estimation—focusing more on what is included in each of the papers and, to a lesser extent, the response by the scientific community. The limited impact Einstein originally had within the physics community is brought to light, along with his lack of a university affiliation during the writing of the papers in1905. Since he did not have access to a number of easy references and scientific minds; Einstein developed a reliance on his own knowledge of physics and the world around him. Rigden makes a convincing argument that though Einstein’s April paper—his dissertation—is one of the most cited in the scientific community, it is not the most influential or groundbreaking of the five. Instead, the April paper is credited as the most important one, because it conforms most naturally to use in scientific papers. Rigden suggests that, taken as a whole, the five papers truly make the most astounding contribution to science, as Einstein built on each work with his unique mind. Overall, Einstein 1905: The Standard of Greatness, provides a clear perspective on six months that altered the scientific community and mankind’s understanding of physics.

Einstein 1905: The Standard of Greatness, is a detailed look at the five papers written by Einstein in 1905—known as his Annus Mirabilis, the extraordinary year—that truly separated him from other...

Susan A. Clancy holds a Ph.D. in Experimental Psychology from Harvard University. An expert in the field of cognitive psychology, she’s no amateur when it comes to controversial topics. In her first book, Abducted: How People Come to Believe They Were Kidnapped (2005), Clancy took on the hot-bottom issue of outer-space alien abductees and their memories. This time around, in The Trauma Myth, she has chosen to tackle childhood sexual abuse. While on the face of it there’s nothing risky about the topic, it’s her approach that pushes the envelope. Based on interviews conducted at Harvard over a 10-year period with over 200 men and women of varying ages and occupations, all of whom were sexually abused as children, the book is part personal narrative accounts from victims and part Clancy’s own theories on sexual abuse trauma. In some ways, The Trauma Myth is a misnomer. Clancy is not arguing that childhood sexual abuse is not traumatic. From the outset, she makes it quite clear that her goal is to argue for a reconsideration of the definition of trauma. This book asks readers to question what they know—or, more aptly, what they thought they knew—about sexual abuse and the resulting trauma experienced by children. As she writes in the book’s introduction, “sexual abuse is very rarely described without the word ‘trauma’ …. As a consequence many researchers studying the impact of sexual abuse do not even bother to ask victims detailed questions as to whether the experience was traumatic when it happened; they just assume it was.” In chapter one, Clancy uses personal accounts from multiple victims to explore what the abuse was like when it happened. Because each victim reported that the abuse was not traumatic, terrifying or life-threatening, she questions what she thought she knew about trauma. In chapter two, the author provides a thorough reading of psychology literature, explaining several key developments in the field of sexual abuse, most prominently Karl Abraham’s 1907 hypothesis that sexual abuse victims fell into two separate categories—“accidental victims” (violent abuse conducted by a stranger) and “participant victims” (non-violent abuse at the hands of a known person). In chapter three, she proceeds to deconstruct how the trauma model for counseling sexual abuse victims emerged. As Clancy explains, Sigmund Freud in his 1896 publication, “The Aetiology of Hysteria” proposed that sexual abuse caused neurosis, but a year later he changed his mind, concluding that his patients had made up their abuse claims. Subsequently, Freud’s revised theory became widely accepted in the field, prejudicing many professionals who began to believe that victims lied about their sexual abuse. As Clancy asserts, “between 1930 and 1970, the literature was rife with case studies of ‘seductive children’ or ‘pathologically needy’ children.” In the 1960s and 1970s, however, the author explains how professional assumptions about child sexual abuse changed, in large part due to the work of child-protection advocates and feminists such as Judith Herman and Susan Brownmiller, who pushed for the end of victim-blaming. By the 1980s and 1990s, sexual abuse was framed as a traumatic event and conceptualized as a form of post-traumatic stress disorder, in large part to protect victims and reclaim the validity of their abuse. In chapters four and five, Clancy sets out to answer the two questions underpinning the book: why does the trauma myth damage victims? And, how does the trauma myth silence victims? Clancy has an exceptional ability to simplify clinical language, allowing the reader to understand how psychology professionals came to their conclusions and the numerous studies that have been conducted in the area of childhood sexual abuse. That said, some of the accounts from victims are shocking, at times difficult to read, and very tragic. These stories, support Clancy’s arguments that trauma, as it has been conceptualized for the past 30 years, is not always directly related to the actual abusive events; rather, more often trauma is a complex web of emotions and feelings that take years to come to the surface. While the first few chapters of the book are heavy with multiple page accounts from victims, the latter chapters contain short quotes that are not as effective. Further, there are many instances where Clancy becomes a bit repetitive with her arguments about trauma. The Trauma Myth is ultimately a fascinating exploration of the topic of childhood abuse. It certainly raises more questions than it answers, but in the end, the author’s intent is not to provide answers but to challenge how we, and professionals in the field, think about childhood sexual abuse. To that end, the author is successful.

Susan A. Clancy holds a Ph.D. in Experimental Psychology from Harvard University. An expert in the field of cognitive psychology, she’s no amateur when it comes to controversial topics. In her first...

In the popular imagination, the name “Genghis Khan” continues to conjure images of savage Mongol hordes laying waste to everything in their path. But Frank McLynn’s sweeping study entitled Genghis Khan: His Conquests, His Empire, His Legacy reveals a more complicated story. This is a portrait of a single man’s rise to greatness and of an obscure tribe’s transformation from an isolated nomadic existence to a world power. McLynn relies heavily on the 13th-century semi-legendary chronicle, The Secret History of the Mongols. The author is unknown but was probably a Mongol, court insider. Like many other accounts of the ancient and medieval periods, it contains exaggerations and fabricated stories. However, historians agree that the core narrative of this primary source is invaluable for establishing a historical record of the Mongols. Another source, 9th-century writings from China’s Tang Dynasty, is the earliest recorded mention of the Mongols.

The future Kahn was born Temujin in 1162. He had three brothers and a sister, but he distinguished himself early, building a reputation as a warrior at 14. McLynn explains the young man’s rapid rise from an accomplished warrior to an influential leader this way: “The attraction of Temujin was that he had created a haven for all who had broken away from the rigidities of the old kinship-based clan structure” (50). One of McLynn’s themes is that Genghis was an innovator, because he “recalibrated” Mongol society, while also being mindful of his peoples’ traditions. First among them was the battue, or hunt, a massive training exercise that covered hundreds of miles and demanded many of the same skills used in battle. These hunts came to an end with thousands of Mongol warriors encircling herds of animals for slaughter.

After years of ferocious campaigning, Temujin united the disparate nomad tribes of the steppe. In 1205, he became Genghis or “tough ruler” (94). But the new Kahn did not rest; he had grand designs for his new Mongol empire in the west and especially south into Northern China. The Jin Dynasty was in the Mongols’ sights, but it would take years to subdue the Chinese kingdoms. The Khan’s conquests were made possible partly by his aptitude for administration and organization. McLynn compares the Mongol’s skills to Napoleon’s legendary prowess. 

Whether Genghis used lightening tactics to engulf an enemy or more patient strategies to wear down a strong opponent, McLynn shows how much of the Mongol empire’s success can be traced to the ruler’s willingness to embrace a “do what works” mentality, instead of sticking to the old ways for the sake of pride or ancient traditions. Depending on what the circumstances dictated, Genghis would use a combination of feigned retreat, encirclements, frontal assaults, and ambushes. He insisted that his men were disciplined and well trained in these tactics.

[caption id="" align="aligncenter" width="500"]Genghis Kahn Genghis Khan[/caption]

As highly mobile horse raiders, the Mongols had no expertise with the tactics and strategies of siege warfare. Genghis recognized this would have to change if they were to defeat China’s heavily fortified cities. His foresight in acquiring this knowledge served his heirs well as they rode west to conquer cities in the Middle East and Europe. In the course of his conquests, he captured or hired many of the finest experts in siege technology and tactics.

As with McLynn’s previous book on Marcus Aurelius, his treatment of Genghis Khan bolsters his belief in the efficacy of the “Great Man” concept for illuminating historical periods. But he is not content to discuss Genghis and the Mongols only in terms of tribal politics and conquests. He also describes weapons and tactics with the same vivid detail. For example, he compares the Mongol composite recurve bow to the English longbow. Many readers would be surprised to learn the Mongol bow, with a pull of 166 lbs., was greater than the legendary English version.

McLynn argues that even though the image of the Mongol ruler had been tainted by propaganda and legend, it is possible to get a sense of Genghis Khan, the man. He displayed many personality traits: he was paranoid, cruel, intelligent, and crafty. Of course, it is almost universally agreed that Genghis Khan was a political and strategic genius. He was also known to be interested in other religions and cultures, and eager to borrow from other ideologies if he thought they would help him in his endeavors. 

This is the portrait of a complex man driven to greatness by the strength of his abilities and an unshakable will. But we must not forget it is also the story of a brutal leader responsible for hundreds of thousands of deaths and untold upheaval and suffering. McLynn effectively encapsulates the Mongol’s essence with the last words he spoke to his sons on his deathbed. “Life is short. I could not conquer the world. You will have to do it” (376). His sons honored his legacy, expanding his empire west into Europe. But they could not match their father’s unique capabilities and were unable to hold Genghis Khan’s empire together.

In the popular imagination, the name “Genghis Khan” continues to conjure images of savage Mongol hordes laying waste to everything in their path. But Frank McLynn’s sweeping study entitled Gengh...

Happy Birthday, F. Scott Fitzgerald! Today marks the anniversary of the birth of one of the most celebrated writers of the 20th century. Francis Scott Key Fitzgerald, or F. Scott Fitzgerald, as he is more commonly known, was born on September 24, 1896, in St. Paul, Minnesota. Fitzgerald is best known for his novel "The Great Gatsby," which has become a literary classic and is often studied in high school and college literature classes. However, Fitzgerald's body of work includes much more than just "The Great Gatsby." He was a prolific writer, producing numerous short stories and novels throughout his career. Fitzgerald's writing is characterized by its vivid portrayal of the excesses and extravagances of the Roaring Twenties. He was a master of capturing the lavish lifestyle of the wealthy and the excesses of the jazz age. But Fitzgerald was more than just a chronicler of the times. His writing also explored deeper themes, such as the corrupting influence of money and the fleeting nature of youth and success. Fitzgerald was born into an upper-middle-class family and received a good education. He attended Princeton University, where he began writing for the school's literary magazine. After dropping out of Princeton, Fitzgerald joined the army and was stationed in Alabama. There, he met and fell in love with Zelda Sayre, a beautiful and vivacious southern belle. Zelda would become Fitzgerald's muse and the inspiration for many of his female characters, including the iconic Daisy Buchanan of "The Great Gatsby." The couple married in 1920 and had a tumultuous relationship marked by infidelity, alcoholism, and mental illness. Despite the challenges in his personal life, Fitzgerald continued to write and publish throughout the 1920s and 1930s. In addition to "The Great Gatsby," Fitzgerald also wrote the novels "This Side of Paradise," "The Beautiful and Damned," and "Tender is the Night." He also produced a number of short stories, many of which were published in popular magazines like The Saturday Evening Post. Fitzgerald's writing career was not without its ups and downs. While he enjoyed success and critical acclaim during the 1920s, his fortunes declined during the 1930s. The Great Depression hit Fitzgerald hard, and he struggled financially. He died at the young age of 44, leaving behind a body of work that would go on to influence countless writers and artists. F. Scott Fitzgerald's writing is often associated with the glamour and excess of the Roaring Twenties, but it is so much more than that. His work is a timeless exploration of the human condition and the pitfalls of the American Dream. On the occasion of his birthday, we celebrate the life and work of this incredible writer and thank him for the enduring legacy he has left behind.

Happy Birthday, F. Scott Fitzgerald! Today marks the anniversary of the birth of one of the most celebrated writers of the 20th century. Francis Scott Key Fitzgerald, or F. Scott Fitzgerald, as he is ...

The Great Gatsby is hands down one of the greatest novels of all time. Don't believe us? Well, let us lay out the evidence for you. First and foremost, The Great Gatsby is a masterclass in character development. Fitzgerald has a way of bringing his characters to life in a way that feels real and authentic. From the enigmatic and mysterious Jay Gatsby to the wistful and tragic Daisy Buchanan, each character is fully fleshed out and feels like a living, breathing human being. But it's not just the characters that make The Great Gatsby great. Fitzgerald's writing is simply breathtaking. His prose is rich, evocative, and emotive, and he has a way of capturing the essence of the Roaring Twenties in a way that feels both glamorous and devastating. But it's not just the words on the page that make The Great Gatsby a masterpiece. The novel is also a commentary on the decadence and excess of the time, as well as a commentary on the corrupting influence of wealth and power. Fitzgerald's portrayal of the American Dream is both hopeful and cautionary, and his themes are as relevant today as they were when the novel was first published. And let's not forget about the iconic setting of The Great Gatsby. Fitzgerald's depiction of the lavish mansions and parties of Long Island's West Egg is both alluring and haunting, and it serves as the perfect backdrop for the events of the novel. So, why is The Great Gatsby one of the greatest novels of all time? It's simple: the characters are complex and multi-dimensional, the writing is beautiful and evocative, the themes are timeless, and the setting is iconic. It's the whole package, folks. Don't just take our word for it—pick up a copy and see for yourself. You won't be disappointed.

The Great Gatsby is hands down one of the greatest novels of all time. Don’t believe us? Well, let us lay out the evidence for you. First and foremost, The Great Gatsby is a masterclass in chara...

In the years following Alfred Hitchcock’s death in 1980, an image of him as a dark, vindictive, and lecherous man clung to his memory. More than 20 years later, Film historian Patrick McGilligan re-evaluated the film director’s life in his book, Alfred Hitchcock: A Life in Darkness and Light. McGilligan is no stranger to the Hollywood biography, having published biographies of such noted directors as Robert Altman, George Cukor, and Fritz Lang. He focuses primarily on his subject’s career—an easier strategy with Hitchcock, who left little documentary evidence of personal nature and was obsessed with the technical side of his work. Aside from studio sources, McGilligan includes source material in the usual manner for a biographer: archives, personal interviews, books, and articles. He presents enough of Hitchcock’s private life, including its salacious aspects, to give the reader a general idea of what had motivated the “master of suspense.” While generally, the author shies away from film analysis, the book is filled with descriptions of Hitchcock’s working methods. McGilligan quotes numerous sources, but three are particularly interwoven: Hitch, the official biography published by John Russell Taylor in 1978, which served, perhaps more than any other, as the foundation for this work; Hitchcock, François Truffaut’s 1967 interview book; and Donald Spoto’s The Dark Side of Genius. Alfred Joseph Hitchcock was born on August 13, 1899, in the living quarters above the Leytonstone greengrocer shop his parents ran. Hitchcock’s family were devout Catholics, and he maintained his faith to varying degrees throughout his life and manifested it—again in varying degrees—in his work. While McGilligan does not really push the idea of the Catholic Hitchcock isolated in a nominally Anglican country, he does reinforce the notion that the director "himself said that it might have contributed to his 'eccentricity'" (p. 17). Hitchcock’s parents sent him to a series of Catholic schools culminating with St. Ignatius College, a Jesuit day school, which the young Alfred attended from 1910 to 1913. Afterward, he entered the London County Day School of Engineering and Navigation. A year later, he began working for Henley’s Telegraph Works, where he realized engineering did not interest him. Too young for military service at the outbreak of World War I, McGilligan can only postulate how his subject avoided military induction when he came of age in 1917. He does point out, however, that in 1917 Hitchcock joined a cadet regiment of the Royal Engineers. Once he had established himself at Henley’s and had moved into a London apartment, Hitchcock enrolled in art classes at Goldsmith’s College, an extension of London University. At Henley’s Hitchcock was transferred from sales to advertising—his earliest apprenticeship in the field of public relations and publicity at which his skill eventually rivaled that of his filmmaking technique. He was a founding editor, business manager, and regular contributor to the company’s house organ, The Henley Telegraph, a magazine that went far beyond the usual company news. McGilligan reproduces seven of Hitchcock’s contributions to the magazine, which show his subject’s early style of blending humor, the macabre, and the occasional twist ending. The film industry soon beckoned the young man, and in 1921 Hitchcock went to work full-time in the art department of British Famous Players-Lasky, which was the UK’s extension of the production arm of Paramount Pictures. He had already worked part-time for the film company on at least three films. Hitchcock’s first job in the motion picture industry was that of “captioneer,” which involved lettering (and sometimes writing) the explanatory title cards used in silent movies, and occasionally illustrating the cards or drawing borders. Also working at the studio was Alma Reville, the woman who became not only Hitchcock's wife (on December 2, 1926) but also the most important person in his career. During the next few years, Hitchcock’s career moved upward on a straight trajectory. In 1920 and 1921 Hitchcock worked as title designer on seven films. In 1922, he received screen credit as title designer and art director on five films and as director and producer of a sixth, titled Number Thirteen, which remained unfinished due to lack of money. By this time British Famous-Players Lasky had ended production. In 1923, Hitchcock signed on with a new film company, Balcon-Saville-Freedman (the first two names referring to Michael Balcon and Victor Saville), and so did Alma. As McGilligan relates it, Reville and Balcon facilitated Hitchcock’s final period of apprenticeship. From the latter half of 1923 through 1925, he and Reville worked on five films: Hitchcock as co-writer, art director, and assistant director; Reville as editor and second assistant director. Balcon produced all five although the final three were for a new company, Gainsborough Productions. This trio may indeed have constituted the first of what McGilligan has described as the “three Hitchcocks”—triumvirates made up of Hitchcock, Reville (whom McGilligan annoyingly refers to as “Mrs. Hitchcock” rather than by her professional name), and a third person, often a screenwriter, for pre-production brainstorming sessions. In 1925, Balcon gave Hitchcock another shot at directing films, and McGilligan quotes Hitchcock from an interview with director Peter Bogdanovich: “Balcon is really the man responsible for Hitchcock. I had been quite content at the time writing scripts and designing” (p. 67). McGilligan admits the generosity of this quote, considering that Hitchcock and Balcon later had a falling out when Hitchcock left for Hollywood. Hitchcock’s first completed directorial effort was The Pleasure Garden (1926). The script was written by Eliot Stannard while Alma served as assistant director. The interiors were filmed in Germany because the picture was actually a co-production of Gainsborough and Münchener Lichtspielkunst (phonetically called “Emelka” for its initials—MLK), a Munich production company. Here McGilligan hints at the origins of the classic Hitchcock blonde when he acknowledges that Alma began to help “shape his aesthetic of female beauty” at the time (p. 69). McGilligan also touches upon, but does not dwell on, Hitchcock’s interest in German expressionism and its influence on his work. Not too much later, as the author describes, the young director would also come under the spell of the Soviet filmmakers and theorists of the 1920s who were exponents of montage as the sublime standard of filmmaking. If Balcon was not the first to complete the triangulation of the so-called three Hitchcocks, then that honor surely fell to scriptwriter Eliot Stannard, who wrote eight of the ten silent films Hitchcock directed and, according to McGilligan, had a hand in the other two. Probably the best known Hitchcock film of this period was The Lodger (1926), which Stannard adapted from the novel by Marie Adelaide Belloc Lowndes. Hitchcock’s third film was a “Jack-the-Ripper” tale starring Ivor Novello. This was an era when stodgy distributors and exhibitors ran the production companies, and Hitchcock was forced to make changes to the film to make it more commercial than he had intended—and Hitchcock was never shy about making a commercially viable picture. Nevertheless, it provided an early tutorial in negotiating the demands of studio executives and censors. The Lodger also marked the first of Hitchcock’s numerous cameo appearances in his films. As with many directors of the late silent era, Hitchcock’s last silent film, Blackmail (1929), was also his first “talkie.” By this time he had been working for British International Pictures (BIP) for nearly two years, with John Maxwell replacing Balcon as his producer. During the sound portion of Hitchcock’s BIP period, Alma was either the writer or co-writer of all of his features, including an adaptation of Sean O’Casey’s Juno and the Paycock (1929). Hitchcock returned to Balcon’s fold in 1934. Balcon was now the production chief of a new studio, Gaumont British, “which had acquired a holding interest in Gainsborough” (p. 152). (McGilligan’s explanation of the machinations of the British film industry in the silent and early talking eras is lucid and just long enough to hold the reader’s attention without slowing down the pace of his story.) At Gaumont British Hitchcock made the finest films of his pre-Hollywood period. These included the original version of The Man Who Knew Too Much (1934), The 39 Steps (1935), Secret Agent (1936), and Sabotage (1936). All were produced by Balcon, and the last three were written by Charles Bennett, who had replaced Stannard as the “third Hitchcock.” Alma was now in charge of continuity on her husband’s films. On the basis of these and other films, by 1938 Hitchcock, according to most critics, was Britain’s premier director, a sentiment McGilligan seconds. And Hollywood, in the person of producer David O. Selznick, thought so too. McGilligan’s account of Hitchcock’s seduction by Selznick and his older brother Myron could itself be a stand-alone book. In the 1930s and 40s, Myron Selznick was one of the top agents in Hollywood, and he soon added Hitchcock to his client list. Meanwhile, David O. Selznick was forging a path as an independent producer. While other producers and studios had an interest in signing Hitchcock on (and Balcon wanted to retain him), it was an open industry secret that Selznick had the inside track, courtesy of his brother. Hitchcock appeared to be blind to the obvious conflict of interest, or at least according to this account. In the end, the double-teaming worked perfectly—Hitchcock signed a contract with Selznick International Pictures that was very good considering it was during the Great Depression, but was egregious by Hollywood standards. Not only was the money subpar (here McGilligan summarizes a survey of directors’ salaries of that period done by Leo Rosten for his book, Hollywood: The Movie Colony, the Movie Makers), but Hitchcock was tied to Selznick by a series of one-year options. The director was often loaned out to other studios, for which Selznick was paid. Hitchcock’s first movie for Selznick International was Rebecca (1940). The director was allowed more freedom than most would have expected from such an interfering producer because Selznick was preoccupied with the massive publicity campaign for Gone with the Wind (1939). Certainly, nothing Hitchcock directed for Selznick approached that film, though he only directed three films, the other two being Spellbound (1945) and The Paradine Case (1947), with the script for the latter credited to Selznick. Yet, Hitchcock was prolific during the 1940s, releasing nine other films including such gems as Foreign Correspondent (1940), Suspicion (1941), Shadow of a Doubt (1943), Lifeboat (1944), and Rope (1948). By the time of Rope, Hitchcock was no longer under Selznick’s thumb. McGilligan also describes Hitchcock’s behind-the-scenes war work, which included directing three short films (or compiling footage in the last instance). The first two were propaganda pieces produced by the British Ministry of Information for Phoenix Films, designed to boost morale in the Free French territories and stiffen the fighting spirit of the Resistance. These were Bon Voyage and Aventure Malgache, both released in 1944. The third film was released in 1945 and titled Memory of the Camps. It was done under the auspices of the Supreme Headquarters of the Allied Expeditionary Force. Memory of the Camps was never finished and locked away for 40 years. It was broadcast on public television in the United States in the 1980s (after Hitchcock had died), and its grisly footage of the Nazi death camps can now be found on the Internet. Though McGilligan regales his readers with anecdotes concerning Hitchcock’s working methods for each film, he seems to enjoy himself when he shows the master tricking the studio censors or being downright subversive (for the era). The subversiveness even extended to casting. In Rope, for example, a film packed with homoerotic symbolism, the two young killers were portrayed by gay actors Farley Granger and John Dall, which added to the film’s frisson. But whether he was sly or subversive, coy or cute, casting was always important to Hitchcock because, as McGilligan recounts more than once, for Hitchcock the real creative work was in the preparation. Filming, he liked to say, was merely a matter of recording what had been scripted, storyboarded, and blocked. Of course, that attitude more or less coincides with the director’s famous “actors are cattle” quote, which McGilligan mentions, then skates around it. By the time he had extricated himself from the Selznick contract, Hitchcock was deep into his golden period—the 1940s and 50s. The latter decade was the pinnacle of his career when he made a number of what are now considered his classic films for Warner Brothers, Paramount or MGM. McGilligan treats the 1950s films with awe and reverence—he even subtitles the section on Hitchcock’s Paramount films “The Glory Years”—including the lesser works like I Confess (1953), The Trouble with Harry (1955), and The Wrong Man (1956). McGilligan shows that Hitchcock’s technique in The Wrong Man was again influenced by European cinema. But instead of German expressionism or Soviet avant-garde, Hitchcock now turned to Italian neorealism as the guiding aesthetic to tell the true story of a man falsely arrested in New York City.

McGilligan rightly exults in this period, for many films of that era are great works of art. During these years, Hitchcock scored casting coups with the male leads for his films, primarily Cary Grant, James Stewart, and Henry Fonda. His female stars in the 1950s included Marlene Dietrich, Grace Kelly, and Eva Marie Saint, with Kelly epitomizing the cool blonde.

In the 1950s, Hitchcock gained acceptance within his own industry. It may sound ironic, but he was not as accepted in Hollywood as his reputation deserved. McGilligan records numerous instances of actors and writers thinking Hitchcock’s work and genre were second-rate. It took young critics from the French film magazine Cahiers du cinema—some of whom, such as Jean-Luc Godard, François Truffaut, Claude Chabrol, and Eric Rohmer, would become great film directors in their own right—to cement Hitchcock’s artistic reputation. They managed to convince the world of his artistry. This nouvelle vague acclaim culminated in the 1960s with the publication of Truffaut’s interview book, Hitchcock. Also at this time, he became a recognizable figure to the public because of his television show Alfred Hitchcock Presents, a half-hour anthology series of suspense. (Toward the end of its seven-year run the program expanded to an hour.) The producer was Joan Harrison, a “third Hitchcock” dating back to his Gaumont British days. For Hitchcock, the 1960s were the beginning of the end, despite the early decade triumphs of Psycho (1960) and The Birds (1963). McGilligan goes about his task of recording and explaining Hitchcock’s decline, as a biographer should, but he takes no pleasure in it, and, likewise, neither does the reader. A number of factors played into this decline: Hitchcock’s age and his health, Alma’s own declining health, and the loss of friends and colleagues, especially some of the production people he had relied on for years at Paramount. Changing trends in the industry contributed to his own sense that his style had become dated. His last two films of the decade, Torn Curtain (1966) and Topaz (1969) were, according to McGilligan, attempts to reclaim the spy genre from the Bond films and their imitators, all of which Hitchcock thought cartoonish. Unfortunately, both were commercial and critical failures. Hitchcock managed a brief comeback with Frenzy (1972), a serial-killer tale set in London, as though he were bidding a final goodbye to the city that nurtured his career. But McGilligan’s account of the making of this film is tinged with pathos. From here on, the story of Hitchcock’s life is saturated with Hollywood clichés of the aging filmmaker that were certainly absent from his films. To his credit, McGilligan conveys the psychological pain and loneliness of the great man in winter, but does not overdo it. And in case the reader has forgotten one of the reasons for the book’s existence, McGillgan added a coda that summarizes the greatness of Hitchcock’s career, though the preceding 700-plus pages hardly needed a recapitulation of their major theme.

In the years following Alfred Hitchcock’s death in 1980, an image of him as a dark, vindictive, and lecherous man clung to his memory. More than 20 years later, Film historian Patrick McGilligan re-...

Although some of his theories are still hotly debated, Sigmund Freud, (May 6, 1856–September 23, 1939) is widely regarded as a trailblazer in the realm of psychiatry and psychology. The Austrian psychiatrist and neurologist, who was allegedly the first to offer a comprehensive explanation of how human behavior is determined by the conscious and unconscious forces, is regarded as the founder of psychoanalysis.
Along with the “talk therapy” that remains the staple of psychiatric treatment to this day, Freud popularized, among other notions, such concepts as the psychosexual stages of development; Oedipus complex; transference; dream symbolism; Ego, Id and Super-Ego; and the one that has become part of colloquial English more than any other psychiatric term—the Freudian slip. Edward Erwin is Professor of Philosophy at the University of Miami at Coral Gables. He is the author of several books as well as numerous articles in philosophy of science, epistemology, philosophy of language, and philosophy of psychology. He is also editor-in-chief of The Freud Encyclopedia: Theory, Therapy, and Culture—the first in-depth Encyclopedia on the life, work, and theories of Sigmund Freud. He joins us on Culture Insight to share his insight into the life and work of Sigmund Freud. https://youtu.be/c36aocpGNjk SUGGESTED READING [table id=55 /]

Although some of his theories are still hotly debated, Sigmund Freud, (May 6, 1856–September 23, 1939) is widely regarded as a trailblazer in the realm of psychiatry and psychology. The Austrian ps...

1. Winston Churchill’s teachers described him as unambitious, rebellious, and violent, and said that he could not be trusted to behave himself in any situation.
2. Even though he was known for his remarkable ability to make stirring speeches, he actually suffered from a speech impediment which he made a great effort to hide. 3. At his house in Kent, England, he stocked between 3,000 and 4,000 Cuban cigars at a time. 4. Churchill acquired his taste for cigars during his visit to Cuba in 1895. He traveled there because the Cuban uprising against the Spanish empire was, to him, the only interesting war going on at the time. 5. While fighting in the Anglo-Boer war in 1899, he was captured by the Boers and thrown into a POW camp. One night, he was able to make a dramatic escape by vaulting the prison walls. Once out, he stole food and hitched rides until he finally reached safety in Mozambique. 6. In 1904, during his first term in the British Parliament, Churchill helped draft a piece of legislation that mandated the sterilization of those who were referred to at the time as “feeble minded.” He was acting on what he had said years earlier to his cousin: “The improvement of the British breed is my aim in life.” 7. He suffered from intense bouts of depression which he called his “Black Dog” periods. It is now believed that he had bipolar disorder. 8. When justifying his support of the Soviet Union, Churchill famously said, “If Hitler invaded hell, I would make at least a favourable reference to the devil in the House of Commons.” He was expressing his belief that the enemy of your enemy is your friend. 9. In 1953, Churchill was awarded the Nobel Prize in Literature and was knighted by Queen Elizabeth II to become Sir Winston Leonard Spencer-Churchill. During the same year, he had a stroke. Funeral plans were made and called Operation Hope Not. 10. Upon Churchill’s death in 1965, his funeral saw the largest gathering of members of the public and statesmen since 1852, when the Duke of Wellington was buried.

1. Winston Churchill’s teachers described him as unambitious, rebellious, and violent, and said that he could not be trusted to behave himself in any situation. 2. Even though he was known for his r...

By re-examining Adam Smith’s theories as they were originally articulated, Gavin Kennedy, Emeritus Professor at Heriot-Watt University, Edinburgh, aims at nothing less than rescuing the authentic Smith from the distorted interpretations, assumptions, and attributions of modern economists. Having in subsequent publications criticized scholars for partial and misleading citations from Smith’s work as a means of validating their own theories, it is a task Kennedy is eminently suited to perform.

With short, snappy chapters that are thematically and sequentially coherent, Kennedy seeks to demonstrate that Smith’s original message was very clear but has been misused by ideologues of the Right and more surprisingly perhaps, also the Left. Judicious citations from Smith’s The Theory of Moral Sentiments (1759) and The Wealth of Nations (1776) provide a compelling dynamic to Kennedy’s narrative. The former was an essential conceptual and philosophical link towards the latter, and with his moral philosophy in mind, Kennedy claims Wealth of Nations as a philosophical treatise rather than an economics textbook, for Smith “directed his intellectual output at emphasising the mutuality of human conduct through chains of exchange relationships arising from the dependence of each person in society on the services of many independent others” (p.18).

Consequently, Smith’s earlier work developed ideas of empathy, interdependence, and harmony of interests, and how these values played out in society were highly significant in underpinning Smith’s political economy. The author considers the passage relating “the propriety of generosity and the deformity of injustice” vitally important in countering the misinterpretations of those who argue that Smith preached the supremacy of self-love and self-interest, and that his fundamental doctrine was “greed is good” (p.137). Kennedy devotes considerable time to developing and demonstrating two fundamental points. Firstly, that Smith was neither the purveyor of pure laissez-faire nor the ideological forerunner of the neo-classical Chicago School, and secondly that he advocated a fairer, equitable society, based not on redistributive mechanisms but by sharing future affluence from economic growth and higher employment.

The idea of Smith as an ideologue of non-interventionist economics has a long history from the Classical Economists and the Manchester School to Hayek and Friedman. Yet, historians of economic thought, notably Jacob Viner and Mark Blaug, have acknowledged that Smith was not a doctrinaire advocate of laissez-faire policies (p.183). Smith accepted the legitimacy of protective duties, in particular cases, and more widely did not oppose Government intervention on principle but opposed policies that undermined or obstructed free competition. He added that many obstructions were enacted at the behest of merchants and manufacturers lobbying for protection and monopolies, which Kennedy attributes to Smith’s moral philosophy leading him to support perfect liberty “pure and simple” without favoring any special interest rather than any Leftist bias.

With moral philosophy always in mind, Kennedy presents a lucid account of Smith’s political economy, demonstrating, contrary to Schumpeter’s criticism that Smith’s esoteric methodology and expositions led to his failure to cross “the scientific Rubicon” into modern economics, that Smith’s economic concepts are finely-honed and highly-nuanced. Smith based much of his economic theory on a sweeping analysis of historical development of the Economic Ages of Man, from antiquity through feudalism to the commercial age, in the process demonstrating how commerce was the harbinger and catalyst of political reform and far-reaching economic development. Kennedy demonstrates how Smith appreciated how historical context influenced the division of labor and the labor theory of value. In primitive societies, where labor was the only factor of production it was valid to claim human capital was the only source of value, but in a multi-factor economy, this was not applicable. Equally, surplus output driving the division of labor, leading to further specialization and a constantly-evolving supply chain, “the essential difference between rude and commercial societies” was applicable to a particular era (p.56). For Smith, neither concept was immutable but changed according to prevailing economic circumstances.

If economic growth and development were positive aspects of commercial society, mercantilism in all its regulatory, institutional, and legislative forms, was the enveloping negative feature. Domestically, mercantilist regulations fuelled high prices, under-stocked markets, and poor workmanship. Not only were these customary practices and legislative enactments economically inefficient but they also inhibited long-term economic growth, and breached natural liberty in preventing free mobility of labor and capital (p.188). Mercantilist restrictions made international commerce “the most fertile source of discord and animosity” between nations (p.136). However, Smith did not, as the English manufacturer and Liberal statesman Richard Cobden later did, envisage free trade as a panacea for the dissolution of international rivalries. Chartered Companies, monopoly rights, and colonial possessions were criticized by Smith primarily on account of misallocation of resources disrupting capital flows, and retarding capital accumulation.

The contemporary relevance of Smith’s work has led to much misunderstanding and misuse, most notably the Invisible Hand metaphor, used once by Smith in Wealth of Nations, to describe how intentional acts could lead to unintended consequences, some of which were beneficial, some of which were not. The metaphor was not intended as a definitive explanation of free-market operations, yet successive economists have propagated it as a “mystical principle” of how market forces promote beneficial outcomes. Even Milton Friedman described how it had been “more potent for progress than the visible hand for retrogression.” Clearly, these ideologically-charged representations are a distortion. Similarly, claims of disapproval of government intervention are easily dismissed. Instancing Smith’s proposals on education, justice, and capital projects of economic infrastructure, complemented by an official apparatus of “instruments of intervention,” and a sanctioned list of governmental duties, Kennedy convincingly argues that utility, not principle, governed his stance (p.176).

For Smith, wealth creation and capital formation were instrumental towards alleviating poverty. British working-class history in the nineteenth and twentieth century validates his perspective, for contrary to Marx’s predictions of working-class immiseration, rising real wages, and the general spread of opulence raised working-class living standards. Smith has also been attacked for failing to grasp how economies of scale contradicted the harmony of self-interest with the common good, but it has to be remembered that Smith never asserted that self-interested acts were universally socially-benign. This type of criticism indicates that the idea of the Invisible Hand universally guiding market forces towards positive outcomes is still widely and mistakenly advanced.

Adam Smith’s fame and prestige never rested on originality. Many others anticipated elements of his thought, and a cross-current of Scottish and European Enlightenment thinkers influenced his writing. More may have been made of these intellectual influences. Yet, while the author’s admiration for his subject is clear, he does not shirk from criticism, especially relative to Smith’s often casual periodization and imprecise dating. If valid criticism, it hardly detracts from the magnitude of Smith’s achievement. It is now not credible to claim as the American economist George Stigler did in 1975 that The Wealth of Nations was merely a “stupendous palace erected upon the granite of self-interest” (p.109). As Kennedy argues, it is this type of careless, incomplete, and selective reading and quotation which has led to the “largely invented image” of Smith as an advocate of laissez-faire (p.172). To his credit, Kennedy aims to eliminate these misconceptions, and his concise and highly-analytical account makes a substantial contribution towards re-appraising the practical content and ideological application of Smith’s output. The nature, vision, and scope of Smith’s work, buttressed by extensive original quotations and careful reading, are well-delineated, and linkages between Smith’s historical analysis of economic development and his examination of human behavior and motivation are finely-drawn. By restoring Smith to his rightful place as a philosopher who described the nature of society he examined and who sought ways to improve the lives of the people, Kennedy has gone some way towards re-locating Smith in his proper historical context and perhaps in the process has made him a less partisan and divisive figure.

By re-examining Adam Smith’s theories as they were originally articulated, Gavin Kennedy, Emeritus Professor at Heriot-Watt University, Edinburgh, aims at nothing less than rescuing the authentic S...

The author of well-received and best-selling books about Franklin D. Roosevelt and Barack Obama, a seasoned political analyst on MSNBC and earlier on Newsweek, Jonathan Alter summons a well informed and fresh perspective on the thirty-ninth President of the United States, taking advantage of new archival sources, including declassified documents on the CIA Records Search Tool available at the Jimmy Carter Presidential Library and Museum. Alter notes that he received the “extensive cooperation of Jimmy Carter and eighteen members of his family,” including Rosalynn Carter, who shared her husband’s love letters. She is a major character in this biography, which is the story of a loving marriage (not without its tensions) and a political partnership. Rosalynn usually was, as the cliché now has it, “in the room,” serving her husband as a chief advisor. Her work on improving access to mental health therapy pioneered the later efforts of First Ladies, especially Michelle Obama. It is surprising to note that no full-fledged biography of Carter has preceded Alter’s, although another Carter biography is on the way from Kai Bird. Something is afoot in the world of politics and biography: Jimmy Carter, dismissed by political elites, the populace, and even former presidents as a failed leader, has slowly been redeemed, rising in the ranks of historians from the bottom third to the top third of American presidents. The conventional explanation for this turnabout is Carter’s spectacular post-presidency with achievements in eradicating the spread of disease, the fostering of democracy around the world, and his work at home with Habitat for Humanity. All good works, Alter acknowledges, but not nearly as consequential as what Carter accomplished as President. [caption id="" align="aligncenter" width="500"]Jimmy Carter Jimmy Carter[/caption] Alter makes a compelling case for Carter as the most important political figure in the environmental movement since Theodore Roosevelt. In the final days of his presidency, for example, he managed to preserve something like a third of Alaska as an environmental preserve that oil companies and other exploiters of the land could not despoil. When Republican senator Ted Stevens tried to weaken Carter’s Alaska’s initiative, bringing to the White House maps and arguments that he thought would release more land to private enterprise, he came away disappointed, exclaiming that Carter knew more about Stevens’s state than he did. That was typical Carter, an engineer, voracious reader, and stickler for detail—all characteristics that made him an impressive, but also vulnerable president. It is not that Carter knew too much, but in Alter’s telling Carter too often turned off both Democrats and Republicans with his know-it-all stance that was even more infuriating because he did seem to know it all! But Carter was hardly a one-issue president. Putting solar panels on the White House roof, for example, was not a stunt, but part of Carter’s realization that to save the planet as well as the American economy the country would need to develop alternative forms of energy. That President Reagan removed those panels and ignored environmental concerns set back the nation precious decades in combating climate change that the government has yet to reverse. American presidents from Clinton to Obama tried to catch up to Carter’s vision of renewable energy before President Trump halted their progress toward a cleaner world. Carter transformed the American judiciary with many more women and people of color and worked closely with African American leaders like Martin Luther King, Sr. for racial and social justice. In this case, he was rectifying his own reticence during the 1950s and 1960s when he did not speak out against segregation, and when he even welcomed the support of populist/racists like George Wallace. In his campaign for governor of Georgia, Carter remained largely silent on the issue of race, although in office he began the long drive to reconcile racial divides. In fact, he acted so boldly as governor that Carter admitted he could not have been re-elected as governor of Georgia. [caption id="" align="aligncenter" width="500"]Jimmy Carter Jimmy Carter[/caption] Alter shows that Carter had superb insights on how to get elected, pioneering the Washington outsider ploy that so many candidates have used since his successful election to the presidency in 1976. So what went wrong? Alter astutely shows that Carter, as an outsider, never bothered to cultivate a constituency within the Democratic Party. He expected to win because he was on the right side of the issues. He despised the give-and-take of congressional deal-making and almost always took the moral high ground, thus alienating even allies who found him sanctimonious. Of course, Carter also came to grief because of the takeover of the U.S. embassy in Iran. Alter shows that neither Carter nor his staff understood the changing political climate of Iran. Faulty intelligence contributed to their ignorance and misjudgment of Iran’s leaders. The ensuing drama of the more than year-long fate of the American hostages that made Ted Koppel a star on the ABC evening program, Nightline, transformed the hostage crisis into a daily indignity that frustrated and angered the American electorate. Even worse, the disastrous effort to rescue the hostages, ending in American fatalities in the desert before the rescue team could even reach the embassy, branded Carter as an unlucky and ineffectual president. It did not help that the country suffered high inflation and went into recession without recovering in time to withstand Ronald Reagan’s challenging words: “Are you better off now than you were four years ago?” Not even the historic peace agreement between Egypt and Israel could save Carter from defeat, even though all sides publicly proclaimed his indispensable role. The courageous Carter never hesitated to put his reputation at risk by personally negotiating with Egypt’s Anwar Sadat and Israel’s Menachem Begin. Foreign policy achievements, including Carter’s bold moves to open up China that went well beyond President Nixon’s policies, did not earn him enough credit for a victory that opened new markets for the United States and brought China more fully into the world order. And of course, his opponent, the taller and telegenic and well-spoken Ronald Reagan made Carter look even smaller in their one debate, which dispelled worries about the hard-line Californian who came off as so genial he made Americans feel good about themselves. By contrast, Carter’s speeches calling for sacrifice were turned against him, as though he were lecturing the American people about their profligacy. To say only this much about Alter’s biography is misleading, since this book is about more than politics. It is also about a man forever learning: to paint, to write poetry, to author many books, personal and political, and to travel tirelessly on behalf of peace—sometimes to the consternation of presidents who saw him as interfering with State Department policy and burnishing his reputation to the detriment of the very peace process Carter supported. And yet the very idea of free and fair elections, with Carter on the ground supervising in person across the globe, reinforced the moral standing of the United States and continued the human rights foreign policy Carter had promulgated as president. Coming to the close of his narrative, Alter has a paragraph that sums up both the force of Carter’s life and of this biography:
For nearly a century, he had already lived again and again and again—constantly reimagining himself and what was possible for a barefoot boy from southwest Georgia with a moral imagination and a driving ambition to live his faith. That passionate commitment—sustained long after others of his generation had left the field—would take him to the farthest corners of the earth, but he always came full circle to Plains, where his inner and out selves could find repose.
Now you see the power of Alter’s title. Carter has always given America and the world “his very best,” showing what it can mean to be an American. But Carter’s story also seems biblical: He is our Methuselah.

The author of well-received and best-selling books about Franklin D. Roosevelt and Barack Obama, a seasoned political analyst on MSNBC and earlier on Newsweek, Jonathan Alter summons a well informed a...

Wade Rowland, the author of Galileo's Mistake, characterizes the trial of Galileo Galilei by the Inquisition in 1633 as a defining moment in modern Western culture. As generally understood, this trial pitted arbitrary and dangerous religious authorities against the progression of scientific discovery. As Rowland defines what he calls “the myth of Galileo,” the Catholic Church condemned Galileo because he had discovered the truth. The author, one of Canada’s leading literary journalists, begins his opposition to this interpretation by educating the reader about how Galileo’s contemporaries in the 17th century would have perceived the issues involved in the trial. Rowland writes that it is a mistake to impose modern sensibilities on the 17th century, and he succeeds in making a point of how difficult it is for people today to relate to or comprehend the mores of the ancient or medieval worlds. In his opinion, this is because the Scientific Revolution, in which the Italian astronomer, physicist, mathematician, and philosopher has played an important role, has overwhelmingly influenced the modern worldview. This is a theme the author returns to again and again. For example, he writes, “I came to share a conviction that the roots of what is most disturbing about the modern world find their nourishment … in what is often called the Scientific Revolution.” Rowland believes the Scientific Revolution “expanded the creative horizons of humanity while reducing the mass of individual humans to the status of commodities and consumers” and “improved health and longevity while promoting unprecedented spiritual and existential dis-ease.” Rowland offers a kind of survey of the history of scientific thought, experimentation, and discovery, which is the best part of the book. He provides an excellent and insightful summary of 17th-century science beginning with traditional and accepted views of a universe in which the Sun traveled around the Earth, and the Bible was the recognized guidebook to the heavens. He covers Ptolemy, Kepler, and Copernicus, providing readable depictions of their discoveries and how those discoveries were accepted—or not—by the general public and the Church. His presentation of Galileo’s life and discoveries is especially interesting, noting that in his own time, Galileo was known for his work in mechanics rather than astronomy.
Embed from Getty Images
The author repeatedly disavows the contention he is acting as an apologist for religion, but he seems to be less than an enthusiastic fan of the scientific side in the argument. He writes, “There is a legitimate place for religious insight in the pursuit of science” and wants to emphasize “that science is not the only legitimate fount of knowledge … that it can and should be challenged on some of its most fundamental preconceptions.” In a rather alarming mixed metaphor, he writes, “I want to lift its skirts and expose the rot in its foundations.” Rowland’s major point is that the Church was not close-minded about Galileo and his support of the so-called Copernican heresy, and overall, he makes a good case for this position, citing examples of individuals at the highest levels of hierarchy who thought Copernicus was probably right. The Church’s real objection to Galileo, according to the author, involved how the astronomer expressed his belief. Galileo thought the universe “was essentially a mathematical reality, in some literal way composed of numbers.” The Church could not accept this position because it excluded the possibility of an ultimate purpose to existence. Therefore, the argument between Galileo and the Church was really about “the nature of reality and what we can truly know”—an argument that Rowland believes continues to “bedevil” modern civilization to this day. Galileo’s mistake was in believing that “nature is its own interpreter.” The author argues it is wrong “to assert, as Galileo did, that there is a simple unique explanation to natural phenomena, which may be understood through observation and reason, and which makes all other explanations wrong…Scientists do not discover laws of nature, they invent them … Society's facts’ about nature are not preexisting truths, they are human constructs…the truth that science ‘discovers’ is not objective and immutable, it is subjective and socially contingent.” Interspersed throughout the book are present-day dialogues, or conversations, between the author, his friend and former student Berkowitz, and a nun, Sister Celeste, in which they expand the science-versus-faith debate. These conversations take place in various places in Italy that have a connection with Galileo’s life, so the reader gets a flavor of the physical settings associated with the trial. However, at times the dialogues are distracting to the book’s overall flow, and Rowland’s tone sometimes slips into condescension toward his conversational partners, characters who are not always successfully portrayed as people who exist beyond their stereotypical roles. For example, Berkowitz always takes the side of materialistic, scientific inquiry, while Sister Celeste, whom Rowland repeatedly describes as “the little nun,” favors mystery and faith. The dialogues present additional biographical and historical information about important figures like Kepler and Copernicus, which is sure to be appreciated by readers. The author is not unbiased in his presentation—nor should he be, since he is taking part in one of the iconic debates of Western intellectual history. For example, he describes Aristotle’s ideal man as “a country-clubbish, self-righteous creature of decidedly materialistic leanings. A bit of a stuffed shirt. Very modern, in fact.” The author seems to dislike Galileo as a person as well. He describes him as “insufferable” and cites the “polemical” and “bellicose” style of Galileo’s writing, especially in The Starry Messenger, as the real reason for the Church’s criticism of its ideas. Overall, this is an interesting take on one of the central events in Western intellectual history and is likely to be well received by readers who relish debate about the nature of truth and knowledge.

Wade Rowland, the author of Galileo’s Mistake, characterizes the trial of Galileo Galilei by the Inquisition in 1633 as a defining moment in modern Western culture. As generally understood, this...

This group portrait of the foundational theorists of evolution opens with the funeral of Charles Darwin, who, despite his wishes, was buried with much pomp in Westminster Abbey in 1882. Three of the men he had both influenced and admired most—botanist Joseph Hooker,  biologist Thomas Huxley, and zoogeographer Alfred Russel Wallace—were among his pallbearers. This opening is a clever way of bringing together four men whose biggest adventures were taken independently and at an earlier point in their lives. And yet, it’s no mere artifice: the struggle to gain public acceptance for the idea of natural selection wove the separate strands of their lives together in the end.

Iain McCalman, a historian, and research professor at the University of Sydney has many books and honors to his name. But perhaps equally important to this project is his familiarity with southern latitudes—he was born in Africa and pursued an academic career in Australia. Darwin’s Armada (Norton, 2009) recounts four successive voyages in the South Seas, each with a different ambitious young man in, at least, the partial capacity of naturalist on board. Darwin’s five-year expedition on the Beagle was only the first and most famous one. It was followed a few years later by Hooker, who joined Captain James Clark Ross’s expedition to Antarctica aboard the Erebus, after which Huxley sailed with the Rattlesnake along the coasts of Australia and New Guinea. Wallace was not associated with one particular ship, but the sea played a decisive role in his travels to the Amazon and later to Indonesia as well.

McCalman lays a good deal of groundwork for exploring the early life and forces that led these young men to leave home and travel for long periods in conditions that could hardly be qualified as comfortable. All had to get used to life on a ship, with men who were largely indifferent to the explorers’ aims, though often there would be another officer or surgeon aboard who would share some similar passion for collecting. And though by necessity this was a masculine way of life, all of them were sustained by their correspondence with women, whether family or, as was particularly true in Huxley’s case, those with whom they had formed a romantic attachment. The delay in getting letters meant that momentous news of deaths, births, and marriages would be received a year later. And in Hooker’s case, where much of his future life depended on his success in gathering the right assortment of specimens, he lived for months keenly feeling his father’s apparent disappointment, and not knowing that the next collection he sent had been met with excited approval.

Though the stakes were higher or lower depending on each man’s circumstances, all went to validate themselves at sea. Darwin, for example, had failed to become the doctor his father wanted him to be, and his family feared that he would turn into an idle dilettante. Hooker, without an independent fortune like Darwin, needed to make a mark that would procure him a rarely paid university position. By chance, each successive voyager started from a lowlier and more perilous beginning and had to work even harder than the one before him, not just for prestige, but also to achieve some recognition within the British society. But even their scientific accomplishments did not always bring them security.

The naturalists’ mutual interest in travel, adventure, and collecting specimens had a literary foundation. A British publication called Boy’s Own Paper, which both Darwin and Hooker had read in childhood, sparked dreams of exploration and discovery. Darwin’s ambitions were kindled by the Personal Narrative of Alexander von Humboldt, which told of his adventures in Tenerife and South America. And all four read, or, at least, knew of, the geological studies of Charles Lyell, which impressed them with new ideas of how old the earth really was, and how its formation had been slowly affected during this “deep time.” Once Darwin’s The Voyage of the Beagle had come out, the younger naturalists had read it too. In fact, a shipboard life, with its long stretches of idleness, proved to be a good basis for a scientific education, especially when a library was available on board.

But McCalman’s book is not all about scholarly research. Life at sea also held many adventures, some of them harrowing. As he describes it, this mixture of scientific discovery and hardship, combined at times with great beauty, shaped the travelers’ disparate personalities to a common understanding of the natural world. Though coming as they did from very different social backgrounds, they were in the end united in their advocacy of Darwin’s central tenets.

The final section shows how a very deliberate campaign on the part of the group and their supporters sensitized the public to give credence to the theory of evolution, which was revolutionary in its implications. Darwin felt more comfortable about publishing his theory because Wallace had come to the same conclusion on his own. Some discussion of how Darwin dealt with his rival and whether he was fair to him does come up for review, but, in general, McCalman is more interested in the bonds that he formed with the other men. And after all, Wallace remained a faithful friend to Darwin till the latter’s death.

Although the structure of Darwin’s Armada might be seen as repetitive, the effect is cumulative and at times quite moving. We are now in a different age, but one wonders what dreams of discovery and adventure might yet be planted by McCalman’s retelling of these voyages.

This group portrait of the foundational theorists of evolution opens with the funeral of Charles Darwin, who, despite his wishes, was buried with much pomp in Westminster Abbey in 1882. Three of the m...

Readers of John Steinbeck’s The Log of the Sea of Cortez will certainly recognize the Western Flyer as the boat that Steinbeck and his friend, the marine biologist Ed Ricketts, hired in 1940 to take them to the Sea of Cortez (the body of water that divides mainland Mexico from the Baja peninsula) on an exploratory and collecting expedition. In this book, Kevin M. Bailey traces the history of that notable boat from its construction in a Tacoma, Washington, shipyard to its current status in dry dock as it awaits a complete overhaul. As the subtitle suggests, Steinbeck and Ricketts’s six-week adventure dominates the early part of the book (and is certainly its selling point), but the author, a Pacific Fisheries scientist and a founding director of the Man & Sea Institute, also discusses the plight of the Pacific fisheries. He easily segues from one to the other while recounting the Western Flyer’s story as a fishing boat and purse seiner, which, under various owners, captains, and crews, took part in different fisheries up and down the West Coast and as far north as Alaska.

The narrative has turns that are anecdotal, gossipy, speculative, and analytical. After striking an elegiac tone in the prologue describing the Western Flyer’s fate in retirement in Alaska, Bailey talks about the ship’s construction, including a brief profile of Martin Petrich Sr., the boatyard owner. The description of the ship’s dimensions and an accompanying diagram follow, as do profiles of the captain and crew when Steinbeck and Ricketts hired the Flyer. At that time, the Western Flyer was part of the sardine fishery, sailing out of Monterrey, California. Bailey enlivens this account with anecdotes and background information about Captain Tony Berry and crewmen Tex Travis, Tiny Coletto, and Sparky Enea. Steinbeck, of course, needs no introduction, but Bailey recognizes the book would be paltry without some background information on the Nobel Prize winner, and more than once he mentions that the author was an unpopular man in and around his native Salinas after the publication of The Grapes of Wrath. He also mentions that at the time of the six-week cruise Steinbeck’s marriage was all but over, yet his wife, Carol, chose to come aboard the Flyer for the entire journey. Steinbeck aside, Ricketts was the most interesting figure of the bunch. A marine biologist, a philosopher, and an early deep ecologist, he ran Pacific Biological Laboratories and was the inspiration for the character of “Doc” in Steinbeck’s novel, Cannery Row. (In this section Bailey also inserts another noteworthy figure into his narrative, the mythologist Joseph Campbell, who knew both Steinbeck and Ricketts.)

When Ricketts and Steinbeck had collected enough specimens in the shallows of the Sea of Cortez, the Western Flyer turned back toward Monterrey, where the ship and its crew returned to the sardine fishery. Unfortunately, by the late 1940s, the sardine boom had collapsed. Here Bailey presents a short, insightful analysis of the fishery’s demise, offering Ricketts’s explanation as well as other scientific views, some of which coincide with Ricketts’ and others don’t. As for Tony Berry and his crew, they decided to switch to tuna fishing. However, the Western Flyer was too small and slow for tuna fishing, so Berry sold it back to Martin Petrich who, in turn, sold it to Armstrong Fisheries of Ketchikan, Alaska.

For a year or so, the Flyer was part of the herring fishery in Alaska, but in 1952, the boat was sold to Dan Luketa, a fisherman from Seattle. Bailey draws an interesting profile of Luketa, including his sad final years. The enterprising Luketa eventually worked himself up to a small fleet of fishing boats, which, along with the Western Flyer—now transformed into a trawler—worked the Pacific Ocean perch fishery (also known as red rockfish.)  Eventually, the perch fishery in the waters off the northwest coast suffered the same fate as the sardine fishery further south, only this time, the reasons were more complicated, both ecologically and politically, as industrial fishing had taken its toll—especially from Soviet and Japanese fishing fleets. Bailey’s explanation of this is detailed, including a graph showing the decline of the fishery.

When the Pacific Ocean perch could no longer be fished, Luketa took the Flyer further north to Alaska to get in on the boom for red king crabs. But as with sardines and perch, the crabs too became depleted, and Luketa sold the boat, renamed Gemini, in 1970. By the mid-1980s, the Gemini was merely a salmon tender, no longer used for fishing but only as the ship to which the fishing boats transferred their catches to be taken to the cannery. As Bailey describes it, “A fish tender is little more than a self-powered barge” (p. 93) and this would seem an ignominious end for one of the best-known boats in American literature.

In fact, the Western Flyer/Gemini was to suffer a worse fate. Disuse, as well as lack of upkeep and maintenance, were responsible for the boat sinking while it was tied up at an Alaskan dock. It was subsequently raised and is expected to be refurbished at an enormous cost. Its likely ultimate destination as of the time of Bailey’s writing is Salinas (instead of Monterey), where it will serve as a beached tourist attraction. Bailey bemoans this plan—his preference is that old fishing boats, even one as well known as the Western Flyer, ought to meet their natural fates—but seems at least partially accommodated to it: “In my own dreams the Western Flyer is a skeleton perched in the hills overlooking the [Salinas] valley, her whale ribs bleaching in the sun” (p. 113). And, really, there are worse fates the inexpungible Western Flyer could have suffered.

Readers of John Steinbeck’s The Log of the Sea of Cortez will certainly recognize the Western Flyer as the boat that Steinbeck and his friend, the marine biologist Ed Ricketts, hired in 1940 to take...

1. Salvador Domingo Felipe Jacinto Dalí y Domenech was born on May 11, 1904, in Figueres, Spain. He was named Salvador after his namesake brother, who had died the year before. Said Dalí of his brother, "We resembled each other like two drops of water, but we had different reflections. He was probably a first version of myself, but conceived too much in the absolute."

2. By the time Dalí went to the School of Fine Arts in Madrid in 1922, he was already an eccentric. To draw extra attention to himself, he would often walk the streets ringing a bell. At school, he became close friends with famed Spanish poet/playwright Federico Garcia Lorca. Expelled from art school in 1926 for insisting none of his professors were competent enough to examine him, he went to bohemian Paris, where he met up with fellow artists Pablo Picasso and Joan Miro, and joined the Surrealist group.

3. Five years later, in 1934, Dalí was again expelled—this time from the Surrealist movement, for supporting fascism. The ousting, however, didn’t stop Dalí from participating in Surrealist exhibitions.

4. When asked to deliver a lecture for the London Internationals Surrealist Exhibition in 1936, no one could have anticipated that would address his audience wearing a deep-sea diving suit. His justification: "I just wanted to show that I was plunging deeply into the human mind."

5. Despite his obvious flair for the unusual, Dali was nothing short of a commercial artist who longed for fame. During his lifetime, he even created a museum to display his own work, the Dalí Theatre-Museum in his native Figueres. This is where Dalí’s famous “lips” sofa originated.

6. In painting “Sacrament of the Last Supper” in 1955, he used his wife Gala’s face as the model for Christ.

7. In 1969, Dalí designed a bright yellow and orange logo for Chupa Chumps lollypops, one of the most iconic logos of all time.

8. Even as the years went on, he never ceased to amaze. In a “60 Minutes” interview with Mike Wallace, Dalí referred to himself in the third person. Topping that, he carried a leather rhinoceros with him when he appeared on the “Tonight Show,” and insisted on sitting on it throughout his interview.

9. It’s said that when Gala, his wife of nearly 50 years, died in 1982, Dalí lost his will to live. He completed his last painting, “The Swallowtail” the following year, in 1983. At that time, he suffered from palsy, which made his hands shake terribly, making painting a difficult task.

10. A fire forced him from his home in 1984, and he lived his final years in the Theatre-Museum until he died of heart failure five years later, at the age of 84.

[mistape format="text" class="mistape_caption_sidebar"]

1. Salvador Domingo Felipe Jacinto Dalí y Domenech was born on May 11, 1904, in Figueres, Spain. He was named Salvador after his namesake brother, who had died the year before. Said Dalí of his brot...