Job Archives

Alan Turing (1912–1954) was an English mathematician, logician, pioneer of computer science, and wartime code-breaker. He is credited with creating a design for the Automatic Computing Engine (ACE), the early electronic stored-program computer, as well as the Bombea decryption device that the British government used during WWII to crack the German “Enigma,” machine, which encrypted secret messages.
Jack Copeland is Professor of Philosophy at the University of Canterbury, New Zealand, where he is Director of the Turing Archive for the History of Computing. His recent biography Turing: Pioneer of the Information Age draws on many years of conversations with Turing’s closest friends and colleagues, and he explores the complex character of this shy genius as well as describing the breadth and importance of Turing’s legacy. Copeland’s other books include The Essential Turing; Artificial Intelligence; Colossus—The Secrets of Bletchley Park’s Codebreaking Computers; and Alan Turing’s Electronic Brain. He shares his insight into the life and work of Alan Turing.

Alan Turing (1912–1954) was an English mathematician, logician, pioneer of computer science, and wartime code-breaker. He is credited with creating a design for the Automatic Computing Engine (...

John McCabe’s insightful biography of Charlie Chaplin is a scholarly work, which will easily engage both young and old. McCabe (1920-2005) is best known for his biographies. In addition to his thorough research for his book on Chaplin, McCabe relied on the extensive notes he had compiled from Stan Laurel’s comments about Chaplin—the two actors were lifelong friends—as well as Laurel’s annotated copy of Chaplin’s autobiography, Charlie Chaplin’s Own Story (1916), which was given to the author by Laurel. Using the notes and annotations, John McCabe has endeavored to sift out the truth behind the endless legends, misconceptions, and errors that surround Charlie Chaplin’s life. McCabe explains how Chaplin’s bitter early years were fashioned by Dickensian London. Chaplin learned how to sing from his parents who were entertainers but after their separation, the family’s economic situation plummeted. His mother’s stressed larynx condition ended her singing career, plunging her into deeper financial and psychological depression. During his early years, Chaplin and his stepbrother Sydney lived on and off with their mother, who occasionally was admitted to an asylum for the mentally ill, and with Charlie’s alcoholic father. Between these periods, the boys were left in at an institution for orphans and destitute children at Hanwell, a few miles outside of London. The young Chaplin brothers forged a close friendship to survive and gravitated towards the Music Hall whenever possible. As time went on, Charlie and Sydney both proved to have considerable natural stage talent. These formative years of desperate poverty and deprivation provided many of the themes and characters for Chaplin’s films in later years. Chaplin began his first tour in America in 1910. It was on this tour that Chaplin met and shared a room in a boarding house with Stan Laurel. This meeting sparked Laurel’s lifelong respect and interest in Chaplin. Director Mack Sennett saw Chaplin’s act and hired him for his studio. Chaplin’s earliest films were made for Mack Sennett’s Keystone Studios where he developed his little tramp character. His initial attempts as a film artist were filled with uncertainty. However, he quickly learned the craft- selecting baggy pants, big shoes, a cane, and a derby hat. He wanted everything to be a contradiction: the baggy pants, the tight coat, the small hat, the large shoes. The mustache added age without hiding his expression. The persona of this character had a life of its own. Chaplin eventually left Keystone Studios and went on to work at Essanay and Mutual Studios until 1918, when he assumed control of his productions. In 1919, Chaplin co-founded United Artists film distribution with Mary Pickford, Douglas Fairbanks, and D. W. Griffith. This move assured him of his independence as a filmmaker. McCabe provides detailed accounts of Chaplin’s time at each studio, along with insight into his comic routines and movies. Charlie Chaplin resisted making “talkies” in the 1930′ as illustrated in an article for Literary Magazine, on February 28, 1931, titled: Charlie Chaplin Defies The Talkies. Chaplin considered cinema essentially a pantomimic art. His first dialogue picture The Great Dictator was an act of defiance against Hitler. McCabe explains that many viewed Chaplin’s political opinions as having roots in Communism. Pressure on Chaplin grew after 1942, when he was sympathetic to a request from the American Committee for Russian War Relief. He spoke openly at rallies about his support for a second front, pointing out the need for Russian Relief. During the McCarthy Era, Chaplin was accused of un-American activities. In 1952, Chaplin left the United States and did not return until twenty years later. McCabe provides memorable insight into Charlie Chapin’s life during these difficult years. He depicts Chaplin as having a rare temperament, being both deeply shy and heartily outgoing and ultimately, according to McCabe, frequently misunderstood. Chaplin’s life was filled with turmoil; his relationships with women and politics perpetuated his insecurities. However, Charlie Chaplin, the deathly shy, undersized boy, dreamed he would overcome his handicaps, and hoped he would conquer the world. McCabe explains how Chaplin’s dreams came true. McCabe’s book includes an extensive bibliography, complete index, a Chaplin filmography, and a section of photographs depicting film stars and family members. In addition to a wonderful story, these are excellent features for the Chaplin researcher or ardent fan.

John McCabe’s insightful biography of Charlie Chaplin is a scholarly work, which will easily engage both young and old. McCabe (1920-2005) is best known for his biographies. In addition to his thoro...

Pulitzer and Nobel-winning writer, Ernest Hemingway (1899–1961) was one of the most influential writers of the 20th century, whose simple, clear, and distinctive style revolutionized literature.

American author Gay Talese is the bestselling author of eleven books. He was a reporter for the New York Times from 1956 to 1965, and since then he has written for the Times, Esquire, The New Yorker, Harper’s Magazine, and other national publications.

He shares his insight into Ernest Hemingway's life and works as well as the craft of writing.

Pulitzer and Nobel-winning writer, Ernest Hemingway (1899–1961) was one of the most influential writers of the 20th century, whose simple, clear, and distinctive style revolutionized literature. Ame...

Few minds have loomed larger than that of Karl Marx in the development of modern philosophy, economics, and politics. Despite his eminence, however, few writers have succeeded in making him interesting and accessible. One who did succeed was Isaiah Berlin, an Oxford lecturer and one of the most lucid thinkers of the twentieth century. Berlin’s exposition of Marx was his first book, originally published in 1939, and offering an insightful and readable account of both the man and mind. This being a time when writers tended to treat Marx either as a demon or a saint, Berlin’s portrayal of him as a man with a nuanced mind took courage. He made periodic revisions to this work throughout the decades, publishing the fourth and final edition just before his death in 1997.

In just 228 pages, Berlin presents a staggering amount of philosophical insight and rigorous analysis of Marx’s intellectual contributions. He begins by describing the social and intellectual milieu into which the young Marx was born. Coming of age in Prussia in the decades immediately after Napoleon’s defeat and Europe’s return to reaction, Marx’s home country had become a hotbed of intellectual introspection and social discontentment. According to Berlin, the leading philosophical strands in Europe at the time were British empiricism and French materialism, the former denying the existence of an “intellectual intuition into the real nature of things” and the latter being “semi-empirical rationalism [consisting] in boundless faith in the power of reason to explain and improve the world.”

As intellectual outsiders, the German philosophers constructed their own system, one that would reject the listlessness of the empiricists and the utopianism of the materialists. The German response would be called Hegelianism, named after its father Georg Hegel, who was born in Stuttgart almost half a century before Marx. This belief system focused on historicity: while neither rejecting the importance of reason or of empiricism, Hegel posited that these alone were insufficient to explain the dynamism that propelled human societies forward. Where did Hegel find this driving force? In the "Idea," a historical force that, acting through a continuous process of conflict and synthesis called dialectics, turned the wheels of history. Hegelianism would be the philosophical fodder for all of Marx’s later intellectual contributions.

The young Marx was first introduced to Hegel’s work as a student in Berlin, then the capital of Prussia. He would quickly become the most erudite member of a group the author calls the “Young Hegelians,” radical young thinkers and writers who applied a fierce belief in rationality to their progenitor’s emphasis on history. Marx’s experience as a Hegelian also undergirded the later emergence of his unshakable faith in social revolution. “Only the rational was real,” paraphrases Berlin. “Degrees of reality … may necessitate a radical transformation of given institutions in accordance with the dictates of reason.” The Young Hegelians believed that not only was ever-advancing rationalism the goal of historical evolution, but also that the philosopher’s responsibility was to use his critical eye and talent for communication to help bring it about. By discrediting archaic institutions and bringing enlightenment to the blind masses, the radicals of which Marx was a part planned to accelerate history’s advance into a more advanced, rational stage.

[caption id="" align="aligncenter" width="500"]Karl Marx Karl Marx[/caption]

Marx would later land in Paris following his banishment from Prussia. According to Berlin, the three years spent there would crystallize his ideological beliefs. It was in Paris where he articulated his most lasting theoretical contribution, the theory of historical materialismThis belief system built upon Hegel’s foundation but infused it with a revolutionary edge that eclipsed anything produced previously by the Young Hegelians. Marx believed himself to have developed a wholly empirical, perfectly rational, scientific method for explaining historical development, replacing Hegel’s ill-defined concept of the “Idea” with an explaining factor that didn’t depend on culture at all. His historical force was socio-economic: evolution necessarily followed from conflict, and “the conflict is always a clash between economically determined classes.”

Berlin provides a lucid interpretation of Marx’s beliefs. The philosopher’s class struggle was joined by a critique of capitalism as inherently exploitative and socially alienating. By separating the masses from the fruits of their labor, the system denied them the creative expression that is the goal of all individual human effort. Thus, the society of Marx’s time rested on fundamentally unstable foundations. The conflict between bourgeoisie and proletariat was too deep and intractable to remain below the surface. Revolution was inevitable. Capitalist society was doomed to fade into history and be replaced by a truly rational, harmonious, and, in Marx’s vision, classless society.

Capitalism had created the material conditions for widespread prosperity and social harmony: modern industrial capacity could feed and clothe everyone many times over, it was simply a matter of organizing production along rationalized, egalitarian lines. Berlin quotes a letter Marx wrote to Engels in 1862: “Only a conscious organization of social production, in which production and distribution are planned, can lift human society above the rest of the animal kingdom.” It was then up to Marx and other revolutionaries to prepare the masses for the new society that historical evolution was destined to bring about. These core beliefs were articulated during Marx’s time in Paris, and he would spend the remaining 30 years of his life (which he mostly spent in London) disseminating them. As the self-styled vanguard of the revolution, he devoted his life to cultivating class consciousness among the European proletariat.

Berlin’s book provides an analysis of Marx’s intellectual work that is detailed without being long-winded, and rigorous without being obscure. It also expertly weaves this analysis within a biography of Marx the man, chronicling his life from cradle to grave and peppering the narrative with relevant anecdotes from his personal life. One comes during the chapter titled “Childhood and Adolescence,” with the author relating a story about Marx’s father Herschel being harassed by the police after publicly advocating moderate social and political reforms. He retracted his comments, something which Berlin believes had a lasting impact on his son: “His father’s craven and submissive attitude, made a definite impression on his eldest son Karl … and left behind it a smouldering sense of resentment, which later events fanned into a flame.”

The most enduring part of Berlin’s book is his approach to Marx and Marxism. His reputation as a liberal democrat (and one distrustful of utopian ideologies) would lead many to expect a denunciation of Marx’s intellectual work. That never comes. Berlin concedes the intellectual brilliance and larger-than-life stature of his subject. It’s also clear that he finds in Marxism a useful framework for analyzing society, if not one for guiding socio-political change. Whether one finds Marx inspiring or monstrous, this is a useful contribution to the literature that should be eagerly read by students of philosophy, history, and politics for years to come. 

Few minds have loomed larger than that of Karl Marx in the development of modern philosophy, economics, and politics. Despite his eminence, however, few writers have succeeded in making him interestin...

Ross King has published three engaging books, over the past sixteen years or so, that describe the creation of three of the best-known works of the Italian Renaissance: Florence’s Il Duomo (Brunelleschi’s Dome, 2000); the fresco ceiling of the Sistine Chapel (Michelangelo and the Pope’s Ceiling, 2002); and The Last Supper, the subject of this review. While Leonardo and The Last Supper offers less insight into the working method and problems confronted by the artist than the previous two books, King still reveals the fascinating background to the creation of the masterpiece, insights into the painting itself, and, as a backdrop, the political machinations of Leonardo’s patron, Lodovico Sforza, the Duke of Milan. He also spices the text with a few swipes at the twentieth-century art historian, Bernard Berenson. The text is accompanied by 48 black-and-white and nine color illustrations. There is also a map of Italy during the period, a Sforza genealogy, and an illustrative who’s who around the table of The Last Supper.

Leonardo first came to the duke’s attention in 1482 when he traveled to Milan from Florence and soon ingratiated himself in the duke’s court. Lodovico commissioned Leonardo to create what is arguably the latter’s best-known unfinished work: the bronze horse (originally it was to have included a rider) to commemorate Lodovico’s father, Francesco Sforza. The statue was to have surpassed any previous equestrian statue cast in Italy—twenty-three feet high with rider and rearing on its hind legs no less. For this, seventy-five tons of bronze had been set aside. Yet, as King relates, "By the end of 1493, Leonardo had spent as many as eight or ten years on the giant equestrian monument.” At the beginning of 1494, “[he] was putting the finishing touches to his clay model and deliberating the practicalities of casting in bronze” (p. 11). Then politics intervened.

In a ploy that presaged Niccolò Machiavelli, Duke Lodovico, in an attempt to keep some of his rivals at each other’s throats, invited the King of France, Charles VIII, into Italy to claim the Kingdom of Naples, then under the reign of the newly crowned Alfonso. Alfonso’s daughter was married to Giangaleazzo Sforza, the rightful Duke of Milan whose title Lodovico had usurped. The ease with which Charles swept down the Italian peninsula alarmed Lodovico and encouraged Charles’s cousin Louis, the Duke of Orléans, who also had a tenuous right to the duchy of Milan. After making various alliances (notably with Venice), and switching sides more than once, the perfidious Lodovico belatedly realized the French were a far greater threat to him. Eventually, the bronze set aside for the memorial statue was requisitioned to be cast into cannons, thus ending that artistic venture.

King provides a mini-profile of Leonardo’s life leading to these events, including his apprenticeship with the Florentine painter and goldsmith Andrea del Verrocchio from the mid-1460s to the early 1470s. At this point, King allows himself a bit of speculation: “Verrocchio must have been the one who first awakened Leonardo’s interest in things such as geometry, knots, and musical proportions—and their application to artistic design” (p. 29). He supports this assertion by linking a motif in Verrocchio’s tomb slab for Cosimo de’ Medici with Leonardo’s Vitruvian Man.

King judges Leonardo received the commission to paint The Last Supper onto the wall of the refectory of Santa Maria delle Grazie either at the end of 1494 or the beginning of 1495. At the same time as Leonardo’s commission, the Milanese painter Giovanni Donato da Montorfano was commissioned to paint the crucifixion scene on the opposite wall. King points out that the pairing of the two scenes was not unusual. Perhaps Montorfano was chosen as he was an experienced frescoist whereas Leonardo had no experience in fresco painting.

Here King describes the usual technique for painting a fresco—the drawing of the cartoon (a stencil of sorts) and the painting directly onto the wet plaster—which he describes in detail in Michelangelo and the Pope’s Ceiling. Leonardo eschewed this technique and painted directly onto the wall—thus The Last Supper is not technically a fresco. The wall onto which Leonardo painted was a surface of his own creation. “Once his first coat of plaster dried,” King writes, “he covered it with a thinner, slightly granular layer of calcium carbonate mixed with magnesium and a binding agent probably made from animal glue. Once this preparation layer had dried, he added an undercoat of lead white: a primer, in effect, to seal the plaster and enhance the mural’s luminosity” (p. 107). Leonardo did this because instead of working in the traditional egg tempera he used oils, which were gaining acceptance. Perhaps this extra prep work caused the delay which troubled Duke Lodovico and the friars at Santa Maria delle Grazie; Leonardo finished his masterpiece in 1498, whereas Montorfano completed The Crucifixion three years earlier.

The book excels in showing Leonardo’s mature powers as an artist. King describes a technique in which seeming faults in the painting come together to create, arguably, the greatest piece of iconography in Western civilization. King also places Leonardo’s Last Supper in the context of other Last Supper paintings of the period. Like many of the other Last Supper paintings, Leonardo’s mural takes as its starting point the moment after Christ has revealed he will be betrayed by one of the disciples later that evening. Thus the “action” of the picture is born of surprise, denial, and anger. Perhaps even embarrassment. King analyzes this action, including Christ’s near physical connection to Judas. Leonardo differed in his Last Supper from many of his predecessors in that St. John is not resting his head upon Christ’s breast. Instead, he is inclined away from Christ, the better to hear what St. Peter is saying. He also points out the figure of Christ is out of proportion vis-à-vis the disciples. None of this information is new, of course, but King’s style brings a fresh approach to it all. King also posits the idea that the dining table and linen mirror that which the friars used in the refectory. To that end, he discusses the effect of the mural on the dining friars and at what point in the room the mural appears to be an extension of the refectory itself. He also discusses the food on the table, the tapestries on the walls, and what many people miss, the Sforza coat of arms on the mural’s back wall.

An epilogue describes the downfall of Lodovico Sforza (captured and imprisoned by the French) and his death in 1508, Leonardo’s own last years, and the fate of The Last Supper. According to at least one Renaissance commentator, the mural began to deteriorate about twenty years or so after Leonardo had completed it, before the end of his life. But hubris, time, climate, inept restorations, nor cutting a door in the wall could succeed in destroying the mural. Neither could the RAF which bombed Santa Maria delle Grazie during the Second World War. King not only discusses the causes of The Last Supper’s deterioration (primarily working with oils and painting onto a drywall rather than a wet plaster), but he also catalogs the errors made in restoration over the centuries, including that done in the mid-twentieth century. “Some critics have argued that The Last Supper is now 80 percent by the restorers and 20 percent by Leonardo. The mural’s restoration has become a puzzle of spatiotemporal continuity…,” King sums up toward the end (p. 274). Yet for King and for the rest of us except the most persnickety this is enough. “The Last Supper is arguably the most famous painting in the world, its only serious rival Leonardo’s other masterpiece, the Mona Lisa,” King pronounces (p. 275). And it is hard, very hard, to disagree.

Ross King has published three engaging books, over the past sixteen years or so, that describe the creation of three of the best-known works of the Italian Renaissance: Florence’s Il Duomo (Brunell...

1. Salvador Domingo Felipe Jacinto Dalí y Domenech was born on May 11, 1904, in Figueres, Spain. He was named Salvador after his namesake brother, who had died the year before. Said Dalí of his brother, "We resembled each other like two drops of water, but we had different reflections. He was probably a first version of myself, but conceived too much in the absolute."

2. By the time Dalí went to the School of Fine Arts in Madrid in 1922, he was already an eccentric. To draw extra attention to himself, he would often walk the streets ringing a bell. At school, he became close friends with famed Spanish poet/playwright Federico Garcia Lorca. Expelled from art school in 1926 for insisting none of his professors were competent enough to examine him, he went to bohemian Paris, where he met up with fellow artists Pablo Picasso and Joan Miro, and joined the Surrealist group.

3. Five years later, in 1934, Dalí was again expelled—this time from the Surrealist movement, for supporting fascism. The ousting, however, didn’t stop Dalí from participating in Surrealist exhibitions.

4. When asked to deliver a lecture for the London Internationals Surrealist Exhibition in 1936, no one could have anticipated that would address his audience wearing a deep-sea diving suit. His justification: "I just wanted to show that I was plunging deeply into the human mind."

5. Despite his obvious flair for the unusual, Dali was nothing short of a commercial artist who longed for fame. During his lifetime, he even created a museum to display his own work, the Dalí Theatre-Museum in his native Figueres. This is where Dalí’s famous “lips” sofa originated.

6. In painting “Sacrament of the Last Supper” in 1955, he used his wife Gala’s face as the model for Christ.

7. In 1969, Dalí designed a bright yellow and orange logo for Chupa Chumps lollypops, one of the most iconic logos of all time.

8. Even as the years went on, he never ceased to amaze. In a “60 Minutes” interview with Mike Wallace, Dalí referred to himself in the third person. Topping that, he carried a leather rhinoceros with him when he appeared on the “Tonight Show,” and insisted on sitting on it throughout his interview.

9. It’s said that when Gala, his wife of nearly 50 years, died in 1982, Dalí lost his will to live. He completed his last painting, “The Swallowtail” the following year, in 1983. At that time, he suffered from palsy, which made his hands shake terribly, making painting a difficult task.

10. A fire forced him from his home in 1984, and he lived his final years in the Theatre-Museum until he died of heart failure five years later, at the age of 84.

[mistape format="text" class="mistape_caption_sidebar"]

1. Salvador Domingo Felipe Jacinto Dalí y Domenech was born on May 11, 1904, in Figueres, Spain. He was named Salvador after his namesake brother, who had died the year before. Said Dalí of his brot...

Charles-Édouard Jeanneret-Gris—better known as Le Corbusier, the name he adopted in 1920—was among the most significant architects and urban planners of the 20th century, and his career was marked by astonishing productivity and self-promoted celebrity. In his lifetime, he designed 75 buildings in a dozen countries and committed himself to nearly 50 urban planning projects. In addition to his architectural prowess, Le Corbusier wrote 34 books and hundreds of articles while painting more than 400 canvases and producing dozens of sculptures. In this brief and visually appealing book, Jean-Louis Cohen, Sheldon H. Solow Professor in the History of Architecture at New York University’s Institute of Fine Arts, surveys 25 of Le Corbusier’s most significant architectural achievements, ranging from his massive residential and administrative complexes to the posh homes he designed for wealthy clients across the globe. Above all, as the subtitle suggests, Le Corbusier wrestled with the effects of modernity on the urban environment. Having grown up in La Chaux-de-Fonds, a small Swiss watch-making town, Le Corbusier appreciated the “interaction between industry and the visual arts” and the “educative virtues of geometrized form” (7). As a young man, he roamed extensively throughout Europe, observing and carefully studying a variety of built environments he encountered on his travels. He studied under Auguste Perret, the French master of reinforced concrete whose influence was visible in Le Corbusier’s famous “Domino” houses, open-floor plans comprised of concrete slabs and held aloft by thin columns of reinforced concrete. This model served as the skeleton for much of his work from World War I through the late 1920s. During these years, Le Corbusier also developed his radical philosophy of urban planning, which he believed required a thorough break with the past. He was in many respects a passionate reformer who believed that industrial cities had developed in ways that produced rampant crowding, moral degeneration, and filth. Departing from what he viewed as an insufficient, piecemeal, “medical” approach to the Parisian housing crisis, Le Corbusier proposed a comprehensive, “surgical” method that would almost literally cut off the diseased tissue and replace it with towering prostheses (10). As Cohen amply describes, Le Corbusier’s career was marked by “a series of divergent if not contradictory stances” that he took in order to resolve these deficiencies and re-harmonize people to the built landscape (15). Most significantly, Le Corbusier’s forays into urban planning led him to conceive large-scale, low-cost housing for ordinary working people—projects that, when actually implemented and imitated by others, generated immense and soulless concrete monstrosities that drew the ire of critics and isolated the poor who inhabited them. For readers who might wish for more of a contextualized history of Le Corbusier’s work, Cohen’s approach will not be terribly satisfying. He describes the architect as a controversial figure—a “Nietzschean rebel” to some and a “nihilistic destroyer” to others—but there is little substantive discussion of his critics, nor is there any effort to place Le Corbusier within a larger political tradition that includes 19th century French utopians like Charles Fourier or Saint-Simon (14). The book is arranged chronologically, although it devotes less attention to the architect’s actual biography—a choice that sometimes leaves the formal analysis and discussion seemingly incomplete. Instead, readers are treated to elegantly composed and richly illustrated overviews of some of Le Corbusier’s most representative work. As is the case in books from the Taschen series, the writing is frequently opaque to non-experts. For example, writing about the Villa Stein-de Monzie, built from 1926-29 in Vaucresson, France, Cohen mingles formal details with casual references to contemporaneous architects: The façade of this house, set within a parallelepiped, is flat and governed by a regulating plan based on a golden section, which determines the proportions and positioning of the windows. Through its transparency, the garden façade, on the other hand, reveals the complex interplay of the indoor volumes and the walking linking the terraces to one another and to the garden . . . [Colin Row and Robert Slutzky] thought [they] recognized in the villa’s cylindrical stairwells and curved partitions . . . as objets-types featuring in Le Corbusier’s Purist paintings . . . (39) Even so, the writing in this slim volume should not tax the patience of anyone looking to discover the variety of Le Corbusier’s work.

Charles-Édouard Jeanneret-Gris—better known as Le Corbusier, the name he adopted in 1920—was among the most significant architects and urban planners of the 20th century, and his career was marke...

1. Young Franz Kafka didn’t have many friends and assuaged his loneliness by reading the works of J. W. von Goethe, Blaise Pascal, Gustave Flaubert, and Soren Kierkegaard.
2. Before he became known as one of the major figures of 20th-century literature, Kafka lived in obscurity, working as an insurance clerk in his native Prague. 3. The insurance job didn’t generate enough money to allow Kafka to write full-time, so he and his friend Max Brod decided to supplement their income by writing a guidebook for tourists in Europe. 4. The adjective of his last name, Kafkaesque, implies a nightmarish world. And though he created frightening images in his books, he was reportedly terrified by mice. 5. The apartment of Kafka’s most famous fictional character, Gregor Samsa in The Metamorphosis, had the same layout as the author’s real flat in Prague.
Embed from Getty Images
  6. Kafka’s works were forbidden in his native Prague during communism because of “degenerative individualism.” The ban was lifted in 1989. 7. The author had an asteroid named after him in 1983. It passes the Earth every 523 days.  8. On July 3, 2013, Google celebrated what would have been the writer’s 130thbirthday with The Metamorphosis—inspired doodle. 9. According to Max Brod, Kafka never lied; the author believed in “absolute truthfulness.” 10. Kafka asked Brod to burn all his unpublished manuscripts after his death. Brod didn’t comply, publishing The Trial and other Kafka classics posthumously.

1. Young Franz Kafka didn’t have many friends and assuaged his loneliness by reading the works of J. W. von Goethe, Blaise Pascal, Gustave Flaubert, and Soren Kierkegaard. 2. Before he became known ...

Without question, Albert Camus ranks as one of the world’s great literary figures. He was also a philosopher and existentialist, even though he decried such labels. In Camus, by David Sherman, Associate Professor of Philosophy at the University of Montana at Missoula, we get a well-rounded portrait of the 20th-century French writer-cum-philosopher and an insight into a man rarely described in such vivid and complete terms. Camus, according to Sherman, was a handsome and charming man, with a penchant for women that derailed his marriage. He loved swimming and soccer, was fond of his birthplace, Algeria, but also had a darker side. The author depicts Camus as taciturn, self-righteous, and aloof—both personally and politically. Camus, who suffered from tuberculosis most of his life, was awarded the Nobel Prize for literature in 1957 and died in an automobile accident in 1960. In Camus, Sherman, associate professor of philosophy at the University of Montana at Missoula, covers the philosopher/writer’s most important works, notably The Myth of Sisyphus, The Stranger, The Fall, The Plague, and The Rebel. Instead of a chronological structure, however, Sherman groups his analysis of Camus’ works along conceptual lines. The book is comprised of an introduction, nine chapters, and an index. After a brief look at “Camus’ life” (chapter one), Sherman delves into “The absurd” in chapter two, where he concentrates on The Myth of Sisyphus.” “Life” in Chapter three encompasses Sherman’s analysis of The Stranger and Caligula, which the author characterizes as Camus’ “‘cycle’ on the Absurd” (p. 8). Chapter four, “Scorn,” looks at The Fall. “Solidarity” (chapter five), looks at The Plague, where Camus’ thoughts “move from solitary revolt to solidarity” (p. 8). Sherman considers the issues of Camus’ ethical and political philosophies in “Rebellion” (chapter six), and “Realpolitik” in chapter seven, covers Camus’ public break with Sartre, and his positions on both the Algerian and Cold Wars. “Exile and rebirth” in chapter eight examines Camus’ collections of short stories, Exile and the Kingdom, which “explores the quandaries of the modern consciousness” (p. 9) and the author’s final book, the autobiographical The First Man, both published posthumously (1971 and 1995, respectively). Sherman’s last chapter, nine, is devoted to “Epilogue.” Sherman’s work shows the unity of all of Camus’ writing, and the duality of man’s life in which happiness and sadness, dark and light, life and death are constantly at work. It also touches on Camus’ belief that despite such duality, man should appreciate life’s happy moments all the more, recognizing that they are fleeting. Sherman’s book further illustrates the philosopher’s profound influence—how his work set the stage for much of postmodern political, literary, and philosophical positions. In summary, the author examines Camus’ life, conflicted by his Algerian past, questions about justice, the impact of oppression, and the cyclical nature of wars and terrorism—much of which still resonates in today’s turbulent world. Camus, notes Sherman, refused to accept the loss of any innocent life in the pursuit of political goals, however noble. Although Camus’ positions were rejected by most of his contemporaries on the Left, including his friend and fellow existentialist philosopher Sartre, his critical perspective is more relevant today than ever. We can still ask ourselves, as Camus undoubtedly did, at what cost of human life is it acceptable to pursue noble goals of political independence?

Without question, Albert Camus ranks as one of the world’s great literary figures. He was also a philosopher and existentialist, even though he decried such labels. In Camus, by David Sherman, Assoc...

1. William James (1842-1910), the philosopher and father of American psychology, was born in 1842 into a highly regarded family. His father was Henry James Sr., a noted theologian in his day. Ralph Waldo Emerson, the leader of the 19th century’s Transcendentalist movement, was James’ godfather.
2. His brother, Henry James, is regarded as one of the US’s greatest novelists. He penned key works such as The Portrait of a Lady (1881) and The Turn of a Screw (1898). 3. James loved to paint when he was a youth. Upon receiving an apprenticeship under William Morris Hunt in Rhode Island, his family moved back to the United States from Paris so that James could hone his blooming artistic ability. His brother, Henry, once remarked that it was likely the only time someone ever left Paris for the US to study painting. 4. Though he loved art, James’ father desired for him to become a physician. Following his father’s wishes, James studied physiology in college and graduated from Harvard Medical School in 1869, however, he never practiced medicine. 5. James suffered through many bodily issues during his life. Doctors diagnosed him with neurasthenia, a medical condition identified in the 19th and 20th centuries that had to do with one’s nervous system. Along with this issue, James often suffered from depression and suicidal thoughts. 6. His transition from medicine to philosophy and psychology resulted from “a sort of fatality,” he called it. He once remarked, “the first lecture on psychology I ever heard [was] the first I ever gave.” 7. James was one of the founders of Pragmatism. This broad philosophical ideology considers the world to be inseparable from agency within it; otherwise, a given belief about the world must be tested according to its practical application. An odd aspect of this philosophy is that the truth of any idea can never be validated. 8. As a professor at Harvard University, James became the first instructor to ever offer a psychology course on American soil. 9. James instructed many notable intellectuals throughout his teaching career, such as W. E. B. Du Bois and Theodore Roosevelt. 10. On August 26, 1910, James passed away at 68-years-old in Chocorua, New Hampshire.

1. William James (1842-1910), the philosopher and father of American psychology, was born in 1842 into a highly regarded family. His father was Henry James Sr., a noted theologian in his day. Ralph Wa...

1. Contrary to popular belief, the “father of modern science,” Galileo Galilei (1564-1642), did not invent the telescope. The telescope’s creator was actually Hans Lippershey, a Dutch eyeglass maker. Lippershey invented the telescope in 1608 and Galilei forged his own version the following year.
2. Galileo practiced astrology and interpreted horoscopes. During his time, astrology had not yet completely separated from astronomy; as a matter of fact, it was taught as a subject in many Italian universities. Galileo spent much of his time teaching students and family members about the superstitious activity, even earning payments for his instruction and predictions. 3. He was sentenced to life in prison. During the Roman Inquisition in 1632, Galilei had published Dialogue of the Two Chief World Systems, a book that compared the Ptolemaic system and the Copernican system. Though he did not suggest that Copernicus’ view of the solar system was correct, the Roman Catholic church deemed him a heretic for teaching that the sun was at the center of our solar system. He was ordered to recant his position and was promptly thrown into prison. 4. Galileo never married his longtime partner, Marina Gamba. The pair had three children together, yet they never even shared a home. Most scholars in his era remained single, and Galileo followed suit. 5. He was a skilled artist. Galileo was known for painting the gripping discoveries he found within the night sky. He utilized a unique style that can be classified as a forerunner to the impressionism of the 19th century. 6. Though he was sentenced to life in prison, Galileo spent his final years at home under house arrest. Since he was allowed to have visitors, Galileo hosted gentlemen such as the philosopher Thomas Hobbes and the poet John Milton in his home. 7. Galileo and the great playwright, William Shakespeare, were born in 1564. Shakespeare’s play, Cymbeline, makes reference to Galileo’s discovery of Jupiter’s four moons. 8. In 1638, Galileo lost his eyesight. He managed to publish one final book before his sight was completely gone, Discourses and Mathematical Demonstrations Concerning Two New Sciences. 9. Since the 19th century, his middle finger has been displayed at various Italian museums. After being dishonorably buried by a side chapel in 1642, Galileo’s remains were transferred to the honorable Santa Croce basilica in Florence, Italy, nearly 100 years after his death. During this transfer, some bones were removed from his corpse, which led to his finger being put on display for the general public to oddly admire. 10. After eight years of house arrest, Galileo died in his home on January 8, 1642. He was 77-years-old.

1. Contrary to popular belief, the “father of modern science,” Galileo Galilei (1564-1642), did not invent the telescope. The telescope’s creator was actually Hans Lippershey, a Dutch eyeglass m...

Kristine Larsen begins her biography of Stephen Hawking with a central question: how has a theoretical physicist known for “esoteric mathematics” and “the secret language of general relativity” become a cultural icon and the most recognizable scientist in the world? While the author still grapples with the same question by the end of this brief volume, Stephen Hawking: A Biography provides a capably concise view of the enigmatic genius, packing a remarkable amount of material into a few chapters. However, the book’s very brevity denies the reader a true appreciation for Hawking’s accomplishments, genius, and inner life. Hawking’s popular science book A Brief History of Time was criticized for its pedantry and impenetrability; quite the opposite charge could be leveled at Larsen’s biography of the man. This is light reading that doesn’t do quite enough justice to its heady subject. Larsen proceeds in conventional chronological fashion, detailing Hawking’s birth to Oxford-educated parents and his “eccentric” upbringing. For example, his family raised bees in the basement and drove a secondhand London taxicab, among other peculiarities. Larsen next details Hawking’s time at Oxford University—his poor study habits but brilliant output, his career indecisiveness, his preference for rowing in the Boat Club over homework assignments—until his graduate studies at Cambridge. These and other events are handled in the book with matter-of-fact conciseness. Next comes Hawking’s unfortunate slide into physical disability, starting with warning signs like stomach aches and clumsiness, and climaxing with a fall down a flight of stairs. Increasing accidents prompted Hawking to seek medical attention, resulting in the devastating diagnosis of amyotrophic lateral sclerosis, or ALS, a disease the scientist has battled for decades despite its debilitating (and degenerative) physical effects. However, readers looking for an emotional or inspirational survivor’s story will be disappointed; Larsen’s narrative steams ahead at a fixed, breezy pace, leaving little time to lament Hawking’s condition or laud his courage in dealing with it. This is not to say, however, that the biography does not cover the wide range necessary for a complex picture. Larsen establishes the cosmology controversy of the early 1960s over big bang versus steady-state theories of the universe, a debate Hawking helped to quell. Hawking’s home life and the extraordinary sacrifices of his wife, Jane, are also covered. There is a discussion of Hawking’s place both among his peers and in popular culture, especially after the runaway success of A Brief History of Time. There are also the stories behind Hawking’s most astonishing theories, such as black hole radiation, the “no-hair” theorem to describe black holes, and the no-boundary proposal to describe the universe. A Professor of Physics and Astronomy at Central Connecticut State University, Dr. Larsen is especially qualified to make the opacity of theoretical physics more transparent to the layperson. Indeed, the author excels at this task. What is missing from the book is any sense of Hawking’s genius or greatness. Larsen’s untangling of difficult scientific concepts serves entirely descriptive purposes. She conveys no awe or wonder in Hawking’s accomplishments, and little regard for the audacity of his intellect. Instead, she overwhelms the reader with a laundry list of this honorary doctorate or that television appearance, painstakingly charts the scientist’s career trajectory, papers, and conferences, but misses the essence of Hawking’s charisma or the astounding implications of his work. Kristine Larsen’s Stephen Hawking: A Biography is ultimately a kind of Cliff’s Notes approach to the famous physicist: useful, but perfunctory.

Kristine Larsen begins her biography of Stephen Hawking with a central question: how has a theoretical physicist known for “esoteric mathematics” and “the secret language of general relativity...

This new edition of Papa Hemingway from Da Capo Press brings A. E. Hotchner’s famous memoir back into print for the first time in nearly a decade. First published in 1966, this book offers an endlessly lively and intimate portrait of a friendship that spanned thirteen years and no fewer than five countries. Hotchner’s meticulous narrative—much of it derived from Ernest Hemingway’s own letters—depicts one of America’s greatest writers at his warmest and most joyous, but it also charts the long, agonizing physical and emotional decline that ultimately led to Hemingway’s suicide in 1961. Hotchner’s relationship with Hemingway began in 1948 when the young journalist was working as a correspondent for Cosmopolitan magazine. Having been assigned the job of persuading Hemingway to write a piece on “The Future of Literature,” Hotchner is dispatched to Cuba, where the two men instantly bond over pitchers of daiquiris, afternoons of deep-sea fishing, and conversations about Sartre and Marlene Dietrich. Over the next few years, their ties grow tighter, as Hotchner visits the novelist in Paris, Venice, Madrid, Havana and elsewhere. During these years, their friendship deepens. Hotchner comes to rely on Hemingway for the confidence and professional advice he needs to become a successful author, playwright, and screenwriter—indeed, he adapts much of Hemingway’s work for stage and screen. Hemingway, meantime, takes solace in Hotchner’s company, sharing with him an immense archive of personal stories that reveal the complexity of Hemingway’s character. Where Hotchner focuses on Hemingway, the writer, we are offered insight into the circumstances in which some of Hemingway’s most important works (and a few of his short stories) were conceived. Papa Hemingway is not, therefore, an intellectual biography. Instead, the book examines the business side of writing. We learn about Hemingway’s relationships with editors, his friendships and rivalries with other writers, and business dealings with film producers like Daryl Zanuck, who learns the hard way that Hemingway does not appreciate creative interference. Most strikingly, Papa Hemingway shows Hemingway struggling to hold onto his sense of youth and virility. He brags incessantly about his sexual exploits, and the narrative is loaded with precisely the sorts of tales one would expect from an account of Hemingway’s life, including long and entertaining reminiscences about hunting, fishing, bullfighting, and war. At the same time, though, readers watch as Hemingway’s own health and confidence decline. He gains a tremendous amount of weight from excessive drinking and poor diet, and he suffers from chronic physical pain after 1954, when he survived two separate plane crashes while traveling to Africa. As the memoir scrolls toward its inevitable conclusion, a heavy depression—capped by the loss of his home in Havana after the Cuban revolution—drains Hemingway’s massive personality and furious creative energy from his ailing body and mind. He stops writing, stops hunting—offering implausible excuses to stay inside—and eventually consents to electroshock therapy in the last months of his life. Papa Hemingway is not a conventional biography by any stretch of the imagination. It may be described best as a literary memoir, since Hotchner made no pretense of neutrality in depicting his subject, nor did he invest much effort in sorting out the “truth” of his friend’s life from the exaggerated and even implausible stories that flowed from Hemingway’s pen and tongue to the pages of this book. This fact has always made Papa Hemingway somewhat of a controversial text. Some readers have appreciated the tales themselves—authentic or not—as genuine expressions of Hemingway’s remarkable personality. Others have objected that Hotchner allowed Hemingway’s self-created mythology to overwhelm the truth of his life. Few readers, however, will find this anything less than a fascinating, tragic, and loving account of one of the last century’s literary lions.

This new edition of Papa Hemingway from Da Capo Press brings A. E. Hotchner’s famous memoir back into print for the first time in nearly a decade. First published in 1966, this book offers an endles...

This is the latest offering from Richard Schickel, a renowned film critic who probably knows more—and writes with greater incision—about the history of film than any contemporary reviewer. Best known as a movie reviewer who spent years writing for Time magazine, Schickel is also an accomplished historian, having written well-regarded biographies of Elia Kazan and D. W. Griffith among others. A collection of short essays, Film on Paper collates the work published under the column of the same name, which Schickel began writing for the Los Angeles Times Book Review in 2001. The book is divided into six clusters of essays about the genres, stars, directors, writers, businessmen, and critics who make up the complex (and, he reminds us, peerlessly expensive) world of film.

Schickel’s work for the L. A. Times is a genre all to itself; rather than a conventional column about film, “Film on Paper” reviews books about movies. It is, in other words, an ongoing reflection about how we think about the art and business of film, and it ruthlessly catalogues what he describes as the “slovenly” commentary that publishers are apparently willing to release. As he explains in the book’s introduction, the books he reviews—with few exceptions—“tend to be badly written, largely by hack journalists or dull-witted academics” (4).

As Schickel sees it, bad books about film tend to misunderstand one or more fundamental aspects of the industry: Whether they oversimplify its business history (e.g., Thomas Schatz, The Genius of the System), exaggerate its politics (Paul Buhle and Dave Wagner, Radical Hollywood), or turn out “plodding” biographies of its stars, (Lee Server, Ava Gardner), he is unsparing in his dismissals and persuasive in suggesting where exactly the authors may have lost their bearings. His skewering of Edward Jay Epstein’s The Big Picture is a masterpiece, and he has great fun at the expense of Joe Eszterhas’ memoir, Hollywood Animal, but Schickel is no mere bomb-thrower. He reviews plenty of books that earn his praise, and curious readers will find many reading suggestions as well as advice on books and authors that might best be avoided.

In most of these essays, however, Schickel does not dwell long on the books themselves. Frequently, the reviews offer him a chance to reflect on some aspect of the industry—or some particular character within it—about which he has something interesting to say. For instance, in two of the pieces—one on the African-American performer Stepin Fetchit, the other on Jews and cultural assimilation—we read about the ways that film reflects the complexities of race, ethnicity, and identity in 20th century America. Similarly, other chapters examine (in succession) the phenomenon of “exploitation” films and the hideous production codes that bound the film industry for three decades (and beyond) to a prudish morality that thwarted art in the name of “decency.” Regardless of Schickel’s conclusions about the books themselves, essays like these stand on their own. They are miniature lessons in the history of film, and they remind us that movies are more than what they seem. Essays like these, give the book a greater sense of coherence than one might expect from a collection of reviews.

In addition to essays that take a broad look at the world of film, Film on Paper includes closer peeks at familiar actors like Katharine Hepburn, Warren Beatty, and Sidney Poitier; directors like Stanley Kubrick and Orson Welles; and executives like Walt Disney and Jack Valenti. Readers who are not film buffs, however, might find themselves reading about some unfamiliar figures—but they will continue reading, which is, of course, the point. Schickel is an endlessly entertaining writer. Film on Paper is an outstanding read and well worth the time for anyone who enjoys film, good writing, or both.

This is the latest offering from Richard Schickel, a renowned film critic who probably knows more—and writes with greater incision—about the history of film than any contemporary reviewer. Best kn...

When the acclaimed American author Tom Wolfe died in 2018, he left a rich body of work whose stylistic blend of literary and journalistic techniques pioneered a movement in the 1960s and 70s known as the New Journalism. Along with writers like Truman Capote, Norman Mailer, and Gay Talese, this new movement pushed the boundaries of traditional journalism and nonfiction writing by introducing novelistic elements (controversial at the time) into their reportorial accounts. Armed with this novel approach, Wolfe produced such contemporary classics as The Kandy-Kolored Tangerine-Flake Streamline Baby, The Electric Kool-Aid Acid Test, From Bauhaus to Our House, and The Right Stuff that cemented his place in American literature. These and other books sought to cover the central cultural narrative of postwar America, but they never became literature. Wolfe addressed literary matters only occasionally, often as grist for his traditional stylistic flour mill. In his last effort, The Kingdom of Speech, Wolfe takes on two scientific pillars in the realms of biology and linguistics, namely Charles Darwin and Noam Chomsky. Darwin, of course, is considered the father of evolution—the view that all life forms are descended from a single common ancestor through unguided natural processes, and that life and its taxa are the products of the same natural laws that governed the formation of the universe. Chomsky is regarded as the father of modern linguistics for his formalization of the principles of syntactic structures in the brain. His model of language in the brain, called universal grammar, explains how language is unique to humans. Wolfe kicks off his narrative with the life and work of British naturalist, explorer, and travel writer, Alfred Russel Wallace. Wallace was an explorer and adventurer who contributed to the theory of natural selection alongside Darwin, although never at the same time or location. In this book, Wolfe largely uses the writing and discoveries of Wallace to support much of the argument he is trying to make—namely, that Darwin and Wallace had similar ideas about evolution and natural selection before Darwin published On the Origin of Species, and that Wallace's discoveries equaled those of Darwin. This is important because Wallace was only awarded joint credit for his contributions with Darwin at Darwin's insistence after the scientific community repeatedly snubbed him for many years. Perhaps the popular belief that Darwin was the real scientific genius behind his theory of evolution and the idea of natural selection and that Wallace was the one left out is, in fact, false and wrong. And contemporary science, rightly so, has given Wallace the credit he deserves, and Wolfe wants to see him further recognized. Yet the central narrative of The Kingdom of Speech is Wolfe's proposed cognitive revolution, a return to thinking about the origin of human nature and its place in the natural world, specifically the relationship between widely used languages and the many other languages found in the world’s wild. In the book, Wolfe takes on Darwin's assertion that man is an ape and life is a meaningless process without intent. Wolfe asserts that far from the ape, humans stand apart from other animals with an incomparable ability to reason and communicate. The book is a broadside against Darwin’s theory of natural selection and Chomsky’s view that human language is simply one more adaptation of our brain. These pillars have been nothing short of a blind ideology for many researchers, Wolfe claims, a blindness that Wolfe seeks to overturn. However, the true guiding principle of the book is Wolfe’s argument against Chomsky’s theory about the evolutionary basis of human language. Wolfe realized that Chomsky’s reliance on the notion of “perfect grammar” that humans have in their heads, separate from any empirical evidence, was very similar to the notion of “a perfect machine, already invented.” In later chapters, Wolfe accuses Chomsky of being Darwin’s intellectual heir, arguing that Chomsky’s theory that much of language is innate is similar to Darwin’s vision of the perfect machine. While Wolfe is right about Chomsky’s reliance on an ideal grammar, he is honestly wrong about the specifics of Chomsky's views. There is no Chomskyan parallel to Darwin's supposed "ideal machine.” Chomsky holds that humans have certain innate grammatical rules which are used to explain how humans acquire language. His view is much more sophisticated than Wolfe’s caricature, which misrepresents Chomsky by ignoring many of his nuanced theories on the subject, and in particular, Chomsky’s view of individual languages being historical, social, and cultural phenomena, while language is a biological faculty of the mind. Furthermore, Chomsky in his writings has proposed a specific mathematical model to describe the mind’s functioning, and not just language’s. This Mathematical Model of the Human Language Faculty, as Chomsky calls it, is Merge. Merge is a non-mandatory operation of putting together words in order to create meaningful sentences. This model is not purely concerned with language, but with the human mind’s ability to reason and understand abstract concepts. Chomsky’s famous views are not the fiction Wolfe makes them out to be, but empirical hypotheses about mind and brain. Chomsky is working on a neurobiological theory of how the mind is able to accomplish these functions that he has himself described in greater detail in recent writings. These are the essential features of Chomsky’s theory, and the outlines of the theory are not fiction; they are supported by empirical evidence. Chomsky is not the father of the idea that language is an innate skill. A number of linguists before him have viewed language as a cognitive ability of human beings. More specifically, Chomsky is the father of the idea that language is an innate, learned skill that is not reliant on any specific culture or environment. Though this is true, Chomsky has emphasized over the years that in his view language is a biological faculty that is learned and socialized by humans in whatever culture they are born and raised in, and therefore, it is just as much cultural as biological. Wolfe’s claim that Chomsky posits that every human fetus is born with a universal grammar encoded in their brains which cues the right age of language acquisition, is patently false. Chomsky holds that children possess an innate, biological ability—universal grammar—which allows them to learn any human language. He does not believe that infants come out of the womb with “a universal language in perfect working shape.” That is dangerous, and Chomsky has often used the example that infants are never born with perfect pitch, yet they are able to acquire music. Chomsky has championed the idea that language is primarily a human concern, not a scientific one, though it is both. He has written extensively about the shortcomings of the scientific method used in linguistics, critiquing non-empirical techniques that have been used for a long time in an effort to explain language’s evolution and development. Chomsky admits that because language is cultural and humans are not perfectly rational creatures, they cannot know what is truly in their heads, and therefore can’t know what language really is. In fact, as Chomsky puts it, language can be considered as a form of thought. But it is not only that; it is also a form of action. The old non-scientific model of linguistics, the structuralist approach which proposed that language represented three things—sound, meaning, and logic—is the kind of analytical approach that Chomsky has been arguing for and against. For instance, in his article, “A Review of B. F. Skinner’s Verbal Behavior,” he accuses Skinner of being unaware of the true complexity of language. Chomsky’s analysis of language made it easy for him to disagree with the sterile versions of linguistics that he sees linguists using in their studies. Chomsky says that language doesn’t just reside in the brain passively, but rather serves many purposes in the real world, and that language is action-oriented. It can be used by a person to tell a story, to ask for help, to write a poem, to say goodbye to an old friend, to call attention to an object, or to say good morning. If language didn’t do anything, then none of its meaning would be known to humans. Chomsky focused on syntax, the common rules in our head that are used to create anything that we say. He also proves that the same rules are not used in music, in painting, or in mathematics. Human language is special because the rules are universal. The Kingdom of Speech is the most poetic work of Wolfe’s long career. It is also his most visionary, most speculative, most excitable, and most intellectually dense. Wolfe has managed to create two literary worlds, old and new, that simultaneously exist, but not in the usual alternating fashion. He attempts to create a literary structure of simultaneous alternative futures that exist in the present. It is the literary prose of an old man trying to win an argument with a younger opponent who is not always convinced by the very argument he is trying to win himself. One only hopes that this work will lead to a counter-argument by another younger adversary, for who else will engage so passionately and with such animus this great boreal owl, this founding father of new American creative literature.

When the acclaimed American author Tom Wolfe died in 2018, he left a rich body of work whose stylistic blend of literary and journalistic techniques pioneered a movement in the 1960s and 70s known as ...