Job Archives

1.John Maynard Keynes was born into a successful family. His father was John Neville Keynes, a formidable economist in his own right, and his mother was Florence Ada Keynes, who was among the first female graduates of King’s College, Cambridge.
2. In 1905, he earned an undergraduate degree from King’s College in mathematics. While he naturally took to math and the classics, the only economic experience he obtained as an undergraduate was an eight-week course in it. To add to this unusual fact, he scored higher in English history than in economics when he took his civil service examinations. 3. He was a member of “the Society,” a private club of intellectuals and writers alternatively called the Bloomsbury Group. It was here that he was introduced to one of his best friends, the modernist writer, Virginia Woolf. 4. Keynes was bisexual. Though he happily married his wife, Lydia Lopokova, at the age of 42, many of Keynes’ early years were spent having sexual encounters with other men. 5. Many of “the Society’s” fellow members discouraged Keynes from marrying Lydia. The writer Lytton Strachey called her a “half-witted canary,” and urged Keynes to keep her as his mistress instead. Keynes evidentially rejected his friend’s advice. [caption id="attachment_36109" align="aligncenter" width="900"]John Maynard Keynes John Maynard Keynes[/caption] 6. He was an ardent supporter of the arts. He served as the founding chairman of the Arts Council in 1945, reopened the Royal Opera House, and spent many years in between the World Wars advocating for more citizens to take up art as a pastime. 7. Keynes promulgated the idea that automation would become so advanced that future generations would only have to work for a mere three hours a day. 8. He was a pioneer for women’s rights. He frequently lectured on the use of contraception and served as the vice president of the Marie Stopes Society for Constructive Birth Control. 9. In 1942, Keynes was awarded a peerage for his service to Britain. He took his seat with the Liberal Party in the House of Lords and his name consequently changed to Baron Keynes of Tilton. 10. Keynes suffered from a heart attack in 1937. After a brief period of recovery, he valiantly pressed on in his career. He wrote three more influential articles on economics, resumed his teaching career at Cambridge, and served in Britain’s treasury until his death in 1946.

1.John Maynard Keynes was born into a successful family. His father was John Neville Keynes, a formidable economist in his own right, and his mother was Florence Ada Keynes, who was among the first fe...

 
Thanks to his groundbreaking work in logic, the philosophy of mind, mathematics, and language, as well as two published works, Tractatus and Philosophical Investigations, Ludwig Wittgenstein (1889-1951) played a leading role in the 20th-century analytic philosophy.
Jaakko Hintikka was Professor of Philosophy at Boston University. Author of over 30 books, he was the main architect of game-theoretical semantics and of the interrogative approach to inquiry, and also one of the architects of distributive normal forms, possible-worlds semantics, tree methods, infinitely deep logics, and the present-day theory of inductive generalization. He joins us on Culture Insight to share his insight into the life and work of Ludwig Wittgenstein.

  Thanks to his groundbreaking work in logic, the philosophy of mind, mathematics, and language, as well as two published works, Tractatus and Philosophical Investigations, Ludwig Wittgenstein...

The enduring power of religious belief in the 21st century would come as something of a surprise to Sigmund Freud, Ludwig Feuerbach, Karl Marx, and the other long-departed social theorists whose influence lurks in the background of Tamas Pataki’s provocative essays Against Religion. These and other observers of modernity expected that over the long haul, science and reason would emerge victoriously over ancient superstition and conservative religious dogma. In the words of Max Weber, one of the most profoundly influential thinkers of the 20th century, modernity had a “disenchanting” effect on the world, relegating faith to the margins. Clearly, this has not happened. How, then, are we to understand the resurgence of faith—especially fundamentalist varieties—in every major religion over the past half-century? In this self-described polemic, Pataki, Honorary Senior Fellow in the Department of Philosophy at the University of Melbourne, Australia, and Honorary Fellow at Deakin University, also in Melbourne, details the psychologically confused motives that bring ordinary people to believe in God. In doing so, he joins a slew of recent authors like Richard Dawkins (The God Delusion), Sam Harris (The End of Faith) and Christopher Hitchens (God is Not Great), all of whom have been described for better or worse as the “new atheists.” In seven brief, sharply-worded chapters, Pataki dismisses religion as philosophically impoverished—a “phantasy masquerading as knowledge”—that derives from a deeply-seated human preference for error over truth (page 113). Though he acknowledges that religion provides “hope, consolation, and the sense of being loved and of being worthy to love” (page 7), he argues in the end that religion is fundamentally irrational and that little good can ultimately come from a system of belief founded on a “refusal to think” (page 110). Pataki’s attack does not wander into the thick woods of theology. He does not try to disprove the existence of God or pick fights over specific doctrinal issues that might be relevant to one religious community or another. Rather, he takes it for granted that all religious arguments are “incoherent” and that proving specific beliefs false is beside the point. Even if rational grounds could be discovered in support of religious belief, he explains, the argument would be pointless because most people adhere to religion for reasons that are not anchored in rationality. Instead, Pataki argues that for the “religiose”—those who view their relationship with God as the basis of their identity—powerful socio-psychological forces prop up their faith. Like many other critics of religion, Pataki asserts that religious faith derives not from the best of human impulses, but from the worst. Religion, he argues, encourages violence and rationalizes cruelty; under a guise of love and generosity, it masks fear and weakness while producing humiliation and shame for its believers, who project their own anxieties on to those viewed as dangerous and demonic outsiders. At bottom, he charges, religious devotion springs from narcissism, which reveals itself in
“A central preoccupation with defending individual and group identity, with self-esteem, ’specialness’ and superiority, with achieving certainty or a kind of omniscience, and with sexual morality, which, in the strange logic of narcissism, is closely linked with these other endeavors. ” (page 33).
Drawing heavily on psychoanalytic theory, Pataki insists that religion is motivated by fear, promotes violence, distorts reality, and encourages servility and dependence. Religion, in other words, does not merely fail to illuminate or enlighten. Rather, it actively subverts humane values, supplying “attractive ideological frameworks for organizing and satisfying the infantile and pathological narcissistic (and other) needs” (page 86). In our own world, all this leads to new extremes of religious fervor, leading believers toward organized aggression and sustaining them as they assert worrisome degrees of influence over government and the law. Although the book’s critique is a philosophical one, it is broadly informed by a sense of dismay with religious conservatives who play such influential roles in the politics of the United States and Australia, among other nations. In drawing out his argument, Pataki adopts a position that will irk more liberally inclined believers who do not recognize the theology of the Christian right as their own. Rather than consign fundamentalist revivalism to the margins—to argue that fundamentalists do not represent the “true spirit” of any particular religious tradition—he insists that fundamentalism expresses that spirit quite adequately and that fundamentalisms are mere “branches of a noxious tree” (page 13). Though Pataki hedges his claims by noting that religious people and their communities do sometimes perform “splendid humanitarian work” and make commendable efforts on behalf of the poor, the book’s overall tone is sharp in a way that some readers will find uncharitable. These readers will take some comfort, perhaps, in the fact that Pataki does not wish them to abandon their faith. As the author concedes in the book’s first pages, his critique is not likely to persuade anyone that their “wishful and delusional” beliefs are not worth holding (page 7).

The enduring power of religious belief in the 21st century would come as something of a surprise to Sigmund Freud, Ludwig Feuerbach, Karl Marx, and the other long-departed social theorists whose influ...

 
Pulitzer and Nobel-winning writer, Ernest Hemingway (1899 – 1961) was one of the most influential writers of the 20th century, whose simple, clear, and distinctive style revolutionized literature.
American author Gay Talese is the bestselling author of eleven books. He was a reporter for the New York Times from 1956 to 1965, and since then he has written for the Times, Esquire, The New Yorker, Harper's Magazine, and other national publications. He joins us on Culture Insight to share his insight into the life and work of Ernest Hemingway.

  Pulitzer and Nobel-winning writer, Ernest Hemingway (1899 – 1961) was one of the most influential writers of the 20th century, whose simple, clear, and distinctive style revolutionized lit...

 
The author of such literary classics as Ulysses and Finnegans Wake, James Joyce (1882 – 1941) was one of Ireland's most celebrated novelists known for his avant-garde and often experimental style of writing.
Michael Patrick Gillespie is Professor of English at Florida International University and the Director of the Center for the Humanities in an Urban Environment. He has written eleven books and numerous articles on the works of James Joyce, Oscar Wilde, William Kennedy, Chaos Theory, and Irish Film. His anthology of early Joyce criticism was published in the spring of 2011 as part of the University Press of Florida Joyce Series. He is currently at work on an oral history of early Joyce studies and on a book on Joyce and the experience of exile. He joins us on Culture Insight to share his insight into the life and work of James Joyce.

  The author of such literary classics as Ulysses and Finnegans Wake, James Joyce (1882 – 1941) was one of Ireland’s most celebrated novelists known for his avant-garde and often expe...

A British philosopher, logician, and mathematician, Bertrand Russell (1872-1970) made significant contributions to the fields of mathematical logic, analytic philosophy, metaphysics, ethics, and epistemology. He also wrote extensively on a wide variety of subjects in science and the humanities, and in 1950 he was awarded the Nobel Prize in Literature.
Stephen Neale, Distinguished Professor of Philosophy at the CUNY Graduate Center, discusses Bertrand Russell's seminal paper "On Denoting" published in the journal MIND in 1905 and the ensuing philosophical debate centered on it. He shares his insight into the life and work of Bertrand Russell. https://youtu.be/0IdZ3aXeQM4

A British philosopher, logician, and mathematician, Bertrand Russell (1872-1970) made significant contributions to the fields of mathematical logic, analytic philosophy, metaphysics, ethics, and epist...

Wade Rowland, the author of Galileo's Mistake, characterizes the trial of Galileo Galilei by the Inquisition in 1633 as a defining moment in modern Western culture. As generally understood, this trial pitted arbitrary and dangerous religious authorities against the progression of scientific discovery. As Rowland defines what he calls “the myth of Galileo,” the Catholic Church condemned Galileo because he had discovered the truth. The author, one of Canada’s leading literary journalists, begins his opposition to this interpretation by educating the reader about how Galileo’s contemporaries in the 17th century would have perceived the issues involved in the trial. Rowland writes that it is a mistake to impose modern sensibilities on the 17th century, and he succeeds in making a point of how difficult it is for people today to relate to or comprehend the mores of the ancient or medieval worlds. In his opinion, this is because the Scientific Revolution, in which the Italian astronomer, physicist, mathematician, and philosopher has played an important role, has overwhelmingly influenced the modern worldview. This is a theme the author returns to again and again. For example, he writes, “I came to share a conviction that the roots of what is most disturbing about the modern world find their nourishment … in what is often called the Scientific Revolution.” Rowland believes the Scientific Revolution “expanded the creative horizons of humanity while reducing the mass of individual humans to the status of commodities and consumers” and “improved health and longevity while promoting unprecedented spiritual and existential dis-ease.” Rowland offers a kind of survey of the history of scientific thought, experimentation, and discovery, which is the best part of the book. He provides an excellent and insightful summary of 17th-century science beginning with traditional and accepted views of a universe in which the Sun traveled around the Earth, and the Bible was the recognized guidebook to the heavens. He covers Ptolemy, Kepler, and Copernicus, providing readable depictions of their discoveries and how those discoveries were accepted—or not—by the general public and the Church. His presentation of Galileo’s life and discoveries is especially interesting, noting that in his own time, Galileo was known for his work in mechanics rather than astronomy.
Embed from Getty Images
The author repeatedly disavows the contention he is acting as an apologist for religion, but he seems to be less than an enthusiastic fan of the scientific side in the argument. He writes, “There is a legitimate place for religious insight in the pursuit of science” and wants to emphasize “that science is not the only legitimate fount of knowledge … that it can and should be challenged on some of its most fundamental preconceptions.” In a rather alarming mixed metaphor, he writes, “I want to lift its skirts and expose the rot in its foundations.” Rowland’s major point is that the Church was not close-minded about Galileo and his support of the so-called Copernican heresy, and overall, he makes a good case for this position, citing examples of individuals at the highest levels of hierarchy who thought Copernicus was probably right. The Church’s real objection to Galileo, according to the author, involved how the astronomer expressed his belief. Galileo thought the universe “was essentially a mathematical reality, in some literal way composed of numbers.” The Church could not accept this position because it excluded the possibility of an ultimate purpose to existence. Therefore, the argument between Galileo and the Church was really about “the nature of reality and what we can truly know”—an argument that Rowland believes continues to “bedevil” modern civilization to this day. Galileo’s mistake was in believing that “nature is its own interpreter.” The author argues it is wrong “to assert, as Galileo did, that there is a simple unique explanation to natural phenomena, which may be understood through observation and reason, and which makes all other explanations wrong…Scientists do not discover laws of nature, they invent them … Society's facts’ about nature are not preexisting truths, they are human constructs…the truth that science ‘discovers’ is not objective and immutable, it is subjective and socially contingent.” Interspersed throughout the book are present-day dialogues, or conversations, between the author, his friend and former student Berkowitz, and a nun, Sister Celeste, in which they expand the science-versus-faith debate. These conversations take place in various places in Italy that have a connection with Galileo’s life, so the reader gets a flavor of the physical settings associated with the trial. However, at times the dialogues are distracting to the book’s overall flow, and Rowland’s tone sometimes slips into condescension toward his conversational partners, characters who are not always successfully portrayed as people who exist beyond their stereotypical roles. For example, Berkowitz always takes the side of materialistic, scientific inquiry, while Sister Celeste, whom Rowland repeatedly describes as “the little nun,” favors mystery and faith. The dialogues present additional biographical and historical information about important figures like Kepler and Copernicus, which is sure to be appreciated by readers. The author is not unbiased in his presentation—nor should he be, since he is taking part in one of the iconic debates of Western intellectual history. For example, he describes Aristotle’s ideal man as “a country-clubbish, self-righteous creature of decidedly materialistic leanings. A bit of a stuffed shirt. Very modern, in fact.” The author seems to dislike Galileo as a person as well. He describes him as “insufferable” and cites the “polemical” and “bellicose” style of Galileo’s writing, especially in The Starry Messenger, as the real reason for the Church’s criticism of its ideas. Overall, this is an interesting take on one of the central events in Western intellectual history and is likely to be well received by readers who relish debate about the nature of truth and knowledge.

Wade Rowland, the author of Galileo’s Mistake, characterizes the trial of Galileo Galilei by the Inquisition in 1633 as a defining moment in modern Western culture. As generally understood, this...

Willard Van Orman Quine (1908–2000), was one of the greatest analytic philosophers and logicians of the 20th century whose body of work fits into a long history of philosophical treatments of metaphysics, epistemology, and language, especially within the empiricist tradition, which remains profoundly influential to this day. While his writing was famously clear and crisp, his ideas are complex and require close attention. It is a wonder that Alex Orenstein, Professor of Philosophy at Queens College and the Graduate Center, City University of New York, has been able to summarize these ideas so succinctly in under 200 pages. Orenstein’s book mentions few of the details of Quine’s life. There is a reason for this, as Quine, who spent many years at Harvard University, first as a student and then as professor of philosophy and mathematics, was a private man. In his own autobiography, “The Time of My Life,” Quine mostly documents his various travels but mentions few important life events. Orenstein too focuses on Quine’s ideas rather than his personality. Understanding Quine’s work must begin with an understanding of Quine’s work on logic. He followed many other 20th-century philosophers in attempting to reduce logic to set theory and mathematics. This follows his general methodological approach of trying to reduce his ontology—the set of things that are said to exist—to the smallest possible number. Quine always attempted to reject unnecessary ontological and abstract entities. The reduction of logic to set theory was one of his most basic efforts to follow this minimalist methodology. [caption id="" align="aligncenter" width="500"]W. V. Quine W. V. Quine[/caption] Quine is introduced in Chapter 1. In Chapter 2, Orenstein reviews his work on logic, building the philosopher’s unique method of determining what exists on his account of logic. Quine developed a particular method of expressing existence claims that led, in turn, to a method for determining a proper ontology. As outlined in Chapter 3, Quine avoided accepting as many abstract objects as he could into his ontology, which caused him to embrace a radical empiricist view of the world and led to several famous attacks on the prospect of a priori knowledge—or knowledge of the world by reason alone, apart from experience. This is the subject of Chapter 4. Chapters 5 and 6 continue to expand on Quine’s conception of analytic or a priori truths and the implications of doing away with analyticity. In Chapter 6, Orenstein discusses Quine’s struggle to dispense with the concept of meaning and his argument that many claims in language are indeterminate and inscrutably refer to their referents. Quine also famously rejected the coherence of modal logic or the systematic reasoning about what is possible or necessary. In Chapter 7, Orenstein discusses Quine’s attempts to critique and do away with modal logic and to produce another analysis of modal claims. In this chapter, he also discusses the philosopher’s attempt to do away with propositional attitudes—mental states with representational content. For instance, when I believe that snow is white, I have an attitude—a belief—associated with a proposition (snow is white). Quine wanted to do away with these attitudes as incompatible with his empiricist commitment to physicalism, the view that all that exists in the world are physical particulars (except the existence of sets, in Quine’s case). This position on the nature of the mind prefigures a modern view known as eliminative materialism, which states that a mature neuroscience will radically alter our conception of the mental, perhaps eliminating most of our ordinary mental concepts altogether. In the final chapter of the book, Orenstein discusses Quine’s approach to epistemology, the theory of knowledge. Quine founded a school of epistemology known as ”naturalized epistemology.” Traditional epistemology attempted to analyze epistemic concepts without reference to science. But Quine’s empiricist bent led him to find this approach implausible. He came to believe that we could study epistemology by learning how human brains interacted with their world and stored information. Orenstein ends the book on this note, but there are other elements that merit mentioning. One of the most interesting features of the book is the author’s criticism of Quine’s views. Orenstein does not reject Quine’s positions but instead informs the reader of the significant challenges to these views that have arisen in past decades. Many of the critiques have gained traction among many Anglo-American philosophers who were influenced by Quine. But one disadvantage of the book is that the sheer counterintuitiveness of many of Quine’s positions is hardly focused on. Quine rejected the existence of beliefs and desires, denied that some propositions were true by definition, and dismissed the very concept of meaning. Presumably, such radical stances would merit some mention. Quine’s positions, which he claimed derived from common sense, stand stridently against a common-sense understanding of the world. This doesn’t mean he was wrong, but in the eyes of many philosophers, it is a significant strike against him.

Willard Van Orman Quine (1908–2000), was one of the greatest analytic philosophers and logicians of the 20th century whose body of work fits into a long history of philosophical treatments of metaph...

1. Leo Tolstoy (1828–1910), who is most famous for his magnum opus, War and Peace (1867), monastically journaled about life. From the time he was a teenager, he penned his daily routine, which included waking by 10 am, as well as his moral failures, such as his routine brothel visits or gambling problems.
2. Tolstoy never finished his degree. As a student at the University of Kazan, his professors described him as someone who was “unable and unwilling to learn.” After earning low marks in his coursework throughout his two-year tenure, he dropped out of the university. 3. He served as an artillery officer in the Crimean War. During his time in the military, he wrote the novel, Boyhood, which was the second novel in his autobiographical series: Childhood, Boyhood, Youth. 4. He married the much younger Sophia Bers within a few weeks of meeting her. Tolstoy was 34-years-old and Bers was only 18 on the day of their wedding. 5. Tolstoy and his wife had 13 children together, but only nine survived through infancy. 6. He became a religious leader. Through religious syncretism, he developed a cult that sought to live ascetic and passive lifestyles. Many followers of Tolstoy, or Tolstoyans, as they were called, sojourned with him on his estate. Some even found communes of their own throughout Russia and parts of Europe. 7. He began wearing peasant clothing and giving away much of his money as a result of his newly found religious zeal. His wife was concerned that he would lead the family into bankruptcy, so he gave her the rights to his royalties and copyright. 8. Tolstoy was excommunicated from the Russian Orthodox Church on February 22, 1901. To this day, his excommunication stands. The church has ignored pleas from Tolstoy’s descendants to have the writer unbanned. The church claims that the elders cannot rescind the excommunication of a former church member who has already died. 9. Tolstoy’s marriage to Bers was one of the rockiest relationships in the world of literature. Despite Bers’ assistance in publishing War and Peace, managing their estate, and tending to their many children, Tolstoy frequently argued with his wife over financial matters and his ascetic practices. 10. One night when Tolstoy was 82-years-old, he realized he was unhappy with the constant bickering between him and his wife. He took his youngest daughter, Aleksandra, and fled to a small plot of land owned by his sister. During his escape, Tolstoy’s health declined to fatal depths. He died of pneumonia on November 20, 1910.

1. Leo Tolstoy (1828–1910), who is most famous for his magnum opus, War and Peace (1867), monastically journaled about life. From the time he was a teenager, he penned his daily routine, which inclu...

When the remains of that long-necked lizard, the Brontosaurus, were initially discovered in 1879, it was believed to be of a new genus and species. Thirty years later, this theory was called into question. It was too close to the already known Apatosaurus to be anything other than merely a species in that genus. So Brontosaurus was discarded—until over a century later, our present day, when it was again agreed that they were distinct genera. So much controversy over fossils. What is interesting about this process of controversy and counterclaim (and counter-counterclaim) is how it seems to be live-streamed in the recent republication of Brian Wilks’ 1978 biography of Jane Austen. Not because it is a controversial book. Far from it. Wilks writes with the pure joy of a studious hobbyist and not as a trench-digging professor (he used to teach English Literature and Drama at Leeds University). But its excavation from the archives gives us a chance to reflect on both Jane Austen and the way our perception of her has changed over time. From the start, Wilks provokes this way of thinking with a new foreword, inviting us to consider how Austen “would be blogging with the best of us.” Reading on, one feels not unlike Wilks does in looking over Austen’s letters: “We have the fossilized remains of the live event.”    

It’s unlikely that anyone alive today is not aware of Jane Austen. The early 19th-century novelist is not only associated with the beginning of the novel, but also with having sparked the first modern cult of celebrity novelist. And let’s not forget that every one of her novels (including the incomplete Sanditon) has one or more movie adaptations. Yet the way we read her novels has changed over time, despite the constant critical and popular acclaim that has followed Austen posthumously. Wilks’ 1978 perspective highlights this, noting that “the story of Jane Austen is the story of a family ... against a background of social change and the growing impetus of the industrial revolution ... in one of the most formative periods of English history." Who, during what we now call the Industrial Revolution, would think of Austen this way? For a man as deeply embedded in 19th-century literature as this author of several books on the Brontë sisters, who also served as Vice President of The Brontë Society, it’s a fitting view of Austen. At one point, Wilks directly compares the experience of walking around the Church of St. Nicholas for conveying Austen’s “sense of place” to walking around the Haworth moors for conveying the Brontës’.

This is his biography’s greatest strength. Wilks’ view is not that of a paleontologist, but a spelunker. His exploration of Austen’s life has an eye towards establishing her “sense of place,” far removed from contemporary anxieties, while keeping his critical lens affixed to the bridge of his nose. While never ignoring the “high contrasts and gross inequality” or the fact that “England was at war for all but seventeen of Jane Austen’s forty-two years,” Jane Austen is first and foremost a guided tour of the novelist’s life. Along these lines, Wilks provides an incredible and effective response to the complaints of her plots’ seeming frivolity (still around forty years later): “in a picturesque and unspoiled countryside the threats of the future, like those of French invasion, could be considered but largely ignored.” The simplicity of this explanation becomes doubly powerful once our understanding of Austen is developed alongside her personality from precocious child to complicated adult. It’s a headlamp against the cavernous curiosities of her inner workings, and has the novelist’s touch. Of course, this can make the book feel like a hagiography to some modern readers. Our cynical times give admiration a naive quality Wilks’ biography wouldn’t have had in 1978.

The issue is partially that only one year after the publication of Wilks’ Jane Austen, one of the definitive texts of feminist literary criticism—The Madwoman in the Attic by Sandra Gilbert and Susan Gubar—would be published. That work would, in essence, view Wilks’ perspective as precisely the problem with 19th-century literary heroines. They were either “contemplative purity” or “significant action.” Meaningfully, Gilbert and Gubar’s book takes its name from a character in Charlotte Brontë’s Jane Eyre. This fundamental shift in criticism and culture makes it difficult to look directly at Wilks’ reminders that “we all from time to time submit to the claims of fashion” or that Austen “took pride in being topical.” Wilks’ interest in Austen’s life as she had lived it seems limiting to someone in the 21st-century. One almost wishes that Wilks, or someone taking up his new foreword’s call, would write about Austen in relation to our age of social media. In fact, this book is already partly present in the biography. “Writing was clearly part of a social event,” Wilks says of the way the Austen family encouraged Jane to share her work at a young age, “an exchange between persons who had much in common."

For those uninclined to read Austen, or for those who are passersby on their way to another author, Jane Austen can catch the eye. Alongside Wilks’ contextualization of her country, her family, and her personal crises, it intrigues and leads like any good story. It may not compel its reader any more than one of Austen’s novels, but that seems like something with which the author himself would agree. If “Northanger Abbey and Persuasion ... tell us of Bath and its former glory more effectively than any guide book,” what hopes does Wilks have of being a better guide book to Austen herself? And while certain qualities of Wilks’ biography haven’t aged as well, we should keep in mind his own commentary on reading Austen: “because we have abandoned a habit owing to the implications of our present way of life, we should not overlook commonplace occurrences of two hundred years ago and find them remarkable in their time." One can’t think of a more symbolic sentiment for Austen in whose work it is this affection and letting another speak for themselves that breeds love. Whose characters she “‘allowed’ ... their own life.” It is a feeling one might find in the paleontologist who, hours into looking over Apatosaurus bones, sees the Brontosaurus in them at last.

When the remains of that long-necked lizard, the Brontosaurus, were initially discovered in 1879, it was believed to be of a new genus and species. Thirty years later, this theory was called into ques...

1. Isaac Newton (1643-1726), the English physicist, mathematician, and astronomer, was born on January 4, 1643. Although one’s birthday would seem like it would be an undisputed fact, the rest of the world understood Newton to be born on the Christmas Day of 1642. At the time, England was lagging the rest of Europe in its use of the Julian calendar instead of the Gregorian calendar which led to this contrast. Today, most scholars refer to Newton’s birthday as January 4.

2. His father, whose name was also Isaac, died three months before Newton was born. Adding to his mother’s burden, Newton was born prematurely and was not expected to live very far past his birthday.

3. His childhood was rather traumatic and left an indelible mark on his personality. At 3, his mother left him with his grandmother while she went off to marry Barnabas Smith, a successful minister. Though his mother returned to her son after Smith died, the damage had already been done. As a teenager, Newton once confessed the sin of “Threatening my father and mother Smith to burn them and the house over them.”

4. When he reached school age, Newton studied at the King’s School in Grantham. Around the age of 15, his mother pulled him out of school to help tend to their farm; however, Newton did not take to the field’s labor. His mother, now realizing that her son would not be of much help on the farm, sent him back to school to finish his education.

5. Newton’s intellectual abilities were largely overlooked during his studies. When he completed his undergraduate program at Cambridge University, Newton graduated without honors or distinctions. Although he earned a full scholarship for his graduate studies, he never held a full fellowship; he only held a minor position. Still, he persevered, and upon publishing his first major work, De Analysi, the mathematical community finally recognized him. 

6. At 26, he was appointed to the prestigious position of Lucasian professor of mathematics at Cambridge. Despite his rising influence, he displayed very little interest in teaching, and thus, his lectures were sparsely attended. 

7. Newton served in Parliament for two terms. His first term was from 1689 to 1690, and his second was from 1701 to 1702. As a representative of Cambridge University, Newton contributed very little to the governmental affairs of Britain. He only spoke once, and his utterance was not a magnificent speech on the floor, but was a simple request for an usher to close a nearby window because the air was cold.

8. Newton unjustly defended his position as the inventor of calculus. Gottfried Wilhelm von Leibniz, a German mathematician, published his version of calculus years before Newton’s work. Newton had developed calculus in the 1660s but did not publish his findings. What resulted was a bitter battle between the two scholars. Leibniz presented his case to the Royal Society in 1703, but Newton was then the Society’s President and had a strong influence over its affairs. Newton assembled a supposedly independent investigative committee to confer the title of calculus’s discoverer; however, he packed the committee with his supporters. Unsurprisingly, the committee named Newton as the discoverer. 

9. It has been said that Newton was inspired to study gravity after an apple fell off a tree and hit him on the head. However, this is not precisely the case. Newton was near an apple tree and observed an apple as it fell from the tree. The apple fell straight down instead of falling at an angle, which inspired him to develop the law of gravity. 

10. Newton died well into his years at the age of 84 on March 20, 1727. His body lies at Westminster Abbey, the resting place of Britain’s monarchs and honored citizens such as Charles Darwin and Stephen Hawking. 

1. Isaac Newton (1643-1726), the English physicist, mathematician, and astronomer, was born on January 4, 1643. Although one’s birthday would seem like it would be an undisputed fact, the rest of th...

Samuel Clemens, also and perhaps more widely known as Mark Twain (1835–1910), continues to be a person of great curiosity. Such interest will endure, no doubt, as will his literary accomplishments. There are many biographies written about the man, and equally, as many historical novels breathing life back into the author. Twain’s End by Lynn Cullen (author of Mrs. Poe) stands out, not only for her masterful writing, but also for the particular time of Twain’s life she captures in this fictionalized account—its end. The beginning of Cullen’s tale focuses on a turning point in Twain’s life, when age and health nudge him toward a more sedate life, away from the glittering Fifth Avenue of New York City, to an Italian style villa in the quieter Redding, Connecticut. With him are his daughter Clara and a private secretary, one Miss Isabel Lyon. From the first words, Cullen, a lyrical writer, sets the tone, one reminiscent of the iconic Jane Austen. At this juncture, Isabel Lyon had been his secretary for six and a half years. Twain has given her a little house on the villa’s grounds, where Isabel’s mother lives, and where the world assumed Isabel lived as well. Isabel’s mother holds little gratitude for her home. It is dashed by her distaste for her daughter’s position with Mr. Twain. Mrs. Lyon offers the first glimpses of Twain’s truths, his selfishness, temper, as well as his talent, and how much better her daughter deserved. After all, Isabel had been raised reading the Iliad, attending Broadway plays, and rubbing elbows with the likes of Horace Greeley, all by her scholarly father’s side, a father now deceased. Cullen introduces Twain with aplomb, “At seventy-four, he held himself with the amused confidence that a younger man could only pretend to, a confidence that invited you to let down your guard even though you knew he would not be doing likewise.” Her depiction of Twain’s daughter was equally economic, though far more cutting, “Clara barged in like she owned the place, which she would, as soon as she could shove her father off this mortal coil.” The rendering of Clemens/Twain throughout is not done through the eyes of adoration, but through a crystal-clear glass, a magnifying psychological intensity, revealing not the legend but the man, with all his foibles, eccentricities, and even cruelness. The truth of Isabel’s relationship with Twain is dropped in with such nonchalance it is rendered greater effect, “(The butler) glanced up, the silver creamer cradled in is callused hands, then looked back down quickly. A blush flooded the hollows of his cheeks. He had been unable to meet her eyes since discovering that her bedroom in the big house adjoined The King’s.” Throughout the scenes in which Isabel carries the tale, she refers to Twain as “The King,” her King, always capitalized, even in her speech. Her mother loathes it. But Isabel loves, falls in love, with the man. “She wished people would not call him Mark, not even Helen Keller. Mark Twain was not a real person. The person they were addressing was Samuel Clemens. But The King never corrected anyone on this. Instead, something inside him seemed to shift when he heart it, as if the mortal Sam Clemens were stepping aside for his slow-moving doppelganger, Twain.” Once the story fully hooks on the present, done adroitly with a cast that includes Helen Keller, her teacher, Mary, and Mary’s husband, and the peculiar triangle they form, the story steps backward, to Isabel’s intellectual, eclectic childhood and onto the onset of her relationship with Twain. Here we find Twain’s wife: sickly, sequestered, and not entirely sane. She is, nonetheless, not blind when it comes to the ways of her husband. Her presence in his life, in the book, is haunting, ever there but rarely present, a shadowing wraith that hove at their edges, unseen hands pushed this way or that. The once impoverished man that was Samuel Clemens rarely forgot his wife. The rake that was Mark Twain often forgot he was married. Isabel’s riches to rags life led her to the Twain household as a governess. Concurrently are depicted the events which led to Twain’s career in writing, a journey told with the length of a bible in his four-volume autobiography. Here it is swiftly rendered and done so with poignancy, “Desperate now for cash, (after failing prospecting in Nevada) I saw that I might write a little piece for the local paper. I liked writing it, and peopled liked me writing it. I traded in my pan for the pen and never looked back, something I never would have done if I had found the mother lode.” His career trajectory crossed over Isabel’s life, strange coincidences continued, pulling them apart, bringing them back together. Cullen sweeps along, defining their lives, the power of destiny and fate, a twisted power that damages as much as it enriches, and always surprises. Such surprises dot this tale like ornaments on a tree, though they do not always glitter. Cullen’s mastery over Twain’s dialogue is perhaps one of the book’s greatest assets; what he says and how he says it, pulls back the veil upon the rough-hewn literary genius.  Perhaps its weak spot is in the shifting of time not always clearly delineated. But it is a feeble criticism at best. There is a good deal of lead-up to the core of the tale; although filled with deft development of these extraordinary characters, its rise to the conflict is a shallow incline, requiring a bit of patience. Regardless, the jewels of Cullen’s prose glitter, dropped and perched all along the story like a crown, “And his eyes—oh, when he looked at her like that, with those quicksilver eyes. There was never a more beautiful beast.” Twain’s End is the story of Twain and Isabel, their struggles with their growing feelings, and the odd culmination of them. But it is also the story of Clemens’s journey into old age and the steps of his life before its final chapter. It is a Twain little known but highly worth knowing.

Samuel Clemens, also and perhaps more widely known as Mark Twain (1835–1910), continues to be a person of great curiosity. Such interest will endure, no doubt, as will his literary accomplishments. ...

 
One of the most influential American inventors of all time, Thomas Edison (1847–1931) is responsible for the creation of several devices that shaped the face of modern technology. Most famous for his invention of the first practical light bulb, Edison was also a shrewd businessman who bridged the gap between invention and large-scale manufacturing. Possibly the single most important figure of the Second Industrial Revolution, Edison’s vast network of corporate contacts ensured that his name was forever cemented in history as the archetypal American scientist.
Paul Israel is a historian of technology who serves as the Director and General Editor of the multi-volume documentary edition of the Thomas Edison Papers at Rutgers University. He joins us on Culture Insight to share his insight into the life and work of Thomas Alva Edison.

  One of the most influential American inventors of all time, Thomas Edison (1847–1931) is responsible for the creation of several devices that shaped the face of modern technology. Most famo...

Sylvia Plath and Anne Sexton in Robert Lowell’s poetry class in Boston are the stuff of lore, fostered by Sexton’s rollicking report, including how she would park in the loading zone of the Ritz and say it was okay because they were going to get loaded. The tales in Plath biographies about the raucous roundelays of these two poets are supplemented by Kathleen Spivack’s first-hand, sensitive account about their classroom behavior in With Robert Lowell and His Circle: Sylvia Plath, Anne Sexton, Elizabeth Bishop, Stanley Kunitz & Others (2012). Gail Crowther, author of The Haunted Reader and Sylvia Plath (2017) and co-author of These Ghostly Archives: The Unearthing of Sylvia Plath (2017), has not only mastered the literature in which both Plath and Sexton appear but has also done her original archival research and consultations with one of Sexton’s daughters and with Plath’s friend, Elizabeth Compton Sigmund. The result is an engrossing book that challenges the very limits of what we think we know about Plath and Sexton.

Part of Crowther’s purpose, although she does not state it is to continue the mission evident in Heather Clark’s Plath biography of situating these poets in the pressures of their cultural contexts, showing how alone they remained as they challenged the notions of what women could accomplish as writers, wives, and mothers. They were quite different personalities and poets. There is a kind of looseness in Sexton that is her greatest strength and limitation, writing as a poet of the viscera that is bold and inevitably called “confessional.” Sometimes she does not know when to quit and accept the perfection of verse that Plath achieved with such vigor in her final poems, overcoming her earlier tendency to stifle spontaneity in a quest for formal precision. For all their differences, however, these rivals inspired and pilfered from one another in mutually beneficial ways.

[caption id="attachment_36095" align="aligncenter" width="900"]Sylvia Plath Sylvia Plath[/caption]

Both ended as suicides in desperation and despair, but also in an excruciating understanding of how their lives had reached their limits. Both readily talked about suicide in their Ritz revelry. They were dead serious about the right to annihilate themselves. They “felt death made them more real,” Crowther notes. Were they wrong to think so? Crowther does not pose such a question, but it is worth answering in so far as acknowledging they both keenly wanted the attention of posterity and therefore how they died, and when they died, was of signal importance to them.

For very different reasons, as Crowther shows, both needed support systems that men took for granted and that allowed men to weather their depressions. For a time, in their marriages—however fraught—these women had a kind of infrastructure of home and family, although Plath was much better at organizing herself, her husband, and their two children. In fact, she was so good at organizing that her husband Ted Hughes, as Crowther shows, was quite bereft without her, even though he left because he did not want to be organized!

[caption id="attachment_36133" align="alignleft" width="485"]Anne Sexton Anne Sexton[/caption]

Crowther deplores the pathographies that see in every sadness and depression warning signs of demise. To be sure, the signs are there, but so is the evidence of Sexton’s and Plath’s ebullience and power. They both knew how to eat men like air. But men were not their only problem. As Crowther notes, “both women had overbearing and emotionally demanding mothers,” and try as they might, these poets also found it difficult not to be “overbearing and emotionally demanding.” Did they discuss as much at the Ritz? Crowther can only speculate: “It seems unlikely Plath and Sexton would not talk about their mother-daughter relationships.” Both women also liked to talk about the men in their lives. Crowther asks: “Did they gossip about their husbands and dramatic marriages over their third martini at the Ritz while these very husbands were at home waiting for them to return?”

Crowther gets at what was going on inside these poets, what they were like, by piling up the material evidence. So she compares their address books, Plath’s “neat and self-contained”; Sexton’s “wilder and more flamboyant.” Yet both books “have in common . . . the merging of the domestic with professional.” The lives of these women were full of business of all kinds and certainly more than their husbands were capable of, who did not have to balance, in Crowther’s words, “rebellion and conformity.” Women were taught to be perfect at everything that Plath especially took to heart. It was a reaction to “learned behavior,” not “neurosis,” Crowther points out, that provoked these women’s breakdowns. 

In a horrifying chapter about mental illness, Crowther delineates the malpractice of medical professionals who prescribed untested drugs and delivered unproven therapies like insulin shock that made for no discernible improvement. To be institutionalized deprived these women of the support they received from women friends. Crowther reports: “Connie Taylor Blackwell, a Boston friend of Plaths, described how they would often meet and drink sherry and talk about ‘the void.’ It was a big topic, claimed Blackwell, because at the time, in the 1950s, women were being pushed in so many directions that the attraction of nothingness for them was very real. Women, quite simply, got exhausted by it all. . . . There is a reason why more women than men are diagnosed with depression. There is a reason why most people diagnosed with depression who are given drugs are women.” Sexton and Maxine Kumin, a mother with young children at home, spent hours on the phone critiquing their poetry. Kumin grounded Sexton and yet kept their collaborations “relatively secret because they feared people would lump them together and see them as alike. It was a struggle for separate identities in a literary world that was hostile enough to women.” The Kumin effect is evident in Crowther’s account of a joint interview during which Sexton “‘waxes lyrical about female friendship . . . ending with ‘I think Im dominating this interview.’ Kumin replied dryly, ‘You are, Anne.’”

Crowther is a sociologist by training, and that background adds a good deal to her understanding of biography: 

Of course, once they died, Plath and Sexton immediately became statistics, slotting into the cultural pattern and story of who was dying where, when, and how. The week Plath died in February 1963 there would have been at least ninety-nine other suicides, and an extra twenty-five to fifty who did not make the official lists. In 1974 when Sexton died, she fell into the category most prone to kill themselves, forty-five to sixty-four-year-olds. This impersonal information shows that sadly, there was nothing unique or doomed-genius-woman-writer about their deaths. They were, along with many other people, too desperate and wrought to continue living.

Crowther does not ignore psychology or the material conditions of her poets’ last days, showing how difficult it was for them to retain equilibrium without marriage, although she does not argue that being single is the cause of their deaths. Instead, she returns to an evocation of how much it took for them to live, as evidenced in 

the endless manuscripts they left behind, worked, and reworked; thousands of letters, teaching notes, business contracts, household bills, bookkeeping. But also, as Plath said, in the lares and penates of a womans life: lipstick smears on envelopes, cigarette burns on letters, coffee cup rings on poetry drafts, their personal libraries, recipe books, and in their perpetual struggle against the male-dominated discipline they were so determined to succeed in.

Sylvia Plath and Anne Sexton in Robert Lowell’s poetry class in Boston are the stuff of lore, fostered by Sexton’s rollicking report, including how she would park in the loading zone of the Ritz a...

In 2005, the Federal Reserve Board Chairman Alan Greenspan delivered the Adam Smith Memorial Lecture in Kirkcaldy, Scotland. Here, at the birthplace of the great philosopher, Greenspan lauded Smith as “a towering contributor to the development of the modern world” whose 1776 work An Inquiry into the Nature and Causes of the Wealth of Nations “reached far beyond the insights of his predecessors to frame a global view of how market economies, just then emerging, worked.” More recently, the Nobel Prize-winning economist Joseph E. Stiglitz repeated the widely-held view of Smith as ‘the father of modern economics.’

These sentiments also underpin Eli Ginzberg’s Adam Smith and the Founding of Market Economics. Ginzberg (1909-2002) was Professor Emeritus of Economics at Columbia University, and he published extensively on the policy impact of human resources, attributing his lifelong interest in this field to his earlier interest in Smith’s writings. Ginzberg’s doctoral thesis in economics was published amidst the Great Depression in 1934 as The House of Adam Smith, and the volume under review is an overdue re-publication, from 2002, with a new introduction.

The book is divided into two parts, with the first deconstructing and interpreting The Wealth of Nations, and the second more speculative and polemical, hailing Smith as the great founder of classical liberal thought while criticizing those (including the Classical Economists and President Herbert Hoover) who hijacked his legacy for their own ends. Ginzberg’s work is a provocative if somewhat hagiographical account of Smith’s prescience, imagination, logic, and humanity. Contrary to orthodox modern academic practice, Ginzberg adopts a broad-sweep (and occasionally disjointed) approach. His method, across several chapters, is to assess the concepts relating to moral philosophy, natural law, economics, and history, to demonstrate Smith’s genius. With impressive if often turgid prose, Ginzberg argues that Smith’s great philosophical observations and discoveries were later distorted and subverted to provide a theoretical basis to justify economic and political inequality characterized by disparities of wealth and political power. In contrast, Ginzberg asserts that Smith extolled the virtues of the laboring classes and sought to enhance their social and economic status (pp.139-40).

With an emphasis on the civic and communitarian as opposed to the individualistic elements of enlightened self-interest, Ginzberg casts Smith in the guise of moral and social reformer and anti-monopolist rather than the founding theorist of laissez-faire capitalism. Ginzberg states:

“The master manufacturers and merchant princes would never give up part of their wealth in order to improve the condition of the petty artisan and the tenant farmer. Smith knew, however, that if the majority of the citizens could be convinced that his proposals would enhance their welfare his reforms might more easily be achieved” (p.234).

Citing Smith’s interest in national as opposed to individual welfare, Ginzberg argues that Smith believed increasing the ‘individual’s scope of action’ was the most effective way of advancing the prosperity of the community:   

“Unseen hands facilitated this transformation; the procedure was not, however, very mysterious; it depended only upon the establishment of perfect justice, perfect liberty, and perfect equality” (p.129).

The notorious ‘Invisible Hand’ guiding the market forces of resource distribution and allocation to promote the public good may have owed something to Providential-type terminology, but recent writers reject the idea that Smith subscribed to any link between theology and economic behavior. It was in the material world that Smith laid the theoretical foundations of Classical Liberal thought and action, notably by demonstrating the mutual benefits derived from open international trading relationships, Smith discredited the mercantilist theory.

[caption id="" align="aligncenter" width="500"]Adam Smith Adam Smith[/caption]

For Smith, the division of labor (demonstrated with fine clarity via the famous example of the pin factory) and enlightened self-interest, the mainsprings of domestic economic growth, were impeded by the customary regulations, trade restrictions, and misallocation and inefficient use of resources characteristic of mercantilism. As an economic philosophy, the zero-sum battle for resources characteristic of mercantilism was undoubtedly primitive in nature and limited in scope, and mercantilist doctrines did not go unchallenged. As the Universal Merchant (1753) argued:

“That the Wealth of a Nation is the common Benefit of its Neighbors, and that where Commerce flourishes, the People neither merit Envy, nor are to be feared.”   

Other writers had, long before Smith, attacked mercantilism, and some time passed before the importance of Smith’s work was acknowledged. The popularity of Smith’s work from the early nineteenth century coincided with the rise of the commercial and industrial bourgeoisie and growth in political economy literature. The Classical Economists, most notably David Ricardo and James Mill, broadly subscribed to Smith’s principles but for Ginzberg there was a conceptual gap between them. Whereas Smith described the great philosophical principles governing human society and relations within the context of a dynamic and expanding capitalist economy, the Classical Economists devoted their time addressing concerns relating to the “business economy of their own day” (p.167). Ginzberg argues that these later economists incorrectly believed Smith’s work to be primarily concerned with defending private enterprise, free-market capitalism, when in fact Smith “was not an economist of modern industrialism” (p.148). By painting a grim picture of the industrial system and the deviations and distortions of laissez-faire perpetrated by Smith’s acolytes, Ginzberg draws a clear distinction between the nuanced moralistic and philosophical aspects of Smith’s work and the more hard-nosed ideology of the nineteenth-century practitioners of the ‘dismal science,’ who despite believing themselves to be preaching Smith’s “gospel” were “false prophets” (p.168).

“The use to which Adam Smith was put by the avaricious capitalists resulted in the reformers’ antagonism towards him. His true character was lost in the battle over property rights and class privileges” (pp. 200-1).

While attributing sophistry and deception to the Classical Economists, Ginzberg acknowledges that the theoretical development of the political economy was in its infancy, and economic conditions were substantially different from Smith’s day. Moreover, it is surely the case that such a jaundiced view underestimates the theoretical contribution made by Smith’s successors, most notably Ricardian comparative advantage.

One of the main problems with Ginzberg’s approach is that while he has undertaken a close reading of Smith’s work he rarely allows for or identifies intellectual influences. The Physiocrats are dismissed in a few sentences, and other economic writers not mentioned at all. Yet Smith was clearly familiar with arguments made by earlier thinkers for breaching the monopolies enjoyed by Chartered Companies, for he integrated these critiques into his work. Ginzberg also underplays the influential arguments of Josiah Tucker and David Hume relative to foreign trade expansion and domestic economic development. Smith’s ideas did not emerge in a vacuum, but represented a convergence of different strands of earlier ideas and proposals. The conceptual framework for his theories came from a respectable if eclectic intellectual lineage.

The propensity of succeeding generations to reinterpret issues in the light of contemporary events, and more plainly, by the more rigorous research methods of modern scholarship, mean it is ultimately the fate of all historical works to be superseded. Given the time which elapsed between the first appearance of the book and its republication, Ginzberg’s work is no different. Recent critical analysis and the publication of letters and journals have enriched our knowledge and understanding of Smith and have led to more refined interpretations of his philosophical principles and policy prescriptions. While these works do not make Ginzberg’s work obsolete, his book must be considered within the context of recent research.    

Over time, the use of Smith’s theories to justify particular schools of thought has intensified, and his work continues to inform contemporary debates on globalization. His legacy is a contested one, with free-market ideologues vying with radical libertarians and Third Way thinkers in claiming Smith’s ideas for their own political views. Most consistently, from the Classical Economists to the Austrian School, Hayek, and Friedman, Smith’s work has been a foundational text in the advocacy of classical economic liberalism and political conservatism. In that sense, Ginzberg’s plea to re-assess Smith’s writings free from the ideological trammels of later thinkers and contemporary political controversy is a reasonable one. A modern re-publication is, therefore, useful as a timely reminder of the constant need to question unspoken assumptions and to reconsider the power of historical legacies and the ideological battles that continue to inform, influence, and shape them.

In 2005, the Federal Reserve Board Chairman Alan Greenspan delivered the Adam Smith Memorial Lecture in Kirkcaldy, Scotland. Here, at the birthplace of the great philosopher, Greenspan lauded Smith as...