HERITAGE OF KNOWLEDGE:
SCIENCE, PHILOSOPHY, RELIGION & MYTH (1)
INTRODUCTION
The question of what it means to know something “has been on philosophers’ minds for 2,000 years or so,” says Professor Paul Harris, one of three faculty members who gamely entered the fray in a conversation hosted by Usable Knowledge for its video roundtable series.
There are several ways to think about knowledge, said Associate Professor Tina Grotzer as the conversation opened. There is conceptual knowledge — “the framing of ideas and mental models, how we construct information in our head” — and there is procedural knowledge: “how we do things — algorithms, recipes, know-how.”
Much of Grotzer’s work explores a subset of conceptual knowledge, something she calls structural knowledge — “how concepts are structured in the deepest sense … what we think about numeracy, how we reason about cause and effect, those very basic assumptions about the nature of how the world works.”
About Harvard Library
We are the libraries and archives of Harvard University. At Harvard Library, we are champions of curiosity. We aim to be global leaders in expanding world knowledge and intellectual exploration. We engage with our communities in the creation and sharing of new knowledge, connecting them with the vast collections that we curate and steward through collaborations around the world. At its core, our mission for over four centuries has been to advance the learning, research, and pursuit of truth that are at the heart of Harvard.
Our efforts are motivated and powered by working collaboratively, embracing diverse perspectives, championing access, aiming for the extraordinary, and always leading with curiosity.
Purpose
We champion curiosity for the betterment of the world.
Vision
We aspire to be global leaders in expanding world knowledge and intellectual exploration.
Mission
We are expert partners on the pathways to knowledge. We engage with our communities in the creation and sharing of new knowledge, connecting them with vast collections that we curate and steward with collaborators around the world.
At its core, our mission is to advance the learning, research and pursuit of truth that are at the heart of Harvard.
Values
Lead with Curiosity. We expand intellectual frontiers and remain in awe of what we do not yet know.
Seek Collaboration. We bring people and ideas together from within and beyond because we believe partnership creates more interesting results.
Embrace Diverse Perspectives. We cultivate and celebrate diversity in our collections and our community to construct a more inclusive and just world.
Champion Access. We enhance access to information and advance inclusive models of scholarly communication.
Aim for the Extraordinary. We drive progress and deliver the unexpected, building on our past and forging the future.
One of Harvard Library’s defining strengths is our collections. They are vast – we are the world’s largest academic research library – but more importantly they reflect the global reach and depth of the university’s remarkable academic programs and history. They are the result of centuries of international collecting, they include valuable ephemeral materials, and they span a very broad range of specializations. We are fortunate to have a network of expertise across the system focused on stewarding vulnerable and critical resources, from a diverse set of perspectives. Many areas of the collections have played a role in the creation and development of academic disciplines and continue to reflect the history of those fields. In other words, our collections differentiate us from other academic research libraries and contribute significantly to world knowledge

When we think about the past, we think about history. When we think about the future, we think about science. Science builds upon its past, but also, simultaneously, denies it. As the Romantic essayist Thomas de Quincey claimed, for working scientists Isaac Newton’s Principia of 1687 has no more value than an outdated cookbook. What then does a history of science look like?
Science has been made over thousands of years by people from a diversity of cultural traditions. Activities such as experiment originated in Renaissance pharmacies, kitchens and artisan workshops; evolutionary theory drew on sources ranging from sheep breeding to the economics of human populations; astronomy emerged from attempts to read the heavenly language of the stars.
The history of science itself developed as a discipline in Europe and North America during the late 19th century, as a way of charting the rise of a distinctively modern world under European domination. Science seemed uniquely a product of the white men in the west. In recent years, however, historians of science have turned this view on its head, so that science is understood as the outcome of global interaction, conflict and exchange. The rise of the universities as a key site for learning in medieval Europe, the reorganisation of scientific disciplines in the decades around 1800 and the rise of genomics and computing in the late 20th century: these and other pivotal episodes are part of changes in cross-cultural commerce and trade.
Science, which has often aimed to establish universal standards, has close connections with the history of empires, from Assyria, Egypt and the Americas to China and India. It has been at the service of princely courts, the military and other centres of power. At the same time, however, the ‘scientist’ (a modern word, dating from the 19th century) is often recognised as having a special kind of moral authority, associated with ideals of detached expertise and neutral objectivity.
Nothing in our culture seems more objective than ‘nature’: but how did that come to be the case? The history of science helps us to understand how things we now take for granted, from the circulation of the blood to the existence of black holes, have been accepted. This requires looking at knowledge in the making, with false starts and wrong directions taken just as seriously as what now appear as brilliant insights. It also means looking at how consensus is established, the diverse range of activities that go into the making of science, and how science becomes part of everyday life.
Historians of science do not simply chronicle progress towards the present, nor do they search for the origins of a one-size-fits-all scientific method. Instead, they ask how discovery became identified as a key feature of science, and how different methods have arisen in different subjects. They look to the material traces of the past, preserved as instruments, maps, clay tablets, palm leaf manuscripts, archaeological remains, and books. The range of skills and methods needed to interrogate these materials means that historians of science are found in a remarkable variety of places, from dedicated academic departments and science studies units to libraries and museums.
Perhaps more than just about any other subject, the history of science challenges profound divisions in our disciplinary map of knowledge (…)
https://www.thebritishacademy.ac.uk/blog/what-is-the-history-of-science/
Science Made the Modern World, and it’s science that shapes modern culture. That’s a sentiment that gained currency in the latter part of the nineteenth century and the early twentieth century—a sentiment that seemed almost too obvious to articulate then and whose obviousness has, if anything, become even more pronounced over time. Science continues to Make the Modern World.
Whatever names we want to give to the leading edges of change—globalization, the networked society, the knowledge economy—it’s science that’s understood to be their motive force. It’s science that drives the economy and, more pervasively, it’s science that shapes our culture. We think in scientific terms. To think any other way is to think inadequately, illegitimately, nonsensically.
In 1959, C. P. Snow’s Two Cultures and The Scientific Revolution complained about the low standing of science in official culture, but he was presiding not at a funeral but at a christening. In just that very broad sense, the “science wars” have long been over and science is the winner. In the 1870s, Andrew Dickson White, then president of Cornell, wrote about the great warfare between science and what he called “dogmatic theology” that was being inexorably won by science.1 In 1918, Max Weber announced the “disenchantment of the world,” conceding only that “certain big children” still harbored reservations about the triumph of amoral science (Weber, [1919]1991: 142).
Some years earlier, writing from the University of Chicago, Thorstein Veblen described the essential mark of modern civilization as its “matter of fact” character, its “hard headed apprehension of facts.” “This characteristic of western civilization comes to a head in modern science,” and it’s the possession of science that guarantees the triumph of the West over “barbarism.” The scientist rules: “On any large question which is to be disposed of for good and all the final appeal is by common consent taken to the scientist.
The solution offered by the scientist is decisive,” unless it is superseded by new science. “Modern common sense holds that the scientist’s answer is the only ultimately true one.” It is matter-of-fact science that “gives tone” to modern culture (Veblen, 1906: 585–88). This is not an injunction about how modern people ought to think and speak but Veblen’s description of how we do think and speak. In 1925, Alfred North Whitehead’s Science and the Modern World introduced the historical episode that “made modernity,” which had not yet been baptized as “ Scientific Revolution”: it was “the most intimate change in outlook which the human race had yet encountered . . .
Since a babe was born in a manger, it may be doubted whether so great a thing has happened with so little stir.” What started as the possession of an embattled few had reconstituted our collective view of the world and the way to know it; the “growth of science has practically recoloured our mentality so that modes of thought which in former times were exceptional, are now broadly spread through the educated world.”
Science “has altered the metaphysical presuppositions and the imaginative contents of our minds . . .” Born in Europe in the sixteenth and seventeenth centuries, its home is now “the whole world.” Science, that is to say, travels with unique efficiency: it is “transferable from country to country, and from race to race, wherever there is a rational society” (Whitehead, [1925]1946: 2). The founder of the academic discipline called the history of science—Harvard’s George Sarton—announced in 1936 that science was humankind’s only “truly cumulative and progressive” activity, so if you wanted to understand progress towards modernity, the history of science was the only place to look (Sarton, 1936: 5).
The great thing about scientific progress was—as was later said and often repeated—that “the average college freshman knows more physics than Galileo knew . . . and more too than Newton” (Gillispie, 1960: 9). Science, Sarton (1948: 55) wrote, “is the most precious patrimony of mankind. It is immortal. It is inalienable.” When, toward the middle of the just-past century, the Scientific Revolution was given its proper name, it was, at the same time, pointed to as the moment modernity came to be. Listen to Herbert Butterfield in 1949, an English political historian, making his one foray into the history of science: [the scientific revolution] outshines everything [in history] since the rise of Christianity and reduces the Renaissance and Reformation to the rank of mere episodes, mere internal displacements, within the system of medieval Christendom.
Since it changes the character of men’s habitual mental operations even in the conduct of the non-material sciences, while transforming the whole diagram of the physical universe and the very texture of human life itself, it looms . . . large as the real origin of the modern world and of the modern mentality… (Butterfield, 1949: —viii) Butterfield’s formulation was soon echoed and endorsed, as in this example from the Oxford historian of science A. C. Crombie: The effects of the new science on life and thought have . . . been so great and special that the Scientific Revolution has been compared in the history of civilisation to the rise of ancient Greek philosophy in the 6th and 5th centuries B.C. and to the spread of Christianity throughout the Roman Empire . . . (Crombie, [1952]1959: vol. 1, p. 7)
And by 1960 it had become a commonplace—Princeton historian Charles Gillispie (1960: 8) concurring that modern science, originating in the seventeenth century, was “the most . . . influential creation of the western mind.” As late as 1986, Richard Westfall—then the dean of America’s historians of science—put science right at the heart of the modern order: “For good and for ill, science stands at the center of every dimension of modern life. It has shaped most of the categories in terms of which we think . . .” (Westfall, 1986).
Evidence of that contemporary influence and authority is all around us and is undeniable. In the academy, and most especially in the modern research university, it is the natural sciences that have pride of place and the humanities and social sciences that look on in envy and, sometimes, resentment. In academic culture generally, the authority of the natural sciences is made manifest in the long-established desire of many forms of inquiry to take their place among the “sciences”: social science, management science, domestic science, nutrition science, sexual science.
Just because the designation “science” is such a prize, more practices now represent themselves as scientific than ever before. The homage is paid from the weak to the strong: students in sociology, anthropology, and psychology commonly experience total immersion in “methods” courses, and while chemists learn how to use mass spectrometers and Bunsen burners, they are rarely exposed to courses in “scientific method.” The strongest present-day redoubts of belief in the existence, coherence, and power of the scientific method are found in the departments of human, not of natural, science.
https://dash.harvard.edu/bitstream/handle/1/3425896/shapin_science_modern.pdf?sequence=1
PHILOSOPHY: „THE ANCIENT GREEKS“
RELIGION
Although the history of Christianity in each of the regions to which it has spread manifests certain special characteristics that set it apart, the development of Christianity within the history of western Europe has in many decisive ways shaped its development in all other regions. The English man of letters Hilaire Belloc (1870–1953) formulated the significance of that development—as well as a highly idiosyncratic and debatable philosophy of history—in his epigram of 1912: „Europe will return to the [Christian] faith, or she will perish. The faith is Europe. And Europe is the faith.“ Belloc’s pronouncement is partly historical and partly hortatory, and even those who would vigorously reject the first and hortatory half of his formulation would probably acknowledge the historical force of the second half. Through most of its history, what most people, insiders or outsiders, have identified as the Christian faith has been the particular form that the Christian faith has acquired in its European experience. Asia, Africa, and the Americas have imported most of their Christianity from western Europe or Britain, and while Christianity did indeed begin in Asia Minor, most Christians in Asia Minor now practice and believe versions of Christianity that have come there only after having first been filtered through Europe. The history of Christianity in western continental Europe and the British Isles is, therefore, indispensable to the understanding of Christianity wherever it exists today. It is no less indispensable to the understanding of the history of western Europe itself. And in that sense at least, Belloc was right.
Its historic role
Judaism has played a significant role in the development of Western culture because of its unique relationship with Christianity, the dominant religious force in the West. Although the Christian church drew from other sources as well, its retention of the sacred Scriptures of the synagogue (the Old Testament) as an integral part of its Bible—a decision sharply debated in the 2nd century CE—was crucial. Not only was the development of its ideas and doctrines deeply influenced, but it also received an ethical dynamism that constantly overcame an inclination to withdraw into world-denying isolation.
It was, however, not only Judaism’s heritage but its persistence that touched Western civilization. The continuing existence of the Jews, even as a pariah people, was both a challenge and a warning. Their liberation from the shackles of discrimination, segregation, and rejection at the beginning of the modern era was understood by many to be the touchstone of all human liberty. Until the final ghettoization of the Jew—it is well to remember that the term ghetto belongs in the first instance to Jewish history—at the end of the Middle Ages and the beginning of the Renaissance, intellectual contact between Judaism and Christianity, and thus between Judaism and Western culture, continued. St. Jerome translated the Hebrew Bible into Latin with the aid of Jewish scholars; the exegetical work of the scholars of the monastery of St. Victor in the 12th century borrowed heavily from Jewish scholars; and the biblical commentary of Rashi (Solomon ben Isaac of Troyes) was an important source for Martin Luther (1483–1546). Jewish thinkers helped to bring the remarkable intellectual achievements of the Islamic world to Christian Europe and added their own contributions as well. Even heresies within the church, on occasion, were said to have been inspired by or modeled after Judaism.
https://www.britannica.com/topic/Judaism/The-role-of-Judaism-in-Western-culture-and-civilization
Chapter 1: Interpretation and Definition of Classical Mythology
THE PROBLEM OF DEFINING MYTH
The establishment of a single, comprehensive definition of myth has proved impossible to attain. No one definition can satisfactorily embrace all the various kinds of stories that can legitimately be classed as myths on the basis of one criterion or another. The attempt to define myth in itself, however intractable a proposition, serves to highlight the very qualities of the stories that make them so different from one another.
THE MEANING OF THE WORD MYTH
“Myth” is derived from the Greek word mythos, which can mean tale, or story, and that is essentially what a myth is: a story. For many, such a general definition proves to be of no real service, and some would add the qualification that a myth must be a “traditional” tale or story, one that has proved of so lasting a value that it is continually retold, through whatever medium the artist/storyteller chooses to employ. For further clarification, distinctions are often made between “myth,” i.e., “true myth” or “myth proper,” and “saga” or “legend,” and “folktale.”
MYTH, SAGA OR LEGEND, AND FOLKTALE
Myth: not a comprehensive term for all stories but only for those primarily concerned with the gods and their relations with mortals.
Saga or legend: a story containing a kernel of historical truth, despite later fictional accretions.
Folktale: a story, usually of oral origin, that contains elements of the fantastic, often in the pattern of the adventure of a hero or a heroine. Its main function is entertainment, but it can also educate with all sorts of insights. Under this rubric may be classed fairytales, which are full of supernatural beings and magic and provide a more pointed moral content.
Rarely, if ever, do we find in Greek and Roman mythology, a pristine, uncontaminated example of any one of these types of story.
MYTH AND TRUTH
The most common association of the words “myth” and “mythical” is with what is incredible and fantastic. How often do we hear the expression, “It’s a myth,” uttered in derogatory contrast with such laudable concepts as reality and the facts? As opposed to the discoveries of science, whose truths continually change, myth, like art is eternal. Myth in a sense is the highest reality, and the thoughtless dismissal of myth as fiction or a lie is the most barren and misleading definition of all. Myth serves to interpret the whole of human experience and that interpretation can be true or fictitious, valuable or insubstantial, quite apart from its historical veracity.
MYTH AND RELIGION
The study of myth must not and cannot be separated from the study of religion, religious beliefs, or religious rituals. No mythologist has been more eloquent than Mircea Eliade in his appreciation of the sacredness of myth and the holy and timeless world that it embodies.
MYTH AND ETIOLOGY
An etiological interpretation of myth demands that a true myth must give the aitia, or cause or reason, for a fact, a ritual practice, or an institution. Thus narrowly defined, etiology imposes too limiting and rigid a criterion for definition. On the other hand, if one broadens the concept of the aitia of a myth to encompass any story that explains or reveals something or anything, an etiological approach offers one of the most fertile ways of interpreting myth, although it cannot really define it. What story can avoid offering some kind of explanation or revelation? Is the best general definition of myth, after all, a traditional story?
RATIONALISM, METAPHOR, AND ALLEGORY
Euhemerism: an attempt to rationalize classical mythology, attributed to Euhemerus (ca. 300 B. C.). He claimed that the gods were great men of old who had become deified.
Allegory: a sustained metaphor. The allegorical approach to mythology is favored by the anti-rationalists, who interpret the details of myth as symbols of universal truth.
Allegorical nature myths: for Max Müller in the nineteenth century, myths are to be defined as explanations of meteorological and cosmological phenomena. Müller’s theory is too limited. Some Greek and Roman myths, but by no means all, are concerned with nature.
MYTH AND PSYCHOLOGY
The theories of Freud and Jung are fundamental and far-reaching in their influence, and although continually challenged, provide the most searching tools for a profound, introspective interpretation of mythology. (…)
COMPARATIVE STUDY AND CLASSICAL MYTHOLOGY
Oral and Literary Myth. Many insist that a true myth must be oral and anonymous. The tales told in primitive societies are the only true myths, pristine, timeless, and profound. The written word brings contamination and specific authorship. We disagree with such a narrow definition of mythology. Myth need not be just a story told orally. It can be danced, painted, and enacted, and this is, in fact, what primitive people do.
Myth is no less a literary than an oral form. Despite the successive layers that have been grafted onto Greek and Roman stories and their crystallization in literary works of the highest sophistication, comparative mythologists have been able to isolate the fundamental characteristics that classical myths share with other mythologies, both oral and literate.
Joseph Campbell. A comparative mythologist, perhaps best known for his series of PBS interviews with Bill Moyers, Campbell did much to popularize the comparative approach to mythology. Though his attention was largely devoted to myths from other traditions, many of his observations, as he himself was well aware, can be profitably applied to classical mythology. (…)
SOME CONCLUSIONS AND A DEFINITION OF CLASSICAL MYTH
We have provided a representative (and by no means exhaustive) sampling of influential definitions and interpretations that can be brought to bear on classical mythology. It should be remembered that no one theory suffices for a deep appreciation of the power and impact of all myths. Certainly the panorama of classical mythology requires an arsenal of critical approaches.
Let us end with a definition of classical mythology that emphasizes its eternal qualities, which have assured a miraculous afterlife. It may be that a sensitive study of the subsequent art, literature, drama, music, dance, and film, inspired by Greek and Roman themes and created by genius, offers the most worthwhile interpretative insights of all.
A classical myth is a story that, through its classical form, has attained a kind of immortality because its inherent archetypal beauty, profundity, and power have inspired rewarding renewal and transformation by successive generations.
https://global.oup.com/us/companion.websites/9780195397703/student/materials/chapter1/summary/
Self Awareness
For more than 300 years, knowledge of the self has been considered to be at the very core of human behavior. The ancient dictum “Know thyself” has been variously attributed to Plato, Pythagoras, Thales, and Socrates. Plutarch noted that this inscription was carved on the Delphic Oracle, that mystical sanctuary where kings and generals sought advice on matters of greatest importance to them.
As early as 42 B.C., Publilius Syrus proposed: “It matters not what you are thought to be, but what you are.” Alfred Lord Tennyson said: “Self-reverence, self-knowledge, self-control, these three alone lead to sovereign power.” Probably the most oft-quoted passage on the self is Polonius’ advice in Hamlet: “To thine own self be true, and it must follow as the night the day, thou canst not then be false to any man.”
Messinger reminded us: “He that would govern others must first master himself.” Self-awareness lies at the heart of the ability to master oneself, but it is not sufficient. While self-management depends first and foremost on self-awareness, as illustrated in Figure 1.1, other self-management skills are closely linked to and build upon self-awareness.
Developing self-control, for example, and clarifying priorities and goals, help individuals create direction in their own lives. Effectively managing time and stress make it possible for individuals to adapt to and organize their surroundings. This chapter centers on the core aspects of selfmanagement and serves as the foundation for the following chapter on stress and time management. Moreover, as Figure 1.1 illustrates, when problems arise in personal management, the easily recognized symptoms are often time pressures or experienced stress.
However, those symptoms are often linked to more fundamental problems with self-awareness and out-of-balance priorities so we begin with a focus on enhancing knowledge of oneself. Despite the research cited above, students of human behavior have long known that knowledge of oneself— self-awareness, self-insight, self-understanding—is essential to one’s productive personal and interpersonal functioning, and in understanding and empathizing with other people.
A host of techniques and methods for achieving self-knowledge have long been available— including group methods, meditation techniques, altered consciousness procedures, aromatherapy, assorted massages, physical exercise regimens, and biofeedback. It is estimated that Americans alone spend between $30 billion and $50 billion on such therapies. In this chapter we do not summarize those various approaches to enhanced self-awareness, nor do we espouse any one procedure in par-ticular. Instead, our objective is to help you understand the importance of self-awareness if you are to be a suc-cessful manager—or a successful individual—and to provide you with some powerful self-assessment instruments that are related to managerial success.
Our emphasis is on scientifically validated information linking self-awareness to the behavior of managers, and we try to avoid generalizations that have not been tested in research.
https://faculty.ksu.edu.sa/sites/default/files/developing_management_skills-8th_edition.pdf

When we think about “resilience,” we typically imagine bouncing back from major hardship. Management theorists have increasingly put forward a more nuanced definition, however: resilience as the ability to adapt to complex change. But in today’s world, that means the demand for resilience is almost constant. With the ongoing onslaught of problems leaders face, and change being the only constant in organizational life, leaders must cultivate resilience as an ongoing skill, not just for the “big moments” of painful setbacks or major change. A significant study of 167 leaders revealed that the most resilient leaders have a deeper self-knowledge. They take honest stock of their skills, curb misplaced irritability by confronting its real sources, push back on unrealistic expectations instead of passing them on, and recognize when they’ve fallen into ambivalence and go back to first principles. (…)
Leaders mindful of their own flagging tenacity dig deeper and redouble their efforts to push ahead, inspiring those around them to do the same.
Adversity in organizational life, sometimes the result of major change, sometimes the provocateur of it, is a way of life today. Leaders need higher levels of resilience in constant reserve to weather this new normal. Those leaders with strong self-knowledge — who have a clear understanding of their skills and shortcomings, their frustrations, and their core principles — are more likely to sustain those needed reserves of resilience to thrive through adversity and change.
https://hbr.org/2017/09/the-better-you-know-yourself-the-more-resilient-youll-be
Question Certainty
According to legend, around 550 BC, Croesus, the king of Lydia, held one of the world’s earliest prediction tournaments. He sent emissaries to seven oracles to ask them to foretell what he would be doing that day. Pythia, the Oracle of Delphi, answered correctly: He would be cooking lamb-and-tortoise stew.
Croesus didn’t perform this exercise out of mere curiosity. He had a decision to make. Confident that he’d discovered a reliable oracle, the king then asked Pythia whether he should attack Persia. She said that if he did, he would destroy a mighty empire. Croesus attacked but was defeated. The problem was interpretation: Pythia never said which mighty empire would be destroyed.
Whether the story is fact or fiction, Croesus’ defeat illuminates a couple of truths: Forecasting is difficult, and pundits often claim their predictions have come true when they haven’t. Still, accurate predictions are essential to good decision making—in every realm of life. As Philip Tetlock, a professor of both management and psychology at the University of Pennsylvania, and his coauthor, Dan Gardner, write in Superforecasting, “We are all forecasters. When we think about changing jobs, getting married, buying a home, making an investment, launching a product, or retiring, we decide based on how we expect the future to unfold.”
So what is the secret to making better forecasts? From 1984 to 2004 Tetlock tracked political pundits’ ability to predict world events, culminating in his 2006 book Expert Political Judgment. He found that overall, his study subjects weren’t very good forecasters, but a subset did perform better than random chance. Those people stood out not for their credentials or ideology but for their style of thinking. They rejected the idea that any single force determines an outcome. They used multiple information sources and analytical tools and combined competing explanations for a given phenomenon. Above all, they were allergic to certainty.
Superforecasting describes Tetlock’s work since. In 2011 he and his colleagues entered a prediction tournament sponsored by the U.S. government’s Intelligence Advanced Research Projects Activity. They recruited internet users to forecast geopolitical events under various experimental conditions, and—harnessing the wisdom of this crowd—they won. In the process, they found another group of “superforecasters” to study. Most weren’t professional analysts, but they scored high on tests of intelligence and open-mindedness. Like Tetlock’s other experts, they gave weight to multiple perspectives and weren’t afraid to change their opinions. They were curious, humble, self-critical, and less likely than most other people to believe in fate. And although they seldom used math to make their predictions, all were highly numerate. “I have yet to find a superforecaster who isn’t comfortable with numbers,” Tetlock writes.
Not surprisingly, Jordan Ellenberg, a mathematician at the University of Wisconsin–Madison, also believes his discipline can help people make better judgments. “Knowing mathematics is like wearing a pair of X-ray specs that reveal hidden structures underneath the messy and chaotic surface of the world,” he explains in his new book, How Not to Be Wrong. Common sense and the application of “simple and profound” concepts, such as correlation, Area rigorously extended to everyday reasoning. Notice that he says knowing math, not doing math. As Tetlock’s research makes clear, the superforecasters aren’t building elaborate statistical models from massive data sets. But the concepts in How Not to Be Wrong would be familiar to them, and they use their mathematical brains to find structure amid complexity and to estimate, understand, and accept probabilities.
Superforecasting shows how this is done by contrasting a scene from Zero Dark Thirty, the 2012 film about the hunt for Osama bin Laden, with real-life events. In the movie, the CIA director, played by James Gandolfini, demands to know if bin Laden really is in the compound in Abbottabad, Pakistan. “Is he there or is he not f–––––– there?” he asks. Analysts offer probabilities between 60% and 80%, until the protagonist, Maya (Jessica Chastain), chimes in: “A hundred percent he’s there,” she says. “OK, fine, 95%, because I know certainty freaks you guys out. But it’s a hundred!”
According to Tetlock, the actual conversation never would have gone that way, because Leon Panetta, the real CIA director at the time, was comfortable using probability and diverse estimates to make his decisions. In fact, as Micah Zenko recounts in another new book, Red Team, the CIA conducted three separate “red team” exercises before the raid, all designed to check and challenge analysts’ assumptions. Although the real “Maya” did give that 95% estimate, she and her team were made to completely review their work. The CIA also appointed four outside analysts to study the case, and the National Counterterrorism Center, a separate agency, conducted its own analysis, generating three probabilities that bin Laden was in the compound: 75%, 60%, and 40%. President Obama concluded that this amounted to “a flip of the coin,” but he did, of course, authorize the raid. Tetlock dislikes Obama’s analogy (his superforecasters would have been more precise) but not the overall process. Maya’s estimate was “more extreme than the evidence could support” and therefore “unreasonable,” he explains. In his world, such confidence is cause for skepticism.
The Zenko book is a good complement to Superforecasting, because it shows how organizations, not just individuals, can overcome their biases toward false certainty and make good predictions, in geopolitics and business, in public and private sectors. With simulations, vulnerability probes, and alternative analyses that offer fresh eyes on a complex situation or intentionally oppose a certain position, red teams can greatly improve the accuracy of forecasts in the same way that Tetlock’s experts do.
Zenko adds that management must buy in, committing significant resources to red teams and empowering them to be brutally honest in their analyses. Tetlock agrees. Although great leaders should be confident and decisive, they must also possess “intellectual humility” and recognize that the world is complex and full of uncertainty, he explains. They should learn from and lean on superforecasters and red teams, using not just one but many. If Croesus had asked all seven oracles about his planned attack on Persia, for example, he might not have lost his empire.
A half-century-old divining method, the Delphi technique, is getting new attention from leading corporations, thanks to recent refinements and today’s heightened connectivity. When properly framed and communicated, its broad predictions can be translated into highly useful strategic guidelines.
Named for the ancient oracle, the technique is a way of tapping the wisdom of experts. It involves recruiting 20 or so knowledgeable panelists and asking them to evaluate possible outcomes. Here’s how the Delphi technique works: A company research team recruits a panel of experts in appropriate fields (the wider the range of relevant expertise, the better); asks them—by phone or e-mail or in person—about the future of, for example, an emerging innovation or a volatile market; and then requests that each panelist rate the likelihood of several predictions that have emerged from the panel’s discussions or from company hypotheses. To ensure a free flow of ideas, the research team doesn’t ask the experts to justify their predictions. The results are then tabulated for the panel, and the experts rate the predictions’ likelihood again. Typically, at this point the team discusses the results with the experts. This process continues until consensus grows or ebbs. Some companies create and run the process themselves, but most use outside firms and remain anonymous, so that the panelists aren’t biased by knowledge of a sponsor’s identity. Companies that have tried the technique use it to guide decisions about major investments in, say, new technologies and about forays into immature or undefined markets.
Delphi panels were devised by the RAND Corporation to help the U.S. government imagine what might unfold in post–World War II Europe. They found early use in health care, education, and other nonprofit enterprises. Internet tools now enable panels to meet virtually, which makes it easier and less expensive for companies to recruit and schedule the experts, set up multiple panels to include a broad range of expertise, gather and distribute information, and conduct consensus building (though it is often helpful to bring the panelists together in person to mull over the findings). In addition, implementers have discovered that running multiple panels simultaneously can add great insight. In the case of an emerging technology, an additional panel of early adopters can be very helpful.
In the 1990s the Delphi technique helped a major television network predict that the advent of HDTV would be slower than widely expected. The network thus avoided making an unnecessary early investment to convert all of its equipment to digital. A global pharmaceutical company is using the technique to learn what trends or occurrences in the field of heart disease might induce consumers to take preventive actions such as changing their diets and starting medication when problems are first identified. A major financial services provider is beginning to use the technique, augmented by research among “lead users” (businesspeople and consumers deemed likely to be financially successful over time), to help determine which services to develop, which markets to target, and how best to earn customers’ long-term trust and loyalty.
But Delphi results alone don’t necessarily lead to great decisions any more than good market research does. The predictions are most useful if they are shaped into several possible scenarios that allow decision makers to understand the implications more fully. For instance, a panel’s prediction that the incidence of Alzheimer’s disease will increase would have much less value to a health care company than would a well-constructed scenario showing who might be affected—patients, families, health care facilities—and what the long-term consequences would be. Armed with detailed scenarios, a company can closely monitor the environment and act quickly in response to even faint indicators of which one is unfolding.
Study all the parts of the project and deliver a speech on the „European Heritage of Knowledge“.
The purpose of this projest is to make clear how rich the European heritage of knowledge is.
The project does not aspire to handle the topic of the scientific or cultural evolution.
It includes some selected information and intends to be a source of inspiration for the consideration of the questions: What is the core of the European identity? What can be the consequences of its loss? Or are such questions of any relevance at all and if so, why?
It should help to figure out how important it is to differentiate between various worldviews in the global, complex and interconnected world nowadays and to build a worldview based on (self)knowledge and (self)understanding as well as on the principle of compatibility between the self and the culture it has been nourished by.
It finally aims to foster curiosity, the desire and courage to learn more.