Sunteți pe pagina 1din 174

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/235418543

What Is Enlightenment? A History and Sociology of


Knowledge

Book · January 2013

CITATIONS READS

0 1,341

1 author:

J. M. Beach
21st Century Literacy
21 PUBLICATIONS   57 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Un-Prepared for Success: An Analysis of Student Soft-Skills and Prerequisite Skills for a Writing Class View
project

21st Century Literacy View project

All content following this page was uploaded by J. M. Beach on 22 May 2014.

The user has requested enhancement of the downloaded file.


Other books by J. M. Beach

Introduction to Philosophy: Knowledge, Action, and Enlightenment

Introduction to Literature: Fiction and the Form of Human Experience

Dare to Know: A Philosophical Autobiography

Faith in Dialogue: Sharing Differences, Seeking Common Ground

Gateway to Opportunity?
A History of the Community College in the United States

Children Dying Inside:


A Critical Analysis of Education in South Korea

The Paradox of Progressivism, and Other Essays on History

Studies in Ideology:
Essays on Culture and Subjectivity

Studies in Poetry:
The Visionary

Living into Words (Poetry in the Time of Killing):


Selected Poems: 1997- 2004

Where a Painter is a Poet: Poems 2001


2
What is Enlightenment?

A History & Sociology of Knowledge

3
4
What is Enlightenment?

A History & Sociology of Knowledge

J. M. Beach

5
By knowing how we know,
may we know better;

saving us from ourselves…

© 2013

J. M. Beach

West by Southwest Press

6
The Choice here is not a choice between magic
and no magic, but a choice between magics
that vary in their degree of approximation
to the truth.

-Kenneth Burke; The Philosophy of Literary Form

7
8
Table of Contents

Preface: Dare to Know 11

I. Different Ways of Knowing:


Subjectivity, Science, & Philosophy

1. Fiction & the Form of Human Experience 15

2. The Creation of Truth: Subjectivity & Narrative 19

3. The Imperfect Promise of Science 26

4. Philosophy is a Tool for Everyone 47

II. What is Enlightenment?


Contested Knowledge & the Limits of Human Nature

5. A History of Religion & Secularism 57

6. What is Enlightenment? 73

References 115

Selected Bibliography 159

About the Author 173

9
10
Preface
---
Dare to Know

There is much about ourselves of which we are not fully aware


because we cannot see through the complexity of the human condition
and the burdens of our daily lives. We are too busy living, as Thoreau
once said. We do not really know who we are and what our capabilities
are as human beings. Thoreau urged to people to become "awake" to the
possibilities of living more deliberate lives. Of course, not much was
really known about human beings in the 19th century, as history,
philosophy, and science were all still in their infancy. Thus, Thoreau and
a great many others of his time were too naive and optimistic about the
possibilities of human knowledge to perfect our nature.
Over the past century and a half since Thoreau, history,
philosophy, and science have become professionalized. We now know
more than ever before about the human condition and the limited powers
we have to control our destiny. For a time, humans seemed a grand demi-
god. But that idealism has since been deconstructed and deflated over the
20th century by realist philosophers, such as Raymond Aaron, Isaiah
Berlin, and John Gray.
This book is a historical, philosophical, and sociological inquiry
into knowledge. For thousands of years we have thought that we were
divinely enabled to know and control our destiny. Only relatively
recently, over the past half century, have we begun to realize that we are
deeply flawed organisms with the unique capacity for conscious thought
and constrained agency. The philosopher and psychologist Joshua
Greene recently explained the capacity of our conscious mind: it is a
"complex hodgepodge of emotional responses and rational
(re)constructions, shaped by biological and cultural forces that do some
things well and other things extremely poorly."1 This book seeks to take
a historical and functionalist approach to the phenomenon of knowledge.2
I want to explain what it is, how it is created, and what it can and cannot
do. I hope this book can help enlighten a new generation about the
possibilities and constraints of human progress, which depends on our
flawed, but useful tool of knowledge. The fate of our species rests on our
continued daring to know.

11
12
I

Different Ways of Knowing:

Subjectivity, Science, & Philosophy

We are drowning in information,


while starving for wisdom.

- E. O. Wilson, Consilience

13
14
Chapter 1
---
Fiction & the Form of Human Experience

What is fiction? Most people dismiss fiction as an unimportant


luxury, but fiction is a necessary part of the human condition. Humans
are “narrative beings.”3 In our beginning as a species is mythos (story).
Only later did we learn logos (rationality).4 We naturally understand our
experience in the form of story.5 We use narratives to learn from our
experience by storing meaningful events in memories, which help guide
future behavior.6 This process creates consciousness and subjectivity,
what neurobiologist Antonio Damasio called the “autobiographical self”
or what the philosopher John Gray called the "fictive self."7 But the
human mind is an imperfect recorder and interpreter of experience,
whether it is preserved through memory, the written word, paint, or
artifacts.8 Yet, this does not mean that our subjective9 experience is false.
The equation of fiction with falsity is a modern fallacy. It comes from
the reductionist assumptions of positivist scientism.10 As Walter
Lippmann once explained, "By fictions I do not mean lies, I mean
representation."11
Fiction derives from the Latin verb fictio, which means create,
make, or form. It is related in meaning to the ancient Greek word
poiema, which meant an action, deed, or act, and is the root of the English
word poetry. The concepts of fiction and poetry refer to the fact that
human consciousness is an active power. This led to the old Latin
moniker of homo faber, "man makes himself."12 Consciousness acts on
experience. Consciousness is not a passive receptor. Consciousness
forms experience with emotion, colors it with meaning, and stabilizes it
with identity. Our mind is not a “mirror of nature.”13 Our subjectivity
co-creates experience with the objective world14 to produce what one
philosopher has called "subjective realism."15 Our minds cooperate with
the world in order to produce an interaction between knower and
known.16 As humans we personally participate in the "act of knowing"
and in the construction of our knowledge.17 This interaction creates a
phenomenon, a mental object perceived by the human senses, which is
distinct from the object as it exists independently of our perceptions.18
The active creativity of human subjectivity does not falsify experience so
much as make it meaningful and relevant. Subjective vision is "a

15
glowing thing, a blurred thing of beauty. Its structure is at once luminous
and translucent; you can see the world through it."19 Subjectivity alters
experience by adding to it, enhancing it, and making it usable.20
Subjectivity is "like magic," as one sociologist put it.21
Historian William H. McNeill explained, "Men are and always have been
myth makers, seizing upon the significant by leaving out the trivial, so as
to make the world intelligible...For human minds imperiously demand
historical experience to have shape and meaning."22 Historian Edmund S.
Morgan wrote an award winning book on the importance of political
fictions, or what we often call "self-evident truths." Morgan explained,
"the make-believe world may often mold the real one...fictions are
necessary, because we cannot live without them...[they] make our world
conform more closely to what we want it to be...the fiction takes
command and reshapes reality."23 And while the subjective magic of
fiction could be denigrated as mere mythmaking, many if not most human
beings need their myths in order to survive and thrive. As the economist
John Kenneth Galbraith once pointed out, "Don't part with your illusions;
when they are gone you may still exist, but you have ceased to live."24
But we now know that these illusions of meaning also have objective
reality, in at least two ways: our subjective experience affects the
physiology of our brain, which in turn affects the physiology of our body
and our actions. Our subjective meanings literally can make us who we
are.25 Anthropologist Daniel E. Moerman calls this phenomenon the
"meaning response."26
Human perception is not a direct window open to the world.
Our perception is partly constituted by the biological perceptual process
and also, perhaps more importantly, as the quote by Galbraith suggested,
by our subjectivity and culture. We see with more than our eyes. We
also see with our wants, our needs, and our hopes. We see with more
than our intellect. We also see with our emotions. We see with more
than our own self. We also see with preconceived social assumptions,
which have been variously labeled as common sense, conventional
wisdom, interpretive frameworks, habitus, gestalts, ideologies,
backgrounds, paradigms, and epistemes. These are various names for the
“epistemological unconscious”27 that shapes our perceptual process, our
personality, and our ability to communicate. We all inherit ways of being
and thinking that are particular to our unique social and historical context.
We often call this culture. Culture is our "social heritage."28 It includes
"a pool of technological and social innovations" that help us develop and
survive.29 As one philosopher explained it, "Culture is not only the
condition of life, it is the condition of knowing."30 Our epistemological
unconscious shapes our subjectivity, formulates our perception, and helps
us participate in our culture. Our culture is ordered by "collective make-
believe,"31 which are values or beliefs that we all take for granted as self-

16
evidently true and right. We all combine our biological nature with the
tools and beliefs of our culture to produce a "self" through a process the
philosopher Daniel C. Dennett calls, "the art of self-making."32 "All the
world's a stage, / And all the men and women merely players," as
Shakespeare intuitively grasped in the 17th century.33 We live our
subjective lives, to quote a more recent philosopher, as a "psycho-poetic
performance."34
We can never escape our subjectivity, nor can we eradicate the
cultural and biological influences that have shaped us since we were born.
The 16th century British philosopher Francis Bacon called these
predicaments the "idol of the cave," after Plato's famous metaphor in the
Republic, and also the "idol of the tribe." He saw human subjectivity as
"a corrupt and ill-ordered predisposition of mind." Bacon believed, as
have many scientists since, that we can destroy and abolish these "idols"
so as the see the world with pristine and unencumbered eyes - as through
"clear glass."35 But this is a lie. Complete objectivity is a "false ideal."36
We can never escape the cave. We cannot "command" our nature or the
objective world. Our minds can never be "thoroughly freed and
cleansed."37 As the philosopher David Hume famously put it, "We have,
therefore, no choice left but betwixt a false reason and none at all."38
Ralph Waldo Emerson later agreed, "We have learned that we do not see
directly, but mediately, and that we have no means of correcting these
colored and distorting lenses which we are, or of computing the amount
of their errors."39
But Bacon admitted, as did Hume, that as part of the natural
world we were uniquely situated and endowed with a capacity to know,
however flawed it may be. Bacon said that we can "interpret" ourselves
and the natural world, but this ability was grounded by the constraints of
the physical world, including the limits of our own biological brain,
which "must be obeyed."40 As reflective and critical beings we can
become more aware of how our biology, subjectivity, and culture
influence our perception and behavior. We can also become more aware
of how our biology, subjectivity, and culture can be influenced and
modified in turn – how they can be changed, not commanded. Our ability
to alter our selves and our environment produces the conditions of true
freedom and moral responsibility.41
But in order to produce social and personal change, we must
analyze and understand our biology, our psychological self, and our
culture - what Robert Nozick has called "the whole of our being":42 how
we subjectively experience life, what we experience, how we know our
experience, and how we can communicate our experience. This is the
domain of knowledge that we call the arts and humanities, and in
particular it was the foundation for the classical notion of philosophy.43
If we can gain this type of knowledge, then we can learn how to actively

17
participate in the construction of truth. Truth is not a verbal abstraction
of some idealized and unchanging notion of reality. Instead, truth is a
pragmatic conceptual construction anchored to the physical world, which
allows us to order and understand our various sensory perceptions so that
we can live more meaningfully and effectively.44 Wisdom is the ability
to use truth to make meaningful, productive, and sustainable choices that
lead to individual and cultural flourishing.

18
Chapter 2
---
The Creation of Truth: Subjectivity & Narrative

How do we act upon the world we see? And what does it mean
to create truth? When we see the world through direct experience or
through memory, we do not simply see, as if looking through a window.
Instead we see the external world through the stained glass of our
consciousness, which has been shaped by our parents, our culture, our
language, but also by our own experience, our goals, and our unique
personality. The visionary English poet William Blake once said that we
all see the world, but we do not all see it the same way. Blake believed
that he could see strange things: angels dancing around the sun and the
dead walking around his garden. During his lifetime, most people
thought Blake to be eccentric, if not completely crazy.45 But Blake’s
experience was very real to him, and he lived his life according to his
subjective vision, producing profoundly beautiful poetry, paintings, and
deep insights into the human condition.
Blake understood that the perceptual process was a creative
activity and he actively created the contours of his own life. Many other
visionary artists have done the same. The philosopher Ralph Waldo
Emerson declared (taking a page from Blake), "As I am, so I see."46 The
founder of Apple, Steve Jobs, was known for having a "reality distortion
field" that enabled him to "bend any fact to fit the purpose at hand" in
order to "change reality." As his biographer put it, "If reality did not
comport with his will, he would ignore it...he acted as if he were not
subject to the strictures around him."47 When we see the world through
subjective experience, we make the objective world part of our unique
consciousness and we purposefully act on reality, as if we were in a
dream. One sociologist has called this process "the art of living."48 We
cognitively organize and verbally express our subjective experience in
order to give it form and meaning. These verbal constructs become
personal and social artifacts of experience, like memories, narratives, or
art, and these artifacts objectify our subjective experience of the world so
that others may share in our vision.49
After the living moments of our life pass away, we carry the
most important as memories in a fictional place we call the past,
imperfectly stored in the subconscious netherworld of our biological
brain. When we go to recollect a memory, we reconstruct it through

19
narrative,50 reconstituting our experience into a recognizable and
meaningful form.51 For much of human history the process of education
was simply memorization of all the important stories that contained the
knowledge and wisdom of a particular culture.52 As Michael Polanyi
explained, "Man lives in the meanings he is able to discern. He extends
himself into that which he finds coherent and is at home there."53 Some
scholars have crafted a professionalized method to analyze and synthesize
recorded memories in order to reconstitute a collective memory of the
past. We call this method history. But the act of memory, and the formal
practice of history, have always been plagued the paradox of fiction.54
Paul Ricoeur explained how memory always "operates in the wake of the
imagination."55 In an ironic critique of historiography, Hermann Hess
sardonically made this same judgment, “The writing of history – however
dryly it is done and however sincere the desire for objectivity – remains
literature. History’s third dimension is always fiction.”56
For much of human history there was never a clear line between
history as "what really happened" and history as "myth."57 They were
largely one and the same, until the development of writing and critical
modes of thinking.58 Myths blended cultural meanings with social rituals
and grounded them both in "the enduring realities of human life."59 Up
until the 19th century, most historians blended actual events with
subjective flourish and mythic narratives, relying more upon traditional
beliefs than empirical evidence. William H. McNeill called this
"mythistory."60 It wasn't until 18th century that scholars, like
Giambattista Vico in his New Science (1725), really began to distinguish
between secular facts (factum - what humans have created) from religious
myths to determine the truth (verum).61 Clarifying a precise distinction
between "fact" (factum) and "fiction" (fictio) was not even a subject of
concern until the 19th and 20th centuries, when scientists began to
formalize empirical and experimental tests for truth, while at the same
time writers began using new realistic prose techniques to tell stories.
But these developments did not abolish myth and subjectivity.62 Rather
these older forms of knowing were combined with new "technologies of
truth."63 But the problem of cultural relativity remains acute, as William
H. McNeill acknowledged, "The same words that constitute truth for
some are, and always will be, myth for others."64
The blurring of fact and fiction are not just the province of
history. This predicament can also be seen in other literary forms, like
the biography, the novel, and journalism. The first "novels" published in
Europe utilized a traditional autobiographical form so as to present a
made-up story as a true memoir. Daniel Defoe, the first English novelist,
structured his "new" literary form on the testimonial confession of St.
Augustine of Hippo and the mystical narrative of the Muslim philosopher
Avicenna. Defoe began his epistolary-confessional novel Robinson
Crusoe (1719) with a by-line presented through the persona of Crusoe,

20
"Written by HIMSELF." The English public actually believed the story
was true until Defoe finally admitted that he invented both the story and
the main character. Defoe also wrote A Journal of the Plague Year
(1722), which was a first-person journalistic account of London's great
plague of 1665 published under a pseudonym. But while many
commentators later criticized Defoe's account of the plague as "fiction
masquerading as fact," later scholars exonerated Defoe. Watson
Nicholson found that there was not "a single essential statement in the
Journal not based on historical fact." Anthony Burgess later claimed that
"Defoe was our first great novelist because he was our first great
journalist."65
The blending of objective reality with subjective fiction has been
the modus operandi of various written genres, including personal
memoirs, environmental writing, journalism, novels, and poetry. As
Henry Adams explained in his famous autobiography, "This was the
journey he remembered. The actual journey may have been quite
different, but the actual journey has no interest for education. The
memory was all that mattered."66 The subjectively crafted story allows
the individual to "transform itself" and "self-create," as philosopher
Robert Nozick explained: "I want to say that you are your reality. Our
identity consists of those features, aspects, and activities that don't just
exist but also are (more) real...Our reality consists partly in the values we
pursue and live by, the vividness, intensity, and integration with which
we embody them."67
Acknowledging the importance of subjectivity in the creation of
self and society enabled the development a new form of journalism in the
1960s. Tom Wolf, Hunter S. Thompson, and Joan Didion pioneered what
Thompson humorously referred to as Gonzo journalism, or the art of
subjective reporting.68 Norman Mailer went so far as to sub-title his
classic history of the Civil Rights movement, "history as a novel, the
novel as history."69 These writers were reacting against the false claim of
objectivity, as old-school journalists hid their subjectivity behind third-
person reporting. This new school of journalists harkened back to the
rebelliousness of Henry David Thoreau, who explained, "the I, or first
person, is omitted...We commonly do not remember that it is, after all,
always the first person speaking."70 Mailer claimed that most journalists
"wrenched and garbled and twisted and broke one's words...action was
distorted...words were tortured."71 Instead, taking a page from Thoreau,
this new movement wanted to be sincere in their subjectivity. Thus, this
new type of journalist flaunted "egotism" and made their own persona the
protagonist of the story, anchoring reality to the eyes of a flawed and
subjective personality who not only observed the story, but lived through
it.72 Joan Didion explained, "however dutifully we record what we see
around us, the common denominator of all we see is always,
transparently, shamelessly, the implacable 'I'."73

21
Harkening back to these journalistic pioneers of subjective
prose, Maureen Tkacik recently argued that journalism cannot ever know
the truth "at the expense of the personal" because "storytelling" does not
"exist in a vacuum." Instead, journalists must "humanize" journalism and
establish "trust" with the reader, which would entail "letting down the old
guard of objectivity and letting go of illusions of unimpeachability."
Tkacik pondered, "Rather than train journalists to dismiss their own
experiences, what if we trained them to use those experiences to help
them explain the news to their audience? Allow their humanity to shape
their journalism?"74 In the early 21st century, journalists are quickly
abandoning the false hope of objective news. They are instead trying to
humanize the new through transparently subjective reporting, which
means openly acknowledging their subjectivity and cultural bias, while
still trying to be accurate, fair and intellectually honest.75
The wartime experiences of Tim O’Brien provide a great
example of this difficulty. In his work O'Brien "humanized" his direct
experience of the objective world through fictionalizing a subjective
narrative. He has written memoirs, novels, and short stories all of which
cover the same lived experience as an American soldier during the
Vietnam War. As a writer, O’Brien was faced with a difficult
predicament. How to explain his experience of Vietnam and the after-
affects of the war on his life? O’Brien admitted that he had to fictionalize
his wartime experience in order to truthfully express its significance and
explain its meaning. He tried to clarify this paradox, saying the “story-
truth is truer sometimes than happening-truth” because the “spell of
memory and imagination” produce a more meaningful explanation of
lived experience than a simple recounting of objective facts. O'Brien
makes the case that fiction can be an important way to produce truth: “By
telling stories, you objectify your own experience. You separate it from
yourself. You pin down certain truths. You make up others. You start
sometimes with an incident that truly happened…and you carry it forward
by inventing incidents that did not in fact occur but that nonetheless help
to clarify and explain.”76 The complex reality of a phenomenon like war,
especially the lived significance for a human being, cannot be simply
conveyed through disembodied data. As the philosopher John Gray has
pointed out, "Some truths cannot be told except as fiction."77
The reconstituting process of subjective narrative does mean that
fiction falsifies the meaning of experience,78 although our memory does
change experience by adding, subtracting, rearranging, or revising certain
parts.79 Fiction, more to the point, brings the subjective truth of being
into words, creating an imaginative artifact inspired from our
phenomenological experience of a meaningful reality.80 Narrative
expresses identity through the "connectedness of life."81 It expresses the
thoughts, dreams, feelings, wishes, wonders, and fears of the individual

22
or community – in short, it expresses the subjective experience of being
human. It expresses a "vision of the world," a "cosmos," that brings
coherence and order to the world grounded in our subjectivity.82 Paul
Ricoeur described this subjective cosmos as "the living present of the
phenomenological experience of time...'here,' as the place where I am."83
Alasdair MacIntyre argued that narrative brings "unity" and meaning to
life that is often fragmented and meaningless.84 We cannot live without
brining assigning order and meaning to the overwhelmingly complex and
chaotic objective world. Narrative is the primary tool for this task.
But our narrative visions of the world are often highly
idiosyncratic and difficult to communicate to others. This notion is
expressed in perfect irony by Hermann Hess in his mock biography of a
fictional character. Hess explained, “Nothing is harder, yet nothing is
more necessary, than to speak of certain things whose existence is neither
demonstrable nor probable.”85 This is the reason why many
autobiographical narratives are labeled “fiction” by publishers. As the
anthropologist Pascal Boyer noted, “The boundary between a fictional
story and an account of personal experience is often difficult to trace.”86
To guard against the literal minded, writers often barb their experience
behind word magic, cautioning this is but a "fiction," which is a slight of
hand warning the wary of the paradoxical netherworld between human
experience and objective truth. Kurt Vonnegut opened his famous novel-
memoir with the frank admission, “All this happened, more or less. The
war parts, anyway, are pretty much true.” And so it goes.87 C’est la vie.
Our subjective perception, conception, and verbalization of the
world, whether through lived experience or through memory, whether
completely factual or not, has definite phenomenological reality – our
experience seems real to us even if we don’t understand it or reproduce it
in accurate detail. Our perception and knowledge of the world are just
tools, flawed but useful.88 As Paul Ricoeur explains, "It is precisely
because of the elusive character of real life that we need the help of
fiction to organize life retrospectively, after the fact."89 We use our
subjective narratives to understand life and make it meaningful. Our
subjective reality combined with the "conventional wisdom" of our
society is often more important to us then the larger, impersonal,
“objective” world that surrounds us, co-constitutes us, and constrains
us.90 Walter Benjamin pointed out that our storytelling is a form of
"practical wisdom" that enables a measure of control over an objective
world that is often unknowable and uncontrollable.91
Most of us will never fully know or understand the complex
objective reality that we live within. We suffer from many natural biases
that block off objective reality.92 What "fragments of knowledge" we do
understand lead us to "jump to conclusions on the basis of limited
evidence," producing a WYSIATI framework: "What you see is all there

23
is."93 We have a very limited grasp on the objective world, but we don't
realize it. We suffer from the "pernicious illusion" that we understand the
world we live in.94 As Daniel Kahneman (2011) pointed out, "Our
comporting conviction that the world makes sense rests on a secure
foundation: our almost unlimited ability to ignore our ignorance."95
What can we really know of all those other people we see
everyday? What really happens in the corridors of power by local, state,
national, international, and corporate entities? What is really happening
to our local, regional, continental, and global ecosystems? To make
matters worse, readily available information is often manipulated by the
machinations of the rich and powerful, what Marx called political
"ideology," and later writers called propaganda or spin.96 In an article
exploring the aftermath a catastrophic oil spill in the Gulf of Mexico,
Naomi Klein admitted that she could not trust media or government
reports. She couldn't even "trust my eyes."97 A glimpse of the complex
reality of the Gulf could only be empirically verified with advanced
machinery and complex scientific methods. But even scientists, patiently
studying subjects for years are boxed in by their narrow disciplinary ways
of knowing, as well as being limited by time and money. Thus, they
often grasp only small pieces of the larger puzzle that we call the
objective world, which can lead to a special kind of distorted reality.98
As the historian William H. McNeill admitted, "If one takes a rigorous
epistemological stance," then finding out "what really happened quickly
becomes impossible."99
Human beings do not need facts or complete knowledge of the
objective world in order to live. We all make decisions based on
incomplete knowledge or no knowledge at all. "Even the most important
decisions in life," wrote Werner Heisenberg, "must always contain this
inevitable element of irrationality."100 Although we don't need perfect
rationality and objective truth, there are some things we do need. We
need meaning to make our lives worth living. We need wisdom to guide
our daily actions. We also need to communicate our lived experience
with others. We need to share the meaningful experiences of our lives or
the lives of others in powerful stories called “myths.” We collect these
meaningful stories over a lifetime and preserve them in personal or
collective canons. These stories define our identity, motivate and guide
our actions, and preserve that which should not be forgotten. These
myths might not be objectively true, but they are the truth by which we
live.
There is real power in subjective truth and cultural myth. Our
subjectivity can act on and alter the objective world through beliefs,
goals, and purposive action, although never as completely as we may
wish. One recent example is the visionary artistry of Steve Jobs and his
transformation of the computer, music, communications, and animation

24
industries through his company Apple.101 Another powerful example is
the placebo effect in medicine. Patients can literally perceive themselves
as better and thereby make themselves well.102 Medical doctors have
found conclusive evidence that placebos work because we think they
work - we want them to work. The Director of the German Medical
Association recently acknowledged that placebos are "hugely important
in medicine today."103 In fact, one researcher claimed that the history of
medicine was "largely the history of the placebo effect."104 Irving Kirsch
has found that placebos are more effective than antidepressant drugs at
treating most forms of depression.105 As the sociologist William Isaac
Thomas noted in 1929, "If men define situations as real, they are real in
their consequences."106 We have the power to literally will a new reality,
although not completely.
We may live in the objective world of factual truth, but we live
by and for our created subjective truths. Paradoxically, these latter truths
have varying degrees of their own objective reality, which can materially
affect our the quality of our lives. Defending the American led invasion
of Iraq, British Prime Minister Tony Blair famously said, "I only know
what I believe." John Gray argued that the lies of George W. Bush and
Tony Blair were not "true lies" because both Bush and Blair believed they
had "prophetic glimpses of the future," they believed that "deception is
justified if it advances human progress," although reality partially
disabused these naive men of such fantasies.107 These powerful political
leaders unleashed a destabilizing war in the Middle East because they
thought their beliefs were strong enough to reshape the world after their
own vision108 - and a large part of both America and England bought into
the reality of this illusion. There is real power in subjective truth, albeit
an unwieldy and frightful power. As John Gray has pointed out,
"Nothing is more human than the readiness to kill and die in order to
secure a meaning in life."109

25
Chapter 3
---
The Imperfect Promise of Science

By acknowledging the importance of subjectivity and culture, I


don’t want my position to be reduced to a simplified idealism, relativism,
or postmodernist anti-rationalism.110 I am not saying that all truths are
merely subjective fictions, and thus, all claims are relatively valid, which
has been called the "postmodern perspective."111 I am not saying that
language and theoretical concepts completely determine our
understanding of the world, or that they negate the possibility of an
objective world independent of human experience.112 I am not saying
that all truths are empirically or ethically equal, nor that they are all
equally valid representations of the objective world. But this is not to say
that subjective and cultural relativism does not exist. Nor is it to say, as
have many scientists like Alan Musgrave, that subjective idealism and
cultural relativism are "ludicrous" or "anti-scientific."113 Subjectivity and
cultural relativism are real. These phenomena are naturally occurring
biological and cultural capabilities that both enable and constrain the
limits our knowledge about ourselves and the complex world we live in.
But these phenomenon are constantly checked and balanced by the
objective world that conditions and constrains us. The visionary
entrepreneur Steve Jobs found this out the hard way. His "reality
distortion field" enabled wonders the world had never seen, but it was
powerless against cancer.114
Unlike most philosophers who have argued over this issue, I
don't want to push for the artificial categories of realism, anti-realism, or
idealism (or any other arbitrary analytical box).115 We must get beyond,
as Pierre Bourdieu argued, "the ritual either/or choice between
objectivism and subjectivism in which the social sciences have so far
allowed themselves to be trapped."116 I want to argue that various
positions ranging from complete objectivism to complete subjectivism
exist simultaneously across the multiplicity of all human experience. The
most interesting sciences try to bridge this gap, like neuro-
phenomenology, which seeks to link subjective thought to the
physiological neural activity of the brain.117 These new fields of
knowledge allow for "subjective realism," a position that acknowledges
the reality of subjective phenomenon which can become "part of the real,
physical fabric of things."118 And even if realist ontologies and
epistemologies produce the most valid forms of knowledge (and I am

26
very certain they do), they are an embattled minority position in social
world ruled by subjective realism.
We should not deny any aspect of our complex human
condition: physical, biological, subjective, or cultural. But neither do we
have to be completely bound and determined by any one of these
constraints. I know that subjectivity and objectivity can coexist rather
peacefully because they do so in my own mind, as I'm sure they do for
many others as well. We need to better understand (to the extent
possible) our ecological, biological, subjective, and cultural worlds in
order to better know and control our personal, social, and environmental
circumstances.
While it is important to be aware of subjectivity and culture as a
"constant monster, lurking over all of us,"119 the notion of subjectivity
and relativism does not upset the existence of scientifically validated
objective truth, nor is the existence of epistemological relativism alien to
the so called “hard” sciences. The theory of relativity forever "altered the
classical concept of objectivity."120 Relativism is an established, but
unsettling, scientific truth with a long history. It has been demonstrated
by Charles Darwin’s theory of natural selection and later studies in
biology, Albert Einstein’s theory of relativity, Werner Heisenberg’s
theory of uncertainty, and a mountain of evidence from cultural
anthropology.121 Darwin discovered the random evolution of biological
forms and later biologists discovered that there are "many 'worlds,' of
which only one is accessible to us."122 Further, "many perceptions of the
'real world' are possible, and our human senses provide only a very
limited sampling of the characteristics of this world."123 The great
physicist Albert Einstein shocked the scientific community in the early
20th century by claiming that the results of science are relative to the
place and time of the observer. Werner Heisenberg added that we can't
always measure, let alone understand, the intricate and complex workings
of the physical universe, especially at it most elementary level. 20th
century cultural anthropologists found that a single species, homo
sapiens, could fashion myriad forms of social structure and cultural belief
through the intentionality of human action and symbolic expression.124
Scientists use diverse situational standpoints, methodological
techniques, and terminologies which all add something to the objective
world being observed and, thereby, shape and constrain knowledge of the
real world. Scientists also use discourse and rhetoric to organize
knowledge, present it to diverse professional and non-professional
publics, and argue over the meaning and relevance of knowledge. As the
philosopher Kenneth Burke once observed, the objective world "does not
possess an absolute meaning...Any given situation derives its character
from the entire framework of interpretation by which we judge
it...different frameworks of interpretation will lead to different

27
conclusions as to what reality is."125 Thus, as Burke later theorized, all of
our methods of knowledge creation, including the scientific, select and
shape the objective world after the intentions of the knower: "In any term
[or method] we can posit a world, in the sense that we can treat the world
in terms of it...Men seek for vocabularies that will be faithful reflections
of reality. To this end, they must develop vocabularies that are selections
of reality. And any selection of reality must, in certain circumstances,
function as a deflection of reality...In its selectivity, it is a reduction."126
All epistemological methods and vocabularies, even the
scientific, have a definite effect on not only the reality being observed,
but also how that reality is recorded, understood, and explained to others.
Hilary Putnam observed, "the world does not have a 'ready made' or
'built-in' description; many descriptions may 'fit,' depending on our
interests and purposes."127 Kenneth Burke, and more recently Richard
Rorty, have pointed out that knowledge of the objective world is greatly
limited by the ontological and epistemological "assumptions" that
different cultural or academic communities hold and use to see the world,
to describe the world, and to give it meaning.128
This does not mean that there is no objective world or valid tools
to apprehend reality, more or less clearly, but this does mean that conflict
is inevitable, and it does mean that the parameters of the objective world
"must be repeatedly negotiated" through political processes.129 It also
means, as Kenneth Burke and Charles E. Lindblom have pointed out, "we
are all impaired, crippled by learned incompetences."130 When we are
trained to see the world in one way, we cannot see it differently, unless
re-trained to see the world from another perspective with another set of
assumptions, which of course will bring with it another set of limitations.
Being aware of the profound limitation of perception and language
requires, as Richard Rorty pointed out, an ironic critical perspective.131
Yet we must not get lost in the politicized haze of diverse
cultural perspectives, limited situational standpoints, and the relativism
they create. We currently live in an age of postmodern "Truthiness"
where there seems to be no empirical facts and no objective world - only
endless and conflicting opinions and values.132 And when "facts" are
invoked in debates over public policy, there are plenty of political groups
who sole purpose is fight facts with what Oreskes and Conway call the
"Tobacco Strategy:" manipulating the ignorant through systematic,
relentless, and well-funded "doubt-mongering."133 Media outlets such as
Fox News have even subverted the definitions of common values, like
"fair and balanced." Partisan media and think tanks reduce reporting to
an "echo chamber" of politicized talking points and systematically
manufacture lies to manipulate the public.134 Success in politics and law
seems to require cynicism, as the indistinct line between subjective and
objective reality is mystified behind obfuscated doublespeak, manipulated
spin, and the "cacophony of conflicting claims."135

28
Most of us instinctively know that our social and political
leaders engage in various acts of official deception, which often takes the
form of outright lying.136 Irving Kristol cynically blessed such official
deception, "There are different kinds of truth for different kinds of
people. There are truths appropriate for children; truths that are
appropriate for students; truths that are appropriate for highly educated
adults, and the notion that there should be one set of truths available to
everyone is a modern democratic fallacy. It doesn't work."137 The great
dramatist David Mamet has recently portrayed the law as nothing but an
“alley fight,” whereby opposing attorneys attempt to craft “two fictions”
in order to manipulate a jury.138 Supreme Court justice Oliver Wendell
Holmes famously said the law was "not a brooding omnipresence in the
sky" but rather the "articulate voice" of the lawyer or judge who uses
"eloquence" to "set fire to reason."139 The old domain of political rhetoric
and the new domains of advertising and public relations all toy with the
subjective nature of truth to manipulate the ignorant and unsuspecting.140
All notions of truth have seemingly dissolved in the acidic discourse of
21st century political debate, but it is important to remember that "not
every 'side' is right or true."141 Humans cannot verbally eliminate the
concrete reality of the objective world, as Don Delillo humorously
illustrated in his novel White Noise.142
The philosopher Isaiah Berlin famously noted, “Because there is
no hard and fast line between ‘subjective’ and ‘objective’, it does not
follow that there is no line at all.”143 The unscrupulous might try to erase
this line out of ignorance, deceit, profit, or desire for power, but the line
remains. Martha C. Nussbaum has argued, "The search for truth is a
human activity, carried on with human faculties in a world in which
human beings struggle, often greedily, for power. But we should not
agree that these facts undermine the very project of pursuing truth and
objectivity."144 The truth exists because the objective world exists;
however, knowing and understanding that objective world has proven
much harder than anyone expected. Even if sincere, honest, academically
trained professionals use reason to try and understand the objective
world, there is no "guarantee of reaching truth."145 As philosopher
Amartya Sen has pointed out, even "the most rigorous of searches, in
ethics or in any other discipline, could still fail."146 Objective truth is out
there, but it is very hard - sometimes impossible - to reach.
Dwelling147 within our subjectivity and culture does not nullify
the objective world outside the doors of our perception. It just makes it
more difficult to fully grasp. Relativism does not justify all possible
vantage points, beliefs, or conclusions, but it does alert us to the reality of
"positional perspectives. There is no "uniquely correct standpoint from
which to arrive at proper judgments."148 We all see the world from within
our subjectivity, our culture, and our physical place in the world. As
Amartya Sen explains, "What we can see is not independent of where

29
we stand in relation to what we are trying to see."149 Just because some
people subjectively believe something to be true, like the existence of
deities or aliens, does not mean these entities actually exist, although they
may seem real to certain individuals. Subjectivity can block off
individuals from the objective world, although this rarely happens
completely.150
The point I am making is simple. We all see the objective world
every day, but very few of us actually look past our subjectivity or culture
to know the objective world, which is possible, however, it is also
incredibly difficult and we are constrained by our biology and cultural
processes.151 Most people dwell within their subjectivity and culture and
rarely venture outside of it. They believe what they subjectively and
culturally see, which is often called "common sense." Some of these
people think they know the objective world, but what they really know is
their subjective experience of the objective world. Even scientists, if
they’re not careful, can succumb to this illusion, as historian of science
Steven Shapen has highlighted in relation to the recent "Science Wars".152
But while subjectivity and culture are removed from the objective world,
they are still connected to and a part of the objective world, thus, these
tools serve us fairly well, although they do have limitations and dangers.
Our subjectivity is naturally attuned to an objective world that is
knowable, largely because our mind is part of, and has evolved within,
this objective world. Thus, we can understand the objective world fairly
well, although our understanding is reliant upon and limited by our
biological brain and our culture's explanatory theories.153 We all live
somewhat successfully within our subjectivity because a basic
phenomenological knowledge of the world allows us to function without
too many epistemological problems.154 As the Nobel Prize winning
psychologist Daniel Kahneman explained, "Most of our judgments and
actions are appropriate most of the time."155
But do we really see and know the objective world as it actually
exists? For most of us, the answer is a qualified “yes.” We see the
surface appearance of the immediate objective world, which creates a
sense of "practical realism" or phenomenological realism,156 but as Hilary
Putnam explained, "we have no direct cognitive relation to the objects of
perception."157 This lead Immanuel Kant to famously distinguish
between the phenomenon that we see from the objective world as it
actually exists because he wanted to warn against any simple transparent
correspondence between our perception and the objective world we
believe we know.158
Since Kant we have known that perception is not passive
recognition of objective reality. Before Kant (and for over a century
after) philosophers and scientists believed that "we draw no conclusions.
They are drawn by nature. We simply describe...phenomena are
discovered."159 This naive assumption has turned out to be false. While

30
we can perceive the objective world through our own personal
experience, we have to actively create knowledge of our world, and
therein lies the problem. Every individual and culture creates a practical
form of knowledge, common sense, which is often valid to some extent.
But our common-sense knowledge is always highly limited because we
can empirically validate only what we experience on a daily basis in our
localized physical and social environment; however, even then we usually
only grasp the superficial surface of objective reality.160 Rarely do we
fully understand the depths of the objective world we dwell within. Plus,
we must never forget that our subjectivity can misunderstand reality or
our biological brain can malfunction, which both can cause us
problems.161 As the physicist Werner Heisenberg explained, we have no
"sharply defined" notion of reality and we "never know precisely."162
Most of us don’t realize how food gets from farms to grocery stores, or
how money circulates from banks to businesses to our pockets, let alone
how chemical reactions create organic life, or how subatomic particles
create the atomic building blocks of the universe. Physicists developing
“string theory” have argued that reality actually consists of eleven
dimensions and that parallel universes exist beside our own. If this is
true, then we all have a very limited grasp on reality.163
Outside of our localized, anthropocentric environment, the vast
majority of human beings are existentially and epistemologically
ignorant. Charles E. Lindblom has pointed out, "Most of the [world] is
too far away for anyone to observe much of it. Each person can see,
listen to, and touch only a small part of it. Its complexities also resist
direct observation. Even [scientists] find direct observation laborious and
difficult, hence each specializes in no more than a few points of
observation and sometimes none at all."164 Most of our knowledge of the
objective world comes second-hand from others, and the validity of this
secondary information varies greatly due to vast differences in the quality
of primary sources. Even to the extent that we use scientifically derived
information, the vast majority of us believe this information based on a
quasi-rational or irrational "trust" or "faith" in the authority of the
scientific professions or because it is accepted social dogma.165 As Henry
Adams once explained about his acceptance of Darwinian evolution, "He
was...a predestined follower of the tide; but he was hardly trained to
follow Darwin's evidences...a young man had to take on trust. Neither he
nor any one else knew enough to verify them...Henry Adams was
Darwinist because it was easier than not, for his ignorance exceeded
belief, and one must know something...Natural Selection seemed a dogma
to be put in the place of the Athanasian creed; it was a form of religious
hope...he did not care whether truth was, or was not, true. He did not
even care that it should be proved true."166

31
The most reliable information comes from scientists or
professionals trained in scientific methods, but science is still flawed, and
it often becomes mere dogma or magic for the average human being.
Relatively few understand the scientific process, and many do not know
why it produces the most valid forms of truth. Even practicing scientists
can't always agree on what they do or why.167 The practice of science is
an important human endeavor that can get us very close to objective
reality, but it has been unevenly developed in various disciplines over the
past five hundred years. It is important to remember that most scientists
were bumbling amateurs up until the late 19th century. The first
systematic treatise on science was not published until 1843. It was John
Stuart Mill's A System of Logic, Ratiocinative and Inductive, Being a
Connected View of the Principles of Evidence and the Methods of
Scientific Investigation.168 Mill was one of the first philosophers to argue
that truth could only be created through scientific methods. But the new
scientific disciplines created in the late 19th century were not really
professionalized until the mid 20th century.169 Yet few understand that
the true promise of science is not the technology it produces.170 Instead,
the fundamental value of science comes from its methodical and
institutionalized process of creating knowledge.171
What is called science, or the scientific method, has always been
an ill defined and diverse set of practices.172 Science is the result of a
"complex and heterogeneous historical process" that has been "chaotic"
and "full of mistakes."173 One philosopher called science a "discordant
collection of methods and results" that are "held together rather
artificially."174 And to make matters worse, the endeavor of science is a
"continually evolving" set of ideas and techniques, so it’s never stable
enough to pin it down exactly.175 Some believe that scientific activity is
too diverse to talk about as a distinct phenomenon. In 1861 the Russian
novelist Ivan Turgenev had one of his characters declare: "And what is
science - science in the abstract? There are sciences, as there are trades
and professions, but abstract science just does not exist."176 Many
contemporary scientists still agree with this sentiment. But I disagree. I
would argue that it is possible to generalize a basic process and purpose
of science. These similarities account for the significance of science as a
general knowledge-creating tool, which is common to all of the specific
sciences with their specific theories and methods. As Ernst Mayr pointed
out, "One would not be able to speak of science in the singular if not all
sciences, in spite of their unique features and a certain amount of
autonomy, did not share common features."177
Physical and social scientists carefully create theories, methods,
facts, and technologies that enable greater description, explanation and
sometimes prediction of the objective world,178 which allows for some
measure of control over our ourselves and our environment. Science is

32
the practice of disciplined reflection, theory, experiment, and critical
debate. Science is “a technology of truth,” as Daniel Dennett described
it.179 Scientists filter their subjectivity through disciplined methods, the
use evidence based theories to experiment on the objective world, and a
critical community. The end result of this process is the creation of
provisional truths. The scientific process is based on a flawed but valid
fiction that empirical observations and laboratory experiments produce a
knowledge that "corresponds" with reality,180 often with "an underlying
reality which we do not experience directly."181 This belief leads many
scientists to believe that they are able to explain "the fabric of reality
itself,"182 and further, that they can use this knowledge to make
predictions. Some scientists, like Bruce Bueno de Mesquita, have even
gone so far as to create a business charging clients for this supposed
ability to foretell the future.183
While the individual activities of each scientist is important,
more important is the emergent effect of the community of scientists
enabled through scientific institutions, like conferences, peer reviewed
journals, and university academic departments. This general process is
called "peer review." John Stuart Mill was one of the first philosophers
of science to appreciate the important value of peer review. He argued
that the scientist needed "the steady habit of correcting and completing
his opinion by collating it with those of others."184 Mill believed that the
"clearer perception and livelier impression of truth" would only be
produced "by its collision with error."185 Each scientist or team of
scientists propose theories about reality based on their research, but these
theories are then evaluated, judged, and later refined through the
disciplined debate of a professional community of critics.186 This is the
main "method of science," according to Karl Popper. He called this
process "the critical method."187 Oreskes and Conway argue that peer
review "is what makes science science...no scientific claim can be
considered legitimate until it has undergone critical scrutiny by other
experts."188
This critical community of scientists produces "positional
objectivity" by filtering out subjectivity, cultural bias, and "positional
variations in observations."189 And as if that is not enough, scientific
truths are also evaluated with further tests designed deliberately to falsify
their claims. If the scientific claim cannot be falsified, then it still
remains provisionally “true,” although generally accepted as more true
than before such tests.190 Most scientific claims are falsified in small
ways and modified over the years so as to become more true, but rarely, if
ever, completely true. A "fact" is a theory that has been "repeatedly
confirmed and never refuted," but even theories that don't reach this final
stage are still "useful heuristic devices" that help explain how reality
might work.191 The peer review process is an "error-elimination process,"
but not all errors can be eliminated.192 As David Deutsche explains, "In

33
science we take it for granted that even our best theories are bound to be
imperfect and problematic in some ways, and we expect them to be
superseded in due course by deeper, more accurate theories."193
Thus, constant vigilance and criticism are warranted. But there
is evidence to suggest that this most crucial aspect of the scientific
method, the critical peer review process, is no longer working as it
should. This is partly due to the ever increasing numbers of practicing
scientists and to the exponential increase of published and non-published
scientific studies, which is pushing the limits of the peer review
system.194 Thus, many flawed scientifically produced claims have gone
unexamined and un-critiqued by the larger scientific community.195 And
there has been a dramatic increase in the amount of errors and outright
fraud that have slipped by reviewers and been published.196 Mistakes and
even fraud are to be expected. This all part of the "sloppy" nature of
science.197 But even when objective truth is examined and verified by a
large scientific community over time, it is important to remember that the
discovered and refined “truth” is never static or absolute. There is no
room for complacency or orthodoxy. Scientifically produced truth is
often only a "probable" truth, as Hans Reichenbach argued, "whose
unattainable upper and lower limits are truth and falsity."198 Science is
built on the foundation of constant critique and revision of old probable
truths so as to continually create better truths or new truths. This leaves
the whole notion of scientific truth in perpetual flux, albeit often within
relatively definitive parameters.199
Over the past two centuries the practice of science has improved
the lives of billions of people through increased knowledge and
technology,200 which have increased the health of individuals and the
wealth of societies. To take but one notable example, the agricultural
scientist Norman Borlaug developed disease resistant and high-yield food
crops in the 1950s and 1960s, which led to the Green Revolution and his
winning the Nobel Peace Prize in 1970. This scientific improvement was
a major breakthrough for the human species. It kept hundreds of millions
in the developing world from starving and it stabilized and sustained the
economic market for food.201
Nobody can honestly deny the possibility of objective
knowledge and the benefits that science and technology can provide.202
The practice of science is perhaps the greatest human innovation of all
time. However, science and technology are not unqualified goods, as
some would make them seem.203 Science is not perfect and it is "not
sacrosanct."204 It is only the best method we have to create truth and the
technology we increasingly depend upon.205 But as the practice of
science advances and pushes down more and more boundaries, we must
collectively acknowledge, "Science does have the potential to do great
harm, as well as good."206 As Dr. John Ioannidis has stated, "The

34
scientific enterprise is probably the most fantastic achievement in human
history, but that doesn't mean we have a right to overstate what we're
accomplishing."207 Thus, as the philosopher of science Robert Klee has
argued, while science's "track record of achievements is neither errorless
nor continuous," the strength of the scientific process has been
continuously proven by the ability of scientists to "learn more from
mistakes than from successes."208
In an effort to help strengthen current practices of science and
the scientific process, I think it is instructive to demonstrate the
limitations of science as it is currently practiced. There are many serious
flaws in the practice of science, including limits to what science can and
cannot accomplish. There are also many unforeseen or uncontrollable
consequences due to the practice of science and the development and
manufacturing of technology.209 For starters, some scientifically
produced technology threatens the very existence of life on Earth. We
don't have to look very far to see the ill effects of fossil fuel pollution,
chemical pollution, nuclear bombs, and biological weapons. We must be
eternally vigilant about the uses, misuses, and unforeseen consequences
of the technology we create.
There are also many more problems with the practice of science,
and with scientists themselves, that need to be more fully acknowledged
and seriously debated. For example, scientists are not immune to
subjectivity and culture. Scientists can be just as irrational and wrong as
the rest of us.210 Paul Feyerabend argued that "interests, forces,
propaganda and brainwashing techniques play a much greater role than is
commonly believed in the growth of our knowledge and in the growth of
science."211 Many scientists cling to widely accepted "canonical
hypotheses"212 like "dogma,"213 as "firmly as any religious belief,"214
which led Thomas Kuhn to famously assert that powerful scientific
paradigms resist change, even when new evidence has proven them
wrong.215 One scientist acknowledged, "Even when the evidence shows
that a particular research idea is wrong, if you have thousands of
scientists who have invested their careers in it, they'll continue to publish
papers on it."216
Some scientists lie and cheat, practicing "voodoo science," and
this is an important truth that should never be swept under the rug.217
Some scientists willfully distort data or plagiarize the ideas of other
scientists. Charles Gross recounts several studies of scientific misconduct
where roughly 7 to 27 percent (depending on the study) of scientists
reported first hand knowledge of "fabricated, falsified or plagiarized
research over the previous ten years."218 And there is evidence that
scientific misconduct has been rising in recent years.219 But even when
there isn't a willful distortion of the data, there is always still a lurking
subjective tendency to publish only the data that corroborates one's
theory. This has been called "publication bias," which means that the

35
published research often legitimates only certain claims or theories, while
the other unfashionable or controversial claims and theories get
neglected.220
Both voodoo science and bad science perpetuates itself for many
reasons: the prestige of powerful scientists, the inherent bias of funding
sources, which can often create conflicts of interest, and also the "black
box" of the peer review process itself. The scientific publishing industry
has come under heavy scrutiny over the past decade. It is well known
that funding sources often bias research results. For instance,
pharmaceutical companies publish only those data that support their
drugs and bury unfavorable data.221
But less well known is the inherent bias within the peer review
system itself. Many articles are published not solely on their merit, but
on contingent factors, like the subjective interests of reviewers or one's
professional connections.222 Several scientists have argued that the peer
review system "rewards conformity and excludes criticism," thus, popular
consensus can often lead to mindless replication of the same
conclusion.223 But even when a crowd of scientists are all publishing on
the same topic, rarely do they do the important work of retesting the
results of other scientists to replicate findings.224 Instead, scientists often
design their own study to attract professional attention and research
dollars. Because of all of this, Dr. John Ioannidis meta-analysis of
medical research has found that 80 percent of non-randomized studies
offer false conclusions, and around 41 percent of the most cited medical
research from the most prestigious journals turns out to be "wrong or
significantly exaggerated."225
Also, valid scientific data can be misinterpreted and/or misused
by politicians, the culture at large, or even scientists themselves.226 Take
for example the controversies surrounding medical research and public
health policy surrounding human diet and health. Gary Taubes has
convincingly argued that "nutritionists for a half century oversimplified
the science to the point of generating false ideas and erroneous
deductions." He concluded,

Practical considerations of what is too loosely defined as the


"public health" have consistently been allowed to take
precedence over the dispassionate, critical evaluation of
evidence and the rigorous and meticulous experimentation
that are required to establish reliable knowledge. The urge to
simplify a complex scientific situation so that physicians can
apply it and their patients and the public embrace it has taken
precedence over the scientific obligation of presenting the
evidence with relentless honesty.227

36
Just because one group of educated individuals has superior knowledge,
does not mean that this group is any better than the rest of us in predicting
the future. Scientists, like all people who can claim expertise, can
become overconfident in their knowledge and abilities, and thereby, be
more prone to make mistakes. In one study of the flawed judgment of
experts, a researcher found that "people who spend their time, and earn
their living, studying a particular topic produce poorer predictions than
dart-throwing monkeys who would have distributed their choices evenly
over the options."228 There is also evidence that scientists themselves
deliberately involve themselves in sensitive issues and obscure the truth
for political purposes, ranging from hiding the dangers of smoking to
denying the reality of global warming.229
Thus, the social status of "science," the judgments of scientists,
and the products of science should not be elevated as unqualified
goods.230 Science "is not, and cannot be, as authoritative" as most
scientists wish it to be.231 Too often we believe scientists "because of
their visible display of the emblems of recognized expertise and because
their claims are vouched for by other experts we do not know,"232
elevating scientific claims to near divine, dogmatic status. This is a
significant problem because scientists often situate their work or
themselves within political debates. As Tony Judt pointed out, the
"public pronouncements" of scientists are often "polemic," whereby their
"expertise [takes] second place to political or ideological affiliation."233
Another problem is that science is elitist. Many reactionary
conservative politicians use the loaded label of "elitism" as an ad
hominem fallacy, but there is actually some truth to the claim. Becoming
a scientist is a long, arduous, and costly process that takes years
(sometimes a decade) of advanced university training. Because of this,
only a tiny fraction of the human population will ever become practicing
scientists.234 The practice of science is also extremely expensive,
complex, and time consuming. Studies ideally last years, if not decades.
This creates a backlog of untested theories. As one economist put it,
"Theory is cheap, and data are expensive."235 Science is also elitist in the
sense that a positivist model of empirical physical science and analytical
logic have been held up as the only true or valid model for all knowledge
claims, which has led to the denigration or dismissal of diverse science
practices and other forms of knowing.236 As Ludwig Wittgenstein
famously said, "What can be said at all can be said clearly; and what we
cannot talk about we must pass over in silence."237
The practice of science is also narrowly specialized, which
creates a very special problem at the heart of the scientific process. Many
scientists struggle to keep up with the vast proliferation of research in
their own narrow niche and have to rely on faith in the ability of other
scientists working in different fields. As Michael Polanyi explained,
"nobody knows more than a time fragment of science well enough to

37
judge its validity and value at first hand. For the rest he has to rely on
views accepted at second hand on the authority of a community of people
accredited as scientists."238 The professional scientist has only been
trained to produce objective truth in a very narrow domain, and outside of
this domain scientists are often just as subjective and ignorant as the rest
of us. Scientists also must have faith that all the "fragments of evidence"
within their own narrow domain, and across the curriculum, add up to
some significant whole.239 Hilary Putnam called this faith
"panscientism," which he defined as the belief that all important defined
problems "are fated, in the end, to be resolved by the progress of the
natural sciences."240
There is, of course, no evidence to support such a proposition.
Rarely do scientists engage in interdisciplinary work to weave diverse
bodies of evidence into some kind of unified and practical knowledge.
Most of the truths produced by scientists are relevant only to other
specialized scientists, thus, whatever progress happens is far removed
from the daily lives of human beings.241 Scientific knowledge only
occasionally pays "practical human dividends" in terms of solving real
world problems.242
Another problem is the overly simplified outcome of scientific
studies, which often focus on a handful of discrete variables in a causal
chain. Thus, scientific studies either ignore or try to control for various
complex systems surrounding the phenomenon being studied, which
gives a false picture of how variables actually interact within densely
layered and overlapping environmental levels: sub-atomic, atomic,
chemical, biological, social, and ecological. Because the larger
contextual ecology is often ignored, or overly simplified, scientists aren't
aware of the significant bias in data gathering methods. Sometimes, as in
the field of psychology, the traditional methods of data gathering were so
biased as to call into question the validity of the entire discipline. Only
recently did psychologists realize how "WEIRD" (Western, Educated,
Industrialized, Rich, and Democratic) the majority of their tests subjects
were. For the past half century, most academic psychologists had been
using undergraduate volunteers and making the assumption that these test
subjects represented the whole of humanity. But Joseph Henrich and his
colleagues have pointed out that these test subjects are historically and
culturally contingent, thus, conclusions based on this population will have
limited or no validity to other cultures, or to humanity as a whole.243
Furthermore, scientists try to remove phenomenon from
temporal processes to create universal, ahistorical truth that falsifies the
dynamic, constantly changing nature of reality.244 But the complicated
process of creating objective truth is always focused on the world as it
was, not on the world as it will be. Thus, science is often a backward
looking endeavor that records how the world worked yesterday, not
necessarily how it will work tomorrow.245 For this reason, scientifically

38
produced truth is always provisional, probable,246 and it slowly becomes
obsolete, as the objective world changes through time. As the historian
and philosopher of Science Karl Popper emphasized, "All theories are
hypotheses; all may be overthrown."247
Science also tends to confirm the status quo. Carl E. Schorske
explained, "The very nature of the scientific method is to affirm the
given, the reality it aims to analyze."248 Thus, scientists rarely question
deeply ingrained assumptions or institutions.249 Furthermore, because
science is a backwards looking process, scientists cannot often make
accurate predictions or fully understand the consequences of their new
knowledge or technology.250 As Stephen Toulmin once said, "The greater
our interventions in the natural world, the less we can forecast or modify
their effects, and the more significant will be their unintended
outcomes."251
Finally, the products of science, like facts, theories, and
technology, have no meaning or value in and of themselves. Nietzsche
stated, "There are no moral phenomena at all, but only a moral
interpretation of phenomena."252 Scientists seek to control and limit the
influence of subjectivity and culture by becoming objective, which means
creating facts without subjective or cultural value. Thus, scientists can
tell us what is, but not what should be. They can tell us what we can do,
but not what we should do. Science gives us facts about the objective
world, but it cannot tell us the value or meaning of facts. Thus, science
does not relieve individuals "of all personal responsibility for our
beliefs."253 Science and technology are amoral and can be used as means
to facilitate moral as well immoral ends - ends which we need as meaning
making animals, and ends over which we will always disagree and
debate.
Although the practice of knowledge and the creation of
technology is tens of thousands of years old, science as we know it is a
relatively recent innovation in human history,254 only becoming clearly
defined and professionally practiced over the last century.255 Most human
beings do not fully understand what science is, let alone use its
methodology to make daily decisions. Even trained scientists would find
it very difficult, cumbersome, and costly to use scientific practices in
everyday living. Even if it were possible to use science in our daily life,
this complex and arduous method would lead to paralysis or “total
absurdity.”256 Charles E. Lindblom went so far as to say that "examining
everything is a path to madness" that "goes far beyond human
capacity."257
But even with all these flaws, science is still one of the most
outstanding human inventions. Karl Popper went so far as to claim,
"Next to music and art, science is the greatest, most beautiful and most
enlightening achievement of the human spirit."258 And without fully
understanding the practice of science, most people benefit from its

39
practice, using the products of science to live better, know better, and
communicate more clearly. We all benefit from the laborious practice of
scientists, even if we do not behave scientifically or completely
understand scientific processes or outcomes. Science is a great invention
and it will continue to be a worthy endeavor, but this technology of truth
does not eliminate subjectivity, nor can it control the myriad social
processes of our culture. Science needs to be understood as an important
but limited tool. We should not blindly trust science.259 As David
Lindley recognized, "Scientific knowledge, like our general, informal
understanding of the everyday world we inhabit, can be both rational and
accidental, purposeful and contingent. Scientific truth is powerful, but
not all-powerful."260
In creating human knowledge there is never a choice between
fact or fiction. And yes, even science, is still a form of fiction.261 The
more perceptive physical and social scientists have acknowledged this
point. I want to note just a few examples. Charles Lindblom pointed
out, "No one - child or adult - knows the surrounding world except
through constructed images, and all contain elements of impairment."262
Kenneth Burke explained, "The choice here is not a choice between
magic and no magic, but a choice between magics that vary in their
degree of approximation to the truth."263 William H. McNeil declared,
"epistemological exactitude is unattainable."264 And Karl Popper
concluded, "There is no absolute certainty...the quest for certainty, for a
secure basis of knowledge, has to be abandoned."265
At root, the indeterminacy of science is related to the
foundational flaw of all human knowledge: we have to subjectively
interpret the objective world.266 Thus, in a world without perfect and
direct knowledge of the objective world, there is only a choice about
using the best methods to improve the quality of our fiction.267 All
human epistemology and discourse, even the scientific, is fictionalized,
albeit if good scientific tools are used, it is a careful and highly qualified
type of fiction. But regardless of what technique is used, all techniques
are flawed and all lead to partial approximations of the real world, thus,
all truth must be built on the provisional and tumultuous foundation of
reasoned debate and human judgment.268
The flawed and contingent nature of truth represented
imperfectly through human discourse was known to scientists even before
Richard Rorty popularized the notion that human knowledge had taken a
"linguistic turn," which had set off a "postmodern" wave of
epistemological skepticism.269 Stephen Toulmin called the 20th century
the "Century of Representation" because the study of language,
subjectivity and culture became central to all knowledge claims.270
Scientists have always admitted the limits of human knowledge and have
used imagination and creative language to conceptualize new theories of

40
the physical universe, creating new ideas and words that had never
existed before, but which reflected a new understanding of reality. While
scientists seek to describe and understand "the very fabric of reality," they
do so with flawed tools that are "imperfect and problematic."271 Most
also recognize, as David Deutsche pointed out, "that reality is a much
bigger thing than it seems, and most of it is invisible. The objects and
events that we and our instruments can directly observe are the merest tip
of the iceberg."272 Recognition of these limits helps keep scientists from
turning their disciplinary concepts and methodologies into
"unchallengeable dogmas," although many still try.273
Epistemological skepticism was slowly acknowledged by the
scientific community in the early 20th century. Not only was reality
revealed as much more strange and complex than once thought, but
scientists also discovered the flaws of human perception and the fictional
properties of all language used to describe the objective world. This was
especially profound and disquieting with the revolution of quantum
mechanics and probability theory. Werner Heisenberg explained how all
knowledge "depends on our language, on the communication of ideas."
He emphasized that the fallibility of human language, the "intrinsic
uncertainty of the meaning of words," is a "fundamental problem" at the
heart of all ontological and epistemological claims.274 The physicist Niels
Bohr exclaimed, "When it comes to atoms, language can be used only as
in poetry. The poet, too, is not nearly so concerned with describing facts
as with creating images and establishing mental connections."275 Michael
Polanyi described scientific discovery as a process of "creation" that
resembled "a work of art."276 Like the rest of humanity, scientists must
craft knowledge through fallible methods and they must shape their
knowledge through cultural discourse.277 As Kenneth Burke explained,
"We find our way through this ever-changing universe by certain blunt
schemes of generalization, conceptualization, or verbalization - but words
have a limited validity...we should consider them inadequate for the
description of reality as it actually is."278
Scientists have to create generalizations about the objective
world from limited data, often describing phenomenon abstractly using
poetic techniques, like metaphors. They also have to simplify the
complex world into fictionalized conceptual models that allow us to
understand casual chains. There is no better social scientific field to
understand this process than Economics, an academic discipline the
reactionary conservative Thomas Carlyle once pejoratively labeled "the
dismal science." The taken for granted concept of Gross Domestic
Product (GDP) is a perfect example. This relatively simple collection of
economic indicators supposedly represents the economic health of a
nation. But this industrial era fiction has recently come under fire as
unreliable and anachronistic in a post-industrial age, leading some
economists to call for a better measure of the wealth of nations.279

41
Another bedrock economic fiction is the credit rating for
consumer, commercial, and sovereign debt. Supposedly AAA or BBB
ratings correspond to credit worthiness;280 however, in the wake of the
Great Recession of 2008-09, these ratings have been exposed as almost
worthless metrics, not the least because of the collusion of credit agencies
with the corporations they evaluate. The United States lost its coveted
AAA rating in the summer of 2011, but in defiance of all economic logic,
the cost of U.S. debt declined due to the widespread belief that U.S.
bonds were still the safest asset in the world. One of the more recent and
imaginative economic fictions is the "Big Mac index," which some
economists use to track purchasing power parity in world currencies.
With the global penetration of McDonalds, one can use the basic material
cost of hamburger to gage whether specific sovereign currencies are over-
or under-valued.281
But economics is not alone. The fields of political science and
the law are also dominated by fictional abstractions. Modern society is
literally built on the foundation of traditional legal fictions. Take for
example the political system of modern democracies. This form of
government is based on the legal fictions of "popular sovereignty," which
is a belief in discrete, ethno-national populations ("the people") who have
"inherent rights" grounded in a "natural law."282 All of these legal
fictions do not actually exist empirically, which has caused 20th century
post-foundationalist legal scholars and political scientists to perform
backbreaking logical justifications of these essentially religious beliefs.
The limited liability “corporation” is another important fictional
construct. This invented entity is a social structure legally defined as if it
were an individual human being, and given all of the legal rights granted
to a person.283 This fiction was particularly ironic in late 19th century
America. The 14th Amendment to the U.S. Constitution and the 1866
Civil Rights Act were originally designed to deliver political and social
equality for freed black slaves, but these laws were primarily used by the
conservative Supreme Court to uphold the rights of the newly formed
corporation (these same statutes were ignored or limited by state laws
when invoked to defend the infringed rights of African Americans).284
All abstract concepts and theories, scientific or religious, are
fictional constructs. These conceptual tools are used by human beings to
piece together a diverse array of empirical facts in order to create a
plausible narrative that explains how the world works. The historical
William H McNeil explained, facts alone do not, "in and of themselves,
give meaning or intelligibility... facts have to be put together into a
pattern that is understandable and credible."285 The philosopher of
science Daniel Dennett explained in his book on human consciousness,
“Metaphors are the tools of thought. No one can think…without
them.”286 The Nobel Prize winning economist, Elinor Ostrom, explained
in one of her most important studies that scientific theories and models,

42
as well as the policy prescriptions built from these models, are always
flawed "metaphors" for the reality they seek to describe and explain.287
Tony Lawson, a Cambridge professor of Economics, is one of the rare
social scientists to frankly admit that “fictionalizing is almost always
necessary” in scientific endeavors.288 Early social scientists like Adam
Smith talked openly about scientific generalizations as a poetic process,
as did some 20th century physicists, like Bohr.289 Many contemporary
scientists often turn to "storytelling" to translate complex scientific truths
into language that a general audience can understand.290
Fictionalization is an important part of human nature and
intimately connected to the construction of human knowledge and social
institutions. Problems arise when we forget that even scientific theories
and mathematical models are imperfect fictions, what James Coleman
called "sometimes true theories."291 One example of the tragedy of taking
scientific theories too seriously can be found in the misplaced trust placed
in economic models and flawed financial instruments, which partly
caused the global Great Recession of 2008-09.292 The foundational
"efficient market hypothesis" of Adam Smith and Paul Samuelson was
laid bare as a simplistic fiction, which had to be balanced and stabilized
by the financial stimulus and regulatory guidance of the state.293
Pierre Bourdieu emphatically warned scholars that one must
never confuse "the model of reality" with "the reality of the model."294
That is because, as the philosopher Martha C. Nussbaum has pointed out,
even our best scientific models can lead to "distorted human
experience."295 My emphasis on the concept of "fiction" is a rhetorical
devise to remind all that even our best concepts and theories are
fundamentally flawed. Even my concept of "fiction" used in this book is
flawed, as it seems to overstate the subjective quality of knowledge and
understate the objective correlates of all knowledge. We must make sure
never to take our concepts and theories too seriously. As Elinor Ostrom
argued, "Scientific knowledge is as much an understanding of the
diversity of situations for which a theory or its models are relevant as an
understanding of its limits."296 The purpose of this extended critique of
science has not been to unfairly chastise this important endeavor, but
rather to show how even our most important technology of truth is
flawed, and that part of the strength of science is its ability to learn from
past mistakes and partly control for the limits of human cognition.
So instead of worshiping the unattainable pinnacle of pure,
objective knowledge, as many scientists seem to do, I think we should
instead teach everyone how to see the world as fiction, thereby, enabling
us all to more consciously create the contours of our lives and understand
the limits of the knowledge we make. We cannot all practice the rigorous
methods of science, but we can all practice daily the art of fiction. Thus,
it is imperative for us to understand this practice so that we can construct
more valid and useful representations of reality. The practices of

43
philosophy (critical thinking) and rhetoric (discourse) can help us do this.
To explain the everyday art of fiction, its importance for the formation of
knowledge, and its facilitation of action, I want to use an example of map
making. This extended metaphor will demonstrate how fictionalization
allows for human knowledge and agency, while also illustrating the
limitations of science.
Map making is a universal human tool. The concept of a map
employs several important processes of knowledge creation and
utilization: symbols to represent the physical world, a conceptual
paradigm for ordering these symbols into a meaningful cosmos, a
motivational purpose directing human action, and language used
rhetorically to encourage deliberate action. A map is a "cartography of
the mind"297 that illuminates, motivates, and directs action. This uniquely
human tool was not really developed as a conceptual instrument for
creating knowledge until the middle ages, when the creation of the
printed book helped develop new forms of learning for practice in the
western university.298 The map was instrumental in constituting our
modern notions of nationalism (ethnic groups) and nations (states), using
fictional political entities to neatly divide the world into discrete units
with equally fictional boundaries.299 Yet the impulse for map making in a
more general sense goes back to the very origins of human language and
conceptual thought. What are primitive cave paintings or oral
cosmologies but cultural maps bringing meaning and order into the
world. Maps are an ancient tool that uniquely reveal a primary function
of the human mind.300 Thus, it is the perfect example to explore the
centrality of fiction and human cognition, which helps us understand the
possibility and limitations of knowledge.
If a stranger asks me how to get to the nearest coffee shop, I
might draw a very crude map emphasizing the major streets, key
landmarks, and basic directions. No idiot would mistake my map for the
existential reality of the city. But even a child could recognize through
my horrid pictographs and sloppy penmanship that this map accurately
represents a real place, and that it could efficiently direct a person to a
particular coffee shop and help facilitate this person's motivation for
going there. The map is a fiction inspired by a clear purpose and
imperfectly expressed through symbols and syntax. It is a product of my
consciousness, whereby, I have experienced and created knowledge of a
particular place. As such, it is my attempt to conceptualize and verbalize
the specific intentional act of getting from point A to point B. While it is
not a completely accurate depiction of a complex ecology, it still
truthfully represents reality enough to help a person reach a destination
and achieve a purpose. The map is a tool created from my subjective and
cultural experience, embodying a fictionalized truth that can help others
to know and act.301

44
Now we all recognize that professional map makers
(cartographers) utilize this same basic subjective act, supplemented by
scientific methods and expensive technology, to create much more
detailed and clear representations of countries, states, cities,
neighborhoods, and streets. Take for example Google Earth, which is a
marvel of information technology.302 Scientifically produced maps are
much more accurate and truthful, but they are still fictions.303 Never for a
moment should we think that Google Earth perfectly represents and
reproduces the complex reality of the objective world. We even have the
wondrous technology of satellites and digital cameras, which are the
culmination of hundreds of years of scientific labor. These satellites can
see any place in the world and produce a detailed picture of objective
reality, but this too is a fiction. All maps represent the subjective fiction
of their makers, even if that maker is a computer.304 Maps are a very
specific way of seeing the world, they can give us a god’s eye view, but
even a digital satellite cannot capture the vibrant and complex ecology of
the objective world. And even if a perfect replica of objective reality
could be created, as Jorge Luis Borges once perceptively explored, it
would be absolutely unusable and incomprehensible because of the sheer
size and mind-numbing complexity of such an object.305
A map presents data, not reality. Data include visible, discrete,
empirically verified objects, such as buildings, streets, names, directions,
and lengths. But data do not actually exist in the objective world. Data
are fictional creations for the purpose of knowledge. Data represent a
very simplified and conceptually arbitrary version of the real world, a
small broken-off piece of a much larger interconnected puzzle. Data
produced through scientific methods are more exact, but they too falsify
both the complexity of the objective world and the subjectivity of human
experience. The very notion of “data,” discrete units of valueless
information, is itself a scientific fiction - useful, but flawed. The concept
of data is an analytical tool created by scientists, and it is meaningful and
useful only to the extent that humans put it to a purpose.
Thus, cognitive maps fictionalize the ecological terrain we live
in to create knowledge, but that knowledge has no intrinsic value outside
of its use. Walter Lippmann once insightfully explained the utilitarian
value of cognitive maps:

For the real environment is altogether too big, too complex,


and too fleeting for direct acquaintance. We are not equipped
to deal with so much subtlety, so much variety, so many
permutations and combinations. And although we have to act
in that environment, we have to reconstruct it on a simpler
model before we can manage with it. To traverse the world
men must have maps of the world.306

45
Knowledge is simply a tool created by human agents who give data
meaning and use it for a purpose. The purpose for constructing
conceptual maps will also shape the level of analysis, or scope of reality.
This could include having to think about physical reality in "multiple
levels" that are "nested in a consistent manner within one another,"307
such as the common mental juggling of personal, institutional, local,
regional, state, national, international, and global phenomenon as we go
about our daily lives.
We can extend the metaphor of map making to include all
attempts by human beings to conceptually order and understand their
physical, social and subjective worlds. Take for example personal and
social narratives, which we often call "history." The historian William H.
McNeill explained, "And just as a map is not a replica of reality, but a
schematic representation of selected features from the relevant
landscapes, so a history, too, is not a replica of what really happened, but
a verbal construct that makes intelligible what is otherwise a confused
jumble of potentially infinite and thoroughly unmanageable
information."308 Everyday we order the "confused jumble" of data we
experience, assembling selected parts into meaningful conceptual
cartographies expressed in different forms, like narratives or maps, so that
we can successfully navigate our lives. This is fiction, the underlying
process of knowledge creation.
Scientific methods produce much more precise maps of the
objective world, thus, they are a very useful and effective way of seeing
and knowing the world. However, they are not perfect representations.
They too are fictions. It is also important to realize that science is not the
only tool that works. For hundreds of thousands of years human beings
relied on "folk taxonomies" and cosmological myths to classify, order,
and understand the world.309
These tools have not been superseded by science. They are still
in use by the majority of human beings, if not by all of us, most of the
time.310 These tools are still in use because, unlike science, older folk
taxonomies and myths are more useful to human beings, especially in
terms of assigning meaning and value to life. Human beings experience
reality and understand it through common sense and personal narratives.
Humans will always need to fictionalize experience to create subjective
truths and conceptualize a larger picture. As linguist George Lakoff
explained, "Human categorization is essentially a matter of both human
experience and imagination - of perception, motor activity, and culture on
the one hand, and of metaphor, metonymy, and mental imagery on the
other."311 As humans, we must fictionalize our experience to make it
meaningful and useful.

46
Chapter 4
---
Philosophy is a Tool for Everyone

Science is not our only technology of truth.312 The practices of


philosophy and rhetoric, the core of a system of knowledge called the
humanities, are much more practical tools, which humans have been
using for over two thousand years. These older technologies of truth
promote self-knowledge, critical reflection, deliberation, judgment, and a
measure of freedom.313 Unlike science, these tools can be more easily
learned by the majority of human beings, and they can help democratize
and improve the process of knowledge production. These practices can
address the practical problems of science so that science remains
accountable to the ends of human society. Philosophy and rhetoric can
lay the epistemological foundations for the education of all human
beings, while also preparing more advanced students for training in the
methods of science. Spreading the practice of philosophy and rhetoric
would increase the democratization of society and the accountability of
political leaders by enabling more rigorous public debate on human
knowledge and public policy.
The practices of philosophy and rhetoric actually preceded and
gave birth to science as we know it.314 Philosophy was not only practiced
by most if not all scientists up until the 20th century, it was also
considered an important epistemological part of the two major branches
of early science, both “natural history” (biology, geology, and
anthropology) and “natural philosophy” (physics, chemistry, astronomy,
and mathematics). Scientists were called "natural philosophers" up until
the 19th century, during which the word "scientist," gradually around
1834, became the preferred professional designation.315 A noted 19th
century American physicist used to call scientifically developed
technology “philosophical toys.”316
But over the last century western philosophy has largely
forgotten its broader humanistic roots, as it became an academic
discipline in the western research university. As a professional discipline,
it has drastically narrowed its preoccupation to obscure and abstract
technical problems in the areas of logic and linguistics, which at its
worse, "tends to degenerate into logic-chopping and semantic
quibbling."317 Many if not most 20th century philosophers have

47
burrowed into the isolated academic caverns of arcane professionalism,
renouncing any contact with, or obligation to, the larger human world and
its practical problems.318
However, not all philosophers lost touch with the everyday
concerns of the average human being. In the 1970s, Karl Popper
sardonically criticized his fellows, "it is very necessary these days to
apologize for being concerned with philosophy in any form whatever"
because "most professional philosophers seem to have lost touch with
reality...They get involved in scholasticism, in linguistic puzzles."319
Needless to say, this modern form of philosophy is not the type that I am
describing or advocating in this book.
I want to reexamine the older practices of ancient philosophy
and make a case for these traditional critical practices as more important
to the average human being than science. As Karl Popper acknowledged,
"We all have our philosophies, whether or not we are aware of this fact,
and our philosophies are not worth very much." However, he went on to
say, "This makes it necessary to try to improve our philosophies," not
abandon the practice of philosophy all together.320 A good philosophy
allows humans to critically examine, understand, and classify the
objective world in conjunction with the needs of human subjectivity and
the demands of human society. A good philosophy allows us to grasp the
possibilities of the objective world and the limitations of subjectivity,
while also helping us understand what experience means by assigning
value and spurring motivation.
But we must think in words and clearly articulate our thoughts to
others. Rhetoric allows us to effectively organize and communicate our
thoughts, both to ourselves and to others. Rhetoric also allows us to
effectively persuade other people about what is good, what is true, and
what is right, which is part of the reasoning process of human beings as
social animals.321 Philosophy and rhetoric (as I define them) are the core
skills of what has been labeled "General Education"322 or "Liberal
Education,323 and these skills are essential for personal development, vital
societies, democratic governance, and sustainable economic growth.324
But we have lost sight of the importance of philosophy and
rhetoric. In the United States over the last century, the practices of
philosophy and rhetoric became professionalized as adjunct disciplines
focused on logic and linguistics. Both fields are now defined very
narrowly and routinely placed beyond the boundaries of ordinary human
beings.325 I am not talking about philosophy as it is currently practiced,
the so-called "analytical" philosophy derived from logical positivism in
the early 20th century. Instead, I am invoking a much older practice that
I believe to be a central part of what makes human beings human.326 The
philosopher Talbot Brewer explained, "The practice of philosophy first
emerged as a living answer to the question of what we humans are to do

48
with the peculiar power of thought and speech (logos) that distinguishes
us from other animals."327 Philosophy is an ancient and continual
practice that examines "the intrinsic value of living." It enables an
"understanding of oneself, one's world, and how best to live in that
world."328 As Henry David Thoreau explained, "To be a philosopher is
not merely to have subtle thoughts...but to love wisdom as to live
according to its dictates...It is to solve some of the problems of life, not
only theoretically, but practically."329 This ancient practice of philosophy
focuses on one central human question: How should we live?330
As Robert Nozick acknowledged, the older notion of philosophy
envelopes the idea of the humanities, a field of knowledge that was
developed in Renaissance Europe. Philosophy "need not exclude" the
broader practices of human knowing and communicating. The ancient
practice of philosophy embodied diverse ways of knowing and living in
the world: "the modes of essayists, poets, novelists, or makers of other
symbolic structures, modes aiming at truth in different ways and at things
in addition to truth."331 Stephen Toulmin explained that philosophy
should be understood as "the systematic and methodical treatment of any
subject."332 Philosophy is a critical method for investigating and creating
"ordinary knowledge,"333 which people use to live.
This practical form of knowledge is the basis for, but not
substantially different from, more advanced forms of knowledge, like
science.334 The practice of rhetoric, and creative writing in particular, is a
philosophically inclined subject because it allows humans to understand
and communicate subjectivity and value, and these are necessarily tools
that allow humans to participate in cultural activities, as well as create
personal narratives.335 At its foundation, philosophy is a critical
discussion and debate with others about "how one should live."336 The
tools of philosophy and rhetoric, according to William H. McNeill, "is,
indeed, what makes us human."337
Despite many advances in science and technology over the 20th
century, we are all still profoundly constrained by our subjectivity and
culture, and we always will be. While objective knowledge produced
through scientific methods is important, assigning meaning and value to
subjective experience and fully participating in our culture is more
important to most (if not all) of us. Life must have a point, even if it is a
subjective or cultural fiction, because we need our existence to be
"shaped, enriched, and laden with significance."338 However, we must
always be aware of our positional perspectives, the limits of language,
and the diversity of the world. We must learn to better examine our
subjectivity and culture through the fictional eyes of an "impartial
spectator," debate different reasonable versions of the truth, and arrive at
provisional conclusions that can help us make practical judgments.339
Charles E. Lindblom argued that "a person's empirical
understanding of the social world emerges not in the form of articulated

49
tested facts or theories but in the capacity of that person to appraise
judiciously, to form and express shifting judgments about, the confronted
confusing social world."340 While science is out of reach for most human
beings, philosophy is an ancient intellectual practice that can help us all
"appraise judiciously," as some of the first philosophers in ancient Greece
demonstrated long ago. Socrates, Plato, and Aristotle all taught that
human wisdom is based on human dialogue, reasoned debate, and
practical decision making based on tentative truth.341 Our educational
institutions need to focus more on these core practices. We cannot avoid
"the untidiness of human judgment."342 I believe that this is the first
principle of ancient philosophy.
The English word philosophy is derived from two ancient Greek
words: philo (“love”) and sophia (“wisdom”). In India the ancient
Sanskrit word for philosophy is darsana, which means using rational
thought to "see clearly."343 Philosophia and darsana are not only the
practice of knowing and seeing the world as it is, but they also embody
the virtue of loving wisdom, which is using knowledge and our
experience to live more meaningful, fruitful and sustainable lives. One
philosopher has defined wisdom as "what you need to understand in order
to live well and cope with the central problems and avoid the dangers in
the predicament(s) human beings find themselves in...to be wise, a person
not only must have knowledge and understanding - have wisdom, if you
will - but also use it and live it [author's emphasis]."344 The intellectual
historian Peter Gay explained, "philosophical thinking was at its best a
reliable method of procedure, a compass for the wilderness...Man is adrift
on a sea of ignorance and uncertainty, and philosophy is the only
seaworthy craft afloat."345
In ancient Greece, philosophy was practiced by sophoi (“wise
men”). These men created truth, lived their truth, and used rhetoric to
debate values and arrive at some collective notion of the good life. It is
important to remember that most sophists did not consider truth to be
singular, nor did they agree about what the good life entailed. Even
within the fairly homogenous culture of ancient Greece, there were
always competing truths, values, and ways of life. Sophists constantly
debated over different visions of the good, thus, they had to master
rhetoric as well as critical thinking. Socrates is perhaps the paradigmatic
sophist, given what we know about him through the stories of Plato.346
While Socrates created and lived his own truth, he also debated with
fellow citizens and sophists, never taking his own ideas too seriously and
always examining all ideas of goodness and truth. He took his vocation
as sophist so seriously that he even died for his pursuit of truth and his
way of life when it became unacceptable to society.347
Philosophy is a tool to better understand348 ourselves as human
beings and the world that we live in. Asking philosophical questions
allows us to investigate "the all but permanent ways in which we think,

50
decide, perceive, judge."349 As the physicist and philosopher David
Deutsch pointed out, "understanding does not depend on knowing a lot of
facts as such, but on having the right concepts, explanations and
theories."350 Karl Popper defined philosophy as a "rational discussion"
based on a "critical" analysis of the world we live in to understand it more
fully.351 Philosophy is a tool to help us see the world more clearly,
understand that world, and construct knowledge about the world to help
live better lives. Of course philosophical investigation is inherently
limited. While it is "guided by a sense of obligation towards the truth,"
what Michael Polanyi called "an effort to submit to reality,"352 it is not a
tool that can deliver truth. Philosophy can only approximate. Thus, more
importantly than discovering unattainable truths, philosophy is a tool to
help us form judgments so that we can wisely choose the course of our
lives and create a sustainable future.
Rhetoric is a companion tool that helps us organize and
communicate our thoughts, and to participate in our culture. The "means
of communication" is a "technology of the intellect," as anthropologist
Jack Goody explained: "Our starting point must be that the acquisition of
language, which is an attribute of mankind alone, is basic to all social
institutions, to all normative behavior. Many writers have seen the
development of languages as a pre-requisite of thought itself."353 The
philosopher Robert Nozick explained the connection between the tools of
philosophy and rhetoric and individual human agency, "When we guide
our lives by our own pondered thoughts, it then is our life that we are
living, not someone else's. In this sense, the unexamined life is not lived
as fully...The understanding gained in examining a life itself comes to
permeate that life and direct its course."354 Philosophy is not just a
question of knowing what is, but also what is useful and what is good.355
While philosophy based on personal experience is not without
flaws and limitations, it is the best tool that most humans will ever
have.356 We use philosophy to understand, judge, set priorities, and often
to make trade-offs when our priorities conflict – all in an attempt to create
a better life. Science is a tool that will only be used by a few, but
philosophy is a tool “for everyman,” as Karl Jaspers once proclaimed.357
Aristotle famously stated, "All men by nature desire to know" in his
treatise on Metaphysics. Antonio Gramsci also argued that there is a
"'spontaneous philosophy' which is proper to everybody," thus, "all men
are 'philosophers.'"358 Albert Einstein explained, "The whole of science is
nothing more than a refinement of everyday thinking," and others like
Karl Popper and Max Planck concurred: "Scientific reasoning does not
differ from ordinary reasoning in kind, but merely in degree of refinement
and accuracy."359 Every human being engages in some form of
intellectual inquiry every day, however muted or mundane it may be.360

51
Refining our natural propensity to think and inquire, we can use
philosophy to gain better knowledge of the world, and thereby in
conjunction with rhetoric, to conceptualize and communicate
epistemological "maps" and "theories" of the world that can guide our
personal lives and our societies toward future goals. Robert Nozick
explained philosophy as "a thoughtful view of what is important, a view
of...major ends and goals and of the means appropriate to reaching
these."361 Isaiah Berlin explained the importance of philosophy in terms
of deciding difficult moral and political issues: “We must decide as we
decide; moral risk cannot, at times, be avoided. All we can ask for is that
none of the relevant factors be ignored, that the purposes we seek to
realize should be seen as elements in a total form of life, which can be
enhanced or damaged by decisions.”362 A philosophy of experience and a
rhetoric of human motives are tools to help us make and communicate the
difficult decisions that we all must face. We learn as we live, we
communicate our lives, and we learn to live better. This should be the
foundation of all institutions of education, at one time appropriately
understood as the humanities.

52
53
54
II

What is Enlightenment?

Contested Knowledge
& the Limits of Human Nature

Chaos of thought and passion, all confused;


Still by himself abused, or disabused;
Created half to rise, and half to fall;
Great lord of all things, yet a prey to all;
Sole judge of truth, in endless error hurled:
The glory, jest, and riddle of the world!

Alexander Pope, An Essay on Man

55
56
Chapter 5
---
A History of Religion and Secularism

For most of human history the cosmos of the known universe


was populated by deities and devils in a simplistically ordered whole
guided by metaphysical principles and orally transmitted through myths.
Humans acted in broad ignorance of the actual mechanics of the objective
world, mixing local intelligence and tradition with fanciful belief in the
power of magic, prayer, and fate. Peter Gay once explained, the "essence
of the mythopoeic mind" as "where the category of verification is absent,
there are no lies."363 Early philosophers in ancient Greece and India
began to use logos to challenge the antiquated logic of mythos, but
skeptics had little evidence to actually convince people to reject old
worldviews. Often critique was met with ridicule from the establishment,
or sometimes, as was the case with Socrates, a death sentence to preserve
public morality.364 The traditional epistemology of mythos was not fully
challenged until new critical human endeavor called "science" was
invented in the 16th century, which allowed a new type of truth to be
fashioned based on empirical observation, innovative technology,
experimentation, and mathematical theory.365
But even as western science developed over the next several
hundred years, a foundational mythos of ancient humans was retained as a
central assumption. This great assumption of a rationally ordered
universe guided by immutable laws was largely unchallenged until the
end of the 19th century.366 Take for example William Graham Sumner
who began his professional life as an Episcopal pastor, but left the
ministry to become a social scientist at Yale in 1872. In his last sermon
Sumner noted the “philosophical skepticism” of modernity and the
brewing “conflict” between “traditional dogmas” and science. Sumner
explained his turn toward science in terms of looking for “an historical
revelation of spiritual and universal truths which has authority for
man.”367 Drawing from the cosmology of ancient mythos, scientists have
assumed that there was a singular, “rational” and “intelligible” order to
the universe that could be discovered by the human mind.
Up until the early 20th century, many scientists conceptualized
this singular order in association with traditional Judeo-Christian beliefs
about “God.”368 Thomas Hill, President of Harvard College in 1865,
claimed, “The ultimate ends of comment sense, of philosophy, and of

57
science are the same. They may be summed up in one, - it is the reading
of God’s thought. The order of the universe is rational, intelligible…No
mind capable of scientific labor ever doubts that all phenomena are
subject to law, that is, that all phenomena succeed each other in an order
which can be understood and expressed in the formulae of human
speech.”369
But the unification of mythos (traditional, sacred reasoning) and
logos (logical, critical reasoning) in a singular conception of the cosmos
has not always been widely accepted because of the inherent ontological
and epistemological conflict between these two worldviews. These two
divergent ideologies have often been in conflict over the past 500 years,
especially in the Western world. During this time, the ancient logic of
mythos was rebranded as “religion,” and the analytical empiricism of
logos was labeled “secularism.” But both concepts of “secularism” and
“religion” sprang from the Judeo-Christian tradition as it developed in
Europe.370 Secular came from the Latin word saeculum, which literally
referred to a century or age.
But this concept was used by Western theologians to designate
ordinary, natural time (as apposed to the sacred time of God’s
metaphysical order).371 Religion came from the Latin word religio,
which referred to the rituals or behaviors linked to supernatural powers or
metaphysical reality. But as the Christian church began to consolidate an
orthodoxy this concept was used by Western theologians, like Saint
Augustine, to designate the body of “true” knowledge or doctrine
contained in the Bible or propagated by church fathers.372 Beginning in
the 18th century, European scientists and humanists used these Judeo-
Christian distinctions to gradually wall-off a new ontological space,
conceptualizing the “secular” world of strictly human affairs and natural
processes. As William T. Cavanaugh has argued, “’Religion’ as a
discrete category of human activity separable from ‘culture,’ ‘politics,’
and other areas of life is an invention of the modern West.”373
Secularism became a powerful intellectual force, part of the
ideological worldview of Deism, which was created by scientists and
humanists during the 18th century. Charles Taylor has called Deism “a
half-way house on the road to contemporary atheism.”374 As a quasi-
religion, Deism did not completely reject the existence of “God” or
metaphysics. Deists assumed that there was some larger metaphysical
reality outside the boundaries of the physical universe. However, using
the basic theory of logical parsimony, often called “Occam’s razor,”
Deists eliminated the notion of an all-powerful transcendent being
(“God”) operating and intervening in human history or the natural world
because it was redundant in the face of new scientific facts.
Instead, modern European Deists focused on empirical
observation, experimental science, and the monistic belief in natural
“laws” of the universe, which they used to re-order the ontological and

58
epistemological space of physical reality. Deists did not try to logically
disprove the existence of “God”, yet any religious notion would have
accord with these new natural laws and prove itself valid within this new
empirical worldview.375 As the French philosophe Voltaire once
explained, "Almost everything that goes beyond the worship of a supreme
Being and the submission of one's heart to its eternal commands, is
superstition...We are all steeped in weaknesses and errors; let's forgive
each other our follies; that is the first law of nature."376
While Deists were largely agnostic on religious issues,
effectively their belief in science made metaphysical notions redundant.
This lead to a general orientation of naturalism, which can be (and was)
seen as a form of atheism.377 However, even Voltaire held on to a deep
religious conviction. After the Lisbon earthquake in 1755 Voltaire wrote,
"All the subtleties of metaphysics will not make me doubt for a moment
the immortality of the soul or a beneficent Providence. I feel it, I believe
it, I want it, I hope for it, and I shall defend it to my last breath."378 As
Peter Gay argued, "Deism was in fact a last compromise with religion.
But it was not a compromise with mythopoeic thinking...The
disenchanted universe of the Enlightenment is a natural universe."379
18th and 19th century scientists and philosophers used an
emerging naturalist mentality to create a “new framework” based on a
“new rationalism,” which slowly secularized the modern Western world
up to the 21th century.380 Bit by bit, the physical universe, the natural
world, and human society were separated from metaphysical doctrines
and spiritual deities, although the underlying belief in a singular,
intelligible cosmos was retained. As historian Peter Gay explained, "The
old questions that Christianity had answered so fully for so many men
and so many centuries, had to be asked anew: What - as Kant put it - what
can I know? What ought I to do?"381 Perhaps the fullest expression of
this new scientific rationality took shape in 1843 when John Stuart Mill
published his six book treatise, A System of Logic, Ratiocinative and
Inductive, Being a Connected View of the Principles of Evidence and the
Methods of Scientific Investigation. Mill argued that there were no "a
priori" or "self evident" truths, which were simply the reflection of "deep-
seated prejudices." Instead he argued that truth needed to be constructed
through the newly developed scientific method.382
Newly formulated concepts, such as “religion” and “secularism”
were used to signify the modern ontological divide that many Westerners
now take for granted. Thomas Jefferson famously went so far as to
proclaim a “wall” between the two spheres, although his rhetorical wall
has always been more symbolism than reality.383 The existence of a
secular sphere of knowledge and action slowly became accepted due to
political and scientific developments, but was not a reality until the 20th
century.384 Western societies became more and more diverse, ethnically
and religiously, and a secular public sphere was a tool to enable official

59
tolerance of all faiths. Also, advances in science and technology
validated the naturalist conception of the physical world and made many
older notions seem quaint fictions, like the traditional notion of a
disembodied "soul."
But there is big problem with the whole notion of a
secular/religious divide that has been legally enshrined over the last
century – it is largely a fiction. While many people consider the 21st
century to be part of a “secular age,”385 the vast majority of human beings
still live in a pre-modern, mythological frame of mind, which has created
general skepticism about the meaning and predominance of secularism in
the modern world.386 The rise of secularism over the last five centuries
has not only coincided with the stability of pre-modern religious beliefs
and practices, but it has also seen the development of huge global
religions, such as Christianity, Islam, and Buddhism, as well as the
continual emergence of new religions.387 Plus, there was an emergence of
a new form of religion. Raymond Aron and John Gray have called them
"secular religions," such as Marxism and Humanism, both of which are
"Christian heresies" because of their faith in progress and apocalyptic
millenarianism.388 John Gray has convincingly argued that the 21st
century is not only in the grip of traditional religions but also "modern
political religions" animated by various utopian projects.389
In the United States religion is arguably as strong and as
culturally significant as it always has been. At the end of the 19th
century 37 out of the 42 states in America legally "recognized the
authority of God in the preambles or texts of their constitutions."390 By
the 21st century around 90 percent of the citizens of the United States of
America still consider themselves “religious,” and even most of the non-
religious still have some spiritual beliefs.391 Perhaps most surprising,
around 39 percent of all scientists in the U.S. still consider themselves to
be religious, only slightly less than the 42 percent of believing U.S.
scientists polled in 1914.392 As Charles Taylor ironically explained, “The
secular age is schizophrenic.”393 Secularism and religion co-exist
awkwardly, and often in conflict, but not always.394
With the growing acceptability of secularism over the last
century, especially in Western Europe and the United States, there has
also been a growing tension between the rival ontological and
epistemological systems of “religion” and “science,” periodically
irrupting into conflict. While science has been legitimated in the public
eye, largely do to the staggering advances in technology and the quality
of human life, the scientific worldview is still an embattled ideology in a
religiously dominated world, even in Western democracies. Some
scholars have questioned whether or not we need a “new model” for
describing the world, which is “neither exclusively secular nor
exclusively religious.” Martin E. Marty explained it succinctly: The
world “is religious. And secular. At the same time.”395 Modernity, as

60
Charles Taylor reminded us, is a “struggle between belief and
unbelief.”396 But this duality predates the modern period because belief
and unbelief have always existed in longstanding conflict.397 Will it
always be this way?
Skirmishes between secularism and religion began at the very
origins of Western science.398 In fact, if one takes a longer, global view,
“religious” beliefs have always been subject to the assaults of doubt, and
doubters have often been disdained by the moral majority.399 But the
battle began in full force with the European philosophers and scientists of
the 18th century, who promoted a spirit of “enlightenment” from the
shackles of antiquated myths, oppressive institutions (like the Roman
Catholic Church), and authoritarian monarchs. Theses philosophers and
scientists promoted rational thought, free exchange, objectivity, universal
natural laws, political democracy, and the ability “to provide permanent
solutions to all genuine problems of life or thought.”400 But from the
start, a rival tradition of “counter-enlightenment” philosophers, poets, and
theologians saw the new secular world-view of science as a reductionist
“murder” and “distortion” of the multifarious and “ineffable” nature of
reality. These thinkers bristled against the “total claim of the new
scientific method to dominate the entire field of human knowledge.”401
These thinkers also questioned the metaphysical assumption of “the
general progressive improvement of the world,” which of course was
based on making the world conform to the cultural institutions of Western
Europe.402
This battle expanded and became more acute in the 19th century.
There were several major scientific, philosophical, and political
developments that shook the very foundations of religious belief. This in
turn caused an aggressive reaction from traditionalists, which lead to
various "culture wars" up into the 21st century.403 The four secular
developments included the new Biblical criticism of Friedrich
Schleiermacher (1768-1834) and David Strauss (1808-1874), the
publication of Darwin’s On the Origins of Species (1859), the proto-
sociologist Karl Marx’s (1818-1883) new political philosophy that saw
religion as the “opiate of the masses,” and the iconoclastic atheism of
Friedrich Nietzsche (1844-1900), who declared that God was dead. By
the end of the 19th century, the conceptual war between science and
religion became a popular cottage industry, the most notable publications
being John W. Draper’s History of Conflict Between Religion and Science
(1875) and Andrew Dickinson White’s A History of Warfare of Science
with Theology (1896).404 And from then on cultural wars have ripped
through Western societies every couple of decades.405
Of course not all scientists in the 19th and 20th century were
opposed to religion, nor did all scientists think that “science” and
“religion” were ontologically opposed. Remember, at the turn of the 20th
century around 40 percent of American scientists still considered

61
themselves religious. Three of the most notable scientists of the 20th
century who defended the notion of religion were Alfred North
Whitehead in Science and the Modern World (1925), Albert Einstein in
Ideas and Opinions (1954), and more recently, Stephen Jay Gould in
Leonardo’s Mountain of Clams and the Diet of Worms (1998) and Rock
of Ages: Science and Religion in the Fullness of Life (1999). Both
Whitehead and Einstein believed in a metaphysical reality that was one
with the physical workings of the universe. Einstein argued for a
pantheist “God” (“Spinoza’s God”) who designed and worked through
the laws of nature.406 Gould, who referred to himself as a “Jewish
agnostic,” argued that religion and science were “nonoverlapping
magisteria.” In an effort to defuse the war between these worldviews,
Gould claimed that these two ontological categories had their own
“respective domains of professional expertise: science in the empirical
constitution of the universe, and religion in the search for proper ethical
values and the spiritual meaning of our lives.”407
The rise of scientific rationality and the growing legitimacy of
secularism also gave rise in the 19th century to the birth of Comparative
Religion, and later, Religious Studies. These academic disciplines sought
to study the phenomenon of “religion.” While the early proponents of
these disciplines were religious men who sought to bolster the importance
of religion in the modern world, the effects of these new disciplines
would only serve to widen the conflict between science and religion. The
main actor in this early intellectual enterprise was the European
philologist Max Mueller. He was greatly influenced by the universal
philosophy and theology of Georg Wilhelm Friedrich Hegel (1770-1831),
whom postulated a universal metaphysical geist (spirit) behind the
working of human history and at the center of all human religions.
Mueller’s most important work was Introduction to the Science
of Religion (1873). In it he claimed that humans have a “faculty of faith”
which they used to apprehend the divine and develop morality. Mueller
was also influenced by the liberal Protestant (and also Western
imperialist) agenda of the early ecumenical movement, which meant to
spread a largely Western European Protestant version of Christianity
across the world. Mueller saw the study of comparative religion as an
important resource for Christian missionaries spreading the gospel and
converting the non-western savage.408
Most of the scholars in the emergent discipline of Religious
Studies were theologians and philosophers whose explicit purpose was to
reconceptualize and validate traditional theological principles within the
newly christened “secular” world. Such scholars included Dutch
theologian P. D. Chantepie de la Saussaye, who coined the term
“phenomenology of religion.” He wanted to study the essence of religion
through its “empirical manifestations.” It also included the Jewish
Christian theologian Joachim Wach (1898-1955), a professor in the

62
Divinity School at the University of Chicago, who wanted to create a
“science” of religion.409
Wach later brought the European philosopher Mircea Eliade
(1907-1986) to the University of Chicago in the 1950s. Eliade would
become “one of the most influential theorists in religious studies in
modern times.”410 Eliade most famous book The Sacred and the Profane:
The Nature of Religion (1957) legitimated and consecrated the
secular/religious divide, but he did so in order to point out how humans
move from secular, profane existence into the “sacred space” of the
divine. He sought to explain the “religious experience” of “religious
man,” a being he termed “homo religiosus.” His central concept was the
“the sacred,” which he defined as “pre-eminently the real, at once power,
efficacity, the source of life and fecundity…objective reality…a cosmos.”
Eliade claimed that religious man would always be dissatisfied with
“profane experience,” and so he “opens” himself to the “divine” and
“makes himself, by approaching the divine models.”411
But these early intellectual movements were only quasi-
scientific at best. Much of the work in the study of religion during the
first half of the 20th century was either theology or philosophy, and this
work almost always was biased in favor of religious assumptions.412 The
actual scientific study of religion did not begin until the 20th century. The
fruits of this research would prove to be highly corrosive to traditional
religious claims and theological belief. The three most important
scientists to study religion scientifically were the German sociologist Max
Weber (1864-1920), the French sociologist Emile Durkheim (1858-
1917), and the social anthropologist E. E. Evans-Pritchard (1902-1973).
Weber was working on his magnum opus Wirtschaft und
Gesellschaft (Economy and Society) when he died in 1920. In the
Religionssoziologie (Sociology of Religion) part of this manuscript he
explained that the “essence of religion is not even our concern, as we
make it our task to study the conditions and effects of a particular type of
social behavior.” Weber argued that anything claiming to be religious
phenomenon should “not be set apart from the range of everyday
purposive conduct” because this phenomenon is primarily “oriented to
this world,” by which he meant the secular world.413
In The Elementary Forms of the Religious Life (1912) Durkheim
theorized that religion was primarily a social phenomenon that was
geared to the survival of human groups, and that it was also a strategy
used by political elites to maintain social order.414 When Durkheim used
the concept of “sacred,” Loyal Rue argues that what he really meant was
any “vital interest of the group.” Thus, “The gods are to be understood as
mere symbols personifying the transcendent reality of the group. The
central activity of religion is found in its ritual life, which creates social
solidarity and preserves the social order by reinforcing group
consciousness.”415

63
Evans-Pritchard’s Witchcraft, Oracles and Magic Among the
Azande (1937) was a groundbreaking book in the scientific study of
religion. As Pacal Boyer explains, “His book became a model for all
anthropologists because it did not stop at cataloguing strange beliefs. It
showed you, with the help of innumerable details, how sensible those
beliefs were, once you understood the particular standpoint of the people
who expressed them and the particular questions those beliefs were
supposed to answer.”416
Although the field of Religious Studies remains plagued by the
bias and vacuity of the central concept of “religion”,417 the study of ritual,
belief, institutions, ethics, and human consciousness in sociology,
anthropology, cognitive psychology, and biology have led to new
developments in our understanding of religion as a social and
psychological phenomenon. Some of the more notable include:
philosopher Daniel C. Dennett’s Darwin’s Dangerous Idea: Evolution
and the Meanings of Life (1995) and Breaking the Spell: Religion as a
Natural Phenomenon (2006); philosopher Loyal Rue’s Religion Is Not
About God: How Spiritual Traditions Nurture Our Biological Nature
(2005), and anthropologist Pascal Boyer’s Religion Explained: The
Evolutionary Origins of Religious Thought (2001). There have also been
new discoveries of evidence in history and archaeology, which have led
to more accurate understanding of existing religious traditions and the
meanings of sacred documents. In the study of Christianity, some of the
more notable include: John Dominic Crossan’s The Historical Jesus
(1993) and The Birth of Christianity (1999); Donald Harmon Akenson’s
Saint Saul: A Skeleton Key to the Historical Jesus (2002); and Bart D.
Ehrman’s Lost Christianities (2005), The New Testament: A Historical
Introduction to the Early Christian Writings (2007), and Misquoting
Jesus: The Story Behind Who Changed the Bible and Why (2007).
For those who follow these recent developments, there is little
space left for any theist claims, as traditionally defined by the major
world religions. Religious prophets have been demystified and
humanized. Religious traditions have been historicized within specific
cultural, temporal, and geographical domains. Religious texts have been
deconstructed both in terms of historicizing the writing and critically
analyzing the meaning of sacred texts, and also in terms of reconstructing
the heterodox collection of texts that were actively suppressed by more
powerful orthodox traditions. As philosopher Loyal Rue explains, “The
question of God’s existence simply doesn’t come into the business of
understanding religious phenomena…belief is the thing, not the reality of
any objects of belief.”418
And there have been recent scientific findings that have falsified
theist-favoring claims of “nonoverlapping magisteria” and the special
religious province of morality. Anthropologists, cognitive scientists, and
evolutionary psychologists have shown that morality has a biological and

64
social basis that predates the development of religion. Pascal Boyer
summarizes what we now know of the origins of morality, “Having
concepts of gods and spirits does not really make moral rules more
compelling but it sometimes makes them more intelligible. So we do not
have gods because that makes society function. We have gods in part
because we have the mental equipment that makes society possible but
we cannot always understand how society functions…We can explain
religion by describing how these various [human] capacities get recruited,
how they contribute to the features of religion that we find in so many
different cultures. We do not need to assume that there is a special way
of functioning that occurs only when processing religious thoughts…this
notion of religion as a special domain is not just unfounded but in fact
rather ethnocentric.”419
These recent scientific developments have led many people to
proclaim that belief in God is “obsolete.” Christopher Hitchens sees the
concept of God as passé, an old fashioned notion based on primitive
superstition: “The original problem with religion is that it is our first, and
worst, attempt at explanation. It is how we came up with answers before
we had any evidence. It belongs to the terrified childhood of our
species…reason and logic reject god.”420 Evolutionary psychologist
Steven Pinker argues, “The deeper we probe these questions, and the
more we learn about the world in which we live, the less reason there is to
believe in God.”421 Based on the available scientific evidence, physicist
Victor J. Stenger argues that “the only creator that seems possible is the
one Einstein abhorred – the God who plays dice with the universe…Yet
there is no evidence that God pokes his finger in anyplace.”422 Thus, this
has led many scientists to proclaim that “God” is a failed and obsolete
“hypothesis.” The evolutionary biologist Richard Dawkins goes so far as
to claim that those who continue to belief in God, despite the evidence
against such a being, are deceived by a “pernicious delusion.”423
The large tide of scientific evidence over the twentieth century
has also emboldened a group of atheists, who have tried to rehabilitate
this old epithet and make it a respectable intellectual and moral
position.424 However, it should be acknowledged that there has been a
bold tradition of atheism and agnosticism in the west for a very long time.
Perhaps two of the most noted atheists in the 18th and 19th centuries
were the philosopher David Hume (1711 - 1776) and the poet Percy
Bysshe Shelley (1792 - 1822). Shelley went to far as to publish a tract
called "The Necessity of Atheism" (1811) in which he declared, "There is
no God...Every reflecting mind must acknowledge that there is no proof
of the existence of a Deity. God is an hypothesis."425 Robert Green
Ingersoll (1833-99) famously argued, "While I am opposed to all
orthodox creeds, I have a creed myself; and my creed is this. Happiness
is the only good."426 The philosopher Bertrand Russell (1872 - 1970) was
an outspoken atheist who called "all the great religions of the world" both

65
"untrue and harmful." He called religion a "delusion born of terror," and
he argued, "The knowledge exists by which universal happiness can be
secured; the chief obstacle to its utilization for that purpose is the
teaching of religion."427
In the Unites States the number of people claiming “no religion”
has grown from 2.7 percent of the population in 1957 to about 15 percent
in recent polls. But only about 10 percent say they are “neither spiritual
nor religious,”428 and even less, about 5 percent of the population, claim
to “not believe in God.” Of this 5 percent, only a quarter of this small
minority (about 1.6 percent of the total population) identify themselves as
“atheists,” although some researchers put the number of atheists and
agnostics around a miniscule 0.2 percent.429 In this very religious
country, there still remain deep prejudices against professed atheists,
although these prejudices are not as violently expressed as a century ago.
In 1876 the New York Sun called Robert Green Ingersoll an "outrage,"
and a "moral pestilence" that should be "exterminate[d]" by "hang[ing]
the mortal carrion in chains upon a cross beam."430 In 1941 Bertrand
Russell was prevented from assuming a teaching post at the College of
the City of New York because he was called a "propagandist against both
religion and morality," a "philosophical anarchist and moral nihilist," and
a "professor of immorality and irreligion." George V. Harvey went so far
as to say that "colleges would either be godly colleges, American
colleges, or they would be closed."431
While this violent type of moral indignation and hatred against
atheists is rarely published any more, atheists are still a despised
minority. In a recent poll conducted in 2003, an average of 41 percent of
Americans (and 63 percent of white evangelical Protestants) said that
they would not vote for an atheist running for president, even if that
person has received their political party’s presidential nomination.432 An
American serviceman had to be sent home early from Iraq because of the
threats he received from other soldiers, and even his commanding officer
because he tried to hold a meeting for “atheists and freethinkers” while
serving in a warzone.433 Atheists remain a very small minority; however,
a number of new atheists have sketched out a platform to promote non-
belief and “atheist pride.”434 These new atheists, raising an ideology of
"militant modern atheism,"435 have delivered some bold attacks against
religion and fanaticism, while defending the intellectual and moral merits
of atheism. Remarkably, many of these new atheists have also
acknowledged the legitimacy certain conceptions of “God” and spiritual
practice, leaving open the possibility of a new form of religious
humanism.
Christopher Hitchens claims that because of the continuing
“fanaticism and tyranny” of religious people, atheism has “moral
superiority” because it allows people to use reason independently of
“dogma” and form more logical and moral solutions to the world’s

66
problems.436 The neuroscientist Sam Harris agrees with Hitchens. He
argues that religious people who hold beliefs with “no evidence” are
“mad,” “psychotic,” “delusional” and show signs of “mental illness.”
This makes religious people very “dangerous” because they are
“ignorant” of reality and prone to violence.437 Harris claims that the
major enemy of the 21st century is not any specific religious tradition or
person, the real enemy is “faith itself” – the “evil of religious faith.”438
Thus, without faith, humans can take “a rational approach to ethics” and
logically solve the world’s problems.439 But Harris does not deny that
people have “spiritual experience[s],” nor does he find anything wrong
with “meditation” or “mysticism,” as long as it is a “rational enterprise”
grounded on sound principles and evidence.440
Richard Dawkins explains that an atheist is simply a
“philosophical naturalist” who “believes there is nothing beyond the
natural, physical world, no supernatural creative intelligence lurking
behind the observable universe, no soul that outlasts the body and no
miracles – except in the sense of natural phenomena that we don’t yet
understand.”441 But like Harris, Dawkins is open to some forms of
spirituality. He does not deny the possibility of Spinoza’s “God,”
Einstein’s “God,” or the other naturalistic definitions of “God” made by
“enlightened scientists.”442 The notion of God just becomes a metaphor
for the wonders of the natural universe. Instead of traditional notions of
“God,” Dawkins prescribes “a good dose of science” and the “honest and
systematic endeavor to find out the truth about the real world.”443
But traditional theists refuse to give up much ground to scientific
developments or to the claims of new atheists. Recent responses include
John F. Haught’s God and the New Atheism: A Critical Response to
Dawkins, Harris, and Hitchens (2007) and Karen Armstrong’s A Case for
God (2009). Even some self-professed atheists acknowledge that religion
will always have the upper hand. Robert Sapolsky, a neurologist at
Stanford University, argues that while “science is the best explanatory
system that we have,” that doesn’t mean “that it can explain everything,
or that it can vanquish the unknowable.”444 There are many perennial
questions about the human condition that science cannot completely
answer or remediate, like how to find "meaning" in life, how to find a
satisfying identity, and how to deal with death. Questions such as these,
Christopher R. Beha has pointed out, "won't simply go away," and the
new atheists have not formulated any convincing answers. Beha laments,
"One is mostly left noticing how much scientism takes from us, and how
little it offers in return...For most of us, 'Get over it!' doesn't qualify as a
satisfactory response to the 'persistent questions.'"445 Because of this
philosophical gap, most still recognize that religious belief offers
"something that science does not.” Sapolsky calls it “ecstasy,”446 but one
could also add emotional comfort, meaningful identity, and motivational
purpose.

67
Thus, in light of the entrenched position of both secularism and
religion, and the growing conflict between these two world views, some
people have asked if there can be common ground, rather than more
culture wars. Robert D. Putnam and David E. Campbell believe that
Americans have "a high degree of tolerance" because our the diversity of
religion "produces a jumble of relationships among people," often within
families, and people learn to live "peacefully" with religious conflict.447
Rather than just co-exist, many scholars and intellectuals want to find
ways to bridge this conflict. Georgetown University Theologian John F.
Haught has proposed a “conversation” between science and religion, as
has philosopher Daniel C. Dennett.448
There have been some notable debates between science and
theology over the subject of the historical Jesus.449 But there have been
few real conversations or dialogues between people with secular and
religious views, and most of these have been academic exchanges that
would have limited appeal with the broader public. One of the most
remarkable academic debates was between William Lane Craig and
Quentin Smith in Theism, Atheism, and Big Bang Cosmology (1993).
The two philosophers debated back and forth the significance of the Big
Bang from their opposing religious and secular viewpoints. Another
more comprehensive example is the edited collection, Contemporary
Debates in the Philosophy of Religion (2004). In a not so formally
academic exchange, the atheist Christopher Hitchens has publically
debated several religious believers, including William Lane Craig.450
Sadly one does not find more of this kind of exchange, especially geared
toward engaging the broader public.451
But not everyone agrees that such conversations are possible, let
alone trying to find common ground. James Davison Hunter argued that
culture wars cannot be resolved and conversations cannot take place: "Is
it not impossible to speak to someone who does not share the same moral
language? Gesture, maybe; pantomime, possible. But the kind of
communication that builds on mutual understanding of opposing and
contradictory claims on the world? That would seem impossible."452
Philosopher Richard Rorty argues that dialogue won't work because
religion is often a "conversation-stopper" for most people, especially for
those believers who hold onto absolutes that cannot be compromised.453
If the past 5,000 years of human history tells us anything, the
reality of the human condition points toward the continued need for
religious belief, thus, there will probably never be a secular world. The
power of cultural tradition is too great, the legitimacy of existing
institutions is too strong, the quality of public education is too
impoverished, and the impact of forward-looking personalities is too
small. The majority of human beings will never lose their religion, nor
will they adopt a secular, scientific worldview. Secularism will always

68
be embattled, and religious differences will sometimes still to lead to
violence. The world of diverse, conflicting cultures and viewpoints that
we have inherited from our ancestors will be passed along to our children.
As the philosopher Isaiah Berlin once explained, “Surely it is not
necessary to dramatize these simple truths, which are by now, if anything,
too familiar, in order to remember that the purposes, the ultimate ends of
life, pursued by men are many, even within one culture and generation;
that some of these come into conflict, and lead to clashes between
societies, parties, individuals, and not least within individuals themselves;
and furthermore that the ends of one age and country differ widely from
those of other times and other outlooks.” But just because we live in a
diverse and conflicting world, it does not “preclude us from sharing
common assumptions, sufficient for some communication with [others],
for some degree of understanding and being understood. This common
ground is what is correctly called objective…Where there is no choice
there is no anxiety; and a happy release from responsibility. Some human
beings have always preferred the peace of imprisonment, a contented
security, a sense of having at last found one’s proper place in the cosmos,
to the painful conflicts and perplexities of the disordered freedom of the
world beyond the walls.”454
The hope of humanity lies within our ability to practice "old-
fashioned toleration."455 Essentially this means finding ways to
communicate our difference, seeking to understanding others and be
understood in turn, and where at all possible, finding common ground on
which to cohabitate in a shared world. Even Pope John Paul II
recognized this hope: "Our respect for the culture of others is...rooted in
our respect for each community's attempt to answer the question of
human life...every culture has something to teach us about...that complex
truth. Thus the 'difference' which some find so threatening can, through
respectful dialogue, become the source of a deeper understanding of the
mystery of human existence."456
From a secular perspective, John Gray has also made the same
basic point, "Oddly enough, we will find that it is by tolerating our
differences that we come to discover how much we have in common."457
Engaging in respectful dialogue is no doubt a difficult and dangerous
endeavor, but it is the responsibility of all those who claim the ethical
courage to build a better world. “We can do no better,” Daniel C.
Dennett argues, “than to sit down and reason together, a political process
of mutual persuasion and education that we can try to conduct in good
faith.”458 But in order to do so, argues Charles Taylor, “Both sides need a
good dose of humility, that is, realism. If the encounter between faith and
humanism is carried through in this spirit, we find that both sides are
fragilized; and the issue is rather reshaped in a new form: not who has the
final decisive argument in its armory…Rather, it appears as a matter of
who can respond most profoundly and convincingly to what are

69
ultimately commonly felt dilemmas.”459 But even when solutions are
found to our common problems, “We don’t just decide once and for
all…it is only a continuing open exchange.”460
There is hope that such public dialogue is possible. There have
been several moments in human history where diverse viewpoints were
peacefully exchanged in public debate. The ancient Greeks provided an
open forum for believers and skeptics to openly discuss their ideas,
although the death of Socrates marked the limits of permissible speech.461
In the third century BCE, the Indian Emperor Ashoka instituted
“Buddhist councils” that created an open arena for diverse parties to
argue over religious principles and practices.462 In the late 16h century
the Muslim Mughal Emperor Akbar the Great, ruing over Hindus,
Muslims, Christians, Parsees, Jains, Jews, and atheists in the Indian sub-
continent, instituted state religious neutrality and public dialogue between
representatives of the different faiths.463
Mongke Khan, the ruler of the vast Mongolian empire in the 13th
century, not only supported Genghis Khan’s initial policy of religious
neutrality and tolerance of all faiths, he also instituted public debates over
religion. Individuals would try to refute opposing religious doctrines or
practices in front of three judges: a Christian, a Muslim, and a Buddhist.
Contestants could only use rhetoric and logic to persuade the judges.
Common ground was rarely reached and “no side seemed to convince the
other of anything,” but each debated ended, as most Mongol celebrations
did, with an excess of alcohol, merriment, and “everyone simply too
drunk to continue.”464
The philosophes of the European Enlightenment used the ancient
philosophical method of dialogue to investigate religious dogma and
political traditions. Gottfried Wilhelm Freiherr von Leibniz tried to
orchestrate an ecumenical conference to bring together the warring
religious factions of Christendom during the Thirty Years' War. In the
1650s he exchanged many letters with the Roman Catholic Bishop
Bossuet towards this end until it became clear that the Bishop was only
trying to convert the heretic Leibniz.465 David Hume used the method of
dialogue in his landmark book, Dialogues Concerning Natural Religion
(1779), published after his death and without attribution because of the
radical nature of dialogue during a time of religious and political
absolutism. Gotthold Ephraim Lessing's play Nathan the Wise (1779)
was focused on the dialogue between a Jew, and Muslim, and a Christian.
In it he tried to preach tolerance for different faiths and viewpoints,
ending with the maxim "Little children, love one another."466 Peter Gay
explained the power of this method, the philosophes favorite
philosophical tool: "The philosophes, for their part, could exploit the
potentialities of dialogue fully, to propound the most outrageous
hypotheses for the sake, not of refutation, but of serious consideration, to

70
dramatize the constructive role of criticism, to record their own
education, their struggles and uncertainties, and by recording them,
educate their readers."467
Political discourse in the United States of America is another
example of the institutionalization of dialogue, public debate, and the
peaceful exchange of ideas. However, it took several hundred years for
this rhetorical arena to open up for all peoples in this country. For most
of this country’s history not everyone was free to debate, and at several
key junctions heated debates have irrupted into violent conflict, and even
war. But more and more minorities have stood up to be recognized and
have articulated their need for equal rights. The idea of American
democracy remains to this day an unsettled and contested ideological
terrain – the contours of which remain divisive and ever changing.468
James A. Banks has argued that a major problem facing modern,
multicultural nations is “how to recognize and legitimize difference and
yet construct an overarching national identity that incorporates the voices,
experiences, and hopes of the diverse groups that compose it.”469 I think
Gerald Graff has offered the only lasting solution. He argued that
educators should show students that “culture itself is a debate” and,
thereby, “teach the conflicts” that define our American culture both past
and present: “Acknowledging that culture is a debate rather than a
monologue does not prevent us from energetically fighting for the truth of
our own convictions. On the contrary, when truth is disputed, we can seek
it only by entering the debate.”470
In many ways the ideal of democracy can be seen as a peaceful
yet heated discussion conducted by diverse human beings with different
viewpoints trying to convince each other with words on the best way to
organize a society. President Barack Obama has described American
democracy “not as a house to be built, but as a conversation to be had.”
Obama argues that Americans need to join the “conversation” of
America, which is “a ‘deliberative democracy’ in which all citizens are
required to engage in a process of testing their ideas against an external
reality, persuading others of their point of view, and building shifting
alliances of consent.”471 The philosopher Amy Gutmann has praised the
“virtue” of deliberation by which important questions facing society are
discussed and argued peacefully amongst the equal participants of a
democratic nation. Often the most important questions cannot be
completely answered, nor can agreement always be found, but Gutmann
stressed, “We can do better to try instead to find the fairest ways for
reconciling our disagreements, and for enriching our collective life by
democratically debating them.”472
The focus of education in a diverse culture should be on
dialogue, the sustained attempt to peacefully discuss ideas while
establishing mutual bonds of goodwill, friendship, and a common
humanity. This is a difficult and potentially dangerous task, but it is

71
worth the risk.473 There should also be a focus on reasoned argument and
clear thinking. It is important to get beyond what Amartya Sen calls
"disengaged toleration."474 Thus, beliefs and claims are questioned so as
to investigate the truth, yet the old notion of a singular truth must be
discarded since one's idea of truth is most likely embedded in a complex
nexus of values and priorities.475 The purpose of this book is not
necessarily to convince anyone with a “final decisive argument.” Rather
this book is about sharing reasoned arguments in an open, friendly
environment, seeking to be heard, respected and understood, while also
searching for “commonly felt dilemmas,” mutual interest, and shared
values.476 The focus of education should be on the public dialogue
between diverse participants, who have all come together in a mutual
project to not only share divergent ideas, but ultimately, to engage our
differences, tolerate diversity, and peacefully share an ever unfolding and
changing world.477

72
Chapter 6
---
What is Enlightenment?

Human beings have defined education very broadly for


thousands of years. One of the oldest concepts of education, found in
diverse human societies around the globe, has been the idea of
"enlightenment." From our earliest records, human beings have
explained education in terms of illuminating the darkness of the human
condition, overcoming fear of the unknown through the light of human
understanding, which paves the way for liberation and purposive action
(often against the oppressive power of the gods, kings or social
institutions). In the tradition that I was raised, the western concept of
enlightenment can be traced to the very origins of philosophy in Ancient
Greece. Later in the 17th and 18th centuries, European philosophers
referred to their time as an "age of enlightenment" because they were
expanding the horizons of human knowledge through both a rediscovery
of ancient knowledge and through new forms of intellectual inquiry that
would be called science. Then in the late 19th and early 20th centuries,
European and American philosophers re-examined the ancient western
conceptions of enlightenment in conjunction with the early developments
of modern science, while at the same time discovering other ancient
traditions of human enlightenment found in eastern and middle-eastern
philosophy from China, India, Persia, Palestine, and Japan.
During the 20th century, the western world had more sustained
and reciprocal cultural contact with eastern cultures and their traditions of
knowledge. This cultural exchange enabled eastern philosophers, poets,
and religious teachers to have a larger impact on western philosophy
through translations of eastern texts and also through direct contact, as
many European scientists visited the Asian sub-continent and some
eastern mystics and philosophers began to visit Europe and America. The
20th century also saw scientists from around the globe asking classically
philosophical questions about the nature of reality and the human
condition, thus, redrawing the boundaries of human knowledge by using a
wide array of empirical data and modern scientific theories. At the dawn
of the 21st century we have access to an unprecedented amount of
knowledge, three thousand years of global philosophy and several
hundred years of empirical scientific study. The challenge for 21st
century education is synthesizing the diverse traditions of global

73
knowledge about the human condition into a coherent intellectual
framework. One way of doing this is by focusing back on the ancient
idea of enlightenment and critically analyzing it in relation to the past
three thousand years of human history.
At the center of the ancient concept of enlightenment are three
interrelated ideas: (1) clarifying human perception, (2) knowing the "real"
world as accurately as possible, and (3) using this knowledge to freely
act, thereby living more successfully and meaningfully. At root this
concept equates knowledge of reality with human freedom. The idea is
that more complete knowledge of reality enables individuals to
disentangle from restrictive determinants in order to more freely and
completely act. Furthermore, the enlightenment of the individual is often
assumed to advance the goals and harmony of human society, which
would ultimately lead to a "perfect" utopian culture. John Gray calls this
later goal "the Enlightenment project," which historically has been a deep
set religious impulse of Europe and America, whereby people have
placed faith in the unlimited capacity for human perfection and material
progress.478
From the earliest historical records, people have been asking a
recurring set of perennial questions: What is reality? How do we know?
How shall we live together? What does it mean to be human? I want to
attempt to give a short, critical history of the concept of enlightenment,
which I think provides a broad, coherent, multicultural, and meaningful
key to understanding the promise of education for the 21st century. An
educational philosophy based on the principle of enlightenment would
encourage not only the pursuit of knowledge, but also the possibility of
increased freedom. It would also warn people about the various
constraints of being human that limit our freedom and our knowledge, the
understanding of which would help enable a more pragmatic ideal of
human possibility.
For most of human history the cosmos of the known universe
was populated by deities and devils in a simplistically ordered whole,
which we now refer to as myths. As Henry Adams poetically noted,
"Chaos was the law of nature; order was the dream of man."479 Across
the globe, humans acted in broad ignorance of the objective world,
mixing local intelligence and tradition with fanciful belief in the power of
magic, prayer, and fate. Human beings seemed to have some small
measure of power over their lives, but they were often at the mercy of the
larger physical environment and chaotic natural processes, which were
beyond comprehension, yet alone human control. Mythological stories of
gods, spirits, and heroes enabled humans to make some sense of a world
they couldn't really know, and these stories gave human beings meaning
and hope, which enabled them to survive and eventually thrive.

74
Early philosophers in ancient Greece, and later in the Roman
imperium, began to challenge some of these myths through dialogue and
critical reasoning, but these ancient intellectuals had little evidence to
actually convince people.480 Socrates (c 469-399 BCE) was perhaps the
most famous example. He argued that humans should admit when they
are ignorant,481 they should debate the nature of reality, and they should
"test the truth" of all ideas.482 Socrates was eventually condemned to
death because he upset his society by questioning the gods and the
traditional explanations of the cosmos, but even at his trial he was
without remorse: "I say that it is the greatest good for a man to discuss
virtue every day...conversing and testing myself and others, for the
unexamined life is not worth living."483 Socrates' student Plato (428-347
BCE) famously extended this central insight into a parable about a cave.
In this story, the enlightened philosopher realizes individuals are bound to
cultural truths, which turn out to be nothing but manipulations, shadows
on a wall. So the philosopher escapes the cave and becomes enlightened
by the sunny real world, which symbolizes the actual truth of existence.484
Several ancient eastern philosophers also criticized the myths of
society and encouraged people to critically evaluate their perceptions to
understand the deeper truths of reality. In ancient India, Gautama
Buddha (c 563-483 BCE) taught that "this world has become blinded"
because the average person's perception was "polluted." Only a few
could "see insightfully." He taught his followers to "make a lamp for
yourself," to purify their vision, and to "become a wise one" by "knowing
full well, the mind well freed."485 A later follower of the Buddha
described the central teaching of Buddhism as self knowledge, which
incidentally was also the core teaching of Socrates: "When the cloud of
ignorance disappears, the infinity of the heavens is manifested, where we
see for the first time into the nature of our own being."486
A contemporary of the Buddha in ancient China also developed
a tradition of enlightenment. The philosopher K'ung Fu-tzu (c 551-479
BCE), we know him in the west as Confucius, tried to teach his students
"insight" by "understand[ing] through dark silence." He encouraged his
students to "study as if you'll never know enough, as if you're afraid of
losing it all."487 A later follower, Meng Tzu (c 372-289 BCE) or
Mencius, told a story about the revered master and his dedication to
knowledge:

Adept Kung asked Confucius, "And are you a great sage, Master?
"I couldn't make such a claim," replied Confucius. "I learn relentlessly
and teach relentlessly, that's all.
At this, Adept Kung said, "To learn relentlessly is wisdom, and to teach
relentlessly is Humanity. To master wisdom and Humanity - isn't that
to be a sage?"488

75
Confucius always pleaded a profound ignorance about life, as did
Socrates, who famously claimed, "All I know is that I know nothing."
For Socrates, Buddha, Confucius, and Mencius, the root of
human wisdom was to "relentlessly" learn and teach, perfecting the art of
human living through a constant dedication to the practice of education,
all the while professing a profound humility about the knowledge and
wisdom they actually possessed.489 As philosopher Owen Flanagan has
summarized:

In separate places between the fourth and sixth centuries


BCE, Plato and the Buddha describe the human predicament
as involving living in darkness or in dreamland - amid
shadows and illusions. The aim is to gain wisdom, to see
things truthfully, and this is depicted by both as involving
finding the place where things are properly illuminated and
thus seen as they really are.490

These ancient teachers lived their own philosophies and tried to embody
wisdom. They also preserved a cannon of traditional texts that taught
about the enlightened actions of gods, heroes, and sages. They wanted to
teach not only about knowledge, but also a better way of life.
While the old epistemology of myth and tradition was
questioned by a long line of western and eastern philosophers,491 it was
not completely challenged until the birth of western science. However,
early modern scientists displayed a continuity with ancient wisdom in the
continued focus on enlightenment, albeit a new form of enlightenment
based on new epistemological methods. Johannes Kepler (1571-1630)
and Galileo Galilei (1564-1642) ushered in what has been called the
“Copernican” revolution with the “new art of experimental science.”
This new science allowed a new type of truth to be fashioned based on
empirical observation, innovative technology (like the telescope),
experimentation, and mathematical theory.492 At the same time
philosophers of science developed new scientific methods to logically
explore and confirm a new type of truth based on logic and empirical
evidence.493 Francis Bacon (1561-1626) popularized inductive logic and
Rene Descartes (1596-1650) developed physical reductionism and
analytical geometry.494 Isaac Newton (1643-1727) unified these
developments, creating modern science by synthesizing experimental
empiricism with more effective logical methods. He not only developed
the calculus, in conjunction with the German philosopher Gottfried
Leibniz (1646-1716), but Newton also further developed the application
of mathematical logic to the study of the physical world. Newton
unequivocally replaced the divinely governed geocentric chain of being
with a new empirical conception of heliocentric universe governed by
natural laws. Newton wanted "to define the motions" of all physical
objects "by exact laws allowing of convenient calculation.”495

76
As the Newtonian revolution reached across Europe it inspired
many philosophers to conduct wide-ranging empirical studies of the
natural world and human society. This period of European history was
called the "age of enlightenment" by various philosophers of that era.496
By the end of the 18th century, the popularity of the term was criticized
by Immanuel Kant (1724-1804). In an essay written in 1784, Kant asked
"What is Enlightenment?" He explained that many human beings had
finally reached a state of intellectual maturity, whereby they were able
understand the world and themselves through direct evidence and without
recourse to myths or traditions. While the majority of humans were still
mired down by "laziness and cowardice," there were a brave few who
were "think[ing] for [them]selves," "cultivating their spirit," developing
"rational" thought, and spreading the "spirit of freedom." Kant argued
that Europeans were not living "in an enlightened age," but they were
living in "an age of enlightenment," by which he meant an age where
humans were gradually becoming more and more enlightened. Kant
believed that if the ethos of this age were allowed to run its course, it
would slowly spread to the public at large, although he was skeptical
about whether the "great unthinking masses" could ever get beyond their
subjective "prejudices" and achieve "a true reform in one's way of
thinking."497
By the 19th century, both Newton’s theoretical methods and his
belief in rationally ordered natural laws spread across Europe and
America, as more and more philosophers and early scientists empirically
studied the physical and social world. Most of these enlightened
Europeans thought they could collect and assemble a vast array of
empirical data, which would thereby transparently reveal the secret
workings of Newton's rational laws. These early scientists believed that a
small number of hidden natural laws ordered and determined not only the
physical universe, but also human society and individual action. The
invention of the Encyclopedia in the 18th century embodied this simple
assumption, as it was based on medieval compilations which purported to
contain all the knowledge a person needed to know, like the Bible and the
Speculum Mundi.498 For over a millennium, the learned men of Europe
thought one book could hold everything a society needed to know. The
new scientific literature merely replaced the Bible as the single source of
intellectual and moral authority, but it did not challenge the simplistic
monism of a transparent natural law.
This assumption of transparent, rational, transcendent, natural
laws would stand unchallenged until the late 19th and early 20th century.
During this time two later scientific revolutions corrected Newton's
simplified view of the world and ushered in a new, more fruitful and
frightful age of enlightenment. The great revolution in science during the
19th century was Charles Darwin’s (1809-1882) theory of natural
selection, which has been called “evolution.” Darwin’s theory delivered

77
a crippling blow to both anthropocentric beliefs about the uniqueness of
human beings as a species and to deep set assumptions of the
"progressive" advancement of human civilization. As the philosopher
Friedrich Nietzsche explained, Darwin was able to "translate man back
into nature."499 Darwin helped historicize and naturalize human
knowledge, as most people still believed in timeless supernatural forces
to be the source of all change on Earth. Basically Darwin’s theory of
natural selection described all life metaphorically as a dense
interconnected tree. The trunk represented a primordial common ancestor
of all life on Earth, which over billions of years branched out into
millions of new forms through a process of blind, natural selection. New
organisms arise due to random genetic modification over long periods of
time as they struggle to adapt to changing environments. Certain
modifications are “naturally selected” by the external environment
because these random changes allow the organism to better survive, mate,
and reproduce.500
Darwin’s theory enabled not only a “new science,” giving rise to
fields like ecology and evolutionary biology, but it also created a “new
way of thinking,” which focused on the dense interconnection,
interdependence, and historically conditioned form of all life on Earth.501
Ernst Mayer claimed that Darwin “caused a greater upheaval in man’s
thinking than any other scientific advance since the rebirth of science in
the Renaissance.”502 I. Bernard Cohen, a pioneering historian of science,
argued, “The Darwinian revolution was probably the most significant
revolution that has ever occurred in the sciences.”

Its effects and influences were significant in many different


areas of thought and belief. The consequence of this
revolution was a systematic rethinking of the nature of the
world, of man, and of human institutions. The Darwinian
revolution entailed new views of the world as a dynamic and
evolving, rather than static, system, and of human society as
developing in an evolutionary pattern…The new Darwinian
outlook denied any cosmic teleology and held that evolution
is not a process leading to a “better” or “more perfect” type
but rather a series of stages in which reproductive success
occurs in individuals with characters best suited to the
particular conditions of their environment – and so also for
societies.503

Darwin’s “dangerous idea” completely naturalized and secularized the


known world, completing the dream of Socrates and the Buddha, as the
long assault on metaphysical belief finally rendered obsolete all notions
of an "enchanted cosmos."504
However, modern advances in science have not completely
unraveled the mystery of the physical universe. In fact, the most recent

78
scientific revolution has unveiled a strange, unpredictable cosmos filled
with wondrous and frightening possibilities. Quantum theory and
quantum mechanics ushered in the unsettling notions of relativity and
probability. The Newtonian framework of physical science was based on
the idea of a limited number of absolute laws of the universe that
determined all matter and motion in discrete, observable, and rational
patterns. But Albert Einstein’s (1879-1955) theory of relativity and
Werner Heisenberg's (1901-1976) theory of quantum mechanics (and
more recent developments in string theory) would revise the Newtonian
assumption of simple laws and predicable patterns of motion, although
many of these patterns seem to hold for the largest bodies of observable
phenomena, like planets and solar systems.
Instead of a discrete, well-ordered universe of rational laws,
Heisenberg discovered that physical matter at its most basic level is
dynamic, chaotic, random, and wholly uncertain. Physicists are still
debating the very nature, and thereby the names, of the building blocks of
life. Because the physical universe is in constant chaotic motion and the
exact position or course of any sub-atomic particle uncertain, the vantage-
point of any subjective observer plays a role in trying to objectively
observe, record, and understand data. The implications of this revolution
disordered basic scientific assumptions, such as objectivity, physical
laws, predictability, and positive forms of knowledge. Even Einstein was
concerned about his own discoveries, later in his life turning toward a
unifying theory of everything because he believed that “God does not
play dice with the universe.”505
But Einstein’s later reaction represented the last throws of an old
Western belief in a fundamental order to reality that was both unchanging
and discretely knowable. As David Lindley explained, "Heisenberg
didn't introduce uncertainty into science. What he changed, and
profoundly so, was its very nature and meaning. It had always seemed a
vanquishable foe...The bottom line, at any rate, seems to be that facts are
not the simple, hard things they were supposed to be."506 There is a new,
profound truth that physical and social scientists are finally coming to
terms with. The objective world that we inhabit is not singular, nor is it
governed by simplistic and unchanging laws that determine everything
according to a singular metaphysical clockwork. Further, in the wake
Gödel's thesis and the Church-Turing thesis, scientists have had to admit
that not every facet of reality is amenable to human intelligence.507
Even more so than Darwin's theory of natural selection, the idea
of uncertainty has been perhaps the most profound and disquieting
discovery in human history. The irrational and ingrained assumption of a
singular, simplistic, and immutable order of the universe was not only the
“great myth” of the ancient world, but it was also the “central dogma” at
the foundation of the Western Enlightenment and the birth of modern
science. From the ancient Greek, Chinese, and Indian sages to the

79
modern world of science, philosophers and scientists have believed in a
"fundamental" principle of rational "order."508 Friedrich Nietzsche was
one of the first philosophers to call this monistic belief in a rational order
"a metaphysical faith," which Nietzsche connected to "a faith millennia
old, the Christian faith, which was also Plato's, that God is truth, that truth
is divine."509
According to philosopher and historian of science Isaiah Berlin
(1909-1997), “One of the deepest assumptions of Western political
thought is the doctrine, scarcely questioned during its long ascendancy,
that there exists some single principle which not only regulates the course
of the sun and the stars, but prescribes their proper behavior to all animate
creatures…This doctrine, in one version or another, has dominated
European thought since Plato…This unifying monistic pattern is at the
very heart of the traditional rationalism, religious and atheistic,
metaphysical and scientific, transcendental and naturalistic, that has been
characteristic of Western civilization.”510
The breakdown of traditional monistic sources of authority
began with the political philosophy of Niccolo dei Machiavelli (1469-
1527) and was later questioned by counter-Enlightenment romantic
philosophers. But it was not until the late 19th century, and more fully
during the 20th century, that the assumed singular “Truth” of human
rationality was fatally assaulted and finally pronounced dead by a
tradition of radical European philosophers, from Friedrich Nietzsche
(1844-1900) and Martin Heidegger (1889-1976) to Michel Foucault
(1926-1984) and Jacques Derrida (1930-2004). This European strain of
philosophy also impacted and co-existed with an American school of
thought called Pragmatism, which argued against singular truths in favor
of epistemological and ontological plurality, from William James (1842-
1910) and George Herbert Mead (1863-1931) to John Dewey (1859-
1952) and Richard Rorty (1931-2007).
Upon closer examination, the myth of a singular, immutable
physical Law proved to be no less quaint than the geocentric conception
of the universe or the belief in a singular omniscient deity that sat at the
edge of reality manipulating the strings of myriad human marionettes. It
was only in the last century, and really in the past twenty-five years, that
this myth of a singular order of the universe has been fully exposed as
false - although many scientists still cling to this assumption. Discussing
the aim of science in 1957, Karl Popper admitted, "the conditions
obtaining almost everywhere in the universe make the discovery of
structural laws of the kind we are seeking - and thus the attainment of
'scientific knowledge' - almost impossible."511 Many physical scientists
(although not all)512 now admit that reality is not a singular, static system
governed by universal laws. Instead, the objective world is understood as
a complex, interdependent ecology composed of multiple dynamic levels,
with many complex systems continually producing emergent qualities

80
that are difficult to understand because they are more than the sum of
their parts.513 The physical world is a complex "open system" composed
of "pluralistic" domains, which are in interrelated and in constant flux.514
Some have ceased calling the total expanse of reality the universe and
instead refer to it as a "multiverse."515 This notion of a complex web of
interdependent life was acknowledged by Darwin in the 1860s and later
caused a paradigm shift in the physical and social sciences beginning in
the 1890s.516 Henry Adams noted the break down of the old order and the
beginning of a new, more chaotic universe at the turn of the 20th
century.517 However, the older monistic paradigm continues to be a
powerful assumption, as it has only been partially eliminated from current
scientific theory and practice.518
The old notion of "universal laws," assumed by classical
scientists, has largely been proven a fiction that does not accurately
describe the physical world.519 For some scientists, the over-turning of
this old dogma is tantamount to a "crisis of faith."520 Over the past
century there has been a widespread acknowledgement of the "delusion
of the universal," which has led to a "re-conceptualization of science."521
Based on a broad reading of scientific discoveries over the full range of
physical, biological, and human sciences, it is clear that the objective
world consists of multiple, interrelated, interdependent, fluctuating, and
evolving levels of reality: from the smallest elements at the sub-atomic
level to the atomic level, the molecular level, the chemical level, leading
up to the organism level, ranging from very simple to very complex
organisms, to various biological social-groups which compose larger
societies, to the many ecosystems across the planet, to the global level, to
our solar system, and to the larger galactic systems all the way to the edge
of the expanding universe.522 The Nobel Prize winning chemist Ilya
Prigogine is one of many 20th century scientists who recognized the
inherent plurality of the objective world: "Nature speaks with a thousand
voices, and we have only begun to listen."523

Various possible languages and points of view about the


system may be complementary. They all deal with the same
reality, but it is impossible to reduce them to one single
description. The irreducible plurality of perspectives on the
same reality expresses the impossibility of a divine point of
view from which the whole of reality is visible...emphasizing
the wealth of reality, which overflows any single language,
any single logical structure. Each language can express only
part of reality.524

In order to be studied, the objective world has to be broken down into


these many conceptual levels of analysis (and even many dimensions of
space and time, as string theorists have been arguing), each with their
own special properties and emergent functions, co-existing in a dense

81
ecological web of life. Each level needs its own "regional theory,"525
with accompanying analytical concepts and methods, which are all
analytical fictions that scientists create to practically label and know an
isolated part of dense interconnected reality that is not entirely knowable,
let alone predictable or controllable.526
All levels and dimensions of this dense web of reality are
connected together and dependent upon each other, deriving their very
substance out of the complex interplay of invisibly interwoven ecological
relationships. Most of the levels and dimensions of the objective world
cannot be directly perceived by the human mind and we rely on complex
technologies to catch a glimpse. Some of these levels and dimensions
cannot even be fathomed, let alone conceptualized into a coherent theory,
as string theorists are dealing with at the sub-atomic level. As Alan
Lightman pointed out, "We are living in a universe incalculable by
science."527 As a species, we do not have the innate capacities to fully
conceive, let alone know the complex reality of the objective world in its
totality. It is completely beyond us, and perhaps always will be unless we
can develop some kind of super technology that will enhance our
cognitive abilities. This is a profound truth that most scientists have not
yet fully acknowledged, as many practicing scientists seem to believe that
humans have an unlimited capacity for rationality, knowledge, and
technological progress.528
We need to constantly remind ourselves, in the words of Paul
Ricoeur, that humanity is both an "exalted subject" and an "humiliated
subject."529 Human beings are not gods with infinite powers of reason
and will. They are in fact quite limited in their abilities to know and
freely act. And when humans do act, they are often animated by various
motivations and ideals that conflict with each other, forcing hard
decisions over which good should prevail. We are also continually
plagued by the unforeseen consequences of our short-sided decisions.
This old notion of human limits was the bedrock of many conservative
philosophers and the literary genre of tragedy: from Sophocles (497-406
BCE) and Euripides (480-406 BCE) to Marcus Aurelius (121-180 CE)
and St. Augustine (354-430 CE) to Blaise Pascal (1623-1662) and
Edmund Burke (1729-1797) up to contemporary thinkers Isaiah Berlin
(1909-1997) and John Gray (1948-).
All of these insightful philosophers, in the most expansive sense
of this term, warned against transgressing traditional boundaries out of a
blind pride in human knowledge and will.530 And this notion of human
limits is not confined to conservative thought. There has also been a
tragic strain of liberalism that questions the complexity of reality, the
conflicting diversity of human goods, and the constraints bounding
human rationality.531 For most of human history there has been
widespread doubt about the ability of human beings to understand and
control the social and physical environment, which would be the

82
foundation of what we call "freedom." The ancient Greeks even used a
special word for those who dared to defy the traditional order of the
universe and try to freely act, hubris.
But with the development of critical philosophy, and later, the
methods of logic and the empirical sciences, humans began to believe that
they possessed special powers that could overcome inherent limitations of
the species and the rest of the biological world. And to some extent, with
the new tools of science and technology, they did. Thus, using a revised
concept of enlightenment, 18th and 19th century philosophers believed
that increased knowledge would allow humans to control the natural
world and their own destiny. This was the dream of early modern science
and the European age of enlightenment. John Gray notes that this
philosophy of human progress was a "secular version" of the Christian
faith it sought to replace.532
The great apotheosis of this belief in human rationality and
progress came in the philosophy of Georg W. F. Hegel (1770-1831). He
believed that there was a fundamental "rational process" to the natural
world. This rationality slowly revealed itself through history via the
unique creation of human beings who gradually perfected their ability to
know and freely act: "The History of the world is none other than the
progress of the consciousness of Freedom...we know that all men
absolutely (man as man) are free."533 This lead Hegel to famously claim
that "What is rational is actual and what is actual is rational," by which he
meant whatever exists is rational and right and must be accepted as the
inherent unfolding of universal laws that unequivocally govern all life -
laws which just so happen to have expressed themselves through the
progressive perfection of human beings and human society.534
But as already discussed, the universe turned out to be much
more complex and strange than humans had always imagined it, and
human beings turned out to be much more flawed than enlightenment
philosophers assumed. Thus, in the 20th century, modern intellectuals
had to revise the older optimistic vision of enlightenment humanism to
deal with the very real biological, social, and physical constraints of the
human condition, not the least of which was admitting the savage, self-
destructive capabilities of human beings. Far from a progressive
unfolding of rationality and freedom, as Hegel suggested, the 20th
century seemed to ominously hint that humans might utterly destroy
themselves and their planet.
Early in the century the Irish poet W. B. Yeats warned that the
"ceremony of innocence" of enlightenment humanism would be
"drowned" because the "blood-dimmed tide" of human irrationalism had
been "loosed" upon the world. Human beings were revealing their true
nature as "rough beast[s]" whose self-inflicted apocalypse had "come
round at last."535 Later, in the midst of the second World War, the

83
English poet W. H. Auden noted that "Evil is unspectacular and always
human."536 The 20th century would remind humans of their animal
nature ("human, all too human," as Nietzsche sighed537), which gave rise
to another form of frightful uncertainty. If humanity did not have the
capacity for enlightenment and freedom, then what hope for a better
world?
Michel Foucault (1926-1984) was one of the most profound 20th
century philosophers who dealt with the paradoxical predicament of
eclipsed enlightenment principles in the "post-modern" world. In an
unpublished manuscript Foucault revisited the classic essay of Immanuel
Kant and asked again, "What is enlightenment?" Foucault started by
noting that Kant had used the German word Aufklarung, which was a way
of saying an "exit" or a "way out," by which Kant had meant a way out of
the limitations of a traditional humanity ruled by irrational myths and the
cruelty of our animal instincts. Kant wanted humans to "escape" from
tradition and human imperfection, to "dare to know," and thereby to
create the condition for enlightenment and human freedom which were
yet still only ideals. The implications of this directive were at once
personal and also social. Human freedom would take not only an
individual act of will, but also a political revolution to free humans from
the "despotism" of traditional sources of authority.538 This led
philosopher Allan Bloom to declare that the European enlightenment was
the "first philosophically inspired 'movement'" that was both "a
theoretical school" and a "political force at the same time."539
Yet Kant was not a revolutionary and he was very distrustful of
the notion of democracy. At the same time that Kant had written his
essay, enlightenment notions of human freedom inspired an
unprecedented wave of democratic revolutions that would sweep the
globe.540 The principles of enlightenment meant "political activism" and
"the transformation of society," the basic tenets of progressivism, or the
political Left.541 Radicals like Thomas Paine believed that we as human
beings "have it in our power to begin the world over again."542 The first
enlightenment inspired revolution happened in the North American
English colonies. The founding fathers of the United States of America
saw themselves as an enlightened vanguard who were "spreading light
and knowledge" to the rest of the world. They wanted to defeat the
tyranny of monarchical despotism and free themselves to fulfill the
promise of enlightenment principles.543 John Adams (1735-1826) saw
America as "the opening of a grand scene and design in Providence for
the illumination of the ignorant, and the emancipation of the slavish part
of mankind all over the earth."544 Thomas Jefferson (1743-1826)
declared that "all men are created equal; that they are endowed by their
Creator with certain inalienable Rights; that among these are life, liberty
& the pursuit of happiness."545

84
These ideas spread across the globe and inspired oppressed
people yearning to be free. One notable group of young European
intellectuals in the early 19th century were especially inspired by these
enlightenment ideals. They were called the Young Hegelians, after their
famous teacher Georg W.F. Hegel. Believing in Hegel's theory of a
progressive rational order in human history, this group of radical
philosophers wanted to put enlightenment principles into practice in order
to revolutionize the whole world, giving freedom and knowledge to all
people.546 The most famous and influential intellectual among this group
was Karl Marx (1818-1883). He argued, "The philosophers have only
interpreted the world, in various ways; the point is to change it."547 In
order to change the world, Marx introduced what he saw as the
intellectual and political culmination of Hegel's world-spirit (Weltgeist)
of enlightenment humanism. This would be a philosophy Marx called
"communism," which he saw as "the solution of the riddle of history."548
Harkening back to both Hegel and to older notions of eastern
enlightenment, Marx wanted a "reform of consciousness."549 He wanted
human beings to "give up their illusions about their condition" and to
"give up a condition that requires illusions."550 Marx and his later
followers, taking a page from the American and French revolutions,
wanted to use science and technology to free the exploited peoples of the
world, both intellectually and politically, so as to create a global utopia of
free and enlightened human beings.
But the 20th century saw the corruption and destruction of this
radical social and political hope.551 The rapid increase in scientific
knowledge and technology, as John Gray has pointed out, left the human
species "as they have always been - weak, savage and in thrall to every
kind of fantasy and delusion."552 The revolutionary spirit of American
and French democracy (among others) was blocked and slowly aborted
by entrenched conservative elites and the manipulation and coercion of
the uneducated masses (and in the French case, the use of terror), all of
which led enlightenment humanists to latch their progressive idealism
onto authoritarian autocrats or illiberal bureaucratic states.553 Marx's
revolutionary socialism and the revolt of the proletariat smashed naively
and recklessly across the world, devolving into fascism, authoritarian
states,554 and various ethnic holocausts.555
Also, the very notions of western enlightenment and human
freedom were desecrated by the vicious global imperialism of Europe and
America. These global empires plundered the riches of the world and
subjugated the majority of the people on Earth, leaving most humans as
little more than colonial slaves, and then these empires stood back as
former colonies devolved into perpetual war and genocide.556 Finally, by
mid century, the two dominant empires on the earth were locked in a
Cold War with nuclear weapons on hair triggers, threatening to destroy
all life on Earth in assured mutual destruction.557 During the long 20th

85
century, the ancient dream of enlightenment seemed to have been
shredded at the cruel hands of a barbarous humanity tearing itself to
pieces.558 One Jewish survivor of the Nazi death camp Auschwitz argued
that humans would have to admit that they live in "a monstrous world,
that monsters do exist" and that "along-side of [enlightenment] Cartesian
logic there existed the logic of the [Nazi] SS": "It happened [the
holocaust], therefore it can happen again."559 Lawrence L. Langer
declared, "the idea of human dignity could never be the same again."560
In revisiting the simplicity of Kant's text on enlightenment,
Michel Foucault asked whether or not a great existential "rupture" had
taken place in the 20th century. Foucault questioned Kant's idea of
intellectual maturity and doubted that humans could ever reach this state.
While humans had more knowledge about themselves and their world, it
was a "limited" and "always partial" knowledge. Foucault argued that we
needed to give up the notion of "complete and definitive knowledge," and
instead we needed to focus on how we have been shaped by historical
circumstance and how we have acted upon those circumstances within
certain "limits." Foucault wanted to more fully understand the constraints
and possibilities of the human condition. But with this notion of limited
knowledge and action, Foucault cautioned that we must not lose our
"faith in Enlightenment" because as an ideal it pushes us to know more,
to better ourselves, and to try to better our society.561
Isaiah Berlin (1909-1997) agreed with this revised version of
enlightenment. Berlin argued that our knowledge of the world and of
ourselves in the late 20th century brought about two basic truths: reality is
much more complex than we ever imagined, and we as human beings are
much more flawed then we ever before admitted. Berlin explained that
there was "too much that we do not know" and that "our wills and the
means at our disposal may not be efficacious enough to overcome these
unknown factors."562 As Friedrich Nietzsche once pondered, "When you
look long into an abyss, the abyss also looks into you."563 The
developing practice of science had brought about great gains in
knowledge and technology, but it had also further enabled the brutal
tendencies of human beings to destroy themselves and their environment.
Our great increase in knowledge and technology has not enabled greater
human freedom for most people, nor has it solved the intractable
problems of our paradoxical nature and society.564 The history of human
folly has not ended, as some idealistic philosophers have assumed.565
In the early 20th century, one of the most educated and
technologically advanced human societies caused a global war and
engineered the efficient murder of millions. Writing just before the Nazis
democratically rose to power, Sigmund Freud was very pessimistic
concerning human beings’ ability to use knowledge and technology
toward noble ends in an effort rise above our biological constraints: "Men
have gained control over the forces of nature to such an extent that with

86
their help they would have no difficulty in exterminating one another to
the last man."566 About a century later, John Gray came to the same
conclusion: "If anything about the present century is certain, it is that the
power conferred on 'humanity' by new technologies will be used to
commit atrocious crimes against it."567 After World War II and the
horrors of the holocaust, Raymond Aron caustically pointed out that
humans would "like to escape from their history, a 'great' history written
in letters of blood. But others, by the hundreds of millions, are taking it
up for the first time, or coming back to it."568 In his moral history of the
20th century, Jonathan Glover forcefully emphasized that "we need to
look hard and clearly at some monsters inside us."569
At the start of the 21st century humans face not only continuing
warfare, poverty, disease, and outbreaks of genocide, but also a new
looming catastrophe, the environmental destruction of the planet, which
could actually bring out Freud's fateful apocalypse, the extinction of the
entire human species.570 John Gray went so far as to call humans a
"plague animal" and all but warned that our demise as a species was
imminent.571 Amartya Sen, on the other hand, has acknowledged the
great strides humanity has taken since the atrocities of the first and
second World Wars, while noting that we still "live in a world with
remarkable deprivation, destitution and oppression."572 Surveying
previous human societies that have destroyed themselves, geologist Jared
Diamond said he remains a "cautious optimist" on the capacity of human
beings in the 21st century to learn from the past in order to solve the
looming environmental crisis and the socio-political upheaval it will
cause. He explained, "we have the opportunity to learn from the mistakes
of distant peoples and past peoples. That's an opportunity that no past
society enjoyed to such a degree."573
Given the tragedy of the 20th century, I would agree with
Foucault that the concept of enlightenment needs to be revised. As John
Gray has perceptively pointed out, "We live today amid the dim ruins of
the Enlightenment project, which was the ruling project of the modern
period."574 We now realize that enlightenment is not blind faith in
knowledge combined with the false hope of unlimited progress. Instead,
we must focus on the constrained possibilities and limitations of being
human, which includes our limited capacity to know, act, and shape our
world. Some philosophers call this position of limited freedom a "soft
determinism."575 The philosopher and psychologist Erich Fromm (1900-
1980) explained the new promise of this revised concept of
enlightenment: "We are determined by forces outside of our conscious
selves, and by passions and interests which direct us behind our backs.
Inasmuch as this is the case, we are not free. But we can emerge from
this bondage and enlarge the realm of freedom by becoming fully aware
of reality, and hence of necessity, by giving up illusions, and by

87
transforming ourselves from somnambulistic, unfree, determined,
dependent, passive persons into awakened, aware, active, independent
ones."576 It is important to note that Fromm said "enlarge the realm of
freedom" instead of perpetuating the naive myth of being completely free
to determine our destiny in any way we please.
Recent scientific discoveries in cognitive psychology, sociology,
evolutionary psychology, and socio-biology lend credence to this new
concept of enlightenment as limited rationality and constrained freedom,
which was developed philosophically in the 20th century by Foucault,
Berlin, and Fromm, among others. The philosopher of science Daniel C.
Dennett argues that modern science has proven a biological, social, and
environmental determinism that shapes and constrains both individual
action and human society. He also argues that fundamentally life is
essentially chaotic and "random," which means that humans have limited
predictive powers to understand the present and plan for the future.
John Gray has taken these facts to an extreme and argued that
humans are just "deluded animals" because they "think they are free,
conscious beings."577 But determinism and chance do not entirely rule
out freedom and volition, they just circumscribe it. "Free will is real,"
Dennett claims, "but it is not a preexisting feature of our existence, like
the law of gravity. It is also not what tradition declares it to be: a God-
like power to exempt oneself from the causal fabric of the physical world.
It is an evolved creation of human activity and beliefs, and it is just as
real as such other human creations as music and money." Free will exists
because we as humans believe it to exist and we act according to this
belief, just like we believe that certain kinds of colored paper allow us to
purchase goods and services - not because of any inherent property in
ourselves (or in the colored paper), but because we say it exists, believe it
exists, and institutionally structure our society around this belief. We do
have the power to alter our reality through ideas.578
But our capacity for free will is also grounded in our biology
from which we will never entirely escape.579 We know that much of who
we are as humans, including our behavior, it largely determined by our
genes. But as Matt Ridley pointed out, "Genes are often thought of as
constraints on the adaptability of human behavior. The reverse is true.
They do not constrain; they enable."580 As a species we are uniquely
endowed with the ability to learn and enhance our lives with knowledge
and technology collectively stored in our culture. We have evolved from
an animal-like state into what Dennett calls "informavores,"
"epistemically hungry seekers of information."581 We use this
information to create knowledge, which in turn is used to better
understand our world and improve our condition. While we are
biologically, socially and environmentally determined, our human nature
is not fixed. We can change. Not only can we use information to change

88
our environment, but we can also build tools, like eye glasses, penicillin
or computers, that compensate for biological deficiencies or maladies and
give us greater power over our human nature.582 This is the realm of
human culture, which separates us from all other species of life on
Earth.583
Sigmund Freud famously called humans a "kind of prosthetic
God," due to our technological advances over our environment and our
biological bodies.584 Karl Popper believed that human evolution in the
20th century was being driven not by biology anymore but by
technology.585 Some scientists studying our human genes argue that we
will be able to soon manipulate the very building blocks of our biology,
allowing for a new "directed evolution."586 Other scientists in the field of
cybernetics and artificial intelligence are even trying to create a new kind
of human, blending technology into our biology to produce the "cyborg"
and the "human-machine civilization," which they optimistically say
promises not only greater knowledge and freedom, but also
immortality.587
E. O. Wilson has claimed that mastering our genetic code could
bring a new form of "volitional evolution," which would allow humans to
be "godlike" and "take control" of our "ultimate fate."588 Mark Lynas has
gone so far as to call us "the God species."589 While their visions might
vary, all of these Cornucopian idealists believe in a form of
"technofideism," i.e. a "blind faith" that innovative technology will solve
all of the world's problems.590 Francis Fukuyama bluntly declared,
"Technology makes possible the limitless accumulation of wealth, and
thus the satisfaction of an ever-expanding set of human desires."591 The
economist Julian Simon believes "the material conditions of life will
continue to get better for most people, in most countries, most of the time,
indefinitely."592 The marvel of current technological wonders aside, the
verdict is still out on just how far technology can take us as a species.
But one thing is sure, technology is no panacea for all of the human-
created problems on planet Earth, let alone naturally occurring problems,
like earthquakes, hurricanes, volcanoes, and the odd asteroid smashing
into our planet.
But these idealistic visions of human grandeur are not entirely
misconceived. Even though we are determined as a species, like all other
biological organisms on this planet, our unique evolutionary adaptations
and cultural development have enabled "a degree of freedom," which we
can exploit for our own improvement. Language, abstract thought, and
culture in particular are tools unique to human beings.593 How far we can
change is dependent on many factors: our personal knowledge and
experience, our vast store of cultural knowledge, the advancement of our
technology, and our environment, which includes both physical resources
and constraints, and also social resources and constraints, like access to
money and political power.

89
Rene Dubos pointed out, "Man's ability to transform his life by
social evolution and especially by social revolutions thus stands in sharp
contrast to the conservatism of the social insects, which can change their
ways only through the extremely slow processes of biological
evolution."594 Thus, we can use our language, our critical thinking, and a
vast store of cultural and physical resources to partially create ourselves
and our social reality, albeit within certain fixed biological, physical, and
social limits.595 Perhaps science and technology might push those limits
farther than we can now imagine, but there will always be limits. Naively
believing that science and technology will erase those limits is a "modern
fantasy."596 Knowledge is freedom, that much is true. However, due to
the "imperfect rationality" and constrained abilities of our species, caused
by various determining factors in our biology and environment, ours will
forever be a bounded, limited and imperfect freedom.597
Thus, we need to reconceive the concept of enlightenment and
human nature through a more biological and ecological understanding of
physical life in terms of "nature via nurture."598 The physicist Fritjof
Capra has pointed out, "cognition is the very process of
life...Cognition...is not a representation of an independently existing
world, but rather a continual bringing forth of a world through the
process of living...'To live is to know.'"599 This insight has led some
scientists, like the entomologist Edward O. Wilson, the philosopher
Daniel C. Dennett, and the cognitive psychologist Steven Pinker, to argue
that it is time to revisit the ancient notion of "human nature" in order to
formulate "a realistic, biologically informed humanism," which combined
with our knowledge of human culture would be the newest phase of
human enlightenment.600 At the center of our biological nature is the
process of cognition, how we know: "reasoning, intelligence, imagination,
and creativity are forms of information processing." But we as humans
do not know in isolation, we utilize other people and our culture, which is
"a pool of technological and social innovations that people accumulate to
help them live their lives...culture is a tool for living."601
We need to understand how our brain "evolved fallible yet
intelligent mechanisms" to perceive and understand reality, as discussed
in the first chapter of this book. We also need to understand how the
"world is a heterogeneous place," and how we as humans are "equipped
with different kinds of intuitions and logics" to apprehend and understand
different aspects of reality. There is no one way to know, just like there is
no one thing to know. One of the most important ways of knowing is
through the tool we call an "idea," which is a complex network of
meaningful information put into language that we use to better know our
world and to more wisely act. Human beings are a species "that literally
lives by the power of ideas."602 Philosophers merely extend this
"personal commitment to ideas" a bit further than the average human, but

90
the practice of philosophy is inherent trait that all human beings share
(whether they consciously realize this or not).603
We need to also realize that we are not always in complete
control of our ideas or the technologies they enable. Ideas have a
tangible, objective reality that make them a constitutive part of human
culture and subjectivity.604 Many ideas grow beyond the mind of their
originator, spread to other minds, and evolve through particular cultures
through historical processes to become what social scientists call
"institutions."605 Institutions are the self-evident and often taken for
granted social structures, both ideological and organizational, found
within particular human societies. The social structure of an institution
can be described as the organized ideas and procedures that pattern
particular social practices.606 These institutional procedures can also be
described as the "working rules" governing a particular social practice,607
which can be thought of as a "rule-structured situation."608 While we do
create our own ideas and institutions, at some point they begin to take on
a life of their own and create us. The early historical-sociologists, Karl
Marx, Emile Durkheim, and Max Weber, each studied different social
structures of constituting rule systems in Western society. They wanted
to understand the underlying logics that established and maintained the
modern world: the social, political, economic, and religious rules,
organizations, procedures, rituals, and ideas that ordered societies.
Recent institutional theorists have complicated older notions of
institutions, which were often overly simplistic and monistic, because we
now know that institutions are “rich structures” of diverse, overlapping,
and often conflicting patterns of social practice.609
The concept of social institutions hurdles a social-scientific
dualism that has been unresolved for the past century. At the center of
the social sciences has been a central debate over how societies and social
institutions are constituted and how they change. Societies and social
institution can be seen, on the one hand, as the “product of human
design” and the outcome of “purposive” human action. However, they
can also be seen as the “result of human activity,” but “not necessarily the
product of conscious design.” One of the paradigmatic examples of this
dualism is language. Human beings are born speaking a particular
language with pre-defined words and a pre-designed grammar; however,
individual human beings are also able to adopt new languages, create new
words, and change the existing definition of words or grammatical
structures.
But is any individual or group of individuals in conscious
control of any particular language? The obvious answer is no, but each
individual has some measure of effect, yet just how much effect is subject
to debate. For the past quarter century or so, scholars have rejected the
idea that societies, institutions, and organizations can be reduced to the
rational decisions of individuals, although purposive individuals do play a

91
role. The new theory of institutions focuses on larger units of analysis,
like social groups and organizations “that cannot be reduced to
aggregations or direct consequences of individual’s attributes or
motives.” Individuals do constitute and perpetuate social structures and
institutions, but they do so not as completely or as freely as they
believe.610
The new institutional theory has focused mainly on how social
organizations have been the locus of “institutionalization,” which is the
formation and perpetuation of social institutions. While groups of human
beings create and sustain social organizations, these organizations
develop through time into structures that resist individual human control.
Organizations also take on a life of their own that sometimes defies the
intentions of those human beings directing the organization. While
institutions can sometimes begin with the rational planning of individuals,
the preservation and stability of institutions through “path dependent”
processes (what we generally call “history”) is often predicated on
ritualized routines, social conventions, norms, and myths.
Once an institution becomes “institutionalized,” the social
structure perpetuates a “stickiness” that makes the structure “resistant” to
change. Individual human actors, thereby, become enveloped and
controlled by the organization’s self-reinforcing social norms, rules, and
explanatory myths, which are solidified through positive feedback
mechanisms that transcend any particular human individual. These
organizational phenomena, thereby, shape individual human perception,
constrain individual agency, and constitute individual action. As one
institutional theorist has argued, all human “actors and their interests are
institutionally constructed.” To a certain extent humans do create
institutions and organizations, but more immediately over the course of
history, institutions and organizations create us. Many millions of
individuals have consciously shaped the English language, but as a child I
was constituted as an English speaking person without my knowledge or
consent. It is perhaps more accurate to say that English allowed for the
creation of my individuality than it is to say that my individual
subjectivity shaped the institution of English.611
But if all human thought and action is constituted by previously
existing institutions, do human beings really have any freedom to shape
their lives or change society? This is actually a very hard question to
answer and it has been the center of many scientific debates over the past
century. Durkheim and Parsons seemed to solidify a sociology that left
no room for individual volition. Marx stressed human control, but
seemed to put agency in the hands of groups, not individuals. Weber
discussed the possibility of individual agency, especially for charismatic
leaders, but he emphasized how human volition was always “caged” by
institutions and social organizations. Michel Foucault conceptualized
human beings as almost enslaved by the various modern institutions of

92
prisons, schools, and professions.612 The novelist Henry Miller brilliantly
expressed the predicament of human agency, whereby his knowledge of
himself and society failed to enable any real freedom: "I see that I am no
better, that I am even a little worse, because I saw more clearly than they
ever did and yet remained powerless to alter my life."613 Taking stock of
all of the possible arguments for human freedom, the philosopher Thomas
Nagel explained, "The area of genuine agency...seems to shrink under
this scrutiny to an extensionless point. Everything seems to result from
the combined influence of factors, antecedent and posterior to action, that
are not within the agent's control."614
The enlightenment notion of unencumbered individual freedom
was deconstructed during the 20th century and revealed to be nothing but
a myth. However, some recent neo-institutional theorists have recently
left open the possibility of individual rationality and freedom, albeit in a
limited and constrained form. Human agency sometimes defined as the
mediation, manipulation, and sometimes modification of existing
institutions. Pierre Bourdieu argued that there was a "dialectical
relationship" between institutional structures and individuals.615 Human
beings can act in concert with institutions or against them, and individuals
can also refuse institutionalized norms and procedures, thereby,
highlighting another type of agency.
Humans can also exploit contradictions between different
institutional structures, and use one institution to modify another.616
Ronald L. Jepperson argues that there can be “degrees of
institutionalization” as well as institutional “contradictions” with
environmental conditions. This means that certain institutions can be
“relative[ly] vulnerab[le] to social intervention” at particular historical
junctures. Jepperson is one of the few institutional analysts who
conceptualize a theory of human action and institutional change, which
allows for “deinstitutionalization” and “reinstitutionalization.” But
Jepperson does not validate rational choice theories of individual agency.
He argues instead that “actors cannot be represented as foundational
elements of social structure” because their identity and “interests are
highly institutional in their origins.”
However, this position does not disavow institutionally mediated
individual choice and action. As Walter W. Powell has argued,
“individual preferences and choices cannot be understood apart from the
larger cultural setting and historical period in which they are embedded,”
but individual actors have some freedom within institutional
environments to “use institutionalized rules and accounts to further their
own ends.” Roger Friedland and Robert R. Alford argue that “the
meaning and relevance of symbols may be contested, even as they are
shared.” “Constraints,” Powell paradoxically argued in one essay, “open
up possibilities at the same time as they restrict or deny others.”617

93
The anthropologist Sherry B. Ortner has developed a
comprehensive theory of human agency that allows individuals more
power to consciously participate in, and thereby, shape and modify
institutions. She describes the individual agent in a “relationship” with
social structures. This relationship can be “transformative” on both
parties: each acts and shapes the other. While the individual is enveloped
by social structures, there is a “politics of agency,” where individual
actors can become “differentially empowered” within the layered “web of
relations” that make up the constraints of culture. Individuals can act
through a process of reflexivity, resistance, and bricolage.
Humans use an awareness of subjectivity and negotiate their
acceptance and refusal of the status quo. Through this process, humans
can re-create existing social structures by reforming traditional practices
and also by introducing novel practices. Ortner conceptualized the
process of agency as the playing of “serious games,” utilizing a metaphor
originally deployed by the analytical philosopher Ludwig Wittgenstein.
She argued forcefully that existing cultural structures and social
reproduction is “never total, always imperfect, and vulnerable,” which
constantly leaves open the possibility of “social transformation” to those
who dare to act out against the status quo.618 However, researchers have
pointed out that those who are moderately alienated or marginalized from
existing institutions seem to have a greater chance of imagining new
institutional forms and acting against existing institutional power: "Those
at the peripheries of the system, where institutions are less consolidated,
are more likely to discover opportunities for effective opposition and
innovation...Change is more likely to be generated by the marginally
marginalized, the most advantaged of the disadvantaged."619
Recent scholarship has also emphasized how organizations and
institutions are structured within "particular ecological and cultural
environments."620 Looking at the wider sphere of organizational ecology
allows researchers to understand how individuals and social organizations
are interconnected within a dense social web. Interdependent social
groups interact with each other to mutually shape the physical and social
environment, which in turn impacts the evolution of organizations,
organizational forms, and institutionalized practices and norms.621
Organizations are mutually influenced by a host of social sectors,
including nation-states, geographical regions, local governments, other
organizations, and micro social groups, like the family and peer
networks.622 Within each sector there are diverse “clusters of norms” and
organizational typologies that institutionally define and constrain
individual and organizational actors, and thereby, a host of institutional
norms and forms are continually reified and perpetuated across a
diversely populated social and organizational landscape, which slowly
changes through time.

94
Because societies are characterized by such diversity of social sectors,
each with their own institutions and norms, different institutions can be
“potentially contradictory,” which can allow for social conflict and social
change through time as institutions develop in relation with the
institutional and physical environment.623 However, it is still unclear how
institutions “change” and what change actually means. Theorizing the
nature and extent of institutional change is an unresolved issue.
Institutions are seen as stable social structures outside the control of
rational agents which seem to slowly adapt to internal and environmental
conditions through an incremental process, although there is some
evidence to suggest that rapid changes can occur in short periods due to
environmental shocks.624 Due to the contingent and evolving complexity
of human institutions, one economist acknowledged, "We may have to be
satisfied with an understanding of the complexity of structures and a
capacity to expect a broad pattern of outcomes from a structure rather
than a precise point prediction."625
Because human institutions are embedded within dense webs of
social and physical ecologies, it is hard to study their complexity utilizing
the simplistic theories of traditional academic disciplines. As the Nobel
Prize winning Economist Elinor Ostrom pointed out, "Every social
science discipline or subdiscipline uses a different language for key terms
and focuses on different levels of explanation as the 'proper' way to
understand behavior and outcomes," thus, "one can understand why
discourse may resemble a Tower of Babel rather than a cumulative body
of knowledge."626 The only way forward is to break down artificial
disciplinary boundaries in order to "integrate the natural sciences with the
social sciences and humanities" into a unified body of knowledge,627
which is something that Ostrom has tried to do with her Institutional
Analysis and Development (IAD) framework.628 This new
interdisciplinary way of knowing would focus on physical, biological,
and social structures that enable and constrain human beings in specific
historical contexts.629 Our current notion of human enlightenment is still
based on the foundational idea that "knowledge is the ultimate
emancipator," as Edward O. Wilson recently put it.630 In fact, Wilson
consciously linked the goals of 21st century science with the older
notions of enlightenment humanism, as he invoked its underlying ethos,
"We must know, we will know."631
But our new age of enlightenment needs to be grounded by an
understanding of both the complexity of knowledge and also the
epistemological limitations of human beings. Wilson acknowledges that
the "immensurable dynamic relationships" between different types of
organisms, and also different levels of reality combined with the
"exponential increase in complexity" between lower levels of reality and
higher levels of reality, all pose the "greatest obstacle" to a unified human
knowledge. It is extremely hard (if not ultimately impossible) to get

95
"accurate and complete description of complex systems."632 On top of
this, human knowledge needs to be grounded in the new sciences of the
human mind (cognitive science and epistemology) because "everything
that we know and can ever know about existence is created there."633 At
the root of these new sciences is our "humbling" realization that "reality
was not constructed to be easily grasped by the human mind," thus, we
need to know both what can and cannot be known and also how to live
better without perfect knowledge.634
I completely agree with Wilson's socio-biological explanation of
human nature and cognition, including his arguments on how genes
condition knowledge and the structure of society;635 however, I disagree
with Wilson's assertion that the old enlightenment myth of
"reductionism" and physical "laws" will be central to 21st century science
and a unified field of knowledge. Wilson and a great many other
scientists still believe that "nature is organized by simple universal laws
of physics to which all other laws and principles can eventually be
reduced." Although it must be noted that Wilson is exceptional in his
humility, admitting that this fiction of universal laws is at least "an
oversimplification" of the complexity of reality, and further, "it could be
wrong."636 While I am willing to admit that there are some physical laws,
I think it has already been readily established that even physical laws are
constant only in specific spatial-temporal environments, leading to what
Pierre Bourdieu called "regional theories," or what Bernard Williams
called "local perspectives."637 Physicists now understand that different
levels of reality have their own rules,638 and the validity of these rules are
explained by "effective theories," which can only effectively explain a
specific level of reality and nothing more.639
Einstein theoretically proved that both time, space, and gravity
can bend, which alters the constant properties of each, therefore, even
these "constants" are not all that constant throughout the universe. Nobel
Laureate Richard Feynman later discovered that some other assumed
theoretical constants seem "to vary from place to place within the
universe."640 At the smallest most fundamental level of reality, the sub-
atomic level, there does not appear to be any governing laws at all. It
appears to be complete chaos, which is paradoxical given the ordered
complexity of the chemical level up through the planetary level.641 Thus,
the laws of physics may be multitudinous across a vast universe that we
have only begun to understand;642 however, this does not mean that there
are "a total absence of reliable rules" governing reality.643 We just can't
assume that the rules controlling one level of reality will be valid at a
higher or lower level of complexity.644 While I would agree with
biologists that evolution is a "law" that governs the process of change in
all organisms and even human societies here on the planet Earth, I think it
would be foolish to say that sub-atomic particles or solar systems are
bound by this same principle of natural selection, although perhaps this

96
theory might still have explanatory power for the level of solar
systems.645
Over the course of the 20th century there has been a clear move
away from "rigidly deterministic and mono-causal models of
explanation" towards more a more complex understanding of the diverse
nature of reality.646 We must get away from universal "laws" and instead
look for the overlapping or layered "regional theories" that approximate
the various levels and processes determining physical reality. Elinor
Ostrom has argued that human culture and institutions have "several
layers of universal components," which means that "multiple levels of
analysis" are needed to fully explain the complex ecology of any social
practice.647 There are also many layers to the physical world, ranging
from the infinitely small sub-atomic particles through the many layers of
the biological world to the large realm of planets and galaxies. Each level
of reality that has been identified by scientists is an area of specialization
with its own name, terminology and methods of analysis.648
As the Physicist and philosopher David Deutsche pointed out,
multiple disciplinary frameworks reflect the underlying complexity of the
physical world and they cannot be collapsed into a single reductionist
framework: "None of these areas of knowledge can possibly subsume all
the others. Each of them has logical implications for the others, but not
all the implications can be stated, for they are emergent properties of the
other theories' domains."649 It is especially important to understand that
higher orders of more complex life cannot be reduced to the simpler base
components of lower levels because there is an "emergence" of higher
order properties that have their own unique dynamics. As biologist Ernst
Mayr pointed out, "In each higher system, characteristics emerge that
could not have been predicted from a knowledge of the components."650
We must guard against scientists with valid regional theories in
one domain of reality who encroach on other domains, especially
scientists who seek to reduce higher order systems to the simpler function
of lower order parts. David M. Kreps has explained this phenomenon as
a form of epistemological "imperialism."651 A theory that might be valid
on one level of reality can easily become invalid by distorting another
level of reality it cannot comprehend. For example, while I agree with
biologists and socio-biologists that both humans and human societies
evolve, I completely disagree with many physical scientists over the
mechanism of evolution in human culture. The zoologist and socio-
biologist Richard Dawkins famously reduced human beings to genes and
reduced human culture to "memes" as biological "structures" of cultural
transmission.652 Quite literally, Dawkins theorized that our bodies and
our ideas replicate independently of human intention: "It is not success
that makes good genes. It is good genes that make success, and nothing
an individual does during its lifetime has any effect whatever upon its
genes."653 Likewise, Dawkins argued that human ideas can be reduced to

97
memes, which are a form of social gene, where "replicators will tend to
take over, and start a new kind of evolution of their own."654 Thus,
human beings are merely passive incubators for genes and memes.
Nothing more, nothing less.
This notion has been widely accepted in both scholarly and
popular circles,655 but it is naively absurd,656 and it merely proves my
point about extending valid scientific theories beyond their corresponding
level of reality. While the mechanism of natural selection on genes does
explain most of human biological evolution, this theory is not valid or
insightful when extended to social phenomenon, where intentional human
actors purposively affect their own lives and cultures (to a certain extent).
John Gray has pointed out, "Biology is an appropriate model for the
recurring cycles of animal species, but not for the self-transforming
generations of human beings."657 Likewise, the evolutionary biologist
and geologist Jared Diamond conclusively demonstrated that "our rise to
humanity was not directly proportional to the changes in our genes."658
Further, Diamond warned, "While sociobiology is thus useful for
understanding the evolutionary context of human social behavior, this
approach still shouldn't be pushed too far. The goal of all human activity
can't be reduced to the leaving of descendents."659
To illustrate this obvious point one needs to look no further than
Dawkins own mother. She once explained to Dawkins when he was a
boy that "our nerve cells are the telephone wires of the body."660 Now if
human communication is nothing but a blind and random biological
transmission of memes, then how does Dawkins purposefully remember
this meaningful story after so many years? And if human communication
can be reduced to electrical impulses and memes, then where did the
meaning of this story come from. Further, how am I able to take this
story out of context to change the meaning in order to criticize Dawkins
for being an absurd reductionist? While the reality and importance of
genes cannot be discounted, many scientists arrogantly believe that
"hard" quantitative sciences, like chemistry or socio-biology, are the best
(and often the only) way to understand human beings and our culture.
This arrogant reductionism is often called "scientism."661 The "question
of meaning" cannot be answered by physical science, which is not to say
that meaning and the human mind are metaphysical entities.662 Both
meaning and the mind are physical and social realities grounded in and
facilitated by the empirical world, but that empirical world is very
complex and one cannot simply reduce higher order psychological and
social phenomena to chemical or genetic processes.
Because they discount the reality of social and psychological
ecologies, such reductionists also never bother to think about individual
and social consequences of reductionist theories, such as memes.663 The
notion that "nothing an individual does during its lifetime has any effect
whatever upon its genes" can be easily translated into a pernicious form

98
of nihilism, leading to individual paralysis and social anarchy.664 Such
reductionism ignores the reality of the "culture gap" between us and all
other species on Earth.665 While we are physical beings programmed and
constrained by natural laws, like all other organic organisms on this
planet, the homo-sapien is a very unique creature that has also created a
subjective world through consciousness and society, which has its own
emergent properties and governing structure. Culture exists and is every
bit as important as genes in terms of shaping our behavior as a species,
yet socio-biologists like Wilson and Dawkins miss this important level of
reality because of their reductionist scope.
Many physical scientists, especially in the relatively new fields
of socio-biology and evolutionary psychology, have been drunk on their
particular methodology and the scientific success it has enabled. Seeking
to imperially expand their explanatory power, these physical scientists
inevitably ride rough with their simplistic reductionism through the
complexities of human life and culture. The analytical concept of genes
is important, but ultimately only one small part of the complex puzzle of
life. Richard Dawkins wants us to believe that "the gene's perspective" is
the most important, valid and insightful uber-perspective of all.666 Why?
Why not reduce human beings and culture to sub-atomic strings, atoms,
chemicals, or organs? If every living thing can be reduced to DNA and
genes then why study any individual organism at all? There are a host of
important physical components that we could use in a reductionist fashion
to "explain" the significance of any species, but all of these factors miss
the forest for the trees. As the physicist Lisa Randall points out,
"Understanding the most basic components is rarely the most efficient
way to understand the interactions at larger scales."667 And yet for the
past century the myopia of physical scientists have distorted the most
important and unique aspects of human beings: individual intentionality
and culture.
In 1968 the microbiologist and philosopher of science Rene
Dubos caustically pointed out, "The most damning statement that can be
made about the sciences of life as presently practiced is that they
deliberately ignore the most important phenomena of human life,"
namely individual intentionality and culture.668 Thirty-five years later,
the notion of studying human beings "as if they were human beings" is
considered a "radical innovation" in the social sciences,669 although some
physical scientists like Jared Diamond are notable exceptions.670 The
archaeologist Timothy Taylor recently made the same point. He argued,
"There has been an extraordinary - and often extraordinarily arrogant -
underestimation of the complexity of the humanities by some hard
scientists who extend themselves across the arts/sciences divide...when
[they] stray into more 'humanistic' domains, [they] make an unwitting ass
of [themselves]." In particular, Taylor cites Richard Dawkins' notion of
memes as a primary example of this misguided arrogance.671

99
Ironically, Daniel C. Dennett has also admitted this same sad
fact. He wrote, "When scientists decide to 'settle' the hard questions of
ethics and meaning, for instance, they usually manage to make fools of
themselves, for a simple reason: They are smart but ignorant."672 This is
ironic because Dennett is one of those aforementioned "smart but
ignorant" scientists who have tried to reduce the complexities of human
culture to the absurdity of memes.673 While admitting the
anthropologist's established fact that "human beings spin webs of
significance," Dennett proceeded to dismiss all humanistic and most
social scientific literature in a recent book on culture, in order to reduce
the phenomenon of religion to mere biological and psychological
processes, which of course boils down in reductionist fashion to genes
and memes.674 Rene Dubos accurately acknowledged that "cultural
evolution has long been of much greater importance than biological
(genetic) evolution." Thus, to fully understand human beings, one has to
focus primarily on the "sociocultural environment," using theories and
methodologies appropriate to that level of reality.675 When trying to
understand the human, as Karl Popper once pointed, "obviously what we
want is to understand how such non-physical things as purposes,
deliberations, plans, decisions, theories, intentions, and values, can plan a
part in bringing about physical changes in the physical world [author's
emphasis]."676
Humans must move beyond the simplistic and silly notion that a
handful of simplistic and reductionist laws can explain the complexity of
all life in the universe, especially the every changing reality of our socio-
political world.677 In our attempt to unify human knowledge in the 21st
century, we must recognize that each level of reality demands its own
theories and methods that are appropriate to the phenomenon being
studied, which necessarily implies the acceptance of a wide range of
epistemological theories and methods. And we also must accept, in the
wake Gödel's thesis and the Church-Turing thesis, that not every facet of
reality is amenable to human intelligence.678 When it comes to human
culture, and issues like religion, politics, education, law, and economics,
it is absurd to think that the theories and methods of biologists or
physicists can completely explain the dense complexity of human society,
or that economists can explain the root function and value of all cultural
activity. This is not to say that biologists, physicists, and economists
cannot lend some fundamental insights into cultural phenomenon and
social processes. What I am saying is that when only one disciplinary
framework is used to explain any complex phenomenon, it will obscure
more than it reveals. As Stephen Toulmin pointed out, any narrow
academic science delivers "at best" an "oversimplification of human life
and experience" and at worst it distorts reality beyond recognition.679

100
Edward O. Wilson is one example of a physical scientist who
has been able to admit that human evolution is determined not only by
biological evolution but also by the "unique" processes of cultural
evolution. He explained that "the most distinctive qualities of the human
species are extremely high intelligence, language, culture, and reliance on
long-term social contracts."680 But he appears to be utterly ignorant of
the voluminous humanistic and social scientific literature on human
culture, which he never references in any of his books. Instead, he has
imperiously assumed time and again that biology and physics can explain
everything. What physical scientists do not understand, due to their
extreme empirical focus on the objective world, is that human culture is a
fundamentally different type of phenomenon because it is both objective
and subjective, at the same time, and what is more, it is diversely
subjective because multiple individuals and groups constantly contest the
constitution of subjective reality through divisive and sometimes violent
political processes.
Thus, to study human society one must not only study the macro
objective-sociology, but also what anthropologist Larry Hirschfeld calls
"naive sociology," which is the way that humans make sense of their own
social world subjectively and give it meaning.681 As philosopher Helen
A. Fielding explained, "Reality can only be given to us through the
multiple moving and moved perceptions of embodied being and the
potentiality of our being with others."682 This is the domain of "culture."
The idea of culture became an important social-scientific concept over the
20th century, enabling new knowledge about human society and
individual action.683 Many physical scientists fail to recognize that
culture "is a rather practical thing," not a bunch of abstract concepts and
theories applied to discrete objects. Thus, cultural transmission is not the
movement of some impersonal, objective meme, but rather it is a
"relevance-driven" process that relies on subjective necessity and values,
which emerge from the individuals who selectively transmit information
based on many possible intentions conditioned by many possible
environments.684 Human beings are able to "generate an infinity of
practices adapted to endlessly changing situations," and rarely are they
fully cognizant of what they are doing and why.685
This requires utilizing diverse methods to know diverse
individuals embedded in complex cultures. The philosopher and
sociologist Raymond Aron called such a method "interpretive pluralism,"
as Brian C. Anderson explained, "Given the complexity of historical and
social reality and the intrinsic limits of human cognition, a
methodological approach that takes into account several dimensions of
analysis or several interpretive frameworks will be more objective than a
reductive, monistic approach."686 Clifford Geertz utilized such
interpretive pluralism in his own practice as a cultural anthropologist.

101
He sought to understand "directly and fully the diversities of human
culture." Geertz was especially fascinated with the "ethos" of individual
cultures, which he defined as the "historically transmitted pattern of
meanings embodie[d] in symbols, a system of inherited conceptions
expressed in symbolic forms by means of which mean communicate,
perpetuate, and develop their knowledge about and attitudes toward
life...their world view."687
Physical scientists also fail to understand that studying human
society is not just an objective scientific endeavor. It is at the same time a
moral and political project too.688 Even when the social scientist tries to
be objective by accepting and recording the status quo, this is a political
act that will help legitimate a historically contingent status quo that
undergirds a particular balance of power.689 When it comes to human
society, science is not a neutral activity made by a neutral observer.
Scientists are active participants that can influence political discourse
and/or action, or be influenced in turn.
The average human being does not need, nor can they often use,
scientifically produced objective data, which to most of us is just
meaningless information. As Edward O. Wilson poetically explained,
"We are drowning in information, while starving for wisdom."690 People
need knowledge to help them wisely make difficult practical and moral
judgments, which will affect the quality of their individual and socio-
political life.691 Scientists are extremely focused on understanding "what
is" reality, but this information cannot help ordinary humans make moral
judgments about "what could be" reality or "what should be" reality.692
Humans constantly negotiate the boundaries of "what is" with "what
could be" and "what should be" in their daily lives. Individuals have
some measure of control over their lives and the shape of their society,
"contributing," as C. Wright Mills once explained, "however minutely, to
the shaping of this society and to the course of its history, even as [they
are] made by society and by its historical push and shove." Thus, when
scientists merely analyze existing social reality they miss an important
point: "it is not a stable phenomenon with a definable essence" because
we change the constitution of society every day in innumerable ways.693
The totality of physical life seems to be completely determined
by a mindless law of cause and effect,694 but not human beings or human
society. This very special realm of the natural world is triply determined:
it is effected not only by physical patterns of causation, but also by
institutional environments and by the conscious freedom of human actors,
as they negotiate, accept, attack, reject, revolt, or revise "what is" with
what they think the world could be and should be. As Mills explained in
his classic treatise on sociology, "Whatever else he may be, man is a
social and an historical actor who must be understood, if at all, in close
and intricate interplay with social and historical structures."695

102
Surely we ought occasionally to remember that in truth we do
not know much about man, and that all the knowledge we do
have does not entirely remove the element of mystery that
surrounds his variety as it is revealed in history and
biography. Sometimes we do want to wallow in that mystery,
to feel that we are, after all, a part of it, and perhaps we
should; but...we will inevitably also study the human variety,
which for us means removing the mystery from our view of it.
In doing so, let us not forget what it is we are studying and
how little we know of man, of history, of biography, and of
the societies of which we are at once creatures and creators.696

In the science saturated society of mid-20th century America, C. Wright


Mills warned that it might be possible for human beings to be increasing
their "rationality" while also living "without reason." Just because
scientifically produced information and technology make our lives
materially better, it "does not mean that men live reasonably and without
myth, fraud, and superstition." And paradoxically, the exponential
increase in human rationality over the past century has not come with
equal gains in human freedom.697 Mills was not hopeful that science
would circle the square of the human condition. The only hope he could
point to was the liberating effects of education, largely in agreement with
John Dewey, whereby average human beings could transform themselves
into "free and rational individuals" and come together democratically to
build a better future.698 Raymond Aron too believed that education held
the only real hope for humanity, although he pessimistically admitted that
"rational humanists" like himself "bet on the education of humanity, even
if he is not sure he will win his wager."699
The continued and enlarged practice of science will surely have
an important part to play in the future of our species. And no doubt the
unification of knowledge will help the human species better address the
complexities of physical and social reality. In particular, scientists must
find a way to "create some kind of synthesis of human evolution with our
new understanding of cultural diversity."700 But scientific progress is not
a sufficient condition of knowledge or freedom for the average human
being. Individual men and women must be given the intellectual tools
that they need to live at a practical level: conceptual tools that will allow
them to understanding themselves, their subjective and cultural ethos, and
the physical world that surrounds them. Knowledge helps us order the
world, "like a string in a maze," so that we don't "lose [our] way."701
Human beings do not need objective knowledge so much as humanistic
education that will allow us to learn and live better. We need knowledge
to be wise in making difficult decisions so as to become more rational
actors, debating the parameters of the good, participating in the co-
construction of our lives, and hopefully working towards a better world.

103
I have offered in this book both an inquiry into knowledge,
education, and the human condition, which I have gleaned from some of
the finest minds in human history. But I offer few answers to the riddle
of the human condition, largely because I firmly believe that there are
few answers to be found. I agree with John Gray who wrote, "Technical
progress leaves only one problem unsolved: the frailty of human nature.
Unfortunately that problem is insoluble."702 Recognizing this perennial
problem, the philosopher Ann V. Murphy argues that we need to
reconceive our humanity in terms of "corporeal vulnerability," allowing
for the "contingency and imperfection" of all that we do, all that we as
humans are and will ever be.703
However, we do not have to be victims of our frailty and
vulnerability. I fundamentally believe that proper education is the key to
enabling our humanity, despite our limitations as a species. We must live
to learn and in learning live a life more meaningful and free. But we
must also recognize that we exist within determining boundaries,
physical, biological, social, and institutional. We must continually
negotiate our humanity within the bounds of these environments and our
own biology. We need to find sustainability as a species, and this means
"living within the physical limits of the ecosphere" as well as our own
biological bodies.704 As Ernst Mayr pointed out, "To 'know thyself," as
the ancient Greeks commanded us, entails first and foremost knowing our
biological origins."705
Knowing the basis and boundaries of human existence is a never
ending process of seeking enlightenment and formulating contingent
judgments based on incomplete information, often in stressful
circumstances. Even when we discover great insight about ourselves or
the world we live in, we must never forget the enduring power of human
ignorance and our imperfect ability to translate knowledge into purposive
action. John Gray sardonically noted, "those who imagine that great
errors of policy are not repeated in history have not learnt its chief lesson
- that nothing is ever learnt for long."706 Or, even more pessimistically,
Arthur Wichmann despaired of humanity's flawed nature and caustically
warned, "Nothing learned, and everything forgotten."707 Thus, we need
to not only seek better knowledge and more sophisticated forms of action,
but we also need better ways of institutionalizing the search for
enlightenment and the transmission of knowledge in the face of our
enduring weaknesses as a species, so that we don't forget ever the hard-
earned lessons we've learned through history. As the public intellectual
Christopher Hitchens insightfully and succinctly pointed out, "human
stupidity" is the "enemy" that we will eternally face.708
This was a central insight of many great 20th century
philosophers, like Isaiah Berlin, Raymond Aron, Michele Foucault, and
Ludwig Wittgenstein.709 Summarizing Wittgenstein's opus, Hilary
Putnam argued, "[He] wants not to clarify just our concepts, but to clarify

104
us; and, paradoxically, to clarify us by teaching us to live, as we must
live, with what is unclear."710 The ancient concept of enlightenment
promises us a way out of the bondage of existential darkness, but it does
not guarantee a clear path or purpose, nor the freedom to be whatever we
might want to be. Berlin, Aron, Foucault, and Wittgenstein all affirmed
what one scholar has called a "self-critical, chastened Enlightenment,
aware of the imperfections of man, the importance of history, the
constancy of the tragic, and the limits of rationality."711 These critics of
enlightenment believed in the power of rational inquiry and human
ingenuity, but not over much; they believed in humanity "as inherently
unfinished and incomplete, as essentially self-transforming and only
partly determinate, of man as at least partly the author of himself and not
subject comprehensively to any natural order."712 It is a humble vision of
skeptical optimism that promises nothing but a chance at better thinking
and living.
In the early 18th century the English poet Alexander Pope
(1688-1774) started an ambitious philosophical treatise in verse called An
Essay on Man, which he was never able to finish. Yet despite its
fragmented state, it spread across Europe and made a contribution to the
"age of enlightenment." Pope was more conservative than most
enlightenment philosophers and the essay reinforced Pope's belief in a
great chain of being and an omnipotent creator God. But his poem can
also be read more metaphorically as a description of a human condition
that we will never escape, caught betwixt the heaven and the earth with
only a fragment of knowledge to guide our way. In Epistle 2 of the Essay
Pope makes a fairly radical claim for his day and age, "Know then
thyself, presume not God to scan; / The proper study of mankind is Man."
Pope described humanity as a "middle state" between "too much
knowledge" and "too much weakness,"

In doubt to deem himself a god, or beast...


Born but to die, and reasoning but to err;
Alike in ignorance, his reason such,
Whether he thinks too little, or too much:
Chaos of thought and passion, all confused;
Still by himself abused, or disabused;
Created half to rise, and half to fall;
Great lord of all things, yet a prey to all;
Sole judge of truth, in endless error hurled:
The glory, jest, and riddle of the world!713

The radical departure of this poem for the 18th century was the eclipse of
divinity with a new focus on the human condition as our "proper study."
Pope praises the uniqueness of humanity, but notes its inherent
contradictions and weaknesses that hobble any clear hope for the future.

105
This poem at once celebrates the potential of human beings, while
simultaneously warning against too much pride in our deeply flawed
nature. It is a paean to human possibility within limits that are beyond
our control.
Writing several decades earlier, the French philosopher Blaise
Pascal came to a similar conclusion. Having declared that "man is
beyond man," Pascal still argued that humans must try to better
apprehend their nature and their condition with the limited knowledge
and skill that they possessed. Pascal claimed that humans "burn with
desire to find a firm foundation, an unchanging, solid base on which to
build a tower rising to infinity," but humans must realize that this type of
knowledge is impossible, "so let us not look for certainty and stability."
Instead Pascal, like Pope, wanted human beings to turn inward and
outward to better understand the human condition:

Let us, having returned to ourselves, consider what we are,


compared to what is in existence, let us see ourselves as lot
within this forgotten outpost of nature and let us, from within
this little prison cell where we find ourselves, by which I
mean the universe, learn to put a correct value on the earth, its
kingdoms, its cities, and ourselves...For in the end, what is
humanity in nature?...The end of things and their beginning
are insuperably hidden for him in an impenetrable secret.714

And like Pope, Pascal placed human beings within a complex ecology so
as to contextualize human nature within the larger natural and social
world. Human beings have a measure of reason and freedom to
understand their nature, but they are still trapped in the "prison cell" of
both their physical reality and their own mind. Pascal claimed that
humans will never know the origins or endings of life, thus he asked
human to narrow their gaze to the past, present and future that can be
known, and to be satisfied with the limited knowledge we have at our
disposal.
While the "impenetrable secret" of absolute reality has been a
constant thorn in the side of humanity's quest for ultimate knowledge, it
was Friedrich Nietzsche who first philosophized a way to know and live
without the "firm foundation" of some supreme Truth. Nietzsche
criticized the irrational myths that continued to keep humanity "sunk deep
in untruth" through both the official lies of the powerful and what the
poet William Blake called the "mind-forg'ed manacles" of our own
subjectivity.715 Nietzsche criticized the tradition of western philosophy
and science for holding onto the very notion of absolute truth: "A lack of
historical sense is the congenital defect of all philosophers...everything
has evolved; there are no eternal facts, nor are there any absolute truths.
Thus historical philosophizing is necessary." He argued that the

106
historically minded philosophers would realize that the human "faculty of
knowledge," as well as the quality of our knowledge, have both "evolved"
as the species has evolved in relation to an ever changing environment.716
Nietzsche was the first philosopher to extend the implications of
Darwin's scientific revolution to the human condition and the limited
possibilities of the human future. It was Nietzsche who first argued that
humans had to historicize and contextualize their knowledge in relation to
the natural world and to our continued evolution. Humans had to look
hard at the past in order to weigh the meaning and possibility of the
future. Only then could we devise clear knowledge about the human
condition.

Stroll backwards, treading in the footprints in which humanity


made its great and sorrowful passage through the desert of the
past; then you have been instructed most surely about the
places where all later humanity cannot or may not go
again...When your sight has become good enough to see the
bottom in the dark well of your being and knowing, you may
also see in its mirror the distant constellations of future
cultures.717

Nietzsche argued that through the study of the human past, we could gage
the limited potential of the human future. But unlike other philosophers,
Nietzsche argued against a fixed human nature, instead postulating an
evolving human nature in dynamic relation to its ever-changing
environment.718 He also argued that stepping outside of our own cultures
and taking a look at the diversity of human life world would enlighten us
about our possibilities: "There are great advantages in for once removing
ourselves distinctly from our time and letting ourselves be driven from its
shore back into the ocean of former world views. Looking at the coast
from that perspective, we survey for the first time its entire shape, and
when we near it again, we have the advantage of understanding it better
on the whole than do those who have never left it."719
The American born English poet T. S. Eliot (1888-1965) had
seen in his lifetime both the rising promise of science and the horrors of
the early 20th century. He felt the same mixed admiration and fear for
humanity, as did Pope and Pascal, but Eliot was also darkly hopeful that
humans could find their way to a better future if only, as Nietzsche
implored, they could understand the past: To know what humans have
been will greatly determine what humans can be. Eliot was able to
articulate in unmatched poetic beauty the depth of these important
perspectives on the paradoxical human condition and our limited ability
to fully know ourselves within the larger ecology of space and time. In
an effort to bring this book to a close, I want to first quote T. S. Eliot at
length to share his beautiful and profound poetry:

107
...both a new world
And the old made explicit, understood
In the completion of its partial ecstasy,
The resolution of its partial horror...
Only through time time is conquered...
Or say that the end precedes the beginning,
And the end and the beginning were always there
Before the beginning and after the end.
And all is always now. Words strain,
Crack and sometimes break, under the burden,
Under the tension, slip, slide, perish,
Decay with imprecision, will not stay in place,
Will not say still...
Quick now, here, now, always...
Do not let me hear
Of the wisdom of old men, but rather of their folly...
The only wisdom we can hope to acquire
Is the wisdom of humility...
Knowing myself yet being someone other -
And he a face still forming...
All touched by a common genius,
United in the strife which divided them...
What we call the beginning is often the end
And to make an end is to make a beginning.
The end is where we start from...
Every phrase and every sentence is an end and a beginning,
Every poem an epitaph. And any action
Is a step to the block, to the fire, down the sea's throat
Or to an illegible stone: and that is where we start.
We die with the dying:
See, they depart, and we go with them.
We are born with the dead:
See, they return, and bring us with them...
A people without history
Is not redeemed from time, for history is a pattern
Of timeless moments...
We shall not cease from exploration
And the end of all our exploring
Will be to arrive where we started
And know the place for the first time.
Through the unknown, remembered gate
When the last of earth left to discover
Is that which was the beginning.720

In this long poem Eliot offered human beings only two humble hopes: To
know one's place in time through careful study of human evolution and to
learn from the mistakes of the past so as to not repeat its fatal errors.
Above all, Eliot encouraged a measured caution and humility; we must

108
recognize our frail mortality as we ceaselessly explore the human "face
still forming" and act in "timeless moments" to make a "new world."
In our quest to push the boundaries of the human condition ever
further in the 21st century, we must continually resist the false lure of
utopian plans for perfection, the easy bliss of hucksters, the patronizing
promises of politicians, the myths of transcendent absolutes.721 From the
origins of our culture producing species, philosophers have oversold the
promise of human rationality and innovation as adequate tools to produce
a perfect utopian future.722 For as Thomas Moore (1478-1535) ironically
warned in his treatise on utopia, "things will never be perfect, until
human beings are perfect," which as the title and story suggested, was
never (utopia is Latin for nowhere).723 During the 20th century one of the
most enlightened and technologically advanced cultures in the world
caused two World Wars, while engineering the technological marvel of
efficiently locating, incarcerating, and murdering about eight million
people in under eight years. Amid this unfolding holocaust, the German
playwright Bertolt Brecht (1898-1956) fled the Nazi regime and began
writing a play on Galileo in which he would revisit the notion of
enlightenment and human rationality in the midst of the horrors of
fanaticism, war, and mass slaughter.724 In the play Brecht's Galileo is
asked if "the truth will prevail" in the end, to which he responds, "No, no,
no. Truth prevails only when we make it prevail. The triumph of reason
can only be the triumph of reasoning men."725 Galileo would go on to
warn in a long monologue at the end of the play,

If mankind goes on stumbling in a pearly haze of superstition


and outworn words and remains too ignorant to make full use
of its own strength, it will never be able to use the forces of
nature which science has discovered. What end are you
scientists working for? To my mind, the only purpose of
science is to lighten the toil of human existence. If scientists,
browbeaten by selfish rulers, confine themselves to the
accumulation of knowledge for the sake of knowledge,
science will be crippled and your new machines will only
mean new hardships. Given time, you may well discover
everything there is to discover, but your progress will be a
progression away from humanity. The gulf between you and
humanity may one day be so wide that the response to your
exultation about some new achievement will be a universal
outcry of horror.726

Brecht has his Galileo criticize the utopian faith in human progress, once
projected onto religion, now turned toward the practice of science.
Brecht's Galileo warns, "The aim of science is not to open the door to
everlasting wisdom, but to set a limit to everlasting error."727

109
At the dawn of the 21st century we as human beings have not
yet escaped from our irrational myths, the cruelty of our animal instincts,
and our insatiable propensity for violence and war. We have not
achieved the ancient enlightenment dream because few humans "dare to
know" and build a better world. We have not created the conditions for
enlightenment and human freedom, which are still only ideals that have
been imperfectly practiced and institutionalized. We have as a species an
unprecedented storehouse of knowledge about the human condition: what
we've done in the past, what we're doing now in the present, and what
we're capable of doing in the future. We have mastered new forms of
technology and engineered unimaginable marvels. But we still suffer
from war, disease, genocide, systemic socio-economic inequality, and
injustice. Our environment is on the brink of catastrophic change due to
pollution and global warming, threatening the sustainability of our
ecosystem, not to mention the thousands of nuclear and biological
weapons armed and waiting that could annihilate all life on earth.
Despite many advances in science over the 20th century, we are
all still profoundly constrained by our subjectivity and culture, and we
always will be. John Gray has called these perennial limitations "the
enduring contradictions of human needs."728 While objective knowledge
created through the practice of science is important, it is not a sliver
bullet, and it is out of reach of the majority of human beings. Every
individual human being needs practical tools for personal
"enlightenment" and dealing with daily life, not sophisticated scientific
tools for social "engineering."729 We need to be aware of how our bodies
and minds work and how they can be managed, within biological,
cultural, and environmental constraints.
Thus, as Albert Camus pointed out last century, like Blaise
Pascal, Ralph Waldo Emerson, and Friedrich Nietzsche had done before
him in previous centuries, the foundation of human knowledge must be a
combination of "optimism and doubt" based on a keen understanding of
"human possibilities and limits."730 In his 1968 Pulitzer prize winning
book, the microbiologist Rene Dubos also repeated this old theme.
Dubos wrote,

Knowledge of the past is essential for the understanding of


life in the present and in the future, not because history
repeats itself - which it never does exactly - but because the
past is incorporated in all manifestations of the present and
will thereby condition the future...The constitution of a
particular person includes the potentialities that his
experiences have made functional; its limits are determined
by his genetic endowment...[and]environmental stimuli...In
addition to the determinants that survive from man's
evolutionary past and are common to all mankind, there are

110
those acquired by each person in the course of his own
individual life...differ[ing] from culture to culture and from
person to person...Even though all manifestations of life are
known to be conditioned by heredity, past experiences, and
environmental factors, we also know that free will enables
human beings to transcend the constraints of biological
determinism...[Thus] each one of us can consciously create
his personality and contribute to the future...by providing a
rational basis for option and action. Man makes himself
through enlightened choices that enhance his humanness.731

Toward this end of enhancing our humanness through


"enlightened choices", we must devise better ways to teach human beings
how to become philosophically aware of their own lives - how to
understand their personal and cultural ethos so that they can more fully
and freely act and contribute to a better human future. Ralph Waldo
Emerson once said that the most important question is "a practical
question of the conduct of life. How shall I live?"732
This book has tried to address this vexing question. Philosophy
is a tool to better understand ourselves as human beings and the world
that we live in. It is also a tool to help us form judgments so that we can
wisely choose the course of our lives and create a sustainable future.
Rhetoric is the tool to help us organize and communicate our thoughts,
and to participate in our culture. While philosophy based on personal
experience is not without flaws and limitations, it is the best tool that
most humans will ever have. We use philosophy to understand, judge, set
priorities, and often to make trade-offs when our priorities conflict – all in
an attempt to create a better life. A philosophy of experience and a
rhetoric of human motives are tools to help us make and communicate the
difficult decisions that we all must face. We learn as we live, we
communicate our lives, and we learn to live better. As the French social
scientist and philosopher Raymond Aron pointed out, "the power of man"
comes from "assessing his place in the world and in making choices," to
"take possession of the history that he carries within him and that
becomes his own."733
The old philosophically oriented course of study called the
"Humanities," which was used to instill a "liberal education" has been in
decline over the past century.734 The whole notion of a liberal education
is suspect because the scientifically and vocationally oriented system of
schooling in the west has ostracized this kind of learning as impractical,
elitist, and obsolete.735 Accordingly, there has been a general and
widespread "devaluation of the humanities" in both higher education and
K-12 public schooling.736 This is unfortunate because the core
curriculum of liberal education is needed more than ever: subjective
awareness, creativity, cultural analysis, understanding human values,

111
critical thinking, and effective communication, all working towards a
greater human freedom and meaningful life.737
The liberal arts are a type of "wisdom literature."738 They teach
us about the historical record of human agency and the consequences of
human action so that we may make more enlightened decisions. As
Louis Menand has explained, a liberal arts education gives a student the
"historical and theoretical knowledge" that helps one "learn how to
learn," about one's individuality, one's culture, and the larger world.739
The end of the liberal arts is "to enable students...to make more
enlightened contributions to the common good."740 And as Martha
Nussbaum writes, "If we cannot teach our students everything they will
need to know...we may at least teach them what they do not know and
how they may inquire."741 Students can then use this enlarged awareness
of the world and an improved capacity for critical thinking to address
"questions that are of prime human importance, for which no answers are
uncontestably certain."742 That is the promise of a liberal education and
the only true hope for humanity.
The priority of all humanistic and scientific disciplines in the
21st century should be to come together in common cause to equip
human beings with the epistemological, ontological, and axiological tools
needed to live in a complex, dangerous, and data saturated world. People
need to be taught how to develop their own capabilities, which will
enable them to be free and to make better decisions. This is especially
important in democratic countries where citizens have significant
responsibilities, including debating public policy and making informed
voting decisions.743 Amartya Sen argued, "As competent human beings,
we cannot shirk the task of judging how things are and what needs to be
done...It is not so much a matter of having exact rules about how
precisely we ought to behave, as of recognizing the relevance of our
shared humanity in making the choices we face."744
More so than the sciences, the humanities can teach an
existential and conceptual cartography, which humans can use to map
their worlds, know their worlds, and act freely and responsibly. This
should be the heart of every educational enterprise. Human must learn to
become aware of how biology, subjectivity, and culture influence
perception and behavior. Humans must also learn how subjectivity, and
culture can be influenced and modified in turn, thus, creating the
conditions of freedom and moral responsibility. We have the capacity to
be "self-transforming beings" and this allows us to "perpetually reinvent"
our ethos within the constraints of the human condition.745 But we must
never forget that individual, social and political "responsibility requires
freedom," thus, even if we will never be completely free, we still must
strive to increase the boundaries of human freedom as much as we can.746

112
The purpose of education in the 21st century should not be
overly focused on information, skill acquisition, vocational training, or
national economic development. Education is an individual endeavor of
exploration and development that is also simultaneously a public good
because society is at root a congregation of diverse individuals living
together in common need and mutual dependence. Society is enriched
and strengthened when subjectively aware, self-assured, and
knowledgeable individuals pursue excellence in an environment of free
exchange and mutual benefit. At its core, education should be focused on
the what Owen Flanagan calls "eudaimonistic scientia," knowledge of the
human condition which enables human flourishing.747 This type of
education would help foster the creation of individual character (ethos),
cultural identity (ethos) and ecological sustainability. This type of
education would nurture the human self through social discovery,
individual creativity, and critical analysis, whereby, the individual would
create their own self and a meaningful life within the social and physical
constraints outlined above. This type of education is thousands of years
old, at least, but it is not anachronistic. It was and is and will ever be the
foundation of all human wisdom. As the stoic philosopher Seneca once
said, "while we live, while we are among human beings, let us cultivate
our humanity."748

113
114
References

1
Joshua Greene, personal website, http://www.wjh.harvard.edu
/~jgreene/
2
Christine V. Wood, "The Sociologies of Knowledge, Science, and
Intellectuals: Distinctive Traditions and Overlapping Perspectives,"
Sociology Compass 4, no. 10 (2010): 909-923.
3
Loyal Rue, Religion Is Not About God: How Spiritual Traditions
Nurture Our Biological Nature (New Brunswick, 2005), 86, 38-39;
Daniel Kahneman, Thinking, Fast and Slow (New York, 2011), 75.
4
Jack Goody, The Logic of Writing and the Organization of Society
(Cambridge, UK, 1986); Jack Goody, The Domestication of the Savage
Mind (Cambridge, UK, 1977).
5
Mark Turner, The Literary Mind (Oxford, UK, 1996).
6
Rue, Religion Is Not About God, 86, 38-39; Goody, The Logic of
Writing and the Organization of Society; Turner, The Literary Mind.
7
Antonio Damasio, The Feeling of What Happens (New York, 1999);
John Gray, Straw Dogs: Thoughts on Humans and Other Animals (New
York, 2003), 77; Kahneman, Thinking, Fast and Slow, 75.
8
David C. Rubin, Memory in Oral Traditions: The Cognitive
Psychology of Epic, Ballads, and Counting-out Rhymes (Oxford, 1997);
Pascal Boyer and James V. Wertsch, Memory in Mind and Culture
(Cambridge, UK, 2009); Kahneman, Thinking, Fast and Slow, 75.
9
I am using the words "subjective" and "objective" in the basic way
that Karl Popper used these terms. While Popper admitted that these
words were "heavily burdened with a heritage of contradictory usages
and of inconclusive and interminable discussions," I hope that my usage
of these terms are clear. Subjective is the internal experience of the
individual while objective refers to the world of physical objects that
we inhabit. Karl Popper, The Logic of Scientific Discovery (London,
2002), 22-23.
10
Philosopher Loyal Rue, in Religion Is Not About God, defines
scientism as “a set of philosophical beliefs about science, not a set of
tested beliefs about nature. Scientism says that science is the sole
authority on all claims about the natural order; that the limits of science
are the true limits to what can be said about how things really are; that
scientific claims are the only ones that warrant realist attitudes” (316).
Scientism can be attacked from two directions. Theists attack scientism

115
because they want to make space for a transcendent reality, thus,
supporting the existence of a deity. But scientism can also be attacked
by scientists. The later is where I stand, and my critique of scientism
represents a critique of the overly narrow definition of what does or
does not count as science. There is a longstanding debate within the
scientific community over diverse scientific methods, and physical
scientists have traditionally defined science in terms of reductionist
physical-science methods that often distort the complex reality of
psychological, social, and cultural phenomena. See: George Steinmetz,
ed., The Politics of Method in the Human Sciences: Positivism and Its
Epistemological Others (Durham, 2005). For an example of scientism
by a scientist see Richard Dawkins, The God Delusion (New York,
2006). For an example of pop-scientism by science propagandists see
the magazine Wired and the work of literary agent John Brockman,
especially “The New Humanists,” The Hew Humanists: Science at the
Edge (New York, 2003) and The Third Culture (New York, 1996).
11
Walter Lippmann, Public Opinion (New York, 1922), 10.
12
Hannah Arendt, The Human Condition (Chicago, 1958); Philip
Selznick, The Moral Commonwealth: Social Theory and the Promise of
Community (Berkeley, 1992), 14-15.
13
Richard Rorty, Philosophy and the Mirror of Nature (Princeton,
1979). Michael Polanyi also acknowledged "the power which we
exercise in the act of perception." Science, Faith and Society: A
Searching Examination of the Meaning and Nature of Scientific Inquiry
(Chicago, 1964), 16.
14
For a history of this idea within the “Romantic” tradition of European
literature and philosophy see: M. H. Abrams, The Mirror and the Lamp:
Romantic Theory and the Critical Tradition (Oxford, 1971).
15
Owen Flanagan, The Bodhisattva's Brain: Buddhism Naturalized
(Cambridge, MA, 2011), 66.
16
This foundational insight was philosophically explored by Immanuel
Kant in Critique of Pure Reason (1781). The notion of “interaction”
was more concretely tied to subjectivity and society by George Herbert
Mead, which can be found in the collection of essays Mind, Self, and
Society (Chicago, 1967). See also Marchand, "Making Knowledge."
17
Michael Polanyi, Personal Knowledge: Towards a Post-Critical
Philosophy (Chicago, 1962), vii, 17; Kant, Critique of Pure Reason.
18
Kant, Critique of Pure Reason. Kant’s insight lead to a distinct
school of philosophy called Phenomenology. See Robert Sokolowski,
Introduction to Phenomenology (Cambridge, 2000).
19
Annie Dillard, The Writing Life (New York, 1989), 56.
20
Steven Pinker, How the Mind Works (New York, 1997) and The
Blank Slate: The Modern Denial of Human Nature (New York, 2003).
On the specific role of the emotions in subjectivity see: N. H. Frijda,
The Emotions (Cambridge, 1986); Richard Lazarus, Emotion and
Adaptation (Oxford, 1991).
21
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010), 195.

116
22
William H. McNeill, Mythistory and Other Essays (Chicago, 1986),
91.
23
Edmund S. Morgan, Inventing the People: The Rise of Popular
Sovereignty in England and America (New York, 1988), 14.
24
John Kenneth Galbraith, The Great Crash, 1929, qtd. in Kim
Phillips-Fein, "Countervailing Powers: Review of The Affluent Society
and Other Writings 1952-1967," The Nation (May 30 2011), 43.
25
Irving Kirsch, The Emperor's New Drugs: Exploding the
Antidepressant Myth (New York, 2010), 102, 116-117, 164).
26
Daniel E. Moreman, "The Meaning Response: Thinking About
Placebos," Pain Practice 6, no 4 (2006), 233-36.
27
These concepts refer to the ideological, discursive, and institutional
social forces or powers that constitute and constrain the intellectual and
experiential vocabulary of an individual. For "interpretive framework"
see Polanyi, Personal Knowledge, 60. For "common sense" see
Clifford Geertz, "Common Sense as a Cultural System," Local
Knowledge (New York, 2000); For "conventional wisdom" see John
Kenneth Galbraith, "The Concept of the Conventional Wisdom," The
Essential Galbraith (New York, 2001), 18--30; For "habitus" see Pierre
Bourdieu, Outline of a Theory of Practice (Cambridge, UK, 2010), 72;
For “background” see John R. Searle, Consciousness and Language
(Cambridge, 2002). For “paradigm” see Stephen Toulmin, Foresight
and Understanding: An Inquiry into the Aims of Science (New York,
1961), 57, 81, 100; Thomas S. Kuhn, The Structure of Scientific
Revolutions (Chicago, 1962). For “episteme” see Michele Foucault,
The Order of Things (New York, 1970). For “epistemological
unconscious” see Steinmetz, “Introduction: Positivism and Its Others in
the Social Sciences,” The Politics of Method in the Human Sciences.
For "ideology" see J. M. Beach, Studies in Ideology: Essays on Culture
and Subjectivity (Lanham, 2005) and Clifford Geertz, "Ideology as a
Cultural System," The Interpretation of Cultures (New York, 2000).
See also: Jerome Bruner, "Myth and Identity," On Knowing
(Cambridge, MA, 1979), 32.
28
Roy D'Andrade, "Cultural Darwinism and Language," American
Anthropologist 104, no. 1 (2002), 223.
29
Steven Pinker, The Blank Slate: The Modern Denial of Human
Nature (New York, 2002), 65-66.
30
Allan Bloom, The Closing of the American Mind (Touchstone, 1988),
307.
31
Morgan, Inventing the People, 202.
32
Daniel C. Dennett, Freedom Evolves (New York, 2003), 277, 281;
Owen Flanagan, Self Expressions: Mind, Morals, and the Meaning of
Life (Oxford, 1996), 67, 158.
33
William Shakespeare, As You Like It, In A. Harbage (Ed.), William
Shakespeare: The Complete Works (New York, 1969), 257.
34
Owen Flanagan, The Really Hard Problem: Meaning in a Material
World (Cambridge, MA, 2007), 14.
35
Stephen Gaukroger, Francis Bacon and the Transformation of Early-
Modern Philosophy (Cambridge, UK, 2001), 122; "Francis Bacon,"

117
Stanford Encyclopedia of Philosophy, Stanford University (Dec 29
2003) <www.plato.stanford.edu>; Charles E. Lindblom, Inquiry and
Change: The Troubled Attempt to Understand and Shape Society (New
Haven, 1990), 153.
36
Polanyi, Personal Knowledge, 18.
37
Gaukroger, Francis Bacon and the Transformation of Early-Modern
Philosophy; "Francis Bacon," Stanford Encyclopedia of Philosophy.
38
David Hume, Treatise of Human Nature (Oxford, 1888), 268.
39
Ralph Waldo Emerson, "Experience," Selections from Ralph Waldo
Emerson (Boston, 1957), 269.
40
Francis Bacon, The New Organon, qtd. in Peter Gay, The
Enlightenment: The Rise of Modern Paganism (New York, 1995), 312.
41
Dennett, Freedom Evolves, 1, 162.
42
Robert Nozick, The Examined Life: Philosophical Meditations (New
York, 1989), 17.
43
Ibid. This classical conception of philosophy became the foundation
for both the Enlightenment projects of the 17th and 18th centuries and
the modernity projects of the 19th and 20th centuries. However, this
classical assumption has been called into question by both philosophers
and scientists. See John Gray, Enlightenment's Wake (New York,
2009). See also Daniel Kahneman, Thinking, Fast and Slow (New
York, 2011). Kahneman spent most of his life investigating how
humans formulate knowledge and make decisions, and he claimed "I
am generally not optimistic about the potential for personal control of
biases" (p. 131). Thus, there is now more and more evidence fleshing
out the biological limits of our brains, which calls into question
thousands of years of philosophical idealism. While I agree with both
Gray and Kahneman about the idealism of the enlightenment project
and the difficulty of formulating reliable knowledge, I think there is still
some room for optimism - otherwise I wouldn't have written this book!
44
McNeill, Mythistory and Other Essays, 18.
45
Alexander Gilchrist, The Life of William Blake (London, 1907);
David Erdman, Prophet Against Empire, 3rd ed. (Princeton, 1977); J. M.
Beach, “William Blake,” Studies in Poetry: The Visionary (Lanham,
2004).
46
Ralph Waldo Emerson, "Experience," Selections from Ralph Waldo
Emerson (Boston, 1957), 271.
47
Walter Isaacson, Steve Jobs (New York, 2011), 118.
48
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010), 88.
49
Jerome Bruner, "Myth and Identity," On Knowing (Cambridge, MA,
1979), 32.
50
The anthropologist Webb Keane has explained the narrative as
“crystallized contextual moments of explicitness, discursive actions that
turn other actions, other contexts, into texts recognizable within
genres.” Webb Keane, “Anthropology: Estrangement, Intimacy, and
the Objects of Anthropology” (Durham, 2005), 82. See also Goody, The
Logic of Writing and the Organization of Society, 78; Turner, The

118
Literary Mind; David Deutsche, The Fabric of Reality: The Science of
Parallel Universes - and Its Implications (New York, 1997), 264-65
51
David C. Rubin, Memory in Oral Traditions: The Cognitive
Psychology of Epic, Ballads, and Counting-out Rhymes (Oxford, 1997);
Pascal Boyer and James V. Wertsch, Memory in Mind and Culture
(Cambridge, UK, 2009).
52
Walter J. Ong, Ramus: Method, and the Decay of Dialogue (Chicago,
2004), 194.
53
Michael Polanyi and Harry Prosch, Meaning (Chicago, 1975), 66.
54
Ernst Breisach, Historiography: Ancient, Medieval, and Modern, 3rd
ed. (Chicago, 2007).
55
Paul Ricoeur, Memory, History, Forgetting, Trans. Kathleen Blamey
and David Pellauer (Chicago, 2004), 5.
56
Hermann Hess, The Glass Bead Game (1943; New York, 1990), 48.
Hess’ novel is an ironic critique of historiography, objectivity, and the
search for truth.
57
Jack Goody and Ian Watt, "The Consequences of Literacy,"
Comparative Studies in History and Society, 5 (1963): 304-345.
58
Ibid., Goody, The Logic of Writing and the Organization of Society,
7; Goody, The Domestication of the Savage Mind.
59
John Gray, Black Mass: Apocalyptic Religion and the Death of
Utopia (New York, 2007), 206.
60
McNeill, Mythistory and Other Essays, ch 1 and 2; Peter Novick,
That Noble Dream: The ‘Objectivity’ Question and the American
Historical Profession (Cambridge, UK, 1988).
61
Breisach, Historiography: Ancient, Medieval, and Modern, 203-4.
62
Novick, That Noble Dream: The ‘Objectivity’ Question and the
American Historical Profession.
63
The quote comes from Daniel Dennett, “Why Getting It Right
Matters: How Science Prevails,” Science & Religion: Are They
Compatible (Amherst, 2003), 156. See also Dennett, Freedom Evolves,
6. The notion of combining old and new forms of knowing comes from
Goody, The Domestication of the Savage Mind, 148.
64
McNeill, Mythistory and Other Essays, 19.
65
Nicholson Baker, "The Greatest Liar," Columbia Journalism Review
(July-Aug 2009); "Dissembling Defoe," The Wilson Quarterly (Autumn
2009), 74-75; Ian Watt, The Rise of the Novel (Berkeley, 1957); Samar
Attar, The Vital Roots of European Enlightenment: Ibn Tufayl's
Influence on Modern Western Thought (Lanham, 2007); G. J. Toomer,
Eastern Wisdom and Learning : The Study of Arabic in Seventeenth-
Century England (Oxford, 1996).
66
Henry Adams, The Education of Henry Adams (Boston, 1961), 43.
67
Nozick, The Examined Life, 128, 132.
68
See Tom Wolf, The Electric Cool-Aid Acid Test (New York, 1999);
Hunter S. Thompson, Fear and Loathing in Las Vegas (New York,
1998); Joan Didion, Slouching Towards Bethlehem (New York, 2008).
69
Norman Mailer, The Armies of the Night: History as a Novel, the
Novel as History (New York, 1994).

119
70
Henry David Thoreau, Walden, In Walden and Resistance to Civil
Government, 2nd ed. (New York, 1992), 1.
71
Ibid., 65.
72
Ibid., 3, 54.
73
Didion, Slouching Towards Bethlehem, 136.
74
Maureen Tkacik, "Look at Me! A Writer's Search for Journalism in
the Age of Branding," Columbia Journalism Review (May/June 2010),
33-40.
75
"The Foxification of News," Bulletins from the Future: Special
Report on the News Industry, The Economist (July 9 2011), 14-15.
76
Tim O’Brien, The Things They Carried (New York, 1990), 179, 245,
158. O’Brien’s National Book Award winning novel was Going After
Cacciato and his memoir was If I Die in a Combat Zone.
77
John Gray, Straw Dogs: Thoughts on Humans and Other Animals
(New York, 2003), 131.
78
David Carr, “Narrative Explanation and Its Malcontents,” History
and Theory 47 (Feb 2008): 19-30; Paul Ricoeur, Oneself as Another,
trans. Kathleen Blamey (Chicago, 1992).
79
David C. Rubin, Memory in Oral Traditions: The Cognitive
Psychology of Epic, Ballads, and Counting-out Rhymes (Oxford, 1997);
Pascal Boyer and James V. Wertsch, Memory in Mind and Culture
(Cambridge, UK, 2009); Goody, The Logic of Writing and the
Organization of Society, 7.
80
Ricoeur, Oneself as Another.
81
Wilhelm Dilthey, "The Construction of the Historical World in the
Human Studies," Dilthey: Selected Writings (Cambridge, UK, 1976),
238.
82
Polanyi and Prosch, Meaning, 107, 163; Owen Flanagan, Self
Expressions: Mind, Morals, and the Meaning of Life (Oxford, 1996),
67, 158.
83
Ricoeur, Oneself as Another, 53.
84
Alasdair MacIntyre, After Virtue: A Study in Moral Theory, 3rd ed.
(Notre Dame, 2007), 205-258; Ricoeur, Oneself as Another, 158.
85
Hermann Hess, The Glass Bead Game, 9.
86
Pascal Boyer, Religion Explained: The Evolutionary Origins of
Religious Thought (New York, 2001), 326.
87
Kurt Vonnegut, Slaughter-House-Five (New York, 1969), 1.
88
William James, Pragmatism, Writings 1902-1910 (New York, 1988);
Sokolowski, Introduction to Phenomenology.
89
Ricoeur, Oneself as Another, 162.
90
John Kenneth Galbraith, "The Concept of the Conventional
Wisdom," In The Essential Galbraith (New York, 2001).
91
Walter Benjamin, "The Storyteller," qtd. in Ricoeur, Oneself as
Another, 163-64. The quote is actually Ricoeur's summary of
Benjamin.
92
Daniel Kahneman, Thinking, Fast and Slow (New York, 2011).
93
Ibid., 75, 86.
94
Ibid., 201.
95
Ibid.

120
96
Karl Marx, The German Ideology: Part I, In The Marx-Engels
Reader, 2nd ed. (New York, 1978); Roger S. Gottlieb, Marxism 1844-
1990: Origins, Betrayal, Rebirth (New York, 1992), 15-19, 48-51.
97
Naomi Klein, "After the Spill," The Nation (Jan 31 2011), 11-18.
98
Kahneman, Thinking, Fast and Slow, 75.
99
McNeill, Mythistory and Other Essays, 46.
100
Werner Heisenberg, Physics and Philosophy: The Revolution in
Modern Science (New York, 2007), 179.
101
Walter Isaacson, Steve Jobs (New York, 2011).
102
Irving Kirsch, The Emperor's New Drugs: Exploding the
Antidepressant Myth (New York, 2010), ch 5 & 6; Stewart Justman,
"Imagination's Trickery: The Discovery of the Placebo Effect," The
Journal of the Historical Society x, no. 1 (March 2010): 57-73; H. K.
Beecher, “The Powerful Placebo,” The Journal of the American
Medical Association, 159 (1955): 1602-06; Anton J. M. de Craen, Ted J
Kaptchuk, Jan G. P. Tijssen, J. Kleijnen, “Placebos and Placebo Effects
in Medicine: Historical Overview,” Journal of the Royal Society of
Medicine, 92 (Oct 1999): 511-515; Nicholas Bakalar, "Perceptions:
Positive Spin Adds to a Placebo's Impact," New York Times (Dec 27
2010).
103
Qtd. in David H. Freedman, "The Triumph of New-Age Medicine,"
The Atlantic (July/Aug 2011), 98.
104
Arthur K. Shapiro paraphrased by Kirsch, The Emperor's New
Drugs, 56.
105
Kirsch, The Emperor's New Drugs.
106
William Isaac Thomas and Dorothy Thomas, The Child in America,
2nd ed. (New York, 1929).
107
John Gray, Black Mass: Apocalyptic Religion and the Death of
Utopia (New York, 2007), 93, 102-105, 140-41.
108
One Bush aide expressed the delusions of grandeur of this
administration, "That's not the way the world really works any more.
We're an empire now, and when we act we create our own reality"
(140). Qtd. in Gray, Black Mass: Apocalyptic Religion and the Death of
Utopia.
109
Gray, Black Mass: Apocalyptic Religion and the Death of Utopia,
186.
110
For a classic reactionary polemic against the evils of relativism and
post-modernism see Allan Bloom, The Closing of the American Mind
(Touchstone, 1988).
111
Philip Selznick, The Moral Commonwealth: Social Theory and the
Promise of Community (Berkeley, 1992), 13. Selznick called the
postmodern perspective a "challenge" to the "coherence and stability"
of enlightenment humanist assumptions about "self, community,
culture, law, art, science, and organization" (12-13).
112
This idea can be traced back to Johann Gottfried von Herder. See
Isaiah Berlin, “Herder and the Enlightenment,” The Proper Study of
Mankind (New York, 2000). For a more recent formulation see
Benjamin Lee Whorf, Language, Thought, and Reality (Cambridge,
MA, 1956).

121
113
Alan Musgrave, "Conceptual Idealism and Stove's Gem," in
Scientific Inquiry: Readings in the Philosophy of Science, ed. Robert
Klee (Oxford, 1999), 351.
114
Walter Isaacson, Steve Jobs (New York, 2011), 455.
115
See Hilary Putnam, "What is Mathematical Truth"; Bas Van
Fraassen, The Scientific Image; Philip Kitcher, The Advancements of
Science without Legend, Objectivity without Illusions. These three
sources are excerpted in Scientific Inquiry: Readings in the Philosophy
of Science, ed. Robert Klee (Oxford, 1999).
116
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010), 4.
117
Owen Flanagan, The Bodhisattva's Brain: Buddhism Naturalized
(Cambridge, MA, 2011), 53.
118
Ibid., 66.
119
Bruce Mazlish, "Session I: The Record of the Social Sciences,"
Conference on the State of the Social Sciences, Critical Review 16, nos.
2-3 (2004), 167.
120
Ilya Prigogine and Isabelle Stengers, Order Out of Chaos: Man's
New Dialogue with Nature (New York, 1984), 218-226.
121
On Darwin see Daniel Dennett, Darwin’s Dangerous Idea:
Evolution and the Meaning of Life (New York, 1995). On Einstein see
Walter Isaacson, Einstein: His Life and Universe (New York, 2007).
On Heisenberg see Werner Heisenberg, Physics and Philosophy: The
Revolution in Modern Science (New York, 2007). On cultural
anthropology see Clifford Geertz, The Interpretation of Cultures (New
York, 1973) and Local Knowledge (New York, 1983). See also David
Lindley, Uncertainty: Einstein, Heisenberg, Bohr, and the Struggle for
the Soul of Science (New York, 2008).
122
Ernst Mayr, This Is Biology: The Science of the Living World
(Cambridge, MA, 1997), 72.
123
Ibid.
124
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010), 9.
125
Kenneth Burke, Permanence and Change: An Anatomy of Purpose,
3rd ed. (Berkeley, CA, 1984), 35. See also Beach, Studies in Ideology,
79.
126
Kenneth Burke, A Grammar of Motives (Berkeley, CA, 1969), 105,
59-60; Beach, Studies in Ideology, 93. I think Burke here draws from
Nietzsche who wrote, "Every philosophy also conceals a philosophy;
every opinion is also a hideout, every word also a mask." Beyond Good
and Evil, trans. Walter Kaufman (New York, 1989), 229.
127
Hilary Putnam, "A Half Century of Philosophy, Viewed from
Within," American Academic Culture in Transformation, Thomas
Bender and Carl E. Schorske, eds. (Princeton, 1997), 216. Putnam is
summarizing an idea from of Nelson Goodman, Ways of Worldmaking
(Indianapolis, 1978).
128
Richard Rorty, Contingency, Irony, and Solidarity (Cambridge,
1989).

122
129
Stanley Fish, "Almost Pragmatism: The Jurisprudence of Richard
Posner, Richard Rorty, and Ronald Dworkin," in There's No Such Thing
as Free Speech...and it's a good thing too (Oxford, 1994), 201. Lest
any scientifically minded academic take issue with my use of the phrase
"political processes," I use the word in a general sense, via Stanley Fish,
as "what everyone inevitably does" in terms of arguing over what is
good or what is right (p. 19). Scholarly debate over truth using
academic rules and scientific tools is still a political process, albeit a
highly refined and civil one (what Kenneth Burke once called "the
purification of war" - see A Grammar of Motives). For science as a
political process see Steven Shapin, Never Pure: Historical Studies of
Science as if It Was Produced by People with Bodies, Situated in Time,
Space, Culture, and Society, and Struggling for Credibility and
Authority (Baltimore, 2010), especially page 20: "if we say that
scientific propositions always have to win credibility, then that makes
them like the claims of ordinary life."
130
Charles E. Lindblom, "Political Science in the 1940s and 1950s,"
American Academic Culture in Transformation, Thomas Bender and
Carl E. Schorske, eds. (Princeton, 1997); Charles E. Lindblom, Inquiry
and Change: The Troubled Attempt to Understand and Shape Society,
62. See also Kenneth Burke, Permanence and Change: An Anatomy of
Purpose, 3rd ed. (Berkeley, 1984).
131
Rorty, Contingency, Irony, and Solidarity .
132
"The Death of Facts in an Age of Truthiness," National Public Radio
(April 29 2012).
133
Naomi Oreskes and Erik M. Conway, Merchants of Doubt: How a
Handful of Scientists Obscured the Truth on Issues from Tobacco
Smoke to Global Warming (New York, 2010), 5-6, 16, 18, 34.
134
Ibid., 7, 236.
135
Ibid., 241.
136
John J. Mearsheimer, Why Leaders Lie: The Truth about Lying in
International Politics (Oxford, 2011).
137
Irving Kristol qtd. in Ronald Bailey, "Origins of the Specious: Why
Do Neoconservatives Doubt Darwin?" Reason (July 1997).
138
A character says, “There are no facts in the case, just two fictions.”
David Mamet, Race (New York, 2009). See also “Toying with
Taboos,” The Economist (12 December 2009), 83.
139
Southern Pacific Company v. Jensen 244 U.S. 205, 222 (1917);
Gitlow v. People of New York, 268 U.S. 652 (1925).
140
See for example, "Rise of the Image Men," The Economist (Dec 16
2010).
141
Oreskes and Conway, Merchants of Doubt, 240.
142
Don Delillo, White Noise (New York, 2009).
143
Isaiah Berlin, “Historical Inevitability,” The Proper Study of
Mankind (New York, 2000), 170. See also McNeill, Mythistory and
Other Essays, 19.
144
Martha C. Nussbaum, Cultivating Humanity: A Classical Defense of
Reform in Liberal Education (Cambridge, MA, 1997), 40.

123
145
Amartya Sen, The Idea of Justice (Cambridge, MA: Harvard
University Press, 2009), 40.
146
Ibid.
147
Martin Heidegger conceptualized “dwelling” as the embedded
interconnectivity and subjectivity of human beings within specific
ecologies. See Heidegger, “Building Dwelling Thinking” and
“Poetically Man Dwells,” Poetry, Language, Thought (New York,
1971). Michael Polanyi expanded this concept with the term
"indwelling" to also include the physical and intellectual tools,
especially theory: "To apply a theory for understanding nature is to
interiorize it." "Science and Religion," qtd. in Mark T. Mitchell,
Michael Polanyi (Wilmington, 2006), 77. See also Polanyi, Personal
Knowledge, 59-60.
148
Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 96.
149
Sen, The Idea of Justice, 155-56. See also Nozick, The Examined
Life, ch 14.
150
On the psychology and psychopathology of subjectivity see: Morton
Hunt, The Story of Psychology (New York, 1993); Steven Pinker, How
the Mind Works (New York, 1997).
151
Sen, The Idea of Justice, 161-62.
152
Steven Shapin, "How to Be Antiscientific," in Never Pure:
Historical Studies of Science as if It Was Produced by People with
Bodies, Situated in Time, Space, Culture, and Society, and Struggling
for Credibility and Authority (Baltimore, 2010), 32.
153
This insight was first put forth by Immanuel Kant. Pierre Bourdieu,
Outline of a Theory of Practice (Cambridge, UK, 2010), 91; David
Deutsche, The Fabric of Reality: The Science of Parallel Universes -
and Its Implications (New York, 1997), 196-97, ix, 57-58. Deutsche
writes, "There is no getting away from the fact that we human beings
are small creatures with only a few inaccurate, incomplete channels
through which we receive all information from outside ourselves...we
are literally contemplating nothing more than patters of weak electric
current trickling through our brains (p. 57-58).
154
On the evolution of the human brain and consciousness see: Dennett,
Darwin’s Dangerous Idea; Pinker, How the Mind Works; Pinker, The
Blank Slate: The Modern Denial of Human Nature (New York, 2003);
Searle, Consciousness and Language.
155
Daniel Kahneman, Thinking Fast and Slow (New York, 2011), 4.
156
Werner Heisenberg, Physics and Philosophy: The Revolution in
Modern Science (New York, 2007), 56, 66; Robert Sokolowski,
Introduction to Phenomenology (Cambridge, 2000).
157
Hilary Putnam, "A Half Century of Philosophy, Viewed from
Within," American Academic Culture in Transformation, Thomas
Bender and Carl E. Schorske, eds. (Princeton, 1997), 215.
158
Immanuel Kant, Critique of Pure Reason, trans. J. M. D.
Meiklejohn, 1934 (London, 1993); Roger Scruton, "Kant," German
Philosophers (Oxford, 2001).
159
Denise Diderot, Reve, qtd. in Garry Wills, Inventing America:
Jefferson's Declaration of Independence (New York, 1978), 144.

124
160
Trevor H.J. Marchand, "Making Knowledge: Explorations of the
Indissoluble Relation between Minds, Bodies, and Environment,"
Journal of the Royal Anthropological Institute (2010): S1-S21.
161
Burke, Permanence and Change,6; Beach, Studies in Ideology, 77.
162
Heisenberg, Physics and Philosophy, 66.
163
Brian Greene, The Elegant Universe: Superstrings, Hidden
Dimensions, and the Quest for the Ultimate Theory (New York, 2003).
164
Charles E. Lindblom, Inquiry and Change: The Troubled Attempt to
Understand and Shape Society, 78-79.
165
Polanyi and Prosch, Meaning, 185; McNeill, Mythistory and Other
Essays, 8-9; Lindblom, Inquiry and Change: The Troubled Attempt to
Understand and Shape Society, 227. Polanyi and Prosch write, "The
acceptance of scientific statements by laymen is really based not on
their own observations, but on the authority that laymen acknowledge
scientists to have in their special fields" (p. 185).
166
Henry Adams, The Education of Henry Adams (Boston, 1961), 224-
225, 231-32.
167
"Session V: The Nature of Science," Conference on the State of the
Social Sciences, Critical Review 16, nos. 2-3 (2004), 228-258.
168
Richard Reeves, John Stuart Mill: Victorian Firebrand (London,
2007), 163.
169
Thomas Bender and Carl E. Schorske, eds., American Academic
Culture in Transformation (Princeton, 1998); Andrew Abbott, Chaos of
Disciplines (Chicago, 2001); Bruce Mazlish, A New Science: The
Breakdown of Connections and the Birth of Sociology (Oxford, 1989).
170
I disagree with optimistic technology worshipers like Ray Kurzweil
who think that technology will solve all human and environmental
problems. See The Singularity is Near: When Humans Transcend
Biology (New York, 2005). The scientist and philosopher Rene Dubos
explained the central problem with the panacea of technology: "Most of
the scientific and technological dreams of mankind have come to pass.
In fact the achievements of modern scientific technology greatly
exceeded the most visionary imaginings of Franklin, Condorcet, and the
other philosophers of the Enlightenment. But the consequences of these
achievements for society do not correspond to eighteenth-century
hopes. Even under the most favorable conditions, the present ways of
life do not necessarily result in better health and greater happiness" (p.
206-07). A God Within: A Positive Philosophy for a More Complete
Fulfillment of Human Potentials (New York, 1972).
171
I agree with John Gray that often when science is linked to the
concept of truth it renews the "mystical faith" of Plato and Augustine;
however, I am using the word truth as the equivalent of true, fact, or
knowledge of the objective world - which of course is always
provisional and subject to revision. As I understand it, most practicing
scientists would agree with more humble concept of truth. See John
Gray, Straw Dogs: Thoughts on Humans and Other Animals (New
York, 2003), 20.
172
Steven Shapin, "Lowering the Tone in the History of Science," in
Never Pure: Historical Studies of Science as if It Was Produced by

125
People with Bodies, Situated in Time, Space, Culture, and Society, and
Struggling for Credibility and Authority (Baltimore, 2010), 7.
173
Paul Feyerabend, Against Method, 4th ed. (London, 2010), 105, 3.
174
Paul Feyerabend, The Tyranny of Science (Cambridge, UK, 2011), 8.
175
Stephen Toulmin, Foresight and Understanding: An Inquiry into the
Aims of Science (New York, 1961), 109.
176
Ivan Turgenev, Fathers and Sons, Trans. Rosemary Edmonds
(London, 1965), 98.
177
Ernst Mayr, This Is Biology: The Science of the Living World
(Cambridge, MA, 1997), 33.
178
Karl R. Popper, Objective Knowledge: An Evolutionary Approach,
revised edition (Oxford, 1979), 108; Mayr, This Is Biology: The Science
of the Living World, ch 2.
179
Daniel Dennett, “Why Getting It Right Matters: How Science
Prevails,” Science & Religion: Are They Compatible (Amherst, 2003),
156. See also Dennett, Freedom Evolves, 6. Kenneth Burke meant
something very similar when he called science a "criticism of
criticism." See Burke, Permanence and Change,6; Beach, Studies in
Ideology, 77.
180
David Bloor, Knowledge and Social Imagery, 2nd ed. (Chicago,
1991), 40-45.
181
David Deutsche, The Fabric of Reality: The Science of Parallel
Universes - and Its Implications (New York, 1997), 3, 7.
182
Ibid, 3.
183
"Game Theory in Practice," The Economist Technology Quarterly
(Sept 3 2011), 19-20.
184
Mill qtd. in Richard Reeves, John Stuart Mill: Victorian Firebrand
(London, 2007), 271.
185
Mill qtd. in. Ibid., 269.
186
C. Wright Mills, The Sociological Imagination (Oxford, 2000), 127;
Karl Popper, The Logic of Scientific Discovery (London, 2002).
187
Karl R. Popper, Objective Knowledge: An Evolutionary Approach,
70-71. For a review and critique of Popper's method see Paul
Feyerabend, Against Method, 154-158.
188
Oreskes and Conway, Merchants of Doubt, 154.
189
Sen, The Idea of Justice, 157-59; Michael Polanyi, Personal
Knowledge, 217; Charles E. Lindblom, Inquiry and Change: The
Troubled Attempt to Understand and Shape Society (New Haven,
1990), 168.
190
Karl Popper, The Logic of Scientific Discovery (London, 2002). See
also Ian Hacking, Representing and Intervening: Introductory Topics in
the Philosophy of Natural Science (Cambridge, UK, 1983), 3.
However, this theory of falsification is an ideal version of the more
complex, sordid affairs of actual scientists. See Mayr, This Is Biology:
The Science of the Living World, 47, 51. Feyerabend argued, "A strict
principle of falsification, or a 'naive falsificationism' as Lakatos calls it,
would wipe out science as we know it and would never have permitted
it to start." Feyerabend, Against Method, 157.
191
Mayr, This Is Biology: The Science of the Living World, 61.

126
192
John Gray, "The Liberalism of Karl Popper," Essays in Political
Philosophy (London, 1989), 12.
193
Deutsche, The Fabric of Reality, 17.
194
Diane Harley and Sophia Krzys Acord, Peer Review in Academic
Promotion and Publishing: Its Meaning, Locus, and Future, Center for
Studies in Higher Education, University of California at Berkeley
(March 2011); Charles Gross, "Disgrace," The Nation (Jan 9/16 2012),
25-32.
195
Ibid. See also "Misconduct in Science: An Array of Errors," The
Economist (Sept 10 2011); David H. Freedman, "Lies, Damned Lies,
and Medical Science," The Atlantic (Nov 2010), 76-86. John Ioannidis,
a medical doctor and researcher, has found that 80 percent of the
conclusions of non-randomized medical studies are false, and that
around 41% of the most rigorous and widely cited articles in the field
are "wrong or significantly exaggerated" (p. 81). As Freedman points
out, "The ultimate protection against research error and bias is supposed
to come from the way scientists constantly retest each other's results -
except they don't. Only the most prominent findings are likely to be put
to the test, because there's likely to be publication payoff in firming up
the proof, or contradicting it...even when a research error is outed, it
typically persists for years or even decades" (p. 84-85).
196
Carl Zimmer, "A Sharp Rise in Retractions Prompts Calls for
Reform," The New York Times (April 16 2012).
197
Paul Feyerabend, Against Method, 160.
198
qtd. in Karl Popper, The Logic of Scientific Discovery (London,
2002), 6; Lindblom, Inquiry and Change: The Troubled Attempt to
Understand and Shape Society, 167.
199
Ernst Mayr, This Is Biology: The Science of the Living World
(Cambridge, MA, 1997), 77. On the scientific process and the
evolution of scientific “truth” see Kuhn, The Structure of Scientific
Revolutions. See also I. Bernard Cohen, Revolution in Science
(Cambridge, 1985); Foucault, The Order of Things; Steinmetz,
“Introduction: Positivism and Its Others in the Social Sciences.”
200
Michael Adas, Machines as the Measure of Men: Science,
Technology, and Ideologies of Western Dominance (Ithaca, NY, 1990);
Merritt Roe Smith and Leo Marx, eds., Does Technology Drive
History? The Dilemma of Technological Determinism (Cambridge,
MA, 1994). See also W. Brian Arthur, The Nature of Technology:
What It Is and How It Evolves (New York, 2009). Arthur offers a
unique evolutionary theory of technology that is tantalizing, but beyond
the scope of my use of this word in this book.
201
“Norman Borlaug,” The Economist (19 September 2009), 91.
202
Matt Ridley, The Rational Optimist: How Prosperity Evolves (New
York, 2010).
203
For example Ray Kurzweil, The Singularity is Near: When Humans
Transcend Biology (New York, 2005).
204
Paul Feyerabend, Against Method, 223.
205
I agree with John Gray's critique of science, but I disagree with his
notion that "the pretensions of science to contain a rationally privileged

127
world-view should be, and can be humbled" (231). While I think the
pretensions of science should be humbled, as I will attempt to do in this
essay, however, I believe that science rightly deserves its privileged
world-view when it comes to knowledge. But I will also argue that
most people are incapable of understanding or practicing science, thus,
it should not be the privileged world-view for all people everywhere in
terms of living a meaningful life. See John Gray, Enlightenment's Wake
(New York, 2009), 231.
206
"And Man Made Life," The Economist (May 22 2010), 11.
207
qtd. in. Freedman, "Lies, Damned Lies, and Medical Science," 86.
208
Robert Klee, "Introduction," Scientific Inquiry: Readings in the
Philosophy of Science, ed. Robert Klee (Oxford, 1999), 2-3.
209
In what follows I critique science in general, making no distinction
between physical and social science. For very cogent critique of social
science in particular see Charles E. Lindblom, Inquiry and Change: The
Troubled Attempt to Understand and Shape Society, 192-94.
210
Daniel Kahneman, Thinking Fast and Slow (New York, 2011), 8.
211
Paul Feyerabend, Against Method, 4th ed. (London, 2010), 10.
212
David M. Kreps, "Economics - The Current Position," American
Academic Culture in Transformation, Thomas Bender and Carl E.
Schorske, eds. (Princeton, 1997), 77, 97.
213
Gray, Enlightenment's Wake, 231; Kahneman, Thinking Fast and
Slow, 9.
214
Gary Taubes, Good Calories, Bad Calories: Fats, Carbs, and the
Controversial Science of Diet and Health (New York, 2008), 462, 1, 60.
215
Kuhn, The Structure of Scientific Revolutions. See also Michael
Polanyi, Personal Knowledge, 200.
216
John Ioannidis, qtd. in Freedman, "Lies, Damned Lies, and Medical
Science," 84.
217
Irving Kirsch, The Emperor's New Drugs: Exploding the
Antidepressant Myth (New York, 2010), 53.
218
Charles Gross, "Disgrace," The Nation (Jan 9/16 2012), 26. See also
Horace Freeland Judson, The Great Betrayal: Fraud in Science (New
York, 2004); David Goodstein, On Fact and Fraud: Cautionary Tales
from the Front Lines of Science (Princeton, 2010).
219
Carl Zimmer, "A Sharp Rise in Retractions Prompts Calls for
Reform," The New York Times (April 16 2012).
220
Kirsch, The Emperor's New Drugs, 25.
221
Kirsch, The Emperor's New Drugs.
222
Curt Rice, "Does Science Publishing Rely Too Much on Subjective
Decisions and Personal Networks?" University World News: Global
Edition, 217 (April 15 2012).
223
Taubes, Good Calories, Bad Calories, 51-52; Freedman, "Lies,
Damned Lies, and Medical Science," 82-85.
224
Freedman, "Lies, Damned Lies, and Medical Science," 84; Charles
Gross, "Disgrace," The Nation (Jan 9/16 2012), 25-32.
225
Ibid., 80-81.
226
Charles E. Lindblom and David K. Cohen, Usable Knowledge:
Social Science and Social Problem Solving (New Haven, 1979), 62.

128
One of the prime examples of the misuse of science is social
Darwinism, which was a powerful perversion of Darwin's theory of
natural selection. See Richard Hofstadter, Social Darwinism in
American Thought (Boston, 1992).
227
Taubes, Good Calories, Bad Calories, 152, 451.
228
Daniel Kahneman, Thinking Fast and Slow (New York, 2011), 219.
229
Naomi Oreskes and Eric M. Conway, Merchants of Doubt: How a
Handful of Scientists Obscured the Truth on Issues from Tobacco
Smoke to Global Warming (New York, 2011).
230
Shapin, Never Pure; Lindley, Uncertainty, 89. The case study of
Nazi Germany is a very interesting and frightening example.
231
Charles E. Lindblom and David K. Cohen, Usable Knowledge:
Social Science and Social Problem Solving (New Haven, 1979), 40.
232
Shapin, "The House of Experiment in Seventeenth-century
England," in Never Pure, 88. Polanyi and Prosch, Meaning, 185.
233
Tony Judt, The Burden of Responsibility: Blum, Camus, Aron and
the French Twentieth Century (Chicago, 1998), 17.
234
Michael Polanyi, Science, Faith and Society: A Searching
Examination of the Meaning and Nature of Scientific Inquiry (Chicago,
1964), 43. Polanyi explained, "A full initiation into the premises of
science can be gained only by the few who possess the gifts for
becoming independent scientists, and they usually achieve it only
through close personal association with the intimate views and practice
of a distinguished master" (p. 43). See also Polanyi and Prosch,
Meaning, 184.
235
Robert M. Solow, "How Did Economics Get That Way and What
Way Did it Get?" American Academic Culture in Transformation,
Thomas Bender and Carl E. Schorske, eds. (Princeton, 1997), 75. See
also Charles E. Lindblom and David K. Cohen, Usable Knowledge:
Social Science and Social Problem Solving (New Haven, 1979), 40.
236
I speak of course of the privileging of positivist physical science
over the social sciences and the humanities in the western university.
Jonathan R. Cole, The Great American University: Its Rise to
Preeminence, Its Indispensable National Role, and Why It Must Be
Protected (New York, 2009), 100, 151, 155. See also Toulmin, Return
to Reason, 22, 99; Steinmetz, The Politics of Method in the Human
Sciences; Hilary Putnam, "A Half Century of Philosophy, Viewed from
Within," American Academic Culture in Transformation, Thomas
Bender and Carl E. Schorske, eds. (Princeton, 1997).
237
Ludwig Wittgenstein, Tractatus Logico-Philosophicus (London,
2001), 3.
238
Michael Polanyi, Personal Knowledge, 163, 216.
239
Taubes, Good Calories, Bad Calories, xxii.
240
Hilary Putnam, "A Half Century of Philosophy, Viewed from
Within," 201.
241
Thomas Bender, "Politics, Intellect, and the American University,
1945-1995," American Academic Culture in Transformation, Thomas
Bender and Carl E. Schorske, eds. (Princeton, 1997), 23, 32.
242
Toulmin, Return to Reason, 79.

129
243
"Experimental Psychology: The Roar of the Crowd," The Economist
(May 26 2012), 77-79.
244
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010), 9.
245
Stephen Toulmin, The Uses of Argument (Cambridge, UK, 1958),
220.
246
Charles E. Lindblom and David K. Cohen, Usable Knowledge:
Social Science and Social Problem Solving (New Haven, 1979), 40.
247
Karl R. Popper, Objective Knowledge: An Evolutionary Approach,
revised edition (Oxford, 1979), 29.
248
Carl E. Schorske, "The New Rigorism in the Human Sciences, 1940-
1960," American Academic Culture in Transformation, Thomas Bender
and Carl E. Schorske, eds. (Princeton, 1997), 317.
249
Daniel Kahneman, Thinking Fast and Slow (New York, 2011), 8.
250
The predictive software developed by Bruce Bueno de Mesquita
seems to be available to the highest bidder, which creates the possibility
of abuse by the rich and powerful, not to mention the unintended
consequences of powerful agents acting on wrong information
generated by this software. See "Game Theory in Practice," The
Economist Technology Quarterly (Sept 3 2011), 19-20.
251
Toulmin, Return to Reason, 80.
252
Friedrich Nietzsche, Beyond Good and Evil, trans. Walter Kaufman
(New York, 1989), 85.
253
Michael Polanyi, Personal Knowledge: Towards a Post-Critical
Philosophy (Chicago, 1962), 268. See also Charles E. Lindblom and
David K. Cohen, Usable Knowledge: Social Science and Social
Problem Solving (New Haven, 1979), 47.
254
Claude Levi-Strauss, The Savage Mind (London, UK, 1966), 15.
For a discussion of the same development and a critique of Levi-Strauss
see: Jack Goody, The Domestication of the Savage Mind (Cambridge,
UK, 1977).
255
Thomas L. Haskell, The Emergence of Professional Social Science
(Urbana, IL, 1977); Kuhn, The Structure of Scientific Revolutions;
Novick, That Noble Dream; Shapin, Never Pure; Steinmetz, ed., The
Politics of Method in the Human Sciences.
256
Berlin, “Historical Inevitability,” 168. Berlin perceptively explained
how one cannot use “truly scientific methods” outside their “proper
field” because it would lead to “total absurdity.” See also Heisenberg,
Physics and Philosophy, 179; Polanyi, Personal Knowledge, 3;
Kahneman, Thinking Fast and Slow, 28.
257
Charles E. Lindblom, Inquiry and Change: The Troubled Attempt to
Understand and Shape Society (New Haven, 1990), 43.
258
Karl Popper, qtd. in Mayr, This Is Biology: The Science of the Living
World, 41.
259
Oreskes and Conway, Merchants of Doubt, 273.
260
David Lindley, Uncertainty: Einstein, Heisenberg, Bohr, and the
Struggle for the Soul of Science (New York, 2008), 216.
261
I want to be clear that I am not agreeing with Feyerabend's position
that scholars just "tell fairytales" and all fairytales are equally valid,

130
which would be representative of post-modern relativism. My point is
much more subtle and complex. See Paul Feyerabend, The Tyranny of
Science (Cambridge, UK, 2011), 13.
262
Charles E. Lindblom, Inquiry and Change: The Troubled Attempt to
Understand and Shape Society, 64.
263
Kenneth Burke, The Philosophy of Literary Form, 3rd ed. (Berkeley,
CA, 1973), 6.
264
McNeill, Mythistory and Other Essays, 84.
265
Karl R. Popper, Objective Knowledge: An Evolutionary Approach,
revised edition (Oxford, 1979), 37.
266
Stephen Toulmin, Foresight and Understanding: An Inquiry into the
Aims of Science (New York, 1961), 81.
267
For further discussion on the perceptual process and knowing see: J.
M. Beach, Studies in Ideology: Essays on Culture and Subjectivity
(Lanham, 2005); Kenneth Burke, A Grammar of Motives (Berkeley,
1969); Kenneth Burke, A Rhetoric of Motives (Berkeley, 1969); John R.
Searle, Consciousness and Language; Sokolowski, Introduction to
Phenomenology. My basic ontology and epistemology has been shaped
by the traditions of American pragmatism (William James and John
Dewey), symbolic interactionism (George Herbert Mead), and practice
theory (Pierre Bourdeau, Sherry B. Ortner, and Etione Wengner).
268
Lindblom, Inquiry and Change: The Troubled Attempt to
Understand and Shape Society, 169-70; Philip K. Howard, The Death
of Common Sense: How Law is Suffocating America (New York, 1994),
12, 60-61, 87, 186-187.
269
Richard Rorty, ed., The Linguistic Turn: Essays in Philosophical
Method (Chicago, 1992).
270
Toulmin, Return to Reason, 67.
271
Deutsche, The Fabric of Reality, 8, 17.
272
Ibid., 45.
273
Andrew Collier, "Critical Realism," in The Politics of Method in the
Human Sciences, George Steinmetz, ed. (Durham, NC, 2005), 327.
274
Heisenberg, Physics and Philosophy, 118-119, 141-143. In
particular see Heisenberg's essay, "Language and Reality in Modern
Physics."
275
Lindley, Uncertainty, 86.
276
Polanyi, Science, Faith and Society, 32; Polanyi and Prosch,
Meaning, 101, 107.
277
Steven Shapin, Never Pure: Historical Studies of Science as if It
Was Produced by People with Bodies, Situated in Time, Space, Culture,
and Society, and Struggling for Credibility and Authority (Baltimore,
2010).
278
Burke, Permanence and Change,92. See also Burke, A Grammar of
Motives, 105, 59-60; Beach, Studies in Ideology, 93. See also Pierre
Bourdieu, Outline of a Theory of Practice (Cambridge, UK, 2010), 19.
279
Binyamin Appelbaum, "On Economy, Raw Data Gets a Grain of
Salt," The New York Times (Aug 16 2011); Martha C. Nussbaum,
Creating Capabilities: The Human Development Approach
(Cambridge, MA, 2011); Joseph Stiglitz, Amaryta Sen, Jean-Paul

131
Fitoussi, and associates, Report by the Commission on the Measurement
of Economic Performance and Social Progress (Sept 2009),
<www.stiglitz-sen-fitoussi.fr/en/documents.htm>
280
Louis Hyman, Borrow: The American Way of Debt (New York,
2012), 240; Michael Lewis, The Big Short: Inside the Doomsday
Machine (New York, 2011), 144.
281
"The Big Mac Index: Fast Food for Thought," The Economist (July
30 2011), 12.
282
Edmund S. Morgan, Inventing the People: The Rise of Popular
Sovereignty in England and America (New York, 1988).
283
Joel Bakan, The Corporation: The Pathological Pursuit of Profit
and Power (New York, 2005); "Peculiar People," The Economist
(March 26 2011): 78.
284
Rogers M. Smith, Civic Ideals: Conflicting Visions of Citizenship in
U. S. History (New Have, 1997), 402-407; Eric Foner, The Story of
American Freedom (New York, 1998), 131-133.
285
McNeill, Mythistory and Other Essays, 5.
286
Daniel Dennett, Consciousness Explained (New York, 1991), 455.
287
Elinor Ostrom, Governing the Commons: The Evolution of
Institutions for Collective Action (Cambridge, UK, 1990), 8, 21.
288
The italics are the author’s emphasis. Tony Lawson, “Economics
and Critical Realism: A Perspective on Modern Economics,” The
Politics of Method in the Human Sciences (Durham, 2005), 383.
289
Kathryn Sutherland, “Introduction,” in Adam Smith, An Inquiry into
the Nature and Causes of the Wealth of Nations (Oxford, 2008), xlii.
290
Steven D. Levitt and Stephen J. Dubner, Freakonomics (New York,
2009), xxv.
291
James Coleman, Introduction to Mathematical Sociology (New
York, 1964), 516-19.
292
“Number-Crunchers Crunched,” The Gods Strike Back: A Special
Report on Financial Risk, The Economist (13 Feb 2010), 7; Anatole
Kaletsky, "Wrong, INET?: A Challenge to Economic Orthodoxy," The
Economist: The World in 2011 (London, 2010), 154.
293
Hyman, Borrow, 244-47; Lewis, The Big Short, 54-55.
294
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010), 29.
295
Nussbaum, Creating Capabilities, ix.
296
Elinor Ostrom, Governing the Commons: The Evolution of
Institutions for Collective Action (Cambridge, UK, 1990), 24.
297
Adrian Johns, Forward, Walter J. Ong, Ramus: Method, and the
Decay of Dialogue (Chicago, 2004), ix. On the history and concept of
mapmaking see: Ken Jennings, Maphead: Charting the Wide, Weird
World of Geography (New York, 2012).
298
Walter J. Ong, Ramus: Method, and the Decay of Dialogue
(Chicago, 2004), 83.
299
Jordan Branch, "Mapping the Sovereign State: Technology,
Authority, and Systemic Change," International Organization 65, no 1
(Winter 2011): 1-36.
300
Jennings, Maphead.

132
301
For map as ideological technology see Polanyi, Personal
Knowledge, 4; Feyerabend, Against Method, 233; Elinor Ostrom,
Understanding Institutional Diversity (Princeton, 2005), 8, 11; Michael
Dutton, "The Trick of Words: Asian Studies, Translation, and the
Problems of Knowledge," in The Politics of Method in the Human
Sciences, George Steinmetz, ed. (Durham, NC, 2005), 92-93; Margaret
Levi, "A Model, A Method, and a Map: Rational Choice in
Comparative and Historical Analysis," Comparative Politics:
Rationality, Culture, and Structure, Mark Irving Lichbach and Alan S.
Zuckerman, eds. (Cambridge, 1997), 19-42. For a discussion of the
epistemological concept of maps and the writing of history see McNeill,
Mythistory and Other Essays, 100-1. For a discussion of the fictional
quality of maps in relation to the writing of fictional narratives see Max
Byrd, “This Is Not a Map,” The Wilson Quarterly 33, no. 3 (Summer
2009), 26-32; Cormac McCarthy, Cities of the Plain (New York, 1998),
268, 273, 291.
302
"Google Earth," Wikipedia (July 8, 2011) <
http://en.wikipedia.org/wiki/Google_Earth >
303
Stephen Toulmin, The Philosophy of Science (London, 1953).
304
Jerry Brotton, A History of the World in Twelve Maps
(London, 2012).
305
Jorge Luis Borges, "On Exactitude in Science," The Aleph and Other
Stories, trans. Andrew Hurley (New York, 2000), 181.
306
Walter Lippmann, Public Opinion, 11.
307
Elinor Ostrom, Understanding Institutional Diversity, 8.
308
McNeill, Mythistory and Other Essays, 100-101.
309
Alex Wright, Glut: Mastering Information Through the Ages (Ithaca,
2007), 24. See also George Lakoff, Women, Fire, and Dangerous
Things: What Categories Reveal About the Mind (Chicago, 1987);
Brent Berlin, Ethnobiological Classification: Principles of
Categorization of Plants and Animals in Traditional Societies
(Princeton, 1992).
310
Isaiah Berlin, “The Concept of Scientific History,” The Proper Study
of Mankind (New York, 2000), 39, 43. Berlin writes: We formulate
judgment through “experience, memory, imagination, on the sense of
'reality'...which may need constant control by, but is not at all identical
with, the capacity for logical reasoning and the construction of laws and
scientific models.”
311
Lakoff, Women, Fire, and Dangerous Things, 8.
312
Paul Feyerabend, Against Method, 4th ed. (London, 2010), xviii.
Feyerabend's classic treatise argues that "science should be taught as
one view among many and not as the one and only road to truth and
reality."
313
Richard Wolin, "Reflections on the Crisis in the Humanities," The
Hedgehog Review 13, no. 2 (Summer 2011), 14-15, 18.
314
Walter J. Ong, Ramus: Method, and the Decay of Dialogue
(Chicago, 2004); Toulmin, Return to Reason, 29.
315
Christopher Hitchens, "Isaac Newton," Arguably: Essays by
Christopher Hitchens. (New York, 2011), 143.

133
316
David Walker Howe, What Hath God Wrought: The Transformation
of America, 1815-1848 (Oxford, 2007), 466, 469.
317
Mayr, This Is Biology: The Science of the Living World, 36.
318
James Ladyman, "Philosophy That's Not for the Masses," The
Philosopher's Magazine 53 (2011) <http://www.
philosophypress.co.uk/?p=1906>. Ladyman argues, "I do not see why
all philosophers, or even most, should be interested in communicating
their thoughts about these matters to the world."
319
Karl R. Popper, Objective Knowledge: An Evolutionary Approach,
revised edition (Oxford, 1979), 32.
320
Ibid., 33.
321
Hugo Mercier and Dan Sperber, "Why Do Humans Reason?
Arguments for an Argumentative Theory," Behavioral and Brain
Sciences 34 (2011): 57-111.
322
Louis Menand, The Marketplace of Ideas: Reform and Resistance in
the American University (New York, 2010), ch 1.
323
Martha C. Nussbaum, Cultivating Humanity: A Classical Defense of
Reform in Liberal Education (Cambridge, MA, 1997).
324
Amy Gutmann, Democratic Education (Princeton, 1987).
325
Bruce Kuklick, The Rise of American Philosophy: Cambridge,
Massachusetts, 1860-1930 (New Haven, 1977), 565-566, 575-576.
326
See for example Raymond Williams, "Philosophy," Keywords: A
Vocabulary of Culture and Society, Revised Edition (Oxford, 1983),
235; Mercier and Sperber, "Why Do Humans Reason?"; Wolin,
"Reflections on the Crisis in the Humanities," 19.
327
Talbot Brewer, "Talbot Brewer Responds," The Hedgehog Review
13, no. 2 (Summer 2011), 84.
328
Talbot Brewer, The Retrieval of Ethics (Oxford, 2009), 326.
329
Henry David Thoreau, Walden, In Walden and Resistance to Civil
Government, 2nd ed. (New York, 1992), 9.
330
Sarah Bakewell, How to Live: A Life of Montaigne in One Question
and Twenty Attempts at an Answer (New York, 2010), 109.
331
Nozick, The Examined Life, 18-19.
332
Toulmin, Return to Reason, 14.
333
Charles E. Lindblom and David K. Cohen, Usable Knowledge:
Social Science and Social Problem Solving (New Haven, 1979), 12-13.
334
Ibid., 15, 17, 37.
335
Bakewell, How to Live: A Life of Montaigne in One Question and
Twenty Attempts at an Answer.
336
Bakewell, How to Live; Nussbaum, Cultivating Humanity, 25;
Mercier and Sperber, "Why Do Humans Reason?"
337
McNeill, Mythistory and Other Essays, 47.
338
Ibid., 14.
339
Sen, The Idea of Justice, 124, 324-26. I am using this term
"impartial spectator" in the way that Sen uses the term, by which he
meant an ideal observational posture whereby humans try to get beyond
their own subjectivity and culture to arrive at objective truth. I am NOT
using this term in the way Husserl once did as a naive assumption that
scientists-as-observers were automatically detracted, neutral observers,

134
which creates, as Bourdieu noted, a "theoretical distortion" (p. 1).
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010).
340
Lindblom, Inquiry and Change: The Troubled Attempt to
Understand and Shape Society, 170. See also Mercier and Sperber,
"Why Do Humans Reason?"
341
Gregory Vlastos, Socrates: Ironist and Moral Philosopher (Ithaca,
NY, 1991); Gregory Vlastos, Studies in Greek Philosophy Volume II:
Socrates, Plato, and Their Tradition (Princeton, 1995); Walter J. Ong,
Ramus: Method, and the Decay of Dialogue (Chicago, 2004), 61.
342
Philip K. Howard, The Death of Common Sense: How Law is
Suffocating America (New York, 1994), 60.
343
Amartya Sen, qtd. in Nussbaum, Cultivating Humanity, 45.
344
Nozick, The Examined Life, 267, 270; Nussbaum, Cultivating
Humanity, 25.
345
Peter Gay, The Enlightenment: The Rise of Modern Paganism (New
York, 1995), 143-44.
346
I am using the term "sophist" as a general term for philosophers in
ancient Greece. I am well aware of Plato's polemics against the
"sophists" as a professional class of philosophers who sold their
services for money. However, I think Plato's critiques of the sophists
are one-sided and his portrait of Socrates very biased. While Socrates
may have been an exceptional philosopher who apparently did not sell
his services, he was still a part of the sophist class of public
intellectuals. John Gray sardonically noted, "It may be that Socrates
was not the questioning rationalist Plato made him out to be. He may
have been a playful sophist who looked on philosophy as sport, a game
no one took seriously - least of all himself." Straw Dogs: Thoughts on
Humans and Other Animals (New York, 2003), 106.
347
For a good historical overview of classical Greek philosophy and its
myriad branchings see Anthony Gottlieb, The Dream of Reason: A
History of Philosophy from the Greeks to the Renaissance (New York,
2001).
348
18th and 19th century German philosophers used a special word for
the human capacity of understanding, “Verstehen,” which was a tool
used to study “Geisteswissenschaften,” the science of the human spirit,
or what we call in English the “humanities.”
349
Isaiah Berlin, qtd. in John Gray, Isaiah Berlin (Princeton, 1996), 67.
350
David Deutsche, The Fabric of Reality: The Science of Parallel
Universes - and Its Implications (New York, 1997), 2.
351
Karl Popper, The Logic of Scientific Discovery (London, 2002), xix.
352
Polanyi, Personal Knowledge, 63.
353
Jack Goody, The Domestication of the Savage Mind (Cambridge,
UK, 1977), 9-10.
354
Nozick, The Examined Life, 15, 12.
355
I loosely paraphrase and revise Jean Jacque Rousseau. In Emile he
wrote, "It is not a question of knowing what is, but only what is useful."
356
Michael Polanyi, Personal Knowledge: Towards a Post-Critical
Philosophy (Chicago, 1962), 249, 265, 268.

135
357
Karl Jaspers, Philosophy Is for Everyman: A Short Course in
Philosophical Thinking (New York, 1967).
358
Antonio Gramsci, Prison Notebooks, quoted in Sen, The Idea of
Justice, 121.
359
Max Planck qtd. in Shapin, Never Pure, 349. See also Karl Popper,
The Logic of Scientific Discovery (London, 2002), xxii.
360
Charles E. Lindblom, Inquiry and Change: The Troubled Attempt to
Understand and Shape Society (New Haven, 1990), 30.
361
Nozick, The Examined Life, 298.
362
Isaiah Berlin, “The Pursuit of the Ideal,” The Proper Study of
Mankind (New York, 2000), 15.
363
Peter Gay, The Enlightenment: The Rise of Modern Paganism (New
York, 1995), 92.
364
Jennifer Michael Hecht, Doubt: A History (New York, 2003);
Amartya Sen, The Argumentative Indian (New York, 2005), 21-30.
365
I. Bernard Cohen, Revolution in Science, 135, 142. See also Thomas
S. Kuhn, The Structure of Scientific Revolutions.
366
Alfred North Whitehead, Science and the Modern World
(Cambridge, UK, 1926), 17; Walter J. Ong, Ramus: Method, and the
Decay of Dialogue (Chicago, 2004), 165.
367
Quoted by Harris E. Starr, William Graham Sumner (New York,
1925), 167-168. See also Richard Hofstadter, Social Darwinism in
American Thought (New York, 1955), 55.
368
Whitehead, Science and the Modern World, 17; Ong, Ramus:
Method, and the Decay of Dialogue, 165.
369
American Social Scientific Association, Constitution (27 December
1865), cited by Thomas L. Haskell, The Emergence of Professional
Social Science (Urbana, IL, 1977), 111-112.
370
Timothy Fitzgerald, The Ideology of Religious Studies (Oxford,
2000); Jacob Pandian, “The Dangerous Quest for Cooperation between
Science and Religion,” Science and Religion: Are They Compatible?
(Amherst, 2003); Charles Taylor, A Secular Age (Cambridge, MA,
2007).
371
Taylor, A Secular Age, 54-55.
372
Fitzgerald, The Ideology of Religious Studies; Pandian, “The
Dangerous Quest for Cooperation between Science and Religion,” 165.
373
William T. Cavanaugh, “Sins of Omission: What ‘Religion and
Violence’ Arguments Ignore,” Religion and Violence, The Hedgehog
Review 6, no. 1 (Spring 2004): 37. See also, Fitzgerald, The Ideology of
Religious Studies, 3-32. Jack Goody argues that "religion" is not so
much a "western" invention as it is a more general product of literate
societies. See The Logic of Writing and the Organization of Society
(Cambridge, UK: Cambridge University Press, 1986), 4-5.
374
Taylor, A Secular Age, 270.
375
Taylor, A Secular Age, 275. See also Berlin, “The Apotheosis of the
Romantic Will,” 555-559; “Herder and the Enlightenment,” 426; “The
Divorce Between the Sciences and the Humanities,” 326-28; “The
Originality of Machiavelli,” 312-313; “The Counter-Enlightenment,”
245-46; “The Pursuit of the Ideal,” 5.

136
376
Qtd. in Peter Gay, The Party of Humanities: Essays in the French
Enlightenment (New York, 1971), 26.
377
Peter Gay, The Enlightenment: The Rise of Modern Paganism (New
York, 1995), 374-377.
378
Voltaire, Letter, August 18, 1756, qtd. in Peter Gay, The
Enlightenment, 68.
379
Peter Gay, The Enlightenment, 148-49.
380
Peter Gay, The Enlightenment, 338; Taylor, A Secular Age, 294;
Berlin, “The Sciences and the Humanities,” 330.
381
Gay, The Party of Humanities: Essays in the French Enlightenment,
124.
382
Mill qtd. in Richard Reeves, John Stuart Mill: Victorian Firebrand
(London, 2007), 163-66.
383
Thomas Jefferson, “Letter to Danbury Baptist Association,” Jan 1
1802. The Supreme Court did not legitimate this notion until 1878. For
a history of this letter see James Hutson, “A Wall of Separation: FBI
Helps Restore Jefferson’s Obliterated Draft,” Information Bulletin,
Library of Congress, 57, no. 2 (June 1998) <www.loc.gov/loc/
lcib/9806/danbury.html>
384
Thomas Bender, "Politics, Intellect, and the American University,
1945-1995," American Academic Culture in Transformation, Thomas
Bender and Carl E. Schorske, eds. (Princeton, 1997), 28.
385
Taylor, A Secular Age.
386
Rajeev Bhargava, "Does Religious Pluralism Require Secularism?"
Hedgehog Review 12, no. 3 (Fall 2010); Charles Taylor, "The Meaning
of Secularism," Hedgehog Review 12, no. 3 (Fall 2010); Craig Calhoun,
"Rethinking Secularism," Hedgehog Review 12, no. 3 (Fall 2010).
387
Jose Casanova, “Rethinking Secularization: A Global Comparative
Perspective,” After Secularization, The Hedgehog Review 8, no 1-2
(Spring & Summer 2006); Paul Heelas, “Challenging Secularization
Theory: The Growth of ‘New Age’ Spiritualities,” After Secularization,
The Hedgehog Review 8, no 1-2 (Spring & Summer 2006); Daniele
Hervieu-Leger, “In Search of Certainties: The Paradoxes of Religiosity
in Societies of High Modernity,” After Secularization, The Hedgehog
Review 8, no 1-2 (Spring & Summer 2006).
388
Raymond Aron, "The Future of Secular Religions" and "From
Marxism to Stalinism," in The Dawn of Universal History: Selected
Essays from a Witness of the Twentieth Century, trans. Barbara
Bray (New York, 2002), 177-201, 203; Raymond Aron, The Opium
of the Intellectuals (New York, 1962); John Gray, Straw Dogs:
Thoughts on Humans and Other Animals (New York, 2003), xiii, 4;
John Gray, Black Mass: Apocalyptic Religion and the Death of
Utopia (New York, 2007).
389
John Gray, Black Mass: Apocalyptic Religion and the Death of
Utopia (New York, 2007), 1-3.
390
Taylor, "The Meaning of Secularism," 26.
391
Kendrick Frazier, “Are Science and Religion Conflicting or
Complementary?” Science and Religion: Are They Compatible?
(Amherst, 2003), 26-27; Robert D. Putnam and David E. Campbell,

137
American Grace: How Religion Divides and Unites Us (New York,
2010).
392
Neil Degrasse Tyson, “Holy Wars: An Astrophysicist Ponders the
God Question,” Science and Religion: Are They Compatible? (Amherst,
2003), 77; Eugenie C. Scott, “The ‘Science and Religion Movement’:
An Opportunity for Improved Public Understanding of Science?”
Science and Religion: Are They Compatible? (Amherst, 2003), 112.
393
Taylor, A Secular Age, 727.
394
Putnam and Campbell, American Grace: How Religion Divides and
Unites Us, 33
395
Martin E. Marty, “Our Religio-Secular World,” Daedalus 132, no. 3
(Summer 2003): 42, 47.
396
Taylor, A Secular Age, 636.
397
Actually, the conflict between secularism and religion extends even
beyond the existence of these concepts. For at least the past three
thousand years, along side religious belief in various cultures, doubts
about these religious beliefs have co-exited. See Hecht, Doubt: A
History.
398
Mario Biagioli, Galileo, Courtier: The Practice of Science in the
Culture of Absolutism (Chicago, 1993).
399
Hecht, Doubt: A History.
400
Berlin, “The Counter-Enlightenment,” 263. On the enlightenment
see Peter Gay, The Enlightenment: The Rise of Modern Paganism (New
York, 1966); The Enlightenment: The Science of Freedom (New York,
1969). See also Berlin “The Apotheosis of the Romantic Will;” “Herder
and the Enlightenment;” “The Divorce Between the Sciences and the
Humanities;” “The Originality of Machiavelli.”
401
Berlin, “The Counter-Enlightenment,” 250-51; “The Sciences and
the Humanities,” 328.
402
Johann Gottfried Herder, Auch eine Philosophie der Geschichte zur
Bildung der Menschheit (Another Philosophy of History on the
Development of Mankind), in Berlin, “Herder and the Enlightenment,”
408. Herder also questioned the Euro-centric bias of enlightenment
philosophers (416).
403
James Davison Hunter, Culture Wars: The Struggle to Define
America (New York, 1991); Peter L. Berger, ed., The Limits of Social
Cohesion: Conflict and Mediation in Pluralist Societies (Boulder, CO,
1998).
404
On this debate in England see Frank M. Turner, Between Science
and Religion: The Reaction to Scientific Naturalism in Late Victorian
England (New Haven, 1974); Peter J. Bowler, Reconciling Science and
Religion: The Debate in Early Twentieth Century Britain (Chicago,
2001); Susan Budd, Varieties of Unbelief: Atheists and Agnostics in
English Society, 1850-1960 (London, 1977).
405
Although these "cultural wars" have not always been focused on
religious tenants. In the U.S. for example, during the 1910s-20s there
was a cultural war based on competing versions of "Americanism" in
response to rising tides of immigration, colonial wars, and World War I.
In the 1940s-1950s there was another cultural war based on political

138
ideologies (communism vs. Americanism) and economic orders (state
socialism vs. free-market capitalism).
406
Ronald W. Clark, Einstein: The Life and Times (New York, 1971),
413. But Einstein did not believe in the “conventional” definition of
religion in terms of a personal God with supernatural powers who acts
in history. Einstein said, “The idea of a personal God is quite alien to
me and seems even naïve.” See Dawkins, The God Delusion, 15.
407
Steven Jay Gould, “Nonoverlapping Magisteria,” Science and
Religion: Are They Compatible? (Amherst, 2003), 193.
408
Fitzgerald, The Ideology of Religious Studies, 34-36.
409
Fitzgerald, The Ideology of Religious Studies, 36-37.
410
Fitzgerald, The Ideology of Religious Studies, 41.
411
Mircea Eliade, The Sacred and the Profane: The Nature of Religion
(New York 1987), 16, 28, 100, 166, 203.
412
Fitzgerald, The Ideology of Religious Studies.
413
Max Weber, The Sociology of Religion, trans. Ephraim Fischoff
(Boston, 1963), 1.
414
Emile Durkheim, The Elementary Forms of the Religious Life (New
York, 1965).
415
Loyal Rue, Religion Is Not About God: How Spiritual Traditions
Nurture our Biological Nature (New Brunswick, 2005), 147-148.
416
Pascal Boyer, Religion Explained: The Evolutionary Origins of
Religious Thought (New York, 2001), 12.
417
Fitzgerald, The Ideology of Religious Studies.
418
Rue, Religion Is Not About God, 3.
419
Boyer, Religion Explained, 28, 311. See also Daniel C. Dennett’s
Darwin’s Dangerous Idea: Evolution and the Meaning of Life (New
York, 1995) and Breaking the Spell: Religion as a Natural Phenomenon
(New York, 2006).
420
Christopher Hitchens, “No, But It Should,” Does Science Make
Belief in God Obsolete? John Templeton Foundation (West
Conshohocken, PA, n.d.), 25. See also Hitchens, God is not Great:
How Religion Poisons Everything (New York, 2007).
421
Steven Pinker, “Yes, If By,” Does Science Make Belief in God
Obsolete? John Templeton Foundation (West Conshohocken, PA, n.d.),
4.
422
Victor J. Stenger, “Yes,” Does Science Make Belief in God
Obsolete? John Templeton Foundation (West Conshohocken, PA, n.d.),
32. See also Stenger, God: The Failed Hypothesis – How Science
Shows That God Does Not Exist (Amherst, NY, 2007).
423
Richard Dawkins, The God Delusion (New York, 2006), 31.
424
Victor J. Stenger, The New Atheism: Taking a Stand for Science and
Reason (Amherst, NY, 2009). For a video discuss between some of the
leading “new atheists” see: Richard Dawkins, Daniel C. Dennett, Sam
Harris, and Christopher Hitchens, The Four Horsemen: Episode 1
(2008).
425
Percy Bysshe Shelley, The Necessity of Atheism and Other Essays
(New York, 1993), 31, 35.

139
426
Qtd in Susan Jacoby, Freethinkers: A History of American
Secularism (New York 2004), 169.
427
Bertrand Russell, Why I am Not a Christian and Other Essays on
Religion and Related Subjects (New York, 1957), v, 193, 47.
428
Christopher McKnight Nichols, “The ‘New’ No Religionists: A
Historical Approach To Why Their Numbers Are on the Rise,” Culture,
Institute for Advanced Studies in Culture (Fall 2009), 13-14.
429
“Not All Unbelievers Call Themselves Atheists,” U.S. Religious
Landscape Survey, The Pew Forum on Religion and Public Life (April
2 2009) <http://pewforum.org/Not-All-Nonbelievers-Call-Themselves-
Atheists.aspx>; Robert D. Putnam and David E. Campbell, American
Grace: How Religion Divides and Unites Us (New York, 2010), 104.
430
Qtd in Susan Jacoby, Freethinkers: A History of American
Secularism (New York 2004), 167.
431
Paul Edwards, "How Bertrand Russell was Prevented from Teaching
at the College of the City of New York," in Bertrand Russell, Why I am
Not a Christian and Other Essays on Religion and Related Subjects
(New York, 1957), 209-13.
432
“Religion and Politics: Contention and Consensus (Part II),” The
Pew Forum on Religion and Public Life (July 24 2003)
<http://pewforum.org/PublicationPage.aspx?id=621#1>
433
“Soldier Sues Army, Saying His Atheism Led to Threats,” The Pew
Forum on Religion and Public Life (April 26 2008)
<http://pewforum.org/Religion-News/Soldier-Sues-Army-Saying-His-
Atheism-Led-to-Threats.aspx>
434
Dawkins, The God Delusion, 3.
435
Philip Kitcher, "Militant Modern Atheism," Journal of Applied
Philosophy 28, no. 1 (2011): 1-13.
436
Christopher Hitchens, “The Future of an Illusion,” Daedalus 132,
no. 3 (Summer 2003): 83-87.
437
Sam Harris, The End of Faith: Religion, Terror, and the Future of
Reason (New York, 2004), 72-77, 83.
438
Harris, The End of Faith, 130-131.
439
Harris, The End of Faith, 170.
440
Harris, The End of Faith, 217, 221.
441
Dawkins, The God Delusion, 14.
442
Dawkins, The God Delusion, 20.
443
Dawkins, The God Delusion, 361.
444
Robert Sapolsky, “No,” Does Science Make Belief in God Obsolete?
John Templeton Foundation (West Conshohocken, PA, n.d.), 20-22.
445
Christopher R. Beha, "Reason for Living: The Good Life without
God," Harper's Magazine (July 2012), 73, 75.
446
Sapolsky, “No,” Does Science Make Belief in God Obsolete?, 20-22.
447
Robert D. Putnam and David E. Campbell, American Grace: How
Religion Divides and Unites Us (New York, 2010), 4-5
448
John F. Haught, Science and Religion: From Conflict to
Conversation (Mahwah, NJ, 1995); Dennett, Breaking the Spell, 14.
449
Marcus J. Borg and N. T. Wright, The Meaning of Jesus: Two
Visions (New York, 1999); Dale C. Allison, Marcus J. Borg, John

140
Dominic Crossan, and Stephen J. Patterson, The Apocalyptic Jesus: A
Debate, ed. Robert J. Miller (Santa Rose, CA, 2001).
450
Christopher Hitchens and William Lane Craig, Does God Exist? (La
Mirada Films, 2009); Christopher Hitchens and Douglas Wilson,
Collision: Is Christianity Good for the World? (Level 4, 2009);
Christopher Hitchens and Dinesh D’Souza, God On Trial: A Debate on
the Existence of God (Fixed Point Foundation, 2008).
451
See for example: Deepak Chopra and Leonard Mlodinow, War of
the Worldviews: Science vs. Spirituality (New York, 2011); William
Lane Craig and Quentin Smith, Theism, Atheism, and Big Bang
Cosmology (Oxford, UK, 1993), Michael L. Peterson and Raymond J.
Vanarragon, Contemporary Debates in Philosophy of Religion (Oxford,
UK, 2004).
452
James Davison Hunter, Before the Shooting Begins: Searching for
Democracy in America's Culture War (New York, 1994), 8.
453
Richard Rorty, "Religion as Conversation-Stopper," Common
Knowledge 3 (Spring 1994): 1-6.
454
Berlin, “Does Political Theory Still Exist?”, 88; “Historical
Inevitability,” 176-77, 185.
455
John Gray, Enlightenment's Wake (New York, 2009), 27, 44-45.
Gray defines toleration as "an acceptance of the imperfectability of
human beings...Since we cannot be perfect, and since virtue cannot be
forced on people but is rather a habit of life they must themselves strive
to acquire, we [are] enjoined to tolerate the shortcomings of others,
even as we struggle with our own" (27).
456
Pope John Paul II, Address to the United Nations General Assembly
(Oct 5, 1995), qtd. in Martha C. Nussbaum, Cultivating Humanity: A
Classical Defense of Reform in Liberal Education (Cambridge, MA,
1997), 259.
457
Gray, Enlightenment's Wake, 45.
458
Dennett, Breaking the Spell, 14.
459
Taylor, A Secular Age, 675.
460
Taylor, A Secular Age, 428.
461
Anthony Gottlieb, The Dream of Reason: A History of Philosophy
from the Greeks to the Renaissance (New York, 2001).
462
Amartya Sen, The Argumentative Indian (New York, 2005), 15.
463
Sen, The Argumentative Indian, 18.
464
Jack Weatherford, Genghis Kahn and the Making of the Modern
World (New York, 2004), 172-73.
465
Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 71.
466
Lessing, Nathan the Wise, qtd. in Peter Gay, The Enlightenment,
334.
467
Peter Gay, The Enlightenment, 172, 176-77.
468
Specifically I am referring the debates over the “culture war” of the
last three decades, which reflect a heated disagreement over the very
notions of American national and cultural identity. A very short list of
this debate might include the following: Allan Bloom, The Closing of
the American Mind; Arthur M. Schlesinger, Jr., The Disuniting of
America: Reflections on a Multicultural Society; James Davison

141
Hunter, Cultural Wars: The Struggle to Define America; Todd Gitlin,
The Twilight of Common Dreams: Why American is Wracked by
Culture Wars; Michael Lind, The Next American Nation: The New
Nationalism and the Fourth American Revolution; Lawrence W.
Levine, The Opening of the American Mind: Canons, Culture, and
History; Michael Kazin and Joseph A. McCartin, eds., Americanism:
New Perspectives on the History of an Ideal.
469
James A. Banks, “Diversity, Group Identity, and Citizenship
Education in a Global Age.”
470
Gerald Graff, Beyond the Culture Wars: How Teaching the Conflicts
Can Revitalize American Education,8,12,15.
471
Barack Obama, The Audacity of Hope: Thoughts on Reclaiming the
American Dream, 92. For two excellent and very short books on
democracy, see: Robert A. Dahl, On Democracy; Robert A. Dahl, On
Political Equality.
472
Amy Gutmann, Democratic Education, 11-12.
473
Linda Liska Belgrave, Adrienne Celaya, Seyda Aylin Gurses,
Angelica Boutwell, Alexandra Fernandez, "Meaning of Political
Controversy in the Classroom: A Dialogue Across The Podium,"
Symbolic Interaction, 35, no. 1 (2012), 68–87.
474
Amartya Sen, The Idea of Justice (Cambridge, MA, 2009), x.
475
Berlin, “Does Political Theory Still Exist?”, 88; “Historical
Inevitability,” 176-77, 185; Amartya Sen, The Idea of Justice, x, xix.
476
Taylor, A Secular Age, 675.
477
Walter Feinberg, The Idea of a Public Education, Review of
Research in Education 36 (March 2012), 17.
478
This claim is central to many of Gray's works. See Enlightenment's
Wake (New York, 2009), ix; Black Mass: Apocalyptic Religion and the
Death of Utopia (New York, 2008); Straw Dogs: Thoughts on Humans
and Other Animals (New York, 2003).
479
Adams, The Education of Henry Adams, 451.
480
A. A. Long, ed., The Cambridge Companion to Early Greek
Philosophy (Cambridge, UK, 1999); Anthony Gottlieb, The Dream of
Reason: A History of Philosophy from the Greeks to the Renaissance
(New York, 2000); Gregory Vlastos, Socrates: Ironist and Moral
Philosopher (Ithica, NY, 1991).
481
Plato, "Apology," in Plato: Complete Works, John M. Cooper, ed.
(Indianapolis, IN, 1997), 21.
482
Plato, "Protagoras," in Plato: Complete Works, John M. Cooper, ed.
(Indianapolis, IN, 1997), 778.
483
My emphasis. Plato, "Apology," Ibid., 33.
484
Plato, "Republic," in Plato: Complete Works, John M. Cooper, ed.
(Indianapolis, IN, 1997).
485
Buddha, The Dhammapada, trans. John Ross Carter and Mahinda
Palihawadana (Oxford, 2000), 3, 5, 33, 43, 49. See also Karen
Armstrong, Buddha (New York, 2001); Andrew Skilton, A Concise
History of Buddhism (Birmingham, UK, 1997).
486
D. T. Suzuki, Zen Buddhism: Selected Writings of D. T. Suzuki (New
York, 1996), 3. Suzuki described Zen Buddhism as the purist form of

142
Buddha's teachings: "Zen takes hold of the enlivening spirit of the
Buddha, stripped of all its historical and doctrinal garments" (41).
487
Confucius, The Analects, trans. David Hinton (Washing DC, 1998),
67, 85.
488
Mencius, Mencius, trans. David Hinton (Washington DC, 1998), 49-
50.
489
I highlight here a common preoccupation with the notion of
enlightenment, but I do not discuss the different methods used by these
teachers that reflect differing conceptions of knowledge and the good.
Socrates and Buddha were much more critical of tradition and society
than were Confucius and Mencius; however, it must be noted that all
four criticized and revitalized their respective cultures. I think this
commonality is much more important than their differences.
490
Owen Flanagan, The Bodhisattva's Brain: Buddhism Naturalized
(Cambridge, MA, 2011), 119.
491
Although the practice of critique is more of a western tradition. See
Gottlieb, The Dream of Reason.
492
I. Bernard Cohen, Revolution in Science, 135, 142. See also Thomas
S. Kuhn, The Structure of Scientific Revolutions.
493
Walter J. Ong, Ramus: Method, and the Decay of Dialogue
(Chicago, 2004), 225. Ong demonstrated that the middle-ages and early
modern period were an "age when there was no word in ordinary usage
which clearly expressed what we mean today by 'method,' a series of
ordered steps gone through to produce with certain efficacy a desired
effect - a routine efficiency. This notion is not entirely missing in early
sixteenth-century consciousness, but it has yet no independent
existence" (p. 225).
494
Cohen, Revolution in Science, 148, 153, 156.
495
Cohen, Revolution in Science, 161-62, 168. Quote from Newton’s
De Motu in Cohen, 168.
496
Peter Gay. The Enlightenment: The Rise of Modern Paganism (New
York. 1995).
497
Immanuel Kant, "An Answer to the Question: What is
Enlightenment?", in Practical Philosophy: The Cambridge Edition of
the Works of Immanuel Kant, trans. Mary J. Gregor (Cambridge, UK,
1996).
498
Lucien Febvre and Henri-Jean Martin, The Coming of the Book: The
Impact of Printing, 1450-1800 (London, 2010), 258; Walter J. Ong,
Ramus: Method, and the Decay of Dialogue (Chicago, 2004), 123.
499
Friedrich Nietzsche, Beyond Good and Evil, trans. Walter Kaufman
(New York, 1989), 161.
500
Dennett, Darwin’s Dangerous Idea: Evolution and the Meaning of
Life.
501
Cohen, Revolution in Science, ch 19, quote on 289; Dennett,
Darwin’s Dangerous Idea: Evolution and the Meaning of Life.
502
Ernst Mayr, “The Nature of the Darwinian Revolution,” Science,
176 (1972), 987.
503
Cohen, Revolution in Science, 299.

143
504
Dennett, Darwin’s Dangerous Idea; Charles Taylor, A Secular Age
(Cambridge, MA, 2007), 77.
505
Cohen, Revolution in Science, ch 27; Isaacson, Einstein; Heisenberg,
Physics and Philosophy; Lindley, Uncertainty: Einstein, Heisenberg,
Bohr, and the Struggle for the Soul of Science; Greene, The Elegant
Universe.
506
David Lindley, Uncertainty: Einstein, Heisenberg, Bohr, and the
Struggle for the Soul of Science (New York, 2008), 2, 4; Karl R.
Popper, Objective Knowledge: An Evolutionary Approach, revised
edition (Oxford, 1979), 214.
507
Ray Kurzweil, The Singularity is Near: When Humans Transcend
Biology (New York, 2005), 453; David Deutsche, The Fabric of
Reality: The Science of Parallel Universes -and Its Implications (New
York, 1997), 234-37.
508
Werner Heisenberg, Physics and Philosophy: The Revolution in
Modern Science (New York, 2007),36-37,48-49; Alfred North
Whitehead, Science and the Modern World (Cambridge, UK, 1926), 17;
Walter J. Ong, Ramus: Method, and the Decay of Dialogue (Chicago,
2004), 165.
509
Friedrich Nietzsche, The Gay Science, qtd. in. Tyler T. Roberts,
Contesting Spirit: Nietzsche, Affirmation, Religion (Princeton, 1998),
39, footnote 11.
510
Isaiah Berlin discusses this point at length in several essays in The
Proper Study of Mankind. See “The Apotheosis of the Romantic Will,”
555-559; “Herder and the Enlightenment,” 426; “The Divorce Between
the Sciences and the Humanities,” 326-28; “The Originality of
Machiavelli,” 312-313; “The Counter-Enlightenment,” 245-46; “The
Pursuit of the Ideal,” 5. The long quote comes from “The Originality of
Machiavelli,” 312-313.
511
Karl R. Popper, Objective Knowledge: An Evolutionary Approach,
revised edition (Oxford, 1979), 204.
512
It must be acknowledged that not all scientists have given up
Newton's assumption of singular Laws. The Harvard socio-biologist
Edward O. Wilson is perhaps the most important holdout. He admits
that science has now uncovered "impenetrably complex systems" (54),
but he argues that "reductionism" will eventually allow scientists "to
fold the laws and principles of each level or organization into those at
more general, hence more fundamental levels," which would lead to
"simple universal laws of physics to which all other laws and principles
can eventually be reduced" (55). However, he is one of the few such
traditionalists to admit that this assumption is based on a
"transcendental world view" that "could be wrong" (55). Consilience:
The Unity of Knowledge (New York, 1998).
513
Fritjof Capra, The Web of Life: A New Scientific Understanding of
Living Systems (New York, 1996); Steven Johnson, Emergence: The
Connected Lives of Ants, Brains, Cities, and Software (New York,
2001).
514
Ilya Prigogine and Isabelle Stengers, Order out of Chaos: Man's
New Dialogue with Nature (New York, 1984), xv, xxvii, 9, 73.

144
515
David Deutsche, The Fabric of Reality: The Science of Parallel
Universes - and Its Implications (New York, 1997), 46; Alan Lightman,
"The Accidental Universe: Science's Crisis of Faith, Harpers (Dec
2011), 35.
516
Haskell, The Emergence of Professional Social Science, 3-23.
517
Adams, The Education of Henry Adams, see especially chapters 15,
22, 25, 31.
518
On epistemological diversity see George Steinmetz, ed., The Politics
of Method in the Human Sciences: Positivism and Its Epistemological
Others. For a work that acknowledges the diversity of knowledge, yet
keeps the monistic assumption of a singular and unified ontology see
Wilson, Consilience: The Unity of Knowledge.
519
Prigogine and Stengers, Order out of Chaos: Man's New Dialogue
with Nature, 7, xiv.
520
Lightman, "The Accidental Universe, 35.
521
Ibid., 25, xxviii.
522
Ibid., xiv-xv, 9, 73; Wilson, Consilience: The Unity of Knowledge,
83-84; Capra, The Web of Life.
523
Prigogine and Stengers, Order out of Chaos: Man's New Dialogue
with Nature, 77.
524
Ibid., 225.
525
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010), 31.
526
Prigogine and Stengers, Order out of Chaos: Man's New Dialogue
with Nature, 225.
527
Lightman, "The Accidental Universe, 36. Lightman goes on to
explain the paradox of 21st century Physics, especially the strange and
unsettling implications of string theory: "Not only must we accept that
basic properties of our universe are accidental and incalculable. In
addition, we must believe in the existence of many other universes. But
we have no conceivable way of observing these other universes and
cannot prove their existence. Thus, to explain what we see in the world
and in our mental deductions, we must believe in what we cannot
prove" (p. 40).
528
See for example Ray Kurzweil, The Singularity is Near: When
Humans Transcend Biology (New York, 2005).
529
Paul Ricoeur, Oneself as Another, trans. Kathleen Blamey (Chicago,
1992), 16.
530
John Gray, Isaiah Berlin (Princeton, 1996), 39-40.
531
Tony Judt, "Introduction," in Raymond Aron, The Dawn of
Universal History: Selected Essays from a Witness of the Twentieth
Century, trans. Barbara Bray (New York, 2002), xxiii. See for example
the philosophy of Raymond Aron and Isaiah Berlin.
532
John Gray, Straw Dogs: Thoughts on Humans and Other Animals
(New York, 2003), 4.
533
Georg W. F. Hegel, The Philosophy of History (Amherst, NY,
1991), 9, 19.
534
Georg W. F. Hegel, qtd. in Shlomo Avineri, Hegel's Theory of the
Modern State (Cambridge, UK, 1989), 123.

145
535
W. B. Yeats, "The Second Coming," The Collected Poems of W. B.
Yeats (New York, 1996), 187.
536
W. H. Auden, "Herman Melville," Collected Poems (New York,
1991), 251.
537
Friedrich Nietzsche, Human, All too Human: A Book for Free
Spirits, trans. Marian Faber and Stephen Lehmann (Lincoln, NB, 1996).
538
Michel Foucault, "What Is Enlightenment?" In The Foucault Reader
(New York, 1984), 32, 35.
539
Allan Bloom, The Closing of the American Mind (Touchstone,
1988), 262.
540
Eric Hobsbawm, The Age of Revolution, 1789-1848 (New York,
1996); Adam Zamoyski, Holy Madness: Romantics, Patriots, and
Revolutionaries, 1776-1871 (New York, 1999).
541
Bloom, The Closing of the American Mind , 289.
542
Qtd. in John Gray, Black Mass: Apocalyptic Religion and the Death
of Utopia (New York, 2007), 107.
543
Gordon S. Wood, The Radicalism of the American Revolution (New
York, 1991), 191.
544
John Adams, qtd. in Wood, The Radicalism of the American
Revolution , 191.
545
Thomas Jefferson, qtd. in Ellis, American Sphinx, 54.
546
Lawrence S. Stepelevich, The Young Hegelians: An Anthology
(Atlantic Highlands, NJ, 1997); Warren Breckman, Marx, The Young
Hegelians, and the Origins of Radical Social Theory (Cambridge, UK,
1999); Harold Mah, The End of Philosophy, The Origin of "Ideology":
Karl Marx and the Crisis of the Young Hegelians (Berkeley, 1987).
547
Karl Marx, "Theses on Feuerbach," Karl Marx: Early Writings
(London, 1992), 423.
548
Karl Marx, "Economic and Philosophical Manuscripts," Karl Marx:
Early Writings (London, 1992), 348.
549
Karl Marx, "Letters from the Franco-German Yearbooks," Karl
Marx: Early Writings (London, 1992), 209.
550
Karl Marx, "Critique of Hegel's Philosophy of Right," Karl Marx:
Early Writings (London, 1992), 244.
551
John Gray, Black Mass: Apocalyptic Religion and the Death of
Utopia. (New York, 2007).
552
John Gray, Enlightenment's Wake (New York, 2009), xv.
553
Alan Dawley, Struggles for Justice: Social Responsibility and the
Liberal State (Cambridge, MA, 1991); Michael McGerr, A Fierce
Discontent: The Rise and Fall of the Progressive Movement in America
(Oxford, 2003); Nell Irvin Painter, Standing at Armageddon: The
United States, 1877-1919 (New York, 1987); Desmond King, In the
Name of Liberalism: Illiberal Social Policy in the United States and
Britain (Oxford, 1999).

146
554
Gray, Black Mass; F. A. Hayek, The Road to Serfdom (Chicago,
1994); Eugen Weber, Varieties of Fascism: Doctrines of Revolution in
the Twentieth Century (Malabar, FL, 1982); Timothy Snyder,
Bloodlands: Europe between Hitler and Stalin (New York, 2010). John
Gray, Black Mass: Apocalyptic Religion and the Death of Utopia (New
York, 2008).
555
John K. Roth and Michael Berenbaum, eds., Holocaust: Religious
and Philosophical Implications (St. Paul, MN, 1989); Nora Levin, The
Holocaust Years: The Nazi Destruction of European Jewry, 1933-1945
(Malabar, FL, 1990); Primo Levi, Survival in Auschwitz (New York,
1966); Primo Levi, The Reawakening (New York, 1995); Primo Levi,
The Drowned and the Saved (New York, 1988); Fergal Keane, Season
of Blood: A Rwandan Journey (London, 1996); Gray, Black Mass:
Apocalyptic Religion and the Death of Utopia.
556
Eric Hobsbawm, The Age of Empire, 1875-1914 (New York, 1989);
Adam Hochschild, Bury the Chains: Prophets and Rebels in the Fight
to Free an Empire's Slaves (New York, 2005); Bill Ashcroft, Gareth
Griffiths, and Helen Tiffin, eds., The Post-Colonial Studies Reader
(London, 19995); V. G. Kiernan, America: The New Imperialism
(London, 2005); Michael Hardt and Antonio Negri, Empire
(Cambridge, MA, 2000).
557
Jonathan Schell, The Fate of the Earth and Abolition (Stanford,
2000).
558
Jonathan Glover, Humanity: A Moral History of the Twentieth
Century (New Haven, 2000); Gray, Black Mass.
559
Primo Levi, The Drowned and the Saved (New York, 1988), 144,
199.
560
Lawrence L. Langer, "The Dilemma of Choice in the Deathcamps,"
In Holocaust: Religious and Philosophical Implications, John K. Roth
and Michael Berenbaum, eds. (St. Paul, MN, 1989), 223.
561
Michel Foucault, "What Is Enlightenment?" In The Foucault Reader
(New York, 1984), 39, 49, 46-47.
562
Isaiah Berlin, "The Sense of Reality," In The Sense of Reality:
Studies in Ideas and Their History (New York, 1998), 38.
563
Friedrich Nietzsche, Beyond Good and Evil, trans. Walter Kaufmann
(New York, 1989), 89.
564
Sigmund Freud was one of the first philosophers to criticize
modernity with such a point. See Civilization and its Discontents, trans.
James Strachey (New York, 1961), 45. See also John Gray, Straw
Dogs: Thoughts on Humans and Other Animals (New York, 2003);
John Gray, Black Mass: Apocalyptic Religion and the Death of Utopia.
565
Francis Fukuyama, The End of History and the Last Man (New
York, 1992).
566
Ibid., 112.
567
John Gray, Straw Dogs: Thoughts on Humans and Other Animals,
14.
568
Tony Judt, The Burden of Responsibility: Blum, Camus, Aron, and
the French Twentieth Century (Chicago, 1998), 158.

147
569
Glover, Humanity: A Moral History of the Twentieth Century, 7.
See also Raymond Aron, The Century of Total War (Lanham, MD,
1985).
570
Jared Diamond, The Third Chimpanzee: The Evolution and Future
of the Human Animal (New York, 1992), 217; Jared Diamond,
Collapse: How Societies Choose to Fail or Succeed (New York, 2005).
571
Gray, Straw Dogs: Thoughts on Humans and Other Animals, 12.
572
Amartya Sen, Development as Freedom (New York, 1999), xi.
573
Diamond, Collapse, 521, 525; Diamond, The Third Chimpanzee, 4.
See also Wilson, Consilience: The Unity of Knowledge, 280-98); "The
Antropocene: A Man-Made World," The Economist (May 28 2011), 81-
83. Diamond's basic thesis was articulated over thirty years earlier by
Rene Dubos in A God Within: A Positive Philosophy for a More
Complete Fulfillment of Human Potentials (New York, 1972).
574
John Gray, Enlightenment's Wake (New York, 2009), 216.
575
Owen Flanagan, Self Expressions: Mind, Morals, and the Meaning
of Life (Oxford, 1996), 57.
576
Erich Fromm, Beyond the Chains of Illusion (New York, 1966), 118.
577
John Gray, Straw Dogs: Thoughts on Humans and Other Animals
(New York, 2003), 120.
578
Daniel C. Dennett, Freedom Evolves (New York, 2003), 64, 85, 13.
579
Jared Diamond, The Third Chimpanzee: The Evolution and Future
of the Human Animal (New York, 2006); Matt Ridley, Nature Via
Nurture: Genes, Experience, and What Makes Us Human (New York,
2003).
580
Ridley, Nature Via Nurture, 64.
581
Jared Diamond, The Third Chimpanzee, 93, 143.
582
Ray Kurzweil, The Singularity is Near: When Humans Transcend
Biology (New York, 2005). For a more political treatment of
technology and human progress see Francis Fukuyama, The End of
History and the Last Man (New York, 1992).
583
Ridley, Nature Via Nurture, 16-17, 209.
584
Freud, Civilization and its Discontents, 44.
585
Karl R. Popper, Objective Knowledge: An Evolutionary Approach,
revised edition (Oxford, 1979), 238.
586
Michael Bess, "Blurring the Boundary between 'Person' and
'Product': Human Genetic Technologies through the Year 2060," The
Hedgehog Review 13, no. 2 (Summer 2011), 57; Wilson, On Human
Nature, 96-97, 208.
587
Kevin Warwick, I, Cyborg (London, 2002); Kurzweil, The
Singularity is Near, 30, 309; Lev Grossman, "Singularity," Time (Feb
21 2011). For a critique see John Gray, Straw Dogs: Thoughts on
Humans and Other Animals. Gray argues "Technology is not
something humankind can control...Technical progress leaves only one
problem unsolved: the frailty of human nature. Unfortunately that
problem is insoluble" (14-15). For a critique of techno-posthumanism
see Michael E. Zimmerman, "Last Man or Overman? Transhuman
Appropriations of a Nietzschean Theme," The Hedgehog Review 13, no.
2 (Summer 2011): 31-44.

148
588
E. O. Wilson, qtd. in John Gray, Straw Dogs: Thoughts on Humans
and Other Animals (New York, 2003), 5.
589
Mark Lynas, The God Species: Saving the Planet in the Age of
Humans (New York, 2011).
590
Naomi Oreskes and Erik M. Conway, Merchants of Doubt: How a
Handful of Scientists Obscured the Truth on Issues from Tobacco
Smoke to Global Warming (New York, 2010), 256, 261.
591
Francis Fukuyama, The End of History and the Last Man (New
York, 1992), xiv.
592
Julian Simon, The Ultimate Resource (Princeton, 1996).
593
Diamond, The Third Chimpanzee, 54-56; Ridley, Nature Via
Nurture, 16-17, 209; Roy D'Andrade, "Cultural Darwinism and
Language," American Anthropologist 104, no. 1 (2002), 223; Jane H.
Hill, "On the Evolutionary Foundations of Language," American
Anthropologist 74 (1972), 308-210.
594
Rene Dubos, A God Within: A Positive Philosophy for a More
Complete Fulfillment of Human Potentials (New York, 1972), 239.
595
Ridley, Nature Via Nurture; Pinker, The Blank Slate.
596
Gray, Black Mass: Apocalyptic Religion and the Death of Utopia,
41.
597
Dennett, Freedom Evolves, 93, 143, 162, 166, 250-51, 269.
598
Ridley, Nature Via Nurture, 4.
599
Fritjof Capra, The Web of Life: A New Scientific Understanding of
Living Systems (New York, 1996), 267.
600
Steven Pinker, The Blank Slate: The Modern Denial of Human
Nature (New York, 2002), xi, 34. See also Edward O. Wilson, On
Human Nature (Cambridge, MA, 1978); Dennett, Freedom Evolves; On
Human Nature, Deadalus: Journal of the American Academy of Arts &
Sciences 133, no. 4 (Fall 2004).
601
Pinker, The Blank Slate, 34, 65-66.
602
Pinker, The Blank Slate, 65-66, 201, 219, 238.
603
Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 193.
604
Tony Judt, "Introduction," in Raymond Aron, The Dawn of
Universal History: Selected Essays from a Witness of the Twentieth
Century, trans. Barbara Bray (New York, 2002), xiii-xiv.
605
Using a historical term, I called the study of ideas "ideology" and I
put forth a theory for how ideas are born and become social institutions
in my book, Studies in Ideology: Essays on Culture and Subjectivity
(Lanham, 2005).
606
Philip Selznick, The Moral Commonwealth: Social Theory and the
Promise of Community (Berkeley, 1992), 232-34.
607
Elinor Ostrom, Governing the Commons: The Evolution of
Institutions for Collective Action (Cambridge, UK, 1990), 51.
608
Elinor Ostrom, Understanding Institutional Diversity (Princeton,
2005), 3.
609
Vincent Ostrom and Elinor Ostrom, “Rethinking Institutional
Analysis: An Interview with Vincent and Elinor Ostrom,” Mercatus
Center, George Mason University (Nov 7, 2003), 1; Anthony Giddens,

149
Capitalism and Modern Social Theory: An Analysis of the Writing of
Marx, Durkheim, and Max Weber (Cambridge, UK, 1971).
610
DiMaggio and Powell, “The Iron Cage Revisited,” 8; Roger
Friedland and Robert R. Alford, “Bringing Society Back In: Symbols,
Practices, and Institutional Contradictions,” The New Institutionalism in
Organizational Analysis, Ibid., 232-263; James Miller, The Passion of
Michel Foucault (New York, 1993), 150; Searle, The Construction of
Social Reality. On the debate over “free-will” and structural
“determinism” see Isaiah Berlin, “Historical Inevitability,” The Proper
Study of Mankind (New York, 2000) and . Pierre Bourdieu, Outline of a
Theory of Practice (Cambridge, UK, 2010), 73.
611
Berger and Luckmann, The Social Construction of Reality, 15;
DiMaggio and Powell, “The Iron Cage Revisited,” 20, 23, 26, 28;
Andre Lecours, “New Institutionalism: Issues and Questions,” New
Institutionalism: Theory and Analysis, Andre Lecours, ed. (Toronto:
University of Toronto Press, 2005), 3-25; John W. Meyer and Brian
Rowan, “Institutionalized Organizations: Formal Structure as Myth and
Ceremony,” The New Institutionalism in Organizational Analysis, Ibid.,
41, 44; Paul Pierson, Politics in Time: History, Institutions, and Social
Analysis (Princeton, 2004), 20-21, 43, 51; Lynne G. Zucker, “The Role
of Institutionalization in Cultural Persistence,” The New Institutionalism
in Organizational Analysis, Ibid., 85; Searle, The Construction of Social
Reality.
612
Isaiah Berlin, “Historical Inevitability,” Giddens, Capitalism and
Modern Social Theory; Miller, The Passion of Michel Foucault, 15,
150, 336; Karl R. Popper, Objective Knowledge: An Evolutionary
Approach, revised edition (Oxford, 1979), 217.
613
Henry Miller, Tropic of Capricorn (New York, 1961), 11.
614
Thomas Nagel, "Moral Luck," In Mortal Questions (Cambridge,
UK, 1979), 35.
615
Bourdieu, Outline of a Theory of Practice, 83.
616
Elisabeth S. Clemens, The People's Lobby: Organizational
Innovation and the Rise of Interest Group Politics in the United States,
1890-1925 (Chicago, 1997), 44-45.
617
James A. Berlin, “Postmodernism in the Academy,” Rhetorics,
Poetics, and Cultures (West Lafayette, 2003): 60-82; Friedland and
Alford, “Bringing Society Back In,” 232, 254; Jepperson, “Institutions,
Institutional Effects, and Institutionalism,” 145, 149, 151-52, 158;
Walter W. Powell, “Expanding the Scope of Institutional Analysis,”
The New Institutionalism in Organizational Analysis, Ibid., 188, 194-
195; Steven Brint and Jerome Karabel, “Institutional Origins and
Transformations: The Case of American Community Colleges,” The
New Institutionalism in Organizational Analysis, Ibid., 337-360; Ronald
Jepperson and John W. Meyer, "Multiple Levels of Analysis and the
Limits of Methodological Individualisms," Sociological Theory 29, no.
1 (March 2011): 54-73.

150
618
Sherry B. Ortner, Anthropology and Social Theory: Culture, Power,
and the Acting Subject (Durham, 2006), 7, 18, 127, 130, 133, 139, 147,
152. On the relationship between institutions and agency see: Ludwig
Wittgenstein, Philosophical Investigations, Trans. G. E. M. Anscombe
(Oxford, 2001), part I, 23. Paul Pierson argues that the term
institutional “change” is misleading because it is almost impossible to
change institutions fundamentally. Instead, Pierson recommends the
term “institutional development” as a more accurate description of how
institutions change through time. See: Politics in Time, 133, 137.
619
Elisabeth S. Clemens, The People's Lobby: Organizational
Innovation and the Rise of Interest Group Politics in the United States,
1890-1925 (Chicago, 1997), 92.
620
Ostrom, Understanding Institutional Diversity, 3.
621
Joel A. C. Baum and Jitendra V. Singh, “Organizational Hierarchies
and Evolutionary Processes,” Evolutionary Dynamics of Organizations,
Joel A. C. Baum and Jitendra V. Singh, eds., (Oxford, 1994), 5;
DiMaggio and Powell, “The Iron Cage Revisited;” Friedland and
Alford, “Bringing Society Back In;” Scott and Meyer, “The
Organization of Societal Sectors,” 137; Powell, “Expanding the Scope
of Institutional Analysis.”
622
Jepperson, “Institutions, Institutional Effects, and Institutionalism;”
Doug McAdam and W. Richard Scott, “Organizations and
Movements,” Social Movements and Organizational Theory, Gerald F.
Davis, Doug McAdam, W. Richard Scott, and Mayer N. Zald, eds.,
(Cambridge, UK, 2005), 4-40; Scott and Meyer, “The Organization of
Societal Sectors,” 117; Powell, “Expanding the Scope of Institutional
Analysis;” Zucker, “The Role of Institutionalization in Cultural
Persistence.”
623
Friedland and Alford, “Bringing Society Back In,” 232; McAdam
and Scott, “Organizations and Movements;” Pierson, Politics in Time;
Zucker, “The Role of Institutionalization in Cultural Persistence,” 84.
624
Siobhan Harty, “Theorizing Institutional Change,” New
Institutionalism: Theory and Analysis, Ibid., 51-79.
625
Ostrom, Understanding Institutional Diversity, 10.
626
Ostrom, Understanding Institutional Diversity, 11.
627
Wilson, On Human Nature, 6.
628
Ostrom, Understanding Institutional Diversity, 8.
629
Wilson, On Human Nature; Ostrom, Understanding Institutional
Diversity.
630
Ibid., 96.
631
Wilson, Consilience: The Unity of Knowledge, 44.
632
Ibid., 83-85.
633
Ibid., 96.
634
Ibid., 31.
635
See for example: Sociobiology (Cambridge, MA, 2000); The Social
Conquest of Earth (New York, 2012); Howard W. French, "E. O.
Wilson's Theory of Everything," The Atlantic (Nov 2011), 70-82.
636
Ibid., 55. See also 30-38.

151
637
Pierre Bourdieu, Outline of a Theory of Practice (Cambridge, UK,
2010), 31; Hilary Putnam, "A Half Century of Philosophy, Viewed
from Within," American Academic Culture in Transformation, Thomas
Bender and Carl E. Schorske, eds. (Princeton, 1997), 201.
638
Lisa Randall, Knocking on Heaven's Door: How Physics and
Scientific Thinking Illuminate the Universe and the Modern World
(New York, 2011), 5.
639
Ibid., 16.
640
"The Nature of the Universe: Ye Cannae Change the Laws of
Physics," The Economist (Sept 4 2010), 85-86.
641
Randall, Knocking on Heaven's Door; Brian Greene, The Elegant
Universe: Superstrings, Hidden Dimensions, and the Quest for the
Ultimate Theory (New York, 2003).
642
Ibid.
643
Randall, Knocking on Heaven's Door, 6.
644
Ibid., 8, 15.
645
David Deutsche has also made this point. He explains that when
physicists use the word "evolution" they mean "development" or
"motion," and "not variation and selection" (182). The Fabric of
Reality: The Science of Parallel Universes - and Its Implications (New
York, 1997).
646
Bess, "Blurring the Boundary between 'Person' and 'Product'", 58.
647
Elinor Ostrom, Understanding Institutional Diversity (Princeton,
2005), 6
648
Ernst Mayr, This Is Biology: The Science of the Living World
(Cambridge, MA, 1997), 107, 112, 122.
649
Deutsche, The Fabric of Reality, 27-28. This diversity of analytical
frameworks presents an epistemological problem, as Ostrom pointed
out: "One can understand why discourse may resemble a Tower of
Babel rather than a cumulative body of knowledge." See Ostrom,
Understanding Institutional Diversity, 11.
650
Ernst Mayr, This Is Biology: The Science of the Living World
(Cambridge, MA, 1997), xii, 19-20.
651
David M. Kreps, "Economics - The Current Position," American
Academic Culture in Transformation, Thomas Bender and Carl E.
Schorske, eds. (Princeton, 1997), 78, 83.
652
Richard Dawkins, The Selfish Gene (Oxford, 1989), 192-93. The
notion of memes was more fully developed by Susan Blackmore in The
Meme Machine (Oxford, 1999).
653
Richard Dawkins, River Out of Eden: A Darwinian View of Life
(New York, 1995), 3.
654
Dawkins, The Selfish Gene, 192-93.
655
See for example Susan Blackmore, The Meme Machine; James
Gleick, The Information: A History, A Theory, A Flood (New York,
2011).
656
I borrow this strong criticism of reductionism from Ernst Mayr, This
Is Biology: The Science of the Living World (Cambridge, MA, 1997),
xii, 8. Even the technology enthusiast Ray Kurzweil argues that human
knowledge and information "will survive only if we want it to,"

152
meaning that it is not independent of our consciousness and
intentionality (330). The Singularity is Near: When Humans Transcend
Biology (New York, 2005).
657
John Gray, Isaiah Berlin (Princeton, 1996), 81.
658
Diamond, The Third Chimpanzee, 37.
659
Ibid., 98. Diamond admits, as do I, that human "cultural traits" are
based on "genetic foundations," but he details how we as an inventive
species have modified ourselves through other mechanisms, like sexual
selection and social evolution (137).
660
Dawkins, River Out of Eden: A Darwinian View of Life, 15.
661
Owen Flanagan, The Really Hard Problem: Meaning in a Material
World (Cambridge, MA, 2007), 22.
662
Ibid., xii.
663
See for example Daniel C. Dennett, Breaking the Spell: Religion as
a Natural Phenomenon (New York, 2006); Richard Dawkins, The God
Delusion (New York, 2006).
664
Dawkins, River Out of Eden: A Darwinian View of Life, 3.
665
Ridley, Nature Via Nurture, 209.
666
Ibid., 28.
667
Randall, Knocking on Heaven's Door, 15.
668
Rene Dubos, So Human an Animal (New York, 1968), 128; Dubos,
A God Within, 21.
669
"Wisdom about Crowds," The Economist (April 23 2011), 85.
670
Jared Diamond, The Third Chimpanzee: The Evolution and Future
of the Human Animal (New York, 1992), 1.
671
Timothy Taylor, "Letter to Editor," In The New Humanists: Science
at the Edge, John Brockman, ed. (New York, 2003), 372-73.
672
Daniel C. Dennett, "Letter to Editor," In The New Humanists:
Science at the Edge, John Brockman, ed. (New York, 2003), 395.
673
Dennett, Breaking the Spell.
674
Ibid., 261-62.
675
Rene Dubos, So Human an Animal (New York, 1968), 57.
676
Karl R. Popper, Objective Knowledge: An Evolutionary Approach,
revised edition (Oxford, 1979), 229; see also Diamond, The Third
Chimpanzee.
677
Prigogine and Stengers, Order out of Chaos, 225; Ridley, Nature Via
Nurture, 64; Deutsche, The Fabric of Reality, 27-28.
678
Ray Kurzweil, The Singularity is Near: When Humans Transcend
Biology (New York, 2005), 453.
679
Stephen Toulmin, Return to Reason (Cambridge, MA, 2001), 153.
680
Wilson, Consilience: The Unity of Knowledge, 166, 224.
681
Larry Hirschfeld, qtd. in Pacal Boyer, Religion Explained: The
Evolutionary Origins of Religious Thought (New York, 2001), 251-52.
682
Helen A. Fielding, "Multiple Moving Perceptions of the Real:
Arendt, Merleau-Ponty, and Truitt," Hypatia 26, no. 3 (Summer 2011),
532.
683
Thomas Bender, "Politics, Intellect, and the American University,
1945-1995," American Academic Culture in Transformation, Thomas
Bender and Carl E. Schorske, eds. (Princeton, 1997), 41. In 1953 Merle

153
Curti called culture "one of the most important and emancipating of all
twentieth-century contributions to knowledge in the social field." Curti,
"The Setting and the Problem," American Scholarship in the Twentieth
Century, Merle Curtie, ed. (Cambridge, MA, 1953), 5.
684
Boyer, Religion Explained, 138, 164; Pierre Bourdieu, Outline of a
Theory of Practice (Cambridge, UK, 2010).
685
Bourdieu, Outline of a Theory of Practice, 16; Brian C. Anderson,
Raymond Aron: The Recovery of the Political (Lanham, MD, 1997), 39.
686
Anderson, Raymond Aron: The Recovery of the Political, 46.
687
Clifford Geertz, The Interpretation of Cultures (New York, 1973),
41, 89.
688
C. Wright Mills, The Sociological Imagination (Oxford, 2000), 79.
689
Ibid., 78.
690
Wilson, Consilience: The Unity of Knowledge, 269.
691
Philip K. Howard, The Death of Common Sense: How Law is
Suffocating America (New York, 1994), 12, 60-61, 87, 186-187; Hugo
Mercier and Dan Sperber, "Why Do Humans Reason? Arguments for
an Argumentative Theory," Behavioral and Brain Sciences 34 (2011):
57-111.
692
Mills, The Sociological Imagination, 77; Richard Rorty,
"Philosopher-envy," Deadalus 133, no. 4 (Fall 2004): 18-24.
693
Mills, The Sociological Imagination, 6.
694
Wilson, Consilience: The Unity of Knowledge, 266.
695
Mills, The Sociological Imagination, 158.
696
Ibid., 164.
697
Ibid., 167-69.
698
Ibid., 187-88.
699
Raymond Aron, In Defense of Political Reason, qtd. in Brian C.
Anderson, Raymond Aron: The Recovery of the Political (Lanham, MD,
1997), 170.
700
Peter Wood, "Session II: The Actual Preoccupations of the Social
Sciences," Conference on the State of the Social Sciences, Critical
Review 16, nos. 2-3 (2004), 187.
701
Cormac McCarthy, Blood Meridian (New York, 1985), 256.
702
John Gray, Straw Dogs: Thoughts on Humans and Other Animals
(New York, 2003), 15.
703
Ann V. Murphy, "Corporeal Vulnerability and the New Humanism,"
Hypatia 26, no. 3 (Summer 2011), 589.
704
J. Donald Hughes, "Sustainability and Empire," The Hedgehog
Review 14, no. 2 (Summer 2012), 35. See also: J. Donald Hughes, An
Environmental History of the World (New York, 2009).
705
Ernst Mayr, This Is Biology: The Science of the Living World
(Cambridge, MA, 1997), xv.
706
John Gray, False Dawn: The Delusions of Global Capitalism (New
York, 1998), 16.
707
Qtd. in Jared Diamond, The Third Chimpanzee: The Evolution and
Future of the Human Animal (New York, 1992), 366.
708
Christopher Hitchens, "Political Animals," Arguably: Essays by
Christopher Hitchens. (New York, 2011), 111.

154
709
Brian C. Anderson, Raymond Aron: The Recovery of the Political
(Lanham, MD, 1997); Michel Foucault, The Foucault Reader (New
York, 1984); Isaiah Berlin, The Proper Study of Mankind (New York,
2000); John Gray, Isaiah Berlin (Princeton, 1996); Ludwig
Wittgenstein, Philosophical Investigations (Oxford, 1953).
710
Hilary Putnam, "A Half Century of Philosophy, Viewed from
Within," American Academic Culture in Transformation, Thomas
Bender and Carl E. Schorske, eds. (Princeton, 1997), 213.
711
Anderson, Raymond Aron: The Recovery of the Political, 170.
712
John Gray, Isaiah Berlin (Princeton, 1996), 9.
713
Alexander Pope, An Essay on Man, In The Norton Anthology of
English Literature, 6th ed, vol. 1, M. H. Abrams, ed. (New York, 1993),
2270.
714
Blaise Pascal, Pensees and Other Writings, trans. Honor Levi
(Oxford, 1995), 42, 70, 66-67.
715
Nietzsche, Human, All too Human, 37; William Blake, "London," In
The Complete Poetry and Prose of William Blake, David V. Erdman,
ed. (New York, 1988), 27.
716
Author's emphasis. Nietzsche, Human, All too Human, 78-79, 14-
15.
717
Nietzsche, Human, All too Human, 174-75.
718
Nietzsche, Human, All too Human, 193.
719
Nietzsche, Human, All too Human, 256.
720
T. S. Eliot, Four Quartets, In Collected Poems, 1909-1962 (New
York, 1963), 178, 180-81, 185, 203, 208.
721
John Gray, Black Mass: Apocalyptic Religion and the Death of
Utopia (New York, 2007).
722
Frank E. Manuel, ed., Utopias and Utopian Thought (Boston, 1966);
Imagining the Future, The Hedgehog Review 10, no. 1 (Spring 2008);
Gray, Black Mass.
723
Thomas More, Utopia (New York, 1965), 64.
724
Ironically, Brecht ended up as a refugee in the U.S. and revised his
play for an American audience, but was later interrogated by the House
Un-American Activities Committee because of his radical politics, and
he was forced to leave the U.S. and return to the then Soviet occupied
East Germany.
725
Bertolt Brecht, "Life of Galileo," In Brecht: Collected Plays, vol. 5
(New York, 1972), 58.
726
Brecht, "Life of Galileo,"94.
727
Brecht, "Life of Galileo," 64.
728
John Gray, Black Mass: Apocalyptic Religion and the Death of
Utopia (New York, 2007), 20.
729
Charles E. Lindblom and David K. Cohen, Usable Knowledge:
Social Science and Social Problem Solving (New Haven, 1979), 73.
730
Tony Judt, The Burden of Responsibility: Blum, Camus, Aron, and
the French Twentieth Century (Chicago, 1998), 123, 135.
731
Rene Dubos, So Human an Animal (New York, 1968), 61, 101, 128,
xii.

155
732
Ralph Waldo Emerson, "Fate," Selections from Ralph Waldo
Emerson (Boston, 1957), 330.
733
Ibid., 142-43. See also John Gray, Isaiah Berlin (Princeton, 1996),
23-24; Philip K. Howard, The Death of Common Sense, 12, 60-61, 87,
186-187.
734
Michael Berube and Cary Nelson, eds., Higher Education Under
Fire: Politics, Economics, and the Crisis of the Humanities (New York,
1995); Martha Nussbaum, Cultivating Humanity: A Classical Defense
of Reform in Liberal Education (Cambridge, MA, 1997); Sander
Gilman, The Fortunes of the Humanities: Thoughts for After the Year
2000 (Stanford, 2000); Richard Wolin, "Reflections on the Crisis in the
Humanities," The Hedgehog Review 13, no. 2 (Summer 2011): 8-20.
735
On higher education see Jonathan R. Cole, The Great American
University: Its Rise to Preeminence, Its Indispensable National Role,
and Why It Must Be Protected (New York, 2009), 100, 151, 155; Louis
Menand, The Marketplace of Ideas: Reform and Resistance in the
American University (New York, 2010). On K-12 education see Diane
Ravitch, The Death and Life of the Great American School System:
How Testing and Choice are Undermining Education (New York,
2010), 226.
736
Cole, The Great American University, 155; Ravitch, The Death and
Life of the Great American School System.
737
Martha C. Nussbaum, Cultivating Humanity: A Classical Defense of
Reform in Liberal Education (Cambridge, MA, 1997), 8-9; Richard
Wolin, "Reflections on the Crisis in the Humanities," 10.
738
Owen Flanagan, The Bodhisattva's Brain: Buddhism Naturalized
(Cambridge, MA, 2011), 109.
739
Menand, The Marketplace of Ideas, 56, 51.
740
Ibid., 56.
741
Nussbaum, Cultivating Humanity, 295.
742
M. H. Abrams, "The Transformation of English Studies: 1930-
1995," American Academic Culture in Transformation, Thomas Bender
and Carl E. Schorske, eds. (Princeton, 1997), 144. See also Sarah
Bakewell, How to Live: A Life of Montaigne in One Question and
Twenty Attempts at an Answer (New York, 2010).
743
John Dewey, Democracy and Education (New York, 1916); Paulo
Freire, Pedagogy of Freedom: Ethics, Democracy, and Civic Courage
(Lanham, 2001); Amy Gutmann, Democratic Education (Princeton,
1987).
744
Amartya Sen, Development as Freedom (New York, 1999), 283.
745
John Gray, Isaiah Berlin (Princeton, 1996), 23-24.
746
Ibid., 284. See also Bakewell, How to Live: A Life of Montaigne in
One Question and Twenty Attempts at an Answer, 320.
747
Owen Flanagan, The Really Hard Problem: Meaning in a Material
World (Cambridge, MA, 2007), 1.
748
Seneca, On Anger, qtd. in Martha C. Nussbaum, Cultivating
Humanity: A Classical Defense of Reform in Liberal Education
(Cambridge, MA, 1997), xiii.

156
157
158
Selected Bibliography

Abbott, Andrew. Chaos of Disciplines. Chicago, 2001.


Abrams, M. H. The Mirror and the Lamp: Romantic Theory and the Critical
Tradition. Oxford, UK, 1971.
Adams, Henry. The Education of Henry Adams. Boston, 1961.
Adas, Michael. Machines as the Measure of Men: Science, Technology, and
Ideologies of Western Dominance. Ithaca, 1990.
Anderson, Brian C. Raymond Aron: The Recovery of the Political. Lanham, 1997.
Arendt, Hannah. The Human Condition. Chicago, 1958.
Armstrong, Karen. Buddha. New York, 2001.
Aron, Raymond. The Century of Total War. Lanham, 1985.
Aron, Raymond. The Dawn of Universal History: Selected Essays from a Witness
of the Twentieth Century. Trans. Barbara Bray. New York, 2002.
Aron, Raymond. The Opium of the Intellectuals. New York, 1962.
Arthur, W. Brian. The Nature of Technology: What It Is and How It Evolves.
New York, 2009.
Ashcroft, Bill, Gareth Griffiths, and Helen Tiffin. (eds.). The Post-Colonial
Studies Reader. London, 1995.
Attar, Samar. The Vital Roots of European Enlightenment: Ibn Tufayl's Influence
on Modern Western Thought. Lanham, 2007.
Auden, W. H. Collected Poems. New York, 1991.
Avineri, Shlomo. Hegel's Theory of the Modern State. Cambridge, UK, 1989.
Bakan, Joel. The Corporation: The Pathological Pursuit of Profit and Power.
New York, 2005.
Bakewell, Sarah. How to Live: A Life of Montaigne in One Question and Twenty
Attempts at an Answer. New York, 2010.
Beach, J. M. Studies in Poetry: The Visionary. Lanham, 2004.
Beach, J. M. Studies in Ideology: Essays on Culture and Subjectivity. Lanham,
2005.
Bender, Thomas, and Carl E. Schorske. (Eds.). American Academic Culture in
Transformation. Princeton, 1998.
Bender, Thomas. "Politics, Intellect, and the American University: 1945- 1995."
American Academic Culture in Transformation. Thomas Bender and
Carl E. Schorske (eds.). Princeton, 1997.
Berger, Peter L. (ed.). The Limits of Social Cohesion: Conflict and Mediation in
Pluralist Societies. Boulder, 1998.

159
Berger, Peter L., and Thomas Luckmann. The Social Construction of Reality: A
Treatise in the Sociology of Knowledge. New York, 1966.
Berlin, Brent. Ethnobiological Classification: Principles of Categorization of
Plants and Animals in Traditional Societies. Princeton, 1992.
Berlin, Isaiah. The Proper Study of Mankind. New York, 2000.
Berube, Michael, and Cary Nelson. (eds.). Higher Education Under Fire:
Politics, Economics, and the Crisis of the Humanities. New York, 1995.
Biagioli, Mario. Galileo, Courtier: The Practice of Science in the Culture of
Absolutism. Chicago, 1993.
Blackmore, Susan. The Meme Machine. Oxford, 1999.
Blake, William. The Complete Poetry and Prose of William Blake. David V.
Erdman. ed. New York, 1988.
Bloom, Allan. The Closing of the American Mind. New York, 1988.
Bloor, David. Knowledge and Social Imagery. 2nd ed. Chicago, 1991.
Borge, Jorge Luis. "On Exactitude in Science." The Aleph and Other Stories.
Trans. Andrew Hurley. New York, 2000.
Bourdieu, Pierre. Outline of a Theory of Practice. Cambridge, UK, 2010
Bowler, Peter J. Reconciling Science and Religion: The Debate in Early
Twentieth Century Britain. Chicago, 2001.
Boyer, Pascal, and James V. Wertsch. Memory in Mind and Culture. Cambridge,
UK, 2009.
Boyer, Pascal. Religion Explained: The Evolutionary Origins of Religious
Thought. New York, 2001.
Brecht, Bertolt. Brecht: Collected Plays. Vol. 5. New York, 1972.
Breckman, Warren. Marx, The Young Hegelians, and the Origins of Radical
Social Theory. Cambridge, UK, 1999.
Breisach, Ernst. Historiography: Ancient, Medieval, and Modern. 3rd ed.
Chicago, 2007.
Brewer, Talbot. The Retrieval of Ethics. Oxford, UK, 2009.
Brint, Steven, and Jerome Karabel. “Institutional Origins and Transformations:
The Case of American Community Colleges,” In The New
Institutionalism in Organizational Analysis, edited by Paul J. DiMaggio
and Walter W. Powell. Chicago, 1991.
Bruner, Jerome. On Knowing. Cambridge. MA, 1979.
Budd, Susan. Varieties of Unbelief: Atheists and Agnostics in English Society,
1850- 1960. London, 1977.
Buddha. The Dhammapada. Trans. John Ross Carter and Mahinda Palihawadan.
Oxford, 2000.
Burke, Kenneth. A Grammar of Motives. Berkeley, 1969.
Burke, Kenneth. A Rhetoric of Motives. 950. Berkeley: University of California
Press. 969. 37.
Burke, Kenneth. Attitudes Toward History. 3rd Ed. Berkeley, 1984.
Burke, Kenneth. Language as Symbolic Action: Essays on Life, Literature, and
Method. Berkeley, 1966.
Burke, Kenneth. Permanence and Change: An Anatomy of Purpose. 3rd ed.
Berkeley, 1984.
Burke, Kenneth. The Philosophy of Literary Form. 3rd ed. Berkeley, 1973.
Burke. Kenneth. Counter Statement. Berkeley, 1968.

160
Campbell, John L. “Institutional Analysis and the Role of Ideas in Political
Economy.” In The Rise of Neoliberalism and Institutional Analysis,
edited by John L. Campbell and Ove K. Pedersen. Princeton, 2001.
Capra, Fritjof. The Web of Life: A New Scientific Understanding of Living
Systems. New York, 1996.
Carr, David. “Narrative Explanation and Its Malcontents.” History and Theory, 47
(Feb 2008): 9-30.
Clark, Ronald W. Einstein: The Life and Times. New York, 1971.
Clemens, Elisabeth S. The People's Lobby: Organizational Innovation and the
Rise of Interest Group Politics in the United States, 1890- 1925.
Chicago, 1997.
Cohen, I. Bernard. Revolution in Science. Cambridge, UK, 1985.
Cole, Jonathan R. The Great American University: Its Rise to Preeminence, Its
Indispensable National Role, and Why It Must Be Protected. New
York, 2009.
Coleman, James. Introduction to Mathematical Sociology. New York, 1964.
Collier, Andrew. "Critical Realism." The Politics of Method in the Human
Sciences. George Steinmetz (ed.). Durham, 2005.
Confucius. The Analects. Trans. David Hinton. Washing, DC, 1998.
Craig, William Lane, and Quentin Smith. Theism, Atheism, and Big Bang
Cosmology. Oxford, UK, 1993.
Damasio, Antonio. The Feeling of What Happens. New York, 1999.
Dawkins, Richard. River Out of Eden: A Darwinian View of Life. New York,
1995.
Dawkins, Richard. The God Delusion. New York, 2006.
Dawkins, Richard. The Selfish Gene. Oxford, UK, 1989.
Dawley, Alan. Struggles for Justice: Social Responsibility and the Liberal State.
Cambridge, MA, 1991.
Delillo, Don. White Noise. New York, 2009.
Dennett, Daniel C. Breaking the Spell: Religion as a Natural Phenomenon. New
York, 2006.
Dennett, Daniel C. Consciousness Explained. New York, 1991.
Dennett, Daniel C. Darwin’s Dangerous Idea: Evolution and the Meaning of Life.
New York, 1995.
Dennett, Daniel C. Freedom Evolves. New York. 2003.
Derrida, Jacques. Limited Inc. Trans. Samuel Weber and Jeffrey Mahlman.
Evanston, 1977.
Deutsche, David. The Fabric of Reality: The Science of Parallel Universes - and
Its Implications. New York, 1997.
Dewey, John. Democracy and Education. New York, 1916.
Diamond, Jared. Collapse: How Societies Choose to Fail or Succeed. New York,
2005.
Diamond, Jared. The Third Chimpanzee: The Evolution and Future of the Human
Animal. New York, 1992.
Didion, Joan. Slouching Towards Bethlehem. New York, 2008.
Dillard, Annie. The Writing Life. New York, 1989.
Dilthey, Wilhelm. Dilthey: Selected Writings. Cambridge, UK, 1976.

161
DiMaggio, Paul J., and Walter W. Powell. “The Iron Cage Revisited:
Institutional Isomorphism and Collective Rationality in Organizational
Fields.” In The New Institutionalism in Organizational Analysis,
edited by Paul J. DiMaggio and Walter W. Powell. 1983. Chicago,
1991.
Dubos, Rene. So Human an Animal. New York, 1968.
Dubos, Rene. A God Within: A Positive Philosophy for a More Complete
Fulfillment of Human Potentials. New York, 1972.
Durkheim, Emile. The Elementary Forms of the Religious Life. New York, 1965.
Eagleton, Terry. Ideology: An Introduction. London, 1991.
Eliade, Mircea. The Sacred and the Profane: The Nature of Religion. New York,
1987.
Eliot, T. S. Collected Poems, 1909- 1962. New York, 1963.
Emerson, Ralph Waldo. Selections from Ralph Waldo Emerson. Boston, 1957.
Febvre, Lucien, and Henri-Jean Martin. The Coming of the Book: The Impact of
Printing, 1450- 1800. London, 2010.
Feyerabend, Paul. Against Method. 4th ed. London, 2010.
Feyerabend, Paul. The Tyranny of Science. Cambridge, UK, 2011.
Fish, Stanley. There's No Such Thing as Free Speech...and it's a good thing too.
Oxford, UK, 1994.
Fitzgerald, Timothy. The Ideology of Religious Studies. Oxford, UK, 2000.
Flanagan, Owen. The Bodhisattva's Brain: Buddhism Naturalized. Cambridge,
MA, 2011.
Flanagan, Owen. Self Expressions: Mind, Morals, and the Meaning of Life.
Oxford, UK. 1996.
Flanagan, Owen. The Really Hard Problem: Meaning in a Material World.
Cambridge, MA. 2007.
Foner, Eric. The Story of American Freedom. New York, 1998.
Foucault, Michel. The Foucault Reader. New York, 1984.
Foucault, Michele. The Order of Things. New York, 1970.
Freud, Sigmund. Civilization and Its Discontents. Trans and Ed. James Strachey.
New York, 1961.
Freud, Sigmund. On Dreams. Trans and Ed. James Strachey. New York, 1980.
Friedland, Roger, and Robert R. Alford. “Bringing Society Back In: Symbols,
Practices, and Institutional Contradictions.” In The New
Institutionalism in Organizational Analysis, edited by Paul J. DiMaggio
and Walter W. Powell. Chicago, 1991.
Frijda, N. H. The Emotions. Cambridge, UK, 1987.
Fromm, Erich. Beyond the Chains of Illusion. New York, 1966.
Fukuyama, Francis. The End of History and the Last Man. New York, 1992.
Galbraith, John Kenneth. The Essential Galbraith. New York, 2001.
Gaukroger, Stephen. Francis Bacon and the Transformation of Early-Modern
Philosophy. Cambridge, UK, 2001.
Gay, Peter. The Enlightenment: The Rise of Modern Paganism. New York, 1995.
Gay, Peter. The Enlightenment: The Science of Freedom. New York, 1969.
Gay, Peter. The Party of Humanities: Essays in the French Enlightenment. New
York, 1971.
Geertz, Clifford. Local Knowledge. New York, 1983.

162
Geertz, Clifford. The Interpretation of Cultures. New York, 2000.
Giddens, Anthony. Capitalism and Modern Social Theory: An Analysis of the
Writing of Marx, Durkheim, and Max Weber. Cambridge, UK, 1971.
Gilman, Sander. The Fortunes of the Humanities: Thoughts for After the Year
2000. Stanford, 2000.
Glover, Jonathan. Humanity: A Moral History of the Twentieth Century. New
Haven, 2000.
Goodman, Nelson. Ways of Worldmaking. Indianapolis, 1978.
Goodstein, David. On Fact and Fraud: Cautionary Tales from the Front Lines of
Science. Princeton, 2010.
Goody, Jack, and Ian Watt. "The Consequences of Literacy." Comparative
Studies in History and Society, 5 (1963): 304-345.
Goody, Jack. The Domestication of the Savage Mind. Cambridge, UK, 1977.
Goody, Jack. The Logic of Writing and the Organization of Society. Cambridge,
UK, 1986
Gottlieb, Anthony. The Dream of Reason: A History of Philosophy from the
Greeks to the Renaissance. New York, 2001.
Gottlieb, Roger, S. Marxism 1844- 1990: Origins, Betrayal, Rebirth. New York,
1992.
Graff, Gerald. Beyond the Culture Wars: How Teaching the Conflicts Can
Revitalize American Education. New York, 1992.
Gray, John. Enlightenment's Wake. New York, 2009.
Gray, John. Essays in Political Philosophy. London, 1989.
Gray, John. Isaiah Berlin. Princeton, 1996.
Gray, John. Black Mass: Apocalyptic Religion and the Death of Utopia. New
York, 2007.
Gray, John. False Dawn: The Delusions of Global Capitalism. New York, 1998.
Gray, John. Straw Dogs: Thoughts on Humans and Other Animals. New York,
2003.
Greene, Brian. The Elegant Universe: Superstrings, Hidden Dimensions, and the
Quest for the Ultimate Theory. New York, 2003.
Gutmann, Amy. Democratic Education. Princeton, 1987.
Hacking, Ian. Representing and Intervening: Introductory Topics in the
Philosophy of Natural Science. Cambridge, UK, 1983.
Hardt, Michael, and Antonio Negri. Empire. Cambridge, MA, 2000.
Harley, Diane and Sophia Krzys Acord. Peer Review in Academic Promotion and
Publishing: Its Meaning, Locus, and Future. Center for Studies in
Higher Education. University of California at Berkeley. March 2011.
Harris, Sam. The End of Faith: Religion, Terror, and the Future of Reason. New
York, 2004.
Harty, Siobhan. “Theorizing Institutional Change.” In New Institutionalism:
Theory and Analysis, edited by Andre Lecours. Toronto, 2005.
Haskell, Thomas L. The Emergence of Professional Social Science. Urbana,
1977.
Haught, John F. Science and Religion: From Conflict to Conversation .Mahwah,
1995.
Hayek, F. A. The Road to Serfdom. Chicago, 1994.
Hecht, Jennifer Michael. Doubt: A History. New York, 2003.
Hegel, Georg W. F. The Philosophy of History. Amherst, 1991.

163
Heidegger, Martin. Poetry, Language, Thought. New York, 1971.
Heisenberg, Werner. Physics and Philosophy: The Revolution in Modern Science.
New York, 2007.
Hitchens, Christopher. Arguably: Essays by Christopher Hitchens. New York,
2011.
Hitchens, Christopher. God is not Great: How Religion Poisons Everything. New
York, 2007.
Hobsbawm, Eric. The Age of Empire, 1875- 1914. New York, 1989.
Hobsbawm, Eric. The Age of Revolution, 1789- 1848. New York, 1996.
Hochschild, Adam. Bury the Chains: Prophets and Rebels in the Fight to Free an
Empire's Slaves. New York, 2005.
Hofstadter, Richard. Social Darwinism in American Thought. Boston, 1992.
Howard, Philip K. The Death of Common Sense: How Law is Suffocating
America. New York, 1994.
Howe, David Walker. What Hath God Wrought: The Transformation of America:
1815- 1848. Oxford, 2007.
Hughes, J. Donald. An Environmental History of the World. New York, 2009.
Hume, David. Treatise of Human Nature. Oxford, UK, 1888.
Hunt, Morton. The Story of Psychology. New York, 1993.
Hunter, James Davison. Before the Shooting Begins: Searching for Democracy in
America's Culture War. New York, 1994.
Hunter, James Davison. Culture Wars: The Struggle to Define America. New
York, 1991.
Hurston, Zora Neale. Dust Tracks On a Road. 2nd Ed. Urbana, 1984.
Hyman, Louis. Borrow: The American Way of Debt. New York, 2012.
Isaacson, Walter. Einstein: His Life and Universe. New York, 2007.
Jacoby, Susan. Freethinkers: A History of American Secularism. New York,
2004.
James, William. Pragmatism, Writings 1902-1910. New York, 1988.
Jaspers, Karl. Philosophy Is for Everyman: A Short Course in Philosophical
Thinking. New York, 1967.
Jennings, Ken. Maphead: Charting the Wide, Weird World of Geography. New
York, 2012.
Jepperson, Ronald L. “Institutions, Institutional Effects, and Institutionalism,” In
The New Institutionalism in Organizational Analysis, edited by Paul J.
DiMaggio and Walter W. Powell. Chicago, 1991.
Jepperson, Ronald L., and John W. Meyer. “The Public Order and the
Construction of Formal Organizations.” In The New Institutionalism in
Organizational Analysis, edited by Paul J. DiMaggio and Walter W.
Powell. Chicago, 1991.
Johnson, Steven. Emergence: The Connected Lives of Ants, Brains, Cities, and
Software. New York, 2000.
Judson, Horace Freeland. The Great Betrayal: Fraud in Science. New York,
2004.
Judt, Tony. The Burden of Responsibility: Blum, Camus, Aron and the French
Twentieth Century. Chicago, 1998.
Kahneman, Daniel. Thinking. Fast and Slow. New York, 2011.
Kant, Immanuel. Critique of Pure Reason. Trans. J. M. D. Meiklejohn. London,
1993.

164
Kant, Immanuel. Practical Philosophy: The Cambridge Edition of the Works of
Immanuel Kant. Trans. Mary J. Gregor. Cambridge, UK, 1996.
Kenneth, Burke. The Rhetoric of Religion: Studies in Logology. Berkeley, 1970.
King, Desmond. In the Name of Liberalism: Illiberal Social Policy in the United
States and Britain. Oxford, UK, 1999.
Kirsch, Irving. The Emperor's New Drugs: Exploding the Antidepressant Myth.
New York, 2010.
Klee, Robert. (Ed.). Scientific Inquiry: Readings in the Philosophy of Science.
Oxford, UK, 1999.
Kreps, David M. "Economics - The Current Position." American Academic
Culture in Transformation. Thomas Bender and Carl E. Schorske,
(eds.). Princeton, 1997.
Kuhn, Thomas S. The Structure of Scientific Revolutions. Chicago, 1962.
Kuklick, Bruce. The Rise of American Philosophy: Cambridge, Massachusetts,
1860- 1930. New Haven, 1977.
Kurzweil, Ray. The Singularity is Near: When Humans Transcend Biology. New
York, 2005.
Lakoff, George. Women, Fire, and Dangerous Things: What Categories Reveal
About the Mind. Chicago, 1987.
Lawson, Tony. “Economics and Critical Realism: A Perspective on Modern
Economics.” The Politics of Method in the Human Sciences. Durham,
2005.
Lazarus, Richard. Emotion and Adaptation. Oxford, UK, 1994.
Levi, Primo. Survival in Auschwitz. New York, 1966.
Levi, Primo. The Drowned and the Saved. New York, 1988.
Levi, Primo. The Reawakening. New York, 1995.
Levin, Nora. The Holocaust Years: The Nazi Destruction of European Jewry,
1933- 1945. Malabar, 1990.
Levi-Strauss, Claude. The Savage Mind. London, UK, 1966.
Levitt, Steven D., and Stephen J. Dubner. Freakonomics. New York, 2009.
Lewis, Michael. The Big Short: Inside the Doomsday Machine. New York, 2011.
Lindblom, Charles E. "Political Science in the 1940s and 1950s." American
Academic Culture in Transformation. Thomas Bender and Carl E.
Schorske. (eds.). Princeton, 1997.
Lindblom, Charles E. Inquiry and Change: The Troubled Attempt to Understand
and Shape Society. New Haven, 1990.
Lindblom, Charles E., and David K. Cohen. Usable Knowledge: Social Science
and Social Problem Solving. New Haven, 1979.
Lindley, David. Uncertainty: Einstein. Heisenberg. Bohr. and the Struggle for the
Soul of Science. New York, 2008.
Lippmann, Walter. Public Opinion. New York, 1922.
Lynas, Mark. The God Species: Saving the Planet in the Age of Humans. New
York, 2011.
MacIntyre, Alasdair. After Virtue: A Study in Moral Theory. 3rd ed. Notre Dame,
2007.
Mah, Harold. The End of Philosophy, The Origin of "Ideology": Karl Marx and
the Crisis of the Young Hegelians. Berkeley, 1987.
Mailer, Norman. The Armies of the Night: History as a Novel, the Novel as
History. New York, 1994.

165
Mannheim, Karl. Ideology and Utopia: An Introduction to the Sociology of
Knowledge, translated by Louis Wirth & Edward A. Shils. New York,
1936.
Manuel, Franke E. (ed.). Utopias and Utopian Thought. Boston, 1966.
Marchand., Trevor H. J. "Making Knowledge: Explorations of the Indissoluble
Relation between Minds, Bodies, and Environment." Journal of the
Royal Anthropological Institute (2010): S1-S2.
Marx, Karl. The Marx-Engels Reader. 2nd ed. New York, 1978.
Marx, Karl. Karl Marx: Early Writings. London, 1992.
Mayr, Ernst. This Is Biology: The Science of the Living World. Cambridge, MA,
1997.
Mazlish, Bruce. A New Science: The Breakdown of Connections and the Birth of
Sociology. Oxford, UK, 1989.
McGerr, Michael. A Fierce Discontent: The Rise and Fall of the Progressive
Movement in America. Oxford, UK, 2003.
McNeill, William H. Mythistory and Other Essays. Chicago, 1986.
Mearsheimer, John J. Why Leaders Lie: The Truth about Lying in International
Politics. Oxford, UK, 2011.
Menand, Louis. The Marketplace of Ideas: Reform and Resistance in the
American University. New York, 2010.
Mencius. Mencius. Trans. David Hinton. Washington, DC, 1998.
Mercier, Hugo, and Dan Sperber. "Why Do Humans Reason? Arguments for an
Argumentative Theory." Behavioral and Brain Sciences 34, no. 2
(2011): 57- 74.
Meyer, John W., and Brian Rowan. “Institutionalized Organizations: Formal
Structure as Myth and Ceremony.” In The New Institutionalism in
Organizational Analysis, edited by Paul J. DiMaggio and Walter W.
Powell. Chicago, 1991.
Miller, Henry. Tropic of Capricorn. New York, 1961.
Miller, James. The Passion of Michel Foucault. New York: Simon & Schuster,
1993.
Mills, C. Wright. The Sociological Imagination. Oxford, UK, 2000.
Mitchell, Mark T. Michael Polanyi. Wilmington, 2006.
More, Thomas. Utopia. New York, 1965.
Moreman, Daniel E. "The Meaning Response: Thinking About Placebos." Pain
Practice. 6, no 4 (2006). 233-36.
Morgan, Edmund S. Inventing the People: The Rise of Popular Sovereignty in
England and America. New York, 1988.
Nagel, Thomas. Mortal Questions. Cambridge, UK, 1979.
Nietzsche, Friedrich. Beyond Good and Evil. Trans. Walter Kaufman. New York,
1989.
Nietzsche, Friedrich. Human, All too Human: A Book for Free Spirits. Trans.
Marian Faber and Stephen Lehmann. Lincoln, 1996.
North, Douglas C. Structure and Change in Economic History. New York, 1981.
Novick, Peter. That Noble Dream: The ‘Objectivity’ Question and the American
Historical Profession. Cambridge, UK, 1988.
Nozick, Robert. The Examined Life: Philosophical Meditations. New York, 1989.
Nussbaum, Martha C. Creating Capabilities: The Human Development
Approach. Cambridge, MA, 2011.

166
Nussbaum, Martha C. Cultivating Humanity: A Classical Defense of Reform in
Liberal Education. Cambridge, MA, 1997.
O’Brien, Tim. The Things They Carried. New York, 1990.
Ong, Walter J. Ramus: Method. and the Decay of Dialogue. Chicago. 2004.
Oreskes, Naomi, and Erik M. Conway. Merchants of Doubt: How a Handful of
Scientists Obscured the Truth on Issues from Tobacco Smoke to Global
Warming. New York, 2010.
Ortner, Sherry B. Anthropology and Social Theory: Culture, Power, and the
Acting Subject. Durham, 2006.
Ostrom, Elinor. Governing the Commons: The Evolution of Institutions for
Collective Action. Cambridge, UK, 1990.
Ostrom, Elinor. Understanding Institutional Diversity. Princeton, 2005.
Painter, Nell Irvin. Standing at Armageddon: The United States, 1877- 1919.
New York, 1987.
Pascal, Blaise. Pensees and Other Writings. Trans. Honor Levi. Oxford, UK,
1995.
Peterson, Michael L., and Raymond J. Vanarragon. Contemporary Debates in
Philosophy of Religion. Oxford, UK, 2004.
Pierson, Paul. Politics in Time: History, Institutions, and Social Analysis.
Princeton, 2004.
Pinker, Steven. How the Mind Works. New York, 1997.
Pinker, Steven. The Blank Slate: The Modern Denial of Human Nature. New
York, 2003.
Plato. Plato: Complete Works. John M. Cooper. ed. Indianapolis, 1997.
Polanyi, Michael. Personal Knowledge: Towards a Post-Critical Philosophy.
Chicago, 1962.
Polanyi, Michael. Science. Faith and Society: A Searching Examination of the
Meaning and Nature of Scientific Inquiry. Chicago, 1964.
Popper, Karl R. The Logic of Scientific Discovery. London, 2002.
Popper, Karl R. Objective Knowledge: An Evolutionary Approach. Revised
edition. Oxford, UK, 1979.
Powell, Walter W. “Expanding the Scope of Institutional Analysis.” In The New
Institutionalism in Organizational Analysis, edited by Paul J. DiMaggio
and Walter W. Powell, eds. Chicago, 1991.
Powell, Walter W., and Paul J. DiMaggio, eds., The New Institutionalism in
Organizational Analysis. Chicago, 1991.
Prigogine, Ilya, and Isabelle Stengers. Order Out of Chaos: Man's New Dialogue
with Nature. New York, 1984.
Putnam, Hilary. "A Half Century of Philosophy. Viewed from Within." American
Academic Culture in Transformation. Thomas Bender and Carl E.
Schorske. (eds.). Princeton, 1997.
Putnam, Robert D., and David E. Campbell. American Grace: How Religion
Divides and Unites Us. New York, 2010.
Randall, Lisa. Knocking on Heaven's Door: How Physics and Scientific Thinking
Illuminate the Universe and the Modern World. New York, 2011.
Ravitch, Diane. The Death and Life of the Great American School System: How
Testing and Choice are Undermining Education. New York, 2010.
Reeves, Richard. John Stuart Mill: Victorian Firebrand. London, 2007.

167
Ricoeur, Paul. Memory, History, Forgetting. Trans. Kathleen Blamey and David
Pellauer. Chicago, 2004.
Ricoeur, Paul. Oneself as Another. Trans. Kathleen Blamey. Chicago, 1992.
Ridley, Matt. Nature Via Nurture: Genes, Experience, and What Makes Us
Human. New York, 2003.
Ridley, Matt. The Rational Optimist: How Prosperity Evolves. New York, 2010.
Roberts, Tyler T. Contesting Spirit: Nietzsche, Affirmation, Religion. Princeton,
1998.
Rorty, Richard (ed.). The Linguistic Turn: Essays in Philosophical Method.
Chicago, 1992.
Rorty, Richard. Philosophy and the Mirror of Nature. Princeton, 1979.
Rorty, Richard. Contingency, Irony, and Solidarity. Cambridge, UK, 1989.
Roth, John K., and Michael Berenbaum. (eds.). Holocaust: Religious and
Philosophical Implications. St. Paul, 1989.
Rubin, David C. Memory in Oral Traditions: The Cognitive Psychology of Epics,
Ballads, and Counting-out Rhymes. Oxford, UK, 1997.
Rue, Loyal. Religion Is Not About God: How Spiritual Traditions Nurture Our
Biological Nature. New Brunswick, 2005.
Russell, Bertrand. Why I am Not a Christian and Other Essays on Religion and
Related Subjects. New York, 1957.
Schell, Jonathan. The Fate of the Earth and Abolition. Stanford, 2000.
Schorske, Carl E. "The New Rigorism in the Human Sciences. 1940- 1960."
American Academic Culture in Transformation. Thomas Bender and
Carl E. Schorske (eds.). Princeton, 1997.
Scruton, Roger. "Kant." German Philosophers. Oxford, UK, 2001.
Searle, John R. Consciousness and Language. Cambridge, UK, 2002.
Selznick, Philip. The Moral Commonwealth: Social Theory and the Promise of
Community. Berkeley, 1992.
Sen, Amartya. Development as Freedom. New York, 1999.
Sen, Amartya. The Argumentative Indian. New York, 2005.
Sen, Amartya. The Idea of Justice. Cambridge, MA, Press, 2009.
Shapin, Steven. Never Pure: Historical Studies of Science as if It Was Produced
by People with Bodies, Situated in Time, Space, Culture, and Society,
and Struggling for Credibility and Authority. Baltimore, 2010.
Shelley, Percy Bysshe. The Necessity of Atheism and Other Essays. New York,
1993.
Shelley. P. B. Shelley's Poetry and Prose. Ed. Donald H. Reiman and Sharon B.
Powers. New York, 1977.
Skilton, Andrew. A Concise History of Buddhism. Birmingham, UK, 1997.
Smith, Adam. An Inquiry into the Nature and Causes of the Wealth of Nations.
Oxford, UK, 2008.
Smith, Merritt Roe, and Leo Marx. (eds.). Does Technology Drive History? The
Dilemma of Technological Determinism. Cambridge, MA, 1994.
Smith, Rogers M. Civic Ideals: Conflicting Visions of Citizenship in U. S.
History. New Haven, 1997.
Snyder, Timothy. Bloodlands: Europe between Hitler and Stalin. New York,
2010.
Sokolowski, Robert. Introduction to Phenomenology. Cambridge, UK, 2000.

168
Solow, Robert M. "How Did Economics Get That Way and What Way Did it
Get?" American Academic Culture in Transformation. Thomas Bender
and Carl E. Schorske (eds.). Princeton, 1997.
Starr, Harris E. William Graham Sumner. New York, 1925.
Steinmetz, George. Ed. The Politics of Method in the Human Sciences:
Positivism and Its Epistemological Others. Durham, 2005.
Stenger, Victor J. God: The Failed Hypothesis – How Science Shows That God
Does Not Exist. Amherst, 2007.
Stenger, Victor J. The New Atheism: Taking a Stand for Science and Reason.
Amherst, 2009.
Stepelevich, Lawrence S. The Young Hegelians: An Anthology. Atlantic
Highlands, 1997.
Suzuki, D. T. Zen Buddhism: Selected Writings of D. T. Suzuki. New York, 1996.
Taubes, Gary. Good Calories. Bad Calories: Fats, Carbs, and the Controversial
Science of Diet and Health. New York, 2008.
Taylor, Charles. A Secular Age. Cambridge, MA, 2007.
Thompson, Hunter S. Fear and Loathing in Las Vegas. New York, 1998.
Thoreau, Henry David. Walden and Resistance to Civil Government. 2nd ed.
New York, 1992.
Toomer, G. J. Eastern Wisdom and Learning : The Study of Arabic in
Seventeenth-Century England. Oxford, UK, 1996.
Toulmin, Stephen. Return to Reason. Cambridge, MA, 2001.
Toulmin, Stephen. The Philosophy of Science. London, 1953.
Toulmin, Stephen. Foresight and Understanding: An Inquiry into the Aims of
Science. New York, 1961.
Toulmin, Stephen. The Uses of Argument. Cambridge, UK, 1958.
Turgenev, Ivan. Fathers and Sons. Trans. Rosemary Edmonds. London, 1965.
Turner, Frank M. Between Science and Religion: The Reaction to Scientific
Naturalism in Late Victorian England. New Haven, 1974.
Turner, Mark. The Literary Mind. Oxford, UK, 1996.
Vlastos, Gregory. Socrates: Ironist and Moral Philosopher. Ithaca, 1991.
Vlastos, Gregory. Studies in Greek Philosophy Volume II: Socrates, Plato, and
Their Tradition. Princeton, 1995,
Vonnegut, Kurt. Slaughter-House-Five. New York, 1969.
Watt, Ian. The Rise of the Novel. Berkeley, 1957.
Weatherford, Jack. Genghis Kahn and the Making of the Modern World. New
York, 2004.
Weber, Eugen. Varieties of Fascism: Doctrines of Revolution in the Twentieth
Century. Malabar, 1982.
Weber, Max. The Sociology of Religion. Trans. Ephraim Fischoff. Boston, 1963.
Wess, Robert. Kenneth Burke: Rhetoric, Subjectivity, Postmodernism.
Cambridge, UK, 1996.
Whitehead, Alfred North. Science and the Modern World. Cambridge, UK, 1926.
Whorf, Benjamin Lee. Language, Thought, and Reality. Cambridge. MA, 1956.
Williams Raymond. Keywords: A Vocabulary of Culture and Society. Revised
Edition. Oxford, UK, 1983.
Wills, Garry. Inventing America: Jefferson's Declaration of Independence. New
York, 1978.
Wilson, E. O. Sociobiology. Cambridge, MA, 2000.

169
Wilson, E. O. The Social Conquest of Earth. New York, 2012.
Wilson, E. O. Consilience: The Unity of Knowledge. New York, 1998.
Wittgenstein, Ludwig. Philosophical Investigations. Oxford, UK, 1953.
Wittgenstein, Ludwig. Tractatus Logico-Philosophicus. London, 2001.
Wolf, Tom. The Electric Cool-Aid Acid Test. New York, 1999.
Wood, Gordon S. The Radicalism of the American Revolution. New York, 1991.
Wright, Alex. Glut: Mastering Information Through the Ages. Ithaca, 2007.
Yeats, W. B. The Collected Poems of W. B. Yeats. New York, 1996.
Zamoyski, Adam. Holy Madness: Romantics, Patriots, and Revolutionaries,
1776-1871. New York, 1999.
Zucker, Lynne G. “The Role of Institutionalization in Cultural Persistence.” In
The New Institutionalism in Organizational Analysis, edited by Paul J.
DiMaggio and Walter W. Powell. Chicago, 1991.

170
171
172
About the Author

J. M. Beach is a lecturer at the University of Texas, San Antonio.

He has advanced degrees in English, History, Philosophy, and Education.


He has been a teacher and educational administrator for over fifteen
years. Beach has taught many subjects in the Humanities to a broad
range of students, from pre-school all the way to university, in public and
private schools, in the U.S., South Korea, and China. Previously Beach
was a Lecturer at Oregon State University, the University of California,
and at several community colleges in Southern California and Central
Texas.

Beach's scholarly research focuses on several distinct, but interrelated


subjects: The philosophy of knowledge, the science of culture and social
institutions, the history and philosophy of education, and literature.
Beach is also a published poet.

Links to his books, articles, and conference papers can be found at his
website at www.jmbeach.com. Follow his blogs at jmbeach.blogspot.com
and 21stcenturycanon.blogspot.com

173

View publication stats

S-ar putea să vă placă și