Sunteți pe pagina 1din 175

REFLECTIONS ON NATURALISM

Reections on Naturalism
Edited by Jos Ignacio Galparsoro University of the Basque Country, Spain and Alberto Cordero City University of New York, USA

SENSE PUBLISHERS ROTTERDAM / BOSTON / TAIPEI

A C.I.P. record for this book is available from the Library of Congress.

ISBN 978-94-6209-294-5 (paperback) ISBN 978-94-6209-295-2 (hardback) ISBN 978-94-6209-296-9 (e-book)

Published by: Sense Publishers, P.O. Box 21858, 3001 AW Rotterdam, The Netherlands https://www.sensepublishers.com/

Printed on acid-free paper

All rights reserved 2013 Sense Publishers No part of this work may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, microlming, recording or otherwise, without written permission from the Publisher, with the exception of any material supplied specically for the purpose of being entered and executed on a computer system, for exclusive use by the purchaser of the work.

CONTENTS

Acknowledgements List of Contributors Introduction: Naturalism and Philosophy Alberto Cordero and Jos Ignacio Galparsoro Metaphilosophy, Folk-philosophy and Naturalized Philosophy: A Naturalistic Approach Jos Ignacio Galparsoro Naturalism and the Mind: The Final Questions Pablo Quintanilla Measuring Morality Jesse Prinz Naturalism and Scientific Realism Alberto Cordero Handling Humility: Towards a Metaphysically Informed Naturalism Steven French The Scientific Undercurrents of Philosophical Naturalism Sergio F. Martnez Advantages and Risks of Naturalization: Converging Technologies Applied to Human Enhancement (Implications and Considerations for a Naturalist Philosophical Anthropology) Nicanor Ursua Naturalism and the Naturalization of Philosophy: Disputed Questions Julin Pacho

vii ix 1

15 33 43 61 85 105

129 151

ACKNOWLEDGEMENTS

We thank the University of the Basque Country (Spain) for its organizational and financial contribution to research and seminars on naturalism at the San Sebastin Campus (Project Naturalizing Philosophy: A Metaphilosophical Reflection in the Context of Contemporary Culture, EHU2009/03, led by professors Galparsoro, Pacho and Ursua). We would also like to express special gratitude to Professor Vctor Gmez Pin for his unfading encouragement all these years and for his contumacious example that it is still possible to aspire to disinterested knowledge. Our thanks go too to Robert Zuneska, M.A. (New York) for his collaboration with international coordinations and to Christopher Evans for his help with the translation of papers by seminar members in San Sebastian.

vii

LIST OF CONTRIBUTORS

Alberto Cordero (Department of Philosophy, CUNY Graduate Center & Queens College CUNY, The City University of New York) Steven French (Department of Philosophy, University of Leeds, U.K.) Jos Ignacio Galparsoro (Department of Philosophy, University of the Basque Country, Spain) Sergio F. Martnez (Institute for Philosophical Investigations, National Autonomous University of Mexico) Julin Pacho (Department of Philosophy, University of the Basque Country, Spain) Jesse Prinz (CUNY Graduate Center, The City University of New York) Pablo Quintanilla (The Pontifical Catholic University of Peru, Lima) Nicanor Ursua (Department of Philosophy, University of the Basque Country, Spain)

ix

ALBERTO CORDERO AND JOS IGNACIO GALPARSORO

INTRODUCTION
Naturalism and Philosophy

The precise character and scope of contemporary naturalism remain disputed issues, yet projects under that label do show discernible commonalities. In particular, naturalists grant exceptional cognitive status to the empirical sciences, although they do this in ways that vary from one author to another. Many, following John Dewey, strive to ground their view of human life in evolutionary biology and, more broadly, to replace traditional metaphysical and epistemological approaches with theories and methods continuous with those of the sciences. Some concentrate on the natural sciences, others take guidance from broader scientific disciplines. A strong version of naturalism, by Hans Reichenbach (1949), runs as follows: [Modern scientists] refuse to recognize the authority of the philosopher who claims to know the truth from intuition, from insight into a world of ideas or the nature of reason or the principles of being, or from whatever superempirical source. There is no separate entrance to truth for philosophers. The path of the philosopher is indicated by that of the scientist.i Not all contemporary naturalist positions aim to cover as much as Reichenbachs package, however. Positions differ regarding the theses they hold. Two especially prominent are (to first approximation): 1. Ontological naturalism, which asserts that all reality, including human life and society, is exhausted by what exists in the causal order of nature. This includes the view that all properties related to the mind depend ontologically on natural entities. Ontological naturalism thus rejects the existence of supernatural entities. Its various options include such positions as supervenient physicalism (e.g. Papineau, 1993) and broader pluralisms (e.g. Bunge, 1977, 1979). 2. Epistemological naturalism holds that there is no higher tribunal for knowledge than science. Different views on scientific knowledge make for different renditions of this thesis, but unifying traits include an emphasis on scientific justification, and a learned distrust of ideas thought to be immune to empirical findings (rejection of apriorism). From the perspective of naturalism (presented sometimes as Methodological Naturalism), one makes the most sense of things by avoiding non-scientific approaches to knowingresearch should pursue the kind and level of warrant the natural
J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 114. 2013 Sense Publishers. All rights reserved.

CORDERO & GALPARSORO

sciences achieve for their best hypotheses. Views under this banner range from liberal projects (e.g. Kitcher, 1992, 2001) to radical scientism (e.g. Rosenberg, 2011). Naturalists who, like Reichenbach, support both theses use natural science and its methodologies as framework for the discussion of philosophical problemsthe study of knowledge, worries regarding the history of inquiry, epistemology, ontology, the rise and nature of mind and ethics, and so forth. In modern science the earliest credible advances of strong naturalism came from evolutionary biology, especially as part of the discussion of Darwins work. Building on the naturalization of biology proposed in the Origin, a subsequent book by Darwin, The Descent of Man, introduced a proposal to understand psychology and the rise of mind that ran contrary to traditional explanations in terms of vital forces and spiritualism. Darwin went as far as to propose that freedom and moral values might be rooted in natural selection. His daring way of looking at organic life and the mind has been an inspiration to naturalists ever since. Radical naturalists draw ontological lessons from Darwin, especially against dualisma doctrine they think has become untenable (Danto, 1972, p. 448). As noted earlier, by affirming the continuity between all levels of reality naturalism opposes supernaturalism and transcendentalism (Ferrater Mora, 1990, p. 2315), with the consequence that, if naturalism is correct, neither human beings nor their cultural products can be considered supernaturalthere is simply no room for spiritualist explanation (see Galparsoros chapter in this volume).
1. THE SIRENS OF PHILOSOPHY

An influential view has naturalism as an anti-philosophical stance. It is an extreme view held mostly by scientistic naturalists: [I]n its reliance on our capacity for experimentation, discovery, and cumulative knowledge, [science] solves all the great puzzles that people have tried to deal with through faith, philosophy, and alcohol and drugs. (Rosenberg, 2011, p. viii) This self-assured image does not represent the full spectrum of contemporary naturalism (including Rosenbergs own brand). Still, it is a view with considerable following. For example, in his Presidential PSA Address (2010), Larry Sklar objects to naturalists being confused by calls to eschew the siren call of philosophy with its attempts at restraining, controlling, or supplementing internal science. Id love to be a naturalist, if I only knew what naturalism was. (op. cit.) Sklar convincingly makes the case that foundational theories exist in a jungle of contending interpretations, where choice typically follows philosophical guidelines (in his view from radical empiricism). Given the way scientists routinely include general philosophy into their research programs, projects like 2

INTRODUCTION

Reichenbachs will stand out as disingenuous if they seek to contrast science and philosophy radically. However, few (if any) naturalists entirely eschew the siren call of philosophy. None of the leading projects today takes an explicitly anti-philosophical stance. Proposals diverge regarding the type of philosophy they let in, also on the sciences they consider exemplary, but the proposals on view are philosophical. This should not surprise. In the last two centuries, major scientists have explicitly drawn insight from philosophyCharles Darwin, James C. Maxwell, Ernst Mach, Henri Poincar, Albert Einstein, John B. Watson, Niels Bohr, Werner Heisenberg, Paul Dirac, and John S. Bell, among many others. Typically, as Sklar urges, philosophical considerations enter the fray of scientific thought in the context of conceptual puzzles that drive theoreticians into ontological, epistemological, and/or semantic research. In order to appreciate the relationship between scientific projects and philosophy, it will help to sketch two examples, one primarily focused on metaphysics and the other on epistemology. (A) Much scientific theorizing and experimentation intertwines with metaphysics (see, in particular, Steven French and Alberto Cordero in this volume). One telling historical example of metaphysical entrapment concerns a posit that was confidently believed to exist until the early 1900s: the mechanical ether of light (Cordero, 2011). A peculiar entity almost from the start,ii the ether supposedly pervaded the universe without impairing any celestial motions, even though the transversal properties of light waves required the ether to have a fairly rigid structure. Yet there were good reasons for assuming its existence. First, all other known waves had a mechanical medium; secondly, physical explanations were thought incomplete unless they gave mechanical understanding.iii Last but not least, being a wave was identified with being a propagating perturbation, which means that waves required the existence of something capable of perturbation: [I find] the evidence quite overwhelming that that light consists of undulations And, if waves, then a medium is required. (Stokes, 1884) If light consists of waves it is clear that they must be waves of something. (Thompson, 1897) A substantial piece of received metaphysics lies behind this confidence. At its center is an ontological hierarchy that has the lowest level of being (shadows, smiles, modes of being in genera) depending on finite substances (Cordero, op. cit.). This ancient view, revitalized for modern physics by Descartes (see e.g. Meditations III), shaped the classical understanding of waves. Placed at the lowest ontological level, waves could not exist without some material substratum whose traveling perturbation they were. Physical theorizing remained firmly embedded in this metaphysical framework until late in the 19th century. Separating the ideas of wave and requiring a medium needed a level of conceptual atomization that became viable only with the rise of positivist interpretations of science and nullresult experiments (e.g. by Michelson and Morley). The innovative conceptual 3

CORDERO & GALPARSORO

separation needed to drop the ether was at the heart of both Einsteins revolutionary move at the dawn of the 20th century and the rage it initially arose in the scientific establishment. As will be noted in section 3, less professional forms of metaphysics also influence physics, especially at personal, private levels of theorizing. (B) Equally present in scientific theorizing are inputs from epistemology and methodology. Sklar (2010) singles out radical empiricism, but more moderate positions seemingly play a no less determinant role. Radical empiricist moves do occur, but their effectiveness tends to be short-lived. Typically, after a period of exuberance, radicals talk the talk far more than they walk the walk, at least in the more empirically successful disciplines. A supposedly archetypal case in point is quantum mechanics, a field dominated by a strongly empiricist rhetoric associated with the Copenhagen circle of Niels Bohr and Werner Heisenberg. Heisenberg, in particular, wanted a theory articulated exclusively in terms of quantities measurable by spectroscopy (like energy levels and optical intensities). Since the location of electrons cannot be so determined, micro-particles location was not just left out but regarded as something about which there is nothing say, ultimately an illegitimate issue. This radical empiricist attitude became dominant in mainstream physics in the 1930s, and remained in place until around 1980. To this day, many identify quantum mechanics with radical empiricism in action. In actual physics, however, only a minority of scientists have ever practiced Heisenbergs philosophy. Leading textbooks from the heyday of Copenhagenism make this plain. Consider, for example, Leonard I. Schiffs classic manual of 1955. The first chapters endorse a version of Copenhagenism full of radical empiricist caution, but then the chapters that deal with collisions, atoms, molecules and nuclei drop all empiricist caution and go for solutions of physical problems that feed magnitudes first introduced in empiricist-terms into classical magnitudes that Schiff presents in realist-objectivist terms. This move is common in quantum mechanics. For instance, theoreticians routinely enter electron state densities by feeding the classical electric potential into the equally classical Poisson equation of electrostatics (see Emch, 2007, especially section 2.5). Other empirically successful disciplines that cite unobservables bear out a similar story. On the face of it, most scientists have no doubts either about the existence of unobservables down to at least the nuclear scale, or about the truth of many theoretical descriptions concerning unobservables and structures involving them. The point is that, more often than not, scientific research and theorizing are guided, not by radical empiricism, but a broader and variegated form that has forged science as much as science has forged it (see e.g. Shapere, 1980, 1984). Einsteins work on atoms and molecules illustrates the character of what might be termed standard scientific empiricism.
2. SCIENTIFIC EMPIRICISM

In 1905, moving against the establishment view, Albert Einstein declared the ether hypothesis dispensable. His motivation was not radical empiricism, for at the time 4

INTRODUCTION

he was also busy defending the existence of theoretical entities posited by the Kinetic Theory of Matter. Einsteins work on molecular diffusion and his paper associated with Brownian motion explicitly grant epistemological worth to consilient explanations, particularly those that lead to novel predictions. Here and subsequently his writings show the same attitude towards theoretical proposals in general. Einsteins argument for the existence of atoms and molecules seems emblematic of the empiricism that guides theorizing in much of science. The Kinetic Theory identified heat with the kinetic energy of the atoms and molecules assumed to constitute matter. Skepticism about this theory had distinguished champions at the turn of the century, led in chemistry by Wilhelm Ostwald, who maintained that the laws of thermodynamics, not mechanics, were fundamental. One epistemological advantage of thermodynamics, Ostwald urged, was that it did not require reference to fancy theoretical posits, only to energy and its observed transformations in the world accessible to the human senses. Existing conceptual problems with the Kinetic Theory helped Ostwalds case. For example, assuming that interactions between the molecules followed just Newtonian mechanics entails the full reversibility of their interactions, and thus the temporal reversal of every possible collision must be a possible motion as well. But then, why we do not see burned logs turning back to wood or melted ice cubes reconstituting themselves spontaneously? The abundance of irreversible processes made many thinkers wary of the Kinetic Theory, notably Mach. Like Ostwald, Mach argued that while mechanics required hypotheses about matter and invisible atoms in motion thermodynamics did not. Einsteins argument in 1905 challenged this radical empiricist rejection of the Kinetic Theory. It had been long noticed that pollen floating on water exhibited a never-ending, irregular motion (called Brownian motion after Robert Brown, the botanist who first reported it in 1827). Einstein reasoned that this spontaneous, irregular motion may provide evidence of the microscopic nature of matter. If water is actually made of molecules, then in a container these should collide continually with one another and with the walls. He therefore proposed that the irregular motion of small visible particles floating on water might be an effect of relentless kicking by the water molecules. Einstein (1905) articulates this idea mathematically in terms of the Kinetic Theory. His explanation of Brownian motion describes both how fast the water molecules move and how many of them hit a pollen grain per unit of time on average. This accountEinstein showscan be turned a detailed story by feeding into the expressions data derived from tracking the pollen grains. The result was a plausible explanation, but not yet one good enough for the most demanding branches of physics at the time. Einsteins readings of philosophy of science and the position held towards mere speculation at the institutions then closest to him (the Zurich Polytechnic and the Patents Office) had made him wary of mere hypotheses. Not the way radical empiricists are wary, however. The epistemological supplement Einstein called for was prediction of previously unnoticed facts. Since doubts stood against atoms and molecules, he thought it crucial to complete his explanation by deriving from it surprising predictions. He 5

CORDERO & GALPARSORO

focused on specific effects of molecules in the liquid state hitting at random much larger particles (e.g. grains of pollen). The result, Einstein found, would be irregular motion at the level of single pollen grains but also noticeable regularity for the average spread of a collection of grains over time. Specifiable statistical distributions, observable under ordinary microscopes, would be exhibited by Browinian motion. Tests of Einsteins theory took place a few years later, conducted with success by Jean Perrin, who even managed to estimate the dimensions of the proposed molecules. At first Ostwald challenged these experiments, but then he came to regard them as compelling enough to accept the reality of atoms. By 1908, molecules were no longer considered merely hypothetical posits by the vast majority of chemists and physicists. Here Einstein had his theorizing guided by empiricist philosophy of a clearly non-radical sort. The situation seems typical of natural science.
3. PUBLIC SCIENCE AND PHILOSOPHY

Naturalists try to bring thought and theorizing of characteristically scientific varieties to philosophical problems: what is space-time; what there is in the world and how what there is came to be; what are we and where do we come from; and how do we know any of this. Naturalists are also increasingly keen to extend their project to anthropology, ethics and religionwhat is right, what is wrong (e.g. see the papers by Jos Ignacio Galparsoro, Pablo Quintanilla, Jesse Prinz, Sergio Martnez and Nicanor Ursua in this volume). Now, if, as the previous considerations suggest, science and philosophy work together, then naturalism cannot be thought of as an anti-philosophy (Pachos chapter presents a strong version of this claim). But, then, what (if any) is the contrast between naturalism and philosophy? There is a sense in which virtually all forms of philosophy, including underdeveloped forms, influence science, especially at private levels of theorizing. Often scientists blurt them out when they speak their hearts in popularization essays. Here is a conception of theory-construction that links unity and truth via beauty, voiced by an eminent physicist, Anthony Zee (1986): My colleagues and I in fundamental physics are the intellectual descendants of Albert Einstein; we like to think that we too search for beauty. Some physics equations are so ugly that we cannot bear to look at them, let alone write them down. Certainly, the Ultimate Designer would use only beautiful equations in designing the universe, we proclaim. [] Let us worry about beauty first, and truth will take care of itself! Such is the rallying cry of fundamental physicists. (p. 3) The prevalent faith is that in Xeroxing the matter content of the universe, [Nature] must have been motivated by a deep aesthetic imperative which we are yet unable to appreciate. (p. 259)

INTRODUCTION

Dreams also reportedly guide scientific projects. The mathematician Srinivasa Ramanujan articulated functions he said he had dreamed, and the story goes that Friedrich Kekuls idea of the benzene ring benefitted from animations he saw while dozing off one day. The point is that numerous forms of philosophical and cultural insight seemingly influence scientists. However, at public (as opposed to private) scientific levels, not every kind of philosophizing qualifies for consideration. Particularly in disciplines marked by high degrees of theoretical unification and predictive power, there is a well-developed public-level of discourse that has an inbuilt expectation of argument backed by empirical evidence, with everything open to detailed checking by any outsiders willing to look into the claims involved (Ziman, 1968). While at individual levels scientists may draw inspiration and confidence from various sources (including rationalist and mystical convictions), at public-levels the introduction and justification of ideas are epistemologically demanding affairs. How demanding? Mary B. Hesse puts it thus: models of science presuppose that the learning process returns to the empirical world, which provides checks and reinforcements, and is the subject of prediction and control. (1980, p. 125) Empiricist caution (broadly construed) looms large and wide on scientific claims. Broadly understood is the operative phrase here. As critics have long pointed out, radical empiricism is both a logical impossibility and a historical falsehood. Even perceptual observation is problematic with respect to truth and nontrivially dependent on theory. Where applicable, therefore, the observational/nonobservational distinction cannot be the foundationalist one of radical empiricists.iv Nor are the virtues of theorizing pursued at public-levels limited to the constructivist qualities favored by hard empiricists. For example, while descriptive simplicity is a goal shared by virtually all reasonable projects, ontic simplicity is a knotty expectation, as is also formal unity.v And so, again, to the extent that fundamental science encompasses philosophical projects, naturalism cannot be construed in opposition to philosophy. Public science has, however, grown frosty towards such positions as rationalism, transcendentalism and mysticism. It is such positions that stand in contrast to naturalism. Some naturalists are particularly harsh towards aprioristic philosophy (e.g. Devitt, 1998, 2005). Who present themselves as naturalists? A growing number of thinkers do, especially in English-speaking circles. Mainstream figures over the last half century include, for example, Mary B. Hesse (1961, 1974, 1980), Mario A. Bunge (1977, 1979), Adolf Grunbaum (1995), Dudley Shapere (1980, 1984), Philip Kitcher (1992, 1993), Daniel C. Dennett (1992, 1995), David Papineau (1993, 2002), Larry Laudan (1990), Abner Shimony (1993), Michael Devitt (1998, 2005), Ronald Giere (2000, 2006), James Ladyman (2007), Jesse Prinz (2008), and Alex Rosenberg (2011), along numerous others. These thinkers share just a general outline, and then not across the board. Still, their respective proposals share some key views. When they focus on public-level science, these and other naturalists 7

CORDERO & GALPARSORO

share a broadly empiricist approach to justificationonly logic, background scientific knowledge, experiment and observation count in favor or against a theoretical proposal. This take on justification comes from the start as a claim that is fallible, a posteriori, and opened to improvement as science develops. Contemporary naturalists regard their perspective as being the most promising in philosophy, but they do so tentatively. Naturalists who draw strongly from radical empiricist moves in scientific practice distrust speculation and reject scientific realism, whereas naturalists who draw from moderate empiricist moves generally accept abductive inference and favor realism (see Corderos essay in this volume). Antirealist varieties of naturalism identify optimal scientific reasoning with radical empiricist moves often advocated during episodes of serious conceptual tension by leading scientistse.g. Ernst Mach, Pierre Duhem, John B. Watson, Werner Heisenberg, John von Neumann, and Stephen W. Hawkins, among others. Realist varieties, by contrast, identify optimal reasoning with the abductive empiricism common in scientific practicee.g. as attested by Galileo Galilei, Isaac Newton, James C. Maxwell, Charles Darwin, Albert Einstein, Watson and Crick, John Bell, and Frank Wilczek, among others who follow moderately empiricist criteria for accepting and rejecting proposals. Accordingly, mainstream naturalism seems best thought of as a family of philosophical projects of widely empiricist leanings, a variegated stance guided by the ideas and methods of public natural science (which, as noted, in turn receives guidance from a broad range of empiricist insight). This characterization allows for a plurality of positions about natural science. A still more multicolored collection results from extending the base from natural science and empiricism to include other empirical sciences and/or non-empiricist epistemologies (as variously discussed by Sergio Martnez and Julin Pacho in their respective contributions to this volume). Whether the views allowed in by the more inclusive characterizations of naturalism have anything significant in common beyond the fallibility thesis is a matter of dispute.
4. NATURALISM AND ITS CRITIQUE

Confronted with a philosophical issue, naturalists try to address it by means of current background science fortified by careful analysis. How successful are naturalist projects, however? By their fruits ye shall know them, so goes the saying. Naturalists seem to have advanced at least some traditional philosophical issues. Consider, for example, the project to naturalize epistemology started by W.V.O. Quine (1969/2004) and then critically developed by succeeding generations of naturalists (including Dudley Shapere, Daniel C. Dennett, Philip Kitcher, Larry Laudan, and Ronald Giere, among numerous others). The resulting proposals present cognitive subjects as organisms whose capacities evolved in a particular physical and social environment (most recently and intensely in the context of scientific practice). In naturalized epistemology the starting point is neither subjective experience nor the individual subject, but innate human 8

INTRODUCTION

capacities supplemented by abilities developed through advances in science and the social organization of science. A second example of naturalist fruitfulness seems on view in recent defenses of realist positions. Here, one key task is to justify forms of inference to the best explanation, enough to at least avoid a charge of vicious circularity against the moderate empiricism appealed to by realists. In a familiar version of the inference, given a set of hypotheses, the one that explains the available data better than its rivals is the hypothesis one should accept tentatively as true. But putting the claim this way invites embarrassing questions: How does one recognize a good explanation? Why should the actual world be the one singled out by the current best possible explanation? What makes anyone think that, given a pool of prospective explanations, it contains the truly best explanation? Some naturalists (notably Boyd, 1984) respond by sharpening the inferences structure and then advancing characteristically scientific arguments in its support. Other naturalist realists labor to identify theoretical descriptions/explanations that can be relied upon with credibility similar to that granted to ordinary claims about ordinary observables (see Corderos essay in this volume). These are only some examples. Equally strong today are naturalist projects to address metaphysical and ethical issues. Efforts along those lines are represented in this collection by Steven French, Pablo Quintanilla, and Jesse Prinz. An old line of criticism, however, dismisses efforts along the above lines as hopelessly nave. Justification for the criteria and moves developed within naturalist perspectives runs along scientific lines. As such, naturalist justification rests on considerations of coherence, agreement with data, and risky predictions. However, even if a proposal were to succeed as a scientific claim, to nonnaturalists the notion of success involved here might seem philosophically raw and misguided from the start. What, non-naturalists ask, justifies the scientific methods to which naturalists appeal? I suggest Ernest Nagels response to this line of complaint remains strong. The objection matters, he noted at the dawn of contemporary naturalism, only to those who refuse to dignify anything as genuine knowledge unless it is demonstrated from self luminous and self evident premises (1956, p. 15). But there is no such thing as complete justification for any claim, and so requiring complete warrant for naturalist proposals is an unreasonable request. The proper guideline for naturalist proposals seems thus clear: develop it using the methods of science; if this leads to a fruitful stance, then explicate and reassess. The ensuing offer will exhibit virtuous circularity if its explanatory feedback loop involves critical reassessment as the explanations it encompasses play out. So viewed, naturalism is a philosophical perspective that seeks to unite in a virtuous circle the sciences and non-foundationalist, broadly-based empiricism. Other common lines of complaint are that naturalization efforts seem fruitful only in some areas, and that several endeavors outside the sciences serve as sources of knowledge into human life and the human condition, especially in areas where science does not reach far as yet. It seems difficult not to grant some truth to many allegories from literature, art and some religions. Naturalism has room for 9

CORDERO & GALPARSORO

knowledge gathered outside science, provided the imported claims can also be sustained by naturalist tests. As noted at the beginning (and as the papers that follow make clear), naturalist approaches do not form a monolithic whole. This rough introductory chapter only aims to highlight the contemporary significance of naturalist moves. We, the editors, hope that, collectively, the essays that follow will give readers a fair view of the vitality and tribulations of naturalist projects today.
5. THIS VOLUME

This volume, long in the making, focuses on approaches discussed at seminars on naturalism at the University of the Basque Country (San Sebastian, Spain) by professors Jos Ignacio Galparsoro, Julin Pacho, and Nicanor Ursua. Many of these activities continued discussions started at various editions of the International Congress of Ontologya biannual UNESCO-recognized venue that, under the umbrella term physis, promotes philosophical research that engages recent contributions of science. The papers in this collection are by authors whose work has been followed with interest at the noted seminars over the last decade. They are placed in an order that goes from papers that pursue naturalist projects to critical papers on naturalization efforts in recent philosophy. In the first contribution, Metaphilosophy, Folk-philosophy and Naturalized Philosophy: A Naturalistic Approach, Jos Ignacio Galparsoro invites us to reflect on the advisability of analysing philosophy from a naturalistic perspective. That is, from a perspective that considers philosophy as if it was one more cultural object, which can be studied using the tools that we have available to us today and that are provided by disciplines such as evolutionary psychology or anthropology oriented by a distinctly cognitivist approach. A central concept in the analysis is that of intuitive ontologyclosely linked to folk-philosophy or the spontaneous, nave (natural) way of thinking that is associated with common sensewhich is a result of the evolutionary process and a source of metaphysical prejudices such as dualism. A metaphilosophical reflection, such as that proposed by Galparsoro, identifies the natural character of a metaphysics that is still too close to folkphilosophy, and the interest of constituting a naturalized philosophy that is fully conscious of its counterintuitive character. Pablo Quintanilla (Naturalism and the Mind: the Final Questions) starts his paper by making explicit the roots of central arguments against naturalism in Kant and Husserl, distinguishing along the way different kinds of naturalism: ontological, methodological, reductive and non-reductive. With this initial work of conceptual clarification in place he then discusses the senses in which there can be a naturalistic account of the mind. He endorses a non-reductive ontological and methodological naturalism, grounded on the notion of supervenience, arguing that there are good reasons to believe that this kind of naturalistic account of the mind is already been offered. However, in his view, there are two last realms in which we should extend naturalism: moral behavior and agency. The paper provides a sketch of how these views could take place. 10

INTRODUCTION

Jesse Prinz (Measuring Morality) claims that in recent years there has been a naturalistic turn in philosophy, akin to the linguistic turn that characterized the last century of work in the analytic tradition. Naturalism has long been a popular metaphysical stance, but is now increasingly associated with a methodology that draws heavily on empirical research in defense of philosophical conclusions. Ethicists have resisted moral philosophy for a number of reasons, including the conceptual nature of ethical questions, the unreliability of folk intuitions, and, most importantly, the alleged divide between projects that are normative and those that are descriptive. In this chapter, Prinz argues that empirical research can contribute to all core areas of moral philosophy, including moral psychology, metaethics, and normative ethics. The author illustrates by describing empirical work that links morality to emotions. Along the way, the chapter distinguishes different kinds of empirical approaches and argues that these must be integrated with more traditional philosophical methods if we want to move from the articulation of theories to theory confirmation. For Alberto Cordero (Naturalism and Scientific Realism), projects of naturalist realism rest their cases on fallible, scientific justification. This chapter explores such proposals in recent philosophy of science, their critical reception and the growing concentration of realist theses on theory-parts rather than whole theories, along with the main problems and prospects of naturalist realism today. The last two sections outline Corderos own suggestions, drawn from scientific practices that emphasize successful novel prediction, integrated into a criterion for selecting theory-parts of realist significance. Steven French (Handling Humility: Towards a Metaphysically Informed Naturalism) claims that much of modern metaphysics is a priori, based on intuitions and pays only lip service to science, where, at best, this amounts to a dim understanding of high school chemistry. Some naturalists urge the construction of a fully naturalized metaphysics, based on what current science (physics, in particular) tells us about the world. In this essay French examines the prospects for such a metaphysics in the light of quantum theory in particular and suggests that even a non-naturalized metaphysics may prove useful to the philosopher of science. The chapter concludes by reflecting on the complex relationship between metaphysics, science and the philosophy of science. According to Sergio F. Martnez (The Scientific Undercurrents of Philosophical Naturalism), naturalism refers to views that consider philosophical method to be continuous with the methods of science. Most often the discussion centers on the characterization of the sort of continuity that is relevant for characterizing naturalism, and thus it is assumed that naturalization takes places with respect to a given discipline. The authors aim is to argue for a characterization of naturalism distinguished by the capacity of mutually supporting explanations to produce better and more encompassing explanations. Thus, such account of naturalism relies on attributing epistemic importance to the capacity of different explanations for mutually supporting each other, not as a consequence of a perfect fit, but through a process of accommodation that takes place in time and involves considerations that are crucial to evaluate its rationality. According to 11

CORDERO & GALPARSORO

Martnez, the issue is not supplementation or replacement of philosophical method as a whole. Naturalism, he argues, is not one master stroke of a brush, but a long process of subtle strokes promoting scientific understanding. Nicanor Ursua (Advantages and Risks of Naturalization: Converging Technologies Applied to Human Enhancement) considers the stances offered by naturalized philosophy and looks at its prospects and the role that philosophy should play in the challenging context set by converging technologies. The concept of converging technologies used by Ursua draws from investigations in the USA and Europe that link this concept to the idea of human enhancement, i.e. improvement of human performance by corporal and/or intellectual modification. Ursua gives particular attention to the current debate about converging technologies and the concept of transhumanism or tecnofuturism which, he stresses, could lead to the transformation of the human species and requires a new philosophical anthropology. Finally, in Naturalism and the Naturalization of Philosophy: Disputed Questions, Julian Pacho proposes that naturalism is a metaphysical position about the deep nature of things. In his view, naturalization is a program that seeks to apply the methodology of the natural sciences to the human sciences, especially philosophy. In some circles naturalization efforts are regarded as proposals to end philosophy, the critique being that, were naturalization efforts to succeed, many problems traditionally regarded as "philosophical" would be entirely transferred to the special sciences. Pacho poses the following questions: Is naturalization an unstoppable process? Are there issues or objects not naturalizable per se, thus revealing that philosophy is a safe and distinct form of knowledge? Is it necessary to assume a non-naturalistic metaphysics to challenge the naturalization of philosophy?
NOTES
i ii iii iv

Reichenbach (1949: 310); quoted by Maddy (2001). See also Reichenbach (1951). Whittaker (1953/2007) remains a sterling reference for the history of the ether of light. For the classical arguments see e.g. William Thomson (1884). Research from cognitive science strongly indicates that percepts have the character of good hypothesesbest guesses that involve added information from prior generalizations, past learning, expectation, and even emotion. See, for example, Pylyshyn (1999), Brewertl and Lambert (2001). Bunge (1963) offers a critique by a strong scientific realist of the myth that simplicity is always a fact or a goal of research. On formal unity see Steven Frenchs contribution to this volume.

REFERENCES
Boyd, Richard N. (1984). The current status of scientific realism. In J. Leplin (Ed.), Scientific realism (pp. 41-82). Berkeley, CA: University of California Press. Brewertl, William F., & Lambert, Bruce L. (2001). The theory-ladenness of observation and the theoryladenness of the rest of the scientific process. Philosophy of Science, 68, S176-S186. Bunge, Mario A. (1963). The myth of simplicity: Problems of scientific philosophy. Englewood Cliffs, NJ: Prentice Hall.

12

INTRODUCTION Bunge, Mario A. (1977). A treatise on basic philosophy, Vol. III, Ontology: The furniture of the world. Dordrecht: Reidel. Bunge, Mario A. (1979). Treatise on basic philosophy, Vol. 4: Ontology II: A world of systems. Boston: Reidel. Cordero, Alberto. (2011). Scientific realism and the divide et impera strategy: The ether saga revisited. Philosophy of Science, 78, 1120-1130. Danto, Arthur C. (1972). Naturalism. In Paul Edwards (Ed.), The encyclopedia of philosophy (pp. 448450). New York: Macmillan. Dennett, Daniel C. (1992). Consciousness explained. Boston: Little Brown & Co. Dennett, Daniel C. (1995). Darwins dangerous idea. New York: Touchstone. Devitt, Michael. (1998). Naturalism and the a priori. Philosophical Studies, 92, 45-65. Devitt, Michael. (2005). There is No A Priori, in E. Sosa and M. Steup (Eds.), Contemporary debates in epistemology (pp. 105-115). Malden, MA: Blackwell. Einstein, Albert. (1905). Investigations on the theory of Brownian motion. Translated by A. D. Cowper. http://users.physik.fu-berlin.de/~kleinert/files/eins_brownian.pdf. Emch, Gerard G. (2007). Quantum statistical physics. In Jeremy Butterfield & John Earman (Eds.), Philosophy of Physics, Part B (pp. 1075-1182). Amsterdam: North Holland. Ferrater Mora, Jos. (1990). Naturalismo. Diccionario de filosofa (pp. 2315-2318). Madrid: Alianza. Giere, Ronald N. (2000). Naturalism. In W. H. Newton-Smith (Ed.), A companion to the philosophy of science (pp. 308-310). Oxford: Blackwell Publishers. Giere, Ronald N. (2006). Modest evolutionary naturalism. Biological Theory, 1, 52-60. Grunbaum, Adolf. (1995). The poverty of theistic morality. In K. Gayroglu, J. J. Stachel, & M. Warfofsky (Eds.), Science, mind and art: Essays on science and the humanistic understanding in art, epistemology, religion and ethics, in honor of Robert S. Cohen, Boston Studies in the Philosophy of Science, Vol. 165 (pp. 203-242). Dordrecht: Kluwer Academic Publishers. Hesse, Mary B. (1961). Forces and fields. London: T. Nelson. Hesse, Mary B. (1974). The structure of scientific inference. Berkeley/Los Angeles: University of California Press. Hesse, Mary B. (1980). Revolutions and reconstructions in the philosophy of science. Bloomington, IN: Indiana University Press. Kitcher, Philip. (1992). The naturalists return. Philosophical Review, 101, 53-114. Kitcher, Philip. (1993). The advancement of science. New York: Oxford UP. Kitcher, Philip. (2001). Science, truth, and democracy. New York: Oxford UP. Ladyman, James, & Ross, D. (2007), Everything must go: Metaphysics naturalized. Oxford: Oxford University Press. Laudan, Larry. (1990). Normative naturalism. Philosophy of Science, 57, 44-59. Maddy, Penelope. (2001). Naturalism: Friends and foes. In J. Tomberlin (Ed.), Philosophical perspectives, Vol. 15, Metaphysics (pp. 37-67). Oxford: Blackwell. Papineau, David. (1993). Philosophical naturalism. Oxford: Blackwell Papineau, David. (2002). Thinking about consciousness. Oxford: Oxford University Press. Prinz, Jesse J. (2008). Empirical philosophy and experimental philosophy. In J. Knobe & S. Nichols (Eds.), Experimental philosophy. New York, NY: Oxford University Press. Pylyshyn, Zenon. (1999). Is vision continuous with cognition? The case for cognitive impenetrability of visual perception. Behavioral and Brain Sciences, 22, 341-365. Quine, Willard V. O. (1969/2004). Epistemology naturalized. In E. Sosa & J. Kim (Eds.), Epistemology: An anthology (pp. 292-300). Malden, MA: Blackwell. Reichenbach, Hans. (1949). The philosophical significance of the theory of relativity. In P. A. Schilpp (Ed.), Albert Einstein: Philosopher-scientist (pp. 287-311). La Salle, IL: Open Court. Reichenbach, Hans. (1951). The rise of scientific philosophy. Berkeley, CA: University of California Press. Rosenberg, Alex. (2011). The atheists guide to reality. New York: W. W. Norton & Co.

13

CORDERO & GALPARSORO Schiff, Leonard I. (1955). Quantum mechanics. Toronto: McGraw Hill. Shapere, Dudley. (1980). The character of scientific change. In T. Nickles (Ed.), Scientific discovery, logic, and rationality, Boston Studies in the Philosophy of Science, Vol. 56 (pp. 61-102). Dordrecht: D. Reidel. Shapere, Dudley. (1984). Reason and the search for knowledge. Dordrecht: Reidel. Shimony, Abner. (1993). The search for a naturalistic world view (two volumes). Cambridge: Cambridge University Press. Sklar, Lawrence. (2010). Id love to be a naturalistIf only I knew what naturalism was. Philosophy of Science, 77, 1121-1137. Stokes, George G. (1884). On light: First course on the nature of light delivered at Aberdeen in November, 1883 (Burnett Lectures). London: Macmillan & Co. Thompson, Silvanus P. (1897). Light: Visible and invisible. A series of lectures delivered at the Royal Institution, 1896. London: Macmillan & Co. Thomson, William. (1884). Notes of lectures on molecular dynamics and the wave theory of light. Baltimore, MD: Papyrograph reproduction. Whittaker, Edmund T. (1953/2007). A history of the theories of aether and electricity: From the age of Descartes to the close of the nineteenth century (1910). New York: Longmans, Green and Co. Zee, Anthony. (1986). Fearful symmetry. New York: Macmillan. Ziman, John M. (1968). What is science? In Science is public knowledge (pp. 1-12). Cambridge, UK: Cambridge University Press.

14

JOS IGNACIO GALPARSORO

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY


A Naturalistic Approach*

1. INTRODUCTION: NATURALIZING CULTURE. NATURALIZING PHILOSOPHY, TOO?

The claim that philosophy belongs to the realm of culture does not provoke substantial objections. However, if one is invited to consider the advisability of analysing philosophy as if it was just one more cultural object that can be studied with the tools that we have available todayand which are provided by disciplines such as evolutionary psychology or anthropology oriented by a distinctly cognitivist approachthen things are quite different. This invitation faces important obstacles, one of which is the excessive zeal with which philosophers tend to protect their discipline. Historically, the field that has been considered the realm of philosophy has been progressively reduced. The fear, then, is that the remaining reduced scope that is considered to be the exclusive domain of philosophy may be wrested from it. When in addition the suspicion is that philosophys home ground is threatened by disciplines that, more or less patently, are recognized as belonging to science, there is a tendency to appeal to the magical word in which all faith is deposited to scare off the enemy: reductionism. The progressive separation of specific disciplines from the core trunk of philosophy has the consequence of leaving philosophy with the sensation of being surrounded, and in need of adopting a defensive position in order to safeguard its remaining possessions at all costs. This attitude means that a large proportion of philosophers consider as enemiesand no longer, as would be desirable, as fellow travellersthose disciplines that could provide elements that would help to clarify problems that philosophers consider to be their own exclusive concerns. Considering philosophy from the point of view of philosophy, that is to say, engaging in metaphilosophy, is a practice that has been common among philosophers from all eras, and there has been an increase in such activity in recent decades. A common denominator among the immense majority of contemporary philosophical tendencies and authors is a refusal to consider the contributions of scientific disciplinesand very specifically those contributions that come from the field of evolutionary biologyas relevant. There is something that resembles biophobiawhich is not unique to philosophy, but rather is common among the humanities and social sciencesthat triggers a knee-jerk rejection of any

J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 1531. 2013 Sense Publishers. All rights reserved.

GALPARSORO

suggestion that so much as considers the advisability of adopting an approach that takes these contributions into account. A symptom of this biophobia is the dominance over many decades of the 20th century of the so-called Standard Social Science Model (SSSM). This model considers that the social sciences are a completely independent field with respect to the natural sciences. The SSSM has been severely criticized, particularly by Tooby and Cosmides (1992). For these authors, one of the central proposals of the SSSM is particularly worthy of criticism; this proposal maintains that, biology is intrinsically disconnected from the human social order (Tooby & Cosmides, 1992, p. 49). The SSSM would suggest the following division of labour: natural scientists should deal with the non-human world and the physical aspects of human life, while social scientists would be the custodians of human minds and of all the mental, moral, political, social and cultural world. To Tooby and Cosmides, this is no more than the resurrection of a barely disguised and archaic physical/mental, matter/spirit, nature/human dualism (Tooby & Cosmides, 1992, p. 49). Such a stance would have led the SSSM to ignore the evolutionary perspective because it is considered irrelevant to the correct field of study of the social sciences. Tooby and Cosmides consider that the time has come for a revolution in the field of social or human sciences: it is necessary to apply our knowledge of evolutionary psychology to the field of culture. A profound consequence of such a move would be to change the way we understand culture itself. It will become necessary to recognize that human beings have a mental architecture that is a product of evolution and is fitted out with a series of contents that condition culture. It will therefore be necessary to give up the idea of the SSSM according to which the mind is originally a completely malleable blank slate that culture, through the process of learning, fills with content (Tooby & Cosmides 1992, p. 28; Pinker, 2002). The critical response to the radical separation that the SSSM establishes between nature and culture is the proposal for a naturalist program in the social sciences (Sperber, 1996) or in other words, a naturalization of culture. In recent years, to the horror of many, there have been attempts to naturalize different areas of culture such as religion (Boyer, 1994, 2001; Dennett, 2006), morality (Ridley, 1996; Dennett, 2003) or science (Atran, 1998; Carruthers et al., 2002). The question that arises is that of the advisability of extending this tendency to the field of philosophy, as an integral part of culture. The project would therefore involve examining the pertinence of applying an anthropological, cognitivist and evolutionary approach to philosophy and considering whether philosophy can be treated as one more cultural product, similar to religion, morality or science. The lack of specialist studies in the field of philosophyi does not mean that research must start from zero. Research should make use of the studies that have already been carried out in other areas of culture, all of which can be characterized by the fact that they attach great importance to the findings of cognitive science within a context that is markedly evolutionary. Be that as it may, such an approach will clearly mark its distance from the SSSM, with its idea of a radical separation between nature and culture. If culture is not a field that is completely independent 16

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY

of nature, it means that cultural products will, at least to a certain extent, be constrained by naturally determined factors which are ultimately biological, since they are the result of a long evolutionary process. Thus, if philosophy is a cultural product, and if there are no longer good reasons for radically separating nature from culture, the question that inevitably arises is the following: are we at last seeing the possibility of a naturalization of philosophy? Today, we know that the human mind has been evolutionarily configured to respond to the problems that it has faced from the environment in which human life has developed (Mithen, 1996). Furthermore, it is with this same cognitive apparatus that people today tackle philosophical and scientific questions. One of the theses of evolutionary psychology is the following: our biological structure has a fundamental influence on the way we think (Mithen, 1996). If this thesis is correct, does it mean that our biological structure influences the way we do philosophy? Or, posing the question in a way that some may find provocative: is philosophy also constrained by human mental structures that are the result of the evolutionary process of natural selection? It is very revealing that philosophy has not dedicated more time and effort to examining itself in the light of these questions. This means that analysing the pertinence of a naturalistic metaphilosophy that takes this thesis into account (via, among other disciplines, evolutionary psychology) is a necessary and urgent task that still has to be undertaken. Furthermore, it should not cause of any great consternation if, as a result of this reflection, it became clear that it would be useful to analyse the contents that philosophy has generated over its history precisely in the light of evolutionary psychology or cultural anthropology with a cognitivist bent. It is not the case that anyone is trying to whisk away a precious treasure from philosophy. Rather, there is a desire to take the task of regarding philosophy itself seriously, which leads to asking questions such as why do we think the way we do, and, why is philosophy the way it is. These are questions that have worried the greatest philosophers; those who lived before the advent of theories such as evolution did not have this valuable tool available to them to assist in the task. For those of us who are alive now, it would be unforgivable not to make use of such theories in order to try to shed some light on these central philosophical questions.
2. NATURALIZED REALMS OF CULTURE: MORALITY AND RELIGION

Nietzsche was one of the first philosophers to recognize that the theory of evolution deeply altered the image of man. The new perspective opened up by evolutionary theory makes it possible, among other things, to develop a genealogy of morality, in which it becomes absolutely imperative to scratch beneath the surface of the apparently eternal and immutable moral values to show that these too were the result of a historical process. Despite the fact that Nietzsches ideas are, in general, completely unknown to researchers in the field of cognitive and evolutionary sciences, those researchers follow the same genealogical strategy, albeit using means that are far more powerful and better honed. Some cultural anthropologists are also interested in a genealogy of morality.ii If we were to 17

GALPARSORO

perform a comparative study between Nietzsche and the latest research results (for example, those of Hauser, 2006) we could probably see important similarities. Thus Hauserlike Nietzschealso moves through a landscape of suspicion: after explicit declarations that are made to justify a moral choice, other more basic elements remain hidden. Here also, it is necessary to dig beneath the surface and to try to make explicit that which is kept hidden but which nonetheless conditions our moral choices. Hausers thesis is that there is a certain parallel between the universal grammar of Chomsky and what he calls a moral grammar or moral organ. That is, that there would be some kind of profound moral intuitions upon which the different moral variants would be configured. This idea has yet to be demonstrated. However, it certainly is an interesting hypothesis according to which there would also be a biologically evolving base that is common to all humans in the field of morality. This means that the barriers that are set up between morality and biology start to be torn down and that, as this happens, the doors to a naturalistic treatment of morality start to swing open. It does not seem probable that biology is the key to explaining all the processes in the field of morality, but it does seem plausible that an answer is to be found to the origin of our moral capacity in the cognitive and evolutionary sciences (Ayala, 1987). Another example of an analysis of morality from a naturalistic perspective is that provided by Bloom (2004), who attempts to show that our intuitive dualism (an element of our mental machinery that is fruit of the evolutionary process) lies behind the way we regard other people. To this end, he analyses the appearance of moral feelings in babies and children, and reaches the following conclusion: the roots of morality are innate (Bloom, 2004, p. 100). Bloom is also concerned with another important field of culture that it would be useful to examine from a naturalistic perspective: religion (Bloom, 2007). Bloom laments the fact that contemporary evolutionary psychologists have abandoned the analysis of religion. Nevertheless, he emphasizes that in recent years some research has attempted to gain an understanding of certain universal religious ideas in children. Some recent studies suggest that two foundational aspects of religious beliefbelief in divine agents, and belief in mind-body dualismcome naturally to young children (Bloom, 2007, p. 147). This research therefore considers two themes that are particularly interesting: the existence of universal religious ideas and the idea that mind-body dualism is firmly rooted in human beings from childhood. Later we will consider the question of this dualism in greater depth. For now, it is enough to point out that dualism is a natural idea (that is, it is spontaneously accepted by our common sense) and that, insofar as this is the case, casting it off requires us to overcome considerable resistance. Bloom claims that if psychologists do not refer to religion, it is because it is a taboo subject: religion is a sacrosanct domain (Bloom, 2007, p. 148). If this subject was treated from a naturalistic point of view, there would be a risk of offending people. In effect, people spontaneously react hostilely to the claim made by many cognitive scientists that religious belief is an evolutionary accident, that is, an unexpected by-product of cognitive systems that have evolved for other 18

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY

purposes (Bloom, 2007, p. 148). Nevertheless, according to Bloom the situation is changing. To a great extent thanks to progress in fields such as evolutionary psychology and cultural anthropology, there is now a small community of cognitive scientists who study religious belief using the same type of theories and methods as have been applied to other fields, such as language, the perception of objects, theories of mind, etc. (Bloom, 2007, p. 148). From Blooms research we can draw an important conclusion: although religion deals with things that are transcendent (that is, supernatural), religion is natural. Insofar as this is the case, religion is an element of culture that can be naturalized. Some of the most interesting results stemming from the analysis of religion from a cognitive and evolutionary perspective are those presented by Pascal Boyer. It is often said that religion provides explanations about the world itself or about events in the world. One thing that we can expect of an explanation is that it tells a story that is less surprising than the thing it aims to explain. Nevertheless, religious explanations tend to complicate things somewhat; providing more obscurity than clarity. Instead of shedding light on things (i.e., explaining) they encourage mystery and obscurity. This means that religious concepts seem to be located outside of the ordinary. Boyers aim is to refute this impression by showing that it is possible to present religious concepts as just one more result of the normal functioning of our mental mechanisms. Boyer presents the human mind as a inference systems, that is, as lots of specialised explanations-devices [], each of which is adapted to specific kinds of events, automatically suggests explanations for these events (Boyer, 2001, pp. 1920). Our minds execute these chains of inferences automatically and it is only the results that are visible to us and can be consciously scrutinized. An analysis of mental mechanisms shows that religious concepts (despite their apparent extraordinary character) are as ordinary or natural as anything else (Boyer, 2003, p. 119). So, these concepts are counterintuitive, that is, they are contrary to the expectations of the intuitive ontologies developed by natural selection in the areas of physics, biology or psychology. Furthermore, Boyer points out, this counterintuitive character is precisely what we find striking and, therefore, what makes religious concepts easily memorable. However, counterintuitive elements are just one part of the representations that are activated in the process: religious concepts also activate a number of additional background assumptions that are not counter-intuitive and in fact are directly provided by intuitive ontological expectations (Boyer, 1998, p. 881). Therefore, these inferential elements are not contradicted by religious propositions. It is precisely the combination of counterintuitive elements and the intuitive mechanisms of basic inference that explains, according to Boyer, the cultural success of such representations in many different cultural environments (1998, p. 881). All of this means that these representations can propagate themselves easily. However, the fact that the counterintuitive elements are constrained by the general system of inference of the mental machinery means that the variation in religious concepts is limited.

19

GALPARSORO

3. INTUITIVE ONTOLOGY

Boyers attempt to provide a naturalistic explanation of religion leads to a series of implications that affect other areas of culture. Religion is still a special case in which we can appreciate the functioning of human mental structures (Boyer, 2008). Boyer clearly has the more general aim of providing a naturalistic description of all cultural representations, which is empirically based on the evidence of neurological functioning and on the evolutionary history of the species (Boyer, 1999, p. 226). The common denominator in Boyers work is the central role allocated to the concept of the intuitive ontology. Boyer often uses this notion, although on occasions he also makes use of equivalent expressions such as evolved metaphysics or natural metaphysics (Boyer, 2000) which in turn are not substantially different in meaning from other expressions such as natural philosophy, intuitive philosophy or folk-philosophy. In what follows, I will try to explain this concept in some detail, paying particular attention to its philosophical implications. Boyer presents the following thesis, which is of great relevance to a field such as philosophy: cognitive capacities make certain types of concepts more likely than others to be acquired and transmitted in human groups (Boyer, 1999, p. 206). The question (a worrying question for some) that inevitably emerges is the following: is it easier or more probable that some philosophical concepts arise (and/or are transmitted) than others? An affirmative answer to this question means that one has to confront the very widely-held notion that it constitutes an attack on the dignity of philosophy to attempt to judge the creative freedom of the practice of philosophy itself in this manner, through the restrictions imposed by our cognitive capacities. Although this may be an uncomfortable problem, philosophy is obliged to consider it and to attempt to answer it. That is why philosophy should take note of the efforts made by authors such as Boyer in research into the field of cognitive development. The human mind has no general principles for learning; rather it has many acquisition mechanisms, each of which is directed at specific aspects of the world. The structure of these mechanisms is based on domain-specific principles, and they are shared by all human being. This universal nature means that the different intuitive domains (those that correspond, for example, to physics, to psychology or to biology) are not substantially different in people belonging to very distant cultures: far apart in space or in time. All human beings share a common intuitive ontology which is developed during our infancy and which then changes very little throughout our adult lives. According to Boyer, processes of cultural transmission cannot be understood without this intuitive background (Boyer, 1999, p. 210). Thus, the intuitive ontology includes universal principles that do not necessarily have to provide cultural universals, but which constrain the degree of variation in cultural productions. Therefore, the thesis of the variability of cultures, one of the central aspects of cultural anthropology that dominated the human sciences for decades, is greatly undermined: such variability is severely limited by the presence of an intuitive ontology that is common to all human beings and that is developed 20

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY

ontogenetically during infancy and is, phylogenetically, the result of the gradual process of natural selection. In this context, other questions that are very closely related to those mentioned above and which deserve an answer reappear within philosophy: is the variability of philosophy conditioned by our cognitive apparatus? The existence of the intuitive ontology does not mean that there are no cases in which the expectations that result from this ontology are thwarted. We have already considered the case of religion; many of the propositions of which fly in the face of intuitive expectations. However, there are domains other than religion that also surpass the bounds of the intuitive ontology, for example, physics, evolutionary biology, mathematics or philosophy itself. Boyer recognizes that there are differences between these fields and religion. We could say that they violate the principles of the intuitive ontology in a different way. In religion, this violation was easy of memorize (and, therefore, easy of transmit), while in the other fields the same thing does not hold. For that reason, in these fields there are special difficulties in transmitting concepts that overstretch the intuitive ontology (Boyer, 1999, p. 216). Boyer offers us the following characterization of ontology: an ontology specifies kinds of stuff in the world (Boyer, 2000, p. 277). Ontology understood in this way is therefore a classification of the objects in the world. In effect, the Aristotelian theory of categories, which is central to Aristotles ontological conception, is none other than an attempt to detail a very general series of categories into which the things in the world are classified. From our current perspective, we could say that Aristotles was the first great attempt to make an intuitive ontology explicit. This was a giant first step; for the first time there was an attempt to make the classification of the things that belong to the realm of common sense explicit. With the tools available to him, Aristotle could not go much further. These days we have much more sophisticated tools and we should use them when it comes to analysing the question of ontology. Today we have abundant psychological evidence that conceptual knowledge we must not forget that knowledge of science and philosophy is a type of conceptual knowledge and that, therefore, what follows affects them fully includes a series of ontological commitments (Boyer, 2000, p. 277). This means that all conceptualization is underwritten by a more profound base, which is its condition of possibility. Studies of inductive and categorization processes in young children show that children are certainly not driven by a pure sensitivity to correlations of external properties in objects (Boyer, 2000, p. 278). That is to say that the mind of the child contributes something in the process of categorization. This confirms the thesis that the mind is not a blank slate. Young children (at an age when the processes described here cannot be affected by conscious reflection or by knowledge that could have been transmitted to them) do not exclusively classify things in the world based on their perception of the objects; rather this perception is filtered by a series of mental mechanisms that remain hidden to their consciousness.iii Furthermore, for Boyer, that ontological categories are real 21

GALPARSORO

psychological structures is not really in doubt (Boyer, 2000, p. 280). So the categories are not ideal entities, but mental structures that have taken root in the human mind after a long evolutionary process. Without these mechanisms which, following Boyer and other authors, we could call ontological commitments children would not be able to classify the things in the world or look for regularities beyond the superficial characteristics of the objects; in short, they would not be able to think and, ultimately, they would not potentially be able to contribute to philosophy. We see then that studies performed on children play an important role in the analysis of the intuitive ontology. However, the questions that inevitably arise are the following: does the intuitive ontology remain unaltered in the adult? Does it change? If it does change, to what extent? Or to put it more clearly: can scientific knowledge modify the intuitive ontology? Boyers answer leaves no room for doubt: no, because the two never meet; science and the intuitive ontology develop along different pathways. Specifically, scientific concepts are invariably acquired in the form of metarepresentational beliefs in a given social and cultural context (Boyer, 2000, p. 286). It is clear that in its content, science challenges those concepts that could be built up from a simple extension of intuitive expectations. For example, the intuitive notions of force or of essence have no place in contemporary physics or biology. Nevertheless, it must be clear that the acquisition of scientific theories that enter into conflict with the intuitive ontology do not result in the intuitive ontology simply being replaced. The intuitive ontology does not disappear. For example, Darwinian biologists continue to construct their theories using a mental apparatus fitted out with an intuitive ontology in which the notion of essence continues to play a decisive role. Darwinian theory contradicts this notion, which comes from intuitive biology, but that does not mean that the notion of essence vanishes from our deepest mental structures. Knowledge of a theory does not create a type of intuitive expectations that would be consistent with that theory, and which substitute the prior intuitive expectations. This means that science plays no role in the emergence and development of the intuitive ontology. The intuitive ontology is more like the obstacle against which science must constantly struggle.iv
4. SCIENCE AND INTUITIVE ONTOLOGY

An author such as Nietzsche (Unpublished Notebooks 1888, 14[153])v already warned that the fact that the cognitive structures of reason are useful for the survival of the human race does not warrant our making epistemological anthropocentric extrapolations: the utility of these structures does not demonstrate their absolute truth (Unpublished Notebooks 1887, 9[38]). Many years after Nietzsche, and perhaps without knowing what Nietzsche had said, Boyer arrives at the same conclusion. After stating that intuitive ontologies are the normal result of cognitive development, Boyer says:

22

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY

Evolved ontology [] is neither optimal nor necessarily true. It is certainly not exhaustivethere are domains of experience for which it does not deliver any stable intuitions. Also, intuitive ontology may well be metaphysically [i.e., philosophically] unsound, postulating such things as essences in living things or beliefs in intentional agents without much evidence. Such flaws can only be expected in an ontology that was built by natural selection rather than by trained philosophers. (1998, p. 879) Our intuitive ontology was not designed by natural selection in order for us to know the truth.vi It is therefore an error to set it up as a criterion for truth. As Boyer says, the human brains intuitive ontology is philosophically incorrect (Boyer & Barrett, 2005, p. 98). One of the tasks of philosophy should be to analyse the problem of categories, firstly, by making the contents of the intuitive ontology explicit and, secondly, by denouncing its epistemic limitations in order to avoid the temptation of raising it to the status of absolute truth. Aristotelian and Kantian theories of categories can be interpreted precisely as attempts to make the contents of the intuitive ontology explicit. Furthermore, contributions, such as that of Nietzsche, can be considered as denouncing such attempts because they treat these categories as hypostasis. Following authors such as Nietzsche and Boyer, we should distinguish between the natural level (of the intuitive ontology) and another level (where science is located, and where post-Darwinian philosophy should be locatedvii) where we should search for truth and which is a realm that on many occasions does not necessarily coincide with the expectations of the intuitive ontology. It can easily be shown that the intuitive ontology carves or classifies reality in a different way from how science does, or how philosophy should do. The propositions of science are counterintuitive, just as those of religion are. However, as I have already said, that does not mean that we should lump science and religion together; the differences between these two fields of culture are notable. While religion is a likely thing [] scientific activity is both cognitively and socially very unlikely (Boyer, 2001, pp. 369-370). That is why it has only been developed by a few people, in a few places and it is just a tiny part of our evolutionary history. Given our cognitive characteristics, scientific activity is totally unnatural (Wolpert, 1992). The intuitive (natural) ontology seems to be completely absent from the realm of science insofar as the propositions of science seem to contradict intuitive expectations. Does that mean that science (and acquired culture in general) is free from the constraints of intuitive expectations? We already know that Boyers answer is a resounding no: Conceptual constraints from intuitive ontology are present here, too []. Acquired culture can add to intuitions some explicit comments on different (or better) concepts and offer explicit non-intuitive alternatives, not change or replace the intuitions themselves. (1998, pp. 882-883) De Cruz and De Smedt (2007) also tackle the problem of the relation between the intuitive ontology and scientific understanding from an evolutionary perspective, analysing the specific case of the theory of evolution applied to humans. Those 23

GALPARSORO

authors present something like a metatheory of evolution constructed out of evolutionary elements, thus placing themselves in a domain that could formally be useful for a naturalistic metaphilosophical reflection. The distinction between humans and non-humans belongs to the realm of the intuitive ontology. It is possible that some studies into human evolution are influenced by this spontaneous ontological division between humans and nonhumans, when for example they conclude that human evolution is exceptional (i.e., unique) with respect to that of other species. On the other hand, essentialismviii (also a result of the intuitive ontology) could lead to opposite conclusions: given that there is a great genetic similarity between non-human higher primates and humans, implicitly both must share the same essence. This can lead to the idea that apes have psychological abilities that are similar to those of humans (De Cruz & De Smedt, 2007, p. 358). We can therefore see that both those who emphasize the uniqueness of humans and also those who insist on demonstrating the family resemblance with the apes are (when it comes down to it, and despite the fact that the defenders of both these opposing positions claim to be scientific) victims of the intuitive ontology. We arrive then at the paradox that scientific ideas have to constantly struggle against something without which those scientific ideas would not exist (i.e., against the intuitive ontology). The fact that our intuitive expectations are so firmly rooted in the human mind explains, from an evolutionary perspective, why we resist accepting explanations, such as evolution, that tend to contradict those expectations (Girotto et al., 2008). Such resistance to scientific ideas is so widespread that some authors do not hesitate to claim that it is a human universal (Bloom & Skolnick Weisberg, 2007).
5. A FOLK-PHILOSOPHY? THE CASE OF DUALISM

The natural character of dualism has been emphasized by some anthropologists (Astuti, 2001; Gell, 1998) who are aware of opposing a very widespread conception among their colleagues according to which dualism is a characteristic exclusive to western civilization. The unorthodox view has also been defended by psychologists such as Paul Bloom, frequently using the expression intuitive dualism or referring to the fact that we are all natural Cartesians or that babies are natural-born dualists (Bloom, 2004, p. xiii). The approach that Bloom advances to explain dualism is decidedly Darwinian: Darwin proposed that many mental abilities emerged through natural selectionthey arose through the reproductive advantages that they gave to our ancestors. But he was also clear that many uniquely human traits are not themselves adaptations. They are by-products of adaptationsbiological accidents. (2004, p. xi) Among these mental capacities that accidentally emerged during the evolutionary process are the capacity to understand the world and people; that is, the capacity of perform science and philosophy. In the same way as for example our feetwhich 24

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY

were originally shaped by natural selection as instruments of movementcan be used to play football, so our brains are capable of doing things in the modern world that offer no clear reproductive advantages, such as engaging in science or philosophy. Blooms thesis is that some of the most interesting aspects of mental life are a consequence of two capacities such as our understanding of material bodies and our understanding of people: we see the world as containing bodies and souls, and this explains much of what makes us human (Bloom, 2004, p. 34). That is to say, it explains many of those specifically human capacities that we set to work when we perform science or philosophy. Descartess is one of the most notable attempts in the history of thought to make what Bloom calls our nave metaphysics explicit (Bloom, 2004, pp. 5-6) and which coincides with what we have been calling intuitive ontology. The only thing that is intuitively clear for Descarteswho aimed to question everything he knewis our existence as thinking beings. In effect, Descartes asked himself: what am I? And he answered: although I may doubt my body, there is no doubt that I am a thinking being, that is, there is no doubt that the self (the soul) exists and that the body is not necessary for the existence of the soul. For Descartes it is clear that mind and body have different properties; that I am not a body, but rather a being that feels, that acts and that occupies a body. The Cartesian answer is a very good reflection of our basic intuitions regarding what we are, since this is how we spontaneously see ourselves and how we see others. Such as it is, Descartess answer satisfies the expectations of our intuitive ontology: it is perfectly natural. Furthermore, according to Bloom, this intuitive dualism that is common to all of us, and which Descartes was able to make explicit, is precisely the foundation of our understanding of personal identity (Bloom, 2004, p. 195). Cartesian intuitive dualism demonstrates the way in which we see the world. We can come to understand what it is that makes us human by recognizing that we are natural Cartesians. Spontaneously, human beings consider that mental states and entities are ontologically different from physical objects and actual events. A good explanation of this attitude is given by some recent research in psychology, such as that of Henry Wellman (1990), who maintains that young children are dualists (Bloom, 2004, p. 199). Dualism decisively favours the appearance of the idea that the soul can survive the death of the body. As Bloom claims, belief in afterlife is a natural consequence of our intuitive Cartesian perspective (Bloom, 2004, p. 207). It also explains why this belief is so widespread and why it is so difficult to accept it as false, despite the fact that cognitive scientists present evidence that is ever more categorical in favour of the thesis that mental life (or if you prefer, the spiritual soul) is no different from material forces and, therefore, has no existence independently of the body. We find ourselves facing another example of persistent resistance to science. Here once again the natural, common sense vision (in accordance with our intuitive ontology) and the scientific vision are in direct conflict. The application of materialism to what we generically call the human soul is a hypothesis that is difficult to swallow: it is profoundly counterintuitive. However, there seems to be no alternative other than to challenge the natural 25

GALPARSORO

dualism contained in our intuitive ontology, maintaining that the only way to explain humans and our cultural products is via acceptance of materialism (Sperber, 1999). This means accepting that every last vestige of dualism must disappear from psychology and from anthropology. Furthermore, we could say that it is also an invitation for it to disappear from philosophy.
6. CONCLUSION: METAPHILOSOPHY, FOLK-PHILOSOPHY AND (NATURALIZED) PHILOSOPHY

It does not seem reasonable for philosophy to continue to ignore the clamour from the approach led by the cognitivist and evolutionary disciplines, and that affects important areas of human culture. That approach presents arguments that are rationally convincing and that should therefore be of interest to philosophy. If in disciplines such as anthropology there are more and more voices being raised in favour of naturalizing the discipline, philosophy should at least seriously analyse the advisability of following the same path, which would involve, in the first place, reflecting on philosophy itself (i.e., engaging in metaphilosophy from a naturalistic perspective). Thereafter, and based on that reflection, philosophy should consider taking the opportunity to purge itself of the positions it has been so firmly entrenched in for many centuries. In this way, a naturalized philosophy should do away with those transcendent positions (or metaphysical in the Nietzschean sense) that are strongly linked to dualism. This must not be seen as philosophy betraying its own past; it is closer to being the exact opposite: from the perspective considered here, this history is valued, since it recognizes that the attempts to make the mechanisms of what we have been calling intuitive ontology explicit that were carried out by great philosophers such as Aristotle, Descartes and Kant, are the first step without which the current critical analysis of those mechanisms would not be possible. It also recognizes the efforts of authors such as Nietzsche who (anticipating some of the results that the cognitive and evolutionary approach considered here leads to) denounced the fact that, due to a paradoxical mechanism that can be explained in naturalistic terms, those positions that cling to dualism and that therefore deny the pertinence of naturalism, are very natural; that is, they can count on the support of our common sense and that is why they are so successful. Dualism does not belong exclusively to any one field of the intuitive ontology (i.e., to intuitive biology, intuitive physics or intuitive psychology), rather it is present in all these areas. Because of this, dualism could be considered one of the characteristics of the intuitive ontology in general and therefore it would be a good candidate to occupy a central place in a hypothetical folk-philosophy (i.e. a nave, natural or intuitive philosophy), whose actual existence should be demonstrated with the help of data to be provided by anthropology. If such a folkphilosophy exists (and all the evidence seems to suggest that it does), it would in principle be concernedin a way which would have to be determinedwith all the folk disciplines (folk-biology, folk-psychology, folk-physics, etc.). That is to say, it would deal with all the areas of the intuitive ontology in an attempt to provide a global vision. Using the metaphor proposed by Mithen (1996), 26

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY

folk-philosophy could be seen as the corridors that connect the chapels in the cathedral of knowledge. Each chapel would represent an area of knowledge, and folk-philosophy would be concerned with all of them. Given that such chapels and corridors are the reflection of the cognitive machinery that has resulted from natural evolution, there is a very strong chance that folk-philosophy would be included among the human universals. The same does not seem to be true of philosophy, which is located at a different level of reflection. Philosophy (for example, as practiced by Aristotle, Descartes or Kant) aims to make explicit (although there is probably no conscious awareness of performing this operation) the categories of the intuitive ontology that folkphilosophy contains. Philosophy (paradoxically unconsciously) has attempted throughout its history to make those mental constraints of the intuitive ontology that remain hidden accessible to the consciousness. Metaphilosophy would be located at another level of reflection, and analyses the assumptions of philosophy that had previously made explicit the assumptions of the intuitive ontology. Metaphilosophy aims to explore (from a naturalistic perspective) the reasons why, for example, the ontology of Aristotle is what it is and is the way it is. Thus it aims to make the status of the Aristotelian ontology explicit; performing something along the lines of a genealogical study of it. This is much the same as performing something along the lines of a genealogy of reason, of the logos, but not through merely speculative means, rather using strictly naturalistic means. Metaphilosophy should not only be concerned with the domain of what interests philosophy as a discipline (i.e., not only with the intuitive ontology that runs through all the areas of knowledge), but also with specific intuitive ontologies. Metaphilosophy would therefore be concerned with certain elements of the different fields of knowledge that in their day separated themselves from the common trunk of philosophy. A metaphilosophical reflection from the naturalistic point of view considered here would have the virtue of making philosophy (once naturalized) emerge from its walled enclosure, and making it interact with other fields of knowledge. Once the need to naturalize philosophy has been accepted, metaphilosophical reflection would contribute to making philosophy conscious that, just as with all the other fields of culture, it is subjected to cognitive constraints that interfere with its task. So philosophy should accept that the intuitive ontology is not the domain where truth is located, but precisely the domain where obstacles to finding truth are. Philosophy should struggle against the prejudices that the dominant philosophical tradition uncritically considered to be a fundamental part of its own identity. If these philosophical prejudices are so difficult to abandon, it is precisely due to the fact that they are prejudices of the common sense and they are therefore deeply rooted in the human mind. Among such prejudices a central place is occupied by the spontaneity of thinking about things in dualistic terms. If philosophy is to continue doing justice to its name, it must accept that advances in the discipline are brought about via a powerful clash with these natural conceptions that have been accepted as correct. If philosophy follows the path 27

GALPARSORO

taken by other fields of knowledge, it will have to accept its counterintuitive (i.e., unnatural) nature and struggle against intuitive (i.e., natural) conceptions that have dominated its core throughout its history, such as essentialism or dualism. A naturalistic philosophy (that is neither natural nor nave) seems to be the only possible way forward for a philosophy that aims to take its place in the field of human knowledge with dignity, without conforming to a relegated, secondary role as an interpreter of texts that are considered to be lifeless relics. The metaphilosophical reflection would identify the natural character of metaphysics and the advisability of constructing a philosophy that is unnatural or counterintuitive precisely because it is naturalistic. The metaphilosophical reflection would also encourage us to consider philosophy in a different, unusual way; reading and interpreting it in the light of what has been said here: from a cognitive and evolutionary perspective. This does not mean throwing out the history of philosophy, but rather interpreting it as the efforts of human beings to understand the world and to understand themselves, but emphasizing the fact that these efforts have to use tools that are not well suited to this end. The intuitive ontology offers permanent resistance that we can only attempt to overthrow through patient work in which the cooperation of everybody is necessary. Culture is accumulative and this allows us to advance through the dense forest of our own understanding. However, philosophy cannot shut itself away; it cannot become a sterile, exclusively self-referential practice. It must be aware that it is one more cultural product, with a series of special characteristics (such as striving to provide globalized explanations) that make its task extremely difficult in a period when the accumulated knowledge is so large that it is impossible to cover it all in a single prodigious synthesis. Nevertheless, philosophy must continue to assert that there is no problem that is not within its field. It must insist that its specificity is precisely and paradoxically its generalizing aim; that is, its declared vocation for the absence of specificity. That its capacities are limited compared to this grandiose task must serve as a stimulus to enlist help from other fields of knowledge that historically belonged to the realm of philosophy but which were progressively separated from the common trunk. The advisability of embarking on a metaphilosophical reflection from this perspective seems obvious. Such a reflection opens the doors to a naturalization of philosophy within the broader context of the programme of a naturalization of culture. Notwithstanding, it must remain clear that naturalizing philosophy does not necessarily mean passively accepting that philosophy must be completely built using features of the methods of the natural sciences. However, it does require accepting that philosophy must free itself from the arrogant attitude that it has displayed in many moments throughout its history, and by virtue of which it has rejected the help that could have been provided by other disciplines in order to better understand itself.

28

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY

NOTES
*

This work forms part of the Research Project Naturalizing Philosophy: A Metaphilosophical Reflection in the Context of Contemporary Culture (EHU2009/03), funded by the University of the Basque Country (UPV/EHU). i An important exception is the excellent book by George Lakoff and Mark Johnson: Philosophy in the Flesh. The Embodied Mind and Its Challenge to Western Thought (1999). The authors take to philosophy as a study object, applying the knowledge of cognitive science. They move, then, in a metaphilosophical field. The key to their explanation lies in the emphasis on metaphorical character of the assumptions of the different philosophies. Their perspective is, therefore, rather linguistics. It shows how the concepts and the theses of the most significant authors in the history of philosophy are constrained by a series of metaphors as a result of naive (i.e., folk) way of think of human beings. From the perspective of the present work, it would take another step, wondering why these metaphors are as much successful, that is why they are so deeply rooted in the human mind. The answer comes from the field of evolutionary psychology: these metaphors were formed along the evolutionary history of man and respond to the solutions that the man was to give to problems posed by the environment. The evolutionary approach of the present work and the approach of Lakoff and Johnson (based on the results of cognitive science and on special attention paid to the metaphors) are complementary. Both approaches show the same preoccupation by the problem of the philosophy and they approach it using a similar strategy. ii In the more specifically philosophical field of metaethics the work of Richard Joyce (Joyce 2006) is particularly noteworthy. iii Studies of neuroimages provide clues as to how the intuitive ontologies can be neurologically represented (De Cruz & De Smedt, 2007, P. 354). Such studies suggest that intuitive ontologies structure and guide perception. Thus, the brain does not just passively construct abstract information from sensory cues, but actively constructs conceptual frameworks to interpret the sensory information (Ibid.). We can consider that this is another more specific way to refer to what Kant called the spontaneity of understanding, that is, to refer to the fact that the subject contributes something to the process of knowing. What Kant could only lay out in a way that was inevitably confused and even clumsysince he did not have more details at his disposalis now made clear by these discoveries. That is why philosophers must know what Kant said, but they cannot remain deaf or blind to the latest discoveries of, in this case, neuroscience. iv This lack of agreement between science and intuitive ontology led to studies into intuitive or nave (i.e., folk) theories in fields such as physics, logic, biology and psychology. An outstanding author in the attempt to tackle the problem of science (and very specifically that of biology) from an anthropological perspective is Scott Atran (1998). v When Nietzsches Unpublished Notebook is cited, the date and the numeration of the Notebook are given according to the numeration established in the Colli-Montinari edition (Nietzsche, 1967-sq). vi In this too, there is concurrence with Nietzsche. Cf. The Gay Science, 354: We have not any organ at all for knowing, or for truth: we know (or believe, or fancy) just as much as may be of use in the interest of the human herd, the species. vii Pre-Darwinian philosophy (such as that of Aristotle or Kant) would be at an intermediate level, between that of the intuitive ontology and that of post-Darwinian science or philosophy. viii Regarding essentialism in biology, see the work of Susan A. Gelman (Gelman, 2003; Gelman & Hirschfeld, 1999). Essentialism in biology has important consequences, as it is a mental obstacle that makes the appearance of the theory of evolution difficult. It would be necessary to analyse whether essentialist positions, which are present in other fields of knowledge, and very consolidated in philosophy (e.g., the essentialism of Plato: ideas or forms are the invariable, unchanging essences of material things in a constant state of change or becoming) also represent an obstacle when it comes to offering non-essentialist (or non-substantialist) explanations in these fields. This would bring out

29

GALPARSORO the difficulty, denounced by authors such as Nietzsche, involved in detaching oneself from notions such as being, the self or causes. Essentialism is so firmly grounded in our intuitive ontology that explanations that threaten this idea encounter huge difficulties when it comes to being accepted.

REFERENCES
Astuti, Rita (2001): Are we all natural dualist? A cognitive developmental approach. The Malinowski Memorial Lecture, 2000 [online]. London: LSE Research Online. Available at: http://eprints.Ise.ac.uk/archive/00000471. Available online: November 2005. Atran, Scott (1998): Folk biology and the anthropology of science: Cognitive universals and cultural particulars, Behavioral and Brain Sciences 21, 547-609. Ayala, Francisco J. (1987): The Biological Roots of Morality, Biology & Philosophy 2, 3, 235-252. Bloom, Paul (2004): Descartes Baby. How the Science of Child Development Explains What Makes Us Human. New York: Basic Books. Bloom, Paul (2007): Religion is natural, Developmental Science 10:1, 147-151. Bloom, Paul and Deena Skolnick Weisberg (2007): Childhood Origins of Adult Resistance to Science, Science 18 May, vol. 316, 996-997. Boyer, Pascal (1994): The Naturalness of Religious Ideas. A Cognitive Theory of Religion. Berkeley / Los Angeles: University of California Press. Boyer, Pascal (1998): Cognitive Tracks of Cultural Inheritance: How Evolved Intuitive Ontology Governs Cultural Transmission, American Anthropologist 100 (4), 876-889. Boyer, Pascal (1999): Human Cognition and Cultural Evolution, in Henrietta L. Moore (ed.): Anthropological Theory Today. Cambridge: Polity Press, pp. 206-233. Boyer, Pascal (2000): Natural epistemology of evolved metaphysics? Developmental evidence for early-developmental evidence for early-developed, intuitive, category-specific, incomplete, and stubborn metaphysical presumptions, Philosophical Psychology 13, 277-297. Boyer, Pascal (2001): Religion Explained. The Evolutionary Origins of Religious Thought. New York: Basic Books. Boyer, Pascal (2003): Religious thought and behaviour as by-products of brain function, Trends in Cognitive Sciences, vol. 7, n 3, 119-124. Boyer, Pascal (2008): Religion: Bound to believe?, Nature 445, 23 October 2008, 1038-1039. Boyer, Pascal and Clark Barrett (2005): Domain Specificity and Intuitive Ontology, in David M. Buss: The Handbook of Evolutionary Psychology, Hoboken: John Wiley & Sons, pp. 96-118. Carruthers, Peter, Stephen Stich & Michael Siegal (2002): The Cognitive Basis of Science, Cambridge: Cambridge University Press. De Cruz, Helen & Johan De Smedt (2007): The role of intuitive ontologies in scientific understanding the case of the human evolution, Biology and Philosophy 22, 351-368. Dennett, Daniel C. (2003): Freedom evolves. New York: Penguin. Dennett, Daniel C. (2006): Breaking the spell. Religion as a natural phenomenon. New York: Penguin. Gell, Alfred (1998): Art and agency. Anthropological theory. Oxford: Clarendon Press. Gelman, Susan A. (2003): The Essential Child: Origins of Essentialism in Everyday Thought. Oxford: Oxford University Press. Gelman, Susan A. & Lawrence A. Hirschfeld (1999): How Biological Is Essentialism?, in: Douglas L. Medin & Scott Atran (eds.): Folkbiology. Cambridge (Ma.): MIT Press, pp. 403-446. Girotto, Vittorio, Telmo Pievani & Giorgio Vallortigara (2008): Nati per credere. Perch il nostro cervello sembra predisposto a fraintendere la teoria di Darwin? Torino: Codice Editori. Hauser, Marc D. (2006): Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong. New York: Harper Collins. Joyce, Richard (2006): The Evolution of Morality. Cambridge (Ma.): MIT Press. Lakokff, George & Mark Johnson (1999): Philosophy in the Flesh. The Embodied Mind and Its Challenge to Western Thought. New York: Basic Books.

30

METAPHILOSOPHY, FOLK-PHILOSOPHY AND NATURALIZED PHILOSOPHY Mithen, Steven (1996): The Prehistory of the Mind. A search for the origins of art, religion and science. London: Thames and Hudson. Nietzsche, Friedrich (1967-sq): Werke, Kritische Gesamtausgabe. Berlin: Walter de Gruyter (Hrsg.: Giorgio Colli & Mazzino Montinari). Pinker, Steven (2002): The Blank Slate. The Modern Denial of Human Nature. New York: Penguin. Ridley, Matt (1996): The Origins of Virtue. Human Instincts and the Evolution of Cooperation. New York: Penguin Books. Sperber, Dan (1996): Explaining Culture. A naturalistic approach. Oxford: Blackwell. Sperber, Dan (1999): Voir autrement la culture, in Roger-Paul Droit & Dan Sperber: Des Ides qui viennent. Paris: Odile Jacob, pp. 91-105. Tooby, John & Leda Cosmides (1992): The Psychological Foundations of Culture, in Jerome H. Barkow, Leda Cosmides, & John Tooby (eds.): The Adapted Mind: Evolutionary psychology and the generation of culture. New York: Oxford University Press, pp. 19-136. Wellman, Henry M. (1990): The Childs Theory of Mind. Cambridge (Ma): MIT Press. Wolpert, Lewis (1992): The Unnatural Nature of Science. London: Faber and Faber.

31

PABLO QUINTANILLA

NATURALISM AND THE MIND


The Final Questions

The Word naturalism has so many different meanings in contemporary philosophy that sometimes it might seem little useful, unless one decides to make some precisions. In this paper I shall, in the first place, detail the senses in which we can fruitfully use the concept in order to understand the nature of the mind. Then, I will discuss what arguments have been offered, for and against, a naturalistic account of the mind. Finally, I will discuss what options we have if we give up naturalism and to what extent we can explain in a naturalistic manner what probably are the most complex features of the human mind, namely, moral behavior and agency. We can start by distinguishing between reductive and non-reductive ontological naturalism. The former view claims that natural entities are the only real entities, i.e., objects that are part of the physical universe. In other words, something is a real entity if and only if it is part of the ontology of the hypothetically completed empirical sciences, or if it is reducible to entities that are part of such ontology. If you take this road, you reach an alternative. On the one hand, you can accept that, strictly speaking, the mind doesnt exist, that the word mind is only a misleading name we use to describe the brain, especially when we dont know how it works. This view is championed by many people, especially by Paul Churchland (1979) in his eliminative materialism. On the other hand, you can say that the word mind is reducible to a set of features or functions of the brain, in which case you can still use the word, that is, you are not an eliminativist as Churchland, but you must keep in mind that you are using an imprecise word to talk about physical things that would be explainable in better ways by just using physical vocabulary. Non reductive naturalism claims that some real entities are part of the ontology of the hypothetically completed empirical sciences, but that some other entities are not, nor are reducible to those entities. However, non-reductive naturalism holds that all real entities supervene to such ontology. An example of these entities would be adverbs, social classes and minds; but not Zeus, ghosts nor phlogiston. Thus, minds, mental states, beliefs, desires, goals and actions are real entities which are part of the universe, with the same status as trees and planets but, whereas trees and planets are spatial-temporal entities, minds and mental states are not, although they supervene upon spatial-temporal entities. Donald Davidson (1980) and David Papineau (1993) are important representatives of non-reductive ontological naturalism. Another important distinction has to be made in the methodological realm, this time between reductive methodological naturalism and non-reductive
J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 3342. 2013 Sense Publishers. All rights reserved.

QUINTANILLA

methodological naturalism. The former claims that all terms to be used in any method designed to produce knowledge can be replaced by terms which are part of the hypothetically completed methods of the empirical sciences, without any explicative, cognitive or semantic loss. If you take that road, you would have to say that mental terms like emotion or intention can be reduced to physical terms, in such a way that you can replace them in any scientific psychological theory without losing either semantic or cognitive content. This implies that psychological theories and any kind of theory that is supposed to explain whatever the mind is, in principle can be reduced to some kind of physical naturalistic theory, without relevant loss. Non reductive methodological naturalism, on the other hand, affirms that although there are terms, for instance psychological terms, which cannot be reduced to terms that belong to the hypothetically completed methods of the empirical sciences without cognitive or semantic loss, these former terms, supervene upon the latter ones. Since both non reductive ontological and methodological naturalism rely heavily on supervenience, it will be necessary to explain this concept further. A simple way to put it would be this: let us call A an entity with some ontological properties or a term with some semantic properties; and lets call B another entity with its own ontological properties or a term with its own semantic properties. A supervenes upon B if and only if there can be no difference in A without being at the same time a difference in B. The word supervenience has a long history, but Donald Davidson (1980, P. 214) made it central to philosophy of mind debates, as he defined it in the following way. [M]ental characteristics are in some sense dependent, or supervenient, on physical characteristics. Such supervenience might be taken to mean that there cannot be two events alike in all physical respects but differing in some mental respect, or that an object cannot alter in some mental respect without altering in some physical respect. Dependence or supervenience of this kind does not entail reducibility through law or definition: if it did, we could reduce moral properties to descriptive, and this there is good reason to believe cannot be done; and we might be able to reduce truth in a formal system to syntactical properties, and this we know cannot be done. The idea in supervenience is, therefore, that if A supervenes to B, there is some kind of ontological relation of dependence between A and B, such that although B is autonomous from A, A is not from B. That is, there could be Bs without As, but there could not be As without Bs. The consequence is that if the mind and the mental states are supervenient upon the brain, there are obviously brains without minds, but not all the way round. Furthermore, any change in the physical states of the brain will bring about changes in the mental states. Furthermore, to hypothetical brains in exactly the same physical conditions, if that was possible, would produce exactly the same mental states. However, the mind is not ontologically identical to the brain, or the body; nor the mental states are 34

NATURALISM AND THE MIND

ontologically identical to the physical states. In the same fashion, a psychological explanation is dependent of physical explanations, but the former cannot be reduced without cognitive or semantic loss to the latter. There have been some objections to the different projects that attempt to give a naturalistic explanation of the mind. The most interesting of them, that is, to the attempts to explain the mind as a natural object or as an entity that supervenes upon natural objects, come from Kantian quarters. It is well known that Kant (1999) thought that the mind has the capacity to constitute phenomenical reality, that is, the only reality we can know, because the noumenon can never be known and it is only a posit of pure reason. In assuming that, Kant claims that the mind or the understanding (Verstand) cannot be studied as any other natural object, since it has the ability to constitute natural objects and, therefore, cannot be a natural object itself. For Kant, the mind is a transcendental object which cannot be studied by natural sciences but only by a transcendental science, which would be his own transcendental philosophy. Kants path was followed by many critics of naturalism, especially by Edmund Husserl and his transcendental phenomenological philosophy. Husserls influence was pervasive among Continental philosophy which, it seems to me, caused this kind of philosophy to divorce from science and, even worse, maintain an obsolete and sometimes positivistic conception of natural sciences. This is a strange paradox. In trying to move away from positivism, the consequences of Husserls legacy was that many Husserlians became positivists without knowing it so. In the beginning of the twentieth century, around 1911, Husserl (1965) thought that naturalism was the biggest danger to both philosophy and science. In his later work, The Crisis of European Sciences and Transcendental Phenomenology: An Introduction to Phenomenological Philosophy, written in 1936 (1965), he also claims that naturalism, as well as the conception of the world and of knowledge implicit in it, is partially responsible for the situation of decadence of the Western World he was living on. The reason for this to happen is that science has alienated and separated man from his own environment. He also thought that technology, which is a natural consequence of science, has become the major enemy of man himself. It is understandable, therefore, that he considered unacceptable a naturalistic approach to the mind. Needless to say, the science he knew was very different from the one we know now, and that he couldnt know the post positivistic developments in philosophy of science. Husserl developed four major arguments against naturalism; the four of them have a Kantian flavor and, at the end of the day, are rooted in Kants transcendental philosophy. Nevertheless, I will discuss them in Husserls fashion for three reasons. First, because he influenced twentieth century Continental philosophy as a whole. Husserl influenced Heidegger and phenomenological hermeneutics, Gadamer and philosophical hermeneutics, and many other phenomenology followers. Second, because many contemporary objections to naturalism, coming from English speaking philosophers, are, at the bottom, versions of those arguments. And third, because they are already formulated in a way that can be 35

QUINTANILLA

discussed properly. It seems to me that, although in these days not many people take Husserl seriously enough in the English speaking philosophical world, his arguments against naturalism have been very influential, even upon philosophers that either have not read him but have been influenced by other philosophers who have, or by philosophers that have arrived independently to similar conclusions as Husserl. Interesting arguments, with clear phenomenological influence, against the project of the naturalization of the mind can be found, for instance, in Lane Craig and Moreland (2006) and De Caro and McArthur (2004). Husserls four arguments run as follows. In the first place, he regards naturalism as claiming that all real entities are natural objects; he didnt have the concept of supervenience and he identified natural entities with material entities. He also thinks that psychologism holds that all logical entities are psychological entities, so, in the long run, they are natural too. Husserl also believes that both naturalism and psychologism are contradictory in the sense that they deny the ideal laws that they need for their own justification as sciences. Husserls main claim is that logical laws have a normative status and that that status cannot be explained by natural sciences, which not only assume such normative status as a condition of possibility for explanation, but are descriptive in its own nature. Husserls second argument holds that the phenomenological aspect of consciousness is not a natural object as any other one, and, therefore, cannot be studied as such, because it has a different ontological status. As it is well know, this idea has been largely discussed in recent philosophy of mind by those philosophers that either hold a dualist or emergentist view and that claim that the phenomenality of consciousness is ontologically different from spatial-temporal objects, so it cannot be studied in the same way and with the same methodological tools. The third argument starts by defining natural attitude as the kind of attitude we have toward the world when we have not reflected about it philosophically. After doing so, we acquire a transcendental attitude, which for Husserl is epistemically superior. Now, Husserl believes that if we become stuck with the natural attitude we tend to develop a naturalistic conception of reality that denies us the possibility of overcoming it and moving to the desired transcendental attitude. The fourth argument states, following Kant, that the mind has a constitutive role in phenomenical reality, and that what constitutes something cannot be ontologically similar to what is constituted, nor can be studied in the same way. Thus, Husserl concludes, it is a mistake to study the mind in the same way we study neurons or other natural objects, with natural scientific tools. For Husserl, therefore, naturalism is doomed. The main problem with the first objection is that it doesnt distinguish between natural entities and supervenient entities. It would be senseless to claims that adverbs, and desires and social classes dont exist, if you think that either they exist as material entities or they dont exist at all. But it is perfectly sensible to say that they exist as supervenient on natural entities. Thus, although there can only be adverbs where there is language, and language where there is people, and people where there are individual persons with neurological processes, then, adverbs 36

NATURALISM AND THE MIND

supervene to persons and their physical brain processes. I think it can be said the same thing about any other object in reality, including numbers, the principle of non-contradiction and triangles. The second argument is more interesting; since it has been discussed widely by recent philosophers, e.g. Thomas Nagel, John Searle and especially Frank Jackson (1986) in famous thought experiments. Although I will not repeat such discussions here, I would like to say that the argument starts by assuming something highly debatable, namely, that consciousness is ontologically different from neurological processes because we have a different access to their experience. If that was right, in the same fashion we would have to say that, since animals have pain and their access to their pain is different from their access to water, animal pain and water are ontologically different, from which we would have to conclude that animals have minds that are ontologically different from their bodies. It seems to me that the second argument is just a reformulation of a Cartesian attitude. For Descartes, there has to be a non-physical entity called res cogitans because at his time there was no way to explain scientifically complex phenomena as the will, consciousness or intentional mental states. But Descartes not only was a dualist but also a mechanicist. He believed that emotions and sentiments were nothing but energy fluxes. He thought we could explain emotions mechanically but not consciousness, so he was a material monist regarding emotions but not regarding consciousness. If we could explain consciousness with natural methods there would be no reason to hold its ontological difference. It seems to me that is the case too with Husserl and the philosophers that claim the ontological difference of phenomenical experience. I also think that recent neuroscience is at the edge of explaining consciousness in natural terms, which is not the same thing as reducing consciousness to natural terms. When neuroscientists finally achieve such explanation, we would have given away all arguments for the ontological difference of phenomenical experience. The main problem with Husserls second argument is that it assumes the following implication: if x cannot be explained in a naturalistic manner, then x is not a natural entity. Obviously this conditional doesnt follow and it might be just a matter of time to be able to explain more mental processes in a naturalistic way. Even if, as a matter of fact, we could never be able to explain an object with naturalistic tools -although it would be interesting to see how that could be proved- it doesnt follow that such an object is not natural. Some people prefer a mystericist turn (McGinn, 1999), which claims that the mind could be just unexplainable to our own limited intellectual capacities, although it is difficult to see why we should arrive at such skeptic conclusion given that we are too early in neuroscientific research. The third argument just begs the question about why we should give up naturalism or, a natural attitude. I believe that a natural attitude and a direct realism account is good enough philosophy and we dont need any transcendental standpoint. The fourth argument is particularly interesting and deserves more analysis. Kant and Husserls claim is that the mind cooperates in the constitution of the phenomenal world and that, therefore, it cannot be studied as any other object of 37

QUINTANILLA

the phenomenal world. For Kant the mind is a transcendental posit of pure reason, that is, its existence cannot be demonstrated but has to be presupposed by pure reason as a necessary condition for the knowledge of the external world. The problem is that, in the first place, the argument is already assuming that we have no knowledge of the noumenal world and only of the phenomenal world, which is constituted by the mind. Of course we dont have to assume all that. If you are a realist and not a noumenist, you dont have to assume we only know the phenomenal world, you can claim that we do know the real world as it is, although, of course, we can also have false beliefs about it. But lets assume for a moment that Kant is right and that we only have knowledge, when we have it, about the phenomenal world and that the mind collaborates in the constitution of such a world. Why does it follow from that that we cannot study the mind as a phenomenal reality? In order to see that the previous conditional is a non sequitur, lets use a similar example. Lets assume, for sake of the argument that the eye and the optic nerve, as well as the brain, collaborate in the constitution of the visual images, in such a way that colors and shapes wouldnt exist if there were no eyes and optic nerves and brains. Does it follow that we cannot study the eye as a natural and visual object? Does it follow that we cannot see it? Even if the brain participates in the constitution of natural phenomenal reality, we can still describe the brain as part of the natural phenomenal reality. I think that the trick with Kants argument is that he is already assuming that the mind is not a phenomenal object, so it cannot be studied as such. It was too difficult for Kant to entertain, even as a possibility, that the mind is just the brain and that when we talk about the understanding we are talking about a function of the brain. At the end of the day objections to naturalism end up either in a form of mysterianism, that is, in claiming that we dont know what the mind is and probably will never know it, or in a form of dualism. If we adopt the former view, it is not an interesting philosophical approach, since we always knew the problem was complex and we have been dealing with it for the last two thousand years. However, it is not the first time in which we think we are clueless about something and then, unexpectedly, new research shows new and promising approaches to it. By definition, we will never predict what we will know in the future. My point, anyway, is that mysterianism is not an interesting option. The other path to take, if you give up naturalism about the mind, is dualism in its different forms. In what follows, I will give some reasons to show why they are not acceptable. Although there are various forms of dualism, the central claim in this view holds that human being is a composite of two ontologically different substances: body and mind, the former being material and the latter not. This is what is normally called substantive dualism. One first problem in this conception is how to understand the idea that the mind is non-material. You might mean that it is not spatial-temporal, but then you have to explain how something that is not spatialtemporal interacts causally with what is spatial-temporal, namely, the body. Further, if the mind is not spatial-temporal, how is it that it is attached to something that is spatial-temporal, namely, one particular body. We all know that psychic 38

NATURALISM AND THE MIND

processes, that is, mental processes, have bodily physical effects, and the other way round, but if the mind is not a spatial-temporal entity, such interaction can be difficult to explain. Of course you could argue that I am assuming that two objects can causally interact only if there are ontologically similar, and that we shouldnt have to assume that. Actually, although I believe that claim, I dont have to assume it in order to object dualism. The only claim I need to make is that if the mind is ontologically different from the body, i.e. if it is not spatial-temporal, then it cannot be attached to something that is spatial-temporal, leaving aside if they interact causally or not. In other words, we have the problem of individuating minds. What makes a mind different from another, if it is not its relation to a body? One could argue, as Augustine and Kant did, that the mind is a temporal entity but not a spatial object, but that contradicts all we know in contemporary physics, in particular the fact about space-time being a single entity. Another problem with dualism is that it is not clear how we could know the characteristics and very existence of something that, by definition, its beyond all our natural and artificial tools of experience. For dualism, accepting the existence of entities that are not material could imply that they are inferred entities, that is, that we infer them from events that are perceptible in order to explain phenomena that otherwise would be unexplainable. That is the point in Descartes mechanicistic dualism. But, as I said before, there is no reason to assume we cannot explain with our present day knowledge most aspects of the mind. In fact, present day neuroscience is very close to explaining some core aspects of consciousness. The other two important problems that are waiting for good naturalistic explanations are, on the one hand, free will and agency and, on the other hand, moral behavior and normativity. I believe, however, that those problems can also be explained naturalistically. I will not attempt to do it now, but I would like to show how it could be done and what steps are already being built for that purpose; but first let me summarize what I have been trying to do so far. Until now I have been claiming that most arguments against a naturalistic account of the mind are rooted in Kants transcendental philosophy. I have also argued that those arguments are not sound and that it is possible and very realistic to have a naturalist account of the mind. I have claimed that if you give up some form of naturalism you have to choose between two options: either a mysterian view or a dualist account, and that both of them have too many flaws to be acceptable. I think that the reason for which many people still find a naturalist account of the mind either menacing or unacceptable is because they think that there are two features of the mind that cannot be explained in a naturalist form. They are agency and moral behavior. Now I will try to show how these two topics can be properly treated by naturalism. I will not attempt to spell out the details of such account here, for I have tried to do it elsewhere (2009, 2011), but I will show how that could be done. In regards to moral behavior, the relevant aspects to be explained are the origins of cooperation, the possibility of altruism, the characteristics of deliberation and the emergence of normativity. I believe these issues can be explained with some help from evolutionary theory and experimental psychology. Recent research in 39

QUINTANILLA

evolutionary theory has made great progress in explaining how cooperative behavior is a product of natural selection and how altruism can also be explained as a result of group selection, in basically the same lines as Darwin tried to solve the so called paradox of altruism. This paradox run like this: if altruism is a product of natural selection and given that altruistic people would put themselves in worse conditions of competition than egoists, then after enough generations egoists would survive and altruistic people would just be extinct; but it seems that there are altruistic people and that they have to be a product of natural selection. Darwin (1871/2010) tried to solve this paradox by using the concept of group selection, claiming that the selective pressure not only selects individuals but also well fit groups of people, which implies strong bonds of cooperation, and probably altruism, among them. Although the notion of group selection was severely criticized little after Darwins times, most recently the works of Hamilton (1964), Trivers (1971) and Axelrod (1981, 1984) have proved it more than plausible. But although cooperation and altruism can be explained with natural methods, that is not enough to have moral behavior. In order to have morality we also need some cognitive and affective capacitiestheory of mind, sympathy, deliberation and the concept of normativity. I have proposed elsewhere (Quintanilla, 2009) an order of development of different stages of abilities that can be part of what, both in the ontogenetic as in the phylogenetic realms, might produce moral behavior. The arrow shows the order of emergence, such that the previous capacity is necessary condition for the later one. What is under brackets is a more explicit explanation of the concept previous to it. Cooperative behavior, biological altruism, motor mimicry, emotional contagion sympathy (compassion) empathy (self-recognition in a mirror, self-awareness) metarepresentation/simulation (theory of mind, mentalization, the ability to attribute mental states in two or more degrees) moral altruism deliberation, recognition of moral intentionality, moral judgment, autonomy moral behavior. In order to show how autonomy could be naturalistically explained, I will turn now to agency, which I have discussed in more detail elsewhere (Quintanilla, 2011). Agency can be understood as a property possessed by those agents that can make choices, i.e., that can intentionally modify their behavior upon a predicted possible future. This would mean that, retrospectively, one could say that they could have acted differently if they had wanted to. Both for interacting with other people as for taking decisions about the future it is necessary to have the ability to simulate being them in counterfactual conditions, i.e., the ability to attribute to them the kind of mental states that we believe we would have in the circumstances in which we think they are in. When we do so, we imagine what happens in their minds and what would happen in their minds had we behaved or if we behave in one way or another. When we do that, we can plan the actions we want to perform in order to produce our desired effects, but we can also predict other peoples actions and mental states. More importantly, we can plan the actions we would perform if they behave in the way we predict they will. 40

NATURALISM AND THE MIND

Such as we can simulate other peoples mental states and actions, in order to predict them and react to them properly, we can also simulate the kind of mental states we think we would have under different possible scenarios of, so to speak, our possible selves or future identities. This internal simulation allows us to test in our imagination those possible scenarios, before actually having to carry them out. When we act we choose the scenario that we believe will be most beneficial to us, according to our list of ends or priorities, and we discard the alternative ones. Thus, the experience of agency would be the phenomenical sensation of imagining those scenarios, deliberating and choosing the one we consider most convenient with regard to our list of priorities within the utility function we assume. Many of these processes are not fully conscious or not conscious at all, both because we are not fully aware of the comparisons our minds are making between different scenarios, and because we are not always fully aware of our own list of priorities. The simulating abilities are necessary conditions for agency, understood it as the capacity to modify the world on the basis of the beliefs and desires one has or one could have. We can use the internal simulator to project our own future, by imagining our possible future selves and our possible behavior according to what we imagine would be the case in our environment. The point is that agency would be the process, either conscious or unconscious, in which we compare different scenarios in the attempt to find the most beneficial one according to our list of priorities in order to reach our goals. The deliberative process takes place when a multiplicity of beliefs, desires, ends and inclinations compete, in what has been called an intertemporal negotiation (Ainslie, 2001, p. 40). Different causes and motivations attempt to take control of the individuals behavior, as though they were different agents struggling for the control of the individual, according to different possible subsets of beliefs, desires and lists of priorities. The action performed will be the one produced by the strongest subset of causes and motivations. Both George Ainslie (2001) as Daniel Dennett (2003) have shown how the actions performed after these processes cannot be predicted because they involve too many variables and also because they include recursive mechanisms. This is generated because, although action is caused by the subsystems of mental states involved in the process, we simulate recursively the kinds of scenarios that would have obtained and the decisions that we would make if those scenarios were to take place. New simulations generate new subsets of motivations, which generate more simulations according to the possible scenarios and to our list of priorities in case those scenarios actually happened. If the different subsystems of motivations are even, deliberations will be difficult and eventually impossible. If those subsystems include one which is clearly stronger than the others, the choice is somehow smooth. The point is that what we call the will, the faculty or entity that makes us agents and that let us behave freely, doesnt have to be understood as a non-natural entity that cannot be explained nor described by the natural sciences. I dont claim in this paper that the account just sketched is correct, but I do think that it might be correct and that it might show a way to explain in natural terms and with 41

QUINTANILLA

naturalistic methodology one of the major problems in the philosophy of mind, as well as perhaps the single problem that has persuaded many people to dualism, namely, the problem of free will. Moral behavior and agency are probably the final questions to be explained in natural terms in the philosophy of mind. If we manage to do it, we will be very close to complete a naturalistic account of the human mind.
REFERENCES
Ainslie, G. (2001). Breakdown of will. Cambridge: Cambridge University Press. Axelrod, R. (1984). The evolution of cooperation. In P. Singer (Ed.), Ethics. Oxford: Oxford University Press. Axelrod, R. & Hamilton H. D. (1981). The evolution of cooperation. Science, 211, 1390-1396. Churchland, P. (1979). Scientific realism and the plasticity of mind. Cambridge: Cambridge University Press. Darwin, C. (1871/2010). The descent of man, and selection in relation to sex. New York: New York University Press. Davidson, D. (1980). Essays on actions and events. Oxford: Oxford University Press. De Caro, M., & McArthur, D. (2004). Naturalism in question. Harvard University Press. Dennett, D. (2003). Freedom evolves. New York: Penguin. Hamilton, W. D. (1964). The genetical evolution of social behaviour. Journal of Theoretical Biology, 7. Husserl, E. (1965). Phenomenology and the crisis of philosophy: Philosophy as rigorous science, and philosophy and the crisis of European man. New York: Harper and Row. Jackson, F. (1986). What Mary didnt know. Journal of Philosophy, 83, 291-295. Kant, E. (1999). Critique of pure reason. Translated by Paul Guyer and Allen Wood. Cambridge: Cambridge University Press. Lane Craig, W., & Moreland, J. P. (Eds.). (2006). Naturalism. A critical analysis. New York: Routledge. Martnez, J., & Ponce de Len, A. (Eds.). Darwins evolving legacy. Mxico: Siglo XXI (in print 2011). McGinn, C. (1999). The mysterious mind. New York: Basic Books. Papineau, D. (1993). Philosophical naturalism. Oxford: Blackwell. Quintanilla, P. (2009). La evolucin de la mente y el comportamiento moral. Acta Biolgica Colombiana, 14(4s). Special issue on Darwin 200 aos. Quintanilla, P. (2011). The evolution of agency. In J. Martnez & A. Ponce de Len (Eds.), Darwins evolving legacy. Mxico: Siglo XXI. Singer, P. (Ed.). (1994). Ethics. Oxford: Oxford University Press. Trivers, R. (1971). The evolution of reciprocal altruism. Quarterly Review of Biology, 46.

42

JESSE PRINZ

MEASURING MORALITY

1. INTRODUCTION

Whether in our minds or out there, morality is part of the world. Like everything that exists, morality can be studied empirically. This follows from the positivist platitude that existing things can be measured, either directly or indirectly, by observational means. The applicability of empirical methods to morality should be no more surprising that the applicability of empirical methods to geology or economics. Pushing positivist strictures further, I would say that any claim not amenable to some observational test should be regarded with the utmost suspicion. There are untestable claims, to be sure. These include claims about events lost to history, or claims about aspects of the universe that are too small or distant to observe. But morality is does resist observation in these familiar ways. Those who say morality cannot be empirically investigated owe us a defense of that extraordinary claim (I consider one defense in the final section). I think it is wholly untenable on the face of it. Here I will argue that morality can be studied empirically, and I will do so mostly be example. I will describe ways in which empirical methods have shed light on traditional debates about the nature of morality. More specifically, I will argue that empirical methods support a sentimentalist theory of the kind developed by the British moralists, most notably David Hume. Like any empirical thesis, this conclusion is open to correction and revision.
2. WHAT IS OBSERVATION?

Some of the queasiness associated with the empirical approach to morality can be deflected by noting that standard philosophical methodology has always been grounded in observation. Philosophers characteristically work by some kind of reflective equilibrium. This involves taking an inventory of intuitions and offering some account that makes sense of them, perhaps sacrificing a few to explain the majority. This method can be regarded as observational because intuiting is a kind observing. Take the old yarn that killing is worse than letting die. How do we intuit that? Presumably we intuit it by considering possible cases, and observing, through introspection, what seems worse. Philosophers sometimes refer to intuitions as a priori, in contrast with the a posteriori, implying that they do not depend on observation. But this distinction is completely misleading. First, what we know by intuition may just be memories of facts observationally learned, or inculcated, on some earlier occasion. Second, introspection itself can be regarded
J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 4359. 2013 Sense Publishers. All rights reserved.

PRINZ

as an observational method, akin to perception, by which we observe the contents of our own mind. Do I know a priori that chocolate is delicious? To presume so would be a confused artifact of a bad philosophical distinction. I taste chocolate; it causes hedonic pleasure; I experience that pleasure and report it. This is much the same method by which I learn that chocolate is brown, or breakable, or bitter when burnt. Drawing a line here between observational and non-observational knowledge would be unhelpful. Some philosophers draw the distinction by saying that intuitions are conceptual claims; that is, intuitions are based of the meanings of our concepts rather than facts. Even granting this extravagant commitment to analyticity, there is no reason to deny that conceptual truths are discovered through observation. There is a debate about what concepts are, metaphysically speaking, but all the major alternatives are consistent with this claim. Concepts are usually construed in one of three ways: as social norms, as Platonic truths, or as psychological entities. All of these are things we observe. That a society expects us to draw certain inferences can be observed through practices of enforcement. That Platos heaven is so-arranged can be observed through some faculty that functions like an organ of sense insofar as it registers mind-external truths and operates effectively only within certain idealized conditions. And psychological states are known either through introspection, or observation of behavior (which is how psychologists study concepts). For the record, I strongly favor the psychological view of concepts (Prinz, 2002), and I will assume that it is correct in this discussion. Here is a brief argument for the psychological view. Social norms supervene on psychological states, such as beliefs about what inferences should be drawn. Therefore, the social norm view is actually a version of the psychological view, which defines concepts in terms of psychological states possessed by groups, rather than individuals. The Platonic theory of concepts cannot be reduced to the psychological view, but, if concepts were Platonic, we would still need a psychological account of how concepts are grasped; somehow we come to know the content of our concepts and that knowledge presumably involves representing the content of the concepts that we possess. But once we admit that we have mental representations corresponding to our concepts, positing anything other than these mental representations seems unnecessary. Frege thought we needed non-psychological entities to explain how two people grasp the same concept, but there is no special difficulty supposing that individuals share tokens of the same mental representations, and, were Frege to deny this, he would still be left with the puzzle of how communication takes place, because people would grasped their shared concepts in different ways. Thus, Platonic entities are neither necessary nor sufficient for explaining concept sharing. If I am right, then conceptual analysis in philosophy amounts to analysis of our mental representations, and that, again, can be gained by observation. The point can be summarized by saying that the data used by traditional philosophersintuitionsare discovered through observation. Theory construction is a matter of explaining data. Data underdetermine theories, but theories are answerable to data. In this sense, philosophy, like science, is an empirical field.

44

MEASURING MORALITY

Given this continuity between philosophy and science, we philosophers should not be categorically opposed to scientific methods, and scientists should embrace the observations of philosophers. Picking a method is a matter of figuring out which form of observation will be most helpful in answering a question of interest. If you want to know whether you like a particular food, sampling it and introspecting it under a range of normal eating conditions would be a good method. Suppose you are interested, however, in a conceptual claim. Introspection may be helpful, but it suffers from two limitations. First, conceptual knowledge is widely believed to lie outside of direct introspective access. Semantic memory, like syntax, is believed to be largely unconscious, and we recover it by considering which cases jar our intuitions, which probably amounts to some kind of affective response. Second, when philosophers make conceptual claims, they aspire to make claims that would be agreed on by their interlocutors, and there is no guarantee that the conceptual knowledge stored in any given persons head is the same as any others. Thus, we need to supplement introspection with some kind of interpersonal bookkeeping. In philosophy, bookkeeping usually involves publication, peer review, and debate. These methods of sharing intuitions resemble what we find in the sciences. Intuitions need to be repeatable and shared if we are to speak in the first-person plural. An anachronistic intuition is like a coat rack made of strawinteresting to examine, but you wouldnt want to hang anything on it. This is most obvious when the intuition is uniquely possessed by a single philosopher. But similar concerns arise for intuitions shared by only a few, like those inculcated in a philosophical tradition, or restricted to members of a social class, gender, ethnicity, or culture. Intuitions that are group-relative are interesting, because they reveal something about the group, but they are risky when mistaken for universal claims. Given the homogeneity of the philosophical community, this risk is substantial. For these reasons, a growing population of philosophers have come to think that standard methods of philosophical observation should be supplemented by methods used in the social sciences. To figure out whether an intuition is shared, we can take poles that include different samples of people: professional philosophers, students, speakers of different languages, members of different cultures, and so on. We can also try to figure out which factors influence our intuitions by conducting experiments with independent variables that can alter introspective reports. The variable may include variations in verbally presented vignettes, environmental factors, or even direct brain stimulation. In such a way, we can get at aspects of conceptual structure that might not be obvious by mere reflection. Such methods rely on introspective reports, but supplement them with sampling and statistical tools that allow us to quantify who has an intuition in questions, and under what conditions. We can also supplement philosophical methods by devising experiments that tap into unconscious knowledge. Implicit associations, reaction times, and physiological measures (including brain scans) are useful in this regard. With a rich variety of available methods, philosophers can advance their cause by moving out of the proverbial armchair and into the lab. That is precisely what has happened with the naturalistic turnarguably the most important 45

PRINZ

methodological development in philosophy since the linguistic turn, and ultimately more promising. The naturalistic turn really began with the cognitive revolution in the early 1960s, when philosophers took a serious interest in computer science and linguistics. Then, in the 1970s, philosophers began to incorporate results from experimental psychology, and in the 1980s neurophilosophy was born. These developments were somewhat marginal in that only a small handful of philosophers were actually reading primary sources in these fields, and their impact was largely limited to the philosophy of mind (with a few practitioners in epistemology and philosophy of science). By the middle 1990s, however, it had become standard practice in philosophy of mind to draw on empirical work, and the trend picked up in moral philosophy by the end of that decade. In the first decades of methodological naturalism, philosophers almost always relied on the experimental work of professional scientists. In the last decade or so, that changed, and now many philosophers are running their own experiments. Given limits in training and resources, these are usually questionnaire studies, and the most typical format is to give participants variations of a thought-experiment culled from the philosophical literature, and ask their opinions. This has become known as Experimental Philosophy. Elsewhere, I propose that we use that term narrowly for experiments using vignettes that resemble traditional philosophical thought experiments (Prinz, 2008). I use the term Empirical Philosophy for the older practice of supporting philosophical views by appeal to other kinds of experiments, such as reaction time studies, brain scans, or methods that manipulate participants psychological states rather than merely polling opinions. Such experiments are usually, though not always, carried out by non-philosophers. Experimental Philosophy is particularly useful if we want to study peoples intuitions: the data that inform traditional philosophical analysis. A broader range of methods is useful if we want to uncover underlying processes, which may not be consciously accessible or testable using verbal materials. Experimental Philosophy sometimes leads to proposals about underlying processes; for example, if two variants of the same vignette obtain different average responses, it might be proposed that the two engage different cognitive systems. Such a claim can then be tested using Empirical Philosophy. For example, a brain scan could be used to see if different brain structures are used for the two vignette variants. Armed with these valuable tools, philosophers can find new evidence for and against old theories. To reject such methods because they are not philosophical would be silly. We should let the questions we ask dictate the methods. The methods of social science are often useful for answering questions that philosophers have traditionally asked. I will illustrate this with my discussion of morality. Of course, it would be equally undesirable to deprive social science of armchair methods. Every experiment begins with a question, tests a theory, and derives a conclusion. Questions are just things that interest us, and this can be determined from the armchair. Theory construction is usually a matter of deriving possible explanations for a range of observations, and, once observations are in, that is something we can do by reflection. To test a theory we need to figure out what observations the theory entailsanother intellectual exerciseand an experiment 46

MEASURING MORALITY

is like an argument, whose conclusion follows from the empirical results. Without science, philosophy is blind; and without philosophy science is empty.
3. NATURALIZING MORALITY

3.1 Moral Psychology Moral philosophy is traditionally sub-divided into different subject areas. Among these, the most important are moral psychology (the study of how people understand morality and make morally relevant decisions), metaethics (the study of the truth-makers of moral judgments), and normative ethics (the study of what morality demands, prohibits, and allows). Empirical methods can contribute to all of these areas, and, in what follows, I will try to illustrate by reviewing evidence that has led me to favor a particular sentimentalist view. I will describe a number of influential studies, as well as some unpublished studies of my own. Let me begin with moral psychology, the domain where the case for empirical methods is most obvious. To discover how people understand morality, we need to study psychological states, and empirical methods are clearly germane to that endeavor. I will focus here on one of the most basic questions in moral psychology: what is it to judge that something is morally good or bad? One traditional answer to this question, associated with the British moralists (especially the Scottish enlightenment figures Francis Hutcheson, David Hume, and Adam Smith) is that moral judgments are emotional states. To judge that something is good is to feel approbation towards it, and to judge that something is bad is to feel disapprobation towards it. In defending this claim, the Scotts relied on introspection. As noted, it is risky to reply on the introspective intuitions of philosophers when advancing theories about how people, in general, think. Perhaps Scotts are just hot blooded. As a first step in assessing sentimentalism, one can see whether ordinary folk intuitions align with Hutcheson, Hume, and Smith. In pursuit of this goal, I conducted a survey study in which participants were presented with one of two vignettes. One vignette describes a college student, Fred, who insists that it is morally permissible to smoke marijuana, but he feels disgusted with people who use the drug, and he feels ashamed when, one day, he uses it. The other vignette described Frank, a college student who insists that smoking marijuana is morally wrong, but feels no shame after smoking it himself, and no disgust towards others who smoke it. Participants had to say whether Fred or Frank believe that smoking marijuana is wrong. Consistent with the sentimentalist prediction, 77.4% of my participants used emotions in making moral ascriptions. Fred who has negative feelings about marijuana was credited with the belief that its wrong, despite his testimony, and conversely for Frank. This is a very large majority, and it suggests that people untrained in philosophy recognize a link between morality and emotion and tend to rely on emotions when ascribing moral values. In other work, researchers have made progress on determining which emotions are associated with which moral judgments. This is an important advance because 47

PRINZ

it helps to refine the sentimentalist theory and establish that the moral domain divides into rules of different kinds. In a seminal study, Rozin et al. (1999) presented participants with vignettes describing three kinds of rule violations: crimes against persons (such as stealing), crimes against the community (such as disrespecting parents), and crimes against nature (such as deviant sex acts). Given three emotions to choose from, participants from two cultures reliably associated crimes against persons with anger, crimes against community with contempt, and crimes against nature with disgust. Extending this work, I gave participants firstperson scenarios in these three categories and asked whether they would feel shame or guilt. Crimes against persons were associated with guilt (about 90%), crimes against nature were associated with shame (about 80%), and community came out in between (no significant difference). Such findings indicate that the folk not only associate emotions with moral judgments but also have views about which emotions arise under different moral circumstances. This is consistent with the predictions of sentimentalism. There are three ways, however, in which the work could be improved. One would be to use the survey methodology to probe modal intuitions: do ordinary people think that moral judgments are impossible without emotions? In one pilot study, I began to explore these issues, giving participants a vignette in which a person with a brain abnormality lacks emotions and engages in antisocial behavior. The majority of my sample refused to credit him with moral understanding. More work along these lines would be helpful. Another limitation of extant survey studies is that they do not directly measure emotions. The folk seem to believe that emotions are integral to moral judgments, but they could be wrong. To test for the actual involvement of emotions, we need to move beyond surveys (experimental philosophy) and into other methods (empirical philosophy). For example, numerous neuroimaging studies now corroborate that people are in emotional states when they make moral judgments (e.g., Heekeren et al., 2003; Moll et al., 2003; Harenski & Hamann, 2006). I know of only lines of imaging evidence that imply that there are conditions under which emotions are not characteristic of some moral judgments. First, there is a study by Borg et al. (2006), who report reduced emotional activity when people consider moral vs. nonmoral dilemmas, but this study has a design flaw: the nonmoral dilemma involves a forest fire that in encroaching on ones prized personal property, and that, presumably, in an intensely emotional situation. The other line of evidence is Greene et al.s (2001) imagining study, in which the authors imply that impersonal as opposed to personal moral dilemmas do not involve the emotions. But the reported result tell otherwise, showing that, as compared to nonmoral dilemmas, moral dilemmas are reliable more emotional. Moreover, the scenarios in Greens study pit helping against harming, and what varies is the perceived moral gravity of the harm; thus helping is a common denominator of all vignettes, and emotions associated with the moral injunction to help are therefore subtracted out of the analysis. A similar failure to take helping emotions into consideration plagues research on patients with ventromedial prefrontal brain injuries (Koenigs et al., 2007); these patients tend to advocating 48

MEASURING MORALITY

helping, even when they must cause serious harm, but that may suggest that their appetitive emotional drives cannot be modulated by negative emotions. In other words, there is no evidence from brain science showing emotional judgments without emotions, and a sizable body of evidence showing the opposite. Opponents of sentimentalism might concede that emotions are normally present when people make moral judgments, while insisting that these emotions are the effects of those judgments, rather than constituents or causes. Brian scans are correlational, so they cant speak to efficacy. Surveys do not solve this problem because they test for associations between emotions and moral judgment, rather than causal relations. That is the third way in which such results are limited. To show that emotions are doing moral work, we need to turn to behavioral studies in which emotions are manipulated. There is now a considerable body of research which speaks to this question. Wheatley and Haidt (2005) inflated assessments of moral wrongness by hypnotically inducing disgust; Schnall et al. (2008) reported a similar inflation of wrongness after inducing disgust though filth, foul smells, autobiographical recall, and disgusting film clips; Eskine et al. (2001) repeated this pattern using bitter beverages. These studies do not directly test the hypothesis put forwards by Rozin et al. (1999), which says that disgust tends to arise in response to crimes against nature. The authors of these disgust induction studies use some vignettes that describe acts we would consider unnatural (such as a person eating his pet dog). To test whether the Rozin model is true, Seidel and I (submitted) induced both disgust and anger (using icky and irritating sounds) and tested their impact on moral vignettes involving crimes against nature and crimes against persons. We found a selective effect of disgust on the former and anger on the latter. Work of this kind suggests that people use emotions as information when making moral judgments, and different emotions carry different information. The simplest explanation of this fact it that moral judgments simply are emotional statesfeelings such as anger or disgust directed toward an action. Call this the constitution theory. It is precisely the kind theory that the Scottish philosophers traditionally defended. If the constitution theory is right, we report moral judgments by introspecting how we feel. Extraneously introduced emotions can lead us to think that an action is more wrong then we would think under better conditions of assessment. It might sound puzzling to say that a judgment can be an emotional state. We tend to contrast thought and feelings. But the puzzlement goes away when we realize that there is actually a large class of judgments that are plausibly analyzed in terms of emotions. These are judgments of taste. When I taste a wine and report that it is delicious, it is plausible to suppose that my report expresses a positive emotional reaction to the wine; that reaction constitutes my assessment, or judgment, that the wine is good. Similarly, I might report that I find an activity fun based on the enjoyment I take in it, or that a painting is attractive based on aesthetic emotions. Judgments of taste seem to be inextricably linked to emotions, and it is perfectly intelligible to suppose that they are constituted by emotional states. The constitution theory of moral judgment draws an analogy between moral judgments and judgments of taste. 49

PRINZ

The constitution theory makes two important predictions. One is that emotions should be sufficient for moral judgments even in the absence of reasons or other cognitive states. This prediction gains some support from Murphy et al.s (2000) work on moral dumbfounding, in which participants condemn consensual incest and harmless cannibalism even when they cannot provide arguments in favor of those judgments. In another unpublished study, I presented people with vignette describing a case of harmless child molestation, and they still regarded it as seriously wrong. Further evidence comes from some emotion induction studies. For example, in their hypnosis study, Wheatly and Haidt found that hypnotically induced disgust makes people form slightly negative moral judgments about individuals who are judged to be morally exemplary in a control condition. Evidence of this kind suggests that there can be moral judgments in the absence of reasons, and that, in turn, supports the conjecture that moral judgments are nothing but emotions directed at actions (or individuals, and so on). This does not entail that reasoning is idle in the moral domain. The defender of the constitution theory can postulate a difference between basic values, which are constituted by emotional states, and applications of those values, which may require reasoning. For example, if I have a disgust response to injustice, I may need to reason extensively to determine whether international trade policies are unjust. Emotions provide a moral foundation, and reason extrapolates from that foundation to novel cases. Another prediction of the constitution theory is that moral judgments will not be possible in the absence of emotions. This is hard to demonstrate experimentally because there are no individuals who lack emotions entirely while retaining the ability to make verbal reports. There is some evidence, however. Psychopaths, who show diminished emotional responses (especially fear and sadness), tend to score low on judgments of moral competence (Blair, 1995). People with Huntingtons disease, who have diminished disgust, are known to engage in behavior deemed sexually inappropriate (Schmidt & Bonelli, 2008). Experimental manipulations that reduce emotionality would come in handy here. One example is a study by Schnall et al. (2011) in which participants who were allowed to clean their hands, hence reducing disgust, gave less harsh moral judgments than a comparison group. None of this evidence is demonstrative, but it is certainly suggestive. Again and again, predictions of sentimentalism are born out. Emotions are associated with moral judgment in folk theories, they occur with moral judgments, people introspect emotions to report their judgments, and emotional deficiencies are comorbid with impairments in moral sensitivity. The simplest explanation is that emotions are constituents of moral judgments. This is precisely what the Scottish moralists, such as Hume, maintained. Importantly, the results summarized here not what some of Humes contemporary heirs would have predicted. Contemporary sentimentalists tend to say that moral judgments are judgments about emotions, rather than judgments comprising emotions (e.g., McDowell, 1985; Gibbard, 1990). On this approach, to believe that something is morally bad or good is to believe that disapprobation or approbation is warranted. DArms and Jacobson (2000) call this neo50

MEASURING MORALITY

sentimentalism. Neo-sentimentalists say that one can make moral judgments without the corresponding emotions, because moral judgments merely mention emotions rather than using them. The empirical evidence suggests, on the contrary, that emotions are parts of moral judgments. There is also empirical evidence that people who cannot form beliefs about whether emotions are warranted, such as children and individuals with autism, can nevertheless make moral judgments (Nichols, 2008). Neo-sentimentalism is motivated by the view that we can second-guess our emotions. The bad is not that which happens to cause disapprobation in us, but that which should. Empirically, the account predicts that people have such beliefs whenever they make moral judgments. This is a testable hypothesis, and Id bet on a negative outcome. This is not to deny that we never form beliefs about our moral emotions. For instance, mature, normally developing adults come to recognize that our moral emotions sometimes conflict. When conflicts arise, we often believe that one emotional response should be superseded by another. In these cases, we do form beliefs about our emotions, and such beliefs may play a role in conflict resolution. For example, a person who has been raised in a homophobic society may be disgusted by public displays of affection between members of the same sex, while also judging that homosexuality should be tolerated. The latter judgment is itself underwritten by emotions. Such a person might be outraged by laws discriminating against homosexuals. So now there is a conflict between disgust and outrage. A person who is emotionally conflicted in this way, may chose to ignore the disgust response, because the norms against discrimination are more strongly felt, more integrated with other values, and more consistent with how this person wants to be perceived by others. Notice that this kind of emotional second-guessing presupposes that first-order moral judgments are constituted by emotions. Contrary to neo-sentimentalism, second-guessing does not show that moral judgments mention emotions rather than using them; instead it shows that mature moralizers can form beliefs about their moral judgments, which are emotionally based. In the present example, I think we should say that the person who is raised in a homophobic society forms a knee-jerk emotional judgment that homosexuality is wrong, and then suppresses it in light of an emotional judgment that it is wrong to discriminate. Second-order judgment may enter into this process, but they are not moral judgments themselves, but rather judgments about which morals are more coherent, central, self-expressive, and so on. In conclusion, I think the constitution theory best explains extant empirical evidence about the nature of moral judgments. This theory is comparable to the traditional sentimentalism of the Scottish moralists, and contrasts with some of their contemporary followers. 3.2 Metaethics The foregoing conclusion belongs to the domain of moral psychology. Sentimentalism is a theory of how we make moral judgments. So stated, it is neutral about what, if anything, those moral judgments represent. Hutcheson 51

PRINZ

thought our moral emotions tracked objective moral truths handed down by divine command. Hume thought our emotions project values on the world. 20th century authors pushed this Humean precept in three directions: error theorists say that moral judgments aim to represent objective facts and fail, rendering them all false; sensibility theorists say that moral judgments represent mind-dependent properties; and expressivists say that moral judgments dont aim to represent at all. Is there a way to empirically adjudicate these theories? Consider objectivism first. Objectivists sentimentalists are committed to the claim that there are mind-independent moral facts and our emotions represent those facts. Testing for the existence of moral facts certainly seems like the kind of thing we should be able to do empirically. Consider, for example, the divine command theory. The plausibility of this theory depends ultimately on the existence of an intelligent and benevolent god. Armchair arguments have been brought to bear on that question, but they are notoriously inconclusive. Most contemporary atheists, and many theists, think that the existence of God is an empirical questionperhaps a question that hangs on an argument to the best explanation. The perceived adequacy of scientific explanation spawned the atheist revolution, and modern theists appeal increasingly to arguments from design. I take this debate as settled in favor of atheism, or at least settled to the same degree as textbook science. We can account for human existence and the events that take place in the universe without postulating an omniscient God, and arguments from intelligent design have been rejected by the vast majority of mainstream biologists. To that extent, the Hutchesonian route to realism is empirically unpromising. There is another problem with Hutchesonian realism that infects its contemporary godless analogues. The claim that our emotions represent moral facts requires a semantic theory that would get those emotions to refer in the right way. These days, the dominant semantic theories are causal: a mental state with semantic content refers to the property that reliably causes that mental state to be tokened (the details of such accounts dont matter here). If this approach is right, and emotions are the bearers of semantic content in the moral domain, then moral emotions refer to whatever reliably causes them to be tokened. Objectivists must show that moral emotions occur in response to real moral facts. The problem here is that peoples moral responses vary considerably. It can be empirically established that there is massive variation in the moral domain (Prinz, 2007). People in different cultures have different moral responses, and, even within a culture, the range of things that cause of moral emotions is quite varied. There is no obvious common denominator between incest, cannibalism, mutilation, and bestiality even though they all cause disgust. There is also no obvious common denominator between inequality, murder, theft, and dishonesty even though they all cause anger. One might think that all moral wrongs violate autonomy, but some people moralize consensual sex acts (sibling incest), and consensual killing (euthanasia and suicide). We might think that all moral wrongs involve harms, but some people moralize victimless crimes (flag burning and sex with dead animals), and many harmful things (high fat diets and exploitation of workers) are often regarded as permissible. Philosophers have tried to find a moral common 52

MEASURING MORALITY

denominator, but no extant proposal has identified any mind-external property that reliably correlates with moral judgments. This makes it unlikely that moral judgments refer to objective facts. More importantly, we can fully explain the causes of our moral judgments without recourse to objective moral facts. Here, empirical methods can be used to play a debunking role. If we can explain our judgments without appealing to moral facts, then we can have no need to posit such facts. Toward this end, I think the most promising tact is Nietzschean. That is to say, I think we can try to account for moral judgments by appeal to cultural history. Variation in sexual morality, attitudes towards slavery, and attitudes towards torture have been investigated by social scientists, and detailed stories are available to explain every moral norm that have been investigated (see examples in Prinz, 2007). This would suggest that morals are, as Hume said, human projections. They are culturally inculcated though emotional conditioning, rather than responses to any measurable mindindependent facts. This leaves us with three options: expressivism, the error theory, or the sensibility theory. Adjudicating between these options can once again make use of empirical findings. Lets begin with expressivism, which says that moral judgments express feelings rather than stating facts (e.g., Gibbard, 1990; Blackburn, 1998). It has long been argued against expressivism that moral language is assertoric in nature, contrary to the predictions of that approach. Empirical work could be used to extend this case, showing that the same pattern in true across languages. I take the linguistic evidence to be decisive against expressivism, and the philosophical efforts to explain moral languagee.g., by positing alternative logics for moral discoursestrike me as just the kind of stubborn immunity to empirical evidence that we should avoid. Next consider the error theory, which says that all of our moral judgments are false. The standard argument for this view goes as follows (cf. Mackie, 1977). It is a conceptual truth that that moral judgments refer to objective facts if they refer at all; there are no objective moral facts; therefore moral judgments dont refer and are false. As I implied in discussing moral variation, I think the second premise is empirically plausible. But what about the first? Are people conceptually committed to moral objectivism? A recent survey study found that only about half the people in a North American sample say they believe in objective moral truths (Goodwin & Darley, 2008). For the rest of us, the argument for the error theory cuts no ice. What about the 50% who think morality is objective? Do their moral judgments fail refer? This would be the case if reference depended on something like belief satisfaction. Those people believe morality is objective, and there are no objective moral facts. But contemporary theory of reference rejects the premise that reference depends on belief satisfaction. Many of our core beliefs about familiar categories are false. Take the widespread belief that gorillas are ferocious. This turns out to be false, but it doesnt imply that there are no gorillas. Suppose that half of Americans think that Obama is Muslim; it doesnt follow that their uses of the name Obama fail to refer. On contemporary semantic theories, concepts refer by their causal relations, not be accompanying beliefs. The beliefs we have can be 53

PRINZ

changed without affecting reference. Moral objectivism may be like this. Its just a contingent belief. In fact, Goodwin and Darley found that objectivism correlates with religiosity, suggesting that it is a belief that people hold because of religious training, and they might be willing to give up objectivism if they came to believe that their religious views are mistaken. Objectivism may be negotiable. To establish negotiability empirically, it would be good to show that people are willing to accept a subjectivist moral theory when confronted with evidence. Pursuant of the end, I ran a study in which I asked people whether morals would refer to response-dependent properties if science could establish that there in objective essence. I compared three morals to classic response-dependent concepts (red and funny) and to natural kind concepts (beetle and tuberculosis). When confronted with the possibility that there is no unifying essence, my participants treated red and funny as response-dependent, but they went error-theoretic with the natural kinds. The moral concept followed the former pattern. People overwhelmingly say that morals are response-dependent, rather than non-existent, if told there is no objective essence. I think the best conclusion to draw is that the sensibility theory is true. There are no objective moral facts, and moral judgments represent response-dependent properties: they property of being disposed to cause certain emotional responses in the moral judge. Some sensibility theories argue that these responses are universal (McDowell, 1985), but I think they are socially inculcated. When people realize that there are no moral facts, they are happy to admit that moral judgments refer to subjective properties, as the sensibility theory asserts. Given the great variety in subjective responses, this would also entail that morality is relative. Moral facts are subjective, and subjective values vary. Thus, there is empirical support for two core metaethical theories: subjectivism and relativism. Further empirical work can be used to distinguish between different subjectivist and relativist theories. 3.3 Normative Ethics Let me turn finally, and briefly, to the most contested domain for the moral naturalist: normative ethics. Those who fiercely resist an empirical approach to morality usually do so because they think science can only reveal descriptive facts, and morality is fundamentally normative. I think this is a mistake. Empirical methods can bear directly on normative questions. Two examples follow directly from the lessons of the last section. If relativism is true, then some of our sociopolitical pursuits are based on a mistake. We sometimes try to impose values on others, thinking that our values are universal. Relativism demands a different approach to policy; we must either embrace evaluative imperialism or find ways to let a plurality of values flourish. Against this, the normative absolutist might protest that the earlier considerations about moral variation establish only descriptive relativism, not normative relativism. It may be conceded that peoples moral judgments do not converge on a single set of values, but that doesnt entail that no set of values is better than others. Perhaps there is a single morality that should be accepted above 54

MEASURING MORALITY

all others. Normative absolutists might use this point to stress the limitations of an ethics based on empirical, and hence descriptive, information. But how are we to decide which values are best? That itself is a normative question, and normative absolutists might insist that no descriptive facts can settle it. What, then, is the alternative? Normative absolutists will typically say that we settle normative questions by normative inquiry, which involves evaluative argumentation. That is to say, we can criticize extant value systems by bringing moral objections to bear against them. For example, we might argue that those who endorse the permissibility of slavery are being cruel or unfair, and therefore they should revise their values. This might persuade the supporter of slavery, but only if she agrees that cruelty and unfairness are always wrong, and that their wrongness cannot be outweighed by other factors. But suppose we encounter a supporter of slavery who is unmoved by these argumentssomeone who thinks cruelty is fine, because might makes right. Against such a moral monster, our arguments would be useless. They would fail to persuade. At this point the normative absolutist might try to argue that the arguments should persuade because the premises on which they are based are absolutely true. The slavery supporter would simply deny this. To escape the impasse, the normative absolutist would have to show that there is a source of normativity that is binding even on those who have not internalized a given norm. Some of the great traditions in normative ethics seek to do this, and I cannot evaluative their success here, but I want to offer a brief remark on the most famous effort to ground normativity on a foundation other than sentiment: Kantian ethics. Kantians will say that the source of normativity lies in our very nature as autonomous agents. To see ourselves as acting freely, we must be responsive to reason, and reasons tells us that we should act in ways that can be treated as general principles or maxims, which would bind us on all occasions an on others. If we act in a way that cannot be universalized, then we are acting unreasonably, and doing so undermines our ability to function as autonomous agents; we become slaves to passion rather than choice. A Kantian will say that the slavery supported fails on this requirement and therefore lacks autonomy, and should change her values. For instance, the decision to enslave when universalized makes it impossible to enslave (slaves cant enslave), resulting in a practical contradiction. But, on this rendering, the Kantian move can be resisted in at least three ways. The slavery supported can willingly give up her autonomy, at which point its not clear how the Kantian can argue that she should act autonomously or view herself as autonomous, since autonomy is supposed by Kantians to be a foundation for normativity, not itself in need of further normative support. Notice, too, that autonomy is probably descriptively false about human beingshuman decision making is largely driven by unconscious factors that we cannot controlso the opponent of the norm that we should act autonomously can argue that this violates the stricture that ought-implies-can. Second, one can resist the move from autonomy to universalization. Lets suppose that acting freely is a matter of active from reason; reason does not tell us that all people are equal, but rather that we differ in abilities. So reason might dictate that we should each do as we can, not 55

PRINZ

that we should each do the same. Reason tells the mighty that they can dominate others. Third the supporter of slavery can reject the conclusion that slavery fails a universalization test. It may be true that not everyone could be slave, but everyone could try to enslave others without practical contradiction. This is perfectly consistent with a might-makes-right morality. Indeed, such arguments have been used to justify capitalism with an almost Kantian ring: we all try to advance ourselves, and, in so doing, some will inevitably end up subordinate to others, but thats okay because we can coherently will that the pursuit of capital be a universal law. These replies to the Kantian are not meant as decisive. They merely illustrate that a celebrated effort to find a universal foundation for normativity may not guarantee the outcome desired by its adherents. There is also a sense in which the Kantian project is ill-suited as a remedy for relativism. It does not deliver a universal moral code. Rather, it is a procedural ethics, which tells us how to decide what to do, and those decisions may depend in facts that differ across individuals. For example, Kantians say we should cultivate talents, but talents vary, so this universal norm must be implemented through relative norms. Kantian procedures may tell us not to harm others, but what counts as a harm may depend again on individual preferences, and Kantian arguments against harm may be undermined in cases where the victims consent. A Kantian might try to get around this last problem by distinguishing autonomous consent and coerced assent, but this distinction brings the Kantian back on to empirical thin ice. If human preferences are generally based on unconscious processes, and these are shaped by inculcation and social influence, then there may be a sense in which not action passes the test for autonomy. That would result in the prohibition of every action. Kants moral philosophy faces an empirical dilemma, then. It either exaggerates the degree to which humans are in fact free, or relies instead on our a merely psychological idea of freedom, which invites relativism, in so far as what people see themselves as freely consenting to varies from place to place. We have also seen that the Kantian foundation for morality cannot obviate the need for sentiments. Recall the slavery supported who says she doesnt care about autonomy. Even if norms of autonomy are rationally binding, they may be motivationally inert. One might express this by saying that such norms may be true of me, but they do not become norms for me until I embrace them. Thus, Kantian normativity ends up looking like an elaborate pun. When we engage in moral deliberation, we restrict ourselves to norms that we embrace. Other norms are not norms in the same sense. They have no practical impact until they are embraced. So the notion of norm identified by Kant is different from the one used in moral thought. Kant has simply changed the topic. Thus, the Kantian program cannot undermine the subjectivism found in sentimentalist theories, because subjectivists are making the descriptive claim the ordinary moral vocabulary expresses subjective states. There may be other normative vocabulary that does not express such states, but it is not the vocabulary we use in daily moralizing. This brings us to a second way in which empirical findings may bear on normative ethics. I have argued for both relativism and subjectivism. We have seen 56

MEASURING MORALITY

that relativism may have normative consequences (efforts to impose values on other are often based on a mistake). Subjectivism has normative consequences as well. More precisely, subjectivism provides a recipe for arriving at normative truths. If morality is subjective, then true moral claims are those that reflect the values of the makers of those claims. It follows from this that I am morally obligated to do whatever my values demand of me. To figure out what I ought to do, then, I need to figure out what my values are. I might do that by introspection, but introspection is a fallible guide. Some of my values may not be readily accessible. Empirical findings may prove helpful in discovering these values, and leading to actions that are right, subjectively speaking. By comparison, consider empirical research on subjective well-being. People are routinely mistaken about what makes them happy. We are bad at affective forecasting, and we inflate the value of things such as wealth in making us happy. Empirical evidence can reveal subjective facts about what makes us happy. Likewise, empirical research could be designed to investigate what really irks us morally. Against this, the opponent of empirical methods might say, it doesnt matter what we actually morally value; what matters is what we should morally value. So stated, this begs the question against the subjectivist, and assumes that there are objective moral values. There may be objective norms, but these are not norms for me until I have embraced them, and embracing them is a subjective act. But the subjectivist can allow that we sometimes adopt norms that we do not currently embrace. One can criticize ones own values from within, by pitting extant values against each other or by thinking about what one would like to get out of morality and adjusting ones values accordingly. Thus, subjectivism does not collapse into the view that we are required to do what we currently want. It might be elaborated to encompass norms that we would want after we underwent evaluative revision from within. The norms that survive internal critique are ours insofar as we derive them from current values, but they include norms that we do not yet embrace. Internal critique can be informed by empirical research. For instance, think about how Nietzsche criticizes Christian morality. He first diagnoses the historical origins of these values and the psychological attitudes that maintain them; he then points out that these historical and psychological facts are repellent to the very people who cherish Christian values. Thus, he confronts readers with a kind of evaluative hypocrisy, and that dissonance spurs a moral change. This is a normative outcome (an injunction to revise morality) driven by empirical claims about the historical and psychological foundations of morality. In pursuing moral reform, one can think about what one wants out of a moral system. This is an internal question, a question about what we value, but not necessarily a moral question. We might want values that make our lives safer and happier, even if one doesnt regard this as a moral demand. Still, given that preference, one might want to inculcate values in ones self and other than are conducive to this non-moral end. This again is an empirical question. If we want to know what values will increase our safety and happiness, we need to investigate 57

PRINZ

the causes of happiness and the social arrangements that increase security. The point is not simply that we need empirical methods to implement valueseveryone would agree to that. Rather, we can use empirical methods to reshape values, by pitting one value against another. Empirical methods can, in that way, help us figure out what we should value, not just what we do value currently. This, of course, can only be done against the background of values that we already possess. I think we can never derive moral values from a transcendental perspective as some normative ethicists would like. Even Kantian norms become moral only to the extent that we can persuade someone to embrace them, and such persuasion must be done by appeal to values that are already embraced. Morality, like science, must be refined from within a set of prior convictions. If this is right, then empirical methods can help us figure out what we should do by uncovering what we value, and what we should value, by pitting one value against another. Once we see that moral norms are subjective, the divide between the normative and the descriptive breaks down. Of course, norms cannot be logically derived from descriptive premises, as Hume taught us. Norms are emotional dispositions and no description entails an emotion. At best, descriptive facts can entail that we should have certain emotions given the emotions we already have, and this discovery can lead us to alter our emotions. Emotional alternation works by means of emotions; we use one attitude to condition another. The homophobia case offers an illustration. Someone inculcated to feel disgust at the sight of same-sex lovers, may come to think that this reaction is bad given other emotionally grounded values, such as the belief that consenting adults should be allowed to do what they like. The latter norm can be brought to bear against the disgust reaction, by focusing on the inconsistency. When the homophobe sees homophobia as bad, the disgust response will come to cause shame, and that shame may work to suppress the disgust. This change would not itself be an empirical discovery, but empirical methods could bring out the initial tension in values. The autonomy of the normative consists in nothing more than Humes law, and Humes law is consistent with the claim what our moral values are, hence what they demand of us, is a fact about our psychology. Such facts can be known only through empirical and introspective observation. I conclude that every area of ethics can be empirically illuminated, and there may be no residue that resists such inquiry. To pursue the study of ethics without the tools of psychology, neuroscience, and social history would retard progress. And, as we have seen here, the science of morals can be guided by centuries of philosophical reflection. Turf wars and debates about methodology distract us from the goal of figuring out what values are what they demand.
REFERENCES
Blackburn, S. (1998). Ruling passions. Oxford: Oxford University Press. Blair, R. (1995). A cognitive developmental approach to morality: Investigating the psychopath. Cognition, 57, 1-29.

58

MEASURING MORALITY Borg, J., Hynes, C., van Horn, J., Grafton, S., & Sinnott-Armstrong, W. (2006). Consequences, action, and intention as factors in moral judgments: An fMRI investigation. Journal of Cognitive Neuroscience, 18, 803-817. DArms, J., & Jacobson, D. (2000). Sentiment and value. Ethics, 110, 722-748. Eskine, J. K., Kacinik, A. N., & Prinz, J. J. (2011). A bad taste in the mouth: Gustatory disgust influences moral judgment. Psychological Science, 22, 295-299. Gibbard, A. (1990). Wise choices, apt feelings. Cambridge, MA: Harvard University Press. Goodwin, G. P., & Darley, J. M. (2008). The psychology of meta-ethics: Exploring objectivism. Cognition, 106, 1339-1366. Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M., & Cohen, J. D. (2001). An fMRI investigation of emotional engagement in moral judgment. Science, 293, 2105-2108. Harenski, C. N., & Hamann, S. (2006). Neural correlates of regulating negative emotions related to moral violations. Neuroimage, 313-324. Heekeren, H. R., Wartenburger, I., Schmidt, H., Schwintowski, H. P., & Villringer, A. (2003). An fMRI study of simple ethical decision-making. Neuroreport, 14, 1215-1219. Koenigs, M., Young, L., Adolphs, R., Tranel, D., Cushman, F., Hauser, M., & Damasio, A. (2007). Damage to the prefrontal cortex increases utilitarian moral judgments. Nature, 446, 908-911. Mackie, J. L. (1977). Ethics: Inventing right and wrong. London: Penguin. McDowell, J. (1985). Values and secondary qualities. In T. Honderich (Ed.), Morality and objectivity. London: Routledge & Kegan Paul. Moll, J., De Oliveirra-Souza, R., & Eslinger, P. J. (2003). Morals and the human brain: A working model. Neuroreport, 14, 299-305. Murphy, S., Haidt, J., & Bjrklund, F. (2000). Moral dumbfounding: When intuition finds no reason. Unpublished manuscript, Department of philosophy, University of Virginia. Nichols, S. (2008). Sentimentalism naturalized. In W. Sinnott-Armstrong (Ed.), Moral psychology: The cognitive science of morality: Intuition and diversity (pp. 255-274). Cambridge, MA: MIT Press. Prinz, J. J. (2002). Furnishing the mind: Concepts and their perceptual basis. Cambridge, MA: MIT Press. Prinz, J. J. (2007). The emotional construction of morals. Oxford: Oxford University Press. Prinz, J. J. (2008). Empirical philosophy and experimental philosophy. In J. Knobe & S. Nichols (Eds.), Experimental philosophy. New York, NY: Oxford University Press. Prinz, J. J., & Seidel, A. (submitted). Music and morals: Irritating and icky sounds have contrasting effects on moral judgment. Rozin, P., Lowery, L., Imada, S., & Haidt, J. (1999). The CAD triad hypothesis: A mapping between three moral emotions (contempt, anger, disgust) and three moral codes (community, autonomy, divinity). Journal of Personality & Social Psychology, 76, 574-586. Schmidt, E. Z., & Bonelli, R. M. (2008). Sexuality in Huntingtons disease. Wiener Medizinischen Wochenschrift, 158, 84-90. Schnall, S. (2011). Clean, proper and tidy are more than the absence of dirty, disgusting and wrong. Emotion Review, 3, 264-266. Schnall, S., Haidt, J., Clore, G., & Jordan, A. (2008). Disgust as embodied moral judgment. Personality and Social Psychology Bulletin, 34, 1096-1109. Wheatley, T., & Haidt, J. (2005). Hypnotic disgust makes moral judgments more severe. Psychological Science, 18, 789-784.

59

ALBERTO CORDERO

NATURALISM AND SCIENTIFIC REALISM*

1. INTRODUCTION

Little consensus exists in the literature about what either naturalism or scientific realism amounts to. The naturalist perspective considered in this paper takes philosophy and empirical science as continuous intellectual endeavors, united by mutual integration rather than reduction. To the extent that contemporary natural science takes all claims about the world to be synthetic, naturalism denies properly a priori knowledge. I agree with Sklar (2010) that fundamental branches of science typically comprise interpretive programs that are distinctly philosophical. Hence my contrast camp to naturalism cannot be philosophy as such but radically antiempiricist philosophy (e.g. rationalism, transcendentalism and mysticism). Naturalist positions vary depending on the degree of scientific empiricism they involve. Views that emphasize radical empiricist moves in science reject scientific realism, whereas those that emphasize moderate empiricist interpretations often favor realism. Also in need of clarification is the term realism in connection with scientific hypotheses (scientific realism). Here I take as central van Fraassens (1980)s contrast between claims about observable and about unobservable aspects of the world. To antirealists, acceptance of a theory involves only the belief that it accurately describes phenomena within the reach of unaided human perception. By contrast, contemporary realists claim both that there is a world external to and independent of the mind, and that through experience, reason, imagination, and criticism, it is possible to obtain knowledge of that world, including aspects beyond the reach of unaided human perception.i I will the much reviled term noumenon to refer to entities and processes not accessed through the unaided senses (e.g. microbes, protein-folding, atoms, nuclear reactions, quarks, tunnel effects). This conception of the noumenon disregards idealist gobbledygook (e.g. how things appear when they do not appear). Realism simply does not grant epistemic primacy to appearance. The term naturalist realism will apply to any realist position inductively argued for from history and current knowledge about the character and cognitive achievements of science. Naturalist realist projects take empirical success as a marker of truth, the latters assertability limited according to the degree of empiricism embraced. These projects argue the way scientific hypotheses are argued for in the natural sciences, i.e. especially in terms of agreement with data, consistency, and risky predictions. As we shall see in the following sections, naturalist and realist proposals present various difficulties. Vicious circularity
J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 6184. 2013 Sense Publishers. All rights reserved.

CORDERO

reportedly mars attempts to justify naturalism, while realist proposals face objections concerning the limits of realist interpretation, the notion of approximate truth, and the supposed advantages of realism over more modest interpretations of scientific success. Scientific realism makes an attractive option for those who find global (metaphysical) skepticism excessive and agree that science exhibits instrumental progress that cries for explanation. Some naturalists, notably Michael Devitt (1999, p. 96), think realism becomes an irresistible option if one starts with an empirical metaphysics. Not all naturalists agree, however. Naturalist realism is a vibrant but difficult project; to see why it will be useful to begin with some history concerning the contending options. That is the subject of sections 2 through 5. Sections 6 through 11 discuss ongoing naturalist projects to make realism irresistible. I start the historical part with William Whewells realism, a position full of an optimism that ended with the breakdown of Newtonian mechanics in the early 1900s.
2. SOME HISTORICAL BACKGROUND

Empirical science, argued Whewell, progresses epistemically because the theories it offers pass stringent tests before scientists take them seriously. Whewells emphasis was on predictive power, consilience, and coherence. Successful predictions of unknown facts, he stressed, provide greater confirmatory value than explanations of already-known facts (1858, Aphorism XII). No less significant was consilience, which occurs when an induction from the colligation of one class of facts also colligates facts belonging to another class (e.g. Newtons multiple inductions pointing to the existence of an inverse-square attractive force as the cause of different classes of phenomena). Cases of consilience impress us with a conviction that the truth of our hypothesis is certain, Whewell thought (1858/ 1968, pp. 87-88). What he termed coherence occurs when extending a hypothesis to a new class of phenomena can proceed without introducing ad hoc modifications, as when Newton extended his theory of gravitation to tidal activity. These lines of testing did the required job, Whewell declared: No example can be pointed out, in the whole history of science, so far as I am aware, in which this Consilience of Inductions has given testimony in favour of an hypothesis afterwards discovered to be false. [W]hen the hypothesis, of itself and without adjustment for the purpose, gives us the rule and reason of a class of facts not contemplated in its construction, we have a criterion of reality, which has never yet been produced in favour of falsehood. (1847: 6768) In the 19th century scientists overwhelmingly agreed, not least Charles Darwin, who labored to pass at least the tests of consilience and coherence.ii As the century entered its last quarter, Newtonian mechanics and gravitational theory, classical electromagnetism (including Maxwells theory of the ether), and Lavoisiers chemistry, all passed Whewells criteria of success with flying colors and were considered beyond reasonable doubt. Nothing, it seemed, could cast doubt on the 62

NATURALISM AND SCIENTIFIC REALISM

scientific picture they yielded. But this confidence in the epistemic reach of science, along with Whewells view of progress, were about to be shattered by the profound conceptual revisions that took place in physics in the early decades of the 20th century. Newtonian mechanics, it turned out, erred dramatically about the nature of mass, gravitation, matter, causality, separability, identity, and more. Confidence in the power to infer noumenal truth from empirical success plummeted, the scientific imagination now viewed as too fertile for realism to remain a plausible project. Reactions involving strong empiricist purges became widespread in theoretical physics. They prompted a group of thinkers to articulate Logical Empiricism, a radical philosophy that privileged information from the unaided senses and distrusted abductive inference and theoretical speculation generally. To these thinkers, empirical success marked instrumental progress but not any advance of theoretical knowledge. To their credit, the logical empiricists exercised a remarkably critical attitude towards their own claims. By the late 1950s, most had come to acknowledge that their image of science was both a logical impossibility and a historical falsehood. It seemed impossible to have completely theory-independent observation reports, on pain of their not being relevant to any theory. The positivist rationale for restricting ampliative inference to just observable levels collapsed. The end of the positivist program is well known.iii Seizing on underdetermination and the problematic character of observation, Thomas S. Kuhns The Structure of Scientific Revolutions (1962) extended to observation the distrust that radical empiricists had reserved for theory. Kuhn presented observation as something so theory-dependent as to claim that scientists adhering to different general theories live in different worlds (1962, pp. 116118). In his view, theories and their successors lack common neutral perspective from which to compare their respective merits (they are incommensurable). Kuhns critique of traditional objectivity encompasses a holistic view according to which, if Einsteins mechanics is true then Newtonian (speed-independent) mass does not exist and Newtonian mass represents nothing. From this perspective, scientific descriptions refer to mere fictions and, therefore, the real world cannot (and should not) be the concern of science. Kuhn came to distance himself from this extreme relativist version of his views,iv admitting for example that scientific theory choice follows shared, informal criteria centered on accuracy, consistency, scope, simplicity, and fruitfulness.v Kuhn, however, never endorsed scientific realism, which he identified with the notion of theory-convergence on the truth. In the second edition of The Structure, he explicitly dismissed realist hopes as incoherent (1970, p. 206). On the other hand, Kuhn made some naturalist moves. From page one of The Structure, he gives center stage to history and psychology, much as naturalists recommend. He presents scientific thought as a process sustained by relations of analogy and perceived similarity. This he supplements with an antirealist interpretation of scientific progress that Kuhn links to an argument focused on Darwins realization that a species becoming more successful attests to local improvement, without any implication that the species in question is evolving 63

CORDERO

towards some ideal form (1962, pp. 170-173). Likewise, Kuhn proposed, theories change and improve as scientists encounter recalcitrant facts and conceptual conundrums, but this progress has nothing to do with how closer a current proposal is to an ideal true theory, only with how well scientists solve the pressing problems at hand. Kuhn agreed that theories improve in predictive power, scope and fertility, but in this he saw no sign that they get objectively more truthful, becausehe stressedat each stage the relevant assessments are internal to the theory and thus rendered merely local by incommensurability. A few years later we find Willard V.O. Quine (1969) promoting the naturalist view that there is no higher tribunal for truth than natural science or a better method than the scientific method for assessing the claims of science. Nor, he warned, is there any need for abstract metaphysics or epistemology (first philosophy in his jargon) to justify science. Furthermore, he urged, the success of naturalism in science convinces one that scientific methods should also be used in philosophy. However, his critique of Logical Empiricism emphasized the corrosive import of empirical underdetermination and holistic theory-dependence on scientific observation. Quines advocated naturalism was thus too mortgaged to strong empiricism to help realism. Objectivist reactions to the above developments were strong. At their center were many of the moves associated with naturalism in contemporary philosophy of science. They are the subject of the next three sections.
3. TWO REALIST REACTIONS

By the mid-1960s, nearly everybody agreed that observation and scientific method depend on theories, perspectives, and historical and cognitive circumstances. However, few accepted Kuhns radical relativist proposals. To many critics, philosophical analysis, history and science teach that observation is theory-laden, but they equally teach that experience and observation are underdetermined by theoryas attested by how often experience disappoints theories. On the realist camp, Dudley Shapere (1964) traced Kuhns relativism to a naive logical-deductive view of scientific rationality compounded by hasty sociologism. In this and subsequent works, Shapere grants central importance to the study of history, but his project (unlike Kuhns) leans towards reason-based naturalist realism. One of Shaperes goals is to reveal how reasons develop in science during revolutions and how scientific objectivity is compatible with deep theorychange. Shapere (1980) connects Kuhns relativism to certain philosophical prejudices common in the study of science, particularly the project of understanding epistemic gains in metaphysical terms separate from the ongoing epistemology of science. Instead, he recommends, the explication of concepts should do justice to their actual use in scientific practice, a recommendation Shapere has applied in detail to the study of how observation functions in science. For example, Shapere (1982) explicates astrophysicists claims that they observe the Sun, spelling out how, in this and other cases, observation reports rest on background information and reasoning based on prior learning about the world. To 64

NATURALISM AND SCIENTIFIC REALISM

Shapere, objectivity and rationality are inextricably bound with background beliefs licensed by science itself, but he argues that the quest for either is not thereby automatically vitiated (1984, p. 639). In science, he insists, metaphysics is not separate from epistemologytruth is tentatively attained when a scientific claim shows success by current standards and, for a reasonably long period, it remains free of specific reasons for doubting it (scientific, as opposed to metaphysical doubts based on mere logical possibility). According to Shapere, looking at science in this naturalistic way enables us to understand what scientists count as their body of background truths on which current considerations reasons rest; and how reasoning enables science to progress epistemically. From this perspective, the objectivity and rationality of todays science are rooted, not in the pure given imagined by strong empiricists, but inextricably bound with background beliefs, drawn increasingly from modern science, a mode of knowing that, in Shaperes view, has epistemically upgraded forms of learning received from evolution and pre-scientific life. For Shaperes approach to support realism, however, some hurdles must be dealt with first. One is that the most empirically successful contemporary theories, far from being free of specific doubts, display significant conceptual tensions and mutual incompatibilities. Realists who try to keep metaphysics and epistemology separate follow a different path. Typically, they explain empirical success in terms of the approximate (non-epistemologized) truth of successful theories. Boyd (1984) follows this approach and presents realism as an inference to the best explanation, according to which empirical theories grow in success because they become more and more approximately true. This is a naturalist perspective in which epistemological and methodological hypotheses stand as empirical theories and must be evaluated accordingly. In Boyds view, all the alternatives to realism fare worse. Empiricism and social constructivism, he argues, fail to explain the predictive success achieved by scientific theories. Critics, however, ask how truth can be the best explanation of scientific successand even if it were, why we should expect this explanation to be true just because it is the best available or foreseeable explanation of sciences success. These worries have complementary presentations in projects led by Bas van Fraassen and Larry Laudan, to which we must turn.
4. VAN FRAASSENS OBJECTIVISM

Van Fraassens Constructive Empiricism too rejects Kuhns radical antiprogressive view, but from an antirealist stance wary of noumenal explanations and committed to epistemological modesty. Theoretical explanations guide inquiry, van Fraassen admits, but he thinks they lack reliability. He recalls Newtons theory, in which the overall Universe has absolute velocity even though, by the theorys own lights, that feature is not an empirically accessible quantity. Constructive Empiricism hinges on arguments from underdetermination, logical modesty, and analogy. 65

CORDERO

Central to van Fraassens constructivism are concerns about empirical underdetermination along two lines. One, promoted by Quine (1951), says that evidence in favor of a theory equally well supports indefinitely many other quite different theories. This general thesis, which many think poses a threat to realism and even the rationality of science, has never received cogent argument, except for trivial cases. Naturalists largely disallow it (e.g. Laudan & Leplin, 1991). Importantly, however, van Fraassens leading exemplifications do not draw from Quines (1975) general line but from a second set of considerations, focused on contingent underdetermination, defeasible in principle. This second kind of underdetermination often gives way to rankings in terms of compatibility with current knowledge and scientific methodology; also, changes in findings about the relevant domain and technology may resolve a current state of underdetermination. These considerations mitigate the predicament, but they leave untouched some significant issues. Scientific criteria for theory choice and technological advances reduce the number of offers on the table at a given time, but rarely to just one. Consider, for example, the multiple ontological interpretations that classical mechanics came to comprise (Jones, 1991). In one ontology, point-particles act upon each other at a distance; in another, action takes place only by contact; in a third ontology, the motion of bodies is determined by total trajectories between spatial points; in yet another, post-Einsteinian picture, space-time is both put at the center of the ontology and endowed with causal efficacy. What, then, might a realist about classical mechanics be realist about? Jones paper invites a pessimistic general conclusion beyond the case of classical mechanics: ontologies, he suggests, fall prey to multiple empirically equivalent fundamental ontologies. Jones conclusion is a blow to traditional realist intuitions, but arguably not a fatal one. In the case of classical mechanics (and arguably all cases of interest), the multiple ontological frameworks on view converge dramatically at intermediate levels of descriptive depth, leaving no underdetermination regarding vast portions of the theoretical picture. Such seems to be the situation of classical mechanics at some regimes of energy, speed and accuracy, regarding for example noumenal descriptions of ensembles of particles, the phases of matter, the laws and nature of heat, fluids, pressure waves, and so forth; also of celestial orbits under regimes of low energy and gravity and their laws; and numerous sorts of histories too. Seemingly this comment generalizes well to other scientific domains.vi The realist point is, then, that the effective underdetermination found in actual theories leaves a theoretical core suitable for realist interpretation. Scientists may have little prospect of articulating the ultimate ontology of the world, but they already have a detailed, far reaching, highly texturedand crediblenoumenal description of the natural world at various intermediate levels, well beyond those of ordinary, pre-theoretical descriptions, not to mention perceptual phenomena. Still, effective empirical underdetermination does compromise levels of theoretical description traditionally dear to realistse.g. current physics suggests that we may never manage to decide whether physical reality is deterministic or indeterministic, or whether quantum worlds actually develop into worlds, among other questions.

66

NATURALISM AND SCIENTIFIC REALISM

Van Fraassens empiricism is radical, however, advocating an agnostic stance towards everything a theory says about unobservables. A sensible aim for science, he thinks, is to produce theories that accurately describe the observable phenomena (empirical adequacy). This restriction seems bizarre and artificial to realists. Why should anyone believe that successful theories stop telling the truth at aspects of the world accessible to human perception? Again, however, while the empiricist boundary van Fraassen proposes seems unjustified, drawing a boundary is not, given the state of science. Effective underdetermination, while in place, sets limits to what naturalist realists can convincingly support. Also, more radical underdetermination may arguably constrain realism at some fundamental levels for example concerning identity and individuality in quantum mechanics.vii Two other pillars of Constructive Empiricism are logical modesty and analogy. Ill comment on van Fraassens appeal to analogy in Section 7. On modesty, he repeatedly states that it is safer to believe less; but without a reasonable criterion this maxim lacks guidance at bestat worst its equilibrium point is skepticism, an option Van Fraassen has considered (Ladyman et al., 1997). Emphasis on empirical adequacy at the expense of realism might make sense if some vast credibility gap separated perception reports and inductions from science's most successful claims about noumenal posits. But no such vast gap seems in sight. Constructive empiricists hope to argue that successful ampliative inferences invoking unobservables are systematically less reliable than ampliative inferences involving only observables. However, both varieties of inference are fallible, and many involving unobservables are among the most credible we seemingly have. Scientists, in particular, do not understand empiricist calls for caution about claims they regard as well established as anythinge.g. that ice is made of angular molecules, with hydrogen atoms at the tips and an oxygen atom at the vertex, forming an angle of 104.45 degrees. On the other hand, the grass-roots realism scientists generally advocate is often overoptimistic.
5. NATURALIST ANTIREALISM

To one influential naturalist, Larry Laudan, the reliability attained by the sciences is no indication that theories unveil what there is beyond the observable level (Laudan 1981, 1984a). The historical record, he notes, is littered with explanations that, having gained empirical warrant, subsequently got discarded in favor of dramatically different stories about the furniture of the world. Even theories with strong predictive power have had this fate, he stresses. Laudans examples include the theories of caloric, phlogiston, and ether-based light, all of whichhe notes failed to refer the way truthful theories supposedly do. We thus get Laudans confutation of the presumed realist link between empirical success and truth. His argument seizes on the epistemic instability of deep theoretical descriptions in science, which according to Laudan, shows that realism is both underdetermined by data and refuted by the history of science. In his view, science delivers growing instrumental and methodological reliability, but truth is not needed for realizing

67

CORDERO

that feat. We have no reason to trust what contemporary theories say about deep reality, he warns. Not everything in Laudans work creates trouble for realism, however. One side of significance to naturalist realists is his defense of the objectivity of scientific values, norms and methods. Laudan turns scientific findings into epistemological considerations: Although we appraise methodological rules by asking whether they conduce to cognitive ends (suggesting movement up the justificatory hierarchy) the factors that settle the question are often drawn from a lower level in the hierarchy, specifically from the level of factual inquiry. Factual information comes to play a role in the assessment of methodological claims precisely because we are continuously learning new things about the world and ourselves as observers of the world. (1984b, p. 38) Scientists thus put forward methodological principles because they believe that following them promotes certain ends. According to Laudan, such principles should not be construed as categorical imperatives, but as hypothetical ones, i.e. imperatives whose antecedent clause is a statement about goals, and whose consequent describes some recommended action: If ones goal is Y, then one ought to do X. That is, we make methodology out of empirical claims about the world, specifically claims about something instrumental in accomplishing identified goals. As such, methodological principles are about the empirical world, and appraising them is no different from testing empirical claims and theories. A methodological conditional found wanting invites revision the way a theory lacking empirical support invites revision. As with theories, scientists consider methodologies and choose between rival ones. Most importantly, there is no need of any meta-methodology in Laudans approach. If a methodological rule amounts to a postulated assertion of covariance, then evidence for such covariance provides warrant for accepting the rule, while negative evidence provides warrant for rejecting it. If so, in methodological appraisals what matters is evidence regarding the ends-means connections embodied in rules. The provider of such evidence is history. To Laudan the goals of science are just those scientists embrace, which are open to change, variation across disciplines and even among individual scientists.viii Goals that seem downright unavailable cannot be considered reasonable. None of this softens Laudans antirealism, however. In his works the clear cognitive gain of science is manipulation and control of natural systems. Why the acknowledged improvement in instrumental reliability cannot be related to gains with respect to truth, Laudan does not make clear. His antirealism connects with peculiar ideas about false theories, language, and approximate truth, views that realists regard with considerable suspicion.

68

NATURALISM AND SCIENTIFIC REALISM

6. REALISTS STRIKE BACK

Jarrett Leplin, a sometime collaborator but also a critic of Laudans work, thinks that only realist views account for the ability of theories to explain and predict successfully phenomena outside the scope of the empirical laws they were designed to cover. History, Leplin (1984) notes, is not opposed to realism any more than our experience of ordinary objects is unambiguously veridical. He agrees that whether there is conceptual continuity through revolutions remains a contentious matter, but draws attention to a valuable historical induction that favors realism: one historical pattern that has remained stable throughout scientific change is the tenacity of preferential judgments about theories. Although a theory that replaces another is in turn replaced, its superiority over its predecessor continues to be recognized. As much as history records sustained judgments of the ultimate unacceptability of theories, it records sustained judgments of their relative merits. [] Such judgments are not restricted to the pragmatic dimensions of predictive success, but include explanatory comparisons. Newton provided a better explanation of free fall than did Galileo although both explanations have been superseded. If we retain such judgments beyond the tenure of the theories themselves, we must regard one theory as having got more of the relevant facts right or as having described those facts more accurately, even if both theories are false. If the explanations proposed by both theories were rejected as utterly devoid of truth, such comparisons would be impossible. (1984, p. 214) On this view, there are as good inductive grounds for concluding that scientific theories increase in truth as for concluding that all theories are false (in the sense of being not completely true). Critics, however, may object that even if this reply may succeed against Laudan it does not respond to van Fraassens objection, who can turn Leplins induction into a more modest (and credible) claim about the tenacity of preferential judgments regarding the empirical adequacy of theories, a topic I consider in the next section. Leplin convincingly argues that Laudans ideas distort the way scientists understand language and truth. Scientists point to significant bodies of theoretical claims in discarded theories that remain accepted either as true or nearly so, Leplin notes. Maximally accurate description is rarely a goal in actual science, he stresses. Leplin thinks realist articulation of the notion of nearly so would benefit from shifting to a multi-valued logic closer to scientific practice. Most realists do not favor that sort of semantic revision, however. Still, realism needs an articulation that does justice to the scientific practice of presenting as true moderately abstract versions of many intermediate-level assertions from theories rich in novel predictions that then proved wrong as whole theoretical proposals. Approaches in that direction are the subject of Sections 8 through 11. Kuhnians, of course, deny coherence to all such realist moves. To most philosophers of science today, however, Kuhnian meaning holism constitutes a suspect stance, particularly when conjoined with the presumption that theoretical claims are epistemologically and semantically exhausted by their observable 69

CORDERO

implications. For one thing, most theoretical terms enter science already endowed with meaning and knowledge (Shapere, 1984; Devitt, 1991)i.e. science builds on and improves pre-scientific knowledge. Prior information and understanding are not lost, Kuhnian style. As Alexander Rosenberg (2000) says, antirealists need to account for the apparent agreement about kinds and regularities characteristic of pre-scientific peoples, and the speed with which they shift to the exotic kinds and regularities proposed by modern science. Naturalist realists thus challenge Laudans presumptions regarding language and truth, charging that linguistic holism makes no sense of human cognitive and linguistic practices, has no proper evidence on its behalf, its sole basis being metascientific prejudice about language and meaning. Dismissing holistic objections paves the way for other realist resources, particularly concerning the identification of as yet unrefuted parts in discarded theories endowed with rich records of empirical success. Still, to constructive empiricists, moves to save theoretical progress and approximate truth lack relevance, given van Fraassens persistent claim that theoretical explanations are not credible.
7. THE ALLURE OF MODESTY AND ANALOGY

To strong empiricists, confining the descriptive goal of science to empirical adequacy is both epistemologically safer than realism and sufficient for making sense of science. Critics disagree, however, pointing that warrant for explanatory inference naturally floods over van Fraassens theory/observation dike, often reaching deeply into noumenal discourse. One favored constructivists response is that, since scientific explanations have to stop somewhere, they might just be stopped at the observable/unobservable boundary. This hardly convinces, however. That explanations must stop somewhere does not mean that they should stop just anywhere, let alone where strong empiricist prejudice dictates. If an epistemic ditch separates the claims of science, let science suggest where it liesixthis alternative to raw modesty suggests a realist stance about numerous noumenal entities and structures (molecules and atoms, elementary particles all the way to at least quarks, microbiological structures, remote histories)all beyond the range of unaided human observation, but with strong scientific warrant nonetheless. Van Fraassen, however, points to fruitful moves away from noumenal explanations in modern theorizing. This brings us to an analogy he takes from biology. Like Kuhn before, Van Fraassen (1980) notices that, in Darwinian theory, natural selection operates on just phenotypical characteristics, a level he associates with the observable level for theories: the success of current scientific theories is no miracle. It is not even surprising to the scientific (Darwinian) mind. For any scientific theory is born into a life of fierce competition, a jungle red in tooth and claw. Only the successful theories survivethe ones which in fact latched on to actual regularities in nature. (1980, p. 40) 70

NATURALISM AND SCIENTIFIC REALISM

Contrary to hasty critics, here van Fraassen is not invoking theory but just showing how scientific theorizing can fruitfully move away from noumenal accounts. Functionality in the living invites explaining organisms as products of intelligent design, but there is no need for such explanation, and even less truth in itrandom variation and natural selection suffice. Likewise, suggests Van Fraassen, the instrumental reliability of science improves by ruthless theory selection in favor of empirical adequacy. However, realists cannot accept this. Realists insightfully complain that van Fraassen is just explaining why we humans hold successful theories, not why those particular theories are successful (Lipton, 1991, pp. 170-172; Devitt, 1991, p. 116). Furthermore the metaphor at hand is seriously faulty. As James R. Brown (1985) points out, successful new theories yield many more different correct predictions than simple guessing would allow. By contrast, Darwinian change is random and succeeds in only one or few respects at a time (the analogue of a new prediction). Philip Kitcher (1993) goes further and exposes the analogy as disingenuous at best. In his view, explaining the survival of theories by their predictive power is like explaining survival by appeal to fitness without then trying to explain fitness in any detail. According to Kitcher, any properly Darwinian account aims to include, at least (1) a list of current and past species that have endured; (2) specification of comparative fitness levels on the basis of endurance for each relevant species; (3) an explanation of attributed fitness by identifying fitness-making characteristics; and (4) a story about how organisms achieve these characteristics. Kitcher faults van Fraassen for arbitrarily stopping at the second stage. Explaining the instrumental reliability of scientific theories, Kitcher urges, requires looking for current and past theories, then determining which theories have been most accepted because of their explanatory and predictive success, followed by finding out what characteristics give these theories explanatory and predictive power, and then telling something about specific ways in which theories achieve these characteristics. Reflecting on these reactions, Rosenberg (2000) suggests that constructive empiricists should simply refuse to answer the question of why science has succeeded, on the ground that no answer to it would improve the empirical adequacy of science. This does not seem right. Explaining success as a product of the truth of at least part of the story provided by the theory does seem to help empirical adequacy. While successful theories that proved wrong litter history, looking for noumenal truth in their accounts has helped scientists figure out the actual domain over which discarded theories are empirically adequate. And seeking to explain the scope and limits of the empirical adequacy of a successful theory in terms of deeper characteristics and relationships responsible for the encountered adequacy has ostensibly advanced sciencefrom the study of gases, to chemical composition, to Mendelian genetics, and countless other cases. I suggest there is a further complaint realists can advance against the noted use of natural selection. In biological evolution change meets the challenges of the environment by means of genomic incorporation of progressively deeper levels of information about the external worldfrom the shape and behavior of entities and processes relevant to survival to abstract structures concerning space, time, 71

CORDERO

dynamical trajectories, light, chemical reactions, even rudiments of Euclidean geometry. Detailed research supports this claim,x whose analogue in theorizing occurs in fields where theory-change involves manifest growth of retained descriptive schemes at increasingly deeper theoretical levels. Considerations such as the above make realists regard Constructive Empiricism as an unpromising project of epistemic modesty. Less radical challenges to traditional realism are not so easy to dismiss, howeverparticularly the challenges posed by effective underdetermination and by the improbable truth of whole theories, both of which force a request to clarify what realists can be realist about. In response, from the late1980s naturalist realists have shifted the link between success and truth to theory-parts (as opposed to whole theories), a strategy labeled Divide and Conquer Realism (DAC).xi DAC projects argue that empirically successful theories make epistemic gains in at least three ways: (1) they get right some significant clusters of theoretical descriptions of their intended objects; (2) what theories get right provides a substantial (if limited) account of relevant parts of the domains at hand; (3) the descriptive parts in question seem overwhelmingly likely to survive as putative truths in successor theories. To DAC realists, in short, discarded successful theories leave behind substantial noumenal descriptions that deserve to be called true as much as anything in science and ordinary life. What counts as a theory-part? DAC literature says rather little about this, so here is my own naturalist line. There being no strong scientific case for freeing theories of claims about unobservables, realists can turn strong empiricism on it head and exploit non-radical versions of its purgative strategies on behalf of DAC. In particular, instead of limiting epistemic commitment to Ramsey formulations of empirical theoriese.g. Sklars (2010) thinning realists can (should) expand commitment along the lines of a generalized, more liberal version, now with topicneutrality reaching into noumenal posits and structures sanctioned by the stringent tests of current methodology. Holists, of course, deny coherence to this realist move, but they do so from a highly suspect perspective, as noted in Section 6. The remainder of this paper is devoted to the ups and downs of DAC moves within naturalism and realism. I begin with two influential perspectival articulations of the DAC approach.
8. PERSPECTIVAL REALISMS

At the heart of one group of DAC projects are the partiality and abstract character of scientific representation and description. This includes Ronald Gieres approach, according to which scientific theories, like maps, are perspectival rather than absolutely objectiveinterests and conventions mediate the representations theories make, and as such they can be neither complete nor wholly accurate (Giere, 1997, 2006a). They capture selected aspects of the world brought to prominence by prior knowledge, social circumstances, and biological evolution. Successful theories get many things wrong and their fundamental terms sometimes fail to refer, Giere admits, but he rejects Laudans antirealist argument, which, in his view, 72

NATURALISM AND SCIENTIFIC REALISM

rests on the unstated assumption that approximation is always just a matter of degrees. If the ether does not exist, claims involving the ether cannot just be a little bit offthey encompass radical error. The argument collapses however if we abandon talk of approximate truth in favor of similarity between the model and the world which allows approximation to include respects as well as degrees of similarity [] Whether the ether exists or not, there are many respects in which electromagnetic radiation is like a disturbance in an ether. Ether theories are thus, in this sense, approximations. (Giere, 1988, p. 107) To Giere, then, discovering that there is no ether justifies the rejection of ether models, but not the rejection of all realistically understood claims about similarities between ether models and the world. Ancient maps based on flat conceptions of the Earth still got a great deal right. In Gieres view, there is room for a modest yet robust scientific realism that has scientists succeeding at least to some extent in their attempts to represent the world. His position ties in with field studies he has made of the ontological commitments of experimental nuclear physicists and their practices. To him, the only remotely plausible, generally scientific account of what is going on at [cyclotron facilities is that] nuclear physicists are producing protons with desired characteristics [] and then using them, together with other particles, to investigate the properties of various nuclei (1988, p. 125). In Gieres approach, scientific knowledge focuses on just some (out of indefinitely many possible) aspects of any given domain of interest. Like Mary B. Hesse (1974) before him, Giere tries to minimize concessions to constructivism by focusing representation on finite lists of respects of similarity invoked in scientific observation, language and scientific theorizing. He emphasizes partial similarity as the key relation that models bear to real systems. In his work the approximation relation has two dimensions: approximation in respects and approximation in degrees. Fresnels descriptions are utterly wrong about the existence and structure of the ether, butGiere addsthe theory is correct to a high degree about the transversal character of light waves, their reflection, refraction and polarization. Antirealists, however, have old objections to this move. For starters, similarity is not easy to take seriously as a relation between models and the empirical world, because in some respect or other everything is similar to (also different from) everything else. Secondly, one needs a criterion to compare approximation in respects and approximation in degrees. Gieres answer is, again, strongly naturalistic (1988, 2006a, 2006b): Without the benefit of social conventions, he notes, animals discriminate exceedingly well among objects relevant to survival in their environment, a capacity humans share. Thus, for most perceptual judgments, widespread agreement does not require social explanation the explanations of biology suffice. Complementarily, biological evolution has enabled animals to make and preserve internal maps of their surrounding empirical world, ones that bear useful similarities with that world (1988, p. 110; 2006b, chapter 4). In humans, experience and reasoning allow us to generalize and revise received maps, thus far with considerable success in numerous areas. So, like many 73

CORDERO

naturalists, Giere defuses radical constructivism by stressing the benefits of having a robust head-start on objectivity and truth, courtesy of natural selection. There is a hindrance, however. As Rosenberg (2000, p. 16) objects, at most Darwinian biology makes it probable that the respects of similarity we have been selected to focus on had local survival value. Such gains may not increase overall similarity to reality, let alone absolute similarity. Just as local survival value led the blind mole to lose its sight, in science the winning theory may not be better in some respects of similarity to the external world. I think Giere can grant this objection by simply stressing that mole maps represent reliably the world relevant to moles. Applied to science, the realist expectation is not that a theory should represent reliably everything in the world but just a portion (significant for current purposes) of the theorys intended domain. The realist point is that science extends human epistemic reach beyond the biologically relevant, not that it does so fully. Gieres realism builds on a primeval quest for perspectival models. But, how such models arise, and how are they checked for truthfulness? Cognitive scientists discern several relevant evolutionary stages in organisms. To Daniel Dennett (1995), for example, slugs and other invertebrates learn by testing randomly generated actions in the external environment: favorable actions are reinforced and tend to be repeated, much as B.F. Skinner thought humans do. More in line with Karl Popper, mammals, birds, reptiles and fish (arguably also some invertebrates) have inner (mental) environments that allow them to preview and then select among possible actions by filtering out the silliest options before risking them in their unforgiving world. As hominids developed, their inner environment became informed by tools and other designed portions of the outer environment (including word-like tools), possibly along the lines proposed by information theorist Richard Gregory (1984). These are just examples of the proposals naturalists make about the rise of modeling. Critics demand more, unsurprisingly. They want a convincing supplement about the processes used to select models and assert cognitive progress. On this too the preferred naturalist approach sticks to evolutionary theory.
9. JUSTIFICATION MATTERS

Reflecting on objectivity, truth, and justification, Philip Kitcher, also a perspectival realist and strong supporter of the map analogy, emphasizes the importance of the biological head-start. In his words, perception contributes a theory-independent basis, inter-subjectively available (1993, pp. 66-67). From a public body of empirical information theories arise that unify and project that information, their justification resting on the elimination of proposals. However, for elimination to work, criteria for theory acceptance and rejection must be commonly shared across the theories involved. This is disputed by Kuhnians, who insist that fundamental theories encompass evaluation criteria that vary from one theory to another. As noted earlier, in the 1970s anti-relativists (notably Shapere) responded by showing that supposed paradigm-shifts in the history of science actually proceeded on the basis of shared reasons and evidence. Kitcher (1993) focuses on commonalities 74

NATURALISM AND SCIENTIFIC REALISM

grounded in biology, particularly at the level of categorization and model-selection. We, he notes, have a primeval propensity, rooted in our genome and early human environment, to theorize under categorizations favored by diverse episodes of natural selection. This has resulted in mammals spontaneously focusing on certain kinds of things and relations and generalizing from few single cases henceforth. To Kitcher, this primitive apparatus works tolerably well in confronting the problems our hominid ancestors encountered; it is relatively well designed for enabling primates with certain capacities and limitations to cope with savannah environment and with the complexities of primate society (p. 241). It is with this apparatus in place, he maintains, that a deeply rooted eliminativist propensity we have gets activated. Other brain developments open the use of this propensity to modification of practice, revision of primitive categorizations and views of dependence, making it possible to figure out alternative views and subject them to eliminative induction. However, what justifies Kitchers conviction that this eliminativist strategy advances theoretical knowledge? Winning over just a finite number of competitors is not enough. The most such elimination can yield is confidence that a certain theory is the right one, unless some other proposal makes more sense. But realism requires that the theory selected advance theoretical knowledge in a stronger sense. Reasons must be provided other than the hypothesis place in the current order of confirmational merit. The ether hypothesis long sustained the status of best confirmed hypothesis, yetantirealists emphasizeit turned out to be quite false (there is no light ether). As Rosenberg (2000) warns, if eliminative induction is all there is, then realism commits science to a wasteful search for more and more alternative hypotheses beyond the needs of empirical adequacy. For only once all the possible hypotheses have been specified and compared with each other would the winner be wellconfirmedan impossible accomplishment, since the number of possible hypotheses is indefinitely large. Elimination by itself fails to convince, but realists have available other resources to discern truth. A marker less mortgaged to the limitations of the human imagination is novel empirical success, which many DAC-realists take as their choice indicator (instead of explanatory virtue, e.g. unification). The DAC thesis, to repeat, is that, as theories change, the successful ones leave behind truthful descriptions that add to previous maps of the noumenal world. This harmonizes well with Leplins already mentioned historical induction about preferential judgment (when a theory is replaced, its superiority over its predecessor continues to be recognized). However, which of the noumenal descriptions introduced by a successful theory are true? Theories, however successful, can be quite wrong about things. Retention across theory-change alone is not a good marker of truth, judging by the array of long lingering ideas that have turned out badly (caloric, fire/phlogiston, teleological holism in biology, the ether of light, the Euclidean conception of space and time through Lorentz). Specifying which parts a successful theory gets approximately right is trickier than might seem at first. Realists have several ways to go. In the remaining two sections of this article I discuss some ongoing moves, including my own. 75

CORDERO

10. THEORY-PARTS SUITABLE FOR REALISM

One prominent DAC approach, variously developed by Kitcher, Leplin, and Psillos,xii focuses on theory-parts that scientists can identify at the time of a theorys success. On this view, components really off the mark are generally not implicated in the historical predictions of a theory, and one can tell whether a part is not implicated simply by checking the theoretical options available at the time (synchronic DAC). Fresneldefenders of this approach claimcould have derived his famous predictions using available Lagrangian derivations that bypassed the ether of light. This, they argue, shows that the ether posit was idle, dispensable or worse. There is a serious problem with this thesis, however: mature science introduces hardly any idle posits. The ether was not a dispensable posit, and could not have been so taken until at least the early 20th century (see the Introduction Chapter). This is due, among other reasons, to metaphysical entrapment (Cordero, 2011a, 2011b): 19th century theoreticians could not give up the ether because, at the time, the whole of physics, including the Lagrangian approach, conceived of waves as propagating perturbations and thus as a mode of being in need of a substratum. Here is an illustration of the confidence the ether commanded: You can imagine particles of something, the thing whose motion constitutes light. This thing we call the luminiferous ether. That is the only substance we are confident of in dynamics. One thing we are sure of, and that is the reality and substantiality of the luminiferous ether. (Lord Kelvin: The Wave Theory of Light, Johns Hopkins Lectures, 1884 [my italics]) How damaging this problem with synchronic approaches is to other versions of DAC realism depends on whether realists can produce a criterion that, while more purgative of metaphysical entrapment, manages to identify with high probability theory-parts that yield approximately correct noumenal descriptions of the intended domain, and does this without resorting to mere retrospective projection of current theory. A prime place to go for naturalist hints is, again, scientific practice. Scientists use strategies that ostensibly enhance the credibility of theory-parts. Promising DAC moves on view include several strategies that take time to bear fruit (diachronic), four in particular (Cordero, 2013a). Two are close to Whewells recommendation (1858/1968, p. 109) of having theories turned in all directions, examined on all sides; the strength and the weakness of the maxims which men apply to them are fully tested; the light of the brightest minds is diffused to other minds, only now applied to theory-parts in Gieres sense rather than full theories. (S1) Hostile Probing: This comprises moves that try to do without the central tenets of a theory, particularly by opponents reacting to a theorys initial success. For example, 19th century corpuscularians responded to the wave theory of light by laboring to show Fresnels central tenets wrong, most famously in the episode that led to the experimental demonstration of the socalled Poisson Spot. Poisson and other corpuscularians thought this 76

NATURALISM AND SCIENTIFIC REALISM

prediction was to be the downfall of Fresnels theory; to their surprise it crowned it. (S2) Checking Auxiliary Assumptions: Supporters of a theory follow this strategy when difficult cases come their way, as when, in 1801, the discovery of double-slit interference corpuscularians into convoluted auxiliary hypotheses to account for the phenomenon. Their efforts failed to satisfy, leading to the effective collapse of the particle camp. As a theory plays out, the above strategies give salience to two kinds of components. At one extreme are theory-parts systematically implicated in the derivations of either failed predictions or conceptual conundrums. These are deemed dubious, as in the example for S2. At the opposite extreme are parts that seemingly cannot be removed without bringing the theory to stagnation, suggesting that they are indispensable and very probably truth-worthy, as in the example for S1. A third DAC strategy comes on view when successful theories begin to wane: (S3) Efforts to Identify Adequacy Conditions for Successor Theories: These occur when a theory faces persistent difficulties and scientists start looking for alternatives. From the yields of S1 and S2 they select theory-parts found particularly trustworthy and place them as correspondence rules, limiting cases, and so forth. At the start of quantum optics, for example, Maxwells laws functioned as adequacy conditions at certain levels of quantum mechanical representation. A glance at other scientific episodes suggests that the above strategies generalize well. In numerous disciplines they have advanced explanatory coherence and broadened the evidence base of theory-parts over time. On the whole, the strategies have a good track record of picking theory-parts accepted as approximately correct to this day (e.g. geometric properties of light waves, kinetic theory of matter, classical chemical structures, conservation principles, to name a few). Nevertheless, the specter of metaphysical entrapment remains a worry. In some notorious cases (e.g. the ether case), the three strategies passed chaff as wheat, strengthening and propagating received metaphysical entrapment into the next generation of theories. Application of the strategies did eventually expose the ether and other such posits as fiction, but stowaways can be hard to catch. Happily scientific practice has other resources, in particular two strategies that center on elucidation. One focuses on external explanatory support, specifically of theoryparts that gain explanatory elucidation from independent successful theories: (S4) External Explanatory Elucidation occurs when claims merely assumed in a theory T subsequently gain justification from another, initially unrelated theory T*. For example, since the 1950s numerous aspects of cell biology have gained elucidation from molecular biochemistrye.g. posited neural mechanisms have been explained by noting that neurons consist of proteins and other molecules that are organized into functional sub-systems such as 77

CORDERO

the nucleus, mitochondria, axons, dendrites, and synapses. In Paul Thagards version of this strategy (2000, 2007), the emphasis is on explanation: if a theory not only maximizes explanatory coherence, but also broadens its evidence base over time and its assumptions are elucidated by explanations of why the theorys proposed mechanism works, then we can reasonably conclude that the theory is at least approximately true. Elucidation has accompanied much of the advance of modern theoretical science. But how good a marker of probable truth is it? In an elucidation instance, the part that gets singled out is the derived structure along with whatever accompanying assumptions both theories clearly share, but nothing else. Thus, because in the 19th century Lagrangian theory and mechanics agreed on the metaphysics of waves, Lagragian elucidations of Fresnels predictions could not expose the ether as a dispensable posit. By the same token, elucidations from cognitive psychology about unconscious levels of the mind do not boost Freuds theory holisticallyjust those structures and claims the two theories clearly share. In an elucidation instance, then, what typically gains epistemic weight is a level of description more abstract than T and T* provide for. Since external explanatory elucidation springs from an independently supported theory T*, elucidation raises the credibility of the assumptions and narratives it casts light onhence its relevance for realists. Moreover, elucidations purgative power against metaphysical entrapment is greater than that of the previous strategies in cases where the initial remoteness of T* lowers the likelihood of shared metaphysical underpinnings. Although clearly a realist resource, elucidation seems neither necessary nor sufficient for realism, however. As the previous strategies suggest, ascriptions of probable truth can be made on the basis of success with novel predictions alone. On the other hand, unsavory counterexamples give pause to granting high probability to elucidated theory-parts. Here are two examples (Cordero, 2013a). When Kepler looked for theoretical support for his 2nd Law, he derived it from the Aristotelian laws of motion and some principles of optimal action. Kepler elucidated the law, but the premises he invoked included some of the wrongest claims of Aristotelian physics. S4 can be improved by requiring the elucidating theory to be successful in terms of novel predictions, but this too fails to filter out some lamentable cases. In the 1940s and 1950s, Freudians allegedly grounded some of their principles (e.g. the death instinct) in thermodynamics; they did not convince. The final strategy I consider focuses on post mortem elucidation of the successes of superseded theories: (S5) Retrospective Elucidation: These are efforts to explain why a discarded theory showed empirical success. The task here is to provide causal and/or structural justification for a theorys accomplishments, as in the account wave theorists provided for the success of corpuscularian optics regarding the phenomena of reflection, refraction and polarization.

78

NATURALISM AND SCIENTIFIC REALISM

Contrary to first appearances, S5 does not involve vicious post-hoc maneuvers. Explaining the success of a theory T0 from the vantage point of a successor T1 often contributes epistemic gains along two complementary lines. One is a greater awareness of divergences between T0 and T1, which typically leads to further novel predictions from T1, and thus to epistemic gains along S1and S2. Another bonus is the unveiling of regions within the logical space of T0 where noumenal descriptions licensed by T0 are approximately correct (from the vantage point of T1), which usually helps to grade the scope and accuracy of T0-parts relative toT1. Yet a third contribution has to do with the expectation (now widely shared by realists) that whole theories are generally false. By finding truth-content in a discarded theory S5 retrospective elucidations enhance the coherence of taking a realist stance. The five strategies just outlined pick structures seemingly indispensable for a theorys success, including noumenal descriptions of various levels of abstraction. Their combined filtering does seem to raise to high levels the trustworthiness of numerous parts of the current scientific picture. Note, however, that none of the contributions associated with the strategies is either guaranteed or trivial. Their status is that of learned, empirical findings. Note also that, historically, preservation of noumenal posits and structures from early theories began in earnest only when novel predictive power gained recognition as an epistemic virtue. Transitions from one Ptolemaic theory to another displayed virtually no common theoretical parts (cycles, epicycles and such changing dramatically from one proposal to another), shared descriptions limited to the observable level. Even Descartes Vortices hypothesis displays similarly poor theoretical convergence with Newtons gravitational theory. By itself each of the strategies fails to satisfy. Together, however, they seem to do the required job, or so I suggest. The final section outlines my proposal of a naturalist criterion that pulls together the strengths of the diachronic strategies reviewed in this section.
11. A PROPOSAL

The following suggestions draw from Cordero (2001, 2011a, 2011b, 2013a, 2013b). They focus on theories designed with one set of data in mind that then, unexpectedly and improbably (relative to prior information), predict some phenomenon unknown to the theorys author. As a theory T is applied to diverse situations and strategies S1 through S5 act on it, clusters of noumenal descriptions first licensed by T (theory-parts) gain salience as either (1) probably false or (2) probably true. Salience becomes noticeable as derivations of predictions (successful or failed) and paradoxes from T intersect at particular theorycomponents and auxiliary assumptions (Balashov, 1994). (R1) Refutational DAC: A given theory-part will reveal itself as doubtful if derivations of failed predictions in various areas intersect at that theory-part,

79

CORDERO

and saving that part is consistently accompanied by degeneration of the whole system (as measured by current epistemological criteria). At the other extreme, input from S1 through S5 will favor a part when it is either (a) implicated in the theorys novel predictive success to the point that removing or changing it leads to empirical degeneration; or (b) the part has gained elucidation from some independently well-established theory. The prolonged retention of posits like the ether warns against (a); against (b) stand counterexamples like those outlined for S4.Conjoining (a) and (b) thus seems a better option to try. (R2) Corroborational DAC: A theory-part will reveal itself as very probably approximately-true if conditions (a) and (b) above obtain. Note that neither (a) nor (b) proceed by retrospective projection. Indeed both conditions can apply to parts while theories are still in full flight. Cases like those of the caloric and the 19th century ether did not meet this criterion. A perspectival DAC stance follows accordingly: (R3) Naturalist Realist Thesis: Law-like structures and tokens of the noumenal types invoked by a theory-part that meets R2 obtain objectively, independently of any minds and (approximately) follow the laws that the part in question and other successful science spell out for them. By the proposed criterion, in several significant respects, light is as Fresnel said, and atoms are as classical physics portrayed them; material transformations are to an significant extent as pre-quantum chemistry said; the evolution of many species is largely as Darwins original proposal stated. In terms of R2, the epistemic yields of the last 250 years amount to arrays of thickly textured descriptions and narratives about numerous noumenal aspects of reality. The result is a world picture composed of objects that develop at various empirical levels and ranges, each of the objects displaying invariant property clusters and entering into law-like relations at their respective level.xiii Included in this picture are entities, origins and evolutionary histories, rich portions of Newtonian and gravitational theory, prequantum chemistry, early quantum mechanics, and Darwins original theory, among other features. The suggested DAC realist stance is a fallible, empirical conjecture, subject to scientific standards of acceptance and rejection. More work is needed on the matters involved, of course, but I think the above proposal shows promise. For one thing, while everything in the scientific picture remains defeasible, arguably (paraphrasing Whewell) no example can be pointed out, in the whole history of science, in which filtering of the noted kinds together with explanatory elucidation has given testimony in favor of a theory-component afterwards discovered to be false. I end with some clarifications. (A) This proposed version of diachronic DAC-realism lowers the risk of metaphysical entrapment by emphasizing external support and granting 80

MEASURING MORALITY

higher epistemic value to corroborated novel prediction than to consistency and unity. It counteracts a widespread tendency to smuggle in prior information in theorizing; it does this by requiring that qualifying novel predictions be unknown to the theorys author(s), in the sense recommended by Worrall (1989). (B) The sense in which truth applies to theory-parts piggy-backs on the ongoing use of truth in experimental science and ordinary life. It builds on explications of truth ascription to restricted, coarse-grained theoretical claims in scientific practice (Cordero, 2013b), but otherwise leaves deeper theorizing about Truth to others. A similar caveat applies to justification. (C) Justification for the criteria and thesis advocated above is primarily scientific and rests on considerations of coherence, agreement with data, and risky predictions. As such, the proposal will fail if the predictions it makes fail to come out true almost without exception for theories that thrive in terms of corroborated novel predictions, in particular the following: (C1) After a successful empirical theory is replaced, its superiority over its predecessor will continue to be recognized (Leplins Induction). (C2) For any theory-part that passes R2, restricted versions of some models original to it (a) will be embeddable in coarse-grained models of successor theories, and (b) the resulting restricted descriptions will be deemed true from the vantage point of those successor theories. An exemplar case is the embeddability of restricted Galilean models and associated descriptions of free fall on coarse-grained Newtonian models and descriptions. Of course, even if the suggested proposal succeeds as a scientific claim, to some non-realists it will seem philosophically nave and misguided from the start. For what, they will ask, justifies the scientific methods to which it appeals? I think Ernest Nagels response to this line of complaint remains strong. The objection matters, he noted at the dawn of contemporary naturalism, only to those who refuse to dignify anything as genuine knowledge unless it is demonstrated from self luminous and self evident premises (1956, p. 15). But there is no such thing as complete justification for any claim, and so requiring complete warrant for naturalist proposals is an unreasonable request. The proper guideline for naturalist realists is thus clear: develop naturalism and realism using the methods of science; if this leads to a fruitful stance, then explicate and reassess. The resulting proposal will exhibit virtuous circularity if its explanatory feedback loop involves critical reassessment as the explanations it encompasses play out.

81

CORDERO

NOTES
*

i ii

iii iv v vi

vii

viii

ix

xi xii

xiii

Research support for this investigation was made possible in part by PSC-CUNY Research Award TRADA-43-748. For a good articulation of the claims of realism see e.g. Bunge (2006, Ch. 10.2). See Ruse (1979, chapter 7). Darwinian arguments did not satisfy the requirement on prediction until late in the following century (Cordero, 2011a). See, for example, Hesse (1974, chapter 1) and Suppe (1977, introduction). Kuhn (1977, 1992); see also Bird (2011). Objectivity, value judgment, and theory choice, in Kuhn (1977). Elsewhere I consider the cases pertaining to quantum mechanics and the theories of light (Cordero, 2001, 2011b, respectively). See Steven French and Decio Krause (2006). Frenchs contribution to this volume proposes a radical ontic-structuralist response to this level of underdetermination. Laudan is vague about this particular, but his emphasis on instrumental success is clear. See Laudan (1984b, 1987). Naturalists do not fully agree regarding the boundarys location. Shapere emphasizes success and freedom from specific doubt. Most entity-realists, settle for a generalized version of empirical adequacy, extended to encompass whatever entities become empirically accessible through scientifically-aided observation and manipulation, which now includes subatomic structures. Another alternative locates the boundary using as guide the most demanding conditions of acceptability found in scientific practice (Cordero, 2013a). A host of controlled experiments strongly suggest that animals make abstract representations of space. For example, the indigo bunting bird orients itself during migration using trigonometric relationships between stars (Emlen, 1969). Clark nutcracker birds store food throughout large areas during the fall season (in thousands of sites); experiments show that these birds manage to retrieve them by figuring out the mid-point between two landmarks (Kamil & Jones, 1999). A neurological basis has been found for these abilities. In rats, the firing rate of individual cells in the hippocampus changes as the animal moves from one region of its environment to another (OKeefe & Dostrovsky 1971; Save et al., 1998; Lever et al., 2002); these place cells form a cognitive map. Experiments such as these indicate that place cells in the hippocampus of rodents and other animals, even blind ones (Save et al., 1998), give them an abstract level of geometrical representation of whatever environment they find themselves in. For a discussion of these developments see De Cruz (2007). Label introduced by Stathis Psillos (1999, p. 108). Kitcher (1993, 2001), Leplin(1997), Psillos (1999). Psillos DAC started partly in response to John Worralls austere Structural Realism. See David Bohm (1957, ch. 5). Anjan Chakravartti (2007) offers a properties-oriented structuralist version.

REFERENCES
Balashov, Yuri. (1994). Duhem, Quine, and the multiplicity of scientific tests. Philosophy of Science, 61, 608-628. Bird, Alexander. (2011). Thomas Kuhn. Stanford Encyclopedia of Philosophy: http://plato.stanford.edu/entries/thomas-kuhn/#6.4. Bohm, David. (1957). Causality and chance in modern physics. London: Routledge & Kegan Paul. Boyd, Richard N. (1984). The current status of scientific realism. In J. Leplin (Ed.), Scientific realism (pp. 41-82). Berkeley, CA: University of California Press. Brown, James R. (1985). Explaining the success of science. Ratio, 27, 49-66. Bunge, Mario A. (2006). Chasing reality. University of Toronto Press, Scholarly Publishing Division.

82

NATURALISM AND SCIENTIFIC REALISM Chakravartty, Anjan. (2007). A metaphysics for scientific realism. Cambridge University Press. Cordero, Alberto. (2001). Realism and underdetermination: Some clues from the practices-up. Philosophy of Science, 68S, 301-312. Cordero, Alberto. (2011a). Darwins theory and prediction. In F. Minazzi (Ed.), Evolutionism and religion (pp. 79-94). Milano: Mimesis Edizioni. Cordero, Alberto. (2011b). Scientific realism and the Divide et Impera strategy: The ether saga revisited. Philosophy of Science, 78, 1120-1130. Cordero, Alberto. (2013a). Theory-parts for realists. European Philosophy of Science Association: Selected Papers of PSA11. Forthcoming, Summer 2013. Cordero, Alberto. (2013b). Conversations across meaning variance. Science & Education. Forthcoming, Spring 2013. Dennett, Daniel C. (1995). Darwins dangerous idea. New York: Simon & Schuster. Devitt, Michael. (1991). Realism and truth, 2nd ed. Princeton: Princeton University Press. Devitt, Michael. (1999). A naturalistic defense of realism. In S. D. Hales (Ed.), Metaphysics: Contemporary readings (pp. 90-103). Albany, NY: Wadsworth Publishing Company. De Cruz, Helen. (2007). An enhanced argument for innate elementary geometric knowledge and its philosophical implications In B. Van Kerkhove (Ed.), New perspectives on mathematical practices: Essays in philosophy and history of mathematics (pp. 185-206). New Jersey: World Scientific. Emlen, S. T. (1969). The development of migratory orientation in young indigo Buntings. Living Bird, 8, 113-126. French, Steven, & Krause, Dcio. (2006). Identity in physics. Oxford: Oxford University Press. Giere, Ronald N. (1988). Explaining science: A cognitive approach. Chicago: The University of Chicago Press. Giere, Ronald N. (1997). Understanding scientific reasoning, 4th ed. New York: Harcourt Brace. Giere, Ronald N. (2006a). Scientific perspectivism. Chicago: University of Chicago Press. Giere, Ronald N. (2006b). Modest evolutionary naturalism. Biological Theory, 1, 52-60. http://www.tc.umn.edu/~giere/MEN-RPS.pdf. Gregory, Richard L. (1984). Mind in science: A history of explanations in psychology and physics. London: Penguin. Kamil, A. C., & Jones, J. E. (1999). How do they, indeed? A reply to Biegler et al. Animal Behaviour, 57, F9-10. Hesse, Mary B. (1974). The structure of scientific inference. Berkeley/Los Angeles: University of California Press. Kitcher, Philip. (1992). The naturalists return. Philosophical Review, 101, 53-114. Kitcher, Philip. (1993). The advancement of science. Oxford: Oxford University Press. Kitcher, Philip. (2001). Science, truth, and democracy. Oxford: Oxford University Press. Kuhn, Thomas S. (1962/1970). The structure of scientific revolutions, Chicago: University of Chicago Press (1970, 2nd edition, with postscript). Kuhn, Thomas S. (1977). The essential tension: Selected studies in scientific tradition and change. Chicago: University of Chicago Press. Kuhn, Thomas S. (1992). The trouble with the historical philosophy of science. In Robert and Maurine Rothschild distinguished lecture, 19 November 1991. Cambridge, MA: Harvard University Press (Special publication of the Department of the History of Science). Ladyman, James, Douven, Igor, Horsten, Leon , & van Fraassen, Bas C. (1997). A defense of van Fraassens critique of abductive inference. Philosophical Quarterly, 47, 305-321. Laudan, Larry. (1981). Philosophy of Science, 48, 19-49. Laudan, Larry. (1984a). Realism without the real. Philosophy of Science, 51, 156-162. Laudan, Larry. (1984b). Science and values: The aims of science and their role in scientific debate. Berkeley, CA: University of California Press. Laudan, Larry. (1987). Progress or rationality? The prospects for normative naturalism. American Philosophical Quarterly, 24, 19-31.

83

CORDERO Laudan, Larry, & Leplin, Jarrett. (1991). Empirical equivalence and underdetermination. The Journal of Philosophy, 88, 449-472. Leplin, Jarrett. (1984). Truth and scientific progress. In J. Leplin (Ed.), Scientific realism (pp. 193-217). Berkeley: University of California Press. Leplin, Jarrett. (1997). A novel defense of scientific realism. Oxford University Press. Lever, C., Wills, T., Cacucci, F., Burgess, N., & OKeefe, J. (2002). Longterm plasticity in hippocampal place-cell representation of environmental geometry. Nature, 416, 90-94. Lipton, Peter. (1991). Inference to the best explanation. London: Routledge. OKeefe, J., & Dostrovsky, J. (1971). The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat. Brain Research, 34, 171-175. Psillos, Stathis. (1999). Scientific realism. London: Routledge. Quine, Willard V. O. (1951). Two dogmas of empiricism. Reprinted in From a logical point of view, 2nd ed. (pp. 20-46). Cambridge, MA: Harvard University Press. Quine, Willard V. O. (1969). Epistemology naturalized. In Ontological relativity and other essays (pp. 69-90). New York: Columbia University Press. Quine, Willard V. O. (1975). On empirically equivalent systems of the world. Erkenntnis, 9, 313-328. Rosenberg, Alexander. (2000). Darwinism in philosophy, social science and policy. Cambridge: Cambridge University Press. Ruse, Michael. (1979). The Darwinian revolution. Chicago: The University of Chicago Press. Save, E., Cressant, A., Thinus-Blanc, C., & Poucet, B. (1998). Spatial firing of hippocampal place cells in blind rats. The Journal of Neuroscience, 18, 1818-1826. Shapere, Dudley. (1964). The structure of scientific revolutions. Philosophical Review, 73, 383-394. Shapere, Dudley. (1980). The character of scientific change. In T. Nickles (Ed.), Scientific discovery, logic, and rationality (pp. 61-102). Boston Studies in the Philosophy of Science, Vol. 56. Dordrecht: D. Reidel. Shapere, Dudley. (1982). The concept of observation in science and philosophy. Philosophy of Science, 49, 485-525. Shapere, Dudley. (1984). Objectivity, rationality, and scientific change. In Proceedings of the Biennial Meeting of the Philosophy of Science Association (pp. 637-663). Suppe, Frederick. (1977). The structure of scientific theories, 2nd ed. Chicago: Illini Books. Thagard, Paul. (2000). Coherence in thought and action. Cambridge, MA: MIT Press. Thagard, Paul. (2007). Coherence, truth, and the development of scientific knowledge. Philosophy of Science, 74, 28-47. Van Fraassen, Bas. (1980). The scientific image. Oxford: Oxford University Press. Whewell, William. (1847). The philosophy of the inductive sciences, Volume II, Section III Tests of hypothesis. London: John W. Parker. Whewell, William. (1858/1968). Novum organon renovatum, being the second part of the philosophy of the inductive sciences, 3rd edition, London. In Robert E. Butts (Ed.), William Whewells theory of scientific method (pp. 103-249). Pittsburgh: University of Pittsburgh Press. Worrall, J. (1989). Structural realism: The best of both worlds? Dialectica, 43, 99-124.

84

STEVEN FRENCH

HANDLING HUMILITY
Towards a Metaphysically Informed Naturalism
i

1. INTRODUCTION

Let me begin with the following assertion: One cannot fully appreciate what it might mean to be a realist until one has a clear picture of what one is being invited to be a realist about. (Chakravartty, 2007, p. 26) The question then is how do we obtain this clear picture? One might start with the following realist recipe for obtaining an understanding of how the world is: we choose our best theories; we read off the relevant features of those theories and then we assert that an appropriate relationship holds between those features and the world. These features might be theoretical terms, say, and the relationship would be that of reference between that term and the associated object; or they might be features such as the laws and symmetries of the theory and the relationship might be that of representation, however construed. But is the picture obtained from following this recipe clear enough? Consider: if we feed into the recipe our best theories from physics, we obtain a certain picture of the world as including particles and fields, for example, interacting via various kinds of forces. But is this clear enough? How are we to conceive of these particles and fields? As objects and substances, respectively? Well-known difficulties arise in each case (see, for example, French & Krause, 2006), yet in the absence of some such conception, the notions of particle and field remain as ciphers. In order to obtain Chakravarttys clear picture and hence obtain an appropriate realist understanding of the world we need to clothe the physics in an appropriate metaphysics. Those who reject any such need are either closet empiricists or ersatz realists (see Ladyman, 1998, for use of this term). Of course, introducing metaphysics in this way raises obvious concerns, not least as to the epistemic probity of such an introduction. Indeed, it has been claimed that by virtue of going beyond the physics, such metaphysical elements demand of us an attitude of epistemic humility. Balancing this humility with the need to obtain a clear picture of the world is what I shall call Chakravarttys Challenge. And I shall argue that to meet this challenge, and to appropriately maximise the clarity of the picture obtained whilst appropriately reducing the level

J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 85104. 2013 Sense Publishers. All rights reserved.

FRENCH

of humility adopted, we should adopt a form of structural realism (Ladyman, 1998; French & Ladyman, 2011). My discussion in this paper clearly bears on the thorny issue of the relationship(s) between science, metaphysics and philosophy of science in general (see also French & McKenzie, forthcoming) and I shall briefly sketch where things stand with regard to those relationships, before presenting the case for humility and considering various ways we might reduce it whilst still constructing a clear picture of the world.
2. THE METAPHYSICS OF SCIENCE: FROM STERILITY TO A-LEVEL CHEMISTRY

The history of the relationship between science, metaphysics and philosophy of science is not a happy one, at least not when one considers the past one hundred years or so. Carnap famously wrote that Most of the controversies in traditional metaphysics appeared to me sterile and useless. When I compared this kind of argumentation with investigations and discussions in empirical science or [logic], I was often struck by the vagueness of the concepts used and by the inconclusive nature of the arguments.ii (Carnap, 1963, pp. 44-45) And little seems to have changed in the intervening decades. More recently, Price has suggested that, Whats haunting the halls of all those college townscapturing the minds of new generations of the best and brightestis actually the ghost of a long discredited discipline. Metaphysics is actually as dead as Carnap left it, but blinded, in part, by [certain] misinterpretations of Quinecontemporary philosophy has lost the ability to see it for what it is, to distinguish it from live and substantial intellectual pursuits. (Price, 2009, p. 323) In the context of the debate over scientific realism, the attitude towards metaphysics is one of the factors that can be used to distinguish the anti-realist from the realist. Thus, van Fraassen famously waved goodbye to metaphysics at the end of his discussion of quantum mechanics (van Fraassen, 1989). However, given that, for the constructive empiricist, different interpretations amount to different ways the world could be (rather than, as the realist would have it, different ways the world is), and that articulating such interpretations will require some engagement with metaphysical issues, even she will need to welcome back metaphysics, if only on a modal basis. In the further extension of constructive empiricism that presents it as a stance, rather than simply a philosophical position, the attitude towards metaphysics is one of the factors that distinguishes it from the realist stance and renders straightforward compare and contrast exercises problematic (for more on stance-relative approaches to metaphysics, see Chakravartty, forthcoming). Thus even the constructive empiricist might feel the need to draw on what metaphysics has to offer. However, some have felt that what it has to offer is 86

HANDLING HUMILITY

precious little, given its apparent lack of contact with modern science. Ladyman and Ross, for example, insist that [m]ainstream analytic metaphysics has become almost entirely apriori (Ladyman & Ross, 2007, p. 24) and argue that even that which pays lip-service to naturalism is really philosophy of A-level chemistry (ibid.). Now of course, one could argue that even divorced from modern science as Ladyman and Ross feel it is, metaphysics might still offer an array of tools, moves and strategies of which the realist could avail herself. Indeed, some metaphysicians might insist that it is only by moving to an appropriate level of generality, with a concomitant loss of contact with scientific concerns, that they can develop such broadly applicable tools and strategies.iii While I am sympathetic to such claims, too many metaphysical positions are grounded in intuition or reflection on everyday objects and their properties and attempts to import these into the quantum context typically prove disastrous. What is obviously needed is to obtain some form of balance and here one might adopt Callenders even handed or, as he puts it, symmetric approach to science and metaphysics (Callender, 2011) in which not only is the laying bare of the metaphysical assumptions of our best theories an important part of understanding the world, but metaphysical speculation itself (appropriately anchored in systematic theorizing) can be heuristically useful.iv Like Chakravartty he takes metaphysics to help provide a crucial element of understanding when it comes to our theories and writes, In slogan form, my claim is that metaphysics is best when informed by good science and science is best when informed by good metaphysics. (ibid., p. 48) But now the question is, how should we conceive of that informing?
3. THE INVOLVEMENT OF METAPHYSICS WITH PHYSICS

This is the issue that Hawley sets out to tackle (Hawley, 2006). She identifies two broad stances that one might adopt with regard to the possibility of metaphysics being informed by our best science, and physics in particular: the optimistic, which takes science to be capable of bearing upon metaphysical matters and to help drive progress in metaphysics; and the pessimistic, which insists that you only get as much metaphysics out of a scientific theory as you put in, in the first place. As an example of the first, she takes Siders claim that Special Relativity shows presentismcrudely, the claim that the present has a distinctive ontological status to be false (Sider, 2001; see also Wuthrich, forthcoming). Representing the second, one can take the claim that quantum physics is consistent with particles regarded either as individuals, with well-defined identity conditions or as nonindividuals which do not have such conditions, so there exists a kind of underdetermination of the metaphysics (of identity and individuality) by the physics (French & Krause, 2006). More specifically, she develops these positions as follows:

87

FRENCH

(Optimism) There are actual cases in which the involvement of a metaphysical claim in an empirically successful scientific theory provides some reason to think that the claim is true. The pessimist position can then be separated into two forms: (Radical Pessimism) The involvement of a metaphysical claim in an empirically successful scientific theory can never provide any reason to think that the claim is true; and (Moderate Pessimism) There is a kind of involvement in theory which, were a metaphysical claim to achieve this involvement, would provide some reason to think the claim is true; but there are no cases of metaphysical claims being involved in theory in this way. What is meant by involvement here? According to Hawley, it is the same kind of involvement that scientific realists appeal to in giving us reason to believe claims about unobservable entities. Thus, when a metaphysical claim is involved with scientific theories in this way, it shares responsibility for generating the empirical success of the theory. However, according to radical pessimism, such involvement would not give us any reason to believe the claim, whereas the moderate pessimist accepts that it would but insists that metaphysical claims are never really involved with scientific theories in this way. Optimists on the other hand, believe that such claims can be appropriately involved with theories and that this involvement gives us reason to believe the claims in question. As well see in a moment, this comparison with the involvement of unobservable entities is problematic. Before we discuss that, however, its worth noting how Hawley lines up these options within the realism debate. In her terms, the realist accepts that there are cases where the involvement of a claim about an unobservable entity in an empirically successful scientific theory provides reason to think that the claim is true. As she says, this is weaker than the standard characterization of realism as inferring the existence of entities, not least because it is compatible with structural realism. It is also more specific in focusing on the role of such claims in explaining the success of theories, something that a number of people have emphasized (Saatsi, 2008). According to Hawley, scientific realism, understood in this form, is incompatible with Radical Pessimism, because otherwise there would have to be some in-principle difference between claims about unobservables and metaphysical claims which could account for the former being confirmed via the relevant theorys success and the latter not, despite their both being integrated into the theory. One option, as she notes, would be to insist that metaphysical claims are simply not truth-apt but this requires further motivation and what she sees as the history of shifts from metaphysics to science without change in truth-apt status suggests any such motivation is going to be hard to produce. Alternatively, one might accept that metaphysical claims could be involved in this way, but in fact it just doesnt happen, or hasnt happenedin which case one would be a Moderate 88

HANDLING HUMILITY

Pessimist, which is compatible with a realist stance. Now, both cases suppose a particular kind of relationship between metaphysics and science such that we can more or less cleanly distinguish metaphysical claims in theories from those involving unobservables. However, one can argue that metaphysics and science do not stand in such a relationship and hence one might have doubts whether such a clean distinction can be established. Ill come back to this point shortly. As Hawley goes on to note, anti-realists might be comfortable with an attitude of Radical Pessimism, because they think that the involvement of a claim about the unobservable in generating predictive success is irrelevant to whether we should believe it; or they might prefer Moderate Pessimism, because they think that claims about the unobservable never do any work in generating novel success. Either way, the anti-realist cannot be an Optimist. With these taxonomic combinations out of the way, let us turn to the question: are metaphysical claims ever involved in scientific theories in this way? Or, to put it another way: can such claims stand in the kind of relationship to theories that Hawley presupposes such that the claims can be ruled in or out (putting it very generally) on the basis of the success of these theories? First of all, it would appear that certain metaphysical claims can certainly be ruled out. Consider for example Leibnizs Principle of Identity of Indiscernibles which statesagain, broadly speakingthat (putative) entities which are indiscernible in some respect are in fact identical. There has been considerable discussion over many years whether the Principle should be understood as necessary or as contingent, with opinion shifting to the latter. Even as such it has been argued that it has been ruled out by quantum mechanics, on the most plausible understanding of what it is to be indiscernible in this context (French & Redhead, 1988; for further discussion see French & Krause, 2006). Such cases might be taken as providing grounds for a kind of falsificationist Optimism: metaphysical claims can be ruled out by science and it is this possibility, I think, that motivates many of the negative attitudes towards metaphysics noted above, since it may seem that in their prolific generation of metaphysical positions without regard to the impact of science, metaphysicians are unaware that many of these positions are metaphysical dead men walking. However, there are two things to note. First, the relationship here is not best described as one of involvement. It is not that the metaphysical claim is involved in the theory in the way that a claim about unobservables is; rather the relationship is more akin to that between theory and disconfirming evidence as quantum mechanics is being used as evidence to rule out this particular item of metaphysics. Secondly, just as apparently falsified theories may regain life as either the evidence or the theory itself are reinterpreted (this possibility being one of the central features of Lakatos Methodology of Scientific Research Programmes) so the above kind of rejection of metaphysics might be conditional on factors such as the formulation of the theory, its interpretation, the nature or formulation of the metaphysical posit and so on (see Monton, forthcoming). So, for example, one might reformulate quantum mechanics in such a way as to offer a different understanding of what counts as indiscernible, or put forward a different 89

FRENCH

interpretation that also offers a different understanding.v Or one might reformulate the metaphysical posit concerned. Thus Saunders has proposed a form of weak discernibility in terms of which fermions, at least, can be understood as satisfying a form of Identity of Indiscernibles. (Saunders, 2003, pp. 289-307; Ladyman & Bigaj, 2010). Of course, one could always insist that such reformulations generate different posits and so the original result stands, strictly speaking, but thats a hard line to hoe, not to mention a churlish one. And just to wrap up this point, Hawley herself has explored in general terms the different ways in which the advocate of the Principle might evade the above kind of falsification, although she concludes that each is unsatisfactory (Hawley, 2009). Thus in the case of quantum particles one might adopt what Hawley calls the summing defence, according to which one denies the existence of the putative separate parts and instead maintain that the (entangled) system constitutes a whole such that the Principle of Identity of Indiscernibles does not apply to its (putative) parts. However, something will then have to be said about the apparent emergence of parts from the whole in measurement situations, as well as grounds for maintaining that we have n fermions in the relevant state and so on (again, for discussion see French & Krause, op. cit.). Nevertheless, there are options available to those who wish to retain certain posits, if not at all costs then at least at some cost elsewhere in their metaphysical system. Moving on, can metaphysical posits be ruled in? In other words, can at least some of these posits be involved in theories in the way Hawley has indicated, such that they can share in the success of the theory? If not, then we will both have grounds for pessimism and problems responding to Chakravarttys challenge. We would then have to accept certain constraints on a realist understanding of the world. Here we must overcome an attitude of humility that the introduction of metaphysics seems to require of us. This is based on the claim that where there exists an array of metaphysical facts about which we can have no knowledge, we must adopt an attitude of epistemic humility towards such facts. Consider the example of intrinsic properties and the following argument: we can have knowledge of something only insofar as it affects us and so our knowledge is dependent on certain relations holding; these relations are not supervenient on or otherwise reducible to the intrinsic properties of things; hence we must remain ignorant of and adopt an attitude of humility toward these intrinsic properties.vi Now there are various ways in which one could resist the force of such an argument (and indeed I shall suggest a number below) but consider the case of metaphysical underdetermination with regard to the notion of individuality in the context of quantum physics given above, which can be construed as another example. Here the metaphysical packages of objects-asindividuals and objects-as-nonindividuals are both compatible with quantum mechanics and can be considered equally natural metaphysical doctrines in this context (see Caulton & Butterfield, 2012). Our only access to the relevant putative objects is via the theory but the theory underdetermines the metaphysics of individuality; hence, as things stand, it seems we must remain ignorant about which of these packages holds and adopt an appropriately humble attitude. 90

HANDLING HUMILITY

So what we have in this attitude is the conjunction of an existence claim and an ignorance claim, leading to the acceptance of appropriate humility (see Langton, 2009): we must have grounds for claiming that the metaphysical facts exist in some sense, and yet we must be ignorant of them. In such cases the humility limits our realist understanding and Chakravarttys challenge cannot be fully met. There are then a number of fairly obvious ways in which we can respond to this situation and handle the humility.
4. HANDLING HUMILITY

The first set of responses involves the acceptance of our ignorance and the acknowledgment that we must be humble but insist that this is not in fact a problem. Thus it is certainly not a problem for the constructive empiricist who adopts a broadly sceptical position towards metaphysics in general. At best, as was indicated above, the understanding that metaphysics provides just fleshes out the different ways the world could be. So, one way the world could be is that quantum particles are individual objects, and another way is that they are non-individual objects, but of course, we have no way of telling which is correct on the basis of our physics. It may also not be a problem for others. Thus one could accept our ignorance of these metaphysical features but still insist that the multiple metaphysical relativities they give rise to lead to greater understanding. Here it seems that we achieve greater understanding at the meta-level, as it were, by surveying these various relativities, or ways the world could be according to the constructive empiricist, rather than by adopting a particular metaphysical package. Thus, Howard (2011), for example, writes that, an array of possible metaphysical interpretations enriches our understanding of quantum mechanics. However, he insists that this conceptual relativity is not equivalent to the kind of scepticism that lies behind the constructive empiricist stance, but is consistent with a form of realism. The appropriate epistemic attitude in this situation, Howard argues, is neither belief (of the theory plus appropriate metaphysical understanding as true, as the standard realist would have it) nor mere acceptance (of the theory as empirically adequate, as the constructive empiricist insists) but a kind of Peircean pursuitworthiness.vii So the idea seems to be that metaphysical underdetermination, and the kind of array of metaphysical facts that generates humility more generally, are to be welcomed since they present a range of options that are worthy of pursuit, and by chasing them down, as it were, we obtain greater understanding. However, Peirce of course agreed with the standard realist that in the long run, our beliefs would settle down and the array of possibilities would narrow down to just one. Thus the above attitude of regarding these metaphysical options as pursuitworthy would seem to be a preliminary attitude at best.viii Alternatively, one may retain the usual elements of the standard realist stance but insist that the above forms of humility are innocuous. Thus, Chakravartty argues that one should no more be worried about the underdetermination regarding quantum individuality than scientific realists are regarding whether a chair is taken 91

FRENCH

to be a substance plus properties, or a bundle of properties or whether those properties are regarded as instantiated universals or tropes or whatever (Chakravartty, 2003). In other words, we do not need to resolve all such cases of underdetermination or reduce our level of humility entirely to be a scientific realist. We can be a realist about chairs and other everyday objects, without feeling we have to resolve all metaphysical relativities and likewise we can be a realist about quantum objects without having to resolve the issue of whether they are individuals or not. However, there is a disanalogy in the case of chairs, or everyday objects more generally and quantum particles and this seems to present a problem for this easy acceptance of humility (see French & Ladyman, 2003). In the case of everyday objects the issue is not whether they are objects or not, but rather, having already established that, how their objecthood should be conceived. Here the matter of access looms large: we have sensory mediated access to everyday entities in terms of which we can separate out those that count as distinguishable objects, by means of the relevant properties, or location in space-time and so on. Once weve established distinguishability, at least in principle, we can then go on to speculate as to the ground of individuality, whether via properties within the scope of an appropriate form of the Principle of Identity of Indiscernibles, or in terms of some form of primitive thisness, or whatever. Here, as elsewhere, I am following Gracia (1983) who suggests that epistemically it is via distinguishability that we become aware of an object as an individual but that ontologically this individuality is then taken to be grounded in some underlying principle, broadly construed, such as haecceity, or substance or whatever. And for certain entities, of course, appropriate distinguishability cannot be establishedentities such as (famously) euros in a bank account, which are then not regarded as individual objects. When it comes to quantum particles, we obviously do not have the same form of access as with chairs and the access that we do have is mediated via the relevant theory, quantum mechanics in this case. The danger of simply reading off the metaphysics from the physics is that the latter, or rather what we take to be the latter, may be infected, as it were, with the metaphysics of the everyday. Indeed, the very foundations of the mathematics we use to frame our theories is already so infected, requiring the genius of Weyl and his understanding of both those foundations and group theory to effectively twist that everyday metaphysics to accommodate the new physics (French & Krause, 2006, pp. 261-263). Here we cannot establish distinguishability to begin with, and the choice the realist faces is not the apparently innocuous one of deciding between different metaphysical accounts of the individuality of objects, but that of deciding whether they should even be regarded as individual objects to begin with. In other words, the choice is the much more fundamental one of deciding whether quantum particles are like euros in your pocket or euros in your bank account (see also French & Ladyman, 2003; French, 2006). Furthermore, there is a tension here with the requirement to supplement ones realism with some form of understanding. In the case of the chair we begin with a lot clearer picture than we have of quantum particles and our relevant 92

HANDLING HUMILITY

understanding of them is such that we can effectively live with the level of humility associated with not knowing whether the chair is a bundle of properties or has a substantival metaphysical component. In the case of the particles, we do not have that level of understanding to begin with and the humility appears at a much more fundamental level. Indeed, it appears at the most fundamental level possible as far as the standard (or object oriented) realist is concerned, namely that of the objects towards which she is adopting her realist stance. But then the question is, how can one adopt such a stance towards something if one does not know whether it is an individual or not? Let us move on to other ways of handling the humility. One might, for example, accept the existence of the relevant metaphysical facts but reject the claim that we must remain ignorant of which obtain and insist that we do have appropriate access to the facts. Thus we might try to expand the relevant notion of cognitive access in this regard and appropriately elaborate accounts of knowledge that resolve our apparent ignorance of the metaphysical facts, such as those regarding quiddities, for example (see Schaffer, 2005). Just as a haecceity or primitive thisness is taken to render an object the individual that it is (and thus provides one way of spelling out this notion in the quantum context; see French & Krause, 2006), so a quiddity likewise underpins the identity of properties. So, the idea here is that the property of charge is the property that it is because of an underlying quiddity of chargeness, such that if this property were instantiated in the absence of any other properties being instantiated, it would still be charge, just as if a given object existed in the absence of any other objects existing, it would still be an individual object by virtue of its haecceity.ix Returning to expanding our cognitive access, the thought is that scepticism about quiddities, say, should be regarded as just a form of skepticism about the external world in general and so whatever answer one offers to skepticism in general will thereby yield an answer to quiddistic skepticism. So, one might adopt a broadly contextualist stance and loosen the standards for knowledge sufficiently that there is a sense in which one can say that we know quiddities. One worry here, of course, is that too much loosening will let anything in, as it were, and if were not careful, well lose any distinction between what can be known and what not. Another is that contextualism hardly seems the right way to go in this situation. It may be that within the metaphysical context, with the loose standards that are appropriate for that context, we can justifiably assert that we know quiddities, but the context we are concerned with is one that covers both science and the philosophy of science, where it is, at least, unclear that such loose standards are appropriate. And if the standards are those that govern knowledge claims about entities such as electrons, or properties such as charge, then it would seem these are too tightly drawn to cover quiddities. Taking a different tack, one might try to argue that we have direct perception of such features of properties, in just the way that, it might be said, by putting ones finger in an electrical socket, one can directly perceive charge.x But this would be to ride roughshod over all sorts of distinctions in the philosophy of science between 93

FRENCH

phenomena and theoretical entities and would so extend the notion of direct perception as to render it meaningless. Consider: if we can directly perceive charge in this way, can we likewise directly perceive spin, or colour (the quark property, not the visual one)? If not, why not? And if there are barriers to perceiving spin, do these also apply to quiddities? But of course, even if one were to agree, madly, that one can directly perceive charge, as a property, it is quite another thing to insist that one can directly perceive metaphysical features of such properties, such as their quiddities. One might want to try the line that one directly perceives the quiddity of the property by virtue of directly perceiving the property itselfso one perceives the charginess of charge when one perceives chargebut then I start to lose my grip on the distinction between the property and its quiddity. As a way of handling humility, this would collapse all kinds of distinctions and seems a step too far. Alternatively, and more plausibly perhaps, one might suggest that we can have abductive knowledge of such metaphysical facts. Thus one might argue that quiddities offer the best explanation of the relevant phenomena and hence can be known in just the way that we can know theoretical entities and properties that are also offered in this way. Thus the way we treat metaphysical features would be put on a par with the way we treat theoretical ones. (Of course the empiricist would not be happy with such a move in either case but I suspect weve left her behind some while ago.) Here we might usefully compare this move to similar ones that are made in the philosophy of mathematics, where it has been argued that mathematical entities offer the best explanation of certain phenomena (such as the periodic life cycles of cicadas or the structure of honeycombs) and hence should be regarded on a realist basis (see Baker, 2009). Care has to be taken in such cases, not least because such arguments leave it unclear whether the mathematics is truly playing an explanatory role and not just a representational or indexical one (see for example Saatsi, 2007, 2010; Bueno & French, 2012). Likewise, we need to be clear on what explanatory role quiddities, for example, are supposing to be playing and what it is that they are supposed to be explaining. This takes us back to the argument that metaphysical terms can be treated like theoretical ones, but the former do not play the same role in theories as the latter. In particular, if we consider what is involved in generating predictions and yielding empirical success when it comes to scientific theories, then metaphysical terms cannot be considered as success inducing in the same way as theoretical ones (e.g. Saatsi, 2008). If one were to insist that terms like quiddities are not meant to play any role in explaining physical phenomena but do play such a role in the metaphysical context (assuming some appropriate notion of metaphysical phenomena can be made out) then we are back to contextualism and the response that thats not the context we are concerned with here. Relatedly, one might attempt to reject the claim of ignorance and break the metaphysical underdetermination by insisting that we should accept a metaphysical posit if it is essentially involved in a theory that generates novel predictions. But again, the involvement of metaphysical posits is not akin to that of theoretical ones and the underdetermination and consequent humility remain. 94

HANDLING HUMILITY

Heres a different tack that one might adopt: on a (broadly) Quinean Approach (see Belot, 2009) one might posit the simplest total theory (involving the given metaphysical posit) that is consistent with the evidence, giving a nice parallel between the scientific and metaphysical methods. And indeed, there is a flourishing field of metametaphysics, certain proponents of which advocate the view that theory choice in metaphysics should be modelled on the methodology of theory choice in science (see Chalmers, Manley, & Wasserman, 2009; Callender, 2011). But of course, pinning down the latter is no easy matter! Thus, it is more or less accepted as a well-known truism that there is no argument that demonstrates that simplicity tracks the truth in the scientific case. And that, furthermore, the problem of characterising what counts as a simple theory is notoriously difficult. If that is the case for the mathematised theories of much of modern science, where one can at least take a crack at the problem by focusing on the number of variables, say, or the mathematical form of the theory, then how much more problematic is it going to be to determine what counts as a simple metaphysical theory! More profoundly, perhaps, in the scientific case, the most generally accepted broadly methodological picture of theory choice is the Bayesian one, which offers a probabilistic conception in which the impact of evidence is crucial (in updating our degree of belief in the truth of the given theory, on one fairly straightforward version of this picture). But in the metaphysical case, there is no such evidence that can drive theory revision (or at least, not in the same relatively straightforward sense) and in particular there is no evidence to wash out disagreements over simplicity. As Belot nicely puts it, If ontology follows a version of the scientific method, the relevant version is a degenerate caseand, I think, we should be suspicious of the credentials of its output. (op. cit.) But perhaps attempting to draw such a straightforward parallel between the methodologies of science and metaphysics is simply too quick. Perhaps a better and more sophisticated approach would be to adopt a naturalistic framework within which the relationship between such metaphysical claims and the relevant scientific theories can be appropriately articulated. Indeed, this is what Ladyman and Ross do as part of their defence of structural realism. In particular, they advocate the following Principle of Naturalistic Closure: [O]nly take seriously those metaphysical claims that are motivated by the service they would perform in showing how two or more hypotheses jointly explain more than the sum of what is explained by the two hypotheses taken separately. (Ladyman & Ross, 2007, p. 37) This is conjoined with what they call the Primacy of Physics Constraint: Special science hypotheses that conflict with fundamental physics, or such consensus as there is in fundamental physics, should be rejected for that reason alone. Fundamental physical hypotheses are not symmetrically hostage to the conclusions of the special sciences. (ibid., p. 44) 95

FRENCH

Together, these yield positive and negative prescriptions regarding the role of metaphysics and its relationship to science. The positive is that metaphysics is now seen as the enterprise of critically elucidating consilience networks across the sciences (ibid., p. 28). And the negative is that we should reject any metaphysical hypothesis that conflicts with fundamental physics. Ladyman and Ross tackle the issue head on, at least, and present a useful framework in which claims about the relationship between specific metaphysical posits and the relevant scientific context can be evaluated. However, it has been criticised for being too liberal, and rejecting too little, and also for being too restrictive, and rejecting too much. It rejects too little, it is argued, because many contemporary scientific theories are themselves neo-scholastic insofar as they contain (naturalistically unjustified) metaphysical assumptions.(Dicken, 2008, p. 291). Thus, insofar as current science incorporates metaphysical posits that do not satisfy the Principle of Naturalistic Closure, such posits should also be expunged, but doing so would remove many of the interpretive elements from the theories concerned. Underlying this criticism is the concern that there is an ambiguity in what is meant by fundamental physical hypotheses in Ladyman and Rosss scheme: do we mean the hypothesis as formally given, or as interpreted? If the former, then we seem to be edging uncomfortably close to a positivistic understanding of theories; if the latter, then it is hard to see how one could include at least some metaphysics in such an interpretation. Here again we bump up against the considerations presented above in the context of deciding whether quantum physics rules out the Principle of Identity of Indiscernibles. On the other hand, the Ladyman and Ross scheme rejects too much because it would rule out the important heuristic role of metaphysics (Dicken, 2008, pp. 290293). This is something that even non-naturalistically inclined philosophers of science might be willing to accept and it has been emphasised across a range of contexts. Thus Howard writes, if fertility in the form of the generation of pursuitworthy physical and metaphysical possibilities is a paramount virtue in inquiry, a very important role for naturalistic metaphysics is indicated. (Howard, 2011) Likewise Hawley, in her review of Ladyman and Ross (2007), suggests that they are too dismissive of metaphysics, arguing that, the actual work of many contemporary metaphysicians provides conceptual resources and tools which can be of great use to anyone who is attemptingadmirablyto draw metaphysical conclusions from the detailed study of current science. (Hawley, 2010, p. 179) This is precisely the view that I adopt in what has been called The Viking Approach to Metaphysics where a variety of conceptual resources and tools can be laid out as available for the philosopher of physics, or philosopher of science in general, to appropriate and utilize. Consider the issue of the composition of objects, for example. Here metaphysicians stand accused of relying on intuitions or unrealistic toy models in articulating their answers to the questions when is it true 96

HANDLING HUMILITY

of certain objects that they compose something and conversely, when is it true of an object that there are objects that compose it? But while I agree that properly naturalised responses to these questions will look to the relevant physics and chemistry (in order to establish the sense in which water is composed of H20 molecules, the latter are composed of hydrogen and oxygen atoms, the latter are composed of electrons, protons and neutrons and the latter composed of quarks), it may be that certain strategies and techniques used by metaphysicians in their consideration of intuitive scenarios and toy models might be appropriated and applied to the scientifically better informed cases (see French, forthcoming). The crucial point is that even where metaphysics has been developed in the absence of any relationship with current physics, or, as is more often the case, on the basis of only everyday examples or at best toy models, it may still prove useful.xi Returning to the issue of how best to handle this attitude of humility towards metaphysical facts, none of the above approaches seem to me to be adequate when it comes to the posits that realism is concerned with. Instead I suggest we handle this humility in double-barrelled fashion, by rejecting the existence of the facts concerned, and thus eliminating the possibility of our being ignorant of them. I shall take my lead from Faraday who asked, Why then assume the existence of that of which we are ignorant, which we cannot conceive, and for which there is no philosophical necessity?xii (Faraday, 1844, p. 140) Now by existence here I mean a narrow construal, in the sense of existing as facts that have some grounding in the relevant scientific facts, or what Cassirer (see below) called the domain of empirical existence. So the idea is to accept only such metaphysical posits as we minimally require to interpret our theories, along the lines suggested by Chakravartty. And I want to use this to push the claim that we do not so minimally require a metaphysics of objects, which generate unacceptable levels of humility via the metaphysical underdetermination regarding individuality mentioned above. This idea is encapsulated in what I shall call Cassirers Condition: Take the conditions of accessibility to be conditions of the objects of experience. By conditions of accessibility I shall understand those conditions encoded in our best theories that give us access to the way the world is (on a realist construal). And by the conditions of the objects of experience I shall understand those conditions that lay down how the world is, where, of course, we are taking objects here in a broad sense. If we adopt this condition, then there will no longer exist an empirical object that in principle can be designated as utterly inaccessible; and there may be classes of presumed objects which we will have to exclude from the domain of empirical existence because it is shown that with the empirical and theoretical means of 97

FRENCH

knowledge at our disposal, they are not accessible or determinable. (Cassirer, 1936 p. 179) This is how I view the objects posited by standard forms of scientific realism: as not accessible, via our theories, nor determinable, in the sense of being able to specify well defined identity conditions for them on the basis of those theories. The general attitude that underlies Cassirers Condition crops up elsewhere in philosophy. Thus Hawthorne, echoing Faraday, asks, [w]hy posit from the armchair distinctions that are never needed by science? (Hawthorne, 2001). And returning to the particular issue of positing quiddities, he writes, If there were a quiddity that were, so to speak, the role filler, it would not be something that science had any direct cognitive access to, except via the reference fixers the quiddity that actually plays the charge role. Why invoke what you dont need? (Hawthorne, 2001) I shall take this as a modern expression of what lies behind Cassirers Condition above. Similar sentiments can be found expressed by Esfeld, for example, who also notes the gap that appears between metaphysics and epistemology if an attitude of humility is allowed (Esfeld, forthcoming) and also urges the closing of this gap by denying the existence of quiddities as underpinning the identity of properties. The central point is that humility can be handled by eliminating the inaccessible posits whose existence opens the gap in the first place. Doing so in the context of the debate over scientific realism pushes us towards a form of structural realism, as we shall now see.
5. GAINING UNDERSTANDING WHILE REDUCING HUMILITY

Let us recall Chakravarttys Challenge above and the demand to provide understanding of scientific theories by offering an appropriately metaphysically informed interpretation. Here weve looked at some of the obstacles faced by and dangers inherent in such an interpretation. Standard, or object-oriented realism, in particular, is hamstrung through being unable to ground the identity conditions of its objects in the relevant physics and the metaphysical underdetermination regarding identity and individuality introduces an unbridgeable gap between the relevant epistemology and metaphysics that brings with it a level of humility that, I would insist, is too much for any realist to swallow. Ill come back to this shortly, but clearly what we need to do is to balance the gain in clarity and understanding that metaphysically informed interpretations can yield with an appropriate reduction in the level of humility that we have to accept as a consequence. We can do this if we follow something like the following process: we draw on metaphysics to respond to Chakravarttys Challenge and thus involve metaphysics in our interpretation of science, but only as much as necessary; we then reduce any associated humility in line with Cassirers Condition regarding the reliance on what he called the conditions of accessibility, thereby

98

HANDLING HUMILITY

minimizing the metaphysics, as much as possible. Indeed, Chakravartty himself expresses something along these lines, when he writes, we must turn to the equations with which we attempt to capture phenomenal regularities, and ask: what do these mathematical relations minimally demand? We must consider not what possible metaphysical pictures are consistent with these equations, but rather what kinds of property attributions are essential to their satisfactioni.e. to consider not what is possible, but what is required. (Chakravartty, 1998, p. 396) Below I shall suggest that ontic structural realism achieves just the right balance of gain in understanding with reduction in level of humility.
6. MANIFESTATIONS OF HUMILITY IN THE REALISM DEBATE

So, let us begin with what I have called object-oriented realism, crudely summarized in the claim that reading off the relevant physics we obtain a picture of the world as composed of objects, that possess certain properties, enter into certain relations etc. The question then is what sort of objects are these? More specifically, can we understand them in terms of our usual metaphysical frameworks regarding individuality and identity or not? Unfortunately, the metaphysical underdetermination noted above prevents us from giving a definitive answer to this question, at least on the basis of the physics itself. Likewise, how should we understand the relevant properties? In particular, is their identity given by quiddities or not? Again, we cant say, on the basis of the physics. Here we have way too much humility! Indeed, it is surprising that the object oriented realist has got away with such a high level of humility for so long but perhaps this is simply because the metaphysics behind her realism is typically not examined very closely, which in turn has to do with the continued failure of realists in general to engage with the implications of quantum mechanics. What about Chakravarttys own view, semi-realism (2007)? Here we have a dispositional framework in which properties are understood in terms of causal powers, extended holistically to include relations in a way that meshes nicely with some of the central features of modern physics. As far as properties are concerned, then, quiddities are excluded from the picture and hence the level of humility is correspondingly reduced. However, semi-realism still retains objects as the seat of these causal powers and thus still falls prey to the metaphysical underdetermination regarding individuality. Again, then, there is still too much humility in this respect, although, as I have also noted, Chakravartty takes the underdetermination to be innocuous. Perhaps I can use this idea of reducing our level of humility to press my earlier point that this underdetermination at the level of quantum particles should not be set on a par with and dismissed alongside the kind of underdetermination we find with regard to whether everyday objects should be regarded as bundles of tropes or universals. In the latter case the level of humility, although high, can indeed be regarded as innocuous because these objects are not taken to be elements of our fundamental ontological base. In a sense it just 99

FRENCH

doesnt matter that different metaphysical accounts can be given of them because there is a tacit understanding that they are dependent upon, or indeed eliminable in favour of, a more fundamental set of objects. These make up or compose, in some sense, the way the world is and here too much humility is an issue, as the gap between epistemology and metaphysics widens and we find ourselves buying into a picturesuch as that of the object oriented realistwhere we have to accept elements that are simply not grounded in our best scientific theories. Moving on, what about Epistemic Structural Realism, with its claim that all we know (i.e. all we have epistemic access to) is structure (Worrall, 1989). Unlike most versions of object oriented realism, here at least were starting from the right epistemic point, with the structures presented to us by theories (in the form of the relevant mathematical equations). But here again humility enters with the hidden natures that the epistemic structural realist (following Poincar) takes to lie behind the structureindeed, we get an extra helping of humility by virtue of their hiddenness! At least the object oriented realists objects are intended to be out in the epistemic open, as it were, but here we have something utterly inaccessible that is posited solely to prop up the structures to which we do have access (thus assuming that they need such props). Sliding across the metaphysical spectrum, in Saatsis eclectic realism (2008) he questions whether even in the classical Fresnel-Maxwell case that the epistemic structural realist invokes it should be the equations that are the focus of attention. Thus, Saatsi argues that the success yielding features of theories can identified in a metaphysically minimal manner with those theoretical properties, such as spin, charge etc., which are actually involved in the relevant theoretical derivations or more generally, which lie at the theoretical end point of the relationships between theory and phenomena. He demonstrates that the recovery of Fresnels equations from Maxwells theory can be articulated in terms of certain dispositional descriptions that are satisfied by those properties that feature in the solutions of Maxwells equations. Here again we have a shift away from objects, but Saatsi is keen to steer clear of structuralism, arguing that the balance should tip towards the epistemological rather than metaphysical aspects of realism.xiii Whether he can do so remains unclear. We recall the well-known criticism regarding the tenability of structural realisms distinction between nature and structure (Psillos, 1999): the object oriented realist would insist that the relevant properties are those of an unobservable object, whose nature is ultimately playing the explanatory role Saatsi is concerned with. However, the structural realist would claim that if the nature of these objects is cashed out in metaphysical terms, then the conclusion doesnt follow. If it is not, then nature signifies nothing more than the relevant properties and the conclusion is empty. And in that case the structuralist can agree that Saatsis principles tell us something about the relevant properties, where these are understood as aspects of structure. In the absence of such an understanding it is unclear how we are to regard themin a sense, eclectic realism avoids humility but offers too little metaphysics and thus may fail Chakravarttys Challenge.

100

HANDLING HUMILITY

We come now, like Goldilocks, to Ontic Structural Realism which eliminates objects and insists that all there is, is structure (Ladyman, 1998; French & Ladyman, 2003, 2011). Here we find a balance between understanding and humility that, I would argue, is just right! As in the case of the epistemic form above it proceeds from the appropriate epistemic base but avoids the humility of hiddenness. And, of course, unlike object-oriented realism it overcomes the obstacle presented by metaphysical underdetermination by dropping the entities whose identity profiles (to use Bradings phrase) remain detached from that epistemic base. Moreover, as in the case of semi-realism, it understands properties in terms of their nomic role and hence does away with quiddities (see French, 2006, forthcoming). Thus the principle sources of metaphysical humility are eliminated. Of course, further attitudes of humility may have to be adopted as the structuralist goes on to elaborate an understanding of her ontology in terms of the metaphysics of structure. But I would hold that this humility would also be generated for the object oriented realist, the semi-realist and the epistemic structural realist insofar as they are obliged to engage in a similar exercise if they are going to take the structures presented by physics seriously (as they should). In particular, none of these views have, so far, properly engaged with some of the most ubiquitous, powerful and important structures presented by modern science, such as those represented by the symmetries of contemporary physics. Here considerable work remains to be done, but insofar as all forms of scientific realism are going to have do such work, and outline an appropriate metaphysical understanding of these structures in order to meet Chakravarttys Challenge, I shall take any humility that has to be adopted as a result as applying across the board and not simply to structural realism alone.
7. CONCLUSION

Adopting a naturalistic attitude pushes us to take seriously our best scientific theories. But taking these literally leaves us unclear as to how we should understand them and hence have a clear picture of how the world is. Accepting Chakravarttys Challenge means clothing what physics tells us in some metaphysical suit or other. By virtue of going beyond the physics this induces an attitude of humility. The trick then is to balance the reduction in that humility that the naturalistic attitude demands with the need to meet Chakravartys Challenge. Standard object oriented and epistemic structural realism compel us to be too humble about how the world is. Ontic structural realism achieves the right balance by eliminating objects, hidden or otherwise and should be what all naturalistically inclined realists should settle on.
NOTES
i

What follows is based on talks I gave at the LSE, Boston Center for the Philosophy of Science and the Institute for the History and Philosophy of Science, Toronto. As always Im grateful to the

101

FRENCH audiences for their comments and general feedback, no matter how incredulous (yes, Im looking at you, Alisdair MacIntyre!). As Howard has noted to me (private communication about his work in progress), Carnap went on to say that metaphysics could be seen as an expression of ones attitude to life and compared it to music, insisting, however, that [m]etaphysicians are musicians without musical ability. Interestingly, given the form of structural realism I defend, he also wrote: [p]erhaps music is the purest means of expression of the basic attitude because it is entirely free from any reference to objects (Carnap, op. cit., p. 80). Thus, as skeptical as we should be about metaphysical attempts to say how the world is, rendering metaphysics completely dependent on science would be as problematic as doing the same for logic, say. In both cases we would lose the opportunity to explore new lines of enquiry unencumbered by already established worldviews, and generate the array of tools mentioned above. Popper, of course, famously maintained that along with those metaphysical ideas that have impeded the progress of science, there are those that have aided it. Indeed, he maintained that scientific discovery would be impossible without the kinds of speculative ideas that one might call metaphysical. The Bohmian and modal interpretations both offer escape routes for the advocate of the Identity of Indiscernibles (for example, see French & Krause, op. cit.). This is a crude condensation of the argument given in Langton (1998) which aims to show that Kant is not the kind of transcendental idealist we all thought he was but in fact he was a kind of realist who took our knowledge to be constrained by our limited access to, for example, intrinsic properties and hence things as they are in themselves. In a sense Langton portrays Kant as a kind of epistemic structural realist who adopts this attitude of epistemic humility towards the hidden natures of things. Another argument for humility was given by Lewis (2009) based on the multiple realisability of properties; for the differences between the forms of humility in each case see Langton (2004). For discussion of the possibility of adopting a pragmatist stance towards the philosophy of science, and a Peircean one in particular, see da Costa and French (2003). An alternative would be to adopt a metaphysically informed variant of Cartwrights patchwork realism but that I think is an avenue of despair in this context. And so arguments for positing quiddities draw on a problematic metaphysical manoeuvre namely that of imagining a sparse possible world of, in this case, one instantiated property, or in the case of haecceities, one lonely object. Given the way properties such as charge are understood in current physics, invoking such a world may eliminate the basis for such an understanding and hence cast into question the propriety of referring to charge at all (see French & McKenzie, forthcoming). Kids, dont try that at home! More precisely, even if the concepts of metaphysics prove to be of little worth due to their lack of engagement with modern physics, the strategies and manoeuvres that metaphysicians deployeven as part of the development of these conceptsmay still turn out to be useful. Before anyone gets excited by Faradays use of philosophical here and takes what he says as the basis for a claim that quiddities, say, are philosophically necessary, let me just note that he meant the term in its old school sense that embraced the scientific. A notion of Explanatory Approximate Truth is central to his view.

ii

iii

iv

vi

vii

viii

ix

x xi

xii

xiii

REFERENCES
Baker, A. (2009). Mathematical explanation in science. Brit. J. Phil. Sci., 60, 611-633. Belot, G. (2009). Simplicity and ontology. Talk given at the Pacific APA, Vancouver. Bueno, O., & French, S. (2012). Can mathematics explain physical phenomena?, with O. Bueno, British Journal for the Philosophy of Science, 63(1), 85-113. Callender, C. (2011). Philosophy of science and metaphysics. In S. French & J. Saatsi (Eds.), The continuum companion to the philosophy of science. Continuum Press.

102

HANDLING HUMILITY Carnap, R. (1963). Intellectual autobiography. In P. A. Schilpp (Ed.), The philosophy of Rudolf Carnap (pp. 3-84). La Salle, IL: Open Court. Cassirer, E. (1936). Determinism and indeterminism in modern physics. Yale University Press (1956). Caulton, A. & Butterfield, J. (2012). Symmetries and paraparticles as a motivation for structuralism. British Journal for the Philosophy of Science, 63(2), 233-285. Chalmers, D., Manley, D., & Wasserman, R. (Eds.). (2009). Metametaphysics: New essays on the foundations of ontology. Oxford University Press. Chakravartty, A. (1998). Semirealism. Studies in History and Philosophy of Modern Science, 29, 391408. Chakravartty, A. (2003). The structuralist conception of objects. Philosophy of Science, 70, 867-878. Chakravartty, A., (2007). A metaphysics for scientific realism. Cambridge University Press. Chakravartty, A. (2010). Metaphysics between the sciences and philosophies of science. In P. D. Magnus & J. Busch (Eds.), New waves in philosophy of science (pp. 59-77). Palgrame Macmillan. Da Costa, N. C. A., & French, S. (2003). Science and partial Truth. Oxford University Press. Dicken, P. (2008). Conditions may apply. Stud. Hist. Phil. Sci., 39, 290-293. Esfeld, M. (forthcoming). Causal realism. In Dennis Dieks & Marcel Weber (Eds.), Points of contact between the philosophy of physics and the philosophy of biology. Dordrecht: Springer. Faraday, M. (1844). A speculation on the nature of matter. L. E. D. Philosophical Magazine, XXIV, 140143. French, S. (2006). Structure as a weapon of the realist. Proceedings of the Aristotelian Society, 106, 167-185. French, S. (forthcoming). The structural of the world: Methaphysics and representation. Oxford University Press. French, S., & Krause, D. (2006). Identity in physics. Oxford University Press. French, S., & Ladyman, J. (2003). Remodelling structural realism: Quantum physics and the metaphysics of structure; and The dissolution of objects: A reply to Cao. Synthese, 136, 31-56 and 73-77. French, S., & Ladyman, J. (2011). In defence of ontic structural realism. In A. Bokulich & P. Bokulich (Eds.), Scientific structuralism (pp. 25-42). Boston Studies in the Philosophy of Science. Springer. French, S., & McKenzie, K. (forthcoming). Thinking outside the (tool) box: Towards a more productive engagement between metaphysics and philosophy of physics. European Journal of Analytic Philosophy. French, S., & Redhead, M. (1988). Quantum physics and the identity of indiscernibles. The British Journal for the Philosophy of Science, 39, 233-246. Gracia, J. (1988). Individuality. State University of New York Press. Hawley, K. (2006). Science as a guide to metaphysics? Synthese, 149, 451-470. Hawley, K. (2009). Identity and indiscernibility. Mind, 118, 101-119. Hawley, K. (2010). Protecting rainforest realism. Metascience, 19, 161-185. Hawthorne, J. (2001). Causal structuralism. Philosophical Perspectives, 15, 361-378. Howard, D. (2011). The physics and metaphysics of identity and individuality. Metascience, 20, 225231. Ladyman, J. (1998). What is structural realism? Studies in History and Philosophy of Science, 29, 409424. Ladyman, J., & Bigaj, T. (2010). The principle of the identity of indiscernibles and quantum mechanics. Philosophy of Science, 77, 117-136. Ladyman, J., & Ross, D. (2007). Every thing must go: Metaphysics naturalized. Oxford University Press. Langton, R. (1998). Kantian humility. Our ignorance of things in themselves. Oxford: Oxford University Press. Langton, R. (2004). Elusive knowledge of things in themselves. Australasian Journal of Philosophy, 82, 129-136.

103

FRENCH Langton, R. (2009). Ignorance and intrinsicality. Talk given at the Pacific APA, Vancouver. Lewis, David. (2009). Ramseyan humility. In David Braddon-Mitchell & Robert Nola (Eds.), Conceptual analysis and philosophical naturalism (pp. 203-222). Cambridge, MA: MIT Press. Monton, B. (forthcoming). Prolegomena to any future physics-based metaphysics. Price, H. (2009). Metaphysics after Carnap: The ghost who walks. In David Chalmers, Ryan Wasserman, & David Manley (Eds.), Metametaphysics (pp. 320-346). Oxford: Oxford University Press. Psillos, S. (1999). Scientific realism: How science tracks truth. Routledge. Saatsi, J. (2007). Living in harmony: Nominalism and the explanationist argument for realism. International Studies in the Philosophy of Science, 21(1), 19-33. Saatsi, J. (2008). Eclectic realism The proof of the pudding. Studies in History and Philosophy of Science, 39, 273-276. Saatsi, J. (2010). The enhanced indispensability argument. British Journal for the Philosophy of Science, 62, 143-154. Saunders, S. (2003). Physics and Leibnizs principle. In K. Brading & E. Castellani (Eds.), Symmetries in physics: Philosophical reflections. Cambridge University Press. Schaffer, J. (2005). Quiddistic knowledge. Philosophical Studies, 123, 1-32. Sider, T. (2001). Four-dimensionalism. New York: Oxford University Press. Van Fraassen, B. (1989). Quantum mechanics: An empiricist approach. Oxford University Press. Worrall, J. (1989). Structural realism: The best of both worlds? Dialectica, 43, 99-124. Wuthrich, C. (forthcoming). The fate of presentism in modern physics. In Roberto Ciuni, Kristie Miller, & Giuliano Torrengo (Eds.), New papers on the present Focus on presentism. Philosophia Verlag.

104

SERGIO F. MARTNEZ

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

1. INTRODUCTION

Broadly speaking, naturalism refers to views that consider philosophical method to be continuous with the methods of science, implying that at least some scientific methods have an impact on whatever philosophy can say about the norms of inquiry. When naturalism is used as a model for epistemology one talks of naturalized epistemology. Similarly, naturalized philosophy of science indicates a philosophy of science that is continuous with science. How to understand such continuity is a major source of controversy in epistemology and the philosophy of science. The continuity in question is usually understood as having two different sources. On the one hand, this continuity is seen as a consequence of the realization that knowledge and justification are psychological concepts that cannot be understood through mere logical analysis. This idea can be elaborated in several ways. One possibility is to say that there is no philosophical theory of knowledge over and above natural science which spells out the methods of inquiry and thus can be used to decide the epistemic status of scientific claims. This is a strong view of continuity that simply replaces the traditional theory of knowledge for scientific method. Another possibility is to claim that continuity requires not replacement of the philosophical theory, but supplementation with scientific methods. The other traditional source of continuity is associated with the recognition of the failure of logical positivism to provide the basic framework for understanding science, and in particular, with the recognition of historically minded philosophers of science that scientific methods are not a priori and that we have to give due importance to the history of science and other empirical studies of science in order to provide a substantive basis for the philosophy of science. In contemporary philosophy of science, the most interesting proposals blend both sources of continuity into an integrated account. Philip Kitcher, for example, wrote a well known long paper entitled The Naturalists Return (Kitcher, 1992) and a book, The Advancement of Science (Kitcher, 1993), in which he combines the acknowledgment that epistemology has to be naturalized with the denial of the a priori nature of scientific methodology. Kitcher elegantly integrates the history of science in his account, but following the logicist tradition (and Hempel in particular) he considers that logical analysis is sufficient in order to identify the structure and typology of the psychologically
J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 105127. 2013 Sense Publishers. All rights reserved.

MARTNEZ

instantiated arguments that are important in uncovering the structure of scientific advance. The continuity between science and philosophy for Kitcher, then, is grounded not only in empirical methods but in logical analysis of the forms of argumentation that are taken for granted. Such a view is compelling because it is accompanied by the idea that scientific progress can be modeled as the accumulation of significant truths about the world. In 1993, Kitcher thought that significance was an objective property of truths. This makes the view plausible that it is not important how we arrive psychologically at those significant truths, or, at least, it is not important for the philosophy of science. Kitcher claims that his account is naturalistic because it gives weight to the history of science and the history of methodology in order to identify significant truths and the typology of the relevant (psychologically instantiated) arguments. But as Kitcher himself has ended up recognizing, significance only makes sense within a context, and thus the typology that is taken as a fixed point in his approach demands to be anchored. I will argue in this paper for a different approach to naturalism, one less concerned with the history of the question within the philosophy of science, and more interested in developing a philosophical view that I think is implicit in the way science advances. This is an approach, which, like Kitchers, acknowledges the importance of the structure of explanations as a guide for naturalism. But the source of the relevant continuity will be located elsewhere. The continuity that matters for what I will call scaffold naturalism is to be found in the mutually scaffolded structure of explanatory practices; the source of normativity that such naturalism aims to characterize is to be found in the way such practices come to integrate heterogeneous concepts and representations into scientific understanding. The usual way of thinking about continuity is too much dependent on concerns arising from the non-naturalistic (logical empiricist) past of the philosophy of science. Naturalization should not be seen as taking place against the background of a given discipline. Quine thought that epistemology should dissolve in behavioral psychology (Quine, 1969), Goldman thought that epistemology should be naturalized with respect to (a version of) cognitive psychology (Goldman, 1988), sociologists of science claim that naturalization of the philosophy of science means interpretation within the explanatory framework of one sort of sociology or another. These reductive approaches to naturalization have no doubt something important to contribute to epistemology and the philosophy of science, or at least to sociology, but we are missing something crucial if we do not see that naturalization has a more integrative epistemic dimension that has roots in the way different explanations look for mutual accommodation and thus serve as mutual scaffolding supporting better explanations. The issue is not supplementation or replacement of philosophical method as a whole. Naturalism is not one master stroke of a brush, but a long process of subtle strokes better characterized as an evolutionary process of the interaction among practices which by comparing and constraining the scope of models, concepts and explanations, promotes its integration. Such integration, that sometimes involves replacement, sometimes supplementation and at others a tweak of methods and 106

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

norms, serves as a scaffold for further diversification (and specialization) of the tapestry of scientific practices. Since scaffold naturalism avoids reductive assumptions about the ultimate source of epistemic legitimacy, it should be seen not as a straight jacket for epistemology, but as a horizon of epistemic normativity stretching across the backdrop of our scientific understanding of the world. One basic idea behind scaffold naturalism is that the naturalization of philosophy of science is closely related to the sort of integration associated with the search for understanding. Understanding is no doubt a main epistemic aim of science. In the logicist tradition understanding has been undervalued as an epistemic aim because it is considered to be merely a psychological phenomenon. But this is not something that should worry us. Nowadays it is widely recognized that epistemology cannot be divorced from psychology. Similar motivations have led us in the philosophy of science to dismiss understanding as an aim of science. One can argue that understanding, maybe even more than knowledge (in the sense of justified belief), is an epistemic aim in science. Most approaches to understanding characterize understanding as a distinctive epistemic virtue going beyond explanation. For example, understanding is taken to consist in the virtue of unifying explanations. Rather, in my sense, understanding is an emergent feature of our mastering of different explanations and the way they support each other (as this feature gets embodied in practices).i In the philosophy of science the topic of understanding as a distinctive epistemic aim is a recent topic of discussion, but scientists have often talked of understanding as a major aim of science. This is not only the case in the social sciences. Darwin, in On the Origin of Species, aims to understand, not just to add to stores of knowledge. As Einstein famously (is said to have) said, every fool can know, the point is to understand. And it is not hard to find contemporary scientists recognizing the importance of understanding as an epistemic aim. Such recognition often makes use of an assumed relation of understanding with reductionism. But we should read further before prejudging their concept of understanding. For example, Regev and Shapiro believe that the distinctive mark of scientific understanding is the reduction of phenomena into simpler units. But reduction is not associated with ontological monism, but with the identification of the right abstractions, being those which allow for the integration of very different phenomena into more general and more tightly related explanations (see Regev & Shapiro, 2000). From this perspective that takes seriously understanding as an epistemic aim, naturalization is first and foremost a philosophical attitude towards different ways in which the diversity of methods and explanations can be productively integrated into understanding. Such view goes against a deeply ingrained belief on ontological monism. Peter Gintis, for example, has been arguing that the social sciences are defective because they study human behavior from different perspectives that are not consistent to each other (Gintis, 2007). He assumes that progress is related to the stablishment of a unified theoretical framework that would integrate the social sciences and thus dissolve the inconmensurabilities associated with the use of different ontologies and representations. But this is not 107

MARTNEZ

the only way of thinking about scientific progress and naturalism, as we have just seen. And in the social sciences there are several interesting proposals that go beyond such accounts of progress and naturalism. Sperber (2011), for example, argues that a naturalization of the social sciences demands the ongoing naturalization of psychology, and that such naturalization does not require the flattening of ontologies. The ontologies of a naturalized social science would articulate a naturalistic description of mental and environmental events using heterogeneous concepts and representations. Heterogeneity of concepts and representations does not go against their capacity to integrate different ontologies.
2.

In the preface to her book, Maddy (2007) says that when she set out to write on the subject of naturalism in mathematics, she assumed that everyone knew what it means to be a naturalist, and that her job was to show how to extend this idea into mathematics. What she discovered was that everyone, naturalist or not, had a different idea of what naturalism requires. We should not be surprised by such diversity of opinion on naturalism, since the overall tendency is to develop naturalism as part of a philosophical tradition in epistemology and the philosophy of science which from the 19th century and until very recently was interested in legitimazing its non-naturalism. An implicit claim imposed upon naturalism is that it should be consistent with a minimal non-naturalism that different authors formulate in different ways but which always includes an assumption as to the homogeneity of science. It is assumed that it makes sense to ask about the continuity between science as a whole and philosophy. The human beings that do science and philosophy are bypassed. However, if as I am arguing, our point of departure are the scientific discussions that can inform us about human nature (coming from biology, the cognitive and social sciences specially) the philosophical task is more clearly in sight. Philosophical naturalism has to follow the scaffolds of our best science, including our best social science, to come to terms with the relevant continuity for naturalism to be not only doctrine, but a guide for doing better philosophy. The underlying issue is more about putting in perspective relations of mutual support between different explanations, with respect to a certain issue, than the usual eliminativism supported by traditional reductionism.ii For the purpose of this paper I shall characterize strong reductionism as a reductionism that is committed to ontological monism. Such a label aims to include all reductionistic approaches in which the existence of different kinds of methods and things is not recognized as valuable resource for the characterization of scientific methodology and epistemology (see Section 4 below). If we start from the assumption that strong reductionism is the right approach to understanding the way the different realms of knowledge relate to each other, then the problem of naturalism ends in the sort of excluding alternatives that Quine made famous. Either norms have an a priori source or we have to acknowledge that psychology (or some other discipline) is all the epistemology we need. But if we do not assume 108

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

such strong reductionism, naturalism is better approached as aiming to the construction of perspectives from which the diversity of modes of organization of practices (and their implicit and explicit norms) can shed light on explanatory depth. This means giving suitable importance to the way in which cognitive resources get displayed socially in practices and traditions of inquiry and also to how such practices and traditions merge in a productive manner to generate, on the one hand, overarching scientific explanations and, on the other, specialized knowledge often related with the production of technological advances.
3.

Traditionally, it is considered that the two main difficulties relating to a naturalized philosophy of science are circularity and the problem of normativity (or alternatively, the problem of philosophical irrelevancy). Circularity elaborates on the point that the use of scientific methods to investigate scientific methods is circular; whatever the evidence that we take as the point of departure, we are required to use criteria or norms of inquiry that it would be part of the business of such methods to discover. Philosophical irrelevancy refers to the issue that a naturalized study of science could, at most, describe scientific methods, whereas philosophy of science should have a say in how science is carried out. Lakatos and Laudan are seen to start a discussion of models of scientific change that can give weight to the salutary criciticisms of naturalism towards logical positivism, while at the same time avoiding such difficulties. Metamethodology allows us to have a rational decision-making method about the relative merits of research traditions, and thus overcome circularity and irrelevancy. Laudans approach has changed over time, but the underlying assumption remains, and it is a good example of the usual sort of strategy employed to resolve the difficulties of naturalism: philosophy of science can be studied without entering into the messy territory of particular scientific disciplines and specific discussions in the sciences as to the nature of our cognitive capacities, its relation to their evolutionary history, and the way they play a role in the kind of inquiry we call science. The recognition of a metalevel (by Lakatos and Laudan), the typology of arguments assumed by Kitcher, or the typology of models assumed by Giere (in Giere, 1985, for example) function in a similar way in regard to the grounding of the more usual versions of naturalism. They provide a stopping point for what is considered a threatening circularity and also allow their proponents to overcome the second difficulty. Normativity has its origin in the explanatory power of arguments or explanations grounded on such fundamental typology. But there is a problem with this strategy. Why should we expect actually displayed arguments or explanations in the sciences to fit these typologies? All of these authors appeal to the history of science to justify their point of departure. But such use is questionable. There are several important rebuttals of the way Lakatos and Laudan want to use the history of science for their own purposes. Kitcher integrates the history of science in a much more sophisticated way in his model of science and furthermore recognizes the importance of scientific practices in his account. Here I 109

MARTNEZ

will concentrate on showing problems with the way Kitcher uses the history of science to gift wrap his views on naturalism. Kitcher claims that the theory of evolution by natural selection formulated by Darwin quite rapidly generated a core consensus. For him, On the Origin provides naturalists with good reasons for accepting minimal Darwinism (the belief in natural selection as a plausible mechanism explaining the origin of species). Kitcher suggests that there is a well formed and clearly delimited argument that goes along with this belief and which leads to changes in views in widely different fields; the acceptance of this minimal argument has led practices to be modified taking minimal Darwinism in consideration; and when this has not happened, resistance can be explained by the importance of exogenous constraints on individual rationality associated with, for example, personal, professional and intellectual allegiances. One important problem is that the history of Darwinism does not support such a neat account of what happened. There were many versions of Darwins theory and important discussions as to the scope of these versions. Robert Richards, for example, has argued that Darwin crafted natural selection as an instrument to manufacture biological progress and moral perfection (Richards, 1988), and that in this regard, Darwins theory does not substantially differ from Spencers views. Indeed, it seems that many contemporary naturalists accepted Darwins theory as a variant and more sober version of Spencers view of evolution as a cosmic process (see Martnez, 2000). Miriam Solomon has argued against Kitchers account of the reception of Darwins theory pointing out that contemporaries did not (as Kitcher claims) modify their practices and start producing the sort of arguments that, according to Kitcher, are associated with the acceptance of minimal Darwinism. She indicatescorrectly I believe- that Darwins supporters and opponents were not always fighting the same battle, and that they use all sorts of routes to reach their different positions. There are many overlapping and sometimes conflicting claims being supported by different kinds of empirical work, and by different traditions of inquiry often related to different disciplines, that do not lend themselves to a simple comparison and more importantly, that do not seem to support the view that progress should be identified with the accumulation of significant truths. In any case, progress would be seen to be associated with the diversification and specialization of significant truths. But this suggestion would not gratify Kitcher, because he would like to say that there are no competing significant truths as the above idea suggests. In the case of Darwin, at least, it is far from clear what the accumulated significant truths would be. Only in retrospect, and with specific values in mind, could one argue that research traditions, for example in developmental biology, have or have not contributed to the progress of biology. Whether they are Darwinian traditions or not is a judgement that depends on our views of what constitutes evolutionary theory nowadays. From the perspective of neo-Darwinism these traditions might not have contributed to significant truths, since it is considered a major achievement of Darwin to have separated issues of 110

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

development from issues of evolution. But from the perspective of contemporary evo-devo or systems biology, things do not look this way at all.
4.

From an historical perspective, the sort of naturalism common in the philosophy of science of the 20th century (and in particular in views like that of Kitchers), looks quite strange. In the 19th century the sharp opposition between science and philosophy that motivates traditional accounts of naturalism was not present. And the continuity between science and epistemology was often framed in terms of the scope of explanations. Ontological and teleological themes and discussions were common and played an important role in the formulation and scope of explanations. From the perspective of the sort of methodological fundamentalism that is pervasive in 20th century philosophy of science, the sort of fundamentalism promoted by philosophers as diverse as Laudan, Popper and Kitcher (at least in some of their work), epistemically distinctive features of science can be understood in terms of methods or typologies (or differences among them) in such a way that discussions about ontology or explanation can be bypassed or blackboxed. From this perspective it seems clear that there has to be a metalevel or some common (normative) currency that allows the comparison of methods with respect to epistemic aims independently of context. However, if we look at substantive discussions in contemporary philosophy of science, it is clear that such methodological fundamentalism is not sustained. Take for example discussions about reduction in biology. The traditional view associated with positivism is theory reduction, according to which the most important relation between theories is a deductive relation between theories conceived of as set of statements generated by axioms and laws. As several philosophers of science have pointed out, not even the canonical example of such a relation, the relation between classical and molecular genetics, fits the model (Hull, 1974; Wimsatt, 1979). If one goes on to argue that even though the reduction does not take place, and that what matters for philosophy is that such reduction is possible in principle, then the question arises as to why one should think that such in principle reduction is philosophically relevant. There is nowadays a widespread agreement in the philosophy of biology that such reductionism will not do, that the diversity of methods and explanations that enter into the variety of scientific practices that conform biology cannot be reduced to a fundamental theory.iii This anti-reductionistic stance supports scaffold naturalism. Discussions about what is a gene or what is a species are more and more often answered by pointing out that there are different concepts of gene and species that have a place in biology. Pluralism is not only allowed but increasingly recognized as an important resource with which to answer questions in science and in the philosophy of science. When we see to what extent this plurality of methods and explanations goes hand in hand with different ontological commitments, methodological fundamentalism looses credibility. But pluralism seems to lead to epistemic relativism. We seem to be left with a huge variety of ontological claims implicit in 111

MARTNEZ

widely different explanations that might make us yearn for the simplicity of strong reductionism and methodological fundamentalism. But the risk of relativism is only a mirage resulting from the distance at which philosophers tend to look at science. Brigand, for example, has provided an elaborate discussion as to how evolutionary novelties (a morphological structure or function featured in a group of organisms that did not exist in an ancestral species) can be explained in contemporary biology (Brigandt, 2008). Explanations of novelty involves concepts, data and explanations from different disciplines: classical and molecular genetics, paleontology, developmental biology, biogeography and ecology, among others. Furthermore, there are changes in how different traditions understand novelty. Neo-Darwinists take novelty to be substantial change in an existing structure, whereas evo-devo theorists consider novelty as coming into existence through evolution of structure. Brigandt uses this kind of discussion as the basis for suggesting that the centrality of a (kind of) explanation as part of another explanation depends on the goal pursued. Depending on our explanatory aim paleontology or biogeography might be questioned and the other considered an unshakable point of departure for the explanation. Explanations are used as scaffolds for more general or more complex explanations. Such scaffolds put in perspective the different ontologies used in different disciplinary domains. Explanation perspectivism and not relativism would be a more accurate way of describing the consequences of ontological pluralism. Another example of this sort of naturalism is the discussion about typology in biology. Darwin started the trend of getting typology away from the metaphysics of essentialism but getting away from essentialism has been harder than it was originally thought (see Love, 2008). Love shows how, within specific scientific practices, one can transform metaphysical thinking into epistemologically sound explanatory reasoning. As Love (2008) puts it, typology needs to be understood as a form of thinking or reasoning, as conceptual behavior. The role of typology in biology is closely related to the recognition of kinds of representations crafted in specific scientific practices through carefully weighted abstractions and approximations.iv The choice of abstractions and approximations aim to promote the integration or alignment with some practices while distancing them from others, thereby fitting the practice within a certain tradition or research program. Love shows that concepts like Protein domains (in molecular biology) respond to different characterizations: (a) units that have stable activity or structure through manipulation; (b) structural units that are observed in X-ray crystallography; (c) functional units that exhibit a particular activity; as well as many others. These different characterizations are used in different contexts related to specific goals and disciplinary practices. One can think of those contexts as competing with each other, but what is important for us is the end result, the shaping of the scope of the explanation by situating it in relation to many other explanations. In a few cases the result is some kind of reduction. But this is not the rule. This determination of the scope of the explanation is not a mere identification or discovery, but rather the crafting of a norm imbued with epistemic import. In the 112

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

next section, we show how this sort of explanatory naturalism relates to a versin of the continuity thesis that was an important element of 19th century science. The rejection of the thesis of continuity as formulated and defended by many 19th century naturalists, and Darwin in particular, played an important role in the development of the social sciences, and moreover in the conviction that the autonomy of the social sciences from biology (and psychology) should be considered an important achievement. This conviction is being questioned nowdays in the social sciences, in biology and in the cognitive sciences.
5.

Darwin was convinced that his theory had implications for the social sciences through its implications for understanding the evolution of our cognitive capacities. This thesis is known as the continuity thesis. The second half of the 19th century saw the publication of many books promoting numerous versions of the thesis. Romanes, for example, published several well known books developing the thesis from the 1870s to the 1890s. As part of the delineation of the borders between scientific disciplines that took place at the turn of the century, such a view of continuity fades away towards the end of the 19th century. In psychology, continuity gave place to emergentism and later to behaviorism. The claim by Lloyd Morgan in 1898 that we do not have enough evidence to support the thesis of the continuity of the animal and the human mind is a well known lapidary statement that is a good indicator of the fate of such a thesis for several decades to come. The thesis of continuity was banned as untenable, and several different views took its place. From being a banner of progress, the thesis was seen within a decade as a sign of an old approach that could only hinder the development of a scientific view of the world. In the social sciences and anthropology in particular, the sort of emergentism embraced by Morgan took the place of the thesis of continuity as a guiding methodological principle. The established consensus towards the beginning of the 20th century was that the thesis of continuity was an untenable metaphysical thesis, unsuitable for the development of sound social science. Boas rejection of evolutionism and the embracing of historical particularism is a good example of the way this rejection of the thesis of continuity took place. Even if he were to accept the importance of the mechanism of natural selection and the importance of geographical dispersion as the main forces shaping the evolution of living beings, he would do so in accordance with the rejection of the thesis of continuity. For Boas, as for many of his contemporaries, Darwins defense of the thesis of continuity (in the Descent of Man), required a view of progress that (contrary to his views in the Origin of Species) pointed to the inevitable transition from instinct to intelligence and thus supported an unacceptable view of human nature. As Wallace famously put it: (since) natural selection could only have endowed savage man with a brain a little superior to that of an ape we should reject the applicability of the theory of evolution by natural selection to the explanation of human cognition, since a savage actually possesses one very little inferior to that of a philosopher (1870, p. 356). 113

MARTNEZ

Wallace had a point, but independently of whether you are convinced or not by Wallaces argument, I think it should be acknowledged that it makes very clear why the thesis of continuity is at the center of a discussion about the scope of explanations of evolution by natural selection, and about the relation between biology and the social sciences. Wallace is often mentioned in the history of biology as someone who did not fully understand the scope of his own discovery (as it is recognized, Wallace was, with Darwin, co-discoverer of the principle of natural selection as an evolutionary force). But things are more complicated. Wallaces view was one of the views supporting the advance of the social sciences during the first half of the 20th century. The scope of the mechanism of natural selection had to be crafted in such a way as to allow principles of the social sciences that had strong ethical and political overtones to be maintained. But, this is not the end of the story. During the second half of the 20th century, the discussion about the thesis of continuity came back as part of a crisis in the social sciences and the shaking up of the borders between the biological and the social sciences associated with new ways of extending the scope of evolutionary models, and the rising to prominence of the cognitive sciences. For example, in anthropology the objective of turning the study of culture into a scientific enterprise has been an important motivation for elaborating an evolutionary model of culture. There are two lines of thought that lead to this sort of project. On the one hand, the idea originating in the 19th century that evolution is the most general and fundamental sort of change, which does indeed support a version of the continuity thesis; and on the other, the search to legitimize the social sciences by anchoring their explanations on laws of nature of universal scope, laws that would sustain social sciences claim to objectivity. The assumption that evolutionary (Darwinian) biology is grounded on such laws as part of its scientific status leads naturally to the view that a characterization of the scientific status of the study of culture has to be modeled as an evolutionary process subject to the same laws. This second line of thought does not support the continuity thesis. But the separation between these two ways of promoting the use of evolutionary models of culture is not as clear cut as it should be. However, this train of thought sets us on a path that has serious problems (Frachia & Lewontin, 1999). The longing for generality is certainly related to the search for the intelligibility of human history, but models of cultural evolution, to the extent that they attempt to mimic, for no reason beyond the desire to appear scientific, a theory from another domain are too rigid in structure to be even plausible (Frachia & Lewontin, 1999, p. 78). Indeed, if the explanation of cultural change and stability has to fit the reductionist model in which individual actors have more cultural offspring by virtue of their persuasiveness or power or the appeal of their ideas, or in which memes somehow outcompete others through their superior utility or psychic resonance (Frachia & Lewontin, 1999, p. 74). I agree. Frachia and Lewontin level much criticism to attempts aiming to extend the scope of Darwins theory to the social sciences based on the existence of laws that support evolutionary explanations. But such criticism would not be relevant to 114

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

explanations that are not based on such laws, as would be the case if we could give credence to some version of the continuity thesis. The way in which Darwin and his supporters (like Romanes) tried to elaborate the thesis of continuity may be incomplete or faulty, but it is not the only way of developing it.v The cognitive sciences suggest versions of the continuity thesis that bypass the traditional objections. To start with, once we abandon the idea that hard science is based on laws of universal scope, and thus abandon the idea that scientific explanations have to fit big theoretical structures that systematize such laws and ground our generalizations (in the form of explanations or predictions), models of cultural evolution can be seen to model the technologies of cognition that scaffold both the stability of culture and the sources of cultural innovation. In this way, models of cultural evolution contribute as much to our understanding of human cognition as to our understanding of human history (see Martnez in press for an elaboration of this sort of model). Several versions of the thesis of continuity are being proposed in cognitive social sciences. But the thesis of this paper does not depend on details of different versions of the thesis of continuity. What is most relevant for us is that once continuity is an issue, the border erected as a metaphysical division between the social and the cognitive sciences through the first half of the 20th century, comes into question. This debate has important implications for philosophical issues, and not the least, on the question of scientific rationality. In order to better understand what is at stake and the claim I am putting forward, we have to review another important discussion in the philosophy of science in recent decades: the discussion about the nature of scientific rationality and its relation to the historical turn in the philosophy of science.
6.

Philosophy of science has devoted a lot of effort to discussions about the nature of scientific rationality. As Ian Hacking famously put it: Philosophers long made a mummy of science. When they finally unwrapped the cadaver and saw the remnants of an historical process of becoming and discovering, they created for themselves a crisis of rationality. That happened around 1960. (1983, p. 1) The crisis of rationality in question started when Kuhn undermined the traditional view of rationality (or at least this is the usual story). Many others questioned logical positivism at the beginning of the second half of the 20th century. But Kuhn captured the headlines. I suspect that one major reason for the attention given to Kuhns ideas, as opposed to alternative proposals, like Toulmins evolutionary model (which is, in more than one sense, a more elaborated critique of formal models of reasoning) and several others that were published around the same time, has to do with the fact that Kuhns approach touched on central concerns of the logical positivists, for example, discussions between Carnap and Popper, as well as many others, about the relation between the history and the philosophy of science, and metaphysical or epistemological discussions about nominalism and realism. 115

MARTNEZ

But Kuhn was catapulted to stardom not by philosophers or historians, but by social scientists and psychologists. Very soon after the publication of the Structure of Scientific Revolutions, social scientists in the different disciplines were talking of the need to overcome a pre-paradigm stage, thus allowing the social sciences to reach the scientific status of physics. Kuhns ideas resonated with the ideas of the sociologist Robert Merton, who had argued for the need to abandon a narrow empiricism and speculative sociology. Mertons claim that sociology should develop specialized theories with a carefully constructed range as the basis for successful generalizations (middle range theories), that in turn could serve as the basis for further generalizations, is not far from the Kuhns notion of paradigm (the term paradigm was actually introduced by Merton). Thus, the importance Kuhn bears for the social sciences, as several writers by now have pointed out, is closely related to the ingrained positivism in the social sciences and philosophy at that time (Bird, 2004). In psychology, his influence was also quite important and not easy to understand.vi Kuhns ideas were most often used in a self serving superficial way. But not always.vii In developmental and educational psychology in particular Kuhn was recruited together with Piaget and Vigotski (among others) to support theories of conceptual change like the interactionist theory of Strike and Posner from 1982 (reformulated in Strike & Posner, 1992), for example. As already said, Kuhn is not the first author to question the positivistic ideal of science as a set of theories dealing with very different subject matters but united through the vertebral column of a methodological reductionism. But the mob psychology dimension of his work resonated in several areas of psychology and education in a rather constructive way. The discussion of paradigmatic science as a kind of doing science in which questions about foundations are left aside and progress is perceived to lie in the solution of relatively well formulated problems, leads in educational psychology to the development of applications of the concept of paradigm as exemplar. Whereas in the social sciences Kuhn is used mainly as leading to the attractive view (at least for positivist-minded philosophers) that a physics-like status is possible for the social sciences to the extent that a new revolutionary way of looking at things was posible. The different appreciation of Kuhn by educational psychologists and social scientists is telling. The widely recognized tension between the two ways in which science changes according to Kuhn is not a mere detail. When seen from afar, such applications of Kuhns ideas force us to confront the obvious problem that science seems to have two ways of changing. The way in which Kuhn talks in 1962 and the way it is often interpreted is that this extraordinary or revolutionary way of doing science is not a rational type of change. What happens when paradigms change is that old problems and ways of thinking about the central questions of the field disappear, and a new way of looking at things takes its place. In this case, it is not continuity but replacement which occurs. The question that has most attracted philosophers attention is the question of how we can account for this sort of non-continuous, non cumulative change. It seems rather odd to say (as Kuhn was often understood to be saying) that the most significant scientific advances, like moving from Newtonian to relativistic physics, 116

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

are irrational sorts of changes. Lakatos famously said that Kuhn had reduced theory change in science to mob psychology. One can argue that Kuhn was simply wrong, that there is no non-cumulative sort of change. One can, for example, argue that in the examples of extraordinary change given by Kuhn, there is a cumulative sort of change, a change that takes place rather fast, but cumulative at any rate (Laudan, 1984; Shapere, 1984). Alternatively, one can give philosophical reasons pointing to the impossibility of modeling scientific change as Kuhn suggests, unless one is willing to fall into the hole of relativism (Popper for example suggests something in this direction). Or one can try to show that indeed there are two notions of rationality that make sense.viii Or one can try to argue that rationality is an achievement implicit in the history of science, and thus impervious to anomalies in Kuhns sense. This can be done in many different ways, including proposals like that of Feyerabend, for whom incommensurability is an anthropological thesis, a basic organizational principle implicit in our conceptual structure, and more especially in the way objects of experience are classified. It would also include proposals like that of Lakatos (1970) and Laudan (1977).
7.

But Kuhns suggestion that there are different sorts of changes in science that are relevant in order to understand science philosophically and historiographically, is worth giving serious attention. This is ultimately the issue of incommensurability and it is a difficult question. If one stays within the straight jacket of methodological fundamentalism, it is not difficult to conclude, as Popper and many other philosophers have done, that talk of different modes of change leads directly to relativism. But if we abandon methodological fundamentalism (and the epistemology that accompanies it) and recognize the plurality of methods and explanatory frameworks that comprise science, the existence of different modes of change is no surprise.ix Feyerabend is right in that the thesis of incommensurability is an anthropological thesis, but as we shall see, it is an anthropological thesis in a rather different sense. Godfrey Smith is correct in pointing to different kinds of rationality, but as we shall also see, this point has to be reformulated. There are not two types of rationality, but many, and how to characterize them invites us to adopt a deeply naturalist attitude that takes the empirical study of rationality (in the cognitive and the social sciences) seriously. However, before we come to this, it might be important to emphasize two things about the point of departure. The first is that (contrary to what Kuhn and most philosophers of science assume) scientific disciplines are not a stable starting point from which to discuss the naturalization of concepts like rationality or paradigm. I would like to suggest that the interesting notion of paradigm makes sense as a constraint on the sort of change that is open to scientific practices (thorugh processes of learning and through constructive interactions among practices). This is a notion of paradigm closer to what Fleck called a style of thinking, and that I prefer to call (by reasons that will be clear later) cognitive style.x The second thing is that, in so far as the task of describing cannot be sharply separated from 117

MARTNEZ

normative considerations, the interaction of efforts and the mutual supporting role of different scientific practices are already part of the process through which the scope of norms and explanations come to be taken as scaffolds for further research. To illustrate this point, in this section I will review some recent discussions about rationality that suggest how incommensurability can be understood as an expression of different modes of change, and the way such different modes constitute mutual scaffolds for fruitful diversification and specialization of concepts and practices. The questioning of the concept of rationality based on the theory of expected utilities has been having important implications for the way the social sciences are designed and oriented, and is leading to the blurring of the border between social and cognitive sciences. In particular, recent approaches to rationality, as well as related concepts like cooperation and decision making provide a good example of how paradigmatic thinking is a cognitive phenomenon and how paradigmatic thinking embodies different kinds of scientific change. Central discussions about the structure of reasoning, the nature of rational thinking and decision making are nowadays carried out at the intersection between psychology, economics and neurosciences (see, for example, Gigerenzer & Sturm, 2011; Bardone, 2011; Glimcher et al., 2009; Echeverra & lvarez, 2010). The revolutionary character of those proposals have to be emphasized. Confronted with anomalies (like the famous Allais paradox), one could argue that, as Simon for example suggested several decades ago, the neoclassical models of economics and the associated concept of rationality worked only under some limited circumstances. Simon approach already involves cognitive considerations, even though one can argue that such cognitive components play a rather passive role and can be taken as part of the background conditions in a slightly modified traditional account. This sort of suggestion can hardly be sustained nowadays. Starting with the development of constructive views, like the one developed by Kahneman and Tversky in the 1980s, it became increasingly clear that the anomalies could not be seen as isolated examples or rare cases describing extreme circumstances. Now, as Kuhn and Fleck would predict, this period of extraordinary science has led to a diversification of approaches. But what is interesting for us is that such approaches are not transient views destined to disappear inmolated at the door of a new paradigm. What seems to be happening is that the crisis in the standard theory of rationality is giving place to several new fields of study that are consolidating different lines of research through the integration of work practices in different disciplines. Behavioral economics, for example, was developed as a label for a series of approaches that were united by the idea that models developed in experimental psychology should have a bearing on models of human behavior that would improve the models offered by neoclassical economics. The discussion between behavioral and traditional neoclassical economists spars about old philosophical issues, like the duality of body and mind, but also issues closely related to projects of naturalization; for example, whether scientific explanations, in order to avoid circular argumentation, should rely on normative idealized theories that provide a 118

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

privileged and uncontested point of departure (see, for example, Glimcher et al., 2009). What is particularly relevant for us is that the discussion initiated by behavioral economics is a discussion about the normative status of certain idealizations that are being proposed as alternatives to the traditional idealization of homo economicus. However, the issue is not whether the new alternative idealizations are true or not, or which one is true: the discussion is about its explanatory scope and stability under explanatory use. Behavioral economics has been criticized because explanations were based on very different models, and there was a perceived need to weed out the variety of empirical models and methods use in inquiry. This is often mentioned as the motivation for the development of neuroeconomics. The beginnings of neuroeconomy are related to early attempts to interpret intermediate variables used in models of mental processes in term of neuron mechanisms. To this extent, the suggestion is that the neurosciences could provide the sort of normative-explanatory framework required for a more consistent advance in the development of an alternative to the standard theory of decision making. No doubt part of the appeal of neuroeconomics is of course the promise of reducing the modeling of decision making to a hard science intelligible (if not reducible) to biology (and physics). But such veiled reductionism, even if it is a hidden motivation behind the recent enthusiasm for neuroeconomics, is not the whole story. Glimcher, for example, argues for an interdisciplinary approach to choice supported by reductive linkages. The idea is more like the one suggested by Regev and Shapiro (see Section 3) than the usual accounts of reductionism in philosophy of science. As a general rule, he says, it is the structure of the higherlevel abstractions that guides the more reduced-level inquiries (Glimcher, 2011, p. 126). Thus, neuroeconomics is not meant to simply replace traditional economic theory. Mechanistic constraints, relevant to the study of choice and behavior, lie outside the neoclassical paradigm. But such constraints are not intended to be independent of the organizational structure imposed by more traditional economic theory associated with the higher-level abstractions that describe the goals guiding neuro-economics. Continuity and change go hand in hand. The questioning of the standard theory of rationality based on the theory of expected utilities has led to the development of other important approaches promoting very different kinds of explanations that are not even being considered in behavioral economics. Institutional economics, for example, is another development arising from the recognition of the limitations of neoclassical economics. Some authors consider that institutional economics should integrate the neoclassical framework and might suggest that institutional economics is the new paradigm for economics. As Coase puts it, modern institutional economics should study man as he is, acting within the constraints imposed by real institutions. Modern institutional economics is economics as it ought to be (Coase, 1984, p. 231). But this is hard to believe, at least if the idea is that behavioral economics and neuroeconomics and the other recent fields in economics branching from the same crisis should become extinct or be absorbed by institutional economics. Very probably some of these approaches will disappear and consolidation will take 119

MARTNEZ

place, but it seems hard to believe (and it is not necessary in order to advance scientific understanding) that the future will bring one new homogeneous model for dealing with economic phenomena. All of these branching lines of research have in common the recognition of the need to abandon the traditional blackbox account of cognition implicit in traditional models. Also, all of these new proposals share the recognition that the empirical sciences, biology, and the cognitive sciences in particular, can provide guiding principles and appropriate idealizations for advances in the social sciences. But what seems to be happening is not reduction of alternatives, but stabilization of at least some of them, stabilization that goes hand in hand with integration of approaches into configurations of explanatory frameworks that scaffold new applications. Another example of the way in which the breaking of the normative framework provided by the traditional theory of decision making (based on the theory of expected utility) unleashes a similar process of diversification of models and explanation counterbalanced by the search for a integrative idealized theoretical framework (that limits the choice of models to be tested and discussed), is the discussion about the use of evolutionary models in archaeology. Evolutionary models in archaeology have been developed in many directions, but it is widely recognized that some sort of constraints on the possible models have to be put in place in order for sustainable advances to follow. This leads us to appeal to the neurosciences, or to a discussion about the possible use of conceptual metaphors in the sort of explanations that should be accepted in archaeology to account for historical patterns in material culture (see for example the discussion between Ortman, Hurt, Rakita and Leonard; Ortman, 2001). One central point of discussion here, as in many other contemporary discussions in the social sciences, is the extent to which we are willing to abandon the view that culture is an exogenous factor that fixes implicit assumptions required for an idealization of the process to be explained. The problem with such an approach is that it ignores feedback between normative frameworks and culture. As Roepstorff puts it in a recent article: The underlying argument appears to be that mapping this chain of transformation in all its cumbersome detail is the key to understanding the type of society in which the object was produced, andat least since the turn towards a cognitive archaeology (Renfrew & Zubrow, 1994)also the mindset of the people who made it. (Roepstorff, 2008) An obvious consequence of these sort of discussions, is that the manner in which change is modeled cannot be left outside the empirical discussion. Change and rationality are not concepts that we can grasp outside a cultural history. As Gamble puts it in the summary to the first part of his book Origins and Revolutions in reference to accounts of change relevant for archaeological theory: I have now examined how archaeologists use the concept of upheavals in their descriptions and accounts of change. I have placed their usage in historical context and found that change is best understood not as a property 120

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

of archaeology being studied but rather an outcome of contemporary concerns. (Gamble, 2007) This is a conclusion that seems to be generalizable: change is an outcome from a certain perspective.xi But that does not mean that change is in the eye of the beholder. Different notions of change in archaeology, historical sociology and behavioral economics point to some sort of incommensurability, but this sort of incommensurability is a synchronous, non-transient type of incommensurability. Such inconmensurability is not an obstacle to knowledge, but is a source of understanding. The different perspectives on change, when fruitfully contrasted, provide limitations of scope and bridges to integrate advances in different disciplines into credible explanations. Inconmensurability is not a problem to be solved but a resource to be exploited for understanding. More generally, Gambles summary of to the history of theories of human origins support the sort of account I want to give of naturalization projects in the philosophy of science (and epistemology). Naturalized philosophy of science should not be seen as a search for the right way of doing philosophy informed or constrained by science, but as a way of thinking about science from the perspective of what are the most promising and empirically grounded contemporary accounts of what is human nature in the context of a set of disciplinary goals. Notice that a similar line of thought leads us to the conclusion that we should expect (as indeed empirical studies show) different notions of rationality to play a role in different situations. There is no single perspective that integrates what we know about human nature in such a way that said perspective can be taken as the normative point of departure for explaining the continuity of science and philosophy. If science spoke with a single voice, naturalism could be described as in Wittgensteins famous account of the correct method of philosophy: To say nothing except what can be said, i.e. the propositions of natural science. But science does not speak with one voice, rather through different practices that conform traditions of inquiry stabilized by social-cognitive productive constraints. Such constraints do not work only on systems of beliefs, but shape the metaphors, analogies and heuristics with normative import that conform scientific styles of doing and representing that I refer to as cognitive styles.
8.

The crisis of rationality to which the discussion about Kuhns work is famously attached goes hand in hand with the historical turn in the philosophy of science. I have suggested above that Kuhns notion of paradigm is important for the philosophy of science because it leads us to confront the fact that there are different kinds of change and different ways of doing things that relate to each other in a way that cannot be modeled by traditional models of explanation and rationality. But Kuhns concept of paradigm is too rigid. For one, it is too closely related to assumptions about the importance of the disciplinary organization of science as the point of departure for an explanation of 121

MARTNEZ

the factors that play a role in an explanation of the stability and change of the norms of inquiry. This disciplinary organization is important from a sociological perspective, but from an epistemological and historical perspective it is less important; disciplines have changing borders; practices coalesce in disciplines in a relatively contingent way, they come together insofar as they can cooperate (not in view of a common aim necessarily). But also it is important to take in consideration that often scientific advance involves migration of methods from one discipline to another. Practices imported from physics were crucial for the beginning of molecular biology, and mathematicians have initiated many lines of research in economics and the social sciences. Once institutionalized an important stability comes from this institutionalization and from the associated teaching practices. But such stability is rather precarious at the level of research, even though the same textbooks are used through decades. And there is something else. Stability of relevant beliefs (for explaining conceptual stability and change) does not follow from sharing textbooks, but even if it were to follow, one would still need to show how such stability of beliefs is relevant to understand the different sorts of conceptual change important for modeling the dynamics of science. Sharing mathematical methods is quite important for the stability of practices, but sharing mathematical methods does not imply sharing beliefs about important conceptual matters. Scientists might share the view that Hilbert spaces are important in Quantum Mechanics, or that population models are crucial to formulate the theory of evolution, but they might differ as to how to understand the basic concepts of the theories in question. For teaching basic quantum mechanics such differences of conceptual framework are not important, but for appraising the future of the discipline and the sort of alliances we might make to foster the discipline, it might be. I am not claiming that stability of beliefs is not important. The stability of beliefs that is a product of shared teaching practices is important. But there is not one set of beliefs that such practices usually converge to. And further more, this is not the only sort of stability that matters for understanding conceptual change. Sharing standards of laboratory, specimens, models and know-how is also an important source of stability and constitute important resources that have to be brought into an explanation of scientific change. In order to answer the crucial question of how these different sources of conceptual stability relate to each other requires to give due importance to the cognitive dimension of conceptual change as it manifests itself in the interaction among different practices through time.xii As I have suggested above such stability can be explained as a special case of the sort of conceptual stability and change supported by cultural practices. A key ingredient of such explanation is that the relevant stability of beliefs for explaning conceptual change is the result of a complex interaction and evolution of norms implicit and explicit in different practices and institutions that in particular have to take into consideration the role of material culture in promoting such stable normative environments. But the stability of beliefs is only a transient state, which, like our geographical reference points changes through geological history. What apparently is an unchanging set of beliefs transmitted through generations of 122

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM

scientists belonging to a paradigm is really a changing arrangement of factors (some of which are norms or have a normative dimension) that is changing slowly in different directions, from different perspectives. An explanation of the stability of cultural practices asks for a complex account of what human culture is; such account commits us to take in consideration the role of material culture and the cognitive scaffoldings that shape our situated cognition and that support our understanding of the world and of our human condition in particular. Those (cultural and cognitive) scaffolds cannot be analyzed fully in terms of beliefs or systems of beliefs, they have to be thought of in terms of their role as productive constraints on situated action.xiii Hacking and other philosophers and historians have been pressing for the importance of recognizing the role of styles of thinking or reasoning in order to understand the stability and the advance of science. For Hacking, a style of reasoning crystallizes in the introduction of new objects and criteria used to judge what is said about such objects. A cognitive style in my sense is the result of complex interactions between material culture, institutions and conceptual resources that constrain our ways of learning and of doing things. A style for Hacking does not answer to external criteria and thus the objects in question are quite distinctive from the sytle. Nevertheless my notion of cognitive style does not crystallize in objects, but in ways of doing things, in constructing models or designing heuristics and more generally, artefacts for situated actions, the sort of artefacts that are paradigmatically articulated and produced in scientific practices (see Martnez, in press).
9. CONCLUDING REMARKS.

From the perspective of a philosophy of science that gives full weight to the organization of science into practices the notion of paradigm does not characterize shared beliefs, but shared practices. Shared practices are most often the result of common ancestry. Common ancestry is important because it allows the transmission of whole packages of techniques, expectations, standards and norms that function as a whole in a relatively stable environment but that it can change piecemeal through small changes in the (conceptual and material) environment that is part of the complex array of factors constituting (scientific) culture. Common ancestry explains the well-documented similarities in the formulation of problems, modes of representing and the kind of expectations that lead research in different traditions and cognitive styles. The differences between different groups of practitiones tend to be inherited through their lineages, formed around the training of new generations of scientists that involve informal personal interaction (see Kaiser, 2005, for a detailed presentation of the history of Feymans diagrams along these lines). A similar account can be given for heuristic patterns of reasoning and observational skills used in the different scientific practices (see Martnez, 2003). Such patterns of reasoning and observational skills lead some scientists to see certain phenomena and not others. Such biased reasoning skills are not arbitrary. The direction of bias is stable, as it answers to a cognitive style. That biases are not 123

MARTNEZ

arbitrary but stable features of reasoning is one of the most important theses of Kahneman and Tversky and has also been used to characterize central features of heuristic reasoning, and of explanation by models (as argued by Wimsatt since 1974; see also Wimsatt, 1976). This non-arbitrariness of biases provides further evidence for our thesis. The naturalization of the philosophy of science (and epistemology) has to take roots in projects of naturalization going on in the cognitive and social sciences. These roots are continuity enough.
NOTES
i

ii

iii

iv v

vi

vii

viii ix x

xi

xii

In this paper I will not elaborate further this notion of understanding. This is not required for the argument of this paper. I hope the examples in the following sections leave sufficiently clear in what sense I take understanding to be an epistemic value. For a congenial argument for the importance of understanding as an epistemic aim, see Elgin (2007). For recent collection of approaches to the issue of understanding in the philosophy of science, see De Regt et al. (2009). In Martnez (2011) I defend there a view of (non)-reductionism as an important aim of science that is closely related to the view of naturalism that I present here. See for example Beckermann et al. (1992), Horst (2007), Regenmortel and Hull (2002), and Mitchell (2003). For an elaboration of this point, see Nersessian (2008) and Martnez and Huang (2011). There are many alternatives. John Dewey developed the thesis of continuity in several writings. The waning of interest in Deweys naturalism in the mid-twentieth century seems to be related to the widespread rejection of versions of the thesis of continuity as a way of advancing sound philosophy and good science. We will suggest a version of the thesis of continuity that is not far from Deweys thesis (although I will not elaborate this point here). ODonohue (1993): The extent to which psychologists find Kuhn so attractive is puzzling given the significant ambiguities and inconsistencies in Kuhns views, his informal and unsystematic use of psychology, and his disparaging comments about psychology . Coleman and Salamon (1988) found that Kuhn was the most frequently cited historian/philosopher of science, most citations (95%) highly favorable towards Kuhn. In the case of psychology, the reason for Kuhns fame might be more superficial than in sociology. Dry experimental papers might be spiced up by quoting a philosopher of science. Nickles has argued that Kuhns account of exemplars requires bringing in schema theory in the discussion (see for example Nickles, 2000), but Kuhn did play a consructive role in the development of schema theory for several psychologists, and educational psychologists in particular. Godfrey Smith has a proposal in this direction in his 2003. Rouse (2003) elaborates a related point. Fleck provides three features of his notion of style: (1) common features in the problems of interest to a thought collective, (2) the judgement which the collective thought considers evident and (3) The methods which it applies as a means of cognition (Fleck, 1981, p. 99). Styles for Fleck seem to be characterized historically and sociologically, whereas cognitive style in my sense, even if it may be addressing similar phenomena, is characterized cognitively. But this is not meant to deny the sociological and historical dimension that Fleck identifies through his account of style of thinking. In this chapter, I am referring to all notions of change relevant for the discussion in the philosophy of science. For a defence of this view as a general philosophical view, see Van Fraassen (2008) and Elgin (2010). A different explanation of the relevant stability can be given using schema theory (see Nickles, 2000). I see such an approach as compatible. The difference is that my suggestion takes into account the fact that the stability of our beliefs and practices depends on cognitive and cultural factors that go beyond whatever factors can be identified as playing a role in explanation of the stability and

124

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM change of our theoretical knowledge. Material culture, as this is obvious for cultural practices in general, has to be recognized as playing a role in accounting for the mechanisms of stability and change of concepts, at least to the extent that norms (implicit and explicit in practices) support such stability. As Brunner puts it more than two decades ago: the cognitive revolution has to go beyond the predominance of AI and return to the original force of the cognitive revolution, a cultural psychology not preoccupied with behavior but with situated action (Brunner, 1990, p. 8). For a converging philosophical approach, see Hendriks-Jansen (1996). The shift in the cognitive sciences to model cognition as situated or embodied is nowadays not just a programmatic statement as it was for Brunner. It is increasingly recognised as a crucial element of an explanation of cognition and its relation to action.

xiii

REFERENCES
Bardone, Emanuele. (2011). Seeking chances: From biased rationality to distributed cognition. Berlin: Springer. Beckermann, Ansgar, Flohr, Hans, & Kim, Jaegwon. (1992). Emergence or reduction? Essays on the prospects of nonreductive physicalism. Berlin: Walter De Gruyter. Bird, Alexander. (2004). Kuhn, naturalism, and the positivist legacy. Studies in History and Philosophy of Science, 35(2), 337-356. Boas, Franz. (1938/1911). The mind of primitive man. New York: The Macmillan Company. Brigandt, Ingo. (2010) Beyond reduction and pluralism: Toward an epistemology of explanatory integration in biology. Erkenntnis, 73(3), 295-311. Bruner, Jerome. (1990). Acts of meaning. Cambridge, MA: Harvard University Press. Cartwright, Nancy. (1983). How the laws of physics lie. New York: Oxford University Press. Cartwright, Nancy. (1999) The dappled world: A study of the boundaries of science. Cambridge: Cambridge University Press. Coase, Ronald. (1984, March). The new institutional economics. Zeitschrift fr die Gesamte Staatswissenschaft [Journal of Institutional and Theoretical Economics], 140(1), 229-231. Coleman, S. R., & Salamon, Rebecca. (1988). Kuhns structure of scientific revolutions in the psychological journal literature, 1969-1983: A descriptive study. Journal of Mind and Behavior, 9(4), 415-446. De Regt, Henk W., Leonelli, Sabina, & Eigner, Kai. (2009). Scientific understanding, philosophical perspectives. Pittsburgh: University of Pittsburgh Press. Dupr, John. (1995). The disorder of things: Metaphysical foundations of the disunity of science. Cambridge, MA: Harvard University Press. Duschl, Richard, & Hamilton, Richard. (1992). Philosophy of science, cognitive psychology, and educational theory and practice. New York: SUNY Press. Echeverra, Javier, & Alvarez, Jos Francisco. (2010). Bounded rationality in social sciences. In Evandro Agazzi, Javier Echeverra, & Amparo Gmez Rodrguez (Eds.), Epistemology and the docial (pp. 173-189). New York: Rodopi. Elgin, Catherine. (2007). Understanding and the facts. Philosophical Studies, 132, 22-42. Elgin, Catherine. (2010). Keeping things in perspective. Philosophical Studies, 150(3), 439-447. Fleck, Ludwik. (1981). Genesis and development of a scientific fact. Chicago: University Of Chicago Press. Fracchia, Joseph, & Lewontin, R. C. (1999, December). Does culture evolve? History and Theory, 38(4), 52-78. Fuller, Steve. (2003). Kuhn vs. Popper: The struggle for the soul of science. UK: Icon Books. Gamble, Clive. (2007). Origins and revolutions: Human identity in earliest prehistory. New York: Cambridge University Press.

125

MARTNEZ Giere, Ronald. (1985, September). Philosophy of science naturalized. Philosophy of Science, 52(3), 331-356. Gigerenzer, Gerd, & Sturm, Thomas. (2011). How (far) can rationality be naturalized? Synthese, DOI 10.1007/s11229-011-0030-6. Gintis, Herbert. (2007). A framework for the unification of the behavioral sciences. Behavioral and Brain Sciences, 30(1), 1-16. Glimcher, Paul, Camerer, Colin, Fehr, Ernst, & Poldrack, Russell. (2009). Neuroeconomics: Decision making and the brain. London: Academic Press. Glimcher, Paul. (2011). Foundations of neuroeconomic analysis. New York: Oxford University Press. Godfrey-Smith, Peter. (2003). Theory and reality: An introduction to the philosophy of science. Chicago: The University of Chicago Press. Goldman, Alvin. (1988). Epistemology and cognition. Cambridge, MA: Harvard University Press. Hacking, Ian. (1983). Representing and intervening: Introductory topics in the philosophy of natural science. New York: Cambridge University Press. Hacking, Ian. (2009). Scientific reason. Taiwan: NTU Press. Hendriks-Jansen, Horst. (1996). Catching ourselves in the act: Situated activity, interactive emergence, evolution, and human thought. Cambridge, MA: The MIT Press. Horst, Steven. (2007). Beyond reduction: Philosophy of mind and post-reductionist philosophy of science. New York: Oxford University Press. Hull, David. (1974). Philosophy of biological science. New Jersey: Prentice Hall College. Hull, David, & Regenmortel, Marc. (2002). Promises and limits of reductionism in the biomedical sciences. England: John Wiley & Sons. Kahneman, Daniel, Slovic, Paul, & Tversky, Amos. (1982). Judgement under uncertainty: Heuristics and biases. Cambridge: Cambridge Univerity Press. Kaiser, David. (2005). Drawing theories apart: The dispersion of Feynman diagrams in postwar physics. Chicago: The University of Chicago Press. Kitcher, Philip. (1992, January). The naturalists return. The Philosophical Review, 101(1), 53-114. Kitcher, Philip. (1993). The advancement of science: Science withouth legend, objectivity without illusions. New York: Oxford University Press. Kuhn, Thomas. (1962). The structure of scientific revolutions. Chicago: The University of Chicago Press. Lakatos, Imre. (1970). Falsification and the metodology of Scientific Research Programmes. In Imre Lakatos & Alan Musgrave (Eds.), Criticism and the growth of knowledge (pp. 92-)197. London: Cambridge University Press. Laudan, Larry. (1977). Progress and its problems: Towards a theory of scientific growth. California: University of California Press. Laudan, Larry. (1984). Science and values: The aims of science and their role in scientific debate. California: University of California Press. Lewis, Herbert. (2001, June). Boas, Darwin, science and anthropology. Current anthropology, 42(3), 381-406. Love, Alan. (2008). Typology reconfigured: From the metaphysics of essentialism to the epistemology of representation. Acta Biotheoretica, 57(1-2), 51-75. Maddy, Penelope. (2007). Second philosophy: A naturalistic method. New York: Oxford Univerity Press. Martnez, Sergio. (2000). On changing views about physical law, evolution and progress in the second half of the nineteenth century. Ludus Vitalis, 8(13), 53-70. Martnez, Sergio. (2003). Geografa de las prcticas cientficas: Racionalidad, heurstica y normatividad. Mexico: UNAM. Martnez, Sergio. (2011a). Reducionismo em biologia: uma tomografia da relao biologia-sociedade. In Paulo Abrantes (Ed.), Filosofia da biologia (pp. 37-52). Brazil: Artmed.

126

THE SCIENTIFIC UNDERCURRENTS OF PHILOSOPHICAL NATURALISM Martnez, Sergio. (2011b). Epistemic groundings of abstraction and their cognitive dimension. Philosophy of Science, 78(3), 490-511. Martnez, Sergio. (In press). Technological scaffoldings for the evolution of culture and cognition. Linnda R. Caporael, James Griesemer, & William C. Wimsatt (Eds.), Scaffolds in evolution, culture and cognition. MIT Press. Mitchell, Sandra. (2003). Biological complexity and integrative pluralism. New York: Cambridge University Press. Mitchell, Sandra. (2009). Unsimple truths: Science, complexity and policy. Chicago: The University of Chicago Press. Morgan, Lloyd Conwy. (1896). Habit and instinct. Whitefish, Montana: Kessinger Publishing. Nersessian, Nancy. (2008). Creating scientific concepts. Cambridge, MA: MIT Press. Nickles, Thomas. (2000). Kuhnian puzzle solving and schema theory. Philosophy of Science, 67, Suppl. Proceedings of 1998 Biennial Meetings of the Philosophy of Science Association, Part II. Nickles, Thomas. (2003). Introduction. In Thomas Nickles (Ed.), Thomas Kuhn (pp. 1-19). New York: Cambridge University Press. ODonohue, William. (1993, September). The spell of Kuhn on psychology: An exegetical elixir. Philosophical Psychology, 6(3), 267-288. Ortman, Scott. (2001, October). On a fundamental false dichotomy in evolutionary archaeology: Response to Hurt, Rakita, and Leonard. American Antiquity, 66(4), 744-746. Quine, Willard Van Orman (1969). Ontological relativity and other essays. New York: Columbia University Press. Regev, Aviv, & Shapiro, Ehud. (2002, September). Cells as computation. Nature, 419, 343. Renfrew, Colin, & Zubrow, Ezra. (1994). The ancient mind: Elements of cognitive archaeology. Cambridge: Cambridge University Press. Richards, Robert. (1992). The meaning of evolution. Chicago: Chicago University Press. Richards, Robert. (1988). The moral foundation of the idea of evolutionary progress, Darwin, Spencer and the neoDarwinians. In Nitecki Matthew (Ed.), Evolutionary progress. Chicago: Chicago University Press. Roepstorff, Andreas. (2008). Things to think with: words and objects as material symbols. Philosophical Transactions of the Royal Society B, 363, 2049-2050. Romanes, George John. (1888). Mental Evolution in Man: Origin of Human Faculty. Cambridge, MA: Cambridge University Press. Rouse, Joseph. (2003). Kuhns philosophy of scientific practices. In T. Nickles (Ed.), Thomas Kuhn. New York: Cambridge University Press. Shapere Dudley. (1974). Scientific theories and their domains, In Frederick Suppe (Ed.), The structure of scientific theories (pp. 518-565). Urbana: University of Illinois Press. Solomon, Miriam. (1995). Legend naturalism and scientific progress: An essay on Philip Kitchers The advancement of science. Studies in History and Philosophy of Science, 26(2), 205-218. Sperber, Dan. (2011). A naturalistic ontology for mechanistic explanations in the social sciences. In Pierre Demeulenaere (Ed.), Analytical sociology and social mechanisms (pp. 64-77). New York: Cambridge University Press. Strike, Kenneth, & Posner, George. (1992). A revisionist theory of conceptual change. In Alan & Hamilton, Philosophy of science, cognitive psychology, and educational theory and practice. Albany: State University of New York Press. Van Fraassen, Bas. (2008). Scientific representation: Paradoxes of perspective. New York: Oxford University Press. Wallace, Alfred Russel. (1870). Contributions to the theory of natural selection. London: Macmillan and Company. Wimsatt, W. C. (1976). Reductive explanation: A functional account. In PSA: Proceedings Biennial Meeting of the Philosophy of Science Association, Vol. 1974 (pp. 671-710). Springer.

127

NICANOR URSUA

ADVANTAGES AND RISKS OF NATURALIZATION


Converging Technologies Applied to Human Enhancement (Implications i and Considerations for a Naturalist Philosophical Anthropology)

We feel that even if all possible scientific questions have been answered, the problems of life have still not been touched at all. Of course there is then no question left, and just this is the answer. (L. Wittgenstein, Tractatus logicophilosophicus, 6.52)
1. NATURALISM

Naturalism can be described as the approach that maintains, starting from the fact that human knowledge is a human process, that cognitive problems are psychophysiological processes. Naturalism starts from the assumption that knowledge is a function of the human animal that is just as natural as any other function. Although naturalism came to prominence in the epistemology of the second half of the 20th century, it is possible to trace it back to Aristotle, insofar as we consider knowledge a psycho-physical process of the human being (Blasco & Grimaltos, 1997, pp. 23-34). Following Hans Albert (1987, pp. 127, 140), we can state that human knowledge, as it occurs in the sciences, constitutes a cultural product that rests on a natural base. Knowledge as a cultural product has to be built on the natural foundations that the cognitive apparatus has available to it.ii In the same way, traditional problems in epistemology and philosophy in general are naturalscientific issues. Intelligence, for example, is not unique to human beings, but is a physiological-mechanical phenomenon. Philosophical naturalism which, roughly speaking, adopts a naturalist posture compared to more autonomous philosophy, can be characterized grosso modo as a conception or a programme that requires at least four essential elements: 1) an overall universal picture or worldview; 2) for some part of the cosmos to be attributed to the human being, but a very modest part; 3) that all the capacities of the human being are contained within it, including speech, knowledge, morals, aesthetic judgements, etc.; and 4) that from these foundations a naturalist anthropology is required and devised, together with a naturalist theory of knowledge, a naturalist research methodology, naturalist ethics and even naturalist aesthetics.

J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 129149. 2013 Sense Publishers. All rights reserved.

URSUA

In accordance with such a philosophical position, our universe is a closed causal system and therefore all the internal problems and all the epistemological questions can be resolved with the means that the cosmic system provides (Vollmer, 2003, p. 362; Kanitscheider, 1994). Jean Gayon (2003) examines the biological roots of the naturalist project and its extension to cultural phenomena, defending co-evolution, as does Kanitscheider (1991, p. 374), who invokes the cultural-biological binomial (culture-gene), where the two aspects are interpreted as a single linked system. That is why the nature (biology) and nurture (culture) dualism is inadequate. The biological base is confirmed as the vehicle of the intellect, where the autonomous aspect of culture also participates in the evolutionary process, leading to a convergence of the biological and cultural models. Evolutionary naturalism aims to account for the biological-evolutionary origin of the cognitive mechanisms of living creatures, including humans (Konrad Lorenz) and to explain scientific change as a product of the processes of variation and selection (Karl Popper). This type of naturalized philosophy attempts to follow an ontological, epistemological and methodological natural-scientific monism. This does not, however, simply convert philosophy into a natural science; rather the history of culture will be viewed as a reality shaped by natural history. From the point of view of naturalization, the course of nature and causal regularities will be considered as models of social development. The increasing formalization of philosophy and of the social sciences in general, has an effect that is still small, but it is constant and can be demonstrated. Through aiming to explain ever more complex phenomena undoubtedly the demand for formalization and quantification increases through studies in empirical cognitive science and neuroscience, through evolutionary biology, the evolutionary theory of knowledge, etc. (Ursua, 1993). This demand spreads beyond the results of natural science to the human sciences. However, resistance may be encountered in such a move due to its ontological reductionism; or the move may be seen as a threat, if we consider for example programmes of evolutionary bio-neuro-cogno research that construe the human being as a mere biological creature subjected to evolutionary, physiological and mechanical causal processes without leaving room for any other type of determination and losing all the essential human attributes. Some philosophers claim that from such a perspective the human being would lose any freedom or autonomy (free will). To naturalize a field or area is to claim that overall it forms part of nature and that, in the case of the human being for example, it can be approached and explained with the help of natural science and technology. The modern naturalism debate represents controversy concerning the interpretation of the results of natural-scientific research and their possible consequences for social order. Some very significant and well-known examples of this are the human genome project and its results, or certain results of research into the human brain. The specific case that I will analyse is that of human enhancement: technical improvement of the human being which opens up the

130

ADVANTAGES AND RISKS OF NATURALIZATION

possibility of new choices in relation to the configuration or reconstruction of the human mind and body (Grunwald, 2007c, p. 949).
2. CONVERGING TECHNOLOGIES (CT)

What do we mean when we talk of converging technologies (CT)? The concept of CT is usually associated with the confluence of science and technology; examples of its use are common in IT and domestic electronics. The concept is used by the majority of experts to describe the interaction of different technoscientific disciplines to tackle problems that are common to all of them through transdisciplinary, interdisciplinary and multidisciplinary cooperation. We could say that the concept of convergence is used here to describe the development of different technologies that are focused on a combination of research evidence from different disciplines involving living and artificial systems in order to design new devices that allow us to enlarge or enhance cognitive and communicative capacities as well as the health and physical capacities of people, and to generate greater social welfare. At the heart of this new concept we find interactive relations, synergies or fusions of broad fields of research and development, such as: nanoscience and nanotechnology, biotechnology and the life sciences, biomedicine including genetic engineering, information and communication technologies, robotics and artificial intelligence, cognitive science, neuroscience and neurotechnology. The debate over CT has been characterised as the forum for exploring the future impact of the whole of science and engineering. The starting point of the CT debate is said to be the 2001 US research and technology policy initiative,iii specifically the NBIC initiative (nano-bio-infocogno initiative) or NBIC convergence. Although it certainly covered very varied fields in, for example, sociotechnical systems (food, housing, transport, communication, tourism, health, safety, education, leisure, etc.) (Aguil, 2005; EOI, 2005, pp. 37-43), here I am interested in examining improving human performance and in the theme of human enhancement. That is, in the technological increase or technical enhancement of human capacities and the modification of the human body and intellect. As we can read in Roco and Bainbridge (2003, p. 24), the participants in the Workshop recommend a national R&D priority area on converging technologies focused on enhancing human performance. The idea of CT, and particularly NBIC, holds the hope that the different areas and disciplines of research will converge in a new technoscientific paradigm, characterised by convergent cooperation. We are not talking here of just a few convenient didactic examples; rather we are dealing with fascinating questions the answers to which, according to Roco and Bainbridge (2003, p. 13), will lead to enhancements of human potential. The aim of CT and this new demand is, as far as we are interested here, the technical perfecting of human faculties.iv The European commission within the framework of the programme Citizens and Governance in a Knowledge-based Society (Sixth Framework Programme) 131

URSUA

has requested that the possible effects of new CT on the European Knowledge Society be studied. The 2003 European Commission document, published in 2004 by the High Level Expert Group (HLEG)v has as its title: Foresighting the New Technology Wave: Converging Technologies Shaping the Future of European Societies (Nano-Bio-Info-Cogno-Socio-Anthro-Philo), and is edited by Alfred Nordmann (CTEKS, http://ec.europa.eu/research/conferences/2004/ntw/pdf/ final_report_en.pdf) (10.03.2011).vi The American initiative focuses on the enhancement of the individual human being and defends a new unit of science, characterized by radical reductionism, taking everything to the nanoscale: it defends the notion that the common base of all sciences is at the nanoscale. The American document can be characterized by the following telling words: If the Cognitive Scientists can think it, the Nano people can build it, the Bio people can implement it, and the IT people can monitor and control it (Roco & Bainbridge, 2003, p. 13). In contrast to this, the European document adopts a view according to which CT responds to the needs and demands of society and is characterised by a framework of interdisciplinarity and multidisciplinarity in which philosophy also has and indeed must have a very specific task: that of helping to clarify and explain the process of technoscientific convergence; of analysing the new modes of production of knowledge; of examining the epistemic cultures of the different disciplines involved (Knorr-Cetina, 1999); of evaluating the new technologies and their impact at the social and ethical levels; and of contributing to clarification of the new self-understanding of the human being.vii In relation to interdisciplinaritya crucial point in the two reports cited Niklas Luhmann (1996, pp. 327-328) already claimed that the term does not designate a single state or concept, and he distinguished three forms of interdisciplinarity. In occasional interdisciplinarity, some disciplines canto a certain extentlearn from contact with others; such encounters are incidental and involve the reception of certain terms that have unexpected effects on the discipline that adopts them. There is also temporary interdisciplinarity which involves interdisciplinary projects that are limited in time and within which different disciplines cooperate around certain problems, and where complementary research takes place. The third form he mentions is described as transdisciplinary enterprise which functions within shared scientific paradigms: that is, a distinct scientific paradigm that is relevant to more than one discipline. Once the human genome had been sequenced and following the advances in areas such as neurology and cognitive science, the European Union thought the time had come to develop, within the Sixth Framework Programme, a European project on the human mind. The aim of the Human Mind Project or HMP was to foster interdisciplinary research on what being human means. In 2005, the HLEG published: What it means to be human. Origins and Evolution for Human Higher Cognitive Faculties (ftp://cordis.europa.eu/pub/nest/docs/ whatitmeanstobehuman_b5_eur21795_en.pdf) (10.3.2011). According to the report, the research was to focus on five broad areas: 1) the genetics of human cognition; 2) the evolution of the mind; 3) the thought process; 4) motivation and decision making; and 5) the cultural context. From the point of view of CT, 132

ADVANTAGES AND RISKS OF NATURALIZATION

genetics, neurobiology, cognitive science, animal and human behaviour, palaeoanthropology, history, modelling and philosophy of the mind are to be taken into account. In the context of this new demand for technologies and the necessity to examine them, I should mention the research projects: Tecnologas Convergentes NBIC (Aguil, 2005; EOI, 2005) and CONTECS Converging Technologies and their Impact on Social Sciences and Humanities (2008). Those projects serve as a base from which to reflect, and the latter in particular is a Specific Support Action that, within the Sixth Framework Programme for research, was financed by the European Commission from February 2006 to April 2008 and was conducted by: the Institut fr Technikfolgenabschtzung und Systemanalyse of the Forschungszentrum Karlsruhe (ITAS), the Fraunhofer-Institut fr System- und Innovationsforschung (FhG-ISI Karlsruhe), as project coordinator, the Sad Business School (Oxford) and the cole Normale Suprieure (Paris) (Andler et al., 2008; Fleischer et al., 2008; STOA, 2009).
3. WHY EXAMINE AND REFLECT ON CT, AND RELATE IT TO NATURALIZATION?

Careful reading of and reflection on the National Science Foundation document (Roco & Bainbridge, 2003), the CTEKS document, the cited work by Grunwald, Coenen and Zonneveld et al., as well as the CONTECS project; STOA (2009), Galert et al. (2009), and Galert (2010), among many other publications that appear in the bibliography, leads us to consider the promise and possible situations of futuristic human development. They contain themes that are very important in terms of society, ethics, ontology and anthropology insofar as CT can open up new fields for human choice and action. This is emancipating since it offers us new opportunities for action: technical enhancement that includes remodelling the body and mind. This could lead us to think that we are facing a culturalization of elements that until now were natural in the human being. The naturalist argument that I have been setting out, that human beings are physiologically what evolution has made of them naturally, and this is how it should be, is no longer of use in a world dominated by technology. Today, the romantic concepts of nature and of human nature have to face up to another concept of nature. If we consider the concept of nature as something sacred (as it was in the past), then the concept of technological progress could be seen as an error, since for some it destroys nature while for others it transforms nature. As a rule, human beings are perfectly satisfied with and indeed proud of building dikes against a flood: it signifies a victory over nature which is not always our benevolent friend, but often our enemy. But, is the human being not from its very beginning naturally an artificial being? (Gesang, 2007, pp. 12, 108-139). Gesang is a philosopher who tells us that if that is the case, then the desire for technical enhancement is just as natural and sacred as any inherited part of the human body. Since we have no natural relation with these two spheres that we could look upon in order to settle the dispute, then we need other criteria that go beyond 133

URSUA

naturalness. Neither nature, nor human nature, according to the author, can themselves be considered values and moral values are only deduced from subjective interests. The value of nature only exists for human beings. In the light of these considerations, nature is highly valuable; particularly it is the base for the great moral acquisitions, such as human rights. What are we to understand by the nature of the human being and what makes it normative?viii It can be claimed that in this context to seek recourse in the nature of the human being is rather problematic, since our understanding of what the human being is, what really constitutes it, is highly varied and can even be completely different (Clausen, 2006; 2009, p. 25). For human nature to be upheld as morally normative, it would be necessary to show, as Engelhardt says (1991, p. 80), that either: 1) the process of its design was such that it was allocated an intrinsic moral significance; or 2) there are properties of the design that show that it imposes an absolute moral obligation on us. Neither of which can be demonstrated since human nature is considered to be the result of natural processes and these are explained without recourse to transcendental religious or teleological explanations. Human nature as the result of natural biological processes and as the biological substrate and base for human psychological and sociological phenomena, is the result of random mutations, coincidence, accident, the limits imposed by biochemistry, genetic drift, natural selection and other natural forces. We are, therefore, the result of a specific history: that of homo sapiens, but not of a normative structure. At the same time, CT and the corresponding futuristic expectations can dissolve traditionally held values, certainties and self-understanding, to the degree that the contingency of the conditio humana increases. That may require a new orientation in a new contingent situation that expands the possible choices between options. At the same time it may require a reduction in dependency on nature and on the traditions of humanity; but it may be accompanied by ambiguous or uncertain situations for the human being through bringing traditionally recognized certainties into question (Grunwald, 2007a, pp. 3-7; 2007b, pp. 381383; 2007c, pp. 949-953; 2007e, pp. 271-288; 2009a, pp. 208, 210, 213, 218). Will we be technoscientifically determined or will there be spaces of freedom based on social and ethical considerations? Will all human problems be resolved technoscientifically? Are we not passively witnessing a technification of the human being (human being-machine symbiosis: cyborgs (Sanmartn, 1990, pp. 173179), a perfect synthesis between minds and machines (where the soul will fuse with the chip, a union of human sensitivity and artificial intelligence will occur, machines will, in the end, acquire human attributes), as Ray Kurzweil (1999) claims? Are we witnessing a trans-human or post-human process (movements in favour of radical human enhancement, which could lead to the transformation and surpassing of the human species through an entirely technical civilization)? Will human enhancement destroy the base of human rights and give rise to a world in which some human beings do not recognize others as humans, as they will have other, totally different capacities? What does human enhancement mean and how should we view it? 134

ADVANTAGES AND RISKS OF NATURALIZATION

Jos Sanmartn, in his evocative, illustrative and critical book of 1987, Los nuevos redentores, sets out some reflections on genetic engineering and the brave new world that we are promised by certain scientists and technologists who not only attempt to dominate nature and orient it towards ends determined by us, but to replace it. Such a move from knowing to doing, has been made, according to that philosopher of science (Sanmartn, 1987, p. 12), on several occasions without taking sufficient note of the ethical implications, the social costs or the ecological consequences. For his part, Jrgen Habermas (2009, p. 38), in his attempt to moralize human nature, to cite W. van den Daele, says: What science makes technically available, moral control should make normatively unavailable.
4. CT AND THE ENHANCEMENT OF HUMAN CAPACITIES

Human enhancement (both physical and intellectual)that is, the increase of human cognitive and physical capacities is not a new idea. In fact, it is a very old idea: as old as human beings themselves, and it has taken up very many technical machinations and become home to much futurism. (Even our biblical ancestors Adam and Eve, for example, desired to become godlike). How many people, at present, are satisfied with their body and with their mind? Would we not like to be different? Would we not like to know everything and be all powerful? Would we not like to conquer aging and even death? The working group of the Europische Akademie Bad-Neuenahr-Ahrweiler in Germany has adopted the expression human enhancement as a terminus technicus. It can be defined as the attempt to technically improve the normal properties of healthy human beings by means of technical bodily intervention or any attempt to improve something or someone. Within bioethics, it focuses on the use of technologies or pharmaceutical products (psychoactive drugs) to improve human capacities, particularly improving them beyond the standard normal level. Each technique and technicalization signifies a permanent extension of or increase in human possibilities that serves to improve human capacities. The terminus technicus human enhancement refers to a modification that is aimed at an improvement in individual human performance brought about through interventions in the human mind and body based on science or technology. This definition from the STOA report (2009, p. 22), which offers a balanced, critical and rational perspective on the development of science, technology, medicine and society is in contrast to positions adopted concerning human enhancement that can sometimes be highly visionary and very ideologized (Coenen, 2009b, pp. 143-144). The definition includes strong forms of human enhancement which have longterm effects or permanent results, together with temporary enhancements. Since this concept is not related to a specific definition of health, it is a non-medical concept of human enhancement. The STOA study of human enhancement distinguishes between purely restorative therapies that do not aim to enhance, enhancement therapies and enhancement that is not therapeutic. This umbrella term, as it can be considered in general, therefore refers to a very broad spectrum 135

URSUA

of existing and emergent technologies, some of which are visionary. The term also includes pharmaceutical products; neuroimplants that restore sight, for example, or that provide other artificial senses; it also includes neuro-enhancement pharmaceutical products (commonly known as brain doping; drugs that increase the power of the mind of healthy people or their emotional state); germ line engineering and existing assisted reproductive technologies; new technologies of brain simulation; gene doping in sport; cosmetic surgery; anti-aging and longevity medication; highly sophisticated technological applications that can supply special sensorial inputs or mechanical outputs; and others. In general, underlying all the techniques and products that aim at human enhancement is the goal of pushing back the frontiers of scientific and medical research. All the research on which technologies of enhancement are based expands the known limits of scientific disciplines.ix It seems that today, more than ever, we are witnessing what has been called a type of normative discontent with one or several specific bodily characteristics, independently of any psychopathology that we want to improve (Fuchs et al., 2002, p. 73). Our corporal dissatisfaction is overcome or corrected, at the individual level, through plastic and cosmetic surgery (body surgery). This is gaining ever more importance in our society at the individual level and it has greater and greater economic repercussions while at the same time it is not frowned upon, or at least not entirely. It should be noted, however, that the concept of outer appearance is a subjective concept and that it occurs, in general, in comparison with others. In this field a distinction must be made between those people who can be helped by a small surgical correction and those with a dysmorphophobia who undergo an operation and require additional psychiatric treatment. From a socio-ethical point of view, we have to analyse whether the patient undergoes the operation voluntarily and autonomously, or whether they are conforming to highly controversial imposed social norms (the tyranny of social norms and the consumer culture). Sometimes there is an illusion that we can control our lives and old age, and that we can continuously improve. Many people in our society are entirely won over by an external image and appearance of youth, while they often overlook the internal aspects of the human being. In order not to offer blind support, it is important to analyse the interactive doctor-patient model, which can be paternalist, informative, interpretative or deliberative. This last possibility introduces a consideration of the values of both patient and doctor in relation to human health and welfare. Since this type of surgery is, as a rule, outside traditional medical teleology, it can be maintained that cosmetic surgery does not, roughly speaking, belong to the field of medical need as its objective is not health, but beauty. Medical aims are guided by physical realities that can be objectively put to the test, while cosmetic surgery aims more at satisfying subjective desires.x We attempt to improve our physical limits, as a rule through constant training. In sport, for example, there is intensive training or the use of technical means to improve performance. The slogan of sport has always been: citius, altius, fortius: faster, higher, stronger. Meanwhile, the International Olympic Committee defines doping as: 1) The use of an expedient (substance or method) which is potentially 136

ADVANTAGES AND RISKS OF NATURALIZATION

harmful to an athletes health and/or capable of enhancing their performance; or 2) the presence in the athletes body of a prohibited substance or evidence of the use thereof or evidence of the use of a prohibited method. In some cases the use of doping (stimulants, narcotics, anabolic steroids, diuretics, peptide hormones, growth hormones, gene doping, etc.) is considered very unsporting and is penalized. The discussion concerning doping is set within the idea of fair play, of the equality of opportunities (that is, the same external conditions) and of authenticity (Fuchs et al., 2002, pp. 85-106). In 2002 Fuchs et al. turned the spotlight on the field of enhancement in addition to plastic and cosmetic surgery, and sport, as mentioned above, in the context of techno-genetic enhancement, growth hormone treatment in paediatrics and mental enhancement through psychoactive drugs. At the collective level, humanity (which has always resented its defects in moral terms or in terms of a civilizing goal) has tried to correct its deficiencies or improve, especially since the Enlightenment, through personal education and culture (the so-called cultural techniques such as learning, training, intellectual exercise, etc.) with the hope of improving the human condition and society in general. Today there is a hope, however, that we will improve human capacities, and hence social actions, not through education and culture, but through convergent development and application of nanotechnologies, biotechnologies, genetic technologies, information and communication technologies, cognitive sciences, neurotechnologies and research into the human brain. If we accept the claim of Roco and Bainbridge (2003), the technical enhancement of human capacities is focused on quantitative technical capacities, which range from the expansion of human sensorial faculties (for example, the enhancement of the eye, or of human hearing, etc.) through memory and the brain (implantation of chips, neuroimplants, neuro-enhancement, etc.) and to the slowing or disappearance of human aging. What certainly increases is the contingency of the conditio humana (Grunwald, 2007a, pp. 4-5; 2007b, pp. 382383; 2007c, pp. 950-951; 2007e, pp. 271-288). In general, we can state that today there are three techniques for technical enhancement of the human being that are available to us: 1) genetic techniques (genetic modification); 2) surgery, and all sorts of implants in all parts of the body and brain (including artefacts, neuroimplants and neuroprosthetics (neuroelectronic enhancement), (Rosahl, 2009, pp. 13-20), incredibly complex brain-computer interfaces (Clausen, 2009, pp. 20-29; Hennen et al., 2008), nanorobots, and both prenatal and perinatal enhancement); and 3) the results of pharmacological and medical research applied to the increase of cognitive competences and of sensorimotor capacities, and to psychic and neurodegenerative disease therapy. In general, the aim is to apply NBIC technologies and products to, among other things, the perfecting of human capacities, the prolongation of life, the enhancement of learning processes, the acquisition of an infinite memory and overcoming the constraints of nature.

137

URSUA

The CONTECS project (2008, p. 10), with the help of experts from different technoscientific fields, has identified 8 research and development areas in which it is possible to locate almost all work in the field of the CT: 1) neurosciences and the enhancement of the brain; 2) physical enhancement and biomedicine; 3) synthetic biology; 4) the human being-machine interface; 5) different sensors; 6) recognition models; 7) models based on the computerization in the world; and 8) robots together with intelligent software and artefacts.
5. THE ROLE OF HUMAN ENHANCEMENT TECHNOLOGIES AND SOME SOCIO-ETHICAL CONSIDERATIONS

One of the fundamental questions in relation to human enhancement is what the goals and objectives of this human enhancement are. This question is related to social and political visions, to ideological factors and anthropological concepts as well as to our fundamental values that shape the debates and the technoscientific activities that can undoubtedly influence the definition of concepts such as health, normality, therapy, perfectibility, etc. Technologies of human enhancement range from specific projects such as, for example, the alteration of the metabolism of soldiers or the development of sophisticated brain-machine interfaces, to the development of images of a posthuman future in which a symbiotic human being-machine civilization from earth spreads beyond our space. Some posthuman and technofuturist visionaries even consider the possibility of the total replacement of humanity by intelligent machines while others claim that human being-machine hybridization is the only option that is left open to humanity, rather than enhancement, in order to prevent a near-future scenario that could displace the human being from centre stage. Such transhumanism believes in a radically new technical civilization that moves beyond current humanity.xi OMathna claims that transhumanism, the term that is used to describe philosophies that support the belief in progress towards a future posthuman condition, has been engulfed within the term posthumanism; which is a group of philosophies united by the promotion of human enhancement. The latter maintain that technology should be developed and used to enhance the human mind, body and soul. Through science and technology, posthumanism attempts to control human evolution maybe towards the emergence of a new species: the posthuman. This tendency views the body as a separate entity from the self. One very important representative of posthumanist technofuturism could, as Christopher Coenen claims (2007, pp. 141-172; 2009a, pp. 47-49), be Ray Kurzweil (1999, 2009). With considerable enthusiasm Kurzweil praises a radical change or transformation of the human being through science and technology, while offering us a techno-beyond-the-grave cosmic vision of transhumanism or posthumanism. Placing his seal on speeches about nanotechnology and the special discourse related to CT, he has played a central role in the debate on human enhancement. The goal of such posthumanist technofuturism is the creation of a radically transformed human being or the creation by means of engineering of a 138

ADVANTAGES AND RISKS OF NATURALIZATION

new artificial human being. This practical posthumanism, as Christopher Coenen defines it (2007, pp. 142-143), is enhancement or increase in the human capacities and abilities through the creation of new man-machine interfaces and through the use of drugs and other means that I mention above. One of his favourite visions is the possibility of digitally scanning (uploading) the human mind and the transfer of conscience to computers and new robotic artefacts or other artificial bodies. As a consequence of this, a type of personal immortality would be achieved and the ego (or at least, a copy of it) would exist eternally. These quasi-immortal cyberminds could travel out into space and colonize it (Bainbridge, 2004). This ideology of extreme progress, characterised by the transhumanist tendency, organized into a transhumanist movement (the World Transhumanist Association), can lead to the erosion of science from within in such central themes as CT and human enhancement. Such ideology emerges from within the field of science can lead, on the one hand, to the emergence of a critical spirit in relation to technique and progress, setting up obstacles for non-scientific beliefs or fundamentalists tendencies. While on the other hand, due to optimism regarding technique and progress, it may open the door to Salvationist ideologies or to mystical, quasi-religious thought. In this way the struggle to gain knowledge and to move toward truth may be abandoned, at least as a normative idea, to become a mere decorative accessory. NBIC in the USA is an important point of reference for transhumanism for two fundamental reasons, according to Coenen (2007, p. 145): due to the prominence of human enhancement and the consideration of far-reaching posthumanist visions in the visionary NBIC programme; and because some NBIC practitioners are very close to organized transhumanism. In order to analyse this subject, in Brussels in September 2008 (Coenen, 2008a, 2009a; STOA, 2009) a group of experts stated that within human enhancement we should distinguish between the improvement of the species, with its eugenicist overtoneswhich is not at all promising as a guiding vision for historical, pragmatic and metaphysical reasonsand enhancement in individuals.xii At that Brussels meeting it was commented that a more appropriate guiding vision at the European level was to consider enhancement both at the level of individual welfare and at that of social cohesion, examining the relations between the social and individual factors. Coenen (2008b) prefers to talk, in this sense, of human optimisation rather than of human enhancement. Some experts claimed that human enhancement discourse is strongly influenced by an uncritical faith in science and that alternative visions of the future and proposals for resolving the problems of society were totally absent or no attention was paid to them in such discourse. Many people approach the issue with considerable hope, as they only listen to the political agents or members of the technocratic elite, and therefore it is necessary to imagine alternatives and obtain social visions related to science and technology with greater public participation. Vision and orientation are therefore needed that will guide the future development of research and technologies that are relevant in the context of human enhancement; and that vision and orientation should undoubtedly be based on a 139

URSUA

social perspective focused on social cohesion and distributive justice (improvement for all people concerned) as a framework for individual choice. There is still a considerable separation between the current visions of CT and the applications to which it will actually be put (future applications of CT are still very difficult of predict). We do not yet know the specific configuration of the techniques or their capacity to provide results. However, we can state (Andler et al., 2008, pp. 22-26; Fleischer et al., 2008, pp. 76-77) that a considerable part of the ethical discourse and of Technology Assessment is concerned with general issues about the development of techniques, and specifically about the opportunities and risks (risk characterization, risk assessment, risk management, risk communication) of human enhancement techniques. It will also be necessary to take into account, in this theme, the precautionary principle, as the European Union has defined it.xiii Grunwald (2007a, pp. 7-13; 2007b, pp. 383-391; 2007c, pp. 953-955; 2007e, pp. 271-288; 2009a, pp. 210, 216, 218) proposes research into the role of futuristic communication in order to provide a new orientation given the growing contingency of the human condition in which there is no longer an ideal state for the physical or mental constitution of the human being and in which the normal ideal state can be optimized. This futuristic communication contributes: 1) to increasing the contingency (a mediating or catalyst function: the certainties of the past are eliminated and new contingences are created without the technical preconditions having been established), 2) to indicating the growing contingency (an indicative function) and 3) to managing the consequences (an orientational function). This orientation is not an automatic consequence; there may be important doubts (expectations of salvation, the promise of new paradises and possible fears or catastrophes). In order to make constructive use of the potential for orientation in futuristic communication, it is necessary to put forward a new instrument such as the assessment of the futuristic vision.xiv This assessment can constitute, according to Grunwald, a new element in the toolbox of Technology Assessment, as mentioned above. Philosophical reflection and research in the philosophy of science and technology can and must also make a contribution, together with empirical science and communication science, to the analysis of visions as a means of communication with its cognitive and evaluative consequences in order to make transparent and rational discussion possible. What is to be done in futuristic vision assessment, according to Grunwald, is: 1) epistemological vision analysis, that is, to reveal the content of the vision and epistemologically judge the scope of its reality and realizability, based naturally on normal knowledge (the question of validity), 2) vision evaluation in order to categorize and judge the cognitive aspects based on their degree of realization (relation between knowledge and values), following the evaluative methods of Technology Assessment and 3) vision management in order to decide and act rationally. According to Grunwald, the question always consists of how the public, the mass media, politics and science can be informed in relation to rational uses of visions. Communication of the cognitive and normative background to the visions forms part of responsible communication so as to be able to carry out transparent discussion. 140

ADVANTAGES AND RISKS OF NATURALIZATION

Although it is not always easy to separate out the ethical as a field of social and legal research,xv and in this case, philosophical and anthropological reflection, there are a series of questions related to human enhancement in visionary expectations (or even possibilities) as a whole that could be expressed in the following way and which, of course, require an answer that can be understood from a philosophical, ethical and social perspective: What does human being mean today? Where does human dignity reside? What is nature today, and specifically, what is human nature; and what aspects of human nature could be considered to have a normative character? Can we, and indeed should we, place the naturalness of the human being in danger, or even eliminate it, through technical enhancement? What does moralization of human nature mean (Habermas, 2009, pp. 38ff)? What is personal identity and authenticity, and what will become of it? What, in this context, do selfdetermination and free will mean? What feelings of self deception do people have with respect to cognitive enhancement in work ethics, in personal ambition, in effort and in authenticity? How should we deal with coercion or pressure (social and economic aspects, such as disadvantage through a lack of enhancement, economic implications, etc.)? How can we distinguish the natural image from the unnatural image and permanent modifications from the merely temporary? What happens to the matter of distributive justice; who can have access to technologies of enhancement? Will there, therefore, be a division of society between those who can and those who cannot have access to enhancement? What consequences and implications are there for our concept of humanity and for the society of the future? Are there, or should there be, limits to technical enhancement of the human being (to the technification of the human being)? Under what circumstances should they be applied? How are such criteria to be decided on? How far can and should human beings go in their technical reconstruction of the body and mind? Are there risks associated with technical enhancement? What attitude do we adopt with respect to technical enhancements? Where does therapy end and non-medical use (or abuse) of technical enhancement start? Will informed consent be enough? Can anything be carried out normatively? What will the meaning of life be and what will become of the good life? How should conflicts with the norms, beliefs and values be dealt with? Will other cultures also have to be taken into account? Will the whole of civil society have to be involved in the debate? Should public funding be dedicated to these technical enhancements? Where does personal responsibility rest in the design of our body and mind? What world, what society is worth living in? Can philosophy provide an orientation in all these questions; should they not vanish from philosophical reflection, in order for these themes to be adequately tackled ethically and socially?xvi
6. NATURALISM AND THE DETERMINISTIC MODEL OF THE HUMAN BEING

The concept of CT places emphasis on naturalism and considers human beings, including their cognitive structures, to be natural entities that constantly interact with other entities. Natural-scientific and technological research will lead to 141

URSUA

explanations and enhancements of the human being, and in order to do so, CT is believed to have the necessary tools, both at the individual level and at the level of society. The activity of CT is more technological than scientific; it depends on material or natural determinism but works together with human beings to impose a new and highly effective determinism. Such a posture is, according to some authors in the human sciences, not acceptable and may even be seen as detestable. A fundamental question related to the theme of naturalization, could be formulated in the following manner: When can naturalization be acceptable? Researchers from the CONTECS project (2008, p. 33) state the following, as a hypothesis that is worthy of study: naturalization is acceptable, both on the part of a great majority of researchers in the human and social sciences, and also on the part of public opinion, to the extent that what is naturalized (in the double sense: in the strict sense of bio-naturalism and in the broad sense of the formal-quantitative method) is an inferior cognitive faculty or that part of the superior faculty that consists of an empty form or a space of formal possibilities. Naturalization could not go beyond that form and the contents affected. Thus, what a person says, believes or intends on a specific occasion is not subject to naturalization. If this line is not crossed, then free will, responsibility, etc. will be beyond the reach of natural science and also beyond technological intervention. If this hypothesis is false, then it would be necessary to study the roots of the resistance to naturalization. If it is true, then it would be interesting to study in detail how intuition is really expressed, in particular, how people (experts and nonexperts alike) draw the line between form and content, and how intuition maintains counterexamples. It would be necessary to study the phenomenon of intuition, therefore, both from the nave point of view and from the field of philosophy and social epistemology as well as from the cognitive sciences (developmental psychology, social cognition, and cognitive and evolutionary anthropology). The claim that naturalizing the human being would eliminate or endanger its naturalness by means of technical enhancement cannot be seen as a strong argument since the naturalness or culturality of the human being is tied to different interpretations of the human condition, as was already expressed in Section 2 above. It can therefore be stated that, in accordance with the arguments given there, if there is no normative design in nature that can be appreciated in lay terms (which means that a natural and rational philosophy demands a critical evaluation of the central, core ideas and images of the scientific and cultural setting) then, as Engelhardt claims (1991, p. 83) from a lay point of view there would be no moral differencein theorybetween curing a defect (i.e., restoring the de facto design of human beings) and increasing human capacities (i.e., altering the de facto design of human beings in order to better achieve the goals people havegoals that are themselves susceptible to change). It would therefore be possible to defend the claim that the argument that we should not technically improve faculties that were biologically-evolutionarily acquired, since they have developed this way and are evolutionarily adapted, could be considered a naturalist fallacy (Grunwald, 2007a, p. 6). Such a flawed argument would lead nowhere in a normative sense and in addition it limits our 142

ADVANTAGES AND RISKS OF NATURALIZATION

capacities to those properties that were naturally given, which would reduce humanity to a museum piece. Furthermore, that would deny the cultural aspect of the human being to which transcendence of the self belongs, that is, thinking beyond that which is given. Neither does it follow from such an argument that technical enhancement is permitted or that it is a technological imperative. What are required are criteria and a solid orientation so as to be able to make decisions personally in a responsible and rational way. The question should be: why improve, to what end? We should reflect and weigh the effects and the risks, and take decisions in the full light of the ethical and social background to the situation. As Dupuy says (2004), the role of ethics in not to tell us what is good and bad, but rather to force us to ask tricky questions about aspects of the human condition that we generally take for granted. Grunwald (2009a, pp. 216-218) in turn considers a double function of ethics in this issue: the function of limiting the technoscientific possibilities, or at least, limiting the socially, politically and ultimately legally acceptable applications (a role of warning and watching, of holding back, of being hyper critical); and an ethics that contributes to guiding technoscientific creativity and to dissipating certain traditional moral sureties and certain uses (critical analysis of traditions that goes beyond that which is given). The European Unions Seventh Framework Programme for research and development brings research ethics to the fore. Ethical precaution (Bidet, 2009, pp. 30-31) cannot be seen as a restraint on the freedom and independence of research, but rather as a way to fix certain limits and a way to guarantee and preserve excellence in research. Any project submitted to the European Commission is first subjected to scientific evaluation. If it passes that, a group of experts from the Governance and Ethics unit of the Director-General of Research carries out a preliminary ethical examination: the ethical report will consider whether it is good or inadequate and it can always be the object of an ethical audit. The concept of human enhancement (Fuchs et al., 2002, pp. 16-17, 24-25, 4445), deserves detailed consideration: 1) concerning the role that technoscience and specifically medicine and medical actions play in re-establishing and conserving health, together with the use and application, in general, of technoscientific knowledge; 2) concerning the use of scarce medical and research resources in relation to human health and welfare; 3) concerning the socio-ethical legitimacy of the enhancement of the human being in the light of distributive justice, as already mentioned, among people and regions, and equality of opportunities; 4) we should, of course, reflect in light of the model of society that we desire and in which we wish to live; and 5) concerning what type of human being we wish for, with what dignity and with what normative concept (the need for self-assessment of the human being and of its design). A certain philosophy and evaluative anthropology are therefore imposed within an evaluative conception of nature in order to understand what it means to be human and thereby to obtain adequate self-understanding of the human being, in the knowledge of what we want when we talk of improving the capacities of the human being, and a social orientation that occupies a significant place in research policy.

143

URSUA

Finally, we can state that in order for research into clinical intervention (for example, direct intervention in the brain, or the brain-computer interface; see Clausen, 2009, p. 26) to be ethically acceptable, it has to meet the following criteria: a) it has to create value; b) it has to have scientific validity; c) the people who are the subjects of the study have to be chosen honestly; d) there has to be an acceptable risk-benefit relation (information on the possible risks and benefits for the people); e) there has to be some form of independent testability; and f) informed consent is needed, together with respect for the people who take part in the research. To sum up, a basic principle for establishing what is ethically acceptable and what is not in all scientific research is to recognise human dignity and respect it. When it comes down to it, we have to constantly ask ourselves what we want and what our objectives are when we aim to increase human capacities. This requires knowledge and reflection before deciding which option we really want.
NOTES
i

ii

iii

iv

vi vii

viii

This work forms part of the Research Project Naturalizing Philosophy: A Metaphilosophical Reflection in the Context of Contemporary Culture (EHU2009/03), funded by the University of the Basque Country (UPV/EHU). On the possibility of naturalizing philosophy, see Galparsoro (2010, pp. 77-123). On the concept of naturalness, see Birnbacher (2006). See Roco and Bainbridge (2003). They were the first to use the term CT. The concept has been expressed graphically through the NBIC tetrahedron. From the extensive literature on the subject, see Parens (1998); Fuchs Lanzerath, Hillebrand, Runkel, Balcerak, and Schmitz (2002); Aguil (2005); EOI (2005); Coenen, Rader, and Fleischer (2004); Coenen (2007); Andler et al. (2008); Fleischer, Quendt, and Rader (2008); Coenen (2008c); Grunwald (2007a, 2007b, 2007c, 2007d, 2007e, 2008, pp. 227ff, 2009a, 2009b); Sandel (2009); Birnbacher (2006); Gesang (2007); Zonneveld, Dijstelbloem, and Ringoir (2008); OMathna (2009); the complete issue of: Technikfolgenabschtzung Theorie und Praxis, 18(2), September 2009, devoted to Converging Technologies with contributions by different authors, including Christopher Coenen, who presents CT under the epigraph a magic word (pp. 44-50); and Galert (2010). See also the website: http://www.converging-technologies.org/convergingtechnologies.html) (10.3.2011). I will return to the theme of human enhancement, the subject of the present reflection, in more detail below. Expert commission brought together and financed by the K2 Unit: Society and Technology Foresight of the Directorate K: Knowledge-based Economy and Society of the European Commissions Directorate-General for Research. See also Coenen et al. (2004), which reports the position of this European group. See Saage (2006), which compares the American and European projects and claims that the latter is an alternative to the American, since, on the one hand, it recognizes the opportunities of these leading technologies and, on the other hand, it considers the dangers of these new developments, where a new social contract could be proposed that considers technological developments not as a mythical destiny which cannot be escaped, but as the result of a democratic agreement between European civil society. See also Wolbring (2009, pp. 30, 32), who claims that CT is much more than a set of different natural sciences and technologies. On the concept of naturalness, see Birnbacher (2006), who deals with this concept in relation to artificialitynaturalness as value, as normand questions whether we can justify the preference for the natural over the artificial in our everyday morals.

144

ADVANTAGES AND RISKS OF NATURALIZATION


ix

xi

xii

xiii xiv

xv

xvi

See, among others: Zonneveld et al. (2008); Technikfolgenabschtzung Theorie und Praxis, 18(2), September 2009, pp. 6-7, 11, 14-15; Grunwald (2009a, 2009b); STOA (2009); Coenen (2009b); Gesang (2007, pp. 4, 37-41); OMathna (2009, pp. 128-157, 197); Galert et al. (2009); Galert (2010); and Twine (2010). See Fuchs et al. (2002, pp. 71-85). On wish-fulfilling medicine, see Gonzlez, Quirs, and Puerta (2009). On post-humanistic technofuturism, see Coenen (2006, 2007); STOA (2009, pp. 94-97); Birnbacher (2006, pp. 173-179); Gesang (2007, pp. 40-41); Twine (2010, pp. 175-195); and OMathna (2009, pp. 158-186, 200-201). On the discussion of whether human genetic engineering is heading towards a new bottom-up eugenics, see Lujn (1991, pp. 125-156). These new technologies could lead to a new eugenics taking hold if, in a strongly meritocratic social context, parents are put under pressure when taking reproductive decisions (Irrgang, 2002; Romeo-Casabona, 2004, p. 325). The latter author claims that genetic intervention in the germ line or cell line, or in the process of biological selection, are lasting and even irreversible, and they could be considered eugenics if they are directly linked to human reproduction. Jrgen Habermas (2009, pp. 9, 29ff), wonders whether we are moving towards a liberal eugenics, understood as a recent development that promotes the use of genetic and reproductive technologies to help people select features of their own children (OMathna, 2009, p. 197; Sanmartn, 1987, 1990, pp. 173-176). See Comunicacin de la Comisin sobre el recurso al Principio de Precaucin, COM 01.02.2000. In connection to this see also Grunwald (2004). Meanwhile Coenen (2004) proposes assessing vision in order to examine nanofuturism as part of technofuturism. With regard to the legal perspectives in new psychiatric treatments and in research related to those treatments, together with the enhancement of psychic capacities and the need to establish new legal decisions, see Romeo-Casabona (2004). See also Grunwald (2007a, pp. 5-6, 2007b, pp. 383, 2007c, pp. 950, 952, 2007e, 2008, 2009, p. 211) and Nordmann et al. (2006).

REFERENCES
Aguil, Jordi (Coordinador). (2005). Tecnologas convergentes NBIC. Situacin y perspectiva 2005. Barcelona: CSIC. (http://nbic.org.es; http://nbic.org.es/institute/downloads-eu/NBIC2005.pdf, 30.12.2008). Albert, Hans. (1987). La posibilidad del conocimiento. Teorema, XIVII(2), 127-144. Andler, Daniel, Barthelm, Simon, Beckert, Bernd, Blmel, Clemens, Coenen, Christopher, Fleischer, Torsten, Friedewald, Michael, Quendt, Christiane, Rader, Michael, Simakova, Elena, & Woolgar, Steve. (2008). Converging technologies and their impact on the social siences and humanities (CONTECTS). An analysis of critical issues and suggestion for the future research agenda. Final Report, May (http://www.contecs.fraunhofer.de/images/files/contecs_report_complete.pdf, 31.12. 2008). Binet, Audrey. (2009, June). El final de los cientficos locos? Research EU. Revista del Espacio Europeo de la Investigacin, 60, 30-31. Birnbacher, Dieter. (2006). Natrlichkeit. Berlin: Walter de Gruyter. Blasco, Josep LLus, & Grimaltos, Tobies. (1997). Teoria del conoixement. Valencia: Servei de Publicacions, Universitt de Valencia. Bainbridge, William S. (2004). Progress toward cyberinmortality. In Immortality Institute (Ed.), The scientific conquest of death. Buenos Aires. (http://www.imminst.org/book, 10.3.2011). Clausen, Jens. (2006). Die Natur des Menschen: Geworden und gemacht Ethische berlegungen zum Enhancement. Zeitschrift fr medizinische Ethik, 52(4), 391-401. Clausen, Jens. (2009, September). Ethische Aspekte konvergierender Technologien. Das Beispiel Gehirn-Computer Schnittstellen. Technikfolgenabschtzung Theorie und Praxis, 18(2), 20-29.

145

URSUA Coenen, Christopher. (2004, June). Nanofuturismus: Anmerkungen zu seiner Relevanz, Analyse und Bewertung. Technikfolgenabschtzung Theorie und Praxis, 13(2), 78-85. Coenen, Christopher. (2006). Der posthumanistiche Technofuturismus in den Debatten ber Nanotechnologie und Converging Technologies, in Nordmann, Alfred; Schummer, Joachim; Schwarz, Astrid (Eds.). Nanotechnologien im Kontext: Philosophische, ethische und gesellschaftliche Perspektiven. Berlin: Akademische Verlagsgesellschaft AKA., 195-222. Coenen, Christopher. (2007). Utopian aspects of the debate on converging technologies. In Gerhard Banse, Armin Grunwald, Imre Hronszky, & Gordon Nelson (Eds.), Assessing societal implications of converging technological development (pp. 141-172). Berlin: Edition Sigma. Coenen, Christopher. (2008a, December). Expert meeting human enhancement. Shifting boundaries, changing concepts: The challenges of human enhancement to social, (dis-)ability, medical and ethical frameworks. Technikfolgenabschtzung Theorie und Praxis, 17(3), 144-145. Coenen, Christopher. (2008b). Die Vollstreckung des Prinzips der Technizitt Anmerkungen zu aktuellen Visionen wissenschaftlich-technischer Konvergenzprozesse. Presentation at the Conference Topoi der Rationalitt. Technizitt, Medialitt, Kulturalitt. Potsdam: Institut fr Philosophie der Universitt Potsdam. Coenen, Christopher. (2008c). Konvergierende Technologien und Wissenschaften. Der Stand der Debatte und politischen Aktivitten zu Converging Technologies. In TAB-Hintergrundpapier No. 16 (March). Bro fr Technikfolgen-Abschtzung beim Deutschen Bundestag, Berlin. (Also available at: http://www.tab.fzk.de/de/projekt/zusammenfassung/hp16.pdf, 19.2.2010.) Coenen, Christopher. (2009a, September). Zauberwort Konvergenz, in Technikfolgenabschtzung Theorie und Praxis, 18(2), 44-50. Coenen, Christopher. (2009b, September). Human enhancement (May 2009). Technikfolgenabschtzung Theorie und Praxis, 18(2), 143-144. Coenen, Christopher, Rader, Michael, & Fleischer, Torsten. (2004, December). Of visions, dreams and nightmares: The debate on converging technologies. Technikfolgenabschtzung Theorie und Praxis, 13(3), 118-125. Comisin Europea. (2000). Comunicacin de la Comisin sobre el recurso al Principio de Precaucin (COM 2000/01.02.2000). Dupuy, Jean-Pierre. (2004). Complexity and uncertainty. A prudential approach to nanotechnology. Available at: http//:portal.unesco.org/ci/en/files/20003/11272944951Dupuy2.pdf/Dupuy2.pdf (3.2.2010). Engelhardt, Tristram H. (1991, April). La naturaleza humana tecnolgicamente reconsiderada. Arbor. Ciencia, Pensamiento y Cultura, 544, T. CXXXVIII (special issue on Gen-tica: El impacto social de la ingeniera humana, collected by Jos Sanmartn), 75-95. EOI. (2005). El desafo de las nuevas tecnologas (Nano-Bio-Info-Cogno). Madrid: Escuela de Organizacin Industrial. Programa Desafos. European Union, Report of a NEST (New and Emerging Science and Technology) High-Level Expert Group. (2005). Sixth framework programme. What it means to be human. Origins and evolution for human higher cognitive faculties. (ftp://ftp.cordis.europa.eu/pub/nest/docs/whatitmeanstobehuman_ b5_eur21795_en.pdf, 10.3.2011). Fleischer, Torsten, Quendt, Christiane, & Rader, Michael. (2008, September). Converging Technologies und die Sozial- und Geisteswissenschaften. Ergebnisse und Erfahrungen aus einem EU-Projekt. Technikfolgenabschtzung Theorie und Praxis, 17(2), 74-77. Fuchs, Michael, Lanzerath, Dirk, Hillebrand, Ingo, Runkel, Thomas, Balcerak, Magdalena, & Schmitz, Barbara. (2002). Enhancement. Die ethische Diskussion ber biomedizinische Verbesserung des Menschen. Edited by Deutsches Referenzzentrum fr Ethik in den Biowissenschaften. DRZESachstandsbericht 1. Bonn. Galert, Thorsten, Bublitz, Christoph, Heuser, Isabella, Merkel, Reinhard, Repantis, Dimitris, SchneSeiffert, Bettina, & Talbor, Davinia. (2009). Das optimierte Gehirn. Gehirn und Geist, 11.

146

ADVANTAGES AND RISKS OF NATURALIZATION Galert, Thorsten. (2010, April). Das optimierte Gehirn. Potenziale und Risiken des pharmazeutischen Enhancements psychischer Eigenschaften. Technikfolgenabschtzung Theorie und Praxis, 19(1), 67-70. Galparsoro, Jos Ignacio (2010). Naturalizar la filosofa? In Jos Ignacio Galparsoro & Xabier Insausti (Eds.), Pensar la filosofa hoy (pp. 77-123). Madrid: Plaza y Valds. Gayon, Jean. (2003). Naturalisation de la culture, naturalisation de la philosophie: Enjeux et limites. In Wolfgang Buschslinger & Christoph Ltge (Eds.), Kaltbltig. Philosophie von einem rationalen Standpunkt. Festschrift fr Gerhard Vollmer zum 60. Geburstag (pp. 243-275). Stuttgart/Leipzig: S. Hirzel Verlag. Gesang, Bernward. (2007). Perfektionierung des Menschen. Berlin: Walter de Gruyter. Gonzlez Quirs, Jos Luis, & Puerta, Jos Luis. (2009). Tecnologa, demanda social y medicina del deseo. Medicina Clnica, 133, 671-675. Grunwald, Armin. (2004). Vision assessment as a new element of the technology Futures Analisis Toolbox. Presented at EU-US Seminar: New Technology Foresight, Forecasting & Assessment Methods, Seville, 13-14 May 2004. Grunwald, Armin. (2007a). Converging technologies for human enhancement A new wave increasing the contingency of the conditio humana. Available at http://www.itas.fzk.de/deu/lit/epp/2007/ grun07-pre04.pdf (10.3.2011). Grunwald, Armin. (2007b). Converging technologies: Visions, increased contingencies of the conditio humana, and search for orientation. Futures, 39, 380-392. Grunwald, Armin. (2007c). Orientierungsbedarf, Zukunftswissen und Naturalismus. Das Beispiel der technischen Verbesserung des Menschen. Deutsche Zeitschrift fr Philosophie, 55, 949-965. Grunwald, Armin. (2007d). Kann, soll oder darf man den Menschen technisch verbessern? Neue wissenschaftliche Visionen und ethischen Fragen. In Niels Boeing, Wolf Philipp, & Dietmar Herdt (Eds.), Nanotechnologie, Gentechnologie, moderne Hirnforschung Machbarkeit und Verantwortung (pp. 71-93). Leipzig: Leipziger Universittsverlag.. Grunwald, Armin. (2007e). Converging technologies for human enhancement. A new wave increasing the contingency of the condition humana. In Gerhard Banse, Armin Grunwald, Imre Hronszky, & Gordon Nelson (Eds.), Assessing societal implications of converging technological development (pp. 271-288). Berlin: Edition Sigma. Grunwald, Armin. (2008). Auf dem Weg in eine nanotechnologische Zukunft. Philosophisch-ethische Fragen (pp. 227-311). Freiburg: Karl Alber, Grunwald, Armin. (2009a). Die technnische Verbesserung des Menschen. Was besagt spontane moralische Entrstung in ethischer Hinsicht? In Heinrich Ganthaler, Otto Neumaier, & Gerhard Zecha (Eds.), Rationalitt und Emotionalitt (pp. 203-219). Wien/Berlin: LIT Verlag. Grunwald, Armin. (2009b). Human enhancement What does enhancement mean here? Akademiebrief, Europisches Akademie zur Forschung von Folgen wissenschaftlich-technischer Entwicklungen, 88, 1-3. Habermas, Jrgen. (2009). El futuro de la naturaleza humana. Hacia una eugenesia liberal? Barcelona: Paids. Hennen, Leonhard, Grnwald, Reinhard, Revermann, Christoph, & Sauter, Arnold. (2008). Einsichten und Eingriffe in das Gehirn. Die Herausforderung der Gesellschaft durch die Neurowissenschaften. Berlin: Edition Sigma. HLEG. (2004). Converging technologies. Shaping the future of European societies. A report from the High Level Expert Group on Foresighting the New Technology Wave. Rapporteur Alfred Nordmann. Brussels (http://ec.europa.eu/research/conferences/2004/ntw/pdf/final_report_en.pdf, 10.3.2011). Irrgang, Bernhard. (2002). Humangenetik uf dem Weg in eine neue Eugenik von untem? Europische Akademie. Graue Reihe. No. 31. Bad Neuenahr-Ahrweiler: Wahrlich Druck. Kanitscheider, Bernulf. (1991). Biologa evolutiva, tica y destino del hombre. Folia Humanstica, XXIX(322), 355-381.

147

URSUA Kanitscheider, Bernulf. (1994). Naturalismus und wissenchaftliche Weltorientierung. Logos. Neue Folge, 1(2), 184-199. Knorr-Cetina, Karin. (1999). Epistemic cultures. How the sciences make knowledge. Cambridge, MA: Harvard University Press. Kurzweil, Ray. (1999). The age of spiritual machines. New York: Penguin Books. Kurzweil, Ray. (2009, April). Der Mensch, Version 2.0. Spektrum der Wissenschaft, 121-126. Luhmann, Niklas. (1966). La ciencia de la sociedad. Mxico: Anthropos/Universidad Iberoamericana. Lujn, Jos L. (1991, April). Ingeniera gentica humana, ideologa y eugenesia. Arbor, 544, 125-156. Nordmann, Afred, Schummer, Joachim, & Schwarz, A. (Eds.). (2006). Nanotechnologien in Kontext. Philosophische, ethische und gesellschaftliche Perspektiven. Berlin: Akademische Verlagsgesellschaft AKA. OMathna, Dnal P. (2009). Nanoethics. Big ethical issues with small technology. London: Continuum. Parens, Erik. (1998). Enhancing human traits. Ethical and social implications. Washington: Georgetown University Press. Przse, Gbor, & Vrkonyi, Lszl. (2007). European aspects of converging technology development. The case of life and medical sciences. In Gerhard Banse, Armin Grunwald, Imre Hronszky, & Gordon Nelson (Eds.), Assessing societal implications of converging technological development (pp. 121-138). Berlin: Edition Sigma. Roco, Mihail C., & Bainbridge, William S. (Eds.). (2003). Converging technologies for improving human performance: Nanotechnology, biotechnology, information technology and cognitive science. Dordrecht/Boston/London: Kluwer Academic Publishers. (NSF/DOC-sponsored report, Arlington, VA: National Science Foundation, June, Online: http://www.wtec.org/ConvergingTechnologies/, 4.11.2011.) Romeo-Casabona, Carlos. (2004). Legal perspectives in novel psychiatric treatments and related research. Poiesis Prax, 2, 315-328. Rosahl, Steffen. (2009, September). Mehr als normal Verstehen wir die Enhancemet-Debatte? Technikfolgenabschtzung Theorie und Praxis, 18(2), 13-20. Saage, Richard. (2006). Konvergenztechnologische Zukunftvisionen und der klassische Utopiediskurs. In Alfred Nordmann, Joachim Schummer, & Astrid Schwarz (Eds.), Nanotechnologien im Kontext: Philosophische, ethische und gesellschaftliche Perspektiven (pp. 179-194). Berlin: Akademische Verlags-gesellschaft AKA. Sandel, Michael. (2009). The case against perfection. Ethics in the age of genetic engineering. Cambridge, MA: Harvard University Press. Sanmartn, Jos. (1987). Los nuevos redentores. Reflexiones sobre la ingeniera gentica, la sociobiologa y el mundo feliz que nos prometen. Barcelona: Anthropos/Universidad del Pas Vasco/Euskal Herriko Unibertsitatea. Sanmartn, Jos. (1990). La ciencia descubre. La industria aplica. El hombre se conforma. Imperativo tecnolgico y diseo social. In Manuel Medina & Jos Sanmartn (Eds.), Ciencia, tecnologa y sociedad. Estudios interdisciplinares en la Universidad, en la Educacin y en la gestin Pblica (pp.168-180). Barcelona: Anthopos/Universidad del Pas Vasco/Euskal Herriko Unibertsitatea. STOA (Science and Technology Options Assessment). (2009, May). Human enhancement study (IPOL/A/STOA/2007-13. PE 417.483). (http://www.europarl.europa.eu/stoa/publications/studies/ stoa2007-13_en.pdf, 19.2.2010.) Technikfolgenabschtzung Theorie und Praxis, 18(2) (2009, September). Issue devoted to Converging Technologies with contributions from different authors. Twine, Richard. (2010). Genomic natures read through posthumanisms. In Sarah Parry & John Dupr (Eds.), Nature after the genome (pp. 175-195). Malden: Blackwell Publishing. Ursua, Nicanor. (1993). Cerebro y conocimiento. Un enfoque evolucionista. Barcelona: Anthropos. Vollmer, Gerhard. (2003). Cmo es que podemos conocer el mundo? Nuevos argumentos sobre la teora evolucionista del conocimiento. Dilogo Filosfico, 57, 356-377.

148

ADVANTAGES AND RISKS OF NATURALIZATION Wolbring, Gregor. (2008). Why NBIC? Why human performance enhancement? Innovation; The European Journal of Social Science Research, 21(1), 25-40. Wolbring, Gregor. (2009, September). Die Konvergenz der Governance von Wissenschaft und Technik mit der Governance des Ableism. Technikfolgenabschtzung Theorie und Praxis, 18(2), 29-35. Zonnenveld, Leo, Dijstelbloem, Huub, & Ringoir, Danielle. (Eds.). (2008). Reshaping the human condition. Exploring human enhancement. The Hague: Rathenau Institute. (www.rathenau.nl; http://www.rathenau.nl/publicaties/reshaping-the-human-condition-exploring-humanenhancement.html (4.11.2011).

149

JULIN PACHO

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY


Disputed Questionsi

Until the Philosophy () is finally () abandoned (Bob Marley) One of the most characteristic features of contemporary philosophy is the lack of consensus as to just what philosophy is. The discussion regarding the nature of philosophy has become more intense in the last few decades. Metaphilosophy has appeared on the scene alongside traditional disciplines such as metaphysics and epistemology (Moser & Mulder, 1994). Within metaphilosophical discourse the naturalist turn (Kitcher, 1992; Callebaut, 1993), which eventual leads to the debate on the naturalization of philosophy, occupies an important place. In one sense this is a place that it always has occupied, since theoretical attitudes with respect to naturalism have constitutedexpressly or tacitlyone of the most decisive aspects of the characterization of philosophical production. There can be no meaningful idea of philosophy without first adopting a position in response to questions regarding proximity to naturalism (or distance from it). If we had to name one feature of philosophy that there was genuine consensus about, that feature would be the assumption that whatever else, philosophy should make a contribution to the conceptual clarity of the problems debated; since maybe it cannot compete with particular sciences, whether natural or human, when it comes to explaining the world. In what follows, I aim to make clear the current state of the issue regarding the naturalization of philosophy. In order to achieve this, I will have to clarify the meaning of the terms naturalism and naturalization; but that cannot be done without first describing, even if no more than roughly, the philosophical situation within which this debate is framed. I will then analyse some of todays controversial issues concerning the naturalization of philosophy.
1. AGREEING TO DISAGREE

As early as the beginning of the 20th century, the historian of philosophy Windelband (1920, p. 9) noted that there was no definition of philosophy that was universally accepted. In 1919 Oswald Klpe had agreed that it is inevitable that we move away from a common notion of philosophy (Klpe, 1919, p. 408). In 1903 A. Riehl maintained that the first problem facing philosophy today is to

J.I. Galparsoro & A. Cordero (eds.), Reflections on Naturalism, 151168. 2013 Sense Publishers. All rights reserved.

PACHO

know what philosophy is (Riehl, 1903, p. 3). A lack of consensus of this type would be unthinkable in any other classical discipline, and it would similarly have been unthinkable in philosophy back in the days when the answer was taken to be culturally obvious. Such a lack of consensus regarding what philosophy is, could, however, be made much more palatable by appealing to the fact that reflection on its own conditions of possibility is inherent to philosophy. In the Protrepticus (B 6) Aristotle already claimed that to practice philosophy is to ask oneself whether it is necessary to practice philosophy. The history of western philosophy could, in fact, be understood as the never-ending effort to demonstrate that a positive answer to the question that Aristotle claims the practice of philosophy poses is both necessary and possible. Hegel gives form to the conception of philosophy as indispensable. He maintains that it is an essential part of philosophy to ask where it should start, but also that this question is inevitable if we wish to justify knowledge in general. The origin (Anfang) of philosophy is for Hegel, just as it is for Aristotle, the question about the (initial) origin, about the foundation (Grund) of knowledge in general, without which knowledge would not be possible. Philosophy claims, then, to be the necessary study of the principles or foundations that are required for knowledge to be worthy of being so called. Historically, this conception of philosophy has been dominant. This way philosophy has of understanding itself as the necessary foundation was accompanied by consensus in the cultural setting. Here also, outside philosophy, it was taken for granted that philosophy was charged with laying the foundations of knowledge in general and, hence, of moral options. Philosophy, it was assumed from without, should and could perform this task. This understanding of philosophy, internally and externally, formed part of the dominant cultural certainties for centuries. However, such consensus has been lost both from within philosophy and from outside. From without, the loss of consensus is inseparable from the spectacular and unceasing growth in fecundity of the particular sciences since the start of the Modern Age. One of the direct consequences of this fecundity in areas of knowledge which philosophy was pleased to demarcate as external to or different from itself, is, as Manfred Riedel (1978, p. 33) has diagnosed, that since Hegel, the gesture no longer accompanies discourse regarding the origin, but rather discourse regarding the ends of philosophy. And the terminal status of the practice of philosophy that this diagnosis describes issues from within philosophy itself. Undoubtedly, such a state of affairs is one of the causes of the rise of metaphilosophy in contemporary philosophical historiography (Moser & Mulder, 1994; Collins, 1998). And a good part of the metaphilosophical positions of the last century, if not all of them, and even before Adornos influential Wozu noch Philosohie? of 1962 (Adorno, p. 1977) have to do, more or less directly, with the theme of the end of philosophy. One of the stages on which the end of the philosophy would today take place or be played out would, without a doubt, be naturalism.

152

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY

2. NATURALISM: A HISTORICAL NOTE

The positions adopted with regard to naturalism constitute one of the most important schisms that can open up between schools of philosophy. It would not be a difficult task to organize the whole history of philosophy in terms of those positions. Although not explicitly, this was the background to the great classical arguments in the history of philosophy: that between Aristotle and Plato concerning the ontological status of ideas, as well as the medieval disagreement about the universal; that between Hobbes and Descartes about the acceptance of independent non-bodily realities such as the human soul; that between Locke and Leibniz regarding the origin and nature of ideas; Humes argument against with dominant theory of causation at the time; Kants argument with the finalism present in the conception of nature; the argument between Marxist dialectical materialism and Hegelian idealism; or that of Nietzsche with the metaphysics that prevailed in the western tradition. All of thoseand they are by no means the only onesare arguments that include positions in favour of or against naturalism. Nevertheless, the explicit debate concerning naturalism belongs well and truly to contemporary philosophy. Kant already used the term naturalist, although not systematically and limited to the context of methodology. He used it to ridicule a particular type of empiricism as a pre-scientific method.ii This notion of methodological naturalism was naively contaminated with the notion of the natural, since it maintained that empiricism was a methodological stance that was too natural; that is, pre-scientific. However, it became successful. Dilthey (1911) continued to use the term in this way and identified naturalism with sensualism and crude materialism since, in naturalism, he says, will is subordinated to a life of animal instincts. Even Husserl still employed this meaning when he characterized the conception of the world that would follow from the cognitive natural attitude as naturalist. This is not the place to analyse whether a naturalist conception of the world should respond to cognitive attitudes and instances of natural methodology in this sense. There is a broad consensus that modern science, since the initial stage of defending the mechanization of the world (Dijksterhuis, 1950), is the most prominent bastion of contemporary naturalism (Philipse, 1994, 2001). It would, however, be meaningless to maintain that modern science is prescientific. The similarity between natural and naturalist, when our aim is to qualify characteristics of the processes involved in a way of understanding the world, is unfortunate (Pacho, 2000). It was, however, at the end of the 19th and beginning of the 20th centuries that the argument concerning naturalism become fully explicit.iii As a rule, the approach at that time was more metaphysical than methodological. Although the correlations between ontological and methodological aspects aretriviallyunavoidable, the argument concerning naturalism was in fact seen as a necessary derivation from the idealist and materialist (respectively) conceptions of the world defended throughout the 19th century. For the same reason, it was promoted within nineteenth-century positivist philosophy. It is easy to demonstrate historiographically that this argument became necessary because of the success that natural 153

PACHO

and human sciences achieved throughout that century. The knowledge that science made available no longer allowed theoretical proposals to be configured based on a priori divisions between nature and reason, nature and history, nature and culture, body and mind, spirit and matter, etc. The old way of thinking and its resultsthat is, its methodological and ontological suppositionsbecame blurred throughout the 19th century, if not meaningless and, especially, obsolete; that is, incongruent with the knowledge available. The way in which that old way of thinking took on a culturally valid form was, despite itsnot always solely superficialnovelty, that represented by post-Kantian German Idealism. Particular sciences, natural and human, have provided many good examples that support the idea that the traditional logical alternatives to immanentist hypotheses regarding the explanation for the worldincluding human phenomena such as language or the capacity to reasonwere not necessary; although they may have been logically consistent. The ontological principle of parsimony in explanations of the worldwhich is characteristic of a naturalistic conception of the worldwas proving to be effective, regardless of philosophical speculation (or maybe even despite it). It was therefore unavoidable that the conflict would become explicit. However, due to resignation in some and obstinacy in others, or simply out of caution, the explicitly metaphysical debate concerning naturalism was overtaken in the first half of the 20th century by others. They included issues concerning the differences between symbolic forms; the debate concerning the divide between science and non-science, or between natural and human sciences; and other related subjects such as the possibility of philosophy as a rigorous science or the best course of action in order to avoid the possible excesses of philosophy. That is, the naturalism debate slid out of the realm of metaphysics and into the realm of methodology and epistemology. Until well into the 20th century, from their two very different approaches, both analytic and continental philosophers preferred metatheoretical questions, without excluding metaphilosophical questions, to strictly metaphysical ones. Since the metaphysical questions regarding naturalism are not metatheoreticalor prima facie they are notphilosophy was able to ignore them. In short, following in the footsteps of Kant, contemporary philosophy was to be less interested in the way the world is and all the more interested in what we can say about knowledge of the world. However, just what had happened in the 19th century with the classic positions occurred again in the 20th century. Once again it happened that the positive knowledge available to particular sciencesnatural and humanmade basic conceptual suppositions of philosophical argument problematic. Ironically, this did not now affect our beliefs about the world, but rather our beliefs about how we acquire and justify knowledge of the world: it affected epistemology, which together with metaphysics is one of the most genuinely philosophical areas of knowledge. Quine (1969) arrived at the conclusion that the new state of knowledge made traditional epistemological positions obsolete and this was the starting point for his proposal to naturalize epistemology. Although I will come back to this later, I would like to say here that, according to Quine, it is only thanks to the fact that 154

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY

essential aspects of the subject matter are being successfully tackled by particular sciences that epistemology is alive today. Specifically, we are making progress thanks to the natural sciences and in some sense in spite of philosophical attempts to portray them as failing. Part and parcel of Quines interpretation is that he saw such progress in the field of psychology, which he considers to be a natural science. It is a trivial consequence that the naturalized epistemology that Quine defends is, materially and formally, another natural science; and since it studies a natural phenomenon (ibid.), the results will pertain to and come from those very natural sciences. Based on Quines proposal to liberate epistemology from philosophical purism, much of the discussion concerning naturalism in the second half of the 20th century involved the methodological and metatheoretical aspects of naturalism, including metaphilosophy, rather than the metaphysical or ontological aspects. In that period, when there was talk of naturalism, it almost always referred to the naturalization of some or other discipline, or of philosophy in general. It is therefore a good idea to establish the precise meaning of the terminology used in the debate on the naturalization of philosophy.
3. NATURALISM, NATURALIST AND NATURALIZATION

Throughout the second half of the 20th century, largely in response to the challenge launched by Quine, in addition to the term naturalism, expressions such as naturalist programme and naturalization of philosophy are common in the philosophical literature. To simplify matters, naturalism is the generic term used to allude to the naturalist programme or the naturalization of philosophy. However, naturalism and naturalization do not have the same meaning; although they may be interdependent in certain respects. This is because the notion of naturalism, as can be seen for the preceding historical note, has at least two quite different meanings. The notion of naturalism can refer both to methodological attitudes and to naturalist positions or theses regarding what the world is. It is therefore wise to differentiate naturalism from the ontological (or metaphysical) point of view and from the methodological (or epistemological) point of view. Of course, some overlap is unavoidable and often explicit.iv Naturalism from the Ontological or Metaphysical Point of View This concerns what the world is. Naturalist conceptions of the world are generically characterized by their denial of the existence of transnatural entities. This allows several different levels of commitment, with a distinction almost always drawn (Strawson, 1985; Papineau, 1993; Kanitscheider, 1996; Koppelberg, 2000) between a more or less strong or radical reductionist position and another weaker, more tolerant position. At all the levels, however, an immanentist conception in the world is maintained (Vollmer, 2000), governed by what has been called the principle of ontological parsimony according to which it is not 155

PACHO

necessary to accept any ontological leap in the causal chains of the universe in order to explain the apparition of new qualities (Kanitscheider, 1996, p.159). Since this is the more clear meaning, I will not dwell on it. Naturalism from the Epistemological or Methodological Point of View This concerns how the world is known or how it should be known; not what the world is. From the methodological point of view, naturalist describes a way of explaining the world, not the world itself. This meaning in fact corresponds with the first philosophical use of the term naturalism. It emerged in the second half of the 18th century as a name for a type of method that is characterized as empiricism or nave methodological reductionism, which owes more to everyday language and experience than to scientific practice (Pacho, 2005). As indicated above, Kant provided the foundational text for this meaning (KrV A 855, B 883). In contemporary philosophy, the argument concerning methodological naturalism has centred on the study of some or all of the objects or themes of epistemology, or from the philosophy of science. This use emerged within a programme that its adherents called naturalist, which began with the naturalization of epistemology then moved on to the philosophy or theory of science in general and finally to the whole of philosophy; to the dismay of more than a few philosophers. The start of this programme and its name are well known: Epistemology naturalized by Quine. Specific to this meaning is the methodological commensurateness of philosophy and natural science: Epistemology () simply falls into place as a chapter of natural science. It studies a natural phenomenon viz. a physical human subject. (Quine 1969, p. 83) This version of methodological naturalism applied to traditional philosophical questionsin this case epistemologyis not the only one. It is the start in terms of a concerted programme, but it has been complemented (and reinforced) through the work of Kuhn (1962; see also Ambrogi, 1999), which led to approaches to the philosophy of science that today are considered post-Quinean naturalism (Giere, 1985). Such approaches are characterized by their call to include external points of view economic, psychosocial, political-cultural, etc. in the study of science. It is this methodological naturalism that has led to the debate concerning the naturalization of philosophy. When methodological naturalism is applied to (or its application is considered regarding) an object that, for whatever reasons, had not been processed following the standard scientific approach, then, following Quines expression referring to epistemology, that object is subjected to naturalization. What is naturalized here is not, strictly, the object; but rather the study of that object. To the extent to which the debate concerning naturalization is projected onto the whole of philosophy, what is talked of is precisely the naturalization of philosophy.

156

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY

However, this weighty description of methodological naturalism is of great metaphilosophical importance. Defenders of methodological naturalism coincide in accepting what Kornblith (1985) calls the thesis of the replacement of de jure questions by de facto questions. The traditional point of view had maintained that the question of how we should support or justify our beliefs (quaestio juris) is independent of questions concerning how we acquire our beliefs (quaestio facti). The former would count firmly as the realm of philosophy; the latter as the realm of particular sciences. This conceptual difference (on which the naturalist fallacy is based) is thus not unknown to methodological naturalism, rather the methodological naturalist denies its consistency, maintaining that the first question cannot be answered independently of the second since the second question is relevant to the first because it addresses everything that the first does not [but which is nonetheless indispensable for it] (ibid., pp. 1-4). The standard arguments against this position are further versions of the accusation of a naturalist fallacy, according to which it is not possible to obtain normswhether epistemic or ethicalfrom facts. One of the most common versions of this rebuttal is to accuse naturalist or naturalized explanations of circularity. Symmetrically, the arguments in favour of methodological naturalism are also arguments against the accusation of vicious circularity that would imply recourse to facts to justify norms or confusing causes with reasons. I cannot reproduce that debate here, but it should at least be said that generically, methodological naturalism considers human knowledge in general as a complex system of virtuous circularities with no privileged objects or subsets that could escape from that circularity. Another way to express this naturalist position (with respect to epistemology and a fortiori to the metaphysics of knowledge) is to claim that knowledge of the world and knowledge of knowledge present no relevant features that distinguish them as instances of different levels of cognition. It is clear that circularity can only be avoided if it is recognized that in contrast to the variability or historical contingency of knowledge of the world, there would be a non-contingent foundation that could and should be detected in a theoretical field and which is not exposed to historical erosion; that would be the equivalent to what has been called the metaphysics of knowledge. This is an epistemic supposition inherent to the position opposed to methodological naturalism. Evidently, the naturalist supposition according to which knowledge of the world and knowledge of knowledge do not differ substantially has considerable ontological weight. If one were to admit that knowledge has a different ontological status from that of facts about the worldfor example, in virtue of the subject who has that knowledge, or of the human mind or reasoningthen it would be more difficult (if not impossible) to defend its naturalization; and correspondingly easier to deny the possibility of its naturalization. Metaphysical naturalism and naturalization of philosophy are not, then, the same thing; although they may be closely related. The former describes a conception or image of the world; the latter a methodological approach. To naturalize a philosophical object or a whole discipline means to follow the 157

PACHO

approaches and methodological criteria of sciencenatural or humanwithout allowing methodological privilege for philosophical objects. (Below, I return to the question of privileged objects.) The result would not, in principle, be a naturalist theory, but a naturalized theory. It is nonetheless clear that a defender of methodological naturalism will very probably support a naturalist conception of the world. However, it is not necessarily obligatory for critics of the naturalization of certain philosophical problems or of philosophy as a whole to require adoption of commitments towards a non-naturalist ontology. In a similar vein, it is not the same to talk about naturalizationor even to defend itand to practice it. It is therefore worth noting that, despite the enormous influence of Quines proposal on the epistemological debate and Kuhns influence on the philosophy of science, in philosophical circles much more is said and debated about the naturalization of philosophy than is actually put into practice. In such a way that, the majority of the arguments about methodological naturalism, including declarations in favour of it, are not naturalized arguments. In fact, Quines arguments in favour of the naturalization of epistemology are not naturalized arguments. As a rule, Quine supports methodological naturalism, but does not always practice it. This is not an empty observation. There can a priori be no argument that requires us to answer the following question in the affirmative: Does the proposal for the naturalization of philosophy have to naturalize itself? Perhaps that is why the empirical information necessary for a naturalized treatment of a given question does not often go beyond rhetorical embellishment to reinforce the position that is being defended. Nevertheless, if identification of the naturalization of philosophy is not restricted to very specialist naturalization programmessuch as biologicism and psychologism in epistemology (Quine) or the sociologist programme in the philosophy of science (Kuhnean and postKuhnean); not to mention the old eliminationist programmes that characterized the founding period of the Vienna Circle and the subsequent beginnings of analytic philosophythen it is only fair to recognize that certain metaphilosophical positions that are not historiographically located within the naturalist spectrum either metaphysical or methodologicalare clearly naturalized. An example that is as clear as it is surprising is Diltheys position when he maintains that philosophy has to be studied in such a way that it, as a human historical fact, must become its own object (Dilthey, 1968, p. 13); that is, as an empirical fact (Dilthey, 2006). Dilthey did indeed put this into practice with certain rigour, in his own way, and offered a distanced and aseptic vision of philosophy; although he did not go as far as to adopt a sociological methodology such as, for example, Randall Collins used in his exhaustive study The sociology of philosophy (Collins, 1998). These are two quite different, but complementary, approaches; they are not mutually exclusive.
4. AN EXAMPLE: THE NATURALIZATION OF EPISTEMOLOGY AND THE UNEASE OF PHILOSOPHY

In order to better establish how hypothetical naturalization affects philosophy, I summarize the core of one of the most thoroughly worked-out and influential 158

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY

naturalization proposals. In principle it does not affect the whole of philosophy, but it certainly affects at least a particularly significant and genuinely philosophical realm from Platos dialogues: epistemology. When Quine launches his proposal for the naturalization of epistemology he is perfectly well aware of the radical metaphilosophical scope of his programme: Philosophers have rightly despaired of translating everything into observational and logico-mathematical terms. [] And some philosophers have seen in this irreducibility the bankruptcy of epistemology. Carnap and the other logical positivists of the Vienna Circle had already pressed the term "metaphysics" into pejorative use, as connoting meaninglessness; and the term "epistemology" was next. Wittgenstein and his followers, mainly at Oxford, found a residual philosophical vocation in therapy: in curing philosophers of the delusion that there were epistemological problems. But [] epistemology still goes on []. Epistemology, or something like it, simply falls into place as a chapter of [] natural science. (Quine 1966, p. 81) The most surprising aspect of this proposal for our purposes is that Quine proposes avoiding the bankruptcy of (part of) philosophy through its naturalization. The bankruptcy of epistemology would be the result of applying the standard Vienna Circle project to epistemological problems. That project proposed the elimination of all statements that could not be translated in logico-mathematical and observational terms. The result of such a project would be that statements that formed the body of traditional epistemology would have to be eliminated in the same way as the elimination of propositions related to traditional metaphysical questions had been proposed. Quine maintains that the bankruptcy of epistemology could be avoided if we start from the assumption that our aim should not be to eliminate problems whose effective treatment does not have a logico-mathematical and observational translation, but to modify them so that they can be treated just like any other object of science. This may already be happening, but removed fromand at the expense ofphilosophy. Fortunately, epistemology would survive, albeit with a different and clarified status (ibid.). This new, clarified, status is that of epistemology that operates in line with natural science. This is what Quine calls naturalized epistemology. Thanks to naturalization, epistemological problems would cease to be false problems or to be treated in such a way that they became false problems. We can infer that, since the naturalization of epistemology is, in fact, already underwayalthough not within philosophy itselfit would therefore only be the specifically philosophical version of epistemological problems that would be bankrupt; or rather, the supposition that, whatever particular sciences may do, there would always be a genuinely philosophical logical space for tackling epistemological problems. The naturalization of epistemology would thus imply a denial of this supposition. Quine (1960, p. 275) expresses this through the muchcited metaphor of exile: there is no such cosmic exile. Philosophy cannot hope to

159

PACHO

exile itself from human knowledge to some exclusive territory, methodologically exempt from meeting the established requirements. According to Quine, therefore, to adopt strategies that lead to exile would be the same as to accept the bankruptcy of philosophy. It is ironic that he intends to avoid such a situation through naturalization, since in his programme that is the same as following the methods not of science in general, but of natural science. However, this is not so surprising if we adopt a broad historical perspective. In a cultural context different from the presentsuch as a pre-modern context in which there was no clear separate allocation of objects and competences to philosophy and science, so that the two terms designated the same thing genericallyto naturalize a discipline could not have meant anything other than to subject it to the methods of science or philosophy. Things have changed to such an extent that Quines requirement that epistemology be naturalized and follow scientific methods demands of this illustrious philosophical discipline that it acquire something akin to citizenship of the Republic of Science, not of the Republic of Philosophy. It is not a case of following the ways of natural science because it is natural, but rather because this is the type of science that will mark out the way for all science. The unease and rejection that this programme of the naturalization of epistemology has caused in philosophical circles is understandable; especially due to the evident risk that, if it was consistent for epistemology, the programme could be extended to any discipline or philosophical object. And that is so even if it were not construed under the strong version of naturalizing exclusively in accordance with the methods of natural science. The naturalization of a philosophical question implies, in general terms, accepting that the noble philosophical question affected must be dealt with, and indeed maybe resolved, by the means and procedures available in a (or some) certain particular science(s); whether natural or human. What is at stake here is not justor even mainlywhether natural science can fathom philosophical objects. It is whether philosophy can justifiably be singled out as some specific and genuine form of knowledge, as epistemically differentiated from science as a wholewhether that be natural science or not when it comes to tackling and resolving problems. I mentioned above that, in addition to the programme of naturalization la Quine, there are other projects that are just as naturalized but that do not follow the means of natural science. More than a few philosophers see in such a process of the naturalization of philosophy an act of self-destruction. Sagal (1987) has used the expressions suicide and harakiri of philosophy in this context. This is an idea that is extremely generalized in philosophical circles. Naturalization is seen to entail an unnecessary and inconsistent relinquishment of philosophical identity. It is true that this rejection contains elements from both the debate concerning whether it is possible to naturalize some or other object and the defensive reaction of philosophy when faced with the impossibility of competing with particular sciences in explaining the world. However, such a confluence is well justified; the naturalization of a philosophical problem or of an entire disciplinesuch as, for example, epistemologyis equivalent to converting the object into the object of study of a (or several) particular science(s). Likewise, philosophy would lose its 160

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY

specific competence in that domain. That is why I warned at the outset that the fecundity of knowledge in particular sciences, the loss of cultural transparency of the philosophical project and the problem of the naturalization of philosophy are inseparable. It could be concluded that whether to keep the label philosophy for the treatment of a given problem is a secondary question; purely a question of naming. However, the challenge of naturalization obliges us to decide whether there is any justification for maintaining that certain objects cannot or must not be naturalized. That is the same question as whether genuinely philosophical objects or problems exist.
5. THEORY AND CRITICISM OF PRIVILEGED PHILOSOPHICAL OBJECTS

Despite the weighty and technical nature of the question concerning whether genuinely philosophical objects exist, and if so what criteria we should use to identify them, an answer is always providedwhether knowingly or notin any philosophical proposal. It is implicit in the notion of philosophy that is upheld and therefore it has to be consistent with the desires and expectations that are held with respect to the cultural responsibility of philosophy. Although it is usually recognized as one of the definite demarcation criteria that normative questions concerning ethics and epistemology are philosophical questions, questions concerning facts such as the existence of God, the existence of the soul or whether the mind is distinct from the body are also still culturally considered to be genuine philosophical problems. Why is that so for those questions and not for others concerning facts, such as knowing whether there is a number greater than n or whether there is water on Mars? From an intuitive point of view, two conditions appear necessary for an object to be considered philosophical: (a) that it be treated with a sufficient degree of generalitythat is, via sufficiently exploratory or tentative analysis; (b) that, for whatever reasons, it have a special cultural relevance in the symbolic world orif one prefersthat it be of broad interest to the members of the species. Candidates for meeting condition (b) have been, among many others: the origin or first cause of the universe; the existence of God; the immortality of the soul; the mind-body distinction and interaction; the origin of language; the basic structure of matter and of life; the basic structures of language in the categorization of facts and objects; the languagethought relation; the origin and nature of knowledge; the relation between the origin of experience and abstract concepts; the foundations of moral and aesthetic norms; and what constitutes freedom. Philosophy has been, and continues to be, concerned with such questions; but no longer with all of them. In the detailed breakdown of these issues, many aspects are no longer philosophical and have become purely scientific. It may be the case that the question of the origin 161

PACHO

of language still has, in its most generic form, a certain philosophical aspect, but in its heyday this question included aspects such as: Who taught humans to speak? Was Hebrew the first language and all the others variations of it? Can the human mind create language without divine intervention? Evidently, not only are these no longer philosophical problems today: they simply are no longer problems. Something similar could be shown for the rest of the questions mentioned; and there are many more problems that in their day were philosophical but have now stopped being so (cf. Popper, 1952; examples from physics in Heller, 1970): the existence of the atom and of gravity; the structure of the system solar; the origin of the species; how many planets there are; the age, shape and position of the Earth in the universe; the nature of the Sun and the stars; the distinction between heavenly and earthly; the distinctions between the sexes and races; the nature of light; infinitesimal calculus; the origin of work and of suffering in human history; the origin of the dominant moral criteria; the origin of languages and of writing; what causes fossils; the effect of climate on intelligence and customs: the origin of the known forms of government; etc. In fact, the history of classical particular sciences can be seen as the history of philosophical problems that no longer are such. And we do well not to forget it; this loss of part of the object matter of philosophy, which is compatible with having gained new alternative philosophical issues, has been due to processes that in contemporary jargon could be classified as processes of naturalization. The startling process of the de-philosophization of innumerable questions throughout history since the appearance of modern science makes the question of whether there are genuinely philosophical problemsthat is, issues that are methodologically irreducible or that cannot be naturalizedeven more pressing. More specifically, the question now takes the form: Can any object, or aspect of certain objects, that can never be touched by the historical process of dephilosophization be identified a priori? We cannot get off the hook through what could be called the argument of post hoc purging: by claiming, for example, that philosophers were concerned with the nature of light in the 17th century because they did not have a purged notion of philosophy or because the frontiers between philosophy and science were still very blurred. A notion of philosophy that is purged, let us say with Kantian criteria, would exclude from the history of philosophy the majority of ancient, medieval and modern philosophical production. Moreover it would be far too arbitrary to hold that philosophy, in a strict sense, starts with Kantian philosophy.v It should be recognized, nevertheless, that such an argumentparticularly the Kantian eliminationist version for grand metaphysical questionswould have the great advantage of freeing philosophy once and for all from having to compete with science and face up to the loss of objects that were genuinely philosophical in their day. It does not seem that institutional philosophy is prepared to go quite so far. Rorty (1979, p. 411) notes, and criticises, that the belief that philosophy can explain what particular sciences leave unexplained is a common characteristic of traditional philosophy. It must be added that this characteristic also holds for 162

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY

contemporary philosophy, particularly where one takes up argument against the naturalist programme. This is reinforced by the theory that there are objects that are philosophical per se. If, as I insinuated above, the history of the classic sciences is in part the history of philosophical problems that no longer are such, then the naturalization of philosophy would support the idea that there is no philosophical object that could not cease to be so. It is difficult, in keeping with the historical facts, to contradict Russells thesis according to which a problem is philosophical to the extent to which it does not yet have a solution; a solution which would be provided by a (or some) science(s): It could be said that science is what we know and philosophy what we do not know. That is why questions constantly move from philosophy to science as our knowledge progresses (Russell, 1960, p. 33) This, which so scandalizes philosophers, means that philosophy is concerned with certain problems to which we have not yet found a lasting solution. Against this it will be argued that in science there are also many problems that are yet to be resolved. It is noticeable that when an unresolved scientific problem has, for whatever reason, great cultural relevance, there is a type of cultural inertia with respect to considering the problem a philosophical one. However, once a satisfactory rational solution has been found, such problems are no longer considered philosophical. Some notable examples of this phenomenonwhich above were mistaken for a loss of subject matter for philosophyare: the nature of time and space; the structure of the solar system; the nature of light; the origins of the universe, life, species and language; and the existence of the atom. This phenomenon of the de-philosophization of certain problems started to become partly visible and to a large extent predictable during the build-up to the scientific revolution of 1600. Since then, programmes to purge the object of philosophical debate began, in order to prevent the discipline from becoming disqualified from without. This was notably carried out by the Padua school, particularly through the logic of Zabarella (Opera logica, 1597). Nominalists from the Renaissance also contributed to this phenomenon and even went as far as to maintain, as Nizolio did,vi that metaphysics had no justifiable object; all the problems were either inconsistent or susceptible to being tackled by particular sciences such as theology, the sciences of language or physics. This process of purging, which was the same as accepting naturalization (avant la lettre) of at least all objects of philosophical debate that dealt with questions of fact, culminated in Kants Critique of Pure Reason. There, Kant renounces the idea that philosophy can establish any doctrine concerning the world and reserves for philosophy only the normative task of establishing a canon concerning the limits of justified knowledge (KrV B26 and B823). It should be understood that this restrictive Kantian metaphilosophical position, translated into our terminology, maintains that all questions related to facts, the human soul, the world as a whole, or God, must be naturalized if there is any hope of gaining knowledge of them. We cannot gain knowledge of such issues if (and to the extent that) they cannot be naturalized. Others, in contrast, have believed that it 163

PACHO

is possible or necessary to maintain a certain type of objectsor a certain dimension of them; a certain dimension of realityas the proper and irrevocable object of philosophical enquiry. It is these objects, which are genuinely philosophical hypotheses, that I call here privileged objects. Genuinely philosophical objects would be immune from naturalization. They would be privileged because they could never be affected in any substantial way by methodological naturalism and, therefore, they could not be absorbed into or fathomed by particular sciences. There is certainly something like a tacit consensus, both within and outside philosophy, regarding the fact that certain problems will never be fathomable by means of particular sciences, whether natural or human. What is that consensus based on? Some consider that certain objects, by their very naturenot due to any structural limitation of the human cognitive systemcannot be successfully tackled via science; they will not be solved in Russells sense of problems that have been solved or have not. Examples of objects of this type would be the existence of God or the nature of consciousness. Others, in a more opportunistic but reasonable way, argue that science can never resolve all the problemsthe set of all the problems simply cannot be defined and some of them will be both attractive to philosophy and particularly resistant to resolution via science. Furthermore, scientific research itself will always throw up new problems that are attractive to philosophy. For both these reasons, philosophy will never become obsolete or something we can do without. The hard core of this complex argument represents a version of the argument know as cognitive closure, used here in favour of philosophy. The cognitive closure thesis maintains that our natural cognitive capacity must have natural limits.vii Used in favour of philosophy it effectively means: there are objects that, by their very nature, are beyond the reach of science (just as third-degree equations are beyond the reach of ruminants), but not beyond the reach of philosophy (just as third-degree equations are not beyond the reach of some humans). Objects such as intelligence itself, linguistic capacity or consciousness could be outside these limits. Therefore, we can only make certain more or less consistent observations; but never strictly solve them via science. The argument based on the nature of the objects themselves deserves at least the following observations. In the first place, to define what is not beyond the reach of science or the reach of philosophy cannot be separated from metatheory regarding what science is, what philosophy is, and what the difference between the two is. Such metatheory will undoubtedly bring with it ontological presuppositions. It is clear that whoever, for example, assumes an ontology based on the principle of ontological parsimony which characterizes naturalism, will find fewerif any material objects that will serve to maintain the argument that certain objects are within the reach of philosophy but beyond that of science. This line of argument makes it clear that, although it is not logically coercive to naturalize philosophy, when we adopt a naturalist metaphysical position, that position excludes support

164

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY

from ontological presuppositions in order to maintain the existence of privileged objects as an argument against the programme of naturalization. More important though is the circularity of maintaining that the nature of certain problems, due to the ontological status of their objects, demands a type of knowledge with its own epistemic status (in this case, philosophical). The claim implies that we know that the status cannot be naturalized thanks to a methodology that cannot be naturalized; and we cannot naturalize it due to the ontological status of those objects. Furthermore, this position has the serious drawback that it would have to make itself invulnerable against any type of knowledge that is not compatible with it. That is, it has to do without the demand for external coherence, just in case. However, the demand for external coherence is indispensable for any philosophical and, a fortiori, metaphilosophical position if the discussion is to be rational. A more intuitive way to express this objection is to emphasizeand hereby I move on to the cognitive closure argumentthat the methodological exclusion or exile of certain objectsor their exileis based on anticipating the future of human knowledge. It seems evident, more than just plausible, that the human mind must have natural limits. However, fixing those limits a priori would be equivalent to anticipating the essence of the human history of knowledge. The set of problems accessible to knowledge is not definable a priori for the simple reason that, according to the anomaly of knowledge known as the hydra effect (Vossenkuhl, 1995, p. 65), the number of problems does not diminish in proportion to the number of problems resolved, but rather it increases. Therefore, neither can the subset of problems that can never be resolved be predefined. Given the interdependent character of the problems and the solutions, it is impossible to conclude a priori whether a given problem can or cannot have a solution. This is not even possible within systems that in principle are more independent of historical and cultural contingencies and have a more aprioristic internal structure, such as formal systems. Certain mathematical problems, such as Fermats conjecture, consist precisely of knowing whether or not they can have a solution. Showing that they do not have one is just as much a form of solution as showing that they do. The history of these problems seems to show, however, that every position adopted with respect to them is provisional, since it is not possible to know a priori all the eventualities of the future of knowledge. In short, the attempt to set certain irreducible objects apart can be explained by the desire to save the interests of philosophers. It is also compatible with strong cultural inertia. However, it is inconsistent since it has to make itself invulnerable against any hypothesis or fact that would strip it of external coherence; that is, of logical compatibility with the actual state of human knowledge. Perhaps that is why Rorty (2001, p. 19), making a virtue of necessity, considers that an indispensable condition for saving the future of philosophy is precisely to abandon competition with particular sciences when it comes to explaining the world and, at the same time, to give up all claims to any type of specificity or autonomy.

165

PACHO

6. PHILOSOPHIZING AFTER THE NATURALIZATION OF PHILOSOPHY?

Above I say that a problem is culturally perceived as philosophical if it meets the following two conditions: (a) it is treated with a sufficient degree of generality, that is, its analysis is sufficiently exploratory or tentative; and (b) that, for whatever reasons, it has a particular importance in the symbolic world or it is of broad interest to humans. We are now in a better position to see precisely what conditions or criteria those issues that in fact are (or can be) considered as philosophical actually do meet. 1. Cultural universality: they are of interest to all humans. This condition or criterion is not enough on its own though; everybody is also interested in the resolution of problems such as finding a cure for cancer, which is not an issue that philosophers study. Therefore the following condition must also be met: The absence of a definitive or consensual solution: they are in a state of tentative analysis or exploration. This condition means that the issue that is of universal interest has no resolution that is agreed upon by any general consensus: the issue is still undergoing exploration or deliberation. However, conditions 1 and 2 together are not enough; there are many issues of general interest that have no solution yet but are issues that are disputed within and by science, not by philosophy. Therefore, a third condition is also required: Formal universality: they are issues that force us to define our most basic or ultimate criteria of reality and rationality; and they expressly deal with those criteria.

2.

3.

It is clear that this third condition is more important than the other two; at least that is what the philosophers will think. However, none of the conditions is enough on its own. Even issues that meet the third condition could be incorporated into science if the second condition were not met. What is more, if we ask ourselves whether the conjunction of these three conditions would save philosophy from the risk of being naturalized, the answer would be: yes, but only to the extent that conditions 1 and 3 also meet condition 2. The second condition is compatible with (if not equivalent to) Russells thesis; furthermore, it is strengthened by the Hydra effect. According to Russell, a problem is philosophical to the extent that it is at an exploratory stage (and has a certain relevance). The Hydra effect encapsulates the idea that the number of unresolved problems does not decrease, but rather it increases, as the number of resolved problems increases. In view of the almost obscene fecundity of contemporary knowledge, of its undeniable state of permanent revolution, there are good reasons to expect that the magnitude of what we do not know grows unceasingly and exponentially. If, then, Russell is right, philosophy would never

166

NATURALISM AND THE NATURALIZATION OF PHILOSOPHY

have enjoyed better social and cultural conditions than it does now, since the human mind has never before been confronted with so many unresolved problems. What characterizes methodological naturalism resides in maintaining that this structural openness of knowledge also affects other issues, those whereby we aim to define the most basic suppositions of our theoretical edifice. This is precisely what the virtuous circle consists of; unlike fundamentalism, it views naturalism as within the actual history of human knowledge. However, even if issues that meet the third condition are naturalized, others will always emerge that also meet it. It will therefore always be possible to philosophize after naturalization.
NOTES
i

ii

iii

iv

vi

vii

This work forms part of the Research Project Naturalizing Philosophy: A Metaphilosophical Reflection in the Context of Contemporary Culture (EHU2009/03), funded by the University of the Basque Country (UPV/EHU). See, for example, KrV A 855 and The conflict of the faculties, AK V A, 87. On the origin of the idea, cf. Pacho (2000). Krikorian (1944) offers an excellent summary of the state of this issue in the first half of the 20th century. Texts that are representative of various current naturalist methodological approaches can be found in Ambrogi (1999). On naturalism, both ontological and methodological, cf. Keil and Schndelbach (2000). It would be arbitrary, but not altogether unusual. The thesis is defended, however, by hermeneutic subtlety that consists of reinterpreting pre-Kantian philosophical systems as proto-transcendental versions of philosophy. For Nizolio, the few useful things that metaphysics includes belong in part to grammar, in part to lexicography, to natural philosophy [physics], to rhetoric, to theology . (De veris principiis et de vera rationis philosophandi contra pseudophilosophos libri IV , Parmae 1553 (repr. Roma 1956), liber III. Leibniz, who considered the rerelease of this work to be necessary, spent some 40 pages refuting this. This hypothesis was defended by Fodor and above all by Chomsky (1979, pp. 66ff., 1995, pp. 2-5) and McGinn (1991, pp. 1-22).

REFERENCES
Adorno, T. (1977). Wozu noch Philosophie? Gesammelte Schriften, 10(2), 459-473. Ambrogi, A. (Ed.). (1999). Filosofa de la ciencia: El giro naturalista. Palma: Universitat de les Illes Balears. Baumgartner, H. (1978). Wozu noch Philosophie? Perspektiven einer [] Frage. In H. Lbbe (Ed.), Wozu noch Philosophie? (pp. 238-258). Berlin/New York. Callebaut, W. (Ed.). (1993). Taking the naturalistic turn. Chicago: Chicago University Press. Chomsky, N. (1979). Language and responsibility. New York: Panteon Books. Chomsky, N. (1995). Language and nature. Mind, 104, 1-61. Collins, R. (1998). The sociology of philosophy. Cambridge/London: Harvard University Press. Dijksterhuis, E. J. (1950). De mechanisering van het wereldbeeld. Amsterdam. Dilthey, W. (2006). Was ist Philosophie? In Zur Weltanschauungslehre, in Gesammelte Schriften, Vol. VIII, Gttingen. Dilthey, W. (1968). Die Typen der Weltanschauung und ihre Ausbildung. In B. Groethuysen (Ed.), Gesammelte Schriften, Vol. VIII, Gtingen (1st, Berlin, 1911).

167

PACHO Giere, R. N. (1985). Philosophy of science naturalized, Phil. of Science, 52, 331-356. Heller, B. (1970). Grundbegriffe der Physik im Wandel der Zeit. Braunschweig: Vieweg. Kanitscheider, B. (1966). Im Innern der Natur, Philosophie und Physik. Darmstadt. Keil, G., & Schndelbach, H. (Eds.). (2000). Naturalismus. Frankfurt. Kitcher, P. (1992). The naturalistic return. Philosophical Review, 101, 53-114. Koppelberg, D. (2000). Was ist Naturalismus in der gegenwrtigen Philosophie. In G. Keil & H. Schndelbach (Eds.), Naturalismus (pp. 68-91). Frankfurt. Kornblith, H. (1985). What is Naturalistic Epistemology? In H. Kornblith (Ed.), Naturalizing epistemology (pp. 1-13). Cambridge, MA/London. Krikorian, Y. H. (Ed.). (1944). Naturalism and the human spirit. New York. Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago: University of Chicago Press (reed., 1969). Klpe, O. (1919). Einleitung in die Philosophie. Berlin. Lenk, H. (1974). Wozu Philosophie? Mnchen. Lbbe, H. (Ed.). (1978). Wozu noch Philosophie? Berlin/New York. McGinn, C. (1991). The problem of consciousness. Oxford: Blackwell. Moser, P., & Mulder, D. (Eds.). (1994). Contemporary Approaches to Philosophy. New York/Toronto: , Macmillan Publishers. Pacho, J. (1995). Naturalizar la razn? Alcance y lmites del naturalismo evolucionista. Madrid: Siglo XXI. Pacho, J. (2005). Natural versus naturalista y viceversa. In VV.AA., La naturalizacin de la filosofa: Problemas y lmites (pp. 17-46). Valencia: Pre-Textos. Pacho, J. (2000). De la diferencia entre natural y naturalista. Datos para la historia de una confusion. Ontology Studies, 10, 337-347. Papineau, D. (1993). Philosophical naturalism. Oxford: Blackwell. Philipse, H. (1994). Towards a postmodern conception of metaphysics: On the genealogy and successor disciplines of modern philosophy. An alternative to Heidegger, Quine, Wittgenstein and Rorty. Metaphilosophy, 25, 26-68. Philipse, H. (2001). What is a natural conception of the world? International Journal of Phil. Studies, 9, 385-399. Quine, W. V. O. (1960). Word and object. Cambridge, MA: MIT Press. Quine, W. V. O. (1969). Epistemology naturalized. In Ontological relativity and other essays. New York: Columbia University Press. Riedel, M. (1978). Philosophieren nach dem Ende der Philosophie? in H. Lbbe (Ed.), Wozu noch Philosophie? (pp. 259-287). Berlin/New York. Riehl, A. (1903). Zur Einfhrung in die Philosophie der Gegenwart. Leipzig. Rorty, R. (2001). Philosophie & Zukunft. In R. Rorty (Ed.), Philosophie & Zukunft (pp. 14-25). Frankfurt: Fischer. Rorty, R. (1979). Philosophy and the mirror of nature. Princeton: Princeton University Press. Russell, B. (1960). B. Russell speaks his mind. London. Sagal, P. T. (1987). Naturalistic epistemology and the harakiri of philosophy. In A. Shimony (Ed.), Naturalistic epistemology (pp. 321-344). Dordrecht/Tokyo: Reidel. Vollmer, G. (2000). Was ist Naturalismus? In G. Keil & H. Schndelbach (Eds.), Naturalismus (pp. 4667). Frankfurt. Vossenkuhl, W. (1995). Wann wird Wissenschaft verantewortungslos? In H. Zehetmair (Ed.), WissensWerte, Ethik und Wissenschaft (pp. 47-72). Starnberg: R. S. Schulz. Windelband, W. (1920). Einleitung in die Philosophie. (Grundriss der philosophischen Wissenschaften). Tbingen.

168

S-ar putea să vă placă și