Sunteți pe pagina 1din 24

Philosophy of Science Association

Indices of Theory Promise Author(s): Laurie Anne Whitt Source: Philosophy of Science, Vol. 59, No. 4 (Dec., 1992), pp. 612-634 Published by: The University of Chicago Press on behalf of the Philosophy of Science Association Stable URL: http://www.jstor.org/stable/188133 Accessed: 28/08/2010 20:49
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at http://www.jstor.org/action/showPublisher?publisherCode=ucpress. Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed page of such transmission. JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Philosophy of Science Association and The University of Chicago Press are collaborating with JSTOR to digitize, preserve and extend access to Philosophy of Science.

http://www.jstor.org

INDICES OF THEORY PROMISE* LAURIE ANNE WHITTtt


Department of Humanities Michigan Technological University Figuring prominently in their decisions regarding which theories to pursue are scientists' appeals to the promise or lack of promise of those theories. Yet philosophy of science has had little to say about how one is to assess theory promise. This essay identifies several indices that might be consulted to determine whether or not a theory is promising and worthy of pursuit. Various historical examples of appeals to such indices are introduced.

The normative proposals of traditional methodologies of science have largely overlooked one of the most common forms of theory appraisal in which scientists engage. Repeatedly scientists have supportedtheir judgements that a particular theory is worthy of pursuit by appealing to its "promise", "fruitfulness", "fecundity" or "fertility". As commonly, they have challenged the judgements of other scientists to this effect. Such assessments are usually accompanied by frank acknowledgement-and sometimes detailed discussion-of the theory's conceptual and empirical shortcomings. Moreover, they are made not only of young or newlyemergent theories, but also of those with long and well-developed careers. In many such cases the contention that specific theories are promising falls emphatically short of a call for their acceptance. Arguments are not invoked to the effect that the theory is true, provides the best explanation, or has the highest problem-solving effectiveness. The claims are typically more modest: that a theory is promising, or more promising than existing alternatives, with respect to a specific problem domain and so is worthy of further development. We might briefly consider Ludwig Boltzmann's defense of kinetic-molecular theory at the end of the nineteenth century. After a series of impressive achievements from roughly 1856-1880,
*Received January 1991; revised September 1991. tA version of this paper was read at the Ninth International Congress for the Logic, Methodology and Philosophy of Science in Uppsala, Sweden, and to the Philosophy Department of the University of New Mexico at Albuquerque. I am grateful to members of these audiences, and to a referee for Philosophy of Science, for their comments. tSend reprintrequests to the author, Department of Humanities, Michigan Technological University, 1400 Townsend Drive, Houghton, MI 49931, USA. Philosophy of Science, 59 (1992) pp. 612-634 ? 1992 by the Philosophy of ScienceAssociation. Copyright

612

INDICES OF THEORY PROMISE

613

kinetic-molecular theory lapsed into a period of stagnation which lasted until 1905. A spate of critical attacks was published, especially during the last decade of the century, raising serious doubts about the conceptual well-foundedness of a theory incompatible with classical thermodynamics. (According to the second law of thermodynamics, energy is subject to irreversible dissipation in all natural transformations-a property not shared by a mechanical system of particles obeying Newton's laws of motion.) Van't Hoff, for example, observed that "with a fairly large expenditure of mathematical development the kinetic theory barely gives the current4% interest on capital, and I think that even this theory should be measured by its fruits" (van Melsen 1960, 151). Furthermore,Planck maintained that [o]bstacles, at present unsurmountable, however, seem to stand in the way of its further progress. These are due not only to the highly complicated mathematical treatment of the accepted hypothesis, but principally to essential difficulties . . . in the mechanical interpretation of the fundamentalprinciples of Thermodynamics. (Nye 1972, 18) Both Van't Hoff's and Planck's assessments of kinetic-molecular theory as less than promising resulted from emphasizing the recent (post1880) history of the theory, and they questioned it on both empirical and conceptual grounds. Boltzmann, by contrast, in assessing the theory in 1896 and 1899, judged it to be promising on the merits of its earlier (pre1880) achievements and on its possession of a powerful heuristic-taking the form of a mechanical, billiard-ball model. He acknowledged that the theory had not done much for scientists lately, "The mathematical part of the gas theory . . . pursues mainly the purpose of further development of mathematical method, for the valuation of which immediate practical utility was never decisive. Let the purely practical man skip this part but also forebear to criticize" (1974, 39). Yet he contended that, given its "earlier attainments", it should be "cultivated further"and that its critics had precipitously and erroneously "inferred from the small current yield of molecular theory to its decline" (ibid., 97-99). Denying "the alleged barrenness of atomism" (ibid., 39), Boltzmann argued: Experience teaches that one will be led to new discoveries almost exclusively by means of special mechanical models. (1964, 26) I am convinced that these attacks are merely based on a misunderstanding, and that the role of gas theory in science has not yet been played out. (I have tried to make clear the) abundance of results agreeing with experiment . . . gas theory has also provided suggestions that one could not obtain in any other way. (Ibid., 215)

614

LAURIE ANNE WHITT

Claiming that it would be a great tragedy for science if the kinetic theory "were temporarily thrown into oblivion because of a momentary hostile attitude toward it" (ibid., 216), he noted, "I am conscious of being only an individual struggling weakly against the stream of time. But it still remains in my power to contribute in such a way that, when the theory of gases is again revived, not too much will have to be rediscovered" (ibid.). As this case demonstrates, decisions to pursue or not to pursue a specific theory can be made on rational grounds, that is, on the basis of assessments of promise or lack of promise which purportto tell us something about the merits and value of the theory in question. As such, they deserve more careful philosophical attention than they have yet received. The notice that has been taken of them has tended to be either cursory or primarily directed to other concerns. Kuhn lists "fruitfulness" as one of his five criteria for theory choice and calls for furtherstudy of it (1977, 322). Laudan appears to regard it as interchangeable with a theory's rate of progress (which is to be comparatively determined using his appraisal measure), "In arguing that the rationality of pursuit is based on relative progress rather than overall success, I am making explicit what has been implicitly described in scientific usage as 'promise' or 'fecundity'" (1977, 112). Lakatos discusses it in terms of "progressive problemshifts": [W]e must not discard a budding research programme simply because it has so far failed to overtake a powerful rival. ... As long as (it) can be rationallyreconstructedas a progressive problemshift,it should be sheltered for a while from a powerful established rival. (1978, 70-71) The most direct and extended discussion is that of McMullin. II The first McMullin distinguishes between two types of theory-appraisal. is an epistemic appraisal to establish theory acceptability. As a realist, McMullin regards this as a matter of addressing the truth-value of the theory, "To what extent has the theory been corroborated?Does it conform reasonably closely to the structure of the real? Is one warranted in accepting the existence of the theoretical entities it postulates?" (1976, 422). To determine theory acceptability, a criterion of P-(proven) fertility is invoked. This is a past-, not a future-, oriented criterion, concerned with the theory's performance or proven record, not its potential: [I]t is estimated by the actual success of the theory in opening up new areas, in meeting anomalies, and so forth. ... To estimate the P-fertility of a theory, one has to retrace its career and see how suc-

INDICES OF THEORY PROMISE

615

cessful it has been in suggesting the right modification at the right time and in allowing incorporation of new areas not originally foreseen. (Ibid., 400-401) McMullin refers to the second type of theory appraisal as a heuristic appraisal. It subjects the theory to a "second, and very different, sort of demand"(ibid., 423): [W]hat is its research-potentialfor the future? how likely is it to give rise to interesting extensions? Does it show promise of being able to handle the outstanding problems (inconsistencies, anomalies, etc.) in the field? Is it likely to unify hitherto diverse areas, or perhaps open up entirely new territory?(Ibid., 423-424) The criterion to be invoked here is also different. Whereas P-fertility confirms the truth-value of a theory, the criterion of U-fertility determines its as-yet untested promise. This is a "tentative, future-oriented affair" (ibid., 400) which does not require tracing the theory's career. Rather, it involves "examining the theory here and now, and estimating its imaginative resources . . . for future extension and modification" (ibid.).

McMullin adds that the two sorts of appraisal are not unrelated in that part of an epistemic appraisal is concerned with P-fertility and the latter can be understood as the way in which the original U-fertility of the theory proves out. Yet he claims they are nevertheless distinct in at least two respects: (1) while a theory's high degree of U-fertility may be enough to persuade us to invest our efforts in it, this heuristic appraisal tells us nothing of the theory's epistemic status; (2) while a theory may, in virtue of its P-fertility (its past performance or actual achievements to date), have received a very positive epistemic appraisal, this tells us nothing of its heuristic potential for the future. McMullin's discussion of fertility contains much of interest for an account of theory promise. His comments regarding U-fertility, for example, suggest that at any time t, there are features of a theory which can be consulted to determine its promise, how worthy it is of furtherpursuit. These are independent of the theory's "researchcareer" or "trackrecord",
of its performance over some interval t .. . t,. I will refer to the appraisal

of these features as appraisals of the theory's formal indices of promise. The existence of such indices helps us to see how some theories can be judged promising and worthy of pursuit despite the fact that they have yet to enjoy significant deployment in research, or have lapsed into a period of stagnation. However, it seems mistaken to maintain, as McMullin does in (1), that such appraisals are "merely" heuristic, and thus that no epistemic commitments are involved in theory pursuit. To claim that a theory T has a

616

LAURIE ANNE WHITT

high degree of U-fertility in virtue of certain performance-independent features which characterize it at t,, is to assert one's belief that the theory is promising; it has features which give grounds for believing that it will be productive in the future. Certainly there is a heuristic component to this; it involves answering the types of questions that McMullin formulates in his discussion of U-fertility. But our estimation of its researchpotential for the future, of how likely it is to give rise to interesting new extensions, to open up new-or to unify diverse-areas of research, is arrived at on the basis of an epistemic appraisal. We appraise certain features of the theory so as to get some sense of what we can reasonably expect of its future performance. Our estimation of its heuristic value then will turn on how well it has fared with respect to the indices of promise consulted. There seems no reason to accept the stipulation that epistemic appraisals are limited to contexts of acceptance. (See Whitt 1990b for an examination of the epistemic and pragmaticcommitments involved in theory pursuit, and how these contrast with those involved in theory acceptance.) We might also question his account in another respect. Since the criterion of U-fertility alone determines whether a theory is promising, for McMullin there are only formal indices of promise. References to the theory's actual performance, to its accomplishments-or lack thereofover t, . . . tn (to what I will refer to as the theory's historical indices of

promise), can play no role in assessments of theory promise on his account. An appraisal of the theory's past performance, that is, of its Pfertility, is relevant only in determining which theory is worthy of acceptance. It is difficult to see how one could make a convincing case that the research successes or failures that a theory has already enjoyed ought not figure in an assessment of how worthy it is of further pursuit, particularly in the face of the frequency with which scientists have appealed to such factors in supporting their assessments of theory promise. Surely no matter how minimal its career, a theory's empirical and conceptual
performance over t, . . . t, are factors highly relevant to determining its

promise. My proposal, then, is that both historical and formal indices need to be consulted in assessing the promise of a theory and in determiningwhether it is worthy of pursuit.' While Laudan's treatment of promise as "rate of progress" allows for the former, it has little room for the latter, seemingly reducing a theory's promise to its "trackrecord".2 Although the theory's
'Whitt (1990b) argues that determinations of theory promise do not require comparative assessments at the community level, although they are required to a limited degree of individual scientists. 2The qualification is due to the conceptual component of Laudan's appraisal measure, which will reflect the current logical relations in which the theory being assessed stands

INDICES OF THEORY PROMISE

617

performance over some interval t, . . . tn is relevant to an estimation of future problem-solving success, it is not sufficient. It leaves out of account the importantmatterof assessing the currentresources of the theory for generating furtherproblem solutions. McMullin's U-fertility, by contrast, does not permit us to consider the actual performance of the theory, that is, its record of achievements over t, . . . t,. This, too, omits a vital aspect of an account of theory promise. He does, however, underscore the need to recognize formal indices of theory promise. III One respect in which historical indices differ from formal indices of theory promise is that they require assessment over some interval of time. In appraising them the conceptual as well as the empirical performance of the theory should be addressed. How has the theory fared conceptually and empirically as the result of however much research and testing it has already undergone? This can be understood as a matter of appraising the conceptual viability and the empirical fertility of a theory T over some interval tl . . . t,. Scientific theories provide scientists with the conceptualresources needed to investigate and understandthe naturalworld, and to direct and promote empirical research. In assessing a theory's conceptual viability, scientists raise questions about the ability a theory has demonstrated to undergo conceptual growth and refinement as the result of its deployment in research. A theory may enhance the conceptual resources it supplies for empirical problem solving in several ways: through the fine-tuning of theoretical concepts, through the appropriation of the conceptual resources of theories in other domains, and through the achievement of greater consilience (for discussion of these see Whitt 1988 and 1990a). My focus here will be on empirical fertility. Scientific assessments of the empirical fertility of a theory, that is, of its demonstrated capacity to undergo empirical growth and refinement, are addressed primarily to how well the theory has responded to the empirical demands that have been made of it in research. Three ways in which a theory may enhance its empirical performance are by increasing the accuracy of its problem solutions and the number of problems it solves in a given domain; by extending its problem-domain (that is, by adding new classes of facts to the domain of facts which it explains); and by
vis-a-vis other theories. However, on that appraisal measure the promise of the theory is still a matter of conceptual problem elimination, along with empirical problem solution. See Whitt (1988).

618

LAURIE ANNE WHITT

increasing its dynamic consilience3 (that is, by extending its problemsolving domain without the addition of ad hoc hypotheses). In appraising empirical fertility, then, scientists' concerns are both quantitative and qualitative. They raise questions not only about how much a theory has been able to explain, or how many problems it has been able to solve
over tl .. . tn, but how well it has done so: Has the accuracy of its prob-

lem solutions improved? Has it extended its domain in a dynamically consilient or in an ad hoc way? While a theory may improve the accuracy of its problem solutions and increase the number of problems it solves in a given domain at any stage of its career, this is perhaps the most reliable means by which a young and little-developed theory may impress and rapidly demonstrate its empirical fertility. Concentrating their efforts on the theory's primary research problems (on what Kuhn refers to as "that class of facts that the paradigm has shown to be particularly revealing of the nature of things" [1970, 25]), scientists can significantly increase the theory's precision and scope of problem solutions by refining experimental methods and techniques of measurement, or by contributing a new-or by improving and adapting an existing-technical apparatus. The nineteenth-centurychemist Jons Jacob Berzelius, for example, devised well-planned gravimetric procedures and employed carefully purified reagents to perform analyses which were patiently repeated until he arrived at values for combining weights of the elements which could be refined no further. His contemporary, Sir Humphrey Davy, explored and improved techniques for utilizing the voltaic cell in chemical research. Davy's successes in decomposing a number of salt solutions and solid compounds contributed to the early problem-solving abilities of the electrochemical theory of affinity (according to which the chemical attraction between elements-responsible for compound formation-was held to be electrical in nature). Later in the century, thermal methods, which were used to solve problems of affinity, were greatly advanced with the development of Pierre Berthelot's bomb calorimeter and with the improvement of thermochemical measurement techniques. Similarly, in the first half of the nineteenth century, the development of numerous methods to arrive at accurate determinations of atomic weights helped further the empirical abilities of the atomic theory: Dalton's rule of greatest simplicity, Gay-Lussac's law of combining volumes, the law of Petit and Dulong on the relationship between specific heat and atomic weight, and Mitscherlich's law of isomorphism which established the relationship between chemical composition and crystalline form. Yet each of these spe3Thagard's (1978, 81-82) discussion of dynamic consilience is slightly different from that adopted here.

INDICES OF THEORY PROMISE

619

cific methods contained limitations which often led to ambiguous and discordantresults, and the accuratedeterminationof atomic weights would not be fully secured until the general method proposed by Avogadro, based on vapor densities, was adopted. The remaining two ways in which a theory may demonstrate its empirical fertility have to do with extensions of the theory's problem-solving domain. They differ in regards to whether these extensions are achieved through the addition of ad hoc hypotheses or through an increase in dynamic consilience. Whitt (1988) argues that when a theory manages to extend its problem-solving domain in a dynamically consilient way it enjoys an importantconceptual as well as empirical achievement by securing greater simplicity and generality of the network of conceptual resources it provides for empirical problem solving. Ad hoc extensions, by contrast, introduce greater complexity into this network of resources; new assumptions are added to the theory in order to account for a particular fact or class of facts, and these assumptions do not receive further empirical support as the theory is developed, or do not serve to uncover new facts which they help to explain. Despite these differences on the conceptual level however, both ad hoc and dynamically consilient extensions of the theory constitute enhancements of its empirical abilities. Berzelius, for instance, was unable to provide a consilient explanation of the law of definite combining proportions using Berthollet's theory of affinities. He had simply added the assumption that those parts of bodies that combined do so in definite proportions; any excess of the various substances would remain in an equilibrium determined by their antagonistic forces. Later he would use his own electrochemical theory of affinity to provide an ad hoc explanation of isomerism. He assumed the existence of different atomic states to account for the different properties of substances which had the same chemical composition. Neither of the above assumptions served to uncover new facts which they helped to explain, though both added to the empirical problem-solving abilities of their respective theories. A consilient explanation of isomerism was, however, provided by the atomic theory. Wollaston had early taken up and attempted to develop Dalton's claims regarding the geometrical considerations governing chemical combination, arguing that the atomic theory could not rest content with atomic weight determinations but would need to provide geometric models of the arrangementof atoms in space. With the development of such models and of three-dimensional views of chemical bonding in the third quarterof the century, atomic theory was able to explain the differences in propertiesof substances with the same composition in terms of structuraldifferences. Another notable example of a theory extending its problem-solving domain without the addition of ad hoc assumptions

620

LAURIE ANNE WHITT

or modifications can be found in the dynamically consilient explanation provided by kinetic-molecular theory of Brownian motion. Appraisals of a theory's empirical fertility, together with those of its conceptualviability, provide scientists with valuable historicalindices which can be used to determine whether a theory is promising and worthy of pursuit. As such, they are highly sensitive to the specific time interval selected: Determinationsof empirical fertility and conceptual viability may vary markedly if the interval over which the theory is being evaluated is lengthened or shortened. When scientists assess the promise of a theory at t, they do not always consult the full career of the theory, that is, how it has developed from its inception at tl, but often emphasize how it has
performed most recently, say from t4 .. . t,. (A clear example of this

occurs in the diverging assessments which Boltzmann, Van't Hoff and Planck made of the promise of kinetic-moleculartheory, discussed above.) Individual scientists, then, may reach different conclusions while appealing to the same historical indices of promise because they differ in the significance which they grant to the recent performance of the theory. The fact that scientists may differ in the selection of an appropriatetime interval over which to assess theory promise is a welcome result which helps to provide for rational diversity of action in the scientific communities. Such historical indices of promise are, however, not the only ones which must be consulted. Various formal indices must be addressed as well.

IV Formal indices of promise are of particular interest for they enable us to see how a newly emergent and little-developed theory-for which the historical indices will be of only marginal relevance-may nevertheless be judged promising and worthy of research. The ability to accommodate cases like this is important if we believe scientists can rationally invest their time and resources exploring and developing the empirical problemsolving abilities of new theories. Moreover, such formal indices may also enable us to account for the phenomena of rashness and tenacity which we find in scientific communities. They help us better understandwhy it might be rational for a scientist such as Boltzmann to tenaciously pursue a "stagnating" kinetic theory. Conversely, they may permit us to appreciate how a scientist can rationally abandon pursuit of a theory which, according to the historical indices, is relatively promising, and "rashly" pursue some other theory which is, according to these same indices, much less promising. Two of the formal indices available may appropriatelybe regarded as constituting the heuristic of the theory of which they are part. That the-

INDICES OF THEORY PROMISE

621

ories possess heuristics is a commonplace observation; the term is frequently invoked by both philosophersand historiansof science. Less clear, though, is exactly what the heuristic of a theory might be understood to be. There has not yet been the kind of extensive, thorough analysis of this concept which it merits. Nor will I offer one here. My discussion of the formal indices of promise does however require a few tentative observations regarding the nature and role of the heuristic within a theory. The heuristic provides a theory with programmaticresearch directives, of varying degrees of specificity and completeness, which orient and guide efforts by scientists to extend the scope and accuracy of the theory's empirical problem-solving abilities. It is, thus, one of the central conceptual resources which a theory makes available to scientists engaged in pursuit. As such, heuristics are amenable and, as will be seen below, regularly subjected to evaluation by scientists. They undergo not only "public articulation and development" (Boyd 1979, 361) by the scientific community, but critique as well. Not only, then, does a heuristic invite further research on the theory, it also serves to focus scientists' efforts to refine and elaborate the theory, indicating what research strategies to employ and the lines along which the theory is to be pursued. It does this by singling out-from the set of authentic, or methodologically-legitimated empirical problems addressable by a theory-certain problems or types of problems, targeting these as primary research problems (PRPs) while relegating the complement of this subset to a lesser or secondary importance. Typically, this task is accomplished by means of an analogy. Along with this, a heuristic will articulate specific research procedures which are empirically oriented and serve to facilitate resolution of the PRPs. I will refer to these as experimental strategies. Such strategies generally enjoin the use either of a particularexperimentaldevice or technique in empiricalproblem solving (e.g., the Voltaic cell or new analytical procedures), or alternatively of an auxiliary hypothesis (e.g., Dalton's rule of greatest simplicity) bearing directly on how empirical results are to be formulated. Two important aspects of a heuristic then are: (i) its analogy, ywhich will direct scientists to the resolution of a particular subset of empirical problems within the theory's domain, and (ii) the experimental strategies which it makes available for resolving these problems. Drawing on Whewell's discussion in another context, we might say a heuristic supplies scientists with a "bond of unity by which the phenomena are held together" (1847, 46), leaving the subject . . . open to further prosecution; which ulterior process may, for the most part, be conducted in a more formal or technical manner. The first great outline of the subject is drawn; and the fin-

622

LAURIE ANNE WHITT

ishing ... demands a more minute pencilling. ... In the pursuance of this task, rules and precepts may be given, and features and leading circumstances pointed out, of which it may often be useful to the inquirer to be aware. (Ibid.) Before considering how analogies and experimental strategies are to be appraised as formal indices of theory promise, it will be helpful to consider a particularcase by way of illustration. According to the heuristic provided by early Daltonian atomism, atoms were analogous to piles of shot: The atoms of a given element were all alike with respect to their specific properties (e.g., weight, size and number per unit volume), while the atoms of different elements differed from one another in weight, size, and number per unit volume. An important property characterizing the atoms of the different elements was their relative weights and, indeed, Dalton had statedthat "one great object" ([1808], 1964, 213) of his work was to show "the importance and advantage of ascertaining"(ibid.) these relative weights. Thus, the heuristic of Daltonian theory singled out the determination of atomic weights as a primary research problem. It also provided a specific research strategy to be employed in determining these weights, an auxiliary hypothesis commonly referred to as Dalton's rule of greatest simplicity. The equivalent weights of the elements (the weights that combine together to give definite compounds) could be determined directly by experiment. Data on combining weights could then be used to calculate atomic weights if one knew how many atoms of one element combined with a single atom of another. Since no means was available of estimating such combining numbers, Dalton assumed that combination would always be of the simplest type. Discrepancies between prediction and experimental results could be and were attributed by chemists working on the theory to inaccuracies resulting from insufficiently refined analytical techniques. When Berzelius, for instance, encountered such discrepancies he assiduously repeated and modified his procedure, "Enlightened by the knowledge of my own errors, and with the aid of better methods, I finally found a great accord between the results of the analysis and the calculations of the theory" (Farber 1969, 150). V indices of promise We might turnnow to how these heuristic-constituitive are to be appraised. For purposes of this discussion I will adopt a broad conception of an analogy so as to include within it both models and metaphors. An analogy asserts that a relational structure that applies in one domain can be applied in another. More exactly, it may be understood as any nonliteral similarity comparison between complex systems which

INDICES OF THEORY PROMISE

623

maps relations in a known or base domain onto a target domain of inquiry (Gentner 1982, 108-109). Consider Rutherford's comparison of the hydrogen atom to the solar system. Here a relation of "revolves around" which holds between objects (a planet, the sun) in the base domain (i.e., the solar system) is one of several relations which is mapped onto objects (the electron, the nucleus) in the target domain (i.e., the hydrogen atom). One of the central contributions of an analogy to a theory is that it facilitates the formulation of predictions in a little-understood domain on domain (Gentner the basis of known relationsin a well- or better-understood and Grudin 1985). Using his solar system model, Rutherfordwas able to predict that in the hydrogen atom a smaller object is peripheralto a larger object, the smaller object revolves around the larger one, the space between them is large relative to the size of the objects, and so on (Hanson 1958, 128-129). Because analogies facilitate prediction, they can play a vital role in determining theory promise. Scientists often cite the possession of a certain analogy among their reasons for believing a particular theory to be promising. We have already seen Boltzmann, for example, arguing that the kinetic-molecular theory deserved to be "cultivated further" in part because "experience teaches that one will be led to new discoveries almost exclusively by means of special mechanical models". The mechanical model of billiard balls provided a well-understood basis for studying what was less well understood, that is, the behavior of the molecules of a gas. He also was emphatic that he was appealing to an analogy: In describing the theory of gases as a mechanical analogy, we have already indicated, by the choice of this word, how far removed we are from that viewpoint which would see in visible matter the true properties of the smallest particles of the body. (Boltzmann 1964, 26) Similarly, throughoutmuch of the eighteenth century, affinity theory was guided by a heuristic according to which the forces of chemical affinity were analogous to gravitational forces. Since the latter were, as the result of Newton's work, well understood and well established, they were believed to provide an analogical "key" for solving the PRPs of affinity theory-the effort to quantify chemistry by ordering and accounting for the observable reactions of chemical reagents. These examples point to two criteria valuable in evaluating a theory's analogy: how well understood and how well established is the base domain. The first of these might be referred to as the specificity (Gentner 1982), and the second as the validity, of the base. With regard to specificity, it should be noted that insofar as the base domain is well under-

624

LAURIE ANNE WHITT

stood it is likely to be familiar as well. Yet familiarity does not guarantee specificity: [M]olecular bonding is sometimes explained by analogy with interpersonal attraction, for example, 'The lonely sodium ion searches for a compatible chloride ion'. Interpersonal attraction is certainly familiar, but its rules are also unfortunately unclear. (Gentner 1982, 113) This lack of clarity with regard to the base relations is transferredto the analogical mapping from the base to the target. It becomes unclear precisely what is to be mapped from the base to the target domain. This suggests a third criterion of evaluation, clarity, pertaining to the exactness or precision of the mapping of relations from base to target. In contrast to a fuzzy analogy, a well-clarified analogy permits strong new predictions. While the predictions of a fuzzy analogy may be verified, they usually reflect strong a priori convictions. Since such predictions could have been made "with or without the analogy we cannot credit the analogy with fostering new discoveries. Only a well-clarified analogy (can) . . . force a truly new and surprising prediction" (ibid., 127). A caveat is perhaps in order here. An analogy may enjoy clarity without sacrificing open-endedness. In other words, not all the relevant aspects of similarity need, or in the ideal case should, be spelled out-although those that are should be well clarified. Open-endedness is another feature of a promising analogy. An open-ended analogy is one in which not all the relevant aspects of similarity are known. If the relevant aspects of similarity (or dissimilarity) have all been probed, the analogy will offer little promise for further extension of the empirical abilities of the theory. This is one way in which a theory's heuristic may be "exhausted". (Another is examined in section 6.) The validity of the base should also be considered more closely. When an analogy draws on a base that is well established, the theory is in a position to make a stronger case for being promising. As Thagard has noted, because they play an important role in improving explanations, analogies can provide valuable support for the theories of which they are part:4 We get increased understandingof one set of phenomena if the kind of explanation used is similar to ones already used. This is because the rules and problem solutions used by a new theory to deal with a
4Boyd, moreover, even suggests that "the quite specific analogy between the cognitive content of a proposed theory and one already accepted may provide some evidential support for the new theory" (1979, 359).

INDICES OF THEORY PROMISE

625

phenomenon are enhanced by connections with well-established rules and solutions. (1988, 94) A theory may thus enrich the conceptual resources it supplies for empirical problem solving through an analogical connection with those of a well-established theory. An analogy which makes this connection provides scientists with a way of drawing upon the resources of another successful theory to help resolve the PRPs singled out by the heuristic of the theory under pursuit. It may also increase the dynamic consilience of the target theory. Having an analogy with an established theory "is likely to lead to dynamic consilience, since explanations of new classes of facts may be achievable using further analogies" (ibid.). Of course the analogy does not itself provide an explanation of the phenomena in the targeted research area, or solve the PRPs. It does suggest the kind of explanation which is needed and is likely to succeed, leaving scientists working on the theory the task of developing the detailed explanations required in each case. In Whewell's terms, it leaves the subject "open to furtherprosecution", by drawing the "first great outline" which then demands finishing through "a more minute pencilling" of detail. Moreover, points of disanalogy may play a role in ruling out a certain kind of explanation or problem solution. On the heuristic of Berthollet's affinity theory for instance, forces of chemical affinity were analogous to gravitational forces in that they varied according to the masses, or quantities of the reacting substances, whereas this was a point of disanalogy between the forces of affinity and those of gravitation according to the heuristic of Bergman's theory of elective affinities. Thus, Berthollet's theory suggested that the kind of explanation which would prove fruitful in accounting for particular chemical reactions would need to make reference to the relative quantities of the reacting substances. This kind of explanation was ruled out by Bergman's theory: The products of a reaction were to be explained by reference only to the relative intensities of the affinities of substances, the quantities entering into the reaction being irrelevant. Let us consider now the second of the formal indices of promise-the experimental strategies specified by the heuristic for resolving the PRPs. As noted above, one way in which such strategies are provided is by the articulationof an auxiliary hypothesis bearing directly on the formulation of experimental results. To see how such hypotheses may be critically evaluated, we might examine the objections raised against the use of Dalton's rule of greatest simplicity in the determinationof atomic weights. Recall that since no means was available of estimating the numbers of atoms which combined to form compounds, Dalton had simply assumed

626

LAURIE ANNE WHITT

that such combination would always be of the simplest type. A number of chemists (including Wollaston, Berzelius, Gay-Lussac and Berthollet) were acutely dissatisfied with this assumption, regarding it as an arbitrary conjecture. The second clause of the rule states that "When two combinations are observed, they must be presumed to be a binary and a ternary . .. (Dalton [1808] 1964, 167). As Wollaston noted, however, this does not enable one to determine which of the two observed combinations is binary and which is ternary: [I]t is impossible in several instances, where only two combinations of the same ingredients are known, to discover which of the compounds is to be regarded as consisting of a pair of single atoms .... [T]he decision of these questions is purely theoretical. (1814, 7) Gardiner (1979) however has argued that the assumption was not arbitrary in that Dalton gave no reason for it: He held that binary compounds are more likely than ternary since the mutual repulsions of atoms of the same element limit the number which can combine with those of the other element. But it was conjectural in that there was a lack, or an insufficient variety, of evidence for it. While the rule could be used in conjunction with Dalton's theory to arrive at correct predictions regarding combining proportions, there were quantities which appeared in the rule, namely, the combining numbers of atoms, which could only be determined by using the rule itself. Thus the rule could not be independently tested. Berthollet for example, objected that: [w]e have no means of determining the number of atoms which combine in this manner in each compound; we must therefore have recourse to conjectures .... Can such presumptions serve as the basis for the determination of the elements of chemical combination? Are we not accepting the vaguest of the speculations of metaphysics . .? (Crosland 1968, 280) Berzelius too held that Dalton's assumption was not "fully warranted by facts" (Muir 1884, 17) and initially made use instead of Gay-Lussac's empirical law of combining volumes to determine the combining numbers of atoms. The ratios of the weights of the combining volumes of elementary gases were taken to represent the ratios of the weights of the atoms of those elements. Later, he also drew on the Dulong-Petit law which permitted the calculation of atomic weights on the basis of measurements of specific heats. Neither of those two methods for determining atomic weights relied on the rule of greatest simplicity. As a result of their use it was possible to calculate atomic weights in more than one way and to provide a variety of evidence for a hypothesis as to atomic weight. Such a hypothesis could be tested by measuring proportions and

INDICES OF THEORY PROMISE

627

then retested by measuring the specific heats via the Dulong-Petit law (Gardiner 1979, 19). What this case suggests is that the promise of such an auxiliary hypothesis as an experimental strategy is affected by the variety of evidence that can be cited in its support. Experimental strategies, however, need not be limited to auxiliary hypotheses. The heuristic may instead enjoin the use of a particularexperimental apparatusor technique. The promise of a theory may be greatly enhanced when its heuristic directs scientists to apply a powerful new experimental device or procedure in empirical problem solving. According to the heuristic of Davy's (as well as Berzelius's) electrochemical theory of affinity for example, the forces of chemical affinity were regarded as analogous to those of electricity. In order to resolve the PRPs of this theory, its heuristic directed scientists to decompose compounds into their constituent elements by means of an electric current supplied by the voltaic battery. The great power of the latter, which Davy described as "an alarm bell to the slumbering energies of experimenters in every part of Europe" (1840, vol. 8, 271), suggested that it would be possible to decompose substances such as the alkalies, which were believed to be compounds but which had resisted every effort to break them down into their simpler substances. It also promised to link hitherto diverse areas, serving no less for demonstrating new properties of Electricity and for establishing the laws of this Science, than as an instrumentof discovery in other branches of Knowledge; exhibiting relations between subjects before apparently without connection and serving as a bond of unity between chemical and physical philosophy. (Ibid.) VI I have already noted that appraisalof a theory's formal indices of promise does not require an assessment of the theory's actual performance over some interval t. . . . t. Thus, the analogy and the experimental strategies embodied in the heuristic of a theory may be assessed at tn as part of an overall estimation of theory promise. It is however certainly possible to evaluate them over time. Indeed, this will form a major part of the appraisal of empirical fertility, in particular, of the determination of how well the theory has succeeded in extending its empirical problemsolving domain. A few words are in order regarding one of the ways in which a theory may experience heuristic exhaustion, contributing to a gradual decline in its promise over time. This consideration will perhaps be of greatest relevance in appraising the promise of a theory that has already enjoyed considerable pursuit. In supporting the claim that a theory T at t, is no longer promising,

628

LAURIE ANNE WHITT

one might argue that either the analogy or the experimental strategies posited by T's heuristic have been "exhausted" or "played out". This raises questions about the actual ability that the heuristic has demonstrated to secure adequate solutions to the PRPs. In Lakatosian terms, it signals that the theory is degenerating; it is no longer succeeding in extending its empirical scope to new problems in its domain. The heuristic exhaustion of the theory reflects the gradual failure of one or both of its heuristic-constituitive indices of promise to guide and advance research efforts effectively. An example of this can be seen in Berthollet's critique of Bergman's theory of elective affinities. According to the heuristic of Bergman's elective affinity theory, forces of affinity were analogous to gravitational forces in that the forces of attraction between minute particles of bodies varied according to their distance apart, but were disanalogous in that affinity was independent of the masses of the reacting substances. The PRPs targeted by this heuristic were concerned with the determination of the relative affinities of substances for one anotherand the establishmentof affinity tables based thereon indicating the order of the affinities of substances. As an experimental strategy for resolving these PRPs, the heuristic advanced an auxiliary hypothesis specifying that the affinity between two substances is constant under similar conditions (and so is independent of the masses of those substances). Thus chemists, in arriving at determinations of relative affinities, were directed to discount as relevant the quantities of the reacting substances and to attend solely to the relative intensities of the affinities of substances. Elective affinity was, then, a constant, invariable force which alone determined the direction of a chemical reaction. In application, this heuristic was enormously successful for a considerable time, enabling chemists to determine, and to establish the order of, elective affinities. Bergman had managed to extend his affinity tables to include most known substances and to reconcile the rules of elective affinity with a number of reactions which had seemed inconsistent with them. But as chemists continued working with the heuristic in order to fill out omissions in Bergman's tables and to dispose of the remaining exceptions to his rules, they found themselves unable to fit the reactions they discovered into his ordered displacement series without inconsistencies. They repeatedly uncovered what they termed anomalous reactions, that is, ones which did not fit the predicted order, for which they were able to do little more than hand-wave. Fourcroy, for instance, in 1801, declared that all anomalies were the result of unaccounted special circumstances such as the physical state of the substances, the heat or cooling employed, imaginary substance, and particularly haste and carelessness on the part of the investigator. (Holmes 1962, 108)

INDICES OF THEORY PROMISE

629

Clearly then, by the beginning of the nineteenth century, the heuristic of Bergman's theory had, as Lakatos would put it, "run out of steam". Positing that elective affinity was a constant force which alone determined the direction of a chemical reaction and which was independent of the relative quantities, or masses, of the reacting substances was no longer a fruitful means for securing resolutions of the PRPs. It was just this heuristic-exhaustion of Bergman's theory which led Berthollet, in 1803, to regard it as unpromising and no longer worthy of pursuit. He attempted to reorient affinity theory by supplying it with a new heuristic which directed chemists' attention to the importance of considering not only the affinities, but the relative quantities of the reacting substances, as well as the properties which influenced the direction of a reaction. According to this heuristic, forces of chemical affinity were analogous to gravitational forces in that they were proportional to the relative masses of the reacting bodies. In this way Berthollet was able to explain a number of reactions of alkalies and alkaline earth with acids which were anomalous under Bergman's theory. VII Other formal indices figure in an assessment of theory promise. Although I will do little more than draw attention to them here, this does not reflect their value, much less their significance in scientists' appraisals of promise. Among these are the theory's logical relations (entailment, reinforcement,consistency, and so on) with well-founded theories in other domains. Laudan (1977) provides an extended discussion of such relations. Particularly instructive in this regard is the case considered in section 1. The incompatibility of kinetic-molecular theory with the absolute validity of thermodynamics was central in the arguments of various scientists that it was unpromising and no longer worthy of pursuit. Those who supported kinetic-molecular theory argued that the second law of thermodynamics was essentially statistical in nature. Maxwell's efforts to apply statisticalmethods to the kinetic theory were followed by Boltzmann's, whose H-theorem placed the statistical interpretationon a more quantitative basis. Determined attempts to resolve the apparent contradiction between macroscopic irreversibility and the fundamental reversibility of Newtonian mechanics, and to overcome such obstacles as the "reversibility paradox" and the "recurrenceparadox" which stood in their way, testify to the significance which scientists placed on the role of intertheoretic logical relations in determining the promise of kinetic-molecular theory. In estimating the promise of T at t, scientists also sometimes make

630

LAURIE ANNE WHITT

reference to the kind of theory which it is. This constitutes another important formal index of promise. That a distinction can be made between two different kinds of theories has been noted by various authors. (See Merz [1907] 1912 and Duhem [1914] 1954 for slightly differing accounts.) I will discuss it here in terms of a distinction between foundational and systematic theories. The former, unlike the latter, grant a central role to underlying entities, processes, or mechanisms in providing accounts of the nature or behavior of phenomena in their domain. They base such accounts on descriptions of entities, processes or mechanisms that are held to constitute-or to be responsible for-the phenomena. Systematic theories, by contrast, are addressed primarily to features or regularities which characterize the system under study as a whole, extending accounts of the phenomena in their domain that do not rely on claims regarding the nature or behavior of underlying entities, processes or mechanisms. This distinction between two different kinds of theories has been explored in Schelar (1966) and Whitt (1990a) in terms of a distinction between two different kinds of problem-solving approaches. Latter-day affinitist theories, for example, can be seen as systematic theories which employed a bulk approach to chemical problem solving. They dealt with the observed behavior of large quantities of substances independent of any reference to the atomic or molecular constitution of the substances. Chemical reactions were studied by considering the energy changes accompanying reactions: The system as a whole could be examined without requiring the construction of any underlying models or commitment to any particular theory of matter. Latter-day atomic theories, on the other hand, can be regarded as foundational theories which employed a particulate approach to chemical problem solving. Using this approach, chemists attempted to provide accounts of the behavior of matter in quantity by reference to the behavior of individual particles. Chemical reactions were studied in terms of atoms, molecules and their motions: A mechanism for chemical change could be provided, based on the laws governing the motions of atoms and molecules. These theories were committed to a discrete, particulate view of matter. How are considerations of theory kind relevant to appraisals of theory promise? Foundational theories take risks which systematic theories typically do not. Because their accounts of the phenomena rest on specific descriptions of underlying entities, processes or mechanisms, foundational theories are, for instance, more likely to clash with theories, as well as to feel the repercussion of changes of theories, in other domains. A foundational theory in chemistry may find itself at odds with the prevailing physical theory, as early Daltonian atomism did. Yet there is a virtue in such risk-taking. A foundational theory is capable of extending

INDICES OF THEORY PROMISE

631

our understandingof the natural world in significant ways. It often does this by provoking a great deal of experimental and theoretical research, by opening up new problem areas or by generating extensive programs of research for which it, some modification of it, or a new theory to which it gives rise, serves as the basis. This was certainly true of Dalton's atomic theory and Newton's theory of gravitation. Merz notes that just as Newton's theory had given rise to a surprising activity in physical astronomy, to a long series of exact measurements and to theoretical deductions of a purely mathematical kind, so the atomic theory of Dalton in the early years of the century fixed the task of chemists for a long time ahead. ([1907] 1912, 395-396) Regarding the atomic theory, Merz draws attention to the extension which was gained in the domain of actual facts . . . the great harvest of actual knowledge of the things and processes of nature which was collaterally gained, whilst chemists were trying to prove or to refute existing opinions. (Ibid., 396) One of the reasons Boltzmann offered for regarding kinetic-molecular theory as promising was this ability of a foundational theory to open up new problem areas for research. Arguing that the kinetic theory has "provided suggestions that one could not obtain in any other way", he obthe more served that the "more boldly one goes beyond experience . facts one can discover" While the (1974, 96). acknowledging surprising the usefulness of systematic theories (those of latter-day affinitism), Boltzmann claimed, "[A] theory which yields something that is independent and not to be got in any other way, for which, moreover, so many physical, chemical and crystallographic facts speak, must not be combated but further developed" (ibid.). However, it would be a mistake to conclude that systematic theories are inherently less promising. Their virtue lies in their potential to control and regulate subsequent research. This powerful function is possible in part because the accounts systematic theories offer of the phenomena in their domain are not based on specific claims about underlying entities, processes or mechanisms. Consequently, they are resilient, able to survive changes in, say, the prevailing theory of matter. The likelihood that they will be subject to intertheoreticalconflicts is thereby lessened; they may be compatible with competing theories in other domains or with differing accounts that may be proposed of the nature of underlying features. Maxwell, for instance, cited this as a virtue of his 1865 "Dynamical Theory of the Electromagnetic Field". There he followed Thomson and Tait in arguing that the energy of the electromagnetic field could be spec-

632

LAURIE ANNE WHITT

ified without also specifying a particularunderlying mechanical structure for the system, that is, without constructing an account of "the nature of the connexions of the parts of the system" (Maxwell [1892] 1904, 213). To do so, he drew on Lagrange's generalized equations of motion, maintaining that the power of this method lay in the fact that "the final equations . . . are independent of the particularform of (the mechanical) connections" (ibid., 200), and recommending it as "presenting to the mind in the clearest and most general form the fundamental principles of dynamical reasoning" ([1890] 1965, 309). The formalism of analytical dynamics employed by Lagrange also provided the basis for George Green's 1838 theory of the elastic solid ether. Green, too, in defending the promise of this theory, argued that it is preferable to assume a general "physical principle as the basis of our reasoning, rather than assume certain modes of action which, after all, may be widely different from the mechanism employed by nature" (Harman 1982, 26). VIII The formal indices of theory promise provide a valuable complement to the historical indices, that is, to the evaluation of a theory's conceptual and empirical "research record". They are part of the varied conceptual resources that theories supply for empirical problem solving, and need to be assessed in determining theory promise. Doubtlessly, others are not mentioned which need to be weighed as well,5 and a great deal more remains to be said about those considered here. I have hoped to establish the importance of taking more seriously an aspect of scientific methodology largely neglected by philosophers of science, to draw attention to its complexity, and to begin to sort out some of the relevant issues. Appraisals of theory promise are, or certainly can be, something more than mere intuitive hunches. They are capable of rational defense and, as the examples introduced above illustrate, scientists typically endeavor to support them on both conceptual and empirical grounds. Unless Kuhnian Normal Science is the rule rather than the exception, appraisals of theory promise figure significantly in the history and practice of science. Within philosophy of science they have a vital role to play in the articulation of any theoretical pluralism, for such positions are committed to the view that a substantial portion of the research activities of any scientific community be devoted to the pursuitthe development, refinement and elaboration-of multiple theories. They may help us to better appreciate, and accommodate, rational diversity of action within scientific communities.
'Some consideration of the role which social factors play in determinations of theory promise is also required.

INDICES OF THEORY PROMISE REFERENCES

633

Boltzmann, L. (1964), Lectures on Gas Theory. Translated by S. G. Brush. Berkeley and Los Angeles: University of California Press. . (1974), Theoretical Physics and Philosophical Problems: Selected Writings. Holland: Reidel. Boyd, R. (1979), "Metaphor and Theory Change: What is 'Metaphor' a Metaphor For?", in A. Ortony (ed.), Metaphor and Thought. Cambridge, England: Cambridge University Press, pp. 356-408. Crosland, M. P. (1968), "The First Reception of Dalton's Atomic Theory in France", in D. Cardwell (ed.), John Dalton and the Progress of Science. New York: Barnes & Nobel, pp. 274-287. Dalton, J. ([ 1808] 1964), A New System of Chemical Philosophy. New York: Philosophical Library. Davy, H. (1840), The Collected Works of Sir Humphrey Davy. Vols. 1-9. London: Smith, Elder & Co. Duhem, P. ([1914] 1954), The Aim and Structure of Physical Theory. Translated by P. Wiener. Princeton: Princeton University Press. Farber, E. (1969), The Evolution of Chemistry: A History of Its Ideas, Methods, and Materials. 2d ed. New York: The Ronald Press. Gardiner, M. (1979), "Realism and Instrumentalism in 19th Century Atomism", Philosophy of Science 46: 1-34. Gentner, D. (1982), "Are Scientific Analogies Metaphors?", in D. Miall (ed.), Metaphor: Problems and Perspectives. Brighten, Sussex: Harvester Press, pp. 106-132. Gentner, D. and Grudin, J. (1985), "The Evolution of Mental Metaphors in Psychology: A 90-Year Retrospective", American Psychologist 40: 181-192. Hanson, N. R. (1958), Patterns of Discovery: An Inquiry into the Conceptual Foundations of Science. Cambridge, England: Cambridge University Press. Harman, P. (1982), Energy, Force and Matter: The Conceptual Development of Nineteenth-Century Physics. New York: Cambridge University Press. Holmes, F. (1962), "From Elective Affinities to Chemical Equilibria: Berthollet's Law of Mass Action", Chymia 8: 105-145. Kuhn, T. (1970), The Structure of Scientific Revolutions. 2d ed. Chicago: University of Chicago Press. . (1977), The Essential Tension: Selected Studies in Scientific Tradition and Change. Chicago: University of Chicago Press. Lakatos, I. (1978), The Methodology of Scientific Research Programmes. Cambridge, England: Cambridge University Press. Laudan, L. (1977), Progress and Its Problems: Toward a Theory of Scientific Growth. Berkeley and Los Angeles: University of California Press. Maxwell, J. C. ([1892] 1904), A Treatise on Electricity and Magnetism. Vol. 2. 3d ed. London: Oxford University Press. . ([1890] 1965), The Scientific Papers of James Clerk Maxwell. Vol. 2. Edited by W. D. Niven. New York: Dover. McMullin, E. (1976), "The Fertility of Theory and the Unit for Appraisal in Science", in R. Cohen, P. Feyerabend, and M. Wartofsky (eds.), Boston Studies in the Philosophy of Science. Vol. 39, Essays in Memory of Imre Lakatos. Holland: Reidel, pp. 395432. Merz, J. ([1907] 1912), A History of European Thought in the Nineteenth Century. Vols. 1-2. London: William Blackwood & Sons. Muir, P. (1884), A Treatise on the Principles of Chemistry. Cambridge, England: Cambridge University Press. Nye, M. (1972), Molecular Reality: A Perspective on the Scientific Work of Jean Perrin. New York: American Elsevier. Schelar, V. (1966), "Thermochemistry and the Third Law of Thermodynamics", Chymia 11: 99-121.

634

LAURIE ANNE WHITT

Thagard, P. (1978), "The Best Explanation: Criteria for Theory Choice", The Journal of Philosophy 75: 76-92. . (1988), Computational Philosophy of Science. Cambridge, MA.: MIT Press. van Melsen, A. (1960), From Atomos to Atom: The History of the Concept Atom. New York: Harper & Row. Whewell, W. (1847), The Philosophy of the Inductive Sciences, Founded upon Their History. 2d ed. London: J. W. Parker. Whitt, L. A. (1988), "Conceptual Dimensions of Theory Appraisal", Studies in History and Philosophy of Science 19: 517-529. . (1990a), "Atoms or Affinities? The Ambivalent Reception of Daltonian Theory", Studies in History and Philosophy of Science 21: 57-89. - . (1990b), "Theory Pursuit: Between Discovery and Acceptance", PSA 1990, vol. 1. East Lansing: Philosophy of Science Association, pp. 467-483. Wollaston, W. H. (1814), "A Synoptic Scale of Chemical Equivalents", Philosophical Transactions of the Royal Society 104: 1-22.

S-ar putea să vă placă și