Sunteți pe pagina 1din 30

Synthese (2009) 171:4775 DOI 10.

1007/s11229-008-9379-6

Conditionals in reasoning
John Cantwell

Received: 23 October 2006 / Accepted: 21 July 2008 / Published online: 27 August 2008 Springer Science+Business Media B.V. 2008

Abstract The paper presents a non-monotonic inference relation on a language containing a conditional that satises the Ramsey Test. The logic is a weakening of classical logic and preserves many of the paradoxes of implication associated with the material implication. It is argued, however, that once one makes the proper distinction between supposing that something is the case and accepting that it is the case, these paradoxes cease to be counterintuitive. A representation theorem is provided where conditionals are given a non-bivalent semantics and epistemic states are represented via preferential models. Keywords Conditionals Non-monotonic logic Ramsey test

Reasoning is a mental activity involving mental states and mental acts. Part of the subject matter of logic is to provide a theory of what constitutes correct reasoning. Such theories are often presented by rules of inference, where an inference is a particular kind of transition from one mental state to another and the rules governing inference characterize the legitimate transitions. The manner in which such rules are presented often leaves implicit both the nature of the transition and the normative status of the rule. For instance, the following is a standard rule presented in a standard way: A& B . A I am here less interested in the rule following aspect than I am in the underlying norm that justies the rule. For this particular rule I would suggest that the underlying norm is: It is a requirement of rationality that if one accepts A& B , then one accepts A.

J. Cantwell (B ) Royal Institute of Technology, Stockholm, Sweden e-mail: cantwell@kth.se

123

48

Synthese (2009) 171:4775

Instead of the above vertical notation I will use the following horizontal notation to express that A is accepted by an agent j : | j A. With this new notation the norm corresponding to the above rule could be stated as follows: x A. For any agent x , it is a requirement of rationality that if | x A& B , then | This, in turn, will be abbreviated to: If | A& B , then | A. Not all rules of inference have this format. The dominant logics, like classical and intuitionistic logic, contain a number of rules of inference that govern the interaction between what one supposes to be the case and what one accepts to be the case (Gentzen 1969; Prawitz 1965). So one nds the rule (sometimes called disjunction elimination) that here can be formulated as follows If C is accepted on the supposition that A, and C is accepted on the supposition that B , then C is accepted on the supposition that A B . Or, letting A | j B stand for B is accepted by j when supposing that A: For any agent x , it is a requirement of rationality that if A | x C and B | x C , then A B | x C . In abbreviated form: If A | C and B | C , then A B | C. Keep in mind that the left-hand side and right-hand side of the wobbly turnstile represent two different kinds of mental states. The act of supposing that A or assuming that A (I use the terms interchangeably) is a mental act that puts one in the mental state where one supposes or assumes that A (the same word is used to describe both the act and the state that results from the act). To suppose that A and to accept that A are two different things (e.g. see Levi 1996). If one supposes that A then one accepts that A (while one reasons under the supposition), but the converse need not hold. For instance, I believe, accept, that it will rain this evening (the clouds look very heavy), but this belief does not have the status of an assumption. This shows up because I can coherently (without accepting any contradiction) suppose, for the sake of the argument, that it will not rain this evening. The assumption that it will not rain, contradicts my belief that it will rain, but upon supposing that it will not rain I no longer, for the duration of the argument, accept that it will rain. My belief has been bracketed by an assumption to the contrary. Assumptions can be formed and given up at will, but while they are in place, they are not undermined by any further assumptions. Thus if I rst assume that it will rain and then, without canceling this rst assumption, assume that it will not rain, then I am committed to accepting a contradiction. Suppositions allow us to reason with and explore possibilities that are not supported by any available evidence, indeed that may contradict the available evidence. Thus even though I know nothing more than that either the gardener or the butler did it, I can for the sake of the argument assume (without any additional evidence) that the gardener didnt do it and reason from this supposition. In this case I would, while

123

Synthese (2009) 171:4775

49

reasoning under this supposition, accept that the butler did it. Or I can assume that it wont rain even though the heavy clouds provide sufcient evidence for me to believe (but not to be indefeasibly certain) that it will rain. The fact that suppositions can undermine acceptances makes our reasoning nonmonotonic in that reasoning does not (and should not) in general satisfy the requirement of monotonicity: If A | C , then A , B | C. As the classical consequence relation is monotonic this disqualies it as the basis for a general theory of correct reasoning.1 However, classical logic is (a strong candidate to be) the correct theory of purely suppositional reasoning: a theory of the kind of reasoning that proceeds only from what one assumes to be the case and what one accepts on the basis of these assumptions, without involving any proposition that one merely (and defeasibly) believes to be the case.2 The correct general theory of correct reasoninga theory which allows that assumptions can undermine or bracket what one otherwise acceptsmust be sought within the domain of non-monotonic logics. Non-monotonic logic is often associated with ampliative reasoning: default logic, circumscription logic, inductive logic can all be seen as forms of reasoning governing more or less plausible ways of extending ones beliefs beyond the available evidence, beyond but yet consistent with what is required by rationality. By contrast the non-monotonicity in reasoning that has been discussed above arises from the fact that beliefs may be defeated by assumptions and is not of the ampliative kind, it is not reasoning intended to reach beyond what one already has reason to believe on the assumptions made. If anything it is of the opposite kind: the non-monotonicity arises because not every belief can legitimately be introduced at every stage of reasoning, some beliefs are temporarily blocked by assumptions to the contrary.3 Yet another form of non-monotonicity arises from the dynamics of information. Acquiring new evidence can make you surrender old beliefs. The non-monotonicity in reasoning that I have in mind is not of this kind either. While the adding and canceling of assumptions in reasoning gives rise to a dynamics, it is not the dynamics of learning: to suppose that something is the case is quite different from learning that it is the case. Still, the different forms of non-monotonicity are structurally similar and are not always easy to keep apart. So for instance Isaac Levi has argued that some aspects of the inuential AGM (after Alchourrn et al. 1985) framework for information dynamics

1 A general theory should not only cover reasoning from what one assumes or believes to be the case theoretical reasoningit should also cover reasoning about what one wants and intends to dopractical reasoning. In this paper only theoretical reasoning is studied. Broome (2001; 2002) outlines an account of practical reasoning. 2 Axiomatic presentations of logic provide a list of claims that are to be indefeasibly accepted. So for instance, in some presentations of classical logic A A is put forward as an axiom, which means that one should accept it regardless of what one otherwise assumes. 3 There is, of course, a sense in which this kind of reasoning is ampliative. It may be required of a particular agent j that j accepts B on the assumption that A even though the same requirement does not apply to some other agent k (perhaps j justiably believes that A B and k does not), thus one cannot say that it is the assumption A alone that brings about j s requirement to accept B , but the assumption A together with j s beliefs.

123

50

Synthese (2009) 171:4775

(belief revision) are best viewed as an analysis not of belief revision, but of suppositional reasoning. Likewise the preferential non-monotonic logic of KLM (after Klaus et al. 1990), often presented as a logic for ampliative reasoning, can be re-interpreted as a theory of reasoning along the lines sketched above. The connection between them can easily be seen if we change representation (Grdenfors and Makinson 1991). If we let K j represent the set of propositions accepted by the agent j , and K j A be the set of propositions accepted by j under the supposition that A, then | j A if and only j B if and only if B K j A. I prefer to use the wobbly turnstile if A K j , and A | | rather than the set notation as the former, for historical reasons, emphasizes the connection to logic and, through it, to reasoning. The distinction between accepting that A and supposing that A is by now wellestablished. But it is not always appreciated how profound the consequences are for the proper understanding of even the most ancient and well-established rules of inference. Many classical rules of inference can be interpreted in several different ways. Take an apparently simple rule like the disjunctive syllogism which (trying to formulate it in a neutral way) says that B can be inferred from the premises A B and A. In the present setting this loose statement of the disjunctive syllogism can be interpreted in four different ways: DS1 DS2 DS3 DS4 A B, A | B. If | A B and | A, then | B. If | A B , then A | B. If | A, then A B | B.

The key here is whether we are to understand the premises A B and A as assumptions (DS1), or as claims that one (perhaps defeasibly) accepts (DS2), or as some combination of assumptions and acceptance (DS3, DS4). It is far from obvious that all these formulations are equivalent. Indeed, I claim that not only are they not all equivalent, but that two of them are false: they do not reect requirements of rationality. The rst two, DS1 and DS2, I think are correct, but it is easy to nd counterexamples to DS3 and DS4. For instance, say that I believe that the butler did it (committed the murder). Then I accept that either the butler or the gardener did it. But upon the supposition that the butler didnt do it I am not required to accept that the gardener did it. For upon the assumption that the butler didnt do it I bracket the belief that the butler did it, and with it the (dependent) belief that either the butler or the gardener did it. Thus DS3 is false. Similarly, even though I might believe that the gardener did it, this doesnt mean that on the assumption that either the gardener didnt do it or my dead grand-mother did it, I am required to accept that my dead grand-mother did it: the possibility that my dead grand-mother did it is not a live option. Thus DS4 is false. Enter conditionals.4 The Ramsey Test for conditionals, an essential ingredient in the analysis of conditionals, holds that there is a close-knit connection between the
4 The conditionals I have in mind are conditionals in the past or present tense: If Jane accepted the bet, she won, If the sausage is not in the fridge, someone has eaten it, If 1 + x = 4, then x =3 are representative examples. Conditionals in the subjunctive mood If the sausage hadnt been in the fridge, Jane wouldnt have eaten it and conditionals in the future tense If Jim jumps from the cliff, he will die are not covered in the analysis.

123

Synthese (2009) 171:4775

51

conditionals that one accepts, and what one accepts on the basis of an assumption. It says (letting A B correspond to the indicative conditional If A, then B ): | A B if and only if A | B. We can immediately note that the ancient rule modus ponens suffers the same ambiguity as the disjunctive syllogism. Trying to formulate it in a neutral way it says that B can be inferred from the premises A and A B . In the present setting this loose statement can be interpreted in four different ways: MP1 MP2 MP3 MP4 A, A B | B. If | A and | A B , then | B. If | A, then A B | B. If | A B , then A | B.

Again the key here is whether we are to understand the premises A and A B as assumptions (MP1), or as claims that one (perhaps defeasibly) accepts (MP2), or as some combination of assumptions and acceptance (MP3, MP4). Each are distinct non-equivalent formulations of modus ponens and, as it happens, I think two of them are requirements on reasoning and two are not. The two correct principles are MP1 and MP4 (the latter is just one direction of the Ramsey Test), while MP2 and MP3 are both subject to counterexamples. Here is a counterexample to MP3. I believe that Oswald killed Kennedy, but upon assuming that if Oswald killed Kennedy, then my grand-mother was a co-conspirator, I would bracket the belief that Oswald killed Kennedy and so I would not conclude that my grand-mother was a co-conspirator to the murder. For I believe that if Oswald killed Kennedy only if my grand-mother was a co-conspirator, then Oswald didnt kill Kennedy. Van McGee has presented a nice counter-example to MP2: Opinion polls taken just before the 1980 election showed the Republican Ronald Reagan decisively ahead of the Democrat Jimmy Carter, with the other Republican in the race, John Anderson, a distant third. Those apprised of the poll results believed, with good reason: If a Republican wins the election, then if its not Reagan who wins it will be Anderson. A Republican will win the election. Yet they did not have reason to believe If its not Reagan who wins, it will be Anderson. (McGee 1985, p. 462) The literature on conditionals, on their relationship to suppositional reasoning, on how they interact with the Ramsey Test in non-monotonic reasoning, and on the problems one encounters along the way is by now considerable (see Bennett 2003 for a general overview, and Arl-Costa 2007 for an overview of the main frameworks for dealing with conditional logic). I want to focus briey on two recurring issues in the literature. The rst issue concerns the semantics of conditionals. Many (e.g. Adams 1975; Levi 1988, 1996; Edgington 1995) hold that the kind of conditionals (indicative

123

52

Synthese (2009) 171:4775

conditionals) discussed here have no semantics. In the most radical form it is held that there is no sense in which these conditionals can be taken to express propositions that are true or false, and that the acceptance of a conditional is no more than the expression of a state of mind. Ramsey is a precursor: Many sentences express cognitive attitudes without being propositions; and the difference between saying yes or no to them is not the difference between saying yes or no to a proposition. F. P. Ramsey, Law and Causality (Ramsey 1929/1990). One need not actually endorse such expressivism to see the virtues of a non-semantic analysis. It is surely a worthwhile ambition to state the requirements of rationality that govern reasoning without appealing to any semantic properties of the language. Indeed there is an important tradition, inferentialism, which holds that once we have provided a theory of the norms governing the use of the various connectives we have also provided a theory of meaning for those connectives (e.g. Brandom 1994; Dummett 1997). Such a view, however, need not be irreconcilable with the view that the connectives have truth conditions. Soundness and completeness proofs are standard ways of linking the normative perspective with the semantic-descriptive perspective, regardless of which perspective one thinks has conceptual priority. A considerable amount of the formal work in the area has focused on epistemic models for the logic of conditionals, models that do not assume that conditionals have a semantics.5 This in itself need not be seen as a form of expressivism. It should already be clear that a theory of correct reasoning cannot be given in purely semantical terms. The mental state one is in when one believes that A has the same propositional content as the mental state one is in when one assumes that A, but the normative requirements on reasoning distinguish between the two mental states, thus the normative requirements on reasoning take more than propositional content into consideration.6 Expressivists take the need for epistemic (rather than semantic) analyses to be grounded in the fact that conditionals lack semantics, and they often invoke an array of impossibility resultsthe best known due to Lewis (1976)that suggest that conditionals do not behave like standard truth-carrying constructions. I think these semantic sceptics are right up to a point, but that they overstate their case. There are good reasons for holding that a conditional (of the kind considered here) with a false antecedent lacks truth value, here I agree with the expressivists. But there are also good reasons to hold that a conditional with a true antecedent and a false consequent is false, just as there are good reasons to hold that a conditional with a true antecedent and a true consequent is true. That is, I am inclined to think that conditionals express partial propositions, propositions that can be true or false, but that also can lack truth value. This is not a new idea (it goes back at least to Quine
5 There are those who hold that conditionals describe the epistemic state of the speaker or describe the

epistemic state of whoever assesses a conditional, in effect relativizing the truth conditions for conditionals to some agent (e.g. Stalnaker 1975; Weatherson 2001). Space does not allow for an in-depth discussion of these views here, although it should be noted that a difference in view on the semantics basis for conditionals need not amount to differences in views on the resulting logic.
6 This does not mean that the content of mental states is irrelevant for the norms of reasoningit may very well be that the norms of reasoning in part supervene on relationships between propositional contentsbut we cannot expect these semantic relationships to pull the whole weight.

123

Synthese (2009) 171:4775

53

1950), indeed some expressivists (e.g. Bennett 2003 and Edgington 1995) are willing to concede the point, but do not regard this as anything that can be put to use. I think they are wrong. The second issue is closely connected with the rst: the difculty of giving a general treatment of embedded conditionals, that is, of conditionals that occur in the scope of some other connective or in the scope of some other conditional. It is commonly conceded that a conditional like A ( B C ) is equivalent to ( A& B ) C . But apart from this special case there is little agreement on how to proceed. There are different suggestions on how to deal with negated conditionals, and many hold that conditionals cannot even meaningfully occur as left-hand embedded, embedded in the antecedent of another conditional (as in ( A B ) C ). Seen from the perspective of suppositional reasoning this would be a disaster. For it seems clear that we can coherently assume that a conditional A B holds and reason from such an assumption to a conclusion C . But if we have A B | C , then the Ramsey Test entails | ( A B) C . There have been attempts at enriching standard models for representing epistemic states in order to analyze left-hand embedded conditionals. But some of these are too domain specic to serve as a general analysis, and more general frameworks (such as Hansson 1992 and Arl-Costa 1995; 1999) are so general in character that they provide little guidance on how we are to proceed to extract a reasonable theory of conditionals. In this paper I take classical logic as a starting point and I assume that it to a large extent captures the actual norms invoked in purely suppositional reasoning. It is my starting point for investigating the norms involved in reasoning that mixes beliefs and assumptions. The intuitionist or relevance logician would choose a different starting point and hence arrive at a different theory. My choice of starting point should not be taken as unconditional acceptance of classical logic (to the contrary: it has problems). However, some of the general lessons emphasized herethe importance of keeping beliefs and assumptions apart, about the non-monotonicity induced by the fact that assumptions undermine beliefs, and the specic problems introduced by conditionalswould remain valid in a variety of starting points. In Section 2 I present and argue for a non-bivalent semantics for conditionals. Section 3 presents a preferential model for epistemic states and provides a representation theorem. Section 4, nally, discusses one of the problems with taking classical logic as a starting point.

1 Inference relations 1.1 Categorical inference relations First a language without conditionals will be considered: Denition 1 The categorical language L 0 consists of a countable set of propositional atoms (denoted by lower case letters p , q , r, . . .) including falsum, , closed under the connectives and & (so that if A and B are sentences of L , then so are A and A& B . A B is dened: ( A& B ).

123

54

Synthese (2009) 171:4775

Denition 2 A Categorical Inference Relation is a relation | (nite) sets of sentences B abbreviates { A} | B ): of L 0 to sentences of L 0 satisfying ( , A | &Acc I If E If I If E If I If E If | A& B if and only if | A and | B. | A, then | A B and | B A. ,A| C and , B | C , then , A B | C. ,A| , then | A. , A | , then | A. . | A& A, then | | , then | B , for any B .

Together with the structural constraints: Reexivity , A | A. Cut If | A and , A | B , then | B. Cautious Monotony If | A and | B , then , A | B. The requirements that characterize categorical inference relations provide a natural generalization of the classical requirements on reasoning on the basis of assumptions and axioms alone. Note that we get classical logic by replacing (actually: strengthening) Cautious Monotony by: Monotony If | B , then , A | B.

1.2 Conditional inference relations Denition 3 The language L consists of a countable set of propositional atoms, closed under the connectives , and & (so that if A and B are sentences of L , then so are A, A B and A& B ). A B is dened: ( A& B ). A sentence in L is categorical if it contains no instance of the conditional or if every instance of the conditional occurs within the scope of a negation. Denition 4 A Conditional Inference Relation is a relation | from (nite) sets of sentences of L to sentences of L , satisfying, in addition to &Acc, I, E, I, E, I, E and the structural requirement Reexivity: Acc | A B if and only if , A | B. As well as the following restricted forms of the structural constraints: R-Cut If | A and | B , then , A | B , provided that B is categorical. R-Cautious Monotony If | A and , A | B , then | B , provided that B is categorical. Here are some sample properties (the proofs are left as an exercise, for the proof of (2) it is helpful to use Theorem 2 below): Observation 1 For any conditional inference relation | : 1. 2. 3. A, A B | B. | A ( B C ) if and only if | ( A& B ) C . | A ( B &C ) if and only if | A B and | A C.

123

Synthese (2009) 171:4775

55

1.3 Discussion From the perspective of classical logic, the distinctive feature of conditional inference relations is that they fail to satisfy: Monotony If | B then , A | B. Cut If | A and , A | B , then | B. Failure of Monotony is of course the dening feature for all non-monotonic logics. Given the intended interpretation of conditional inference relations the failure of monotony corresponds to the fact that what you accept on a given set of assumptions , need not be accepted upon further suppositions . In particular, a belief (which is an unconditional acceptance) may be bracketed when reasoning under a supposition, so that we may have | A without having B | A. For instance, say that Jane, noticing that her cat comes inside drenched in water, draws the reasonable conclusion that it is raining outside. That is, after Jane has made her observation we have | j It is raining outside Her belief is no rmer, however, than that she can coherently consider the possibility that she is wrong; on the supposition that it isnt raining, she no longer accepts that it is raining. Instead, on supposing (for the sake of the argument) that it isnt raining, she brackets her belief that it is raining and accepts that someone poured water on the cat, that is, we have It isnt raining outside | j Someone poured water on the cat, but not It isnt raining outside | j It is raining outside. Or say that Bill saw Jane leaving home this morning, apparently without an umbrella. Thus, on the supposition that it rained during the day, Bill would conclude that Jane got wet: It rained | b Jane got wet. On the other hand, while unlikely, Bill is willing to consider the possibility that Jane carried a hidden umbrella, so he also accepts: It rained, Jane carried a hidden umbrella | b Jane didnt get wet, but not It rained, Jane carried a hidden umbrella | b Jane got wet.

123

56

Synthese (2009) 171:4775

The distinguishing property of arguments that have the structure of an reductio ad absurdum is that there are suppositions that one cannot coherently accommodate (that is, they lead one to accept a contradiction). So for instance, Jane accepts that 1 + 1 = 2 and she cannot coherently accommodate the supposition that 1 + 1= 2. So we have: | j 1 + 1 = 2, and 1+1 = 2 | j 1 + 1 = 2. but we also have: 1+1 = 2 | j 1 + 1 = 2. Accepting both a claim and its negation is the dening feature of a reductio argument and in classical logic, as well as here, it leads to a logical explosion: one accepts everything. While non-monotonicity is widely accepted as a feature of everyday reasoning, the failure of Cut is more contentious (for instance, Gabbay 1985, holds it as one of the dening properties of an inference relation). But, I will argue, it is a distinguishing feature of conditionals in the kind of inference relations considered here that they systematically violate Cut. It should already be clear that there is a difference in the logical status of accepting that A as opposed to supposing that A. This is obvious already in allowing for nonmonotonicity. As we have seen, in allowing for non-monotonicity one can coherently have both: | A and , B | A . Note, however, that one is committed to accepting anything one supposes to hold, so we always have: A, and we never have A, , B | A . | A,

That is, while the belief (or conditional acceptance) that A can be undermined by further assumptions that conict with A, the assumption or supposition that A is never undermined by further assumptions or suppositions. Or, in yet different words, the

123

Synthese (2009) 171:4775

57

fact that A occurs on the right-hand side of the relation | Athe fact that A is (hypothetically) acceptedis no guarantee that it will remain on the right-hand side no matter what we add to the left-hand side; but as soon as A occurs on the left-hand side, this is also a guarantee that it will occur on the right-hand side: no matter what , we always have , A | A. Suppositional reasoning allows us to explore the possibility that we are wrong in some of our beliefs (of course, they also allow us to explore possibilities that are consistent with what we believe); in supposing that A one explores a possibility ( A) that may go contrary to what one believes. So some of the conditionals that we accept explore the possibility that we are wrong in what we believe to be the case. But the conditionals that we accept never explore the possibility that we are wrong in what we assume or suppose to be the case. For instance, we have: A, A | B, for an arbitrary B (one cannot coherently explore the possibility that both A and A are true). So we have: A| A B, for an arbitrary B ; under the assumption that A the conditional A B explores a possibility ( A) that cannot be coherently explored while one is assuming that A. On the other hand it is not in general the case that just because we have: | A, we also have: A | A. One need not accept a contradiction when making an assumption that goes contrary to what one believes. So it is not in general the case that when we have | A, we also have, for an arbitrary B : A | B. Thus it is not in general the case that whenever we have | A, we also have: | A B. So here we begin to see the problem with Cut (and unrestricted Cautious Monotony). For Cut says that | A together with A | A B , entails | A B. It saysimplausibly when conditionals are involvedthat defeasible beliefs can be treated as if they were assumptions as long as no further assumptions are added. The problem is that when a conditional A B is evaluated the antecedent plays the role of an assumption in the evaluation of the consequent, an assumption that may

123

58

Synthese (2009) 171:4775

undermine what one otherwise believes (but not what one otherwise assumes), and so when reasoning with conditionals one cannot treat what one otherwise believes as on par with assumptions. We see this feature over and over again: Yes: Yes: Yes: Yes: Yes: Yes: A| A B. AB| ( A&C ) B . A, A B | B. AB| B A. A C. A B, B C | AB | A B. No: No: No: No: No: No: If | A, then | A B. If | A B , then | ( A&C ) B . If | A and | A B , then | B. If | A B , then | B A. If | A B and | B C, then | A C. If | A B , then | A B.

All the properties of the left-hand column (the yes column) hold for conditional inference relations (they are easy to prove). On the other hand, none of the properties of the right-hand column (the no column) in general hold for conditional inference relations (this can easily be shown given the representation results below). If we allow unrestricted Cut, all of the No-properties would follow from the Yesproperties. Why is this important? Because there is good evidence from considered judgements (e.g. Adams 1975) that none of the No-properties are valid in general. It is important because one of the cardinal mistakes that is endemic in the literature on conditionals is to take the evidence against the No-properties as suggesting that the Yes-properties are not valid, or that they pertain to some other conditional. It is important because the opposite mistake is just as common; the mistake of dismissing the evidence against the No-properties on the grounds that the Yes-properties belong to the core of classical logic, a deeply entrenched theory of reasoning. It is important because the two opposing views typically do not realize that they are discussing two distinct sets of logical properties, and that the systematic disagreement between the two parties is due to a systematic confusion about the properties they are talking about. Give up Cut and this false conict is exposed. The rst pair (with the right-hand side being a well-known paradox of implication) has already been discussed. Consider the second pair. The failure of antecedent strengthening is generally accepted for natural language conditionals: one can accept If Jim struck the match ( M ), it lit ( L ) while denying If Jim struck the match and the room was empty of oxygen ( O ), the match lit. In the present framework this is respected as we can have: | M L, and | ( M & O ) L . By contrast, when one holds xed (assumes/supposes) If Jim struck the match it lit, one is committed to accepting If Jim struck the match and the room was empty of oxygen, the match lit. That is, we have:

123

Synthese (2009) 171:4775

59

ML| (M &O) L . The third Yes/No-pair (two different forms of modus ponens) has already been discussed and similar stories can be told for the remaining pairs. I will not belabour the point any further. 2 Semantics 2.1 Classical semantics Denition 5 A classical valuation is an assignment V of truth values {t, f} to the sentences of L satisfying: A t f A f t AB t t t t f f f t A& B t t t f f f f f

Note that the conditional is here interpreted as the material implication. Denition 6 A is a classical consequence of , in symbols | A, if and only if for every classical valuation V : if V ( B ) = t for all B , then V ( A) = t. Theorem 1 (Supra-classicality) For any Conditional Inference relation | , if then | A. Theorem 2 For any Conditional Inference relation | : 1. If A | B and B | A, then 2. , A, B | C if and only if ,A| C if and only if , A& B | C. ,B | C. | A,

Any conditional inference relation contains the classical consequence relation. Furthermore, it is easily seen that the classical consequence relation is itself a conditional inference relation. Thus the classical consequence relation is the smallest conditional inference relation: it captures the requirements of rationality that hold regardless of what is otherwise accepted. For this reason I think we can think of classical logic as a theory of purely suppositional reasoning. An important property (Theorem 2) is that classically equivalent sentences are substitutable when they appear as assumptions. Thus, in particular, A B and A B have the same role in reasoning when they appear as assumptions: to assume that A B is to exclude the possibility that A is true and B is false, i.e. to assume that A B . This does not mean that A B and A B are substitutable under acceptance. It is consistent with the requirements of conditional inference relations that one can accept A B without accepting A B . Parts of the material truth conditions for the conditional can be grounded in usage. It is wrong to accept a conditional if one holds the antecedent to be true and the consequent false, and wrong to reject a conditional when one holds both the antecedent

123

60

Synthese (2009) 171:4775

and the consequent to be true. This is mirrored in the material truth conditions: it is wrong to accept a conditional with a true antecedent and false consequent as the conditional is false under these conditions, and it is wrong to reject a conditional with a true antecedent and true consequent as the conditional is true under these conditions. Note, however, that there is no general tendency for people to accept a conditional on the basis that they hold the antecedent of the conditional to be false, yet according to the material truth conditions a conditional is true if its antecedent is false. Still worse, people actually and reasonably reject conditionals that they according to the material truth conditions believe are true: I believe that Oswald murdered Kennedy, I do not believe that if Oswald didnt murder Kennedy, then my grand-mother did, indeed I would reject this conditional as preposterous. So this part of the material truth conditions cannot be grounded in usage, but then what reason can be offered for holding that a conditional is true as soon as its antecedent is false? In my view, the strongest justication for the material truth conditions for the conditional stems from two theoretical commitments: (i) classical logic is the correct theory of purely suppositional reasoning, (ii) an inference in purely suppositional reasoning is correct only if it preserves truth. Note that (i), according to the strictures of classical logic, on assuming A B one is committed to accepting A B . So (ii) A B is true whenever A B is true; the latter is true when A is false, so A B is true when A is false. If one holds, as I do, that the conclusion of this argument is false, one must reject one of the premises (i) or (ii). I think we should reject (ii) and with it the presupposition that every conditional sentence is either true or false.

2.2 Non-bivalent semantics If we agree that one can correctly infer A B whenever one has assumed that A B , and agree that in purely suppositional reasoning a valid inference cannot take one from a true premise to a false conclusion, we should conclude that A B cannot be false when A is false. In a bivalent setting this implies that A B is true when A is false, but if we accept that conditionals express partial propositions, and so can lack truth value, we can settle for a different conclusion: A B lacks truth value when A is false. The basic idea that actual usage of a particular kind of construction is important in xing the meaning ofand hence truth conditions forthe construction can hardly be controversial. Empirical studies on how speakers in various situations evaluate conditionals support the contention that conditionals lack truth value when the antecedent is false, for instance Politzer (2007), in a review of the empirical psychological investigations of reasoning with conditionals concludes In sum, reasoners seem to consider the not-A case as irrelevant to the truth value of conditional statements [of the form If A, then B ] (p. 80). Admittedly it is difcult to interpret data on how speakers evaluate claims as true or false, but when the data points in one direction and there is a coherent theoretical framework that can account for this data, this should count in favor of the theoretical framework (see Cantwell 2008a for a more thorough discussion).

123

Synthese (2009) 171:4775

61

Denition 7 A non-bivalent-valuation is an assignment W of truth values {t, f, } to the sentences of L such that: A A t f t f f AB t f t t f f t f A& B t f t t f f f f f f

Theorem 3 If a classical valuation V and a non-bivalent valuation W coincide on the atomic formulas, then for any A: V ( A) = t if and only if W ( A) = f Proof Just check the truth tables. So B is a classical consequence of A if and only if A is false in every non-bivalent valuation where B is false. That is, on the non-bivalent interpretation, classical logic is the strongest logic that guarantees that if the conclusion of an argument is false, then some premise is false. On this interpretation the distinctive feature of classical logic is not that it preserves truth (from the premises to the conclusion), but that it does not introduce falsity (in the conclusion from non-false premises). 2.2.1 Conditional negation A non-bivalent semantics allows for semantical distinctions that cannot be made in the classical bivalent semantics. In particular, it allows for a different notion of negation, inner or conditional negation, here represented by : A A t f f t This form of negation is of interest as it makes ( A B ) and A B logically equivalent and provides a plausible path to the property: | ( A B ) if and only if , A | B. This negative Ramsey Test is the plausible requirement that one should reject a conditional like If Jane applied, she got the position if and only if one, on the assumption that Jane applied, rejects the claim that Jane got the position. Conditional negation thus seems more appropriate in the present setting than the outer negation ( A B ) that says that A B is false. For note that, by supraclassicality, we have: , ( A B ) | A & B .

123

62

Synthese (2009) 171:4775

This is a very strong form of negated conditional. It seems clear that we do occasionally negate conditionals with a form of negation that is closer to than to . Inner and outer negation can (of course) coexist within the same language, but they can also be dened in terms of one another. A unary connective is denable in a language if for any A, there is a sentence B in the language such that A is logically equivalent to B (has the same truth value in every valuation). Theorem 4 is denable in L. Proof Actually, we dene both and the operator T ( A) ( A is true) with the truth conditions: A T ( A) t t f f f Dene T and as a function from sentences of L to sentences of L as follows: 1. 2. 3. 4. T ( p) = p. T ( A & B ) = T ( A )& T ( B ). T ( A B ) = A&T ( B ). T ( A ) = A .

Now dene A as (T ( A) A) A. It is straightforward to check that T ( A) and A, as dened, satisfy the requisite truth conditions. Note that | ( A B ) if and only if (by definition of ) | A B if and B. only if , A | I have chosen to characterize conditional inference relations using outer rather than inner negation to emphasize the connection to classical logic (if conditional negation had been taken as the primitive form of negation, the inference from A B to ( A B ) would come out valid, and while this seems plausible enough, it is not classically valid). See Cantwell (2008b) for a treatment of the logic of conditional negation. 3 Preferential models So far the notion of accepting a proposition on the basis of an assumption has been treated as a primitive. But it can be useful to study a model of rational acceptance. Here such a model is developed and it is shown that the model completely characterizes the class of conditional inference relations. Let U be a set of points (possible worlds) and I be a function that to each propositional letter p assigns a subset of U . Each point u in U generates a non-bivalent valuation Wu where Wu ( p ) = t if and only if u I ( p ), and Wu ( p ) = f if and only if u I ( p ). Let | A| denote the set of points in U where Wu ( A) = f. A proposition P is a subset of U where for some sentence A, P = | A|. Note that | A& B | = | A| | B | and | A B | = | A| | B |, so the set of propositions is closed under intersection and union. When is a nite set of sentences let | | = {| A|| A }.

123

Synthese (2009) 171:4775

63

Denition 8 A selection function is a function from propositions to subsets of U , with the restrictions: 1. (| A|) | A|. 2. If (| A|) | B |, then (| A| | B |) = (| A|). 3. (| A| | B |) (| A|) (| B |). The intended interpretation of (| A|) is that it is an agent relative function that selects the A-worlds (the worlds where A is not false) that the agent considers most plausible or most (epistemically) preferred.7 Due to the structural similarity between the present model and the model of KLM (Klaus et al. 1990) I will call a triple (U , I , ) a preferential model. Preferential models are not intended to provide an agent-relative semantics for conditionals (the semantics of conditionals was dealt with in the last section), but are instead intended to provide a structure rich enough to model the epistemic state of a rational agent and how the agents epistemic state changes upon making an assumption. Let represent the initial epistemic state of the agent and A represent the epistemic state of the agent upon supposing that A. Here A is dened: A(| B |) = (| A| | B |). That is, by assuming that A, the agent screens off all possible worlds where A is false from further consideration. Note that if is a selection function, then so is A. The task of the remaining sections is to give an account of how an epistemic state is related to the propositions accepted under different assumptions. That is, when represents the epistemic state of some rational agent j , we want to show how characterizes the relation | j. 3.1 The categorical case Denition 9 An acceptance relation on the categorical language L 0 and a preferential model M is a relation such that:

B if and only if (| |) | B |. if and only if B is true in the most preferred

So B is accepted on the assumptions -worlds. It follows that:

B if and only if

| |

B if and only if | |(U ) | B |,

That is, B is accepted on the assumptions if and only if B is true in the worlds that are most preferred after the agent has made the assumptions .
7 A transitive, stoppered, anti-reexive preference relation < on worlds gives rise to a selection function

as follows: < (| A|) = {u | A|| v | A| : u < v }.

123

64

Synthese (2009) 171:4775

We have the following representation result: Theorem 5 For any preferential model (U , I , ), relation.

is a categorical inference

Theorem 6 For any categorical inference relation | , there is a preferential model (U , I , ) that characterizes | , i.e. there is a preferential model (U , I , ) such that: | A if and only if 3.2 The conditional case The acceptance recipe for categorical propositions does not carry over to conditionals. Say that you believe that A, then (according to the semantics proposed above) you believe that A B lacks truth value, and this in itself does not give grounds for either accepting or rejecting the conditional. An acceptance recipe that works better with partial propositions is the following (here True( A) is the set of worlds where A is true and False( A) is the set of worlds where A is false):

A.

A if and only if | |(True( A) False( A)) True( A). is empty:

Or, in the special case when

A if and only if (True( A) False( A)) True( A).

That is, you accept that A if and only if A is true in the most preferred worlds where A has a truth value. This acceptance recipe allows us to derive the Ramsey Test:

A B if and only if A

B.

Proof Note rst: (E) (False( B ) True( B )) | A| = True( A B ) False( A B ). Now: 1. ( A)(False( B ) True( B )) True( B ) if and only if (by definition of ) 2. ((False( B ) True( B )) | A|) True( B ) if and only if (by (E)) 3. (False( A B ) True( A B )) True( B ). Note that as False( A B ) True( A B ) | A|, (3) holds if and only if: 4. (False( A B ) True( A B )) | A| True( B ).

123

Synthese (2009) 171:4775

65

As | A| True( B ) = True( A B ), (4) is equivalent to: 5. (False( A B ) True( A B )) True( A B ). So: A B if and only if, by the acceptance recipe, (1) if and only if (5) if and only if, by the acceptance recipe, A B . Informally: you accept a conditional like If Jane applied she got the position if and only if the conditional is true in all the most preferred worlds where it has a truth value; it has a truth value only in those worlds where Jane applied and is true only in those worlds where she applied and got the position. So you accept the conditional if and only if the most preferred worlds where Jane applied are worlds where Jane got the position. Unfortunately, the above acceptance recipe works badly when we consider conditionals embedded in conjunctions. What does it take for a conjunction to be true? Presumably that both conjuncts are true. What does it take for a conjunction to be accepted? Presumably that both conjuncts are accepted. In a bivalent setting the semantical and pragmatic theses go hand in hand, but when we turn to conditionals we face a problem. If a conditional lacks truth value when the antecedent is false then it cannot happen that both of the following are true (and so their conjunction cannot be true): 1. If the coin landed heads, Jim won the bet. 2. If the coin didnt land heads, Jim lost the bet. Yet clearly, one can accept both conditionals at the same time (if one believes that Jim has bet on heads, and if one does not know how the coin landed) and, hence, accept their conjunction. Thus, it is consistent with the requirements of rationality that one can accept a claim (such as the conjunction of the above conditionals) that cannot be true (but that might be false, maybe Jim didnt bet on heads). The acceptance recipe needs to be revised. For the moment I suggest the following: Denition 10 The acceptance relation characterized by a preferential model M is a relation dened recursively:

p if and only if (| |) I ( p ). A if and only if (| |) U | A|. A& B if and only if A and B. A B if and only if , A B .

We can now prove two representation theorems: Theorem 7 For any preferential model (U , I , ), relation.

is a conditional inference

Theorem 8 For any Conditional inference relation | , there is a preferential model (U , I , ), such that: | A if and only if

A.

123

66

Synthese (2009) 171:4775

3.3 Comments Note the weight that the proposed semantics of conditionals pulls in the above account. The idea is that an agent j satises A B p if p is true in the most plausible worlds where A B is not false. That is, when an agent assumes that A B she thereby excludes from consideration any world where A B is false. This presupposes that A B can be false, i.e. that it can have a truth value (although it does not presuppose that A B always has a truth value). Thus it is an essential aspect of the above model and the resulting representation theorem that conditionals can have truth values. The semantics also places constraints that are satised by the model. It is a requirement of rationality that if one accepts that a conditional is true, then one accepts the conditional. Furthermore, if one has consistent beliefs one should not accept a conditional that one believes is false. Both these constraints are satised when a nonbivalent semantics is employed (the rst property is not satised when the conditional is interpreted as a material implication): Obervation 2 1. If (| A|) True( B ), then A B . 2. If (| A|) = and (| A|) False( B ), then A

B.

Proof (1) Assume that (| A|) True( B ). Proof by induction over the length of B . B = p . Trivial. B = C & D . Note that (| A|) True(C & D ) if and only if (| A|) True(C ) and (| A|) True( D ) if and only if (by the induction hypothesis) A C and A D if and only if A C & D . B = C . Trivial. B = C D . Note that if (| A|) True(C D ), then (| A|) |C | and so (| A|) = (| A&C |). So (| A&C |) True(C D ) True( D ) and so, by the induction hypothesis, A&C D and so A C D . (2) By Corollary 2 (in Sect. 5.5), if A B , then (| A|) | B |. So assume that (| A|) = and that (| A|) False( B ). It follows that (| A|) | B | and so A B. Nevertheless, as conditionals express partial propositions the t to semantics is not perfect: an agent can believe a conditional even though she does not believe that it is true, indeed even though she believes that it lacks a truth value (the conditional must then be true in the, according to the agent, most plausible worlds where it has a truth value). Note that with the representation theorems at hand, one can prove a number of negative results. For instance, one can show that the following is not a requirement of rationality: If | A and | A B , then | B. This is done by constructing a preferential model where this form of modus ponens fails to hold. Let U consist of three worlds r, a , and c, where, alluding to McGees

123

Synthese (2009) 171:4775

67

example in the introduction, r is a world where Reagan won, a is a world where Anderson won and c is a world where Carter won. Of these r is the most plausible, followed by c, followed by r . Assuming that |Reagan won| = {r } (similarly for Anderson and Carter) and |A Republican won| = {r, a } we have:

Reagan won. A Republican won.

For the most plausible world is r where Reagan, who was a Republican, won. We also have:

A Republican won (Reagan won Anderson won).

For the most plausible world where a Republican won and it wasnt Reagan, is the world a where Anderson won. But we also have:

Reagan won Carter won.

For the most plausible world where Reagan didnt win is c where Carter won. For this reason we do not have:

Reagan won Anderson won.

Thus we see that one of the forms of modus ponens is not satised in all preferential models, hence we know (from the representation theorems) that it is not entailed by the requirements on conditional inference relations. Of course, a restricted version of this form of modus ponens still holds: Observation 2 If | A and | A B , then | B , provided that B is categorical. The proof is left as an exercise. 4 Implausible properties As already noted conditional inference relations contain classical logic (where is interpreted as the material implication), thus we have all the paradoxes of implication, such as A | A B , that have fueled part of the literature on conditionals. I have already argued that the property A | A B (together with a number of related properties) is in fact a plausible requirement of rationality when the premise A is viewed as an assumption, and that it should not be confused with its implausible sibling If | A, then | A B. Still, there remain some implausible properties. The problem with the strong classical negation, and how conditional negation might come to the rescue, has already been mentioned but other problems remain. For instance, due to supra-classicality, it becomes an alleged requirement of rationality that one is supposed to accept

123

68

Synthese (2009) 171:4775

( A B ) ( A B ) for any A and B . But clearly it is not a requirement of rationality that I accept: Either it is the case that if the coin landed heads then Napoleon is still alive, or it is the case that if the coin didnt land heads then Napoleon is still alive. Plausibly, a necessary condition for accepting a disjunction is that one does not reject both disjuncts, but in the above disjunction I can properly reject both disjuncts, so I cannot be required to accept their disjunction. Classical logic thus entails requirements, even on purely suppositional reasoning, that seem too strong. In its defense one can note that it is not possible for both disjuncts ( A B ) ( A B ) to be false at the same time (if one disjunct has a truth value, the other disjunct lacks truth value), thus we are not required to accept a claim that might be false. But this just shows that avoidance of falsity cannot be sufcient grounds on which to characterize acceptability conditions when we are dealing with partial propositions. Semantics is not enough. In this paper I have cheated as I have dened A B as ( A& B ), so A B will be true if neither A nor B are false but this means that A B cannot be interpreted as A or B . Given the way that A B has been dened, ( A B ) ( A B ) is not an obviously implausible property (both conditionals cannot be false at the same time). So this just leaves open the important problem of characterizing the logic that governs the English or when it combines with conditionals, a problem that has not been properly addressed here. What has been shown here is how a classical treatment of the connectives including the conditionalcan be combined with principles of non-monotonic logic. This yields all the theorems of classical logic (the same class of valid formulas as the classical consequence relation), but invalidates the classical reasoning patterns that are inappropriate when we turn to reasoning that is not purely suppositional. Classical logic is simple in many ways and very powerful but too strong to account for all our intuitions, so clearly there is more work to be done. 5 Proof of theorems 5.1 Proof of Theorem 1 The proof does not invoke Restricted Cut. Proof strategy: show that for every conditional preferential inference relation | : if | A, then there is a classical evaluation of the connectives (where is interpreted as the material implication) where every element of is true and A is false. The construction is standard. is a | -consistent set if for no nite subset 0 of . is a | -inconsistent set if there is some nite subset 0 of such that : 0| . is a maximal | -consistent set if no superset of is | -consistent. 0 | Lemma 1 Any | -consistent set can be extended to a maximal | -consistent set. Proof Assume that is a | -consistent set and assume some enumeration of the sentences of L . Dene:

123

Synthese (2009) 171:4775

69

1 2.a 2.b

. = =
i i

i +1 i +1

{ Bi }, if i { Bi } is a | -consistent set. { Bi }, otherwise.

Let = i i . First show that is a | -consistent set. Assume that it is not, then there is some nite set such that | . As is nite there is some i such that i . B . Thus by R-Cautious MonoTake any B i . Due to the inconsistency of , | tonicity, , B | , that is { B } is a | -inconsistent set. Reiterating this procedure . (In effect, we have shown that even if | is for each B i we nd that i | not in general monotonic, a | -inconsistent set remains inconsistent no matter what -consistent set there is some 0 j < i such that j is a we add to it). As 0 is a | -inconsistent sets. That | -consistent set but j { B j } and j { B j } are both | . But then j | B j by I; similarly, j | Bi by E. So j is is, j , B j | -consistent. | -inconsistent contradicting the assumption that j is | Next show that is a maximal | -consistent set. By the construction we know that for every B , either B or B . In either case we would get a | -inconsistent set by extending by any sentence B . Lemma 2 Take any maximal | -consistent set VC ( A) = t if and only if A . VC ( A) = f if and only if A . Claim: VC is a classical valuation of L (with interpreted as the material conditional). Proof Note that VC is well-dened as for no A do we have both A and A . Furthermore, for every A, either VC ( A) = t or VC ( A) = f (thus VC is bivalent). (). First: VC ( A) = t if and only if A if and only if VC ( A) = f . Second: Assume that VC ( A) = f . It follows that A . Thus A and so A , so VC ( A) = t . (&). Assume that VC ( A& B ) = t , then A& B . We have (due to Reexivity) A& B , A | A& B and so (due to &Acc) A& B , A | A. We also have (due to Reexivity) A& B , A | A. So (due to &Acc) A& B , A | A& A. As A& B and as is consistent, A not . Similarly B . But then A , B and so VC ( A) = VC ( B ) = t . Assume that VC ( A& B ) = f , then ( A& B ) . We have ( A& B ), A , B | A& B (due to Reexivity and &-IE). We also have (due to Reexivity) ( A& B ), A, B | ( A& B ). So (due to &-IE) ( A& B ), A , B | ( A& B )&( A& B ). Thus, as ( A& B ) , we cannot have both A and B in and so either A or B . But then either VC ( A) = f or VC ( B ) = f . (). Assume that VC ( A B ) = t , then A B . Assume for reductio that A VC ( A) = t and VC ( B ) = f . Then A , B . By Reexivity B , A B | B and so, by (Acc), B , A B , A | B . But we also have B , A B , A | B. So by (&Acc) B , A B , A | B & B . But then is not a | -consistent set. Thus either VC ( A) = t or VC ( B ) = f , i.e. either VC ( A) = f or VC ( B ) = t . Assume that VC ( A B ) = f , then ( A B ) . By Reexivity ( A B ), A, B | B so, by (Acc) ( A B ), B | A B and (via Reexivity and &Acc) and dene VC :

123

70

Synthese (2009) 171:4775

( A B ), B | ( A B )&( A B ). Thus B and so B and VC ( B ) = f . Similarly, as ( A B ), A , A | B , we have ( A B ), A | A B . But then ( A B ), A | ( A B )&( A B ). So A and VC ( A) = t . So assume that | A. Note that { A} is | -consistent (as otherwise, by -E, | A). So { A} can be extended to a maximal | -consistent set . As and A , it follows that for each B , VC ( B ) = t while VC ( A) = f . As VC is a classical valuation we have shown that A is not a classical consequence of .

5.2 Proof of Theorem 2 (1) Assume that A is classically equivalent to B (it follows by Supra-classicality that ,A | B and , B | A). It is enough to show one direction of the Lemma: If ,A| C , then , B | C . Proof by induction over the length of C . C = p . Assume that , A | p . As , A | B we have , A , B | p by Restricted Cautious Monotony. As , B | A we have , B | p by Restricted Cut. C = D . Assume that the claim of the Lemma holds for D . Assume that , A| D . As , A | B we have , A , B | D by Restricted Cautious Monotony (note that every instance of in D occurs within the scope of ). As , B | A we have , B | D by Restricted Cut. C = D E . Assume that the claim of the Lemma holds for D and E . Assume that , A | D E . Then , A , D | E , but then (by the induction hypothesis) , B, D | E and so , B | D E. C = D & E . Assume that the claim of the Lemma holds for D and E . Assume that , A | D & E . Then , A | D and , A | E , but then (by the induction hypothesis) , B | D and , B | E and so , B | D& E . C . Proof by induction over (2) Show that , A , B | C if and only if , A& B | the length of C . C = p . Assume that , A , B | p . As , A , B | A& B we have , A , B , A& B | p by Restricted Cautious Monotony. As , A& B | A and , A& B | B we have , A& B | p by two applications of Restricted Cut. Assume that , A& B | p. As , A& B | A and , A& B | B we have , A , B , A& B | p by two applica p tions of Restricted Cautious Monotony. As , A , B | A& B we have , A , B | by Restricted Cut. C = D . Assume that , A , B | D . As , A , B | A& B we have , A , B , A& B | D by Restricted Cautious Monotony. As , A& B | A and , A& B | B we have , A& B | D by two applications of Restricted Cut. Assume that , A& B | D . As , A& B | A and , A& B | B we have , A , B , A& B | D by two applications of Restricted Cautious Monotony. As , A , B | A& B we have , A , B | D by Restricted Cut. C = D E . , A, B | D E if and only if , A , B , D | E if and only if (by the induction hypothesis) , A& B , D | E if and only if , A& B | D E. C = D& E . , A, B | D & E if and only if , A , B | D and , A , B | E if and only if (by the induction hypothesis) , A& B | D and , A& B | E if and D& E . only if , A& B |

123

Synthese (2009) 171:4775

71

5.3 Proof of Theorem 5 We need to show that a given satises all the constraints imposed on Categorical inference relations. This is fairly trivial for &Acc, I, I, E, I, E and Reexivity, so here it is shown only that satises E, Cut and Cautious Monotony. E. Assume that , A C and , B C , i.e. that (| | | A|) |C | and (| || B |) |C |. By requirement 3 on , ((| || A|)(| || B |)) (| || A|) (| | | B |) and so ((| | | A|) (| | | B |)) |C |. But then (| | (| A| | B |)) |C | and so (| | | A B |) |C |, i.e. , A B C . Cut and Cautious Monotony. Assume that A, i.e. (| |) | A|. By requirement 2 on , (| |) = (| | | A|). So , A B if and only if B. 5.4 Proof of Theorem 6 The proof is omitted as the proof of Theorem 8 below can easily be adapted to the categorical case. 5.5 Proof of Theorem 7 We need to show that a given satises all the constraints imposed on Conditional Inference relations. This is trivial for Acc and &Acc. So we need to show that the remaining conditions are satised. First a number of auxiliary properties are presented. Lemma 3 If B is categorical then A

B if and only if (| A|) | B |.

The proof, which proceeds by induction over the length of B is quite trivial and is omitted. Lemma 4 If | A| = | B |, then A

C if and only if B

C.

Proof Assume that | A| = | B |. Proof by induction over C . The cases when C = p and C = D are trivial. C = D E . As | A| = | B |, we have | A| | D | = | B | | D | and so | A& D | = | B & D |. So A D E if and only if A , D E if and only if (by the induction hypothesis) B , D E if and only if B D E . C = D & E . A D & E if and only if A D and A E if and only if (by the induction hypothesis) B D and B E if and only if B D & E . Lemma 5 If A , B

C, then (| A|) = (| A&( B C )|).

Proof Assume that A , B C . Proof by induction over the length of C . C = q . We have (| A| | B |) |q |. Thus (| A| | B |) | B q |. Furthermore (| A|| B |) | B q |. So (by Requirement 3 on ), ((| A|| B |)(| A|| B |)) | B q |. But | A| = (| A| | B |) (| A| | B |), thus (| A|) | B q |. But then (by Requirement 2 on ) (| A|) = (| A&( B q )|).

123

72

Synthese (2009) 171:4775

C = E F . It is assumed that A , B E F , i.e. A , B , E F , i.e. A , B & E F . Now: (| A|) = (| A&(( B & E ) F )|) = (| A&( B ( E F ))|) and we are done. C = E & F . A , B E & F . Thus A , B E and A , B F . By the induction hypothesis (| A&( B E )|) = (| A&( B F )|). But then (| A&( B E )|) | B F | and so (| A&( B E )|) = (| A&( B E )&( B F )|) = (| A&( B ( E & F ))|). C = E . A , B E . So (| A& B |) | E |. But then (using the same argument as the base case when C = p ), (| A|) = (| A&( B E )|). Corollary 1 If A

B, then (| A|) = (| A& B |).

Proof Assume that A B )| ) = (| A & B | ). Corollary 2 If A

B . Then A ,

B . By Lemma 5, (| A|) = (| A&(

B, then (| A|) | B |.

Lemma 6 If | A| | B |, then A

B.

Proof Proof by induction over the length of B . The cases when B = p and B = C are trivial. B = C D . Assume that | A| |C D |. Thus | A&C | | D |. By the induction hypothesis A , C D and so A C D . B = C & E . Assume that | A| |C & D |. So | A| |C | and | A| | D |. By the induction hypothesis A C and A D , so A C & D . Lemma 7 I If A, then A B and E If , A C and , B C, then , A B I If , A , then A. E If , A , then A. I If A& A, then . E If , then B, for any B.

B A.

C.

Proof (I.) Assume that A. We need to show that A B , i.e. that ( A& B ). By Corollary 2, (| |) | A|. So (| |) |( A& B )|. But then ( A& B ). (E.) Assume that , A C and , B C . Proof by induction over C . C = p . It is assumed that , A p and , B p . So (| | | A|) | p | and (| | | B |) | p |. Thus (by Requirement 3), (| | (| A| | B |)) | p | but then , A B p. C = D E . As , A D E and , B D E , we have As , A , D E and , B , D E . By the induction hypothesis , A B , D E and so As , A B D E. C = D & E . As , A D & E and , B D & E , we have As , A D , , A E , , B D and , B E . Thus, by the induction hypothesis, , A B D and , A B E . So , A B D & E . This concludes the proof of E.

123

Synthese (2009) 171:4775

73

(I.) Assume that , A . By construction (| | | A|) = . Furthermore | | = (| | | A|) (| | | A|). By Requirement 3, ((| | | A|) (| | | A|)) = (| |) (| | | A|) (| | | A|). So (| |) (| | | A|). But then (| |) | A|. So A. (E.) Proof by induction over A. A = p . Assume that , p . Then (| | | p |) = . But then (via Requirement 3) (| |) | p | and so p. A = B &C . Assume that , ( B &C ) . Then (| | |( B &C )|) = . (| | (| B | |C |)) = . But then (| | (| B | |C |)) = (| | | B |) = (| | |C |). So , B and , C . By the induction hypothesis B and C and so B &C . A = B C . Assume that , ( B C ) . Then (| | |( B C )|) = . As |( B C )| = | B | |C |, (| | | B | |C |)) = . But then , B , C and so, by the induction hypothesis, , B C . So B C. A = D . Assume that , D . Then (| | | D |) = . So (| |) | D |. But then D. (I.) Assume that A& A. By Corollary 2, (| |) | A& A|. But | A & A | = so (| |) = and so . (E.) Assume that . Then (| |) = . Prove by induction over A that A. The cases when A = p and A = B are trivial. A = B &C . By the induction hypothesis B and C so B &C . A = B C . As , , B . By the induction hypothesis , B C . So B C. Corollary 3 Reexivity , A A. R-Cut If A and B , then , A B , provided that B is categorical. R-Cautious Monotony If A and , A B , then B , provided that B is categorical. Proof Reexivity. An immediate consequence of Lemma 6. R-Cut and R-Cautious Monotony. Assume that A. By Corollary 2, (| |) | A|. Thus (| |) = (| | | A|). By Lemma 3, , A B if and only B as long as B contains no conditional. 5.6 Proof of Theorem 8 Given a categorical inference relation | we need to construct a preferential model (U , I , ) such that | = . Let | be a categorical inference relation. Let U be the set of all maximal classically consistent sets of sentences of L . Dene: I ( p ) = {u U | p u } (| A|) = {u U | B : (if A | B and B is categorical, then B u )}.

123

74

Synthese (2009) 171:4775

Note (i) that for any u U and sentence A: u | A| if and only if A u . Note (ii) that is well-dened. For if | A| = |C |, then A and C are classically equivalent. By Theorem 2, A | B if and only if C | B. Let M = (U , I , ). Lemma 8 M is a preferential model. Proof We need to show that has the requisite properties. (1) Show (| A|) | A|. Note that for any A, | A| = | A|. By Reexivity A | A. From Theorem 2, A | A. Thus for every u | (| A|), A u , i.e. u | A| and so u | A|. (2) Show that if | (| A|) | B |, then | (| A& B |) = | (| A|). Assume that | (| A|) | B |. First show that A | B . Assume for reductio that A | B . Let = {C | A | C and C is categorical}. Assume that | C B . Then there are C1 , . . . , Cn such that C1 & &Cn | C B . Then C1 & &Cn | C B . By Supra-classicality, C1 & &Cn | B as C1 & &Cn . By R-Cut A | B contrary to assumption. A| Ci , 1 i n , A | Thus | C B . But then { B } can be extended to a maximal classically consistent set u U L . As u , u (| A|) and as B u , u | B | contrary to assumption. So A | B . By R-Cut and R-Restricted monotony A | C if and only if A , B | C , for any categorical C and as, by Theorem 2, A , B | C if and only if A , B | C , we have thus (| A|) = | (| A& B |). (3) Show that (| A B |) (| A|) (| B |). Assume that u (| A|) (| B |). Then there are categorical C and D such that A | C and B | D such that C , D u . By Acc, A | C D and B | C D . By E, A B | C D . As C , D u , C D u . so u (| A B |). Lemma 9 Where is the acceptance relation dened by the model M, for all A and B. B: A B if and only if A | Proof By induction over the length of B . B = p . p is categorical so A | p if and only if (| A|) | p | = I ( p ) if and only if A p . B = C . C is categorical so A | C if and only if (| A|) |C | = U |C | if and only if A C . B = C & D . A C & D if and only if A C and A D if and only if (by the induction hypothesis) A | C and A | D if and only if A | C & D. B = C D . A C D if and only if A , C D if and only if (by the induction hypothesis) A , C | D if and only if A | C D.

123

Synthese (2009) 171:4775

75

References
Adams, E. W. (1975). The logic of conditionals. Dordrecht: Reidel. Alchourrn, C., Grdenfors, P., & Makinson, D. (1985). On the logic of theory change: Partial meet functions for contraction and revision. Journal of Symbolic Logic, 50, 510530. Arl-Costa, H. (1995). Epistemic conditionals, snakes and stars. In L. Farinas del Cerro, G. Crocco, & A. Herzig (Eds.), Conditionals, from philosophy to computer science (pp. 193239). Oxford University Press. Vol. 5 of Studies in Logic and Computation, Series editor: D.M. Gabbay. Arl-Costa, H. (1999). Belief revision conditionals: Basic iterated systems. Annals of Pure and Applied Logic, 96, 328. Arl-Costa, H. (2007). The logic of conditionals. Stanford Encyclopedia of Philosophy. Bennett, J. F. (2003). A philosophical guide to conditionals. Oxford: Oxford University Press. Brandom, R. (1994). Making it explicit. Cambridge: Harvard University Press. Broome, J. (2001). Normative practical reasoning. Proceeding of the Aristotelian Society (pp. 175193). Supplementary Volume 75. Broome, J. (2002). Practical reasoning. In J. Bermdez & A. Millar (Eds.), Reason and nature: Essays in the theory of rationality (pp. 85111). Oxford: Oxford University Press. Cantwell, J. (2008a). Indicative conditionals: Factual or epistemic? Studia Logica, 88, 157194. Cantwell, J. (2008b). The logic of conditional negation. Notre Dame Journal of Formal Logic, 49(3), 245260. Dummett, M. (1997). Elements of intuitionism. Oxford: Clarendon Press. Edgington, D. (1995). On conditionals. Mind, 104, 235329. Gabbay, D. (1985). Theoretical foundation for non-monotonic reasoning in expert systems. In Proceeding NATO Advanced Study Institute on Logics and Models of Concurrent Systems (pp. 439457). Berlin: Springer. Grdenfors, P., & Makinson, D. (1991). Relations between the logic of theory change and nonmonotonic logic. In A. Fuhrmann & M. Morreau (Eds.), The logic of theory change, Berlin: Springer Verlag. Gentzen, G. Investigations into logical deduction. In M. E. Szabo (Ed.), In the collected works of Gerhard Gentzen. Amsterdam: North Holland. Hansson, S. O. (1992). In defense of the Ramsey test. Journal of Philosophy, 89, 522540. Klaus, S., Lehmann, D., & Magidor, M. (1990). Nonmonotonic reasoning, preferential models and cumulative logics. Articial Intelligence, 44, 167207. Levi, I. (1988). Iteration of conditionals and the Ramsey test. Synthese, 76, 4981. Levi, I. (1996). For the sake of the argument. Cambridge: Cambridge University Press. Lewis, D. (1976). Probabilities of conditionals and conditional probabilities. Philosophical Review, 85, 297315. McGee, V. (1985). A counterexample to modus ponens. Journal of Philosophy, 82, 462471. Politzer, G. (2007). Reasoning with conditionals. Topoi, 26, 7995. Prawitz, D. (1965). Natural deduction: A proof-theoretical study. Almqvist & Wiksell. Quine, W. V. O. (1950). Methods of logic. New York: Holt, Rheinhart and Winston. Ramsey, F. P. (1929/1990). Law and causality. In D. H. Mellor (Ed.), F. P. Ramsey: Philosophical papers. Cambridge: Cambridge University Press. Stalnaker, R. (1975). Indicative conditionals. Philosophia, 5, 269286. Reprinted in Ifs, ed. W. L. Harper, R. Stalnaker and G. Pearce 1976 by Reidel. Weatherson, B. (2001). Indicative and subjunctive conditionals. The Philosophical Quarterly, 51(203), 200216.

123

S-ar putea să vă placă și