The logic of indicative conditionals remains the topic of deep and intractable philosophical disagreement. I show that two influential epistemic norms—the Lockean theory of belief and the Ramsey test for conditional belief—are jointly sufficient to ground a powerful new argument for a particular conception of the logic of indicative conditionals. Specifically, the argument demonstrates, contrary to the received historical narrative, that there is a real sense in which Stalnaker’s semantics for the indicative did succeed in capturing (...) the logic of the Ramseyan indicative conditional. (shrink)
In this article, I outline a logic of design of a system as a specific kind of conceptual logic of the design of the model of a system, that is, the blueprint that provides information about the system to be created. In section two, I introduce the method of levels of abstraction as a modelling tool borrowed from computer science. In section three, I use this method to clarify two main conceptual logics of information inherited from modernity: Kant’s (...) transcendental logic of conditions of possibility of a system, and Hegel’s dialectical logic of conditions of in/stability of a system. Both conceptual logics of information analyse structural properties of given systems. Strictly speaking, neither is a conceptual logic of information about the conditions of feasibility of a system, that is, neither is a logic of information as a logic of design. So, in section four, I outline this third conceptual logic of information and then interpret the conceptual logic of design as a logic of requirements, by introducing the relation of “sufficientisation”. In the conclusion, I argue that the logic of requirements is exactly what we need in order to make sense of, and buttress, a constructionist approach to knowledge. (shrink)
A uniform theory of conditionals is one which compositionally captures the behavior of both indicative and subjunctive conditionals without positing ambiguities. This paper raises new problems for the closest thing to a uniform analysis in the literature (Stalnaker, Philosophia, 5, 269–286 (1975)) and develops a new theory which solves them. I also show that this new analysis provides an improved treatment of three phenomena (the import-export equivalence, reverse Sobel-sequences and disjunctive antecedents). While these results concern central issues in (...) the study of conditionals, broader themes in the philosophy of language and formal semantics are also engaged here. This new analysis exploits a dynamic conception of meaning where the meaning of a symbol is its potential to change an agent’s mental state (or the state of a conversation) rather than being the symbol’s content (e.g. the proposition it expresses). The analysis of conditionals is also built on the idea that the contrast between subjunctive and indicative conditionals parallels a contrast between revising and consistently extending some body of information. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth-conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the `logicality of language', accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter-examples consisting of acceptable tautologies and contradictions, the logicality of language is often paired (...) with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is `blind' to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non-classical---indeed quite exotic---kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis-a-vis its open class terms and employs a deductive system that is basically classical. (shrink)
Conditional excluded middle (CEM) is the following principe of counterfactual logic: either, if it were the case that φ, it would be the case that ψ, or, if it were the case that φ, it would be the case that not-ψ. I will first show that CEM entails the identity of indiscernibles, the falsity of physicalism, and the failure of the modal to supervene on the categorical and of the vague to supervene on the precise. I will then argue (...) that we should accept these startling conclusions, since CEM is valid. (shrink)
Recent work in formal semantics suggests that the language system includes not only a structure building device, as standardly assumed, but also a natural deductive system which can determine when expressions have trivial truth‐conditions (e.g., are logically true/false) and mark them as unacceptable. This hypothesis, called the ‘logicality of language’, accounts for many acceptability patterns, including systematic restrictions on the distribution of quantifiers. To deal with apparent counter‐examples consisting of acceptable tautologies and contradictions, the logicality of language is often paired (...) with an additional assumption according to which logical forms are radically underspecified: i.e., the language system can see functional terms but is ‘blind’ to open class terms to the extent that different tokens of the same term are treated as if independent. This conception of logical form has profound implications: it suggests an extreme version of the modularity of language, and can only be paired with non‐classical—indeed quite exotic—kinds of deductive systems. The aim of this paper is to show that we can pair the logicality of language with a different and ultimately more traditional account of logical form. This framework accounts for the basic acceptability patterns which motivated the logicality of language, can explain why some tautologies and contradictions are acceptable, and makes better predictions in key cases. As a result, we can pursue versions of the logicality of language in frameworks compatible with the view that the language system is not radically modular vis‐á‐vis its open class terms and employs a deductive system that is basically classical. (shrink)
Priest has provided a simple tableau calculus for Chellas's conditional logic Ck. We provide rules which, when added to Priest's system, result in tableau calculi for Chellas's CK and Lewis's VC. Completeness of these tableaux, however, relies on the cut rule.
According to one tradition, uttering an indicative conditional involves performing a special sort of speech act: a conditional assertion. We introduce a formal framework that models this speech act. Using this framework, we show that any theory of conditional assertion validates several inferences in the logic of conditionals, including the False Antecedent inference. Next, we determine the space of truth-conditional semantics for conditionals consistent with conditional assertion. The truth value of any such conditional is settled whenever the (...) antecedent is false, and whenever the antecedent is true and the consequent is false. Then, we consider the space of dynamic meanings consistent with the theory of conditional assertion. We develop a new family of dynamic conditional-assertion operators that combine a traditional test operator with an update operation. (shrink)
This paper starts by indicating the analysis of Hempel's conditions of adequacy for any relation of confirmation (Hempel, 1945) as presented in Huber (submitted). There I argue contra Carnap (1962, Section 87) that Hempel felt the need for two concepts of confirmation: one aiming at plausible theories and another aiming at informative theories. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. The main part of the paper (...) consists in working out the claim that one can have Hempel's cake and eat it too - in the sense that there is a logic of theory assessment that takes into account both of the two conflicting aspects of plausibility and informativeness. According to the semantics of this logic, a is an acceptable theory for evidence β if and only if a is both sufficiently plausible given β and sufficiently informative about β. This is spelt out in terms of ranking functions (Spohn, 1988) and shown to represent the syntactically specified notion of an assessment relation. The paper then compares these acceptability relations to explanatory and confirmatory consequence relations (Flach, 2000) as well as to nonmonotonic consequence relations (Kraus et al., 1990). It concludes by relating the plausibility-informativeness approach to Carnap's positive relevance account, thereby shedding new light on Carnap's analysis as well as solving another problem of confirmation theory. (shrink)
How to say no less, no more about conditional than what is needed? From a logical analysis of necessary and sufficient conditions (Section 1), we argue that a stronger account of conditional can be obtained in two steps: firstly, by reminding its historical roots inside modal logic and set-theory (Section 2); secondly, by revising the meaning of logical values, thereby getting rid of the paradoxes of material implication whilst showing the bivalent roots of conditional as a speech-act based on (...) affirmations and rejections (Section 3). Finally, the two main inference rules for conditional, viz. Modus Ponens and Modus Tollens, are reassessed through a broader definition of logical consequence that encompasses both a normal relation of truth propagation and a weaker relation of falsity non-propagation from premises to conclusion (Section 3). (shrink)
Causal models provide a framework for making counterfactual predictions, making them useful for evaluating the truth conditions of counterfactual sentences. However, current causal models for counterfactual semantics face limitations compared to the alternative similarity-based approach: they only apply to a limited subset of counterfactuals and the connection to counterfactual logic is not straightforward. This paper argues that these limitations arise from the theory of interventions where intervening on variables requires changing structural equations rather than the values of variables. Using (...) an alternative theory of exogenous interventions, this paper extends the causal approach to counterfactuals to handle more complex counterfactuals, including backtracking counterfactuals and those with logically complex antecedents. The theory also validates familiar principles of counterfactual logic and offers an explanation for counterfactual disagreement and backtracking readings of forward counterfactuals. (shrink)
The paper presents a new analysis of Hempel’s conditions of adequacy, differing from the one in Carnap. Hempel, so it is argued, felt the need for two concepts of confirmation: one aiming at true theories, and another aiming at informative theories. However, so the analysis continues, he also realized that these two concepts were conflicting, and so he gave up the concept of confirmation aiming at informative theories. It is then shown that one can have the cake and eat it: (...) There is a logic of confirmation that accounts for both of these two conflicting aspects. (shrink)
This paper presents a new analysis of C.G. Hempel’s conditions of adequacy for any relation of confirmation [Hempel C. G. (1945). Aspects of scientific explanation and other essays in the philosophy of science. New York: The Free Press, pp. 3–51.], differing from the one Carnap gave in §87 of his [1962. Logical foundations of probability (2nd ed.). Chicago: University of Chicago Press.]. Hempel, it is argued, felt the need for two concepts of confirmation: one aiming at true hypotheses and another (...) aiming at informative hypotheses. However, he also realized that these two concepts are conflicting, and he gave up the concept of confirmation aiming at informative hypotheses. I then show that one can have Hempel’s cake and eat it too. There is a logic that takes into account both of these two conflicting aspects. According to this logic, a sentence H is an acceptable hypothesis for evidence E if and only if H is both sufficiently plausible given E and sufficiently informative about E. Finally, the logic sheds new light on Carnap’s analysis. (shrink)
Darwiche and Pearl’s seminal 1997 article outlined a number of baseline principles for a logic of iterated belief revision. These principles, the DP postulates, have been supplemented in a number of alternative ways. Most suggestions have resulted in a form of ‘reductionism’ that identifies belief states with orderings of worlds. However, this position has recently been criticised as being unacceptably strong. Other proposals, such as the popular principle (P), aka ‘Independence’, characteristic of ‘admissible’ operators, remain commendably more modest. In (...) this paper, we supplement the DP postulates and (P) with a number of novel conditions. While the DP postulates constrain the relation between a prior and a posterior conditional belief set, our new principles notably govern the relation between two posterior conditional belief sets obtained from a common prior by different revisions. We show that operators from the resulting family, which subsumes both lexicographic and restrained revision, can be represented as relating belief states associated with a ‘proper ordinal interval’ (POI) assignment, a structure more fine-grained than a simple ordering of worlds. We close the paper by noting that these operators satisfy iterated versions of many AGM era postulates, including Superexpansion, that are not sound for admissible operators in general. (shrink)
What follows is a summary of basic principles pertaining to the definitions used in constructing an ontology. A definition is a statement of necessary and sufficient conditions. What this means in the simplest case can be understood as follows. To say that ɸ‐ing is a necessary condition for being an A is just another way of saying that every A ɸ’s; to say that ɸ‐ing is a sufficient condition for being an A is just another way of saying that everything (...) that ɸ’s is an A. The goal in writing a definition is to specify a set of conditions of this sort which are all necessary, and which are jointly sufficient. (shrink)
The program put forward in von Wright's last works defines deontic logic as ``a study of conditions which must be satisfied in rational norm-giving activity'' and thus introduces the perspective of logical pragmatics. In this paper a formal explication for von Wright's program is proposed within the framework of set-theoretic approach and extended to a two-sets model which allows for the separate treatment of obligation-norms and permission norms. The three translation functions connecting the language of deontic logic with (...) the language of the extended set-theoretical approach are introduced, and used in proving the correspondence between the deontic theorems, on one side, and the perfection properties of the norm-set and the ``counter-set'', on the other side. In this way the possibility of reinterpretation of standard deontic logic as the theory of perfection properties that ought to be achieved in norm-giving activity has been formally proved. The extended set-theoretic approach is applied to the problem of rationality of principles of completion of normative systems. The paper concludes with a plaidoyer for logical pragmatics turn envisaged in the late phase of Von Wright's work in deontic logic. (shrink)
In this paper we consider conditional random quantities (c.r.q.’s) in the setting of coherence. Based on betting scheme, a c.r.q. X|H is not looked at as a restriction but, in a more extended way, as \({XH + \mathbb{P}(X|H)H^c}\) ; in particular (the indicator of) a conditional event E|H is looked at as EH + P(E|H)H c . This extended notion of c.r.q. allows algebraic developments among c.r.q.’s even if the conditioning events are different; then, for instance, we can give a (...) meaning to the sum X|H + Y|K and we can define the iterated c.r.q. (X|H)|K. We analyze the conjunction of two conditional events, introduced by the authors in a recent work, in the setting of coherence. We show that the conjoined conditional is a conditional random quantity, which may be a conditional event when there are logical dependencies. Moreover, we introduce the negation of the conjunction and by applying De Morgan’s Law we obtain the disjoined conditional. Finally, we give the lower and upper bounds for the conjunction and disjunction of two conditional events, by showing that the usual probabilistic properties continue to hold. (shrink)
According to the truth-functional analysis of conditions, to be ‘necessary for’ and ‘sufficient for’ are converse relations. From this, it follows that to be ‘necessary and sufficient for’ is a symmetric relation, that is, that if P is a necessary and sufficient condition for Q, then Q is a necessary and sufficient condition for P. This view is contrary to common sense. In this paper, I point out that it is also contrary to a widely accepted ontological view of conditions, (...) according to which if P is a necessary and sufficient condition for Q, then Q is in no sense a condition for P; it is a mere consequence of P. (shrink)
The Logic of Causation: Definition, Induction and Deduction of Deterministic Causality is a treatise of formal logic and of aetiology. It is an original and wide-ranging investigation of the definition of causation (deterministic causality) in all its forms, and of the deduction and induction of such forms. The work was carried out in three phases over a dozen years (1998-2010), each phase introducing more sophisticated methods than the previous to solve outstanding problems. This study was intended as part (...) of a larger work on causal logic, which additionally treats volition and allied cause-effect relations (2004). The Logic of Causation deals with the main technicalities relating to reasoning about causation. Once all the deductive characteristics of causation in all its forms have been treated, and we have gained an understanding as to how it is induced, we are able to discuss more intelligently its epistemological and ontological status. In this context, past theories of causation are reviewed and evaluated (although some of the issues involved here can only be fully dealt with in a larger perspective, taking volition and other aspects of causality into consideration, as done in Volition and Allied Causal Concepts). Phase I: Macroanalysis. Starting with the paradigm of causation, its most obvious and strongest form, we can by abstraction of its defining components distinguish four genera of causation, or generic determinations, namely: complete, partial, necessary and contingent causation. When these genera and their negations are combined together in every which way, and tested for consistency, it is found that only four species of causation, or specific determinations, remain conceivable. The concept of causation thus gives rise to a number of positive and negative propositional forms, which can be studied in detail with relative ease because they are compounds of conjunctive and conditional propositions whose properties are already well known to logicians. The logical relations (oppositions) between the various determinations (and their negations) are investigated, as well as their respective implications (eductions). Thereafter, their interactions (in syllogistic reasoning) are treated in the most rigorous manner. The main question we try to answer here is: is (or when is) the cause of a cause of something itself a cause of that thing, and if so to what degree? The figures and moods of positive causative syllogism are listed exhaustively; and the resulting arguments validated or invalidated, as the case may be. In this context, a general and sure method of evaluation called ‘matricial analysis’ (macroanalysis) is introduced. Because this (initial) method is cumbersome, it is used as little as possible – the remaining cases being evaluated by means of reduction. Phase II: Microanalysis. Seeing various difficulties encountered in the first phase, and the fact that some issues were left unresolved in it, a more precise method is developed in the second phase, capable of systematically answering most outstanding questions. This improved matricial analysis (microanalysis) is based on tabular prediction of all logically conceivable combinations and permutations of conjunctions between two or more items and their negations (grand matrices). Each such possible combination is called a ‘modus’ and is assigned a permanent number within the framework concerned (for 2, 3, or more items). This allows us to identify each distinct (causative or other, positive or negative) propositional form with a number of alternative moduses. This technique greatly facilitates all work with causative and related forms, allowing us to systematically consider their eductions, oppositions, and syllogistic combinations. In fact, it constitutes a most radical approach not only to causative propositions and their derivatives, but perhaps more importantly to their constituent conditional propositions. Moreover, it is not limited to logical conditioning and causation, but is equally applicable to other modes of modality, including extensional, natural, temporal and spatial conditioning and causation. From the results obtained, we are able to settle with formal certainty most of the historically controversial issues relating to causation. Phase III: Software Assisted Analysis. The approach in the second phase was very ‘manual’ and time consuming; the third phase is intended to ‘mechanize’ much of the work involved by means of spreadsheets (to begin with). This increases reliability of calculations (though no errors were found, in fact) – but also allows for a wider scope. Indeed, we are now able to produce a larger, 4-item grand matrix, and on its basis find the moduses of causative and other forms needed to investigate 4-item syllogism. As well, now each modus can be interpreted with greater precision and causation can be more precisely defined and treated. In this latest phase, the research is brought to a successful finish! Its main ambition, to obtain a complete and reliable listing of all 3-item and 4-item causative syllogisms, being truly fulfilled. This was made technically feasible, in spite of limitations in computer software and hardware, by cutting up problems into smaller pieces. For every mood of the syllogism, it was thus possible to scan for conclusions ‘mechanically’ (using spreadsheets), testing all forms of causative and preventive conclusions. Until now, this job could only be done ‘manually’, and therefore not exhaustively and with certainty. It took over 72’000 pages of spreadsheets to generate the sought for conclusions. This is a historic breakthrough for causal logic and logic in general. Of course, not all conceivable issues are resolved. There is still some work that needs doing, notably with regard to 5-item causative syllogism. But what has been achieved solves the core problem. The method for the resolution of all outstanding issues has definitely now been found and proven. The only obstacle to solving most of them is the amount of labor needed to produce the remaining (less important) tables. As for 5-item syllogism, bigger computer resources are also needed. (shrink)
Starting from a recent paper by S. Kaufmann, we introduce a notion of conjunction of two conditional events and then we analyze it in the setting of coherence. We give a representation of the conjoined conditional and we show that this new object is a conditional random quantity, whose set of possible values normally contains the probabilities assessed for the two conditional events. We examine some cases of logical dependencies, where the conjunction is a conditional event; moreover, we give the (...) lower and upper bounds on the conjunction. We also examine an apparent paradox concerning stochastic independence which can actually be explained in terms of uncorrelation. We briefly introduce the notions of disjunction and iterated conditioning and we show that the usual probabilistic properties still hold. (shrink)
This paper discusses an almost sixty year old problem in the philosophy of science -- that of a logic of confirmation. We present a new analysis of Carl G. Hempel's conditions of adequacy (Hempel 1945), differing from the one Carnap gave in §87 of his Logical Foundations of Probability (1962). Hempel, it is argued, felt the need for two concepts of confirmation: one aiming at true theories and another aiming at informative theories. However, he also realized that these two (...) concepts are conflicting, and he gave up the concept of confirmation aiming at informative theories. We then show that one can have Hempel's cake and eat it, too: There is a (rank-theoretic and genuinely nonmonotonic) logic of confirmation -- or rather, theory assessment -- that takes into account both of these two conflicting aspects. According to this logic, a statement H is an acceptable theory for the data E if and only if H is both sufficiently plausible given E and sufficiently informative about E. Finally, the logic sheds new light on Carnap's analysis (and solves another problem of confirmation theory). (shrink)
We generalize, by a progressive procedure, the notions of conjunction and disjunction of two conditional events to the case of n conditional events. In our coherence-based approach, conjunctions and disjunctions are suitable conditional random quantities. We define the notion of negation, by verifying De Morgan’s Laws. We also show that conjunction and disjunction satisfy the associative and commutative properties, and a monotonicity property. Then, we give some results on coherence of prevision assessments for some families of compounded conditionals; in (...) particular we examine the Fréchet-Hoeffding bounds. Moreover, we study the reverse probabilistic inference from the conjunction Cn+1 of n + 1 conditional events to the family {Cn,En+1|Hn+1}. We consider the relation with the notion of quasi-conjunction and we examine in detail the coherence of the prevision assessments related with the conjunction of three conditional events. Based on conjunction, we also give a characterization of p-consistency and of p-entailment, with applications to several inference rules in probabilistic nonmonotonic reasoning. Finally, we examine some non p-valid inference rules; then, we illustrate by an example two methods which allow to suitably modify non p-valid inference rules in order to get inferences which are p-valid. (shrink)
I evaluate two claims; that (a) Jesus’ message as recorded in the gospels implies exclusivism with respect to salvation and that, correspondingly, (b) Christians should be exclusivists with respect to salvation. I evaluate these claims through a cataloguing and evaluation of the logical condition involved in each of the claims regarding conditions for salvation made by Jesus in the Gospel of John. As a result, I argue that (a) is false and that, correspondingly, so is (b).
We argue that distinct conditionals—conditionals that are governed by different logics—are needed to formalize the rules of Truth Introduction and Truth Elimination. We show that revision theory, when enriched with the new conditionals, yields an attractive theory of truth. We go on to compare this theory with one recently proposed by Hartry Field.
Future Logic is an original, and wide-ranging treatise of formal logic. It deals with deduction and induction, of categorical and conditional propositions, involving the natural, temporal, extensional, and logical modalities. Traditional and Modern logic have covered in detail only formal deduction from actual categoricals, or from logical conditionals (conjunctives, hypotheticals, and disjunctives). Deduction from modal categoricals has also been considered, though very vaguely and roughly; whereas deduction from natural, temporal and extensional forms of conditioning has been (...) all but totally ignored. As for induction, apart from the elucidation of adductive processes (the scientific method), almost no formal work has been done. This is the first work ever to strictly formalize the inductive processes of generalization and particularization, through the novel methods of factorial analysis, factor selection and formula revision. This is the first work ever to develop a formal logic of the natural, temporal and extensional types of conditioning (as distinct from logical conditioning), including their production from modal categorical premises. Future Logic contains a great many other new discoveries, organized into a unified, consistent and empirical system, with precise definitions of the various categories and types of modality (including logical modality), and full awareness of the epistemological and ontological issues involved. Though strictly formal, it uses ordinary language, wherever symbols can be avoided. Among its other contributions: a full list of the valid modal syllogisms (which is more restrictive than previous lists); the main formalities of the logic of change (which introduces a dynamic instead of merely static approach to classification); the first formal definitions of the modal types of causality; a new theory of class logic, free of the Russell Paradox; as well as a critical review of modern metalogic. But it is impossible to list briefly all the innovations in logical science — and therefore, epistemology and ontology — this book presents; it has to be read for its scope to be appreciated. (shrink)
It is customarily assumed that propositional attitudes present two independent components: a propositional component and a psychological component, in the form of an attitude. These two components are caught by means of two different methods: propositions by some model theoretic theory, psychological attitudes by making appeal to their functional or psychological role. Some authors have seek a convergence by individuating propositions by Functional role semantics. In this paper I show that when it comes to emotional attitudes with propositional content, either (...) the independence of proposition and attitude collapses or functional role semantics brings to unstable individuation conditions for propositions. Some consequences of these two outcomes are considered. (shrink)
In this paper we distinguish between various kinds of doxastic theories. One distinction is between informal and formal doxastic theories. AGM-type theories of belief change are of the former kind, while Hintikka’s logic of knowledge and belief is of the latter. Then we distinguish between static theories that study the unchanging beliefs of a certain agent and dynamic theories that investigate not only the constraints that can reasonably be imposed on the doxastic states of a rational agent but also (...) rationality constraints on the changes of doxastic state that may occur in such agents. An additional distinction is that between non-introspective theories and introspective ones. Non-introspective theories investigate agents that have opinions about the external world but no higher-order opinions about their own doxasticnstates. Standard AGM-type theories as well as the currently existing versions of Segerberg’s dynamic doxastic logic (DDL) are non-introspective. Hintikka-style doxastic logic is of course introspective but it is a static theory. Thus, the challenge remains to devise doxastic theories that are both dynamic and introspective. We outline the semantics for truly introspective dynamic doxastic logic, i.e., a dynamic doxastic logic that allows us to describe agents who have both the ability to form higher-order beliefs and to reflect upon and change their minds about their own (higher-order) beliefs. This extension of DDL demands that we give up the Preservation condition on revision. We make some suggestions as to how such a non-preservative revision operation can be constructed. We also consider extending DDL with conditionals satisfying the Ramsey test and show that Gärdenfors’ well-known impossibility result applies to such a framework. Also in this case, Preservation has to be given up. (shrink)
Kate Manne’s Down Girl: The Logic of Misogyny combines traditional conceptual analysis and feminist conceptual engineering with critical exploration of cases drawn from popular culture and current events in order to produce an ameliorative account of misogyny, i.e., one that will help address the problems of misogyny in the actual world. A feminist account of misogyny that is both intersectional and ameliorative must provide theoretical tools for recognizing misogyny in its many-dimensional forms, as it interacts and overlaps with other (...) oppressions. While Manne thinks subtly about many of the material conditions that create misogyny as a set of normative social practices, she does not fully extend this care to the other intersectional forms of oppression she discusses. After touching on the book’s strengths, I track variations of its main problem, namely, its failure to fully conceive of oppressions besides sexism and misogyny as systemic patterns of social practices, as inherently structural rather than mere collections of individual beliefs and behaviors. (shrink)
This paper presents a semantical analysis of the Weak Kleene Logics Kw3 and PWK from the tradition of Bochvar and Halldén. These are three-valued logics in which a formula takes the third value if at least one of its components does. The paper establishes two main results: a characterisation result for the relation of logical con- sequence in PWK – that is, we individuate necessary and sufficient conditions for a set.
In the *Science of Logic*, Hegel states unequivocally that the category of “life” is a strictly logical, or pure, form of thinking. His treatment of actual life – i.e., that which empirically constitutes nature – arises first in his *Philosophy of Nature* when the logic is applied under the conditions of space and time. Nevertheless, many commentators find Hegel’s development of this category as a purely logical one especially difficult to accept. Indeed, they find this development only comprehensible (...) as long as one simultaneously assumes that Hegel breaks his promise to let the logic do the leading. However, if Hegel were to in fact allow the logical development to be led by biological analogies at this point, problems would ensue. Not only would it contradict his own speculative method, which should secure the necessity of the categories, but it would also endanger the ontological generality of the category of life itself. Beyond undermining his method and the logical integrity of the category, however, I will argue that such a reading makes the transition to the next category of “cognition” unintelligible and problematic. My aim in the first part of this paper is to argue how logical life can be read as a pure category. I then argue in the second part how my reconstruction makes the transition to cognition intelligible without resorting to profane or supernatural interpretations. (shrink)
Aristotle presents a formal logic in the Prior Analytics in which the premises and conclusions are never conditionals. In this paper I argue that he did not simply overlook conditionals, nor does their absence reflect a metaphysical prejudice on his part. Instead, he thinks that arguments with conditionals cannot be syllogisms because of the way he understands the explanatory requirement in the definition of a syllogism: the requirement that the conclusion follow because of the premises. The (...) key passage is Prior Analytics I.32, 47a22–40, where Aristotle considers an argument with conditionals that we would consider valid, but which he denies is a syllogism. I argue that Aristotle thinks that to meet the explanatory requirement a syllogism must draw its conclusion through the way its terms are predicated of one another. Because arguments with conditionals do not, in general, draw their conclusions through predications, he did not include them in his logic. (shrink)
Logique et existenceDeleuze à propos des « conditions du réel »Pour Deleuze, l’un des problèmes fondamentaux d’une théorie de la pensée est de savoir comment la pensée peut quitter la sphère du possible pour penser le réel, c’est-àdire pour penser l’existence elle-même. La position du réel semble être hors du concept. Des pré-kantiens comme Leibniz approchaient ce problème par le biais de la distinction entre vérités d’essence et vérités d’existence, alors que des post-kantiens comme Maimon l’approchaient par la distinction entre (...) les conditions de l’expérience possible et celles de l’expérience réelle. La logique classique définit la sphère du possible par trois principes logiques – l’identité, la non-contradiction et le tiers-exclu – et la présente étude examine les trois grandes trajectoires qui, dans cette histoire de la philosophie, ont tenté d’utiliser l’un de ces trois principes classiques pour pénétrer l’existence ellemême : 1) Leibniz cherchait à étendre le principe de d’identité àl’existence entière ; 2) Hegel cherchait à étendre le principe de non-contradiction à la totalité de l’expérience ; et 3) le groupe des penseurs appelés de manière assez large « existentialistes » cherchait à étendre le principe du tiers-exclu à la totalité de l’existence. La conclusion examine les raisons pour lesquelles Deleuze a été fasciné par chacune de ces tentatives philosophiques pour « penser l’existence », tout en pensant néanmoins qu’elles ont toutes échoué ; et pourquoi aussi il a fini par développer sa propre réponse au problème en faisant appel à un principe de différence.Logica e EsistenzaLe ‘Condizioni del reale’ in DeleuzePer Deleuze, uno dei problemi fondamentali per una teoria del pensiero è: come può il pensiero abbandonare la sfera del possibile per pensare il reale, ossia, pensare l’esistenza stessa? La posizione del reale sembra essere fuori dal concetto. Prekantiani come Leibniz affrontano questo problema in termini di distinzione fra verità dell’essenza e verità dell’esistenza, mentre post-kantiani come Maimon affrontano il problema in termini di distinzione fra condizioni dell’esperienza possibile e condizioni dell’esperienza reale. La logica classica ha definito la sfera del possibile secondo tre principi logici – identità, non-contraddizione, terzo escluso – e questo saggio analizza tre grandi ‘parabole’ della storia della filosofia che hanno tentato di usare uno di questi tre principi della logica per penetrare l’esistenza stessa: Leibniz hanno tentato di estendere il principio di identità a tutta l’esistenza; Hegel hanno tentato di estendere il principio di non-contraddizione a tutta l’esistenza; il gruppo di pensatori chiamati “esistenzialisti” ha tentato di estendere il principio del terzo escluso all’esistenza. La conclusione analizza sia le ragioni per le quali Deleuze era affascinato da ciascuno di questi tentativi filosofici di “pensare l’esistenza” nonostante fosse convinto che essi avessero fallito, sia i motivi per cui egli in conclusione traccia la propria risposta al problema facendo appello al principio della differenza. (shrink)
Although logical consistency is desirable in scientific research, standard statistical hypothesis tests are typically logically inconsistent. To address this issue, previous work introduced agnostic hypothesis tests and proved that they can be logically consistent while retaining statistical optimality properties. This article characterizes the credal modalities in agnostic hypothesis tests and uses the hexagon of oppositions to explain the logical relations between these modalities. Geometric solids that are composed of hexagons of oppositions illustrate the conditions for these modalities to be logically (...) consistent. Prisms composed of hexagons of oppositions show how the credal modalities obtained from two agnostic tests vary according to their threshold values. Nested hexagons of oppositions summarize logical relations between the credal modalities in these tests and prove new relations. (shrink)
The main objective of the paper is to provide a conceptual apparatus of a general logical theory of language communication. The aim of the paper is to outline a formal-logical theory of language in which the concepts of the phenomenon of language communication and language communication in general are defined and some conditions for their adequacy are formulated. The theory explicates the key notions of contemporary syntax, semantics, and pragmatics. The theory is formalized on two levels: token-level and type-level. As (...) such, it takes into account the dual – token and type – ontological character of linguistic entities. The basic notions of the theory: language communication, meaning and interpretation are introduced on the second, type-level of formalization, and their required prior formalization of some of the notions introduced on the first, token-level; among others, the notion of an act of communication. Owing to the theory, it is possible to address the problems of adequacy of both empirical acts of communication and of language communication in general. All the conditions of adequacy of communication discussed in the presented paper, are valid for one-way communication (sender-recipient); nevertheless, they can also apply to the reverse direction of language communication (recipient-sender). Therefore, they concern the problem of two-way understanding in language communication. (shrink)
The purpose of this paper is to explore the conditions under which the post-positivist interest in rewriting or reinterpreting history could operate legitimately from an historical point of view. The first part of the paper outlines and explains some of the key thematic elements of historical post-positivism. The second, proceeds to investigate how these elements can be configured and related to each other within Arthur Danto’s influential account of the development of contemporary art, and especially the avant-garde. The intention is (...) to acquire a sense of the working dynamics of the post-positivist thought, so as to better understand its possible implications for the writing of history. In the concluding section an argument is proposed to the effect that, although the post-positivist interest in re-writing of history can in principle be admitted as entirely legitimate, its legitimacy depends on introducing some substantive constraints on content, in addition to the formal considerations that post-positivist discourse generally tends to favor. It is further suggested that this constraint should take the form of a requirement on historical literacy the meaning of which is, finally, elucidated by drawing a contrast with historical common sense. (shrink)
Perhaps the question “What is philosophy?” can only be posed late in life, when old age has come, and with it the time to speak in concrete terms. It is a question one poses when one no longer has anything to ask for, but its consequences can be considerable. One was asking the question before, one never ceased asking it, but it was too artificial, too abstract; one expounded and dominated the question, more than being grabbed by it. There are (...) cases in which old age bestows not an eternal youth, but on the contrary a sovereign freedom, a pure necessity where one enjoys a moment of grace between life and death, and where all the parts of the machine combine to dispatch into the future a trait that traverses the ages: Turner, Monet, Matisse. The elderly Turner acquired or conquered the right to lead painting down a deserted path from which there was no return, and that was no longer distinguishable from a final question. In the same way, in philosophy, Kant’s Critique of Judgment is a work of old age, a wild work from which descendants will never cease to flow.We cannot lay claim to such a status. The time has simply come for us to ask what philosophy is. And we have never ceased to do this in the past, and we already had the response, which has not varied: philosophy is the art of forming, inventing, and fabricating concepts. But it was not only necessary for the response to take note of the question; it also had to determine a time, an occasion, the circumstances, the landscapes and personae, the conditions and unknowns of the question. One had to be able to pose the question “between friends” as a confidence or a trust, or else, faced with an enemy, as a challenge, and at the same time one had to reach that moment, between dog and wolf, when one mistrusts even the friend. Gilles Deleuze was professor of philosophy at the University of Paris VIII, Vincennes-St.-Denis, until his retirement in 1987. Among his books translated into English are the two-volume Capitalism and Schizophrenia , the two-volume Cinema , The Logic of Sense , and Expressionism in Philosophy: Spinoza . Daniel W. Smith is a doctoral candidate in philosophy at the University of Chicago. He is at work on a study of the philosophy of Deleuze, and is translating Deleuze’s Francis Bacon: Logique de la sensation. Arnold I. Davidson, executive editor of Critical Inquiry, teaches philosophy at the University of Chicago and is currently Marta Sutton Weeks Fellow at the Stanford Humanities Center. (shrink)
The paper defends a variant of the material implication approach to the meaning of conditional sentences against some arguments that are considered to be widely subscribed to and/or important in the philosophical, psychological and linguistic literature. These arguments are shown to be wrong, debatable, or to miss their aim if the truth conditions defining material implication are viewed as determining nothing but the denotation of conditional sentences and if the function of conditional sentences in deduction (logic) is focused on (...) rather than in inferencing (reasoning). It is shown that some ‘paradoxes of material implication’ are due to inconsistent premises of deductions introduced by semantic relations between clauses constituting the premises, a fact which does not invalidate the approach. Other ‘paradoxes’ are shown to arise because they are based on uninformative deductions, violating a basic pragmatic principle. In addition, the paper introduces the distinction between the set of possible states of a mental model of the actual world and of alternative worlds. It is argued that material implication determines the denotation of an indicative conditional as a subset of the former set and the denotation of a subjunctive conditional as a subset of the latter set, thus unifying these two types of conditionals. (shrink)
While many different mechanisms contribute to the generation of spatial order in biological development, the formation of morphogenetic fields which in turn direct cell responses giving rise to pattern and form are of major importance and essential for embryogenesis and regeneration. Most likely the fields represent concentration patterns of substances produced by molecular kinetics. Short range autocatalytic activation in conjunction with longer range “lateral” inhibition or depletion effects is capable of generating such patterns (Gierer and Meinhardt, 1972). Non-linear reactions are (...) required, and mathematical criteria were derived to design molecular models capable of pattern generation. The classical embryological feature of proportion regulation can be incorporated into the models. The conditions are mathematically necessary for the simplest two-factor case, and are likely to be a fair approximation in multi-component systems in which activation and inhibition are systems parameters subsuming the action of several agents. Gradients, symmetric and periodic patterns, in one or two dimensions, stable or pulsing in time, can be generated on this basis. Our basic concept of autocatalysis in conjunction with lateral inhibition accounts for self-regulatory biological features, including the reproducible formation of structures from near-uniform initial conditions as required by the logic of the generation cycle. Real tissue form, for instance that of budding Hydra, may often be traced back to local curvature arising within an initially relatively flat cell sheet, the position of evagination being determined by morphogenetic fields. Shell theory developed for architecture may also be applied to such biological processes. (shrink)
Logical pluralism is the view that there is more than one correct logic. This very general characterization gives rise to a whole family of positions. I argue that not all of them are stable. The main argument in the paper is inspired by considerations known as the “collapse problem”, and it aims at the most popular form of logical pluralism advocated by JC Beall and Greg Restall. I argue that there is a more general argument available that challenges all (...) variants of logical pluralism that meet the following three conditions: that there are at least two correct logical systems characterized in terms of different consequence relations, that there is some sort of rivalry among the correct logics, and that logical consequence is normative. The hypothesis I argue for amounts to conditional claim: If a position satisfies all these conditions, then that position is unstable in the sense that it collapses into competing positions. (shrink)
We propose a new account of indicative conditionals, giving acceptability and logical closure conditions for them. We start from Adams’ Thesis: the claim that the acceptability of a simple indicative equals the corresponding conditional probability. The Thesis is widely endorsed, but arguably false and refuted by empirical research. To fix it, we submit, we need a relevance constraint: we accept a simple conditional 'If φ, then ψ' to the extent that (i) the conditional probability p(ψ|φ) is high, provided that (...) (ii) φ is relevant for ψ. How (i) should work is well-understood. It is (ii) that holds the key to improve our understanding of conditionals. Our account has (i) a probabilistic component, using Popper functions; (ii) a relevance component, given via an algebraic structure of topics or subject matters. We present a probabilistic logic for simple indicatives, and argue that its (in)validities are both theoretically desirable and in line with empirical results on how people reason with conditionals. (shrink)
This paper deals with Austinian ifs of every stripe within classical logic. It is argued that they are truth-functional and the theory of conditional elements is used. Ellipsis is key. Corrects an error in Fulda (2010) in translation and therefore scope. -/- The PDF is made available gratis by the Publisher.
According to a theorem recently proved in the theory of logical aggregation, any nonconstant social judgment function that satisfies independence of irrelevant alternatives (IIA) is dictatorial. We show that the strong and not very plausible IIA condition can be replaced with a minimal independence assumption plus a Pareto-like condition. This new version of the impossibility theorem likens it to Arrow’s and arguably enhances its paradoxical value.
Curry's paradox for "if.. then.." concerns the paradoxical features of sentences of the form "If this very sentence is true, then 2+2=5". Standard inference principles lead us to the conclusion that such conditionals have true consequents: so, for example, 2+2=5 after all. There has been a lot of technical work done on formal options for blocking Curry paradoxes while only compromising a little on the various central principles of logic and meaning that are under threat. -/- Once we (...) have a sense of the technical options, though, a philosophical choice remains. When dealing with puzzles in the logic of conditionals, a natural place to turn is independently motivated semantic theories of the behaviour of "if... then...". This paper argues that the closest-worlds approach outlined in Nolan 1997 offers a philosophically satisfying reason to deny conditional proof and so block the paradoxical Curry reasoning, and can give the verdict that standard Curry conditionals are false, along with related "contraction conditionals". (shrink)
This paper extends Kripke’s theory of truth to a language with a variably strict conditional operator, of the kind that Stalnaker and others have used to represent ordinary indicative conditionals of English. It then shows how to combine this with a different and independently motivated conditional operator, to get a substantial logic of restricted quantification within naive truth theory.
The epistemic modal auxiliaries 'must' and 'might' are vehicles for expressing the force with which a proposition follows from some body of evidence or information. Standard approaches model these operators using quantificational modal logic, but probabilistic approaches are becoming increasingly influential. According to a traditional view, 'must' is a maximally strong epistemic operator and 'might' is a bare possibility one. A competing account---popular amongst proponents of a probabilisitic turn---says that, given a body of evidence, 'must p' entails that Pr(p) (...) is high but non-maximal and 'might p' that Pr(p) is significantly greater than 0. Drawing on several observations concerning the behavior of 'must', 'might' and similar epistemic operators in evidential contexts, deductive inferences, downplaying and retractions scenarios, and expressions of epistemic tension, I argue that those two influential accounts have systematic descriptive shortcomings. To better make sense of their complex behavior, I propose instead a broadly Kratzerian account according to which 'must p' entails that Pr(p) = 1 and 'might p' that Pr(p) > 0, given a body of evidence and a set of normality assumptions about the world. From this perspective, 'must' and 'might' are vehicles for expressing a common mode of reasoning whereby we draw inferences from specific bits of evidence against a rich set of background assumptions---some of which we represent as defeasible---which capture our general expectations about the world. I will show that the predictions of this Kratzerian account can be substantially refined once it is combined with a specific yet independently motivated `grammatical' approach to the computation of scalar implicatures. Finally, I discuss some implications of these results for more general discussions concerning the empirical and theoretical motivation to adopt a probabilisitic semantic framework. (shrink)
Major Research Paper Abstract -/- A Part of This World: Deleuze & The Logic Of Creation. -/- Is there a particular danger in following Deleuze’s philosophy to its end result? According to Peter Hallward and Alain Badiou, Deleuze’s philosophy has some rather severe conclusions. Deleuze has been known as a vitalist thinker of life and affirmation. Hallward & Badiou seek to challenge the accepted view of Deleuze; showing that these accepted norms in Deleuzian scholarship should be challenged; and that (...) initially Deleuze calls for the evacuation of political action in order to remain firm in the realm of pure contemplation. I intend to investigate and defend Deleuze’s philosophy and against critics like Badiou and Hallward; and that not only is Deleuze’s philosophy creative and vital but also highly revolutionary and ‘a part of this world.’ I will look at several works in Deleuze’s corpus, as well as look at Deleuzian scholars whom defend Deleuze’s position -/- Hallward sees Deleuze as a theophantic thinker of the one and like Spinoza an individual mode must align oneself with the intellectual love of god, so that creativity and expressivity may mediate through them. Thus, according to Hallward the major theme of Deleuze’s philosophy is creativity; and a subject or a creature must tap into this vital spark of creation, which, is also a form of creatural confinement. Hallward states this creative act can only occur in the realm of the virtual, by lines of flight leading 'out of this world'. The subject is then re-introduced to an extra-worldly existence of contemplation and remains further away from decisions and lived experience. Deleuze, according to Hallward, falls prey to a cosmological pantheism. -/- Badiou has similar concerns. Deleuze’s philosophy is too systematic and abstract. The entirety of Deleuzes’ work is surrounded by metaphysics of the one; and essentially its repercussions lead to an overt asceticism. Badiou notes that Deleuze wants us all to surrender thought to a renewed concept of the one. Through the surrender of the one, the multiple is lost and incorporated into the realm of simulacra. Everything in this Deleuzian world is ‘always-already’ in the infinite and inhuman totality of the one. According to Badiou, this entire process is articulated in the power of inorganic life that operates through all of us. Like Hallward, Badiou sees Deleuze demolishing the subject, who is stuck between machinery and exteriority. Subjects are forced to transcend and go beyond their limits, slowly collapsing into an infinite virtuality. Badiou believes this is a powerful metaphor for a philosophy of death. Thus the conditions of Deleuzian thought are contingent upon asceticism, making a Deleuzian world a sort of ‘crowned anarchy’. Badiou sees Deleuze’s ascetic mission intimately linked with a philosophy of death, and like Hallward we should pay careful attention to the outcome of such an aristocratic philosophy. Death according to Badiou, symbolizes Deleuzian thought, not only making it dangerous, but also actualizing it as an ineffective position. Badiou also points out that Deleuze’s conceptual sources are not only limited but also repeated time and time again through a monotonous selection of concepts. Is this a fair critique and representation of Deleuzian thought? -/- Eugene Holland states, that both Hallward and Badiou have misrepresented Deleuze. Deleuze does invoke the creation of a new earth but one which we all fully believe in. The only world Deleuze wants to get out of is the world of habits, conformity, power; and forces that block creative being. According to Holland, Hallward presents us a Deleuze that inhibits an engagement with the world. However Deleuze’s creative enterprise is insistent on forming concepts that can change and transform our world. -/- So the question arises where does the problem of misrepresentation begin? It begins with both Badiou and Hallward having an erroneous account of the actual/virtual distinction in Deleuze’s Philosophy. According to Protevi, Hallward posits a dualism between the actual and the virtual, denying the role of the intensive. Hallward initially sees the relationship between the intensive and the virtual, ignoring the fact that the intensive has its own ontological register that mediates both the virtual and the actual. However, Protevi notes if one could not accept the intensive for an ontological register and had to place it with one or the other; you would have to accept an interrelationship between the actual and the intensive. Hallward places it in the realm of the virtual, thus, leading us to his major claim that Deleuze’s philosophy leads us out of the world. Protevi states, intensive processes happen in our world they are a part of this world. Hallward completely empties all creativity from the actual, thus depending on the virtual and its slippery slope. Both Hallward and Badiou have missed the point altogether. We live in an intensive/actual world and the main point about Deleuze’s politics has to do with experimentation and social interaction and the transformation and intervention of the concept. As Daniel W. Smith states, unlike Badiou, Deleuze is not searching for an axiomatic approach to the world, one that is prone to reductionism but rather with problematic, inventive and creative methods to transform a society. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.