Simple taste predications typically come with an ‘acquaintance requirement’: they normally require the speaker to have had a certain kind of first-hand experience with the object of predication. For example, if I told you that the crème caramel is delicious, you would ordinarily assume that I have actually tasted the crème caramel and am not simply relying on the testimony of others. The present essay argues in favor of a ‘lightweight’ expressivist account of the acquaintance requirement. This account consists of a recursive semantics and a ‘supervaluational’ account of assertion; it is compatible with a number of different accounts of truth and content, including contextualism, relativism, and purer forms of expressivism. The principal argument in favor of this account is that it correctly predicts a wide range of data concerning how the acquaintance requirement interacts with Boolean connectives, generalized quantifiers, epistemic modals, and attitude verbs.
In this paper, we provide a recipe that not only captures the common structure between semantic paradoxes, but it also captures our intuitions regarding the relations between these paradoxes. Before we unveil our recipe, we first talk about a popular schema introduced by Graham Priest, namely, the inclosure schema. Without rehashing previous arguments against the inclosure schema, we contribute different arguments for the same concern that the inclosure schema bundles the wrong paradoxes together. That is, we will provide alternative arguments on why the inclosure schema is both too broad for including the Sorites paradox, and too narrow for excluding Curry’s paradox.
We then spell out our recipe. Our recipe consists of three ingredients: (1) a predicate that has two specific rules, (2) a simple method to find a partial negative modality, and (3) a diagonal lemma that would allow us to let sentences be their partial negative modalities. The recipe shows that all of the following paradoxes share the same structure: The Liar, Curry’s paradox, Validity Curry, Provability Liar, a paradox leading to Löb’s theorem, Knower’s paradox, Knower’s Curry, Grelling-Nelson’s paradox, Russell’s paradox in terms of extensions, alternative liar and alternative Curry, and other new paradoxes.
We conclude the paper by stating the lessons that we can learn from the recipe, and what kind of solutions does the recipe suggest if we want to adhere to the Principle of Uniform Solution.
Stalnaker’s minimal change semantics for conditionals fails to support the import-export law, according to which (a) and (b) are logically equivalent:
(a) if A, then if B, then C
(b) if A and B, then C
However, natural language conditionals seem to abide by the law. McGee (1985) outlines a minimal change semantics for conditionals that supports it. I argue that, in fact, the equivalence between (a) and (b) does not hold unrestrictedly, and I suggest that the facts follow from the interaction between the semantics of conditionals and the ways suppositions may affect the context. I conclude by describing the consequences of my account for the issue of the validity of modus ponens.
Circumstantialists already have a logical semantics for impossibilities. They expand their logical space of possible worlds by adding impossible worlds. These are impossible circumstances serving as indices of evaluation, at which impossibilities are true. A variant of circumstantialism, namely modal Meinongianism, adds impossible objects as well. The opposite of circumstantialism, namely structuralism, has some catching-up to do. What might a structuralist logical semantics without impossible worlds or impossible objects look like? This paper makes a structuralist counterproposal. I present a semantics based on a procedural interpretation of the typed l-calculus. The fundamental idea is that talk about impossibilities should be construed in terms of procedures yielding as their product a condition that could not possibly have a satisfier, or else failing to yield a product at all. Dispensing with a ‘bottom’ of impossibilia requires instead a ‘top’ consisting of structured hyperintensions, intensions, intensions defining other intensions, a typed universe, and dual predication. I explain how the theory works by going through a couple of cases.
The talk advocates a marriage of inferentialist bilateralism and truth-maker bilateralism. Inferentialist bilateralists like Restall and Ripley say that a collection of sentences, Y, follows from a collection of sentences, X, iff it is incoherent (or out-of-bounds) to assert all the sentences in X and, at the same time, deny all the sentences in Y. In Fine’s truth-maker theory, we have a partially ordered set of states that exactly verify and falsify sentences, and some of these states are impossible. We can think of making-true as the worldly analogue of asserting, of making-false as the worldly analogue of denying, and of impossibility as the worldly analogue of incoherence. This suggests that we may say that, in truth-maker theory, a collection of sentences, Y, follows (logically) from a collection of sentences, X, iff (in all models) any fusion of exact verifiers of the members of X and exact falsifiers of the member of Y is impossible. Under routine assumptions about truth-making, this yields classical logic. Relaxing one such assumption yields the non-transitive logic ST. Relaxing another assumption yields the non-reflexive logic TS. We can use known facts about the relation between ST, LP, and K3, to provide an interpretation of LP as the logic of falsifiers and K3 as the logic of verifiers. The resulting semantics for ST is more flexible than its usual three-valued semantics because it allows us, e.g., to reject monotonicity. We can also recover fine-grained logics, like Correia’s logic of factual equivalence.
Our topic is logical inference in natural language, as it is done by people and computers.
The first main topic will be monotonicity inference, arguably the best of the simple ideas
in the area. Monotonicity can be incorporated in running systems whereby one can take
parsed real-life sentences and see simple inferences in action. I will present some of the
theory, related to higher-order monotonicity and the syntax-semantics interface offered by
In a different direction, these days monotonicity inference can be done by machines as well
as humans. The talk also discusses this development along with some ongoing work on the
borderline of natural logic and machine learning.
The second direction in the talk will be an overview of the large number of logical systems for
various linguistic phenomena. This work begins as an updating of traditional syllogistic logic,
but with much greater expressive power.
Overall, the goal of the talk is to persuade you that the research program of “natural logic”
leads to a lively research area with connections to many areas both inside and outside of more
mainstream areas of logic.
(joint work with Norbert Gratzl, MCMP, Munich)
Free logics is a family of first-order logics which came about as a result of examining the existence assumptions of classical logic. What those assumptions are varies, but the central ones are that (i) the domain of interpretation is not empty, (ii) every name denotes exactly one object in the domain and (iii) the quantifiers have existential import. Free logics usually reject the claim that names need to denote in (ii), and of the systems considered in this paper, the positive free logic concedes that some atomic formulas containing non-denoting names (including self-identity) are true, while negative free logic rejects even the latter claim.
These logics have complex and varied axiomatizations and semantics, and the goal of the present work is to offer an orderly examination of the various systems and their mutual relations. This is done by first offering a formalization, using sequent calculi which possess all the desired structural properties of a good proof system, including admissibility of contraction and cut, while streamlining free logics in a way no other approach has. We then present a simple and unified system of generalized semantics, which allows for a straightforward demonstration of the meta-theoretical properties, while also offering insights into the relationship between different logics (free and classical). Finally, we extend the system with modalities by using a labeled sequent calculus, and here we are again able to map out the different approaches and their mutual relations using the same framework.
This paper adapts Lewis’s “Ptolemaic Astronomy” from Counterfactuals for use in thinking about social hierarchy and subordination.
In this talk I will provide an overview of my recent investigations, some published some unpublished, on neologicism and in particular on the topics related to the good company and the bad company objections.
Contextual analysis deals with systems of random variables. Each random variable within a system is labeled in two ways: by its content (that which the variable measures or responds to) and by its context (conditions under which it is recorded). Dependence of random variables on contexts is classified into (1) direct (causal) cross-influences and (2) purely contextual (non-causal) influences. The two can be conceptually separated from each other and measured in a principled way. The theory has numerous applications in quantum mechanics, and also in such areas as decision making and computer databases. A system of deterministic variables (as a special case of random variables) is always void of purely contextual influences. There are, however, situations when we know that a system is one of a set of deterministic systems, but we cannot know which one. In such situations we can assign epistemic (Bayesian) probabilities to possible deterministic systems, create thereby a system of epistemic random variables, and subject it to contextual analysis. In this way one can treat, in particular, such logical antinomies as the Liar paradox. The simplest systems of epistemic random variables describing the latter have no direct cross-influences and the maximal possible degree of purely contextual influences.