A growing number of philosophers of logic have suggested that logic is best analyzed in terms of acceptance and rejection. In several influential papers, it has been proposed that we understand acceptance in terms of answering a yes-or-no question with `yes’ and understand rejection in terms of answering a yes-or-no question `no’. In this talk I will examine question-answer setups of this sort very generally, and use them to examine what sorts of negations can be endorsed by a deflationist about negation.
Michael Rescorla has argued that it makes sense to compute directly with numbers, and he faulted Turing for not giving an analysis of number-theoretic computability. However, in line with a later paper of his, it only makes sense to compute directly with syntactic entities, such as strings on a given alphabet. Computing with numbers goes via notation. This raises broader issues involving de re propositional attitudes towards numbers and other non-syntactic abstract entities.
The consensus for the last century or so has been that diagrammatic proofs are not genuine proofs. Recent philosophical work, however, has shown that (at least in some circumstances) diagrams can be perfectly rigorous. The implication of this work is that, if diagrammatic reasoning in a particular field is illegitimate, it must be so for local reasons, not because of some in-principle illegitimacy of diagrammatic reasoning in general. In this talk, I try to identify some of the reasons why geometers in particular began to reject diagrammatic proofs. I argue that the reasons often cited nowadays — that diagrams illicitly infer from a particular to all cases, or can’t handle analytic notions like continuity — played little role in this development. I highlight one very significant (but rarely discussed) flaw in diagrammatic reasoning: diagrammatic methods don’t allow for fully general proofs of theorems. I explain this objection (which goes back to Descartes), and how Poncelet and his school developed around 1820 new diagrammatic methods to meet this objection. As I explain, these new methods required a kind of diagrammatic reasoning that is fundamentally different from the now well-known diagrammatic method from Euclid’s Elements. And, as I show (using the case of synthetic treatments of the duals of curves of degrees higher than 2), it eventually became clear that this method does not work. Truly general results in “modern” geometry could not be proven diagrammatically.
Many proofs use a bootstrapping process. Initially, a simple version is proved, and then extensions are proved by repeated applications of the simple result. Proofs of many results about graph colorings use this strategy. We will concentrate on a specific example. Suppose we write RT(2,n) for Ramsey’s theorem for pairs and n colors, a graph coloring theorem. (Pictures will be provided.) The usual proof of RT(2,4) uses two applications of RT(2,2). In this talk, we address the question: Can RT(2,4) be proved with one use of RT(2,2)? Of course, the answer depends on the base system chosen and the formalization of what constitutes a use. In some settings, a formalization of Weihrauch reductions can lend insight, and the law of the excluded middle plays a surprising role.
A number of inference patterns involving epistemic modalities, including Modus Ponens, display a split behavior. They seem valid by the lights of all-or-nothing judgments: whenever the premises are accepted as true, the conclusion must be accepted as true as well. But they seem invalid by the lights of probabilistic judgments: we can specify intuitive credal assignments where the drop in probability between premises and conclusions is incompatible with validity. I suggest that we explain this asymmetry by endorsing two bridge principles between logical notions and rational constraints on attitudes. The first links a notion of classical consequence to rational constraints on probabilistic states. The second links a broadly dynamic notion of consequence (so-called informational consequence) to rational constraints on acceptance. In existing literature, classical and informational consequence are seen as alternatives; I argue instead that we can and should have both.
Dan Lassiter (Stanford)
Recent years have seen successful new applications of probabilistic semantics to epistemic modality, and there are signs of a revival of a classic probabilistic approach to conditionals. These models have a simple implementation using tools from intensional and degree semantics, and they have considerable appeal in terms of capturing inferences that are problematic for classical models in some cases. However, they generally fare worse than classical models in two key respects – they have trouble explaining the update/learning potential of epistemics and conditionals as well as their interpretations when embedded. Focusing on epistemic language, I’ll describe some of the reasons for the probabilistic turn, explain why embeddings and learning present a problem, and suggest a way to deal with the problems by adopting an enriched probabilistic semantics based on Bayesian networks. Time permitting, I’ll venture some speculations about attractions of, and challenges to, an extension to conditionals in the style of Kratzer 1986.
Jon Gajeweski (UConn)
22 Apr, 2pm, LH 302
In this talk, I will discuss whether and how degree is represented in American Sign Language. I will begin by attempting to place American Sign Language into typologies of comparative constructions that have been based primarily on evidence from spoken languages. Placing ASL in such typologies leads us to the question of the role of iconicity in the representation of degree in ASL and how this relates to constructions in spoken language that may make use of pragmatic resources such as co-speech gesture. The results of the investigation of the comparative are then tested against the existence/structure of related degree constructions including measure phrases, differentials and degree questions.
Lenore Blum (Carnegie Mellon University)
Most logicians and theoretical computer scientists are familiar with Alan Turing’s 1936 seminal paper setting the stage for the foundational (discrete) theory of computation. Most however remain unaware of Turing’s 1948 seminal paper introducing the notion of condition, sets the stage for a natural theory of complexity for the “other theory of computation” emanating from the classical tradition of numerical analysis, equation solving and the continuous mathematics of calculus.
This talk will recognize Alan Turing’s work in the foundations of numerical computation (in particular, his 1948 paper “Rounding-Off Errors in Matrix Processes”), its influence in complexity theory today, and how it provides a unifying concept for the two major traditions of the Theory of Computation.
Whit Tabor (UConn)
Apr 15, 2pm, LH 302
Consider an iterated map f on a connected space X. A proper subset, A, of X, is said to be asymptotically stable if, when the system is perturbed slightly away from A, it converges back on to A, under continued iteration of f. When one is working in the realm of logic, or classical computation more broadly, stability is not an issue and does not really make sense. There is no possibility of being slightly off—formulas are either well-formed and have precise meaning, or they are ill-formed and they have no meaning at all. Interestingly, in recent decades, a number of people have discovered ways of doing complex computation on connected spaces. This raises the question of whether there could be “stable computation” in such systems. In this talk, I define asymptotic stability for a particular type of connected space computer and show that at least one interesting class of languages (the mirror recursion languages) has a realization which exhibits asymptotic stability of certain fractal sets. One interesting outcome is an answer to the question: Why is there grammaticality?
Timothy Williamson (University of Oxford)
23 Mar, 4:30pm, Laurel 201.
The choice between alternative logics can be understood as a special case of theory choice in science, governed by broadly abductive criteria, without appeal to a special relation of logical consequence. This view will be applied to semantic and sorites paradoxes as putative motivations for departures from classical logic.