Modalized Disquotationalism

V. Halbach

The instances of the disquation scheme “ ‘A’ is true if and only if A” are argued to be necessary. The consequences of combining this position with a suitable axiomatic theory of necessity are studied in a formal setting and it is shown how the resulting theory overcomes some deficiencies of traditional disquotationalist theories of truth.

Wo stehen wir heute mit dem Problem der Induktion?

W. Spohn

The paper gives a broad overview of the current state of the discussion of the problem of induction, explains the importance of general formal treatments, and argues that its normative status cannot be dissolved into naturalized epistemology.

How to Understand the Foundations of Empirical Belief in a Coherentist Way

W. Spohn

The central claim of the paper is, roughly, that the fact that it looks to somebody as if p is a defeasibly a priori reason for assuming that p (and vice versa), for any person, even for the perceiver himself. As a preparation, it outlines a doxastic conception suitable to explicate this claim and explains how to analyse dispositions within this conception. Since an observable p has the disposition to look as if p, this analysis generalizes to the central claim which is then argued to be at the bottom of coherentism. Thus, the defense of the claim supports coherentism as opposed to foundationalism and at the same time provides an answer to skepticism about the external world.

Lewis’ Principal Principle ist ein Spezialfall von van Fraassens Reflexion Principle

W. Spohn

The paper explains how Lewis’ (old) Principal Principle (which tries to specify the most basic connection between subjective probability, credence, and objective probability, chance) may be conceived and derived as a special case of van Fraassen’s Reflexion Principle (which specifies a fundamental relation between present and future subjective probabilities).

A Disquotational Theory of Truth

V. Halbach Abstract

A variant of modalized disquotationalism is discussed. A sentence is said to be analytic in the truth predicate if and only if it is a logical consequence of the uniform T-sentences. It is claimed that the disquotationalist is committed to the view that T-analyticity parsed this way is sound, that is, if A is analytic in the truth predicate, then A. The resulting theory of truth turns out to be equivalent to the “Tarskian” theory of truth and is therefore much stronger than the theory of the pure T-sentences often associated with disquotationalism. Moreover, truth in foreign languages is discussed from a disquotationalist point of view.

Two Coherence Principles

W. Spohn

The paper proposes two principles of coherence (thus taking up work started in No. 18). The latter indeed serves as a weak, but precise explication of the notion of coherence as it is used in the current epistemological discussion. After discussing their epistemological setting, the paper considers four ways of establishing these principles. They may be inferred neither from enumerative induction, nor from the nature of propositions as objects of belief, nor in a Kantian way from self-consciousness. Rather, I propose a fairly rigorous way to infer them from an even more fundamental rationality principle of non-dogmatism and an elementary theory of perception.

Truth and Reduction

V. Halbach

Axiomatic theories of truth and subsystems of second-order arithmetic are compared. The paper contains a survey on proof-theoretic results connecting both kinds of systems. Certain semantic, i.e., truth-theoretic principles like compositionality are argued to be equivalent to set existence principles (e.g. predicative sets of numbers). The value of these observations for ontological reduction is investigated.

On Lehrer’s Principle of Trustworthiness

V. Halbach

According to the usual foundationalist picture of knowledge, basic beliefs provide the empirical input on which the whole empirical knowledge of a person should rely. There are convincing arguments showing that this picture is not completely convincing, because the proposed basic beliefs were not sufficient for providing foundations of knowledge or they were in need of further justification and therefore not really basic.

Now coherentism claims that the distinction of basic and non-basic beliefs is not sensible at all. There may be basic beliefs and coherentism does not have to deny this, but these possibly existing basic beliefs do not figure prominently in a sound account of epistemic justification. The rejection of basic beliefs as foundations of knowledge, however, seems to deprive of the empirical input. But if knowledge does not rest on basic beliefs providing the empirical input, how can external factors influence our knowledge at all? Coherentism seems prone to a conception where epistemic agents are completely isolated from the world.

Coherentists have developed several strategies to evade the isolation objection. Keith Lehrer’s approach to solve the problem became especially important and was thus discussed intensively.

In order to give a rough idea how Lehrer accomplishes a plausible picture of how we acquire empirical information, I will present an example. Suppose that I accept that there is a red rose. This assumptions implies in particular that I do not believe that there is a red rose because of wishful thinking etc; rather I came to believe it with the objective of accepting it just in the case it is true; but so far my belief lacks justification and is thus not yet knowledge. I cannot conclude (and thereby justify) from my acceptance alone that there is a red rose indeed. Given some additional information, however, I can and do usually infer this. The additional premises required may include that I am not dreaming, that normal daylight is present, and that I know roses. The common feature of all these premises is that I must be a reliable epistemic agent in the present kind of situation. Exactly this is expressed by the principle of trustworthiness proposed by Lehrer in his “Theory of Knowledge” (1990):

T . Whatever I accept with the objective of accepting something just in case it is true, I accept in a trustworthy manner.

He explains the benefits of this principle in the following way:

The consequence of adding principle T to my acceptance system is that whatever I accept is more reasonable for me to accept than its denial. It has the effect of permitting me to detach the content of what I accept from my acceptance of the content. My acceptance system tells me that I accept that p, accept that q, and so forth. Suppose I wish to justify accepting that p on the basis of my acceptance system telling me that I accept that p. How am I to detach the conclusion that p from my acceptance system? The information that I accept that p, which is included in my acceptance system, does not justify detaching p from my acceptance of it in order to obtain truth and avoid error. I need the additional information that my accepting that p is a trustworthy guide to these ends. Principle T supplies that information and, therefore, functions as a principle of detachment. It is the rule that enables me to detach the conclusion that p from my acceptance of p.

Although this account seems to be plausible at first, I will show that the principle of trustworthiness cannot be used as intended by Lehrer. By reformulating the principle of trustworthiness and by an appeal to Loeb’s theorem I will show that only trivial instances of the principle are consistent with our basic assumptions on beliefs.

Disquotationalism and Infinite Conjunctions

V. Halbach

According to the disquotationalist theory of truth, the Tarskian equivalences, conceived as axioms, yield all there is to say about truth. Several authors have claimed that the expression of infinite conjunctions and disjunctions is the only purpose of the disquotationalist truth predicate. The way in which infinite conjunctions can be expressed by an axiomatized truth predicate is explored and it is considered whether the disquotationalist truth predicate is adequate for this purpose.

has appeared in Mind 108 (1999), pp. 1–22

Strategic Rationality

W. Spohn

The paper argues that the standard decision theoretic account of strategies and their rationality or optimality is much too narrow, that strategies should rather condition future action to future decision situations (a point of view already developed in my Grundlagen der Entscheidungstheorie, sect. 4.4), that practical deliberation must therefore essentially rely on a relation of superiority and inferiority between possible future decision situations, that all this allows to substantially broaden the theory of practical rationality, that a long list of points attended to in the literature can be subsumed under the broadened perspective (including a novel view on the iterated prisoner’s dilemma and on iterated Newcomb’s problem, which, however, is revised in No. 42), and that the task to complete and systematize this list indeed forms a fruitful research programme.

Conservative Theories of Classical Truth

V. Halbach

Some axiomatic theories of truth and related subsystems of second-order arithmetic are surveyed and shown to be conservative over their respective base theory. In particular, it is shown by purely finitistically means that the theory PA + “there is a satisfaction class” and the theory FS with arithmetical induction only of Halbach (1994) are conservative over PA.

Two Proof-Theoretic Remarks on EA+ECT

L. Horsten/V. Halbach

In this note two propositions about the epistemic formalization of Church’s Thesis (ECT) are proved. First it is shown that all arithmetical sentences deducible in Shapiro’s system of Epistemic Arithmetic (EA) from ECT are derivable from Peano arithmetic PA +uniform reflection for PA. Second it is shown that the system EA+ECT has the epistemic disjunction property and the epistemic numerical existence property for arithmetical formulas.

Ranking Functions, AGM Style

W. Spohn

The paper first points out that ranking functions are superior to AGM belief revision theory in two crucial respects, i.e. in solving the problem of iterated belief revision and in giving an adequate account of doxastic independence (this was indeed why ranking function were developed in No. 15). Second, it shows how ranking functions are uniquely reflected in iterated belief change. More precisely, it specifies conditions on threefold contractions which suffice to represent contractions by a ranking function uniquely up to multiplication by a positive integer (a result independently obtained by Matthias Hild). Thus, an important advantage AGM theory was always claimed to have over ranking functions proves to be spurious.

Concepts Are Beliefs about Essences

U. Haas-Spohn/W. Spohn

The most promising strategy to understand (sentential) narrow contents or (subsentential) concepts seems to be to conceive them as primary intensions or diagonals within the epistemologically reinterpreted character theory of Kaplan. However, this strategy seems to founder either at Block’s dilemma between a too syntacticist or too holistic understanding of narrow contents and concepts or at Schiffer's problem that the character theory depends on functional role semantics without adding anything to it. The paper defends a way of steering midway of Block’s dilemma without recourse to functional role semantics, a way perfectly summarized in its title.

Über die Struktur theoretischer Gründe

W. Spohn

The paper sets out something like a scheme behind my various epistemological papers (Nos. 15, 18, 26, 27, 28, and 32): by explaining the relations I see between the dynamics of belief, reasons and apriority, by introducing a number of basic general principles concerning the structure of reasons (such as a ban of dogmatism, the Schein-Sein-Prinzip, a special and a general coherence principle, a weak and a strong discoverability principle, a weak and strong principle of the coherence of truth, and a weak and a very weak principle of causality), by explaining the relations between these principles, and by relating the principles to familiar doctrines like the verifiability theory of meaning, the unity of science, or internal realism.

Deterministic Causation

W. Spohn

The paper is the most complete presentation of my views on deterministic causation. It develops the deterministic theory in terms of ranking functions in perfect parallel to my theory of probabilistic causation (as was claimed to be possible in No. 17) and thus unites the two aspects. It also argues that the theory presented is superior to regularity and counterfactual theories of causation, in particular by giving a natural account of causal overdetermination. Since it presents a subjectivistic theory of causation, in a Humean spirit, it closes with briefly summarizing my account of how to objectify this theory of causation (which is fully presented in No. 20).

Vier Begründungsbegriffe

W. Spohn

The paper distinguishes four basic notions of one assumption or proposition being a reason for (or justifying) another: a deductive notion, a computational notion, a causal notion, and a positive relevance notion (as first defended by me in No. 10). After setting these notions within three important distinctions of present-day epistemology – knowledge vs. belief, internalistic vs. externalistic, and normative vs. naturalized epistemology – and after explaining why I tend to be a normative internalist belief theorist, I compare the four notions and argue that the positive relevance notion is the most adequate and fruitful one.

Disquotational Truth and Analyticity

V. Halbach

The uniform reflection principle for the theory of uniform T-sentences is added to PA. The resulting system is justified on the basis of a disquotationalist theory of truth where the provability predicate is conceived as a special kind of analyticity; it is equivalent to the system ACA of arithmetical comprehension. If the truth predicate is also allowed to occur in the instances of the T-sentences, yet not in the scope of negation, the system with the reflection schema for these T-sentences assumes the strength of the Kripke-Feferman theory KF and thus to ramified analysis up to epsilon_0.

Dependency Equilibria and the Causal Structure of Decision and Game Situations

W. Spohn

The paper attempts to rationalize cooperation in the one-shot prisoners’ dilemma (PD). The first step consists in introducing (and investigating) a new kind of equilibria (differing from Aumann’s correlated equilibria) according to which the players’ actions may be correlated (section 2). In PD the Pareto-optimal among these equilibria is joint cooperation. Since these equilibria seem to contradict causal preconceptions, the paper continues with a standard analysis of the causal structure of decision situations (section 3). The analysis then raises to a reflective point of view in which the agent integrates his own present and future decision situations into the causal picture of his situation (section 4). This reflective structure is first applied to the toxin puzzle and to Newcomb's problem, showing a way to rationalize drinking the toxin and taking only one box without assuming causal mystery (section 5). The latter result is finally extended to a rationalization of cooperation in PD (section 6).

Bayesian Nets Are All There Is To Causal Dependence

W. Spohn

The paper displays the similarity between the theory of probabilistic causation developed by Glymour et al. since 1983 and mine developed since 1976: the core of both is that causal graphs are Bayesian nets. The similarity extends to the treatment of actions or interventions in the two theories. But there is also a crucial difference. Glymour et al. take causal dependencies as primitive and argue them to behave like Bayesian nets under wide circumstances. By contrast, I argue the behavior of Bayesian nets to be ultimately the defining characteristic of causal dependence.

Occurrence-Dependence

M. Kupffer

Within a single sentence, sometimes different syntactic occurrences of the same expression refer differently. This is what I call “occurrence-dependence”. The paper compares two frameworks that are able to deal with occurrence-dependence, namely tokenreflexive semantics and occurrence-interpretation. It is argued that the key concept of the latter framework is reducible to the key concept of the first: while tokenreflexive semantics is concerned with the interpretation of utterances, occurrences should best be understood as certain sets of utterances.

Continu’ous Time Goes by Russell

U. Lück

Russell and Walker proposed different ways of constructing instants from events. This article compares the two approaches with respect to their power of explaining “time as a continuum”. Thomason attributed such power to Walker's theory of time and not to Russell’s. It is shown here that these powers are about the same for both theories. This is done by solving, among others, a mathematical characterization problem corresponding to the characterization problem that Thomason solved with regard to Walker’s construction. Namely, it is shown how to characterize those event structures (formally: interval orders) which become linear orders isomorphic to (a) a given ordered real interval (depending on endpoints), or (b) any (non-trivial) ordered real interval (ignoring endpoints) through Russell’s construction of instants. Characterizations for each of the following conditions on the resulting linear order serve as tools: (i) Dedekind completeness, (ii) separability, (iii) plurality of elements, (iv) existence of certain end-points. Moreover, denseness is characterized in order to replace Russell’s erroneous attempt. The use of non-constructive principles to ensure the existence of instants is also discussed. It is shown that such principles can be replaced by the characterizing condition presented for (ii).

Beweisbarkeitslogik für Rosser-Sätze

C. von Bülow

In his classical paper ‘Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme I’ (1931), Kurt Gödel showed that the true sentences of mathematics could not be obtained as the theorems of any formal system. To prove this, Gödel constructed arithmetical sentences which state about themselves: ‘I am not provable (in the system considered).’ These Gödel sentences then are in fact not provable. But thus they are true, and therefore exemplify the system’s incompleteness. John Barkley Rosser, in his ‘Extensions of Some Theorems of Gödel and Church’ (1936), improved Gödel’s result by using, instead of Gödel sentences, so-called Rosser sentences, which roughly say: ‘If I am provable then my negation is provable as well.’ Now, formulae of propositional modal logic can be ‘translated’ into the language of arithmetic by replacing propositional variables with arithmetical sentences and interpreting the necessity operator as a formalized provability predicate. In 1976, Robert M. Solovay demonstrated (in ‘Provability Interpretations of Modal Logic’) that the modal system GL yielded exactly those formulae which are provable under all such translations. Thus it is possible to investigate the formalized provability predicate by modal-logical means, i.e., to engage in provability logic. In their 1979 ‘Rosser Sentences’, Solovay and D. Guaspari made Rosser-like phenomena accessible to modal-logical methods, too. In this work, Guaspari’s and Solovay’s results, together with the necessary fundamentals from predicate and modal logic, are developed and slightly generalized. I tried to present the subject so as to be easily accessible even for readers with only basic knowledge in mathematical logic. Chapters I, II and IV by themselves can serve as an introduction to provability logic.

Kaplan’s A priori

M. Kupffer

In “Demonstratives”, David Kaplan proposes to explicate “a priori” as “true in every context”. This predicts that sentences like “I exist”, “I am now here”, and “everything is as it actually is” are a priori. While I agree with the prediction, I think that the account is dubious. I propose to explicate “is a priori” as “is knowable in virtue of semantic competence” instead. The first part of the paper tries to spell this out in some detail; it also deals with the application of the resulting notion with the help of some principles of disquotation. The second part is concerned with the connection between epistemology and Kaplanian semantics. A condition is formulated under which truth in every context indeed implies knowledge in virtue of semantic competence. On a plausible construal of semantic competence, this condition is not met by sentences like “Hesperus=Phosphorus”, so one can maintain that this sentence is true in every context but not a priori.

Conceivability and the A Priori

M. Kupffer

David Chalmers claims that there is (i) a sense of “conceivable” in which conceivability directly implies metaphysical possibility and (ii) a sense in which it implies epistemic possibility. There is reason to doubt this thesis. First, the proposed ambiguity cannot account for some basic examples; even in the context of a natural reading of the famous zombie argument none of the two senses really seem to be appropriate. Second, a proper analysis of conceivability does not validate the claim that conceivability always implies some kind of possibility. “It is conceivable that S” is not ambiguous and merely implies that it is epistemically possible that S is not inconsistent. Finally I show how to square the weakness of these implications with the fact that thought experiments sometimes inform us of what we should rationally believe to be possible.