Contemporary perceptual psychology uses Bayesian decision theory to develop Helmholtz’s view that perception involves ‘unconscious inference’. The science provides mathematically rigorous, empirically well-confirmed explanations for diverse perceptual constancies and illusions. The explanations assign a central role to mental representation. This article highlights the explanatory centrality of representation within current Bayesian perceptual models. The article also discusses how Bayesian perceptual psychology bears upon several prominent philosophical topics, including: eliminativism about representation (defended by Churchland, Field, Quine, and Stich); relationalism about perception (endorsed by Brewer, Campbell, Martin, and Travis); phenomenal content (postulated by Chalmers, Horgan and Tienson, and Thompson); and the computational theory of mind (espoused by Fodor and many other philosophers).
James M. Joyce
This article is concerned with Bayesian epistemology. Bayesianism claims to provide a unified theory of epistemic and practical rationality based on the principle of mathematical expectation. In its epistemic guise, it requires believers to obey the laws of probability. In its practical guise, it asks agents to maximize their subjective expected utility. This article explains the five pillars of Bayesian epistemology, each of which claims and evaluates some of the justifications that have been offered for them. It also addresses some common objections to Bayesianism, in particular the “problem of old evidence” and the complaint that the view degenerates into an untenable subjectivism. It closes by painting a picture of Bayesianism as an “internalist” theory of reasons for action and belief that can be fruitfully augmented with “externalist” principles of practical and epistemic rationality.
This article starts when Charles Darwin published his great evolutionary work, “On the Origin of Species” which deals with artificial selection, i.e. the combination of traits and natural selection in which certain traits are attributed to improved survival ability. The ideal behind the book was the concept of laws being bound up deductively into a system. Population genetics does not deal directly with physical objects and their features. Rather, it casts everything in terms of the presumed underlying genes. It brings out the semantic aspects of the theory. Various methods are used for adaptation such as the comparative method. Darwin said that group selection might be an important factor in the evolution of morality. There is a theory of “punctuated equilibrium” which supposes that rapid evolution takes place in groups when they are first isolated from the parent body. Darwinian theory adds to our new understanding and at the same time draws strength from this understanding.
This article finds it characteristic of orthodox Bayesians to hold that for each person and each hypothesis it comprehends, there is a precise degree of confidence that person has in the truth of that proposition, and that no person can be counted as rational unless the degree of confidence assignment it thus harbors satisfies the axioms of the probability calculus. In focusing exclusively on degrees of confidence, the Bayesian approach tells nothing about the epistemic status of the doxastic states epistemologists have traditionally been concerned about—categorical beliefs. The purpose of this article is twofold. First, it aims to show that, as powerful as many of such criticisms are against orthodox Bayesianism, there is a credible kind of Bayesianism. Second, it aims to show how this Bayesianism finds a foundation in considerations concerning rational preference.
This article examines three competing views entertained by economic theory about the instrumental rationality of decisions. The first says to maximize self-interest, the second to maximize utility, and the third to “satisfice,” that is, to adopt a satisfactory option. Critics argue that the first view is too narrow, that the second overlooks the benefits of teamwork and planning, and that the third, when carefully formulated, reduces to the second. This article defends a refined version of the principle to maximize utility. It discusses generalizations of utility theory to extend it to nonquantitative cases and other cases with nonstandard features. The study of rationality as it bears on law is typically restricted to the uses made of the notion of rationality by the “law and economics movement.” Legal economists accept the traditional economic assumption that rational agents seek primarily to maximize their personal utility.
Werner Güth and Hartmut Kliemt
The originally Hobbesian ideal of twentieth-century neoclassical economics as a discipline that studies human interaction “more geometrico” as a scenery of interactive rational decision making is rejected. “Explaining” overt behavior as (if it were) the equilibrium outcome of opportunity seeking rational choices is impossible if the requirement of approximate truth of the explanans is upheld. Stylized accounts of some central experiments (prisoner’s dilemma, ultimatum, dictator, impunity games, double oral auctions) show why this is so and illustrate basic contributions of experimental economics in an exemplary manner. A somewhat detailed account of an experiment concerning “equity” shows the explanatory potential and “workings” of experimental economics and how its findings can contribute to traditional philosophical and psychological discussions. Why the Humean “attempt to introduce the experimental method of reasoning into moral subjects” must remain incomplete until experimental economics and experimental psychology become fully complementary research strategies is indicated as well.
William G. Lycan
This article proposes that explanation and epistemology are related in at least three ways. First, “to explain something is an epistemic act and to have something explained is to learn.” This article begins its account of explanation by drawing out several paradigms for scientific explanation, but it finds it unlikely that scientific explanation is captured by a single set of necessary and sufficient conditions. Noting, however, that scientific explanation does not exhaust an account of explanation in general, it moves on to a second way in which explanation is related to epistemology: by the idea of explanatory inference. To account for a hypothesis' being “the best,” this article introduces “pragmatic virtues” that can increase the value of a hypothesis. The third way in which explanation relates to epistemology claims that a belief can be justified if it is arrived at by explanatory inference.
Igor Douven and Jonah N. Schupbach
Formal epistemology is a young but vibrant field of research in analytic philosophy characterized by both its matter and its method. The subject matter of this field is epistemology, the theory of knowledge. The method for investigating the subject matter of epistemology involves the use of formal, logicomathematical devices. This chapter highlights the major achievements of formal epistemology so far and gives a sense of what can be accomplished by addressing problems from mainstream epistemology with the use of logic, probability theory, computer simulations, and other formal tools. The historical roots of the field are also described, and there is a discussion of new questions that have been raised by formal epistemology that should also be of interest to mainstream epistemologists. We also pay attention to the currently emerging subfield of formal social epistemology.
Donald L.M. Baxter
For Hume, the ideas of space and of time are each a general idea of some indivisible objects arranged in a certain manner with additional qualities that make them conceivable to the mind. He argues that the structures of these ideas reflect the structures of space and time. Thus, space and time are not infinitely divisible, and there cannot be empty space nor time without succession. Hume’s idiosyncratic theory can be seen to be reasonable if one pays careful attention to the fact that Hume, in accordance with his skepticism, is concerned only to give vent to views about space and time as they appear in experience. The chapter focuses on explicating Hume’s central arguments rather than trying to give a comprehensive treatment.
To begin with, economics has a paradigm, as plainly exhibited by the uniformity of its textbooks. It is a commonplace that unlike every other social and behavioral science, the introductory texts in economics are all pretty much the same. By and large you could permute the problems at the ends of chapters among the five largest selling textbooks in the field and still test students' understanding of the chapters they had actually read. Second, the language of the discipline is highly mathematized. Third, the discipline has identifiable proprietary laws, albeit inexact ones, and a set of proprietary kinds, which the discipline's “discipline” requires be applied to the solution of puzzles. And it rewards most generously those who find new puzzles to which to apply the laws and concepts. This progressive expansion of the domain of economics is often decried by some social scientists as economic imperialism.
This article draws on several elements of Pierre Duhem's account of science to show issues of the cumulation of knowledge and scientific progress in the social sciences. It concentrates on Duhem's principle of underdetermination of theory by evidence, his holist account of the growth of scientific knowledge, and his conventionalist view of theories. Willard Van Ormand Quine's version is holism on a still larger scale than Duhem's. Duhem's appeal to le bon sens does not present any damaging subjectivity into an account of science beyond what most philosophers of science recognize. It is clear that no measure-stipulation has been accepted by contemporary authors on all sides of the balance-of-power debate. It is shown that the concept of the measure-stipulation can explain the progress or absence of progress of debates in social science just as well as those in the natural sciences.
This article is about the relationship between people, as modeled by the economist's concept of agency, and other entities modeled as agents in economics—in particular, sub-personal interests, as in models descended from Schelling (1978, 1980, 1984), and functional parts of people's brains, as modeled in neuroeconomics. This subject would not be very interesting if people and subpersonal interests reduced to or were simple additive functions of functional parts of their brains. However, for reasons that unfold here, it is thought that this common idea is untenable. The relationship between people, subpersonal interests, and brain systems is complicated, not simple. The objective of this article is to shed some light on it, on the basis of recent empirical research. It does not aim at stating a comprehensive theory of the relationship, which would be a premature ambition at this point in our collective knowledge.
This chapter provides an up-to-date discussion of work on two distinct problems in the foundations of quantum mechanics: the problem of the classical regime and the measurement problem. It explains that contemporary work has focused on the role of environmental decoherence in the emergence of classical kinetics and dynamics, and argues that the success of appeals to decoherence to solve the problem depends on the interpretation of the quantum theory. The chapter also considers the collapse postulate, the Born rule, and the apparatus of positive operator value measures.
This article is organized around two topics: the first one is the methodology of experimental economics, a research program that is becoming increasingly influential in contemporary economic science. The second one is normative methodology, an issue that has been widely debated by philosophers of economics over the last two decades. A methodological discussion of experimental economics could simply aim at describing the methods used by experimental economists in their daily work, without asking questions of efficacy or justification. Another approach, the one that is pursued here, is to take a more direct normative stance: instead of passively observing what economists do, the philosopher steps down in the arena of scientific debate and tries to issue some cautious advice on methodological matters.
Stan du Plessis
Data mining could compromise the believability of econometric models. And yet there might not be an alternative to data mining if economics is going to be an empirical science practiced with the joint constraints of incomplete economic theory and non-experimental data. The organizing principle for this discussion of data mining is a philosophical spectrum that sorts the various econometric traditions according to their epistemological assumptions about the underlying data-generating process (DGP), starting with instrumentalism at one end and reaching claims of encompassing the DGP at the other; call it the DGP-spectrum. In the course of exploring this spectrum, this article discusses various Bayesian, specific to general (S–G) as well as general to specific (G–S) methods. A description of data mining and its potential dangers and a short section on potential institutional safeguards to these problems set the stage for this exploration.
This article focuses on naturalism. It makes one terminological distinction: between methodological naturalism and ontological naturalism. The methodological naturalist assumes there is a fairly definite set of rules, maxims, or prescriptions at work in the “natural” sciences, such as physics, chemistry, and molecular biology, this constituting “scientific method.” There is no algorithm which tells one in all cases how to apply this method; nonetheless, there is a body of workers—the scientific community—who generally agree on whether the method is applied correctly or not. Whatever the method is, exactly—such virtues as simplicity, elegance, familiarity, scope, and fecundity appear in many accounts—it centrally involves an appeal to observation and experiment. Correct applications of the method have enormously increased our knowledge, understanding, and control of the world around us to an extent which would scarcely be imaginable to generations living prior to the age of modern science.
Three views of the orientation of the perceptual field are discussed. According to the simple orientation view, the perceptual field itself is an absolute space, and the positions of perceptual objects are intrinsic phenomenal qualities. The sophisticated orientation view rejects the claim that perceived space is intrinsically oriented, and locates the apparent orientation of the perceptual field at the level of egocentric perceptual modes of presentation. Finally, the no-orientation view proposes that the orientation of the perceptual field is due to post-perceptual (e.g. memory) processes. In conclusion, it is suggested that the notion of a frame of reference can be applied at three different levels: the ontological level of perceived position, the conscious level of how perceived position is presented to the subject, and the subpersonal level of the cognitive mechanisms underlying conscious perception.
Andrew Gelman and Cosma Rohilla Shalizi
This article reports the perspective on the philosophy of Bayesian statistics, based on the idiosyncratic readings of the philosophical literature and, more importantly, the experiences doing applied statistics in the social sciences and elsewhere. It is noted that Bayes need not be linked with subjectivity and inductive reasoning. Bayesian statistics is connected with a formal inductive approach. A problem with the inductive philosophy of Bayesian statistics is that it assumes that the true model is one of the possibilities being considered. The Bayesian data analysis fits well into the falsificationist approach. Because it is felt that the status quo perception of Bayesian philosophy is wrong, it is thought that it is more helpful to present the author's perspective forcefully, with the understanding that this is only part of the larger philosophical picture.
This article examines a number of issues and problems that motivate at least much of the literature in the philosophy of mathematics. It first considers how the philosophy of mathematics is related to metaphysics, epistemology, and semantics. In particular, it reviews several views that account for the metaphysical nature of mathematical objects and how they compare to other sorts of objects, including realism in ontology and nominalism. It then discusses a common claim, attributed to Georg Kreisel that the important issues in the philosophy of mathematics do not concern the nature of mathematical objects, but rather the objectivity of mathematical discourse. It also explores irrealism in truth-value, the dilemma posed by Paul Benacerraf, epistemological issues in ontological realism, ontological irrealism, and the connection between naturalism and mathematics.
Anna Alexandrova and Robert Northcott
This article begins by surveying existing work on scientific models, with an eye to the specific case of economics. It reviews four accounts in particular—the satisfaction-of-assumptions account, the capacities account, the credible-worlds account, and the partial-structures account. It tells the detailed story of the 1994 Federal Communications Commission (FCC) spectrum auction in the United States, highlighting the crucial role of experiment as well as theory. In the light of this case study, this article presents its own open-formula account of economic models. It then turns to the issue of economic progress. Finally, it concludes that empirical progress in economic theory might not be discussed here, or at least that the success of the spectrum auction provides no warrant for doing so. Rather, progress is better seen as more akin to the worthy but piecemeal variety typical of engineering.