The existence and nature of the a priori are defining issues for philosophy. A philosopher's attitude to the a priori is a touchstone for his whole approach to the subject. Sometimes, as in Kant's critical philosophy, or in Quine's epistemology, a major new position emerges from reflection on questions that explicitly involve the notions of the a priori or the empirical. But even when no explicit use is made of the notion of the a priori in the questions addressed, a philosopher's methodology, the range of considerations to which the philosopher is open, his conception of the goals of the subject, his idea of what is involved in justification — all of these cannot fail to involve commitments about the nature and the existence of the a priori. So understanding the a priori is of interest in itself.
In Philosophical Investigations, Ludwig Wittgenstein talks about action and the will. The main ideas we need to be acquainted with in order to understand Wittgenstein's remarks on this topic are, first, Arthur Schopenhauer's neo-Kantian theory of the will, which Wittgenstein seems to have fully accepted in 1916, and which still influenced his thinking in 1947, and second, the theory advanced in William James's The Principles of Psychology, which Wittgenstein encountered in the 1930s, and rejected root and branch. Schopenhauer and James were in turn reacting, in very different ways, to the empiricist theory of the will, which received its classic exposition in John Locke's Essay Concerning Human Understanding. This article argues that Wittgenstein's treatment of action and the will in Philosophical Investigations is seriously flawed. Wittgenstein fails to disentangle the active/passive distinction and the voluntary/not voluntary distinction; he fails to see that voluntariness is not only an attribute of activity, but of passivity as well; and he confuses action and motion.
There are two main motivations for action-based approaches to perception: the parsimonious assumption that action and perception belong to a single overlapping functional system and the tendency to minimize the load of internal processing in perception. For example, according to the ecological paradigm, visual perception consists in detecting affordances for action. Many advocates of action-based accounts of perception reject the computational/representational approach and embrace instead an embodied approach to perception and an empiricist view of the contents of concepts. For example, enactivists argue for constitutive links between an agent’s bodily movements and the content of her perceptual experiences. While, enactivism is not easy to reconcile with evidence for the two-visual systems model of human vision, further support for action-based accounts of social perception has been derived from the discovery of mirror neurons and mirroring processes.
Christian Pohl, Bernhard Truffer, and Gertrude Hirsch-Hadorn
In a number of European countries a particular understanding of transdisciplinarity has evolved over the last decades, initiated by research on environmental problems. The focus of this type of transdisciplinary research is on helping society solve wicked problems. A specific feature is that, in addition to researchers of different disciplines, representatives of civil society and the private and public sectors are involved in the research process. “Addressing Wicked Problems through Transdisciplinary Research” describes this type of transdisciplinary research, its roots, and the challenges to be dealt with when addressing wicked societal problems.
The lengthy history of interdisciplinary activity in higher education offers important lessons about developing, administering, and assessing interdisciplinary programs. A deepening body of literature surrounding higher education studies and organizational theory surrounds these lessons. This literature acknowledges that, like any other system, higher education institutions face multiple influences from both internal and external stakeholders. This interaction requires an understanding of the environment in which higher education institutions operate. This chapter begins from the position of a changing environment for higher education to consider the challenges associated with administering interdisciplinary programs. After establishing organizational norms unique to higher education institutions, the chapter considers three specific areas: (1) the role of boundaries in shaping the university, and how interdisciplinary programs negotiate these boundaries; (2) the persistence of disciplinary cultures, and their impact on interdisciplinary programs; and (3) the resource challenge for contemporary higher education, and how this debate affects interdisciplinary activities.
This article focuses on the distinction between analytic truths and synthetic truths (i.e. every truth that isn’t analytic), and between a priori truths and a posteriori truths (i.e. every truth that isn’t a priori) in philosophy, beginning with a brief historical survey of work on the two distinctions, their relationship to each other, and to the necessary/contingent distinction. Four important stops in the history are considered: two involving Kant and W. V. O. Quine, and two relating to logical positivism and semantic externalism. The article then examines questions that have been raised about the analytic–synthetic and a priori–a posteriori distinctions, such as whether all distinctively philosophical truths fall on one side of the line and whether the distinction is relevant to philosophy. It also discusses the argument that there is a lot more a priori knowledge than we ever thought, and concludes by describing epistemological accounts of analyticity.
In the Posterior Analytics, Aristotle develops a theory of demonstration as a way of gaining causal knowledge of things or events (pragmata) under the general plan of constructing both an ideal structure for demonstrative science and a unified, comprehensive theory of heuristic inquiry. The Aristotelian idea of “demonstrative science” is derived from his attempt to characterize the conditions for “knowledge simpliciter (epistêmê haplôs),” that is, causal and necessary knowledge. This article first shows that Aristotle's inquiry theory is a heuristic theory and as such yields scientific knowledge within the scope of his theory of demonstration, and then examines the difficulties which arise concerning the relation between demonstration and definition. In particular, if demonstrations and definitions turn out to be unrelated in terms of their objects, predications, or methods, Aristotle's general plan will be a failure. The article explores how Aristotle attempts to construct a demonstration of what it is. Finally, by analysing his new theory of definition, the article considers how far he has succeeded in developing his heuristic demonstrative inquiry theory.
Catherine Z. Elgin
This article discusses the character of art and the centrality of art education to the curriculum. It argues against the claim that art is impervious to education. It explains that though inspiration is essential to art and inspiration cannot be taught, the assumption that art is entirely a product of inspiration is unfounded. In addition, sensory and emotional responses can be educated. It shows how a symbol-theoretic conception of art readily explains how art education is possible and why it is valuable.
The question of whether art gives us knowledge is as old as the philosophy of art itself: Plato in The Republic argued that, although poetry purports to give knowledge, it in fact does no such thing, but produces a mere deceptive appearance of knowledge. In contrast, Aristotle in The Poetics argued for the capacity of poetry to give its audience knowledge of universals. The dispute has reverberated down to the modern period, and a large part of the contemporary debate is still concerned with the classical form of the question. This can be dubbed the epistemic question: can art give its audience knowledge? Though the questions are rarely distinguished, there is a distinct issue which also needs to be addressed under the general rubric of art and knowledge.
This article describes the concept of ascriber contextualism in relation to skepticism. It explains that ascriber contextualism in epistemology is the view that the truth conditions for sentences containing “know” and its cognates are context sensitive. It discusses the motivations for contextualism, skeptical paradoxes, the mechanism of context shifting, and sensitive moderate invariantism (SMI). It also comments on objections to ascriber contextualism and SMI.
Edward S. Casey
This chapter concentrates on the edges of the lived body, which act to mediate between the outermost and innermost edges. The prospects for construing bodily edges are explored. Bodily edges realise the paradigm of definitive but incomplete self-knowledge in a very particular way: namely, that such edges are parts of parts. The internal and external edges of bodily parts are not only glimpsed in the course of ongoing experience but also offer a grip for hands. Inside/outside is an especially significant binary edge structure, and is altogether central to edges that inhere in bodies. Bodily edges act as the mediatrix between the edges of the perceived earth (the farthest horizons) and those of the inner psyche (most interior parts), and also provide the conjunctures for many things in experienced life-worlds.
The article focuses on Broadbent's approach to the explanation of attention. Broadbent shows that one's information-processing resources have sufficient capacity to encode the simple physical properties of all the stimuli that one is presented with, but have only a limited capacity for the encoding of the semantic properties of those stimuli. The resulting model depicts perceptual processing as proceeding in two stages. The first stage entails that a large capacity sensory system processes the physical features of all stimuli in parallel. A subset of the representations generated by the large capacity system are selected to be passed on to a second perceptual system, which has a smaller processing capacity, and which has the job of processing the stimuli's semantic properties. Broadbent's theory would explain that pre-bottleneck processing is responsible for the detection of simple physical features, and also for own-name detection. The phenomenology of one's shifting awareness in conditions of binocular rivalry is naturally described as the manifestation of a competition, and perhaps of a biased competition.
This chapter presents an account of auditory perception. Its focus is the question of how we perceive the sources of sounds (the sound-producing events) that we take ourselves to hear. The chapter begins by surveying two ways of answering that question—the epistemic account, and the demonstrative account—and argues that neither is adequate. An alternative account—the representational account—is set out and defended. According to this account, auditory experience represents both sounds (and their acoustic properties) and the sources of sounds. What this means is explained, and empirical evidence in support of the account is described.
This article examines philosopher J. L. Austin's argument against skepticism. It explains Austin's response to questions concerning the evaluation of Cartesian arguments in favour of skepticism and the correctness of the conclusion of the said arguments. It also mentions Austin's argument in his 1946 article that attention to everyday practice of making and challenging knowledge claims makes it clear that it is a practice in which there is a substantial constraint on what counts as a legitimate challenge to a person's claim to knowledge.
Contemporary perceptual psychology uses Bayesian decision theory to develop Helmholtz’s view that perception involves ‘unconscious inference’. The science provides mathematically rigorous, empirically well-confirmed explanations for diverse perceptual constancies and illusions. The explanations assign a central role to mental representation. This article highlights the explanatory centrality of representation within current Bayesian perceptual models. The article also discusses how Bayesian perceptual psychology bears upon several prominent philosophical topics, including: eliminativism about representation (defended by Churchland, Field, Quine, and Stich); relationalism about perception (endorsed by Brewer, Campbell, Martin, and Travis); phenomenal content (postulated by Chalmers, Horgan and Tienson, and Thompson); and the computational theory of mind (espoused by Fodor and many other philosophers).
James M. Joyce
This article is concerned with Bayesian epistemology. Bayesianism claims to provide a unified theory of epistemic and practical rationality based on the principle of mathematical expectation. In its epistemic guise, it requires believers to obey the laws of probability. In its practical guise, it asks agents to maximize their subjective expected utility. This article explains the five pillars of Bayesian epistemology, each of which claims and evaluates some of the justifications that have been offered for them. It also addresses some common objections to Bayesianism, in particular the “problem of old evidence” and the complaint that the view degenerates into an untenable subjectivism. It closes by painting a picture of Bayesianism as an “internalist” theory of reasons for action and belief that can be fruitfully augmented with “externalist” principles of practical and epistemic rationality.
Christa Davis Acampora
Ecce Homo offers Nietzsche’s own interpretation of himself, his thoughts, and his works. This article analyzes how the text bears on his ideas about agency, fate, and freedom. It presents an account of “how one becomes what one is.” For Nietzsche, a person is a set of drives ordered or ranked a certain way; there is no will or subject separate from these that could carry out the work of becoming. What is most important is that one’s drives be coordinated in a single entity. Through these tactics some of us can become what we are.
This article examines Nietzsche’s thoughts about becoming and being, and how these are at odds with both knowledge and life. It discusses how Nietzsche addresses this problem, beginning with its historical part: Nietzsche’s story of how the philosophical tradition first builds the concept of being, but then pulls it down by the stages described in the famous ‘history of an error’ chapter in Twilight of the Idols. This development culminates in the replacement of being with becoming. But understanding what Nietzsche means by becoming requires an understanding of its relation to time. We arrive at a genuine sense of becoming only by stripping away our experience of time as succession.
Dorit Bar-On and Kate Nolfi
A fundamental puzzle about self-knowledge is this: spontaneous, unreflective self-attributions of beliefs and other mental states (avowals) appear to be at once epistemically groundless and epistemically privileged. On the one hand, it seems that avowals simply do not require justification or evidence. On the other hand, avowals seem to represent a substantive epistemic achievement. Several authors have tried to explain away avowals’ groundlessness by appeal to the so-called transparency of present-tense self-attributions. After a critical discussion of two extant construals of transparency, this article presents an alternative reading of transparency (based on neo-expressivism about avowals) that explains, without explaining away, the apparent groundlessness of avowals. The article goes on to explore a way of coupling this alternative reading with a plausible account of how it is that ordinary avowals can represent genuine knowledge of present states of mind.
This article examines Bishop Berkeley's view and treatment of skepticism based on the content of his books Principles of Human Knowledge and Three Dialogues between Hylas and Philonous. It explains that the Principles of Human Knowledge was not well received when it was published in 1710 because readers and reviewers understood his denial of the existence of material substance as supporting skepticism. In his second book, Berkeley showed how he was opposed to skepticism. He also showed that there were some common principles of philosophers, and that these principles, either individually or jointly, lead to skepticism of some form.