Show Summary Details

Page of

PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).

date: 25 May 2019

Introduction

Abstract and Keywords

This article provides a road map for the users of this handbook, and integrates the questions each author addresses into the larger debates of the field. An important portion of the handbook is devoted to the question of whether morphology follows the same rules as syntax, closely related to the issue of whether the lexicon exists as a distinct module of grammar with its own primitives and modes of combination. One of the important issues at the syntax–semantics interface concerns the notion of compositionality. The issues concerning the overall model of grammar are also considered. It is not appropriate to argue for any particular position in this introduction, but it is felt that the issues which emerge most forcefully from this collection are the scope and limitation of the syntactic component, and the autonomy of phonology.

Keywords: morphology, syntax, lexicon, semantics, grammar, phonology

In the Introduction to Syntactic Structures, Chomsky (1957: 11) states that the “central notion in linguistic theory is that of ‘linguistic level’ … such as phonemics, morphology, phrase structure … essentially a set of descriptive devices that are made available for the construction of grammars.” The term “grammar” is used here in the usual ambiguous fashion to refer to both the object of study and the linguist's model of that object. Thus, in chapter 3 (p.18), Chomsky comes back to the issue of levels referring, both to the complexity of languages and the usefulness of theoretical descriptions.

A language is an enormously involved system, and it is quite obvious that any attempt to present directly the set of grammatical phoneme sequences would lead to a grammar so complex that it would be practically useless. For this reason (among others), linguistic description proceeds in terms of a system of “levels of representations”. Instead of stating the phonemic structure of sentences directly, the linguist sets up such “higher level” elements as morphemes, and states separately the morphemic structure of sentences and the phonemic structure of morphemes. It can easily be seen that the joint description of these two levels will be much simpler than a direct description of the phonemic structure of sentences.

In current parlance, we say that knowledge of language is modular, and individual linguists tend to specialize in research on a particular module—syntax, morphology, semantics, or phonology. Of course, the very existence of each module and the boundaries and interfaces between the modules remain issues of controversy. (p. 2) For many years the dominant model of the the architecture of the language faculty, including the relationship among modules, has been the Chomskian T-model dating from the 1960s. However, linguistic theory has been undergoing important changes over the last ten years. Recent work has succeeded both in deepening the theoretical issues and expanding the empirical domain of the object of inquiry. While the D-structure and S-structure levels are no longer universally accepted as useful levels of representation, the nature of PF, the interface of the grammar module(s) with the auditory-perceptual system, and LF, the interface of the grammar with the conceptual-intentional system, have increased in theoretical importance. The recent empirical and theoretical challenges to the dominant T-model have in many cases undermined the presuppositions underlying that basic architecture and have reopened many important questions concerning the interactions between components of the grammar.

One striking discovery that has emerged from recent work is the importance of the various interfaces between modules within the grammar in understanding the nature of the language faculty. Indeed, one could argue that in understanding the interfaces between syntax and semantics, semantics and pragmatics, phonetics and phonology, or even syntax and phonology, we place boundary conditions on the scope and architecture of the theory as a whole. It is not surprising then that some of the most intellectually engaging and challenging research in recent years has emerged precisely at these interfaces.

In commissioning the chapters for this volume, we have deliberately adopted a narrow interpretation of the term “interfaces” as referring to the informational connections and communication among putative modules within the grammar. The term “interface” can of course legitimately be applied to the connections between the language faculty and other aspects of cognition (e.g. vision, reasoning) or between linguistics and other disciplines (e.g. philosophy, psychology). Our choice of scope here reflects our belief that the narrower interpretation of “interface” allows us to focus on the most crucial issue facing generative linguistics—the internal structure of the language faculty—which we believe should be prior to consideration of how this faculty interacts with others. In other words, our decision reflects not only a practical choice based on current research trends but also the logical priority of defining the elements of comparison (say, language and vision) before undertaking such comparison.

The chapters in the book are original contributions by authors who have been working on specific empirical problems and issues in areas that cross-cut traditional domains of grammar. In some cases the authors address a particular debate that is ongoing in the field; in others, their aim is to throw light on an empirical area that has proved challenging for the modular view in general or in which the choice of analytic tools is still open. In some ways, this is a handbook with a difference because there is no prejudged territory to cover, and no obvious partitioning of our researchers' concerns into neat components (by definition). Our (p. 3) purpose here in the introduction, therefore, is to provide something of a road map for the users of this book, and to integrate the questions each author addresses into the larger debates of the field.

Part I Sound

Though he presents several possibilities found in the literature both for a strict demarcation of phonetics and phonology and also for no demarcation at all, Scobbie himself favours a less categorical view of the matter. He suggests that the problems faced by researchers in categorizing low-level but language-specific phenomena into phonetics or phonology, in defining clear-cut modules and their interface, reflect the nature of the phenomena themselves: the very existence of an ambiguous no-man's-land between phonetics and phonology may reflect (and be reflected by) the non-deterministic mental representations in the systems of individual speakers. Scobbie thus argues for a more flexible quasi-modular architecture, best modelled, he claims, stochastically. He points out that descriptive data based on transcription is biased in necessarily assuming traditional categorical and modular interpretations. Quantitative continuous data, he suggests, could provide new evidence for meaningful debate on the nature of the interface.

In addition to providing a useful survey of issues related to phonetics and phonology from a wide range of approaches, including generative phonology, articulatory phonology, exemplar theory, and others, Scobbie forces us to recognize that many common and intersecting assumptions concerning phonological representation and computation are rarely justified explicitly, even for the analysis of familiar data from languages as well studied as English.

In a chapter diametrically opposed to Scobbie's theoretical ecumenicism and desire to blur the phonetics—phonology boundary, Reiss defines a number of modules composing what are typically referred to as phonetics and phonology. He assumes that these modules are informationally encapsulated from each other, and defines the “ij interface” as a situation in which the outputs of one module Mi serve as the inputs to another module Mj. His claim is that the problem of understanding the i–j interface reduces to identification of those outputs of module Mi which Mj receives. Reiss provides a speculative discussion about how results in auditory perception can aid our understanding of phonology. His general point is that by better understanding what phonology is not, we can understand better what phonology is: we will then not mistakenly attribute a property to the phonology that rightly belongs elsewhere. This issue is related to a general (p. 4) discussion of the purview of Universal Grammar and the relationship between our sources of data and the theories we construct.

Hale and Kissock use the Marshallese vowel system, which underlyingly has four distinct members, to explore the phonetics—phonology interface from an acquisition perspective. On the surface, these four vowels show wide variation, depending on the features of flanking consonants. Hale and Kissock discuss the difficulties that such a system poses for phonetically grounded versions of Optimality Theory. They also claim that acquisition of such a system would require, under OT assumptions, that markedness constraints be low-ranked at the initial state of the grammar. This claim, which contradicts all work on acquisition in OT, except for earlier work of Hale, leads to a general critique of various aspects of the OT framework including Richness of the Base and “the emergence of the unmarked” in child language.

Orgun and Dolbey discuss the morphology—phonology interface in the context of a theory of Sign-Based Morpology. In this framework, the grammar consists of lexical items and sets of relations among them. For example, the grammar lists both singular book and plural books, and there is a relation that maps these items to each other. In order to address the issue of apparent cyclicity and over- and underapplication of phonological processes in morphologically complex words, the authors develop a specific version of Sign-Based Morphology which treats paradigms as elements of the theory, paradigmatic sign-based morphology (PSBM). The authors present solutions for a number of puzzling phonology—morphology interactions in Turkic and Bantu languages by embedding their PSBM within an OT grammar, which allows them to invoke the type of output—output correspondence and uniform exponence constraints found elsewhere in the OT literature.

Elordieta surveys various models of the phonology—syntax interface of the past twenty years, all of them fairly closely related to the Government and Binding and the Minimalist versions of syntactic theory. This chapter is explicit about the shortcomings of its predecessors each new model was most concerned to address. It concludes with Elordieta's analysis of a vowel-assimilation pattern in Basque which he analyses as reflecting the syntactic and phonological closeness of elements entering into feature chains consisting of feature checking relations. The process in question occurs in nominal contexts between a noun and a following determiner or case marker, and in verbal contexts between a verb and a following inflected auxiliary. The point of the analysis is to show that these two contexts form a natural class under a certain version of Minimalist checking theory. This contribution leads naturally into the second part of the book, in which researchers in the area of non-phonological structure grapple with problems and issues from a syntactic perspective.

(p. 5) Part II Structure

The existence of a syntactic component of grammar, encoding hierarchical structural relations, does not seem to be in any doubt within the field of generative grammar. However, questions arise as to the nature of the relational and transformational mechanisms involved (if any), and how far they should extend into domains like morphology and the lexicon. An important portion of this book is devoted to the question of whether morphology follows the same rules as syntax, closely related to the issue of whether the lexicon exists as a distinct module of grammar with its own primitives and modes of combination.

In her contribution, Rosen adopts a syntactic perspective on issues traditionally considered within the domain of lexical semantics and argument structure. She proposes a close connection between argument roles and the syntactic projections that are responsible for case, agreement, and notions like grammatical subject. However, she suggests that languages can systematically differ in whether they use a lower domain of functional projections for argument licensing (ones like vP and TP that are correlated with the well-known phenomena of nominative and accusative case), or whether they choose higher functional projections (those within the CP domain). In the former languages, arguments are classified on the basis of their effect on the event structure (specifically, initiation and telicity), while in the latter case the arguments are classified along more discourse driven lines (topic-hood, point of view). Rosen offers a survey of the different languages of each type and the syntactic properties that distinguish the behaviour of their arguments. The claim here is that the supposedly semantic and thematic differences among arguments and their modes of organization are actually tied to syntax and the functional projections that are active, and do not belong to some separate semantic module of grammar.

Julien then presents a view of the syntax—morphology interface, arguing that the notion of word is an epiphenomenon, based on the specifics of syntactic structure combined with the possibility of certain morphemic collocations to assume a distributional reality. She suggests that items traditionally considered to correspond to “word” actually derive from many possible distinct syntactic head configurations (head—head, head—specifier of complement, and specifier—head in the basic cases) where movements and lexical access conspire to create linear adjacency and distributional coherence. Drawing on evidence from a variety of languages, she shows that constraints on syntactic structure, and specifically the functional sequence, can explain the patterns and non-patterns of so-called word-formation across languages, without invoking morphology-specific modes of combination. In this sense, Julien is arguing for a strongly syntactic approach to morphology and against a lexicalist view of the notion of “word”.

(p. 6) Svenonius approaches the same interface with a rather different set of theoretical tools in mind, and a different set of ordering data. He observes that, when elements ordered in a logical hierarchy appear in natural languages, some cross-linguistically robust patterns of linear ordering emerge at the expense of others. Rather than reflecting a simple syntactic head parameter, these generalizations, he shows, are more insightfully described by different sorts of phrasal movement—roll-up, curl, and constituent fronting. Strikingly, he reveals that in a parallel fashion, morpheme ordering conforms to many of the very same patterns and generalizations as are found in the syntactic domain. The argument here is thus not only that morphology operates on the same sorts of hierarchically ordered structures and primitives as syntax, but that it also participates in the very same sorts of transformations that affect word-word linearization.

Embick and Noyer present a Distributed Morphology view of the relation between morphology and syntax which shares some important properties with Julien's. In particular, they argue that the notion of “word” does not correspond to any genuine linguistic primitives and that the morphological patterns of vocabulary insertion are a direct reflection of syntactic structure. They position themselves strongly against what they call the “lexicalist” camp and deny that there is an independent lexical module with its own primitives and modes of combination. For them, the only generative component is the syntax, and they argue that this is the null, most “minimal” hypothesis. However, they differ from Julien in assuming that “words” are inserted at syntactic terminals, and therefore only countenance a subset of the syntactic configurations (basically just complex heads formed by head—head adjunction) that Julien allows to give rise to “word-like” (distributionally privileged) sequences. This more restrictive mapping from the syntax to insertion forces them to admit a larger set of counter-examples to the straightforward mapping between the two domains. Thus, in their system, they have a number of post-Spell-Out operations that can modify the syntactic representation prior to vocabulary insertion, as well as phonological rules that can change linear ordering. Embick and Noyer claim that these operations are only minimal departures from the strong hypothesis that syntactic structure is responsible for morphological patterns, and that these rules are learned on a language by language basis. However, the large number and the power of these operations raises the question of whether they are not in fact covertly constructing, perhaps not a lexical, but certainly a morphological component.

Ackema and Neeleman argue for keeping the domains of morphology and syntax distinct, but within the larger domain of the syntactic module. They argue that “morphology” is actually “Word syntax”, whereas what is traditionally called syntax is actually just a submodule concerned with “Phrasal syntax”. While both submodules share some primitives inherited by the fact that they are both a type of syntax (e.g. category labels, merge, c-command, argument), they also each have more specialized operations and primitives that make them distinct. For example, (p. 7) phrasal syntax makes reference to notions such as EPP and wh-movement; word syntax must make reference to features such as “latinate” vs. “germanic” or features that encode declension class membership. They present evidence in their chapter that the two types of syntax are indeed autonomous and that they do not interact with each other directly, and that it would complicate the notions required in phrasal syntax if one were to attempt to do so. In cases where it seems direct interaction might be necessary, they present analyses to argue that the effects derive instead from the interaction between the syntactic module as a whole with the phonological module of grammar, that is, the correspondence principles required between the two macromodules.

This leads naturally to the chapter by Williams, who takes a position similar to that of Ackema and Neeleman despite some superficial differences in terminology. Williams argues that the syntax of the word is distinct and informationally encapsulated from the syntax of phrases and that this is responsible for a series of basic and robust efects. He agrees that both levels are in some sense syntactic and that they share some basic properties in that they are combinatoric and are sensitive to some of the same features. However, they differ in that the word-level does not tolerate “delayed resolution” of certain relations such as argument relations or anaphoric dependency. Williams uses the term “lexical” to refer to the word-level but, like Ackema and Neeleman, is careful to distinguish it from the notion of “listeme”, which clearly cross-cuts the word and phrasal domains. Again like Ackema and Neeleman, he argues that the confusion in the use of the term “lexicon” has been responsible for some of the general confusion in the debate, most particularly in the criticism of lexicalism by the proponents of Distributed Morphology. The second half of Williams's chapter is a careful criticism of the assumptions and analyses of a particular version of the DM view, showing that they cannot actually avoid the distinction between word-level and phrase-level syntax that he takes as primitive.

Stewart and Stump argue for a particular version of a realizational-inferential view of morphology, which they call Paradigm Function Morphology (PFM). The crucial aspects of this position involve the idea that the interface between morphology and syntax is “word-based” rather than “morpheme-based” and that external syntax is blind to the internal morphological structure of a word. They present analyses of important and pervasive properties of natural language morphological systems which can be straightforwardly described by a system of rules which, in a language-specific way, map roots and an associated bundle of morphosyntactic features to phonological forms. In this system, there is no internal “syntactic” structuring to the morphosyntactic features, although there is some structural complexity in the way in which reference to paradigms is exploited to express systematic generalizations about the way in which which certain feature clusters or rule blocks interact in a particular language. They contrast their perspective with that of Distributed Morphology which, although also “realizational”, is fundamentally (p. 8) morpheme-based, and argue that the word-based view is empirically better motivated and conceptually preferable in being more restrictive. For Stewart and Stump, mirror principle effects, or effects that seem to correlate with syntactic generalizations, are epiphenomenal and derive from historical grammaticalization paths; they should not be built into the theory of the synchronic system which inserts words as unanalysed wholes into the syntactic derivation. Thus, the view presented in this chapter also contrasts generally with the more syntactic approaches to word formation as found in most radical form in the contributions by Julien and Svenonius, and is an important counterpoint to them.

Part III Meaning

One of the important issues at the syntax—semantics interface concerns the notion of compositionality. Higginbotham, in his contribution, argues forcefully that this is not a conceptual triviality, but an empirical working hypothesis which should be used to probe important questions about the syntax—semantics interface in natural language. In particular, if it is taken as a constraint which imposes function-argument application as the only semantic mode of combination for syntactic merge at the same time as allowing a n-ordered logic of indefinitely large n, then the principle itself reduces to vacuity. On the other hand, if it is construed as a hypothesis that restricts the composition of semantic values to be genuinely local within a conservative second-order logic, then it has some bite. Higginbotham emphasizes that the syntax—semantics interface problem is essentially one in three unknowns: the nature of the meanings involved, as known by a native speaker of the language; the nature of the syntactic inputs to interpretation; and the nature of the mapping between the two. Actual natural-language examples may require adjustments in any of these three areas, keeping strong compositionality as a background assumption. Higginbotham also assumes that there is a principled distinction between the semantics of lexical items and the “combinatorial semantics” of natural languages, only the latter being subject to the compositionality thesis. He takes a fairly conservative position on the size of those lexical items, eschewing the finer syntactic decompositions of lexical items such as those found in some recent theoretical work (see, for example, Rosen, this volume). On the other hand, his main theoretical point concerning the status of the compositionality thesis holds even if one believes that “lexical items” (in the sense of listed elements) are somewhat smaller than he assumes.

Büring examines an issue that directly concerns the phonological “component”, namely, intonation. This important area is a domain where phonological/intonational (p. 9) and semantic/informational structural information seem to be most directly correlated. Do we need to forge a direct connection, or are the relationships more subtle? Büring proposes an account whereby a single syntactic representation, which contains formal features such as F (focus), and CT (contrastive topic), is seen by both the phonological and interpretational modules of the grammar. He argues that there is no need for a level of information-structure representation per se, and further that there are formal syntactic features that have predictable effects at the interpretational interface. For Büring, these effects are discoursal and not directly truth conditional in nature, although they can be modelled in terms of an update function from context to context. Focus is marked according to lack of “givenness” in the sense of Schwarzschild (1999), but is also affected by general principles relating to Question—Answer Congruence (QAC). The phonological interface operates with an entirely different vocabulary, and interprets the formal features as constraints on the placing of pitch accent, nuclear pitch accent, and intonational tunes within the context of a general prosodic implementation involving both specific rules and defaults. Büring finally considers the effect of information structure on constituent order in various languages. Here he suggests that some movements are clearly triggered by prosodic constraints. He discusses the implications of this for a derivational view of syntax: such a view would either have to embody “anticipatory” movements, or allow optional movements while filtering out ill-formed derivations at the interface under a matching condition. The latter system would be equivalent to a direct non-derivational mapping between prosodic and syntactic structure, and would raise the issue of whether a derivational view of the syntactic component gives the most natural modelling of the relation between syntax and prosody.

Potts takes the old definition of conventional implicatures from its Gricean source and argues that, far from being a class of meanings with no coherent identity, it singles out a distinct and pervasive phenomenon in natural language. He shows that the class of meanings does exist which is at once (a) linguistically driven, (b) non-defeasible, and (c) independent of the main assertive content (“at-issue” content) of the sentence. This turns out to be the class of appositive, speaker-oriented meanings with its own particular syntactic and semantic properties. He first shows that these meanings must be distinguished from presuppositions (with which the old class of conventional implicatures has often been wrongly conflated), conversational implicatures, and at-issue content. He then argues that, given the parallel contribution of these items to the at-issue content (despite their syntactic integration), there are two obvious ways to create a compositional analysis of their systematic contribution to meaning: assume non-standard syntactic structures with a non-standard syntax—semantics mapping; or use a standard tree architecture within a radical multidimensional semantics. Potts argues that the former route is both theoretically undesirable and empirically problematic and develops the latter as an elegant formal solution to (p. 10) the problem. The impact of this class of meanings and their solution drastically alters our view of the structure of meaning representations: the multi-dimensionality it embraces has implications for the syntax—semantics interface and the relationship between semantics and pragmatics.

Beaver and Zeevat take on the complex and intricate problem of accommodation, which sits right at the interface between semantics and pragmatics. Since Lewis (1979) accommodation has been understood as the process by which speakers “repair” presupposition failures to achieve felicity in discourse, and as such it has been closely tied up with the research on presupposition. As with presupposition, the phenomena treated here extend the question of the various roles of syntax, information structure, discourse representation, and conversational principles in accounting for the ways in which speakers negotiate meanings. Presupposition triggers come both from open-class lexical items (such as verbs like stop and realize), functional items such as determiners, and even certain constructions (such as clefts). Beaver and Zeevat argue that the complex process of accommodation is not a mere pragmatic accessory, but is “at the heart of modern presupposition theory” and not distinct from the problem of presupposition projection. Like the latter, it seems to be sensitive to syntactic domains and/or levels in discourse representation structure. The specific location and nature of accommodation also seems to be guided by conversational implicatures, information structure, and specific contextual information, although the debate continues about the centrality of these different influences. A further intriguing problem for a systematic and unified treatment of the phenomenon of accommodation lies in the fact that certain presuppositional elements such as too, definite determiners, and pronominals, seem to require discourse antecedents explicitly and cannot be “saved” by post hoc accommodation of referents. An important question raised for language is not only why there should be linguistic items that trigger presuppositions in the first place, but also why such differences between them in terms of possibility of accommodation should exist.

Part IV Architecture

Because of the recent changes and re-axiomatizations ushered in by the Minimalist Program (following on from Chomsky 1993), the architecture of the grammar and its relation to the interfaces and levels of representation has been subject to more internal scrutiny and questioning of assumptions. Part IV addresses issues concerning the overall model of grammar.

(p. 11) Boeckx and Uriagereka present an overview of the issues that have motivated generative grammar and the changes within the specifically Chomskian line of thinking that culminates in the Minimalist Program (MP). They argue that the MP shows clear continuities with Chomsky's earlier thinking, and that it represents an advanced stage of theoretical understanding of the core syntactic and interface issues. In particular, they show that it allows novel and more explanatorily adequate analyses of phenomena such as existential/expletive constructions in natural language. They take these constructions as their case study since it is one of the most basic and simple constructions on one level, but also because it is representatively complex in implicating many different (simple) interacting elements of grammar. Their strongly derivational minimalist approach to the particular problem of expletive constructions is used to demonstrate that the minimal devices present in this theory are sufficient to account for a phenomenon of great internal complexity. One interesting interface issue is raised by the prospect (made available in the MP) of a dynamically split model of multiple spell-out, in which the interfaces at PF and LF are accessed cyclically and locally in derivational chunks or “phases”, based on a partitioned numeration. This theoretical option gives a rather different architecture and makes different predictions from the pre-MP T-model. On a more conceptual level, Boeckx and Uriagereka argue that minimalist theorizing allows new and deeper questions to be asked about the relationships between grammar and mind/biology.

Steedman, in his chapter, argues that many basic minimalist tenets are sound and that our model of grammar should indeed be driven by our understanding of the necessary properties of the minimally necessary interfaces with sound and meaning. He gives a proposal for the form of the intervening derivational module which combines these basic prerequisites with the insights of computational and non-derivational frameworks (claiming that such a convergence is both timely and necessary, given that the ideal is a model in which competence and performance are closely linked). Essentially, he argues for a version of categorial grammar (Combinatory Categorial Grammar) which can be translated into a system of productions for generating information structures and syntactic-category structures in parallel. The novel aspects of his proposal (from the point of view of mainstream minimalism) lie in the fact that traditional notions of constituency are abandoned in favour of a more flexible mode of combination, and that final word order is argued to be under lexical control. Steedman motivates the flexibilities in constituency with data from the groupings found in intonational phrasing, and shows that the account he proposes can also account for classic cases of crossing dependencies in Dutch. The system also depends on re-evaluating the role of the numeration: rather than starting the derivational process with an arbitrarily chosen multiset of lexical elements (which may support more than one distinct string in the language, or no string at all), the numeration is simply the ordered multiset of terminal lexical elements of the derivation. As such, the notion of a numeration is (p. 12) largely redundant, being entirely determined by either the string or the derivation, depending on whether the analytic or generative viewpoint is taken. The advantages of the Steedman system are that PF, S-structure, and intonational structure can be unified under a single surface derivational module capturing many otherwise problematic correlations. At the same time, the model conforms with standard desiderata of having syntactic rules be universal and invariant, with language-specific information being relegated to the lexicon. In an important way, Steedman's model is much more lexicalist than the MP: word order and all bounded constructions are controlled by lexically specified information. With respect to word order, it is important to recognize that under certain versions of minimalism (i.e. those involving elaborate movements to derive word order effects (cf. Svenonius, this volume)), the triggering features and parametric differences distinguishing languages with different word orders are currently fairly obscure. Steedman's solution to locating these effects is an unashamedly lexical one. The issues in this Chapter thus also bear on the debates elsewhere in this volume concerning the independent status of the lexicon as a module of grammar, and on the contribution by Jonas Kuhn on constraint-based models of grammar which clearly place considerable weight on the lexicon as a module.

Kuhn considers the notion of interface from the point of view of non-derivational theories of grammar—specifically LFG and HPSG. As he points out, in some sense the notion of interface gains greater prominence within this class of theories than in either the Principles and Parameters framework or the MP. For the latter theories, there is one derivation (possibly with levels of representation linked by transformations) and the only “interfaces” are with modules outside the domain of the computation—minimally PF, LF, and possibly the Lexicon or Morphology if these are to be considered distinct modules. Within constraint-based theories, transformations are eschewed in favour of parallel representational modules with potentially different primitive elements and internal relations. A particular complex network with parallel representations in different domains is well formed if it can be successfully “unified”. Under this view, every set of constraining relations between one module and another constitutes an “interface”. Thus, for every level of representation or module there is a potential interface with every other module, since the modules form a network rather than a serialized pipeline set of representations. Kuhn offers a perspective from which the drive to eliminate modules does not exist. Within these theories, the claim is that distinct modules do more justice to the heterogeneity of linguistic generalizations and mismatches between domains than do theories which attempt to reduce the important domain of generalizations to very few modules. In addition, the perspective is different because the mapping principles themselves constitute the interface between highly articulated levels of representation. Within the MP, the syntactic computation is the mapping between the “interface” levels of PF and LF. It is important to keep this difference in the interpretation of “interface” in mind (p. 13) when approaching constraint-based theories, and to see what a difference in perspective it implies: from a constraint-based point of view, minimalists are actually pursuing a highly elaborated explanation of the interface between the representational levels of PF and LF, albeit couched within a derivational metaphor.

The issues explored in this volume are still in many cases open for debate. We do not think that it is appropriate to argue for any particular position in this introduction, but we feel that the issues that emerge most forcefully from this collection are (i) the scope and limitation of the syntactic component, and in a parallel way (ii) the autonomy of phonology. Although many of the debates remain inconclusive, we do believe that the chapters in this book are useful and original contributions to the most important questions in the field today.

References

Chomsky, N. (1957), Syntactic Structures, The Hague: Mouton.Find this resource:

——— (1993), “A Minimalist Program for Linguistic Theory”, in K. Hale and S. J. Keyser (eds.), The View from Building 20: Essays in Linguistics in Honor of Sylvain Bromberger, Cambridge, MA: MIT Press, 1–52. (p. 14) Find this resource: