Terrence Stewart and Chris Eliasmith
Cognitive theories have expressed their components using an artificial symbolic language, such as first-order predicate logic, and the atoms in such representations are non-decomposable letter strings. A neural theory merely demonstrates how to implement a classical symbol system using neurons: this is actually an argument against the importance of the neural description. The fact that symbol systems are physically instantiated in neurons becomes a mere implementational detail, since there is a direct way to translate from the symbolic description to the more neurally plausible one. It might then be argued that, while the neural aspects of the theory identify how behavior arises, they are not fundamentally important for understanding that behavior. Classical symbol systems would continue to be seen as the right kinds of description for psychological processes.
Non-Symbolic Compositional Representation and Its Neuronal Foundation: To wards An Emulative Semantics
This article proposes a neurobiologically motivated theory of meaning as internal representation that holds on to the principle of compositionality, but negates the principle of semantic constituency. The approach builds on neurobiological findings regarding topologically structured cortical feature maps and the mechanism of object-related binding by neuronal synchronization. It incorporates the Gestalt principles of psychology and is implemented by recurrent neural networks. The semantics to be developed is structurally analogous to some variant of model-theoretical semantics. The semantics to be developed is a neuro-emulative model-theoretical semantics of a first-order language. The model-theoretical semantics is merely denotational and does not imply anything about the structures of the mind or the underlying neural mechanisms that enable producing and comprehend meaningful expressions. The relation between a mental representation and its content is some form of causal-informational covariation.
The Parallel Architecture shares classical generative grammar’s concern with the psychological and biological foundations of the language capacity. However, it argues for different formal machinery: multiple independent sources of combinatoriality–phonology, syntax, and semantics–connected by interface components. In particular, words are treated as part of the interface components, linking the independent structures. This view leads to an extension of the lexicon to include regular affixes, idioms, meaningful constructions, and even phrase structure rules. Unlike the classical theory, the Parallel Architecture includes a rich and explicit theory of meaning, Conceptual Semantics, which in turn permits a far leaner notion of syntactic structure, Simpler Syntax. The approach integrates aspects of many nonclassical generative theories, makes possible more perspicuous descriptions of many well-known linguistic phenomena, and provides a far closer linkage to psycholinguistics, neuroscience, and evolutionary psychology than has been heretofore possible.