Optimality: From Neural Networks to Universal Grammar
- 14 March 1997
- journal article
- review article
- Published by American Association for the Advancement of Science (AAAS) in Science
- Vol. 275 (5306) , 1604-1610
- https://doi.org/10.1126/science.275.5306.1604
Abstract
Can concepts from the theory of neural computation contribute to formal theories of the mind? Recent research has explored the implications of one principle of neural computation, optimization, for the theory of grammar. Optimization over symbolic linguistic structures provides the core of a new grammatical architecture, optimality theory. The proposition that grammaticality equals optimality sheds light on a wide range of phenomena, from the gulf between production and comprehension in child language, to language learnability, to the fundamental questions of linguistic theory: What is it that the grammars of all languages share, and how may they differ?Keywords
This publication has 7 references indexed in Scilit:
- On language and connectionism: Analysis of a parallel distributed processing model of language acquisitionPublished by Elsevier ,2002
- Generalized alignmentPublished by Springer Nature ,1993
- Networks and Theories: The Place of Connectionism in Cognitive SciencePsychological Science, 1991
- On the proper treatment of connectionismBehavioral and Brain Sciences, 1988
- Comparison of convolution and matrix distributed memory systems for associative recall and recognition.Psychological Review, 1984
- Neural networks and physical systems with emergent collective computational abilities.Proceedings of the National Academy of Sciences, 1982
- Grundzüge der PhonologieLanguage, 1941