Context-free and context-sensitive dynamics in recurrent neural networks

Boden, Mikael and Wiles, Janet (2000) Context-free and context-sensitive dynamics in recurrent neural networks. Connection Science: journal of neural computing, artificial intelligence, and cognitive research, 12 3-4: 197-210. doi:10.1080/095400900750060122


Author Boden, Mikael
Wiles, Janet
Title Context-free and context-sensitive dynamics in recurrent neural networks
Journal name Connection Science: journal of neural computing, artificial intelligence, and cognitive research   Check publisher's open access policy
ISSN 1360-0494
0954-0091
Publication date 2000-12
Sub-type Article (original research)
DOI 10.1080/095400900750060122
Volume 12
Issue 3-4
Start page 197
End page 210
Total pages 14
Place of publication Abingdon, England
Publisher Taylor & Francis.
Language eng
Subject 08 Information and Computing Sciences
C1
Abstract Continuous-valued recurrent neural networks can learn mechanisms for processing context-free languages. The dynamics of such networks is usually based on damped oscillation around fixed points in state space and requires that the dynamical components are arranged in certain ways. It is shown that qualitatively similar dynamics with similar constraints hold for a(n)b(n)c(n), a context-sensitive language. The additional difficulty with a(n)b(n)c(n), compared with the context-free language a(n)b(n), consists of 'counting up' and 'counting down' letters simultaneously. The network solution is to oscillate in two principal dimensions, one for counting up and one for counting down. This study focuses on the dynamics employed by the sequential cascaded network, in contrast to the simple recurrent network, and the use of backpropagation through time. Found solutions generalize well beyond training data, however, learning is not reliable. The contribution of this study lies in demonstrating how the dynamics in recurrent neural networks that process context-free languages can also be employed in processing some context-sensitive languages (traditionally thought of as requiring additional computation resources). This continuity of mechanism between language classes contributes to our understanding of neural networks in modelling language learning and processing.
Keyword Computer Science, Artificial Intelligence
Computer Science, Theory & Methods
Context-free Grammar
Context-sensitive Grammar
Dynamical System Language
Learning
Recurrent Neural Network
Recursive Structure
Starting Small
Recognizers
Time
Q-Index Code C1

Document type: Journal Article
Sub-type: Article (original research)
Collection: School of Information Technology and Electrical Engineering Publications
 
Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 26 times in Thomson Reuters Web of Science Article | Citations
Scopus Citation Count Cited 0 times in Scopus Article
Google Scholar Search Google Scholar
Created: Mon, 13 Aug 2007, 12:09:39 EST