Next: #zeng94j###
Up: Inference of context-free grammars
Previous: #Gil90a###
  Contents
  Index
propose what could be
considered as an asynchronous
DTRNN-like architecture which effectively acts as a pushdown
automata implementing a shift-reduce parsing
strategy.6.3The implementation is based on the following modules:
- A stack which may contain input
symbols
(terminals) shifted onto it or nonterminal symbols
(variables).
- A set of demon units that can read the top two stack symbols.
Each demon unit reacts to a particular pair of symbols, pops both of
them from the stack, and pushes a particular nonterminal. When no
demon unit reacts, a new input symbol is pushed onto the stack.
Grammars induced are of the form
where and are either
nonterminals or
terminals. Both the stack
(as in Giles et al. (1990)) and the demons are continuous to allow for
partial demon activity and variable-thickness symbols in the stack.
This in turn allows for the use of a gradient-based
method to train the network; the
network is trained to obtain a stack containing simply the start
symbol after reading a grammatical
string, and any other symbol after reading a nongrammatical string.
Mozer and Das (1993) successfully train the network to learn four
simple grammars from relatively small learning sets.
Next: #zeng94j###
Up: Inference of context-free grammars
Previous: #Gil90a###
  Contents
  Index
Debian User
2002-01-21