Figure 3:
Recurrent FGREP-module. At the end of each backpropagation cycle, the current input represen- p.2Figure 1:
Overview of the model. The model consists of parsing and generating subsystems, and a central lexi- p.2Figure 2:
Lexicon. The lexicon is an associative mem- ory, associating the text form of each word with its dis- p.2Figure 5:
Networks generating helped system is in the beginning of generating The woman, who The sentence generator has produced the case-role rep- resentation of the first act fragment, The woman, and the act generator has output the first word of that fr p.3Figure OUT:
hit, blamed the man has been read in. The output of last act fragment, blamed the man p.3Figure 6:
Training configuration. is truined separately and simultaneously, developing the Each network same lexicon p.4Table 1:
Sentence templates. Table 2: Restrictions. There are 3 different verbs, with 2 possible agents and patients each (table 2) p.5Table 3:
Performance. The first column indicates the percentage of correct words out of all output words p.5