Towards Effective Parsing with Neural Networks: Inherent Generalisations and Bounded Resource Effects |
| |
Authors: | Peter CR Lane James B Henderson |
| |
Affiliation: | (1) Department of Computer Science, University of Hertfordshire, Hatfield Campus, College Lane, HATFIELD, AL10 9AB, UK;(2) Department of Computer Science, University of Geneva, 24 rue Général Dufour, CH-1211 Genève 4, Switzerland |
| |
Abstract: | This article explores how the effectiveness of learning to parse with neural networks can be improved by including two architectural features relevant to language: generalisations across syntactic constituents and bounded resource effects. A number of neural network parsers have recently been proposed, each with a different approach to the representational problem of outputting parse trees. In addition, some of the parsers have explicitly attempted to capture an important regularity within language, which is to generalise information across syntactic constituents. A further property of language is that natural bounds exist for the number of constituents which a parser need retain for later processing. Both the generalisations and the resource bounds may be captured in architectural features which enhance the effectiveness and efficiency of learning to parse with neural networks. We describe a number of different types of neural network parser, and compare them with respect to these two features. These features are both explicitly present in the Simple Synchrony Network parser, and we explore and illustrate their impact on the process of learning to parse in some experiments with a recursive grammar. |
| |
Keywords: | neural networks resource effects structured representations syntactic parsing systematicity |
本文献已被 SpringerLink 等数据库收录! |
|