The processing of natural language is, at the same time, naturally symbolic and naturally subsymbolic. It is symbolic because ultimately symbols play a critical role. Writing systems, for example, owe their existence to the symbolic nature of language. It is also subsymbolic because of the nature of speech, the fuzziness of concepts, and the high degree of parallelism that is difficult to explain as a purely symbolic phenomenon. Building a processor of natural language, therefore, requires a hybrid approach. This report details a set of experiments which support the claim that natural language can be syntactically processed in a robust manner using a connectionist deterministic parser. The model is trained from patterns derived from a deterministic grammar and tested with grammatical, ungrammatical and lexically ambiguous sentences. 相似文献
In the shearing process, clearance has a significant effect on machining accuracy. However, the relationship between uneven clearance caused by misalignment of tool position and machining accuracy remains unclear. This is attributed to the fact that, previously, the effect was small because the thickness of the workpiece was not so thin, and a method for precisely measuring and adjusting the tool position had not been established. Therefore, in the present study, a new method of adjusting the initial tool position is developed. In addition, punching experiments are conducted under the condition that the initial tool position is adjusted to an accuracy of 2 μm or better, and the effects of clearance on machining accuracy, shape of cross-section, and diameter of hole, are investigated in three types of materials. From these results, the importance of adjusting the initial tool position is clarified. 相似文献
computing devices such as Turing machines resolve the dilemma between the necessary finitude of effective procedures and the potential infinity of a function's domain by distinguishing between a finite-state processing part, defined over finitely many representation types, and a memory sufficiently large to contain representation tokens for any of the function's arguments and values. Connectionist networks have been shown to be (at least) Turing-equivalent if provided with infinitely many nodes or infinite-precision activation values and weights. Physical computation, however, is necessarily finite.
The notion of a processing-memory system is introduced to discuss physical computing systems. Constitutive for a processing-memory system is that its causal structure supports the functional distinction between processing part and memory necessary for employing a type-token distinction for representations, which in turn allows for representations to be the objects of computational manipulation. Moreover, the processing part realized by such systems provides a criterion of identity for the function computed as well as helps to define competence and performance of a processing-memory system.
Networks, on the other hand, collapse the functional distinction between processing part and memory. Since preservation of this distinction is necessary for employing a type-token distinction for representation, connectionist information processing does not consist in the computational manipulation of representations. Moreover, since we no longer have a criterion of identity for the function processed other than the behaviour of the network itself, we are left without a competence-performance distinction for connectionist networks, 相似文献