首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Two-way, two-tape automata were first considered in Rabin and Scott [1]. We show how surprisingly powerful these machines are by describing some programming techniques that can be implemented on them, and by proving, by means of these techniques, that there exists a universal two-symbol, two-way, two-tape automaton. We then argue that in contradiction to an assertion by Rabin and Scott, the class of sets of pairs of tapes definable by two-way, two-tape automata is not closed under most Boolean operations.This work was supported by the National Science Foundation under grant GJ-596.  相似文献   

2.
Summary There have been many recent proposals for embedding abstract data types in programming languages. In order to reason about programs using abstract data types, it is desirable to specify their properties at an abstract level, independent of any particular implementation. This paper presents an algebraic technique for such specifications, develops some of the formal properties of the technique, and shows that these provide useful guidelines for the construction of adequate specifications.Supported in part by the National Science Foundation under grant MCS-76-06089 and the Joint Services Electronics Program monitored by the Air Force Office of the Scientific Research under contract F44620-72C-0061Supported in part by the National Research Council of Canada.  相似文献   

3.
Abstract interpretation [6] has been long regarded as a promising optimization and analysis technique for high-level languages. In this article, we describe an implementation of aconcurrent abstract interpreter. The interpreter evaluates programs written in an expressive parallel language that supports dynamic process creation, first-class locations, list data structures and higher-order procedures. Synchronization in the input language is mediated via first-class shared locations. The analysis computes intra- and inter-threadcontrol anddataflow information. The interpreter is implemented on top of Sting [12], a multi-threaded dialect of Scheme that serves as a high-level operating system for modern programming languages.  相似文献   

4.
Summary Recent papers have attacked the use of locations, or pointers, as data objects in programs. This paper considers a generalization of this attack — attempting to remove locations from the underlying semantic model. Using the mathematical semantics of Scott and Strachey, we show that reference parameters and Algol scope rules for procedures, two constructs requiring the notion of reference, can be replaced with alternative constructs not requiring locations in the definition. Some conclusions are also drawn about the use of mathematical semantics as a language design tool.This research was partially supported by the National Research Council of Canada and by National Science Foundation grant MCS 76-14293  相似文献   

5.
6.
7.
This paper proves that the complexity class P, parity polynomial time [PZ], contains the class of languages accepted byNP machines with few accepting paths. Indeed, P contains a broad class of languages accepted by path-restricted nondeterministic machines. In particular, P contains the polynomial accepting path versions ofNP, of the counting hierarchy, and of Mod m NP form>1. We further prove that the class of nondeterministic path-restricted languages is closed under bounded truth-table reductions.These results were announced at the 6th Symposium on Theoretical Aspects of Computer Science [CH3]. Jin-yi Cai was supported in part by NSF Grant CCR-8709818 and the work was done while he was at Yale University. Lane A. Hemachandra was supported in part by a Hewlett-Packard Corporation equipment grant and the National Science Foundation under Grant CCR-8809174/CCR-8996198 and a Presidential Young Investigator Award. His work was done in part while at Columbia University.  相似文献   

8.
There have been several proposals for logic programming language based on linear logic: Lolli [8], Lygon [7], LO [3], LinLog [2], Forum [11], HACL [10]. In these languages, it is possible to create and consume resources dynamically as logical formulas. The efficient handling of resource formulas is, therefore, an important issue in the implementation of these languages. Lolli, Lygon, and Forum are implemented as interpreter systems; Lolli is on SML and λProlog, Lygon is on Prolog, Forum is on SML, λProlog and Prolog. However, none of them have been implemented in Java.In this paper, we describe the Prolog Café 1 system which translates a linear logic programming language called LLP to Java via the LLPAM [12] [5], an extension of the standard WAM [16] [1] for LLP. LLP is a superset of Prolog and a subset of Lolli. The main difference from the first implementation [4] is resource compilation. That is to say, resource formulas are compiled into closures which consist of a reference of compiled code and a set of bindings for free variables. Calling these resources is integrated with the ordinary predicate invocation.Prolog Café is portable to any platform supporting Java and easily expandable with increasing Java's class libraries. In performance, on average, Prolog Café generate 2.2 times faster code for a set of classical Prolog benchmarks compared with jProlog.  相似文献   

9.
We continue our investigations and study automated theorem proving for reasoning about perception of reasoning agents and their consensus reaching. Using our earlier techniques and those of logic programming we develop the processing techniques for consensus programs.Work partially supported by Polish Goverment grant KBN 2 2051 91 02.Work partially supported by U.S. National Science Foundation grant IRI-9012902.  相似文献   

10.
Two notions of correctness and their relation to testing   总被引:1,自引:0,他引:1  
Summary We consider two interpretations for what it means for test data to demonstrate correctness. For each interpretation, we examine under what conditions data sufficient to demonstrate correctness exists, and whether it can be automatically detected and/or generated. We establish the relation between these questions and the problem of deciding equivalence of two programs.The work of this author was supported in part by the National Science Foundation under grant MCS-8109547  相似文献   

11.
We study the relationship between probabilistic and unambiguous computation, and provide strong relativized evidence that they are incomparable. In particular, we display a relativized world in which the complexity classes embodying these paradigms of computation are mutually immune. We answer questions formulated in—and extend the line of research opened by—Geske and Grollmann [15] and Balcázar and Russo [3].Some of these results were presented at the 1989 International Conference on Computing and Information, Toronto [12]. D. Eppstein's research was, supported in part by the National Science Foundation under Research Grants DCR-8511713 and CCR-8605353. The research of L. A. Hemachandra was supported in part by the National Science Foundation under Research Grants CCR-8809174, CCR-8957604, and CCR-8996198, and by a Hewlett-Packard Corporation equipment grant. B. Yener's research was supported in part by the Center for Telecommunications Research, Columbia University.  相似文献   

12.
The proliferation of large software systems written in high level programming languages insures the utility of analysis programs which examine interprocedural communications. Often these analysis programs need to reduce the dynamic relations between procedures to a static data representation. This paper presents one such representation, a directed, acyclic graph named the call graph of a program. We delineate the programs representable by an acyclic call graph and present an algorithm for constructing it using the property that its nodes may be linearly ordered. We prove the correctness of the algorithm and discuss the results obtained from an implementation of the algorithm in the PFORT Verifier [1].  相似文献   

13.
Owing to its rapid convergence, ease of computer implementation[1], and applicability to a wide class of practical problems[2, 3], separable programming is well established among the more useful non-linear programming techniques[2, 4]. Yet at the same time, its impracticality for highly nonlinear problems, pointed out repeatedly[1, 5, 6], constitutes a severe limitation of this important approach. This emerges even more strongly when one observes the essential failure of the method for some of the very small (2 × 2) problems included in this report. In this context of high nonlinearity, we examine the performance of a convergent (to within a given ε> 0 of the optimal) alternative procedure based on Refs.[7, 8] which obviates the major difficulties effectively by solving a series of non-heuristic, rigorously determined small separable programs as opposed to a single large one in the standard separable programming technique given, e.g., in Refs.[1, 2, 5]. Specifically, this paper, first, in absence of any such study in the literature, demonstrates the extreme degree of vulnerability of standard separable programming to high nonlinearity, then states the algorithm and some of its important characteristics, and shows its effectiveness for computational examples. Problems requiring up to about 10,000 nonzero elements in their specifications and about 45,000 nonzero elements in the intermediate separable programs, resulting from up to 70 original nonlinear variables and 70 nonlinear constraints are included in these examples.  相似文献   

14.
Leftist grammars are characterized in terms of rules of the form a → ba and cd → d, without distinction between terminals and nonterminals. They were introduced by Motwani et al. [13], where the accessibility problem for some general protection system was related to these grammars. This protection system was originally proposed in [4] and [15] in the context of Java virtual worlds. The accessibility problem is formulated in the form "Can object p gain (illegal) access to object q by a series of legal moves (as prescribed by the policy)?" The membership problem for leftist grammar is decidable [13], which implies decidability of the accessibility problem for the appropriate protection system. We study relationships between language classes defined by various types of leftist grammars and classes of the Chomsky hierarchy. We show that general leftist grammars can define languages which arenot context free, answering in the negative a question from [13]. Moreover, we study some restricted variants of leftist grammars and relate them to regular, deterministic context-free, and context-free languages.  相似文献   

15.
We develop a method for generating pseudorandom sequences with Gaussian distribution. The method is based on completely uniformly distributed sequences and linear transformations, such as the Fourier transform and Walsh transform. We obtain some discrepancy estimates and make a numerical comparison of these two transformations. Furthermore, we show how this method can be used for testing randomness. We remark that similar approaches are due to Gut, Egorov and Il’in [7], Yuen [26] and Rader [21]. The authors are supported by the Austrian-Hungarian Scientific Cooperation Programme, Project Nr. 10U3 This author is supported by the Austrian Science Foundation, Project Nr. P10223-PHY  相似文献   

16.
A number of programming languages are studied to compare their techniques for data management in large programs. In particular, it is argued that the hierarchical scope rules of Pascal violate accepted criteria for large program design. FORTRAN is shown to obey these criteria in its global data rules only. To support this, large programs in the two languages are studied in detail. There is no intention to provide statistical arguments, but rather these detailed case studies show the nature of programming style that can and cannot be supported by the languages used. Finally, the problems identified in the case studies are used to examine the suitability of a number of other programming languages with regard to data management in large projects.  相似文献   

17.
In this paper, the first stage of studies concerning the computer analysis of hand X-ray digital images is described. The images are preprocessed and then skeletization of the fingers is carried out. Then, the interphapangeal and metacarpophalangeal joints are detected and contoured. Joint widths are also measured. The obtained results largely concur with those obtained by other authors—see Beier et al. [Segmentation of medical images combining local, regional, global, and hierarchical distances into a bottom-up region merging scheme, Proc. SPIE 5747 (2005) 546-555], Klooster et al. [Automatic quantification of osteoarthritis in hand radiographs: validation of a new method to measure joint space width, Osteoarthritis and Cartilage 16 (1) (2008) 18-25], Ogiela et al. [Image languages in intelligent radiological palm diagnostics, Pattern Recognition 39 (2006) 2157-2165] and Ogiela and Tadeusiewicz [Picture languages in automatic radiological palm interpretation, Int. J. Appl. Math. Comput. Sci. 15 (2) (2005) 305-312].  相似文献   

18.
Marker automata     
This paper deals with finite automata augmented with markers which the automata can move about on their input tapes. The concept of augmenting markers to automata was first introduced by Blum and Hewitt [1] in two-dimensional automata. Kreider, Ritchie, and Springsteel [6, 7, 8, 12] investigated recognition of context-free languages by (one-dimensional) automata with markers. In this paper, we investigate some fundamental properties of marker automata and study their relationships to other types of automata and languages. The main result in this paper is the establishment of an infinite hierarchy of languages recognizable by deterministic and deterministic, halting marker automata. It also turns out that, because of the equivalence of finite marker automata and multi-head automata, the study of three-marker automata becomes very interesting due to results of Hartmanis [4] and Savitch [10].  相似文献   

19.
20.
Summary In this paper we propose an axiomatic system for proving the total correctness of CSP programs. The system is based on the partial correctness system of [6, 7]. We use the proposed system to prove the total correctness of a program for set partitioning.This work was supported in part by the NSF grant no. ECS8404725  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号