首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Due to the growing complexity and size of software systems, the development of correct and easy to maintain software has become more and more of a problem. This is especially true for distributed systems with real-time requirements. Therefore, great efforts have been made to overcome this problem. However, most approaches either do not consider every aspect of interest or are restricted to only one development phase. This paper describes OASIS, an open environment, that allows the integration of different analysis techniques in different system development phases, and presents the existing OASIS-toolset, that is already incorporated in this environment.  相似文献   

2.
3.
Jacob Palme 《Software》1974,4(4):379-388
A typical list structurre application program was written in both SIMULA and PL/1. The program was executed using the IBM 360/370 SIMULA and PL/1 optimizing compilers. SIMULA was found to give shorter compile time, about the same execution time and larger memory requirement than PL/1. The source program for the same algorithm was shorter in SIMULA. The SIMULA system could discover semantic programming errors earlier and diagnose them better them PL/1.  相似文献   

4.
This paper addresses the problem of timestamped event sequence matching, a new type of similar sequence matching that retrieves the occurrences of interesting patterns from timestamped sequence databases. The sequential-scan-based method, the trie-based method, and the method based on the iso-depth index are well-known approaches to this problem. In this paper, we point out their shortcomings, and propose a new method that effectively overcomes these shortcomings. The proposed method employs an R-tree, a widely accepted multi-dimensional index structure that efficiently supports timestamped event sequence matching. To build the R-tree, this method extracts time windows from every item in a timestamped event sequence and represents them as rectangles in n-dimensional space by considering the first and last occurring times of each event type. Here, n is the total number of disparate event types that may occur in a target application. To resolve the dimensionality curse in the case when n is large, we suggest an algorithm for reducing the dimensionality by grouping the event types. Our sequence matching method based on the R-tree performs with two steps. First, it efficiently identifies a small number of candidates by searching the R-tree. Second, it picks out true answers from the set of candidates. We prove its robustness formally, and also show its effectiveness via extensive experiments.  相似文献   

5.
Abstract A new approach to software training is presented, the so‐called Double‐Fading Support (DFS) approach. According to this approach, which is based on Carroll's training‐wheels idea and on cognitive theories of skill acquisition, two types of user support when learning to use a complex software system — locking the software's functionality and detailed guidance — are faded out gradually during the training course, so that the learners are able to use the complex software with minimal instructional support at the end of the training. Two 30‐hour training experiments with two different CAD software systems and CAD‐inexperienced university students were conducted. The results of Experiment 1 with 88 participants indicate the effectiveness of the DFS‐approach for CAD software with a deeply structured menu system. Participants working with the initially reduced software outperformed participants of the full software functionality group; additionally, participants of the slowly faded guidance group outperformed participants receiving medium, fast or no fading of guidance at all. Results of Experiment 2 with 120 participants, however, indicate less effectiveness of the DFS‐approach for an icon‐based CAD software in which most of relevant functions are permanently visible to the user. It seems that the two factors (fading out the locking of software's functionality and fading out detailed guidance) overcompensate each other.  相似文献   

6.
Polynomial models are used to give a unified approach to the problem of classifying the set of all real symmetric solutions of the algebraic Riccati equation.  相似文献   

7.
We estimate parameters in the context of a discrete-time hidden Markov model with two latent states and two observed states through a Bayesian approach. We provide a Gibbs sampling algorithm for longitudinal data that ensures parameter identifiability. We examine two approaches to start the algorithm for estimation. The first approach generates the initial latent data from transition probability estimates under the false assumption of perfect classification. The second approach requires an initial guess of the classification probabilities and obtains bias-adjusted approximated estimators of the latent transition probabilities based on the observed data. These probabilities are then used to generate the initial latent data set based on the observed data set. Both approaches are illustrated on medical data and the performance of estimates is examined through simulation studies. The approach using bias-adjusted estimators is the best choice of the two options, since it generates a plausible initial latent data set. Our situation is particularly applicable to diagnostic testing, where specifying the range of plausible classification rates may be more feasible than specifying initial values for transition probabilities.  相似文献   

8.
9.
This article reports on our experiments and results on the effectiveness of different feature sets and information fusion from some combinations of them in classifying free text documents into a given number of categories. We use different feature sets and integrate neural network learning into the method. The feature sets are based on the “latent semantics” of a reference library — a collection of documents adequately representing the desired concepts. We found that a larger reference library is not necessarily better. Information fusion almost always gives better results than the individual constituent feature sets, with certain combinations doing better than the others.  相似文献   

10.
基于双序列比对算法的立体图像匹配方法*   总被引:1,自引:1,他引:0  
在分析现有立体匹配方法的基础上,提出一种基于双序列比对算法的立体图像匹配方法。将立体图像对中同名极线上的像素灰度值看做是一对字符序列,使用基于动态规划思想的双序列比对算法对这些对字符序列进行匹配,以获取立体图像视差。为验证该方法的可行性和适用性,采用人脸立体图像对进行实验。实验结果表明,使用该方法进行立体图像匹配能获得光滑的、稠密的视差图。基于动态规划思想的双序列比对算法能够有效地解决立体图像匹配问题,从而为图像的立体匹配提供了一个实用有效的方法。  相似文献   

11.
This paper presents the time-domain approach to the analysis of the convergence of continuous-time adaptive control and estimation algorithms. The time-domain definition of persistently exciting signals is introduced and the convergence of estimation algorithms is established in the cases of open-loop and closed-loop systems. An application of the persistency of excitation theory to the design of a globally stable adaptive pole-placement controller is given.  相似文献   

12.
Since many domains are constantly evolving, the associated domain specific languages (DSL) inevitably have to evolve too, to retain their value. But the evolution of a DSL can be very expensive, since existing words of the language (i.e. programs) and tools have to be adapted according to the changes of the DSL itself. In such cases, these costs seriously limit the adoption of DSLs.This paper presents Lever, a tool for the evolutionary development of DSLs. Lever aims at making evolutionary changes to a DSL much cheaper by automating the adaptation of the DSL parser as well as existing words and providing additional support for the correct adaptation of existing tools (e.g. program generators). This way, Lever simplifies DSL maintenance and paves the ground for bottom-up DSL development.  相似文献   

13.
The paper presents and discusses a nonlinear, three-dimensional, finite-segment, dynamic model of a cable or chain. The model consists of a series of links connected to each other by ball-and-socket joints. The size, shape, and mass of the links is arbitrary. Furthermore, these parameters may be distinct for each link. Also, the number of links is arbitrary. The model allows an arbitrary force system to be applied to each link.

The model is used to develop a computer code which consists primarily of subroutines containing algorithms to develop the kinematics, force systems, and governing dynamical equations. Although the integration of the equations is performed with a Runge-Kutta algorithm, the code is developed so that any other suitable integration technique or algorithm may be substituted. The input for the code requires the following: the number of links; the mass, centroidal inertia matrix, mass-center position, connection point, and external forces on each link; and the time history of the specified variables. The output consists of the time history of each variable, the position, velocity, and the acceleration of the mass-center of each link, and the unknown forces and moments.

An example problem is presented which describes the motion of a sphere drug through water by a partially submerged cable suspended from a rotating surface crane. Viscous forces of the water are included. Although the example simulates a typical nautical rig. its inclusion in the paper is introduced primarily to illustrate the capability of the model.  相似文献   


14.
This paper presents a range of optimization based approaches to fault diagnosis. A variety of fault diagnosis problems are reformulated in the so‐called standard problem set‐up introduced in the literature on robust control. Once the standard problem formulations are given, the fault diagnosis problems can be solved by standard optimization techniques. The proposed methods include (1) fault diagnosis (fault estimation, (FE)) for systems with model uncertainties; FE for systems with parametric faults, and FE for a class of nonlinear systems. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

15.
Popular techniques for modeling facial expression usually rely on the shape blending of a series of pre-defined facial models, the use of feature parameters, or the use of an anatomy based facial model. This requires extensive user interaction to construct the pre-defined facial model, the deformation functions, or the anatomy based facial model. Besides, existing anatomy based facial modeling techniques are targeted for human facial model and may not be used directly for non-human like character models. This paper presents an intuitive technique for the design of facial expressions using a physics based deformation approach. The technique does not require specifying the deformation function associated with facial feature parameters, and does not require a detail anatomical model of the head. By adjusting the contraction or relaxation of a set of facial muscles, different facial expressions can be obtained. Facial muscles and skin are assumed to be linearly elastic. The boundary element method (BEM) is adopted for evaluating deformation of the facial skin. This avoids the use of volumetric elements as in the case of finite element method (FEM) or the setting up of complex mass–spring models. Given a polygon mesh of a facial model, a closed volume of the facial mesh is obtained by offsetting the polygon mesh according to a user defined depth map. Each facial muscle is approximated with a series of muscle polygons on the mesh surface. Deformation of the facial mesh is attained by stretching or compressing the muscle polygons. By pre-computing the inverse of the stiffness matrix, interactive editing of facial expression can be achieved.  相似文献   

16.
T. R. Hopkins 《Software》1980,10(3):175-181
The increase in the number of available dialects of BASIC has lead to the usual difficulties encountered when transporting software. The proposed American National Standard Minimal BASIC represents a small but almost universal subset of all BASICs. PBASIC is a verifier for ANS Minimal BASIC. The verifier is itself written in PFORT, a portable subset of FORTRAN IV.  相似文献   

17.
An important research topic in image processing and image interpretation methodology is the development of methods to incorporate contextual information into the interpretation of objects. Over the last decade, relaxation labelling has been a useful and much studied approach to this problem. It is an attractive technique because it is highly parallel, involving the propagation of local information via iterative processing. The paper. surveys the literature pertaining to relaxation labelling and highlights the important theoretical advances and the interesting applications for which it has proven useful.  相似文献   

18.
This paper describes a software sysem (SOFTLIB) that has been developed to assist in the management of software documentation generated during systems development projects. It provides facilities to manage large numbers of documents, to file documents when they are complete and to issue them to system developers and maintainers. It also includes an information retrieval facility that allows programming staff to find documents, to examine their contents before issue and to assess the state of the software project documentation. SOFTLIB is explicitly intended to help manage the documentation generated during software development — it is not designed for use by end-users of that software or for managing end-user documentation. The novel characteristic of this system is the approach that is taken to the consistency and completeness of documentation. The documentation associated with a software system is organized in such a way that it may be detected if document sets are complete (that is, if all documentation which should be provided for a software component is available) and if document sets are likely to be inconsistent. This means that if a document has been changed without a comparable change being made to other associated documents, this is detectable by the librarian system. In addition, a subsidiary aim of our work was to investigate the utility of menu systems to complex software tools by building a user interface to SOFTLIB. We conclude that menu systems are far from ideal in such situations because of the range of possible options which must be handled by the system.  相似文献   

19.
Michael J. Spier 《Software》1976,6(3):293-299
A sequence of events is described, leading to the severe deterioration of an initially well conceived and cleanly implemented compiler. It is shown how an initial “optimization” implanted a latent bug in the compiler, how the bug was subsequently activated through innocent compiler modification, and how the compiler then deteriorated because of the incompetent correction of the bus manifestation. This exceedingly negative case history is presented in the hope of conveying to the reader a better feeling for the complex problems inherent to industrial software production and maintenance. The difficulty in proposing any constructive (and complete!) software engineering methodology is known and acknowledged; the study of an episode such as described in this paper might help put the difficulties, with which we are confronted, into better perspective.  相似文献   

20.
Integration of MCDM with DSS brings benefit to both fields. MCDM tools are useful in identifying and evaluating incompatible alternatives for DSS, while DSS can implement MCDM approaches and help maintain and retrieve MCDM models. Over the years, MCDM has made considerable contribution to the development of various DSS subspecialties. This special issue on Multiple Criteria Decision Making and Decision Support Systems consists of 9 selected papers from the 20th International Conference on Multiple Criteria Decision Making. The guest editors highlight the key ideas and contributions of the papers in the special issue.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号