首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper the performance of the linear, exponential and combined models to describe the temperature dependence of the excess Gibbs energy of solutions in the framework of the Redlich–Kister model is discussed. The models are not compared to existing Calphad optimized databases, rather they are tested against the 209 binary solid and liquid metallic alloys, for which reliable experimental data exist on the heat of mixing and Gibbs energy of mixing in the handbook of Predel. It was found that the linear model often leads to high-T artifact (artificial inverted miscibility gaps) and the excess Gibbs energy approaches infinity at high temperatures, which seems unreasonable. It was also found that although both the exponential and combined models can in principle lead to low-T artifact (liquid re-stabilization), in real systems it probably does not take place, at least for the “normal” systems (a system is “normal”, if the heat of mixing, excess entropy of mixing and excess Gibbs energy of mixing have the same sign at the temperature of measurement; 86% of all systems are found “normal”). The problem with the exponential model is that it is unable to describe the “exceptional” systems (14% of all systems). It is shown that the combined model is able to describe also these “exceptional” systems, as well. An algorithm is worked out to ensure that the combined model does not run into any high-T or low-T artifact, even when it is used to describe the “exceptional” systems. It is concluded that the T-dependence of the interaction energies for all solution phases described by the Redlich–Kister polynomials should be described by the combined model. In this way an improved databank on excess Gibbs energies of solution phases can be gradually built, not leading to any artifact.  相似文献   

2.
3.
Systems development methodologies incorporate security requirements as an afterthought in the non-functional requirements of systems. The lack of appropriate access control on information exchange among business activities can leave organizations vulnerable to information assurance threats. The gap between systems development and systems security leads to software development efforts that lack an understanding of security risks. We address the research question: how can we incorporate security as a functional requirement in the analysis and modeling of business processes? This study extends the Semantic approach to Secure Collaborative Inter-Organizational eBusiness Processes in D'Aubeterre et al. (2008). In this study, we develop the secure activity resource coordination (SARC) artifact for a real-world business process. We show how SARC can be used to create business process models characterized by the secure exchange of information within and across organizational boundaries. We present an empirical evaluation of the SARC artifact against the Enriched-Use Case (Siponen et al., 2006) and standard UML-Activity Diagram to demonstrate the utility of the proposed design method.  相似文献   

4.
We are on the verge of realizing a new class of material that need not be machined or molded in order to make things. Rather, the material forms and re-forms itself according to software programmed into its component elements. These self-reconfiguring materials are composed of robotic modules that coordinate with each other locally to produce global behaviors. These robotic materials can be used to realize a new class of artifact: a shape that can change over time, i.e., a four-dimensional shape or a hyperform. Hyperforms present several opportunities: objects such as furniture could exhibit dynamic behaviors, could respond to tangible and gestural input, and end-users could customize their form and behavior. To realize these opportunities, the tangible interaction community must begin to consider how we will create and interact with hyperforms. The behaviors that hyperforms can perform will be constrained by the capabilities of the self-reconfiguring materials they are made of. By considering how we will interact with hyperforms, we can inform the design of these systems. In this paper, we discuss the life cycle of a hyperform and the roles designers and end-users play in interacting with hyperforms at these various stages. We consider the interactions such a system could afford as well as how underlying hardware and software affect this interaction. And we consider the extent to which several current hardware systems, including our own prismatic cubes (Weller et al. in Intelligent Robots and Systems. IEEE, 2009), can support the hyperform interactions we envision.  相似文献   

5.
Despite diligent efforts made by the software engineering community, the failure of software projects keeps increasing at an alarming rate. After two decades of this problem reoccurring, one of the leading causes for the high failure rate is still poor process modeling (requirements’ specification). Therefore both researchers and practitioners recognize the importance of business process modeling in understanding and designing accurate software systems. However, lack of direct model checking (verification) feature is one of the main shortcomings in conventional process modeling methods. It is important that models provide verifiable insight into underlying business processes in order to design complex software systems such as Enterprise Information Systems (EIS). The software engineering community has been deploying the same methods that have haunted the industry with failure. In this paper, we try to remedy this issue by looking at a non-conventional framework. We introduce a business process modeling method that is amenable to automatic analysis (simulation), yet powerful enough to capture the rich reality of business systems as enacted in the behavior and interactions of users. The proposed method is based on the innovative language-action perspective.  相似文献   

6.
7.
Topic models are generative probabilistic models which have been applied to information retrieval to automatically organize and provide structure to a text corpus. Topic models discover topics in the corpus, which represent real world concepts by frequently co-occurring words. Recently, researchers found topics to be effective tools for structuring various software artifacts, such as source code, requirements documents, and bug reports. This research also hypothesized that using topics to describe the evolution of software repositories could be useful for maintenance and understanding tasks. However, research has yet to determine whether these automatically discovered topic evolutions describe the evolution of source code in a way that is relevant or meaningful to project stakeholders, and thus it is not clear whether topic models are a suitable tool for this task.In this paper, we take a first step towards evaluating topic models in the analysis of software evolution by performing a detailed manual analysis on the source code histories of two well-known and well-documented systems, JHotDraw and jEdit. We define and compute various metrics on the discovered topic evolutions and manually investigate how and why the metrics evolve over time. We find that the large majority (87%–89%) of topic evolutions correspond well with actual code change activities by developers. We are thus encouraged to use topic models as tools for studying the evolution of a software system.  相似文献   

8.
A general software system aimed at computer-aided design of controllers for robots and robotized technological systems is described in this paper. The software system includes modules for the synthesis of various levels of robot controller as well as controllers of complex robotized technological systems. The software includes simulation of robotic systems within manufacturing cells using various types of models: complete dynamic models, kinematic models and simple models in the form of finite automata. Using these modelsvarious algorithms for all controls levels in robot controllers may be synthesized taking into account the actual interaction between the robot and its environment. The software system enables the solution of the important problem of the interaction between higher and lower levels of controllers. Finally, a general purpose controller as a target system for the proposed software is described. The controller is designed as an open system allowing the user to apply various control laws and to run in conjunction with an actual robot. The general software system together with the controller represents a powerful educational tool in modern robotics.  相似文献   

9.
Capture-recapture (CR) models have been proposed as an objective method for controlling software inspections. CR models were originally developed to estimate the size of animal populations. In software, they have been used to estimate the number of defects in an inspected artifact. This estimate can be another source of information for deciding whether the artifact requires a reinspection to ensure that a minimal inspection effectiveness level has been attained. Little evaluative research has been performed thus far on the utility of CR models for inspections with two inspectors. We report on an extensive Monte Carlo simulation that evaluated capture-recapture models suitable for two inspectors assuming a code inspections context. We evaluate the relative error of the CR estimates as well as the accuracy of the reinspection decision made using the CR model. Our results indicate that the most appropriate capture-recapture model for two inspectors is an estimator that allows for inspectors with different capabilities. This model always produces an estimate (i.e., does not fail), has a predictable behavior (i.e., works well when its assumptions are met), will have a relatively high decision accuracy, and will perform better than the default decision of no reinspections. Furthermore, we identify the conditions under which this estimator will perform best  相似文献   

10.
An important area of reverse engineering is to produce digital models of mechanical parts from measured data points. In this process inaccuracies may occur due to noise and the numerical nature of the algorithms, such as, aligning point clouds, mesh processing, segmentation and surface fitting. As a consequence, faces will not be precisely parallel or orthogonal, smooth connections may be of poor quality, axes of concentric cylinders may be slightly tilted, and so on. In this paper we present algorithms to eliminate these inaccuracies and create “perfected” B-rep models suitable for downstream CAD/CAM applications.Using a segmented and classified set of smooth surface regions we enforce various constraints for automatically selected groups of surfaces. We extend a formerly published technology of Benkő et al. (2002). It is an essential element of our approach, however, that we do not know in advance the set of surfaces that will actually get involved in the final constrained fitting. We propose local methods to select and synchronize “likely” geometric constraints, detected between pairs of entities. We also propose global methods to determine constraints related to the whole object, although the best-fit coordinate systems, reference grids and symmetry planes will be determined only by surface entities qualified as relevant. Lots of examples illustrate how these constrained fitting algorithms improve the quality of reconstructed objects.  相似文献   

11.
Information systems are widely used in all business areas. These systems typically integrate a set of functionalities that implement business rules and maintain databases. Users interact with these systems and use these features through user interfaces (UI). Each UI is usually composed of menus where the user can select the desired functionality, thus accessing a new UI that corresponds to the desired feature. Hence, a system normally contains multiple UIs. However, keeping consistency between these UIs of a system from a visual (organisation, component style, etc.) and behavioral perspective is usually difficult. This problem also appears in software production lines, where it would be desirable to have patterns to guide the construction and maintenance of UIs. One possible way of defining such patterns is to use model-driven engineering (MDE). In MDE, models are defined at different levels, where the bottom level is called a metamodel. The metamodel determines the main characteristics of the models of the upper levels, serving as a guideline. Each new level must adhere to the rules defined by the lower levels. This way, if anything changes in a lower level, these changes are propagated to the levels above it. The goal of this work is to define and validate a metamodel that allows the modeling of UIs of software systems, thus allowing the definition of patterns of interface and supporting system evolution. To build this metamodel, we use a graph structure. This choice is due to the fact that a UI can be easily represented as a graph, where each UI component is a vertex and edges represent dependencies between these components. Moreover, graph theory provides support for a great number of operations and transformations that can be useful for UIs. The metamodel was defined based on the investigation of patterns that occur in UIs. We used a sample of information systems containing different types of UIs to obtain such patterns. To validate the metamodel, we built the complete UI models of one new system and of four existing real systems. This shows not only the expressive power of the metamodel, but also its versatility, since our validation was conducted using different types of systems (a desktop system, a web system, mobile system, and a multiplatform system). Moreover, it also demonstrated that the proposed approach can be used not only to build new models, but also to describe existing ones (by reverse engineering).  相似文献   

12.
In this article we describe the importance of reusing software artifacts resulting from the earliest stages of the development life-;cycle, i.e., software conception, requirements analysis, feasibility study, requirements specification, architectural and detailed design. Although reuse of early artifacts is deemed beneficial to software development projects, there are no readily available software tools that could facilitate their effective reuse. Hence, we identified nearly one hundred early artifact types. We analyzed, compared and contrasted them. We clustered similar artifact types into distinct artifact affinity groups. We then proposed several methods and techniques useful in the processing of these artifacts to support their reuse. We believe that the proposed methods could be utilized by tool builders to construct software development environments capable of assisting analysts, designers, architects and programmers to effectively reuse the results of early life-;cycle activities.  相似文献   

13.
In the scientific community, feature models are the de-facto standard for representing variability in software product line engineering. This is different from industrial settings where they appear to be used much less frequently. We and other authors found that in a number of cases, they lack concision, naturalness and expressiveness. This is confirmed by industrial experience.When modelling variability, an efficient tool for making models intuitive and concise are feature attributes. Yet, the semantics of feature models with attributes is not well understood and most existing notations do not support them at all. Furthermore, the graphical nature of feature models’ syntax also appears to be a barrier to industrial adoption, both psychological and rational. Existing tool support for graphical feature models is lacking or inadequate, and inferior in many regards to tool support for text-based formats.To overcome these shortcomings, we designed TVL, a text-based feature modelling language. In terms of expressiveness, TVL subsumes most existing dialects. The main goal of designing TVL was to provide engineers with a human-readable language with a rich syntax to make modelling easy and models natural, but also with a formal semantics to avoid ambiguity and allow powerful automation.  相似文献   

14.
Model-driven software modernization is a discipline in which model-driven development (MDD) techniques are used in the modernization of legacy systems. When existing software artifacts are evolved, they must be transformed into models to apply MDD techniques such as model transformations. Since most modernization scenarios (e.g., application migration) involve dealing with code in general-purpose programming languages (GPL), the extraction of models from GPL code is an essential task in a model-based modernization process. This activity could be performed by tools to bridge grammarware and MDD technical spaces, which is normally carried out by dedicated parsers. Grammar-to-Model Transformation Language (Gra2MoL) is a domain-specific language (DSL) tailored to the extraction of models from GPL code. This DSL is actually a text-to-model transformation language which can be applied to any code conforming to a grammar. Gra2MoL aims to reduce the effort needed to implement grammarware-MDD bridges, since building dedicated parsers is a complex and time-consuming task. Like ATL and RubyTL languages, Gra2MoL incorporates the binding concept needed to write mappings between grammar elements and metamodel elements in a simple declarative style. The language also provides a powerful query language which eases the retrieval of scattered information in syntax trees. Moreover, it incorporates extensibility and grammar reuse mechanisms. This paper describes Gra2MoL in detail and includes a case study based on the application of the language in the extraction of models from Delphi code.  相似文献   

15.
软件体系结构是软件开发过程的关键制品,应该尽早地分析和评估其质量.目前研究的软件体系结构评估主要集中在基于场景的评估方法,其特点是定性的、主观的、无需专用的体系结构描述语言.本文提出以统一建模语言UML作为软件体系结构描述语言以及度量的软件体系结构的定量评估.针对UML的可视化、多视图、半形式化以及一致地应用在整个软件开发活动的特性,提出一组UML度量,从UML图所表达的信息含量、可视化影响以及图形建模元素之间的关联性这三个方面度量软件体系结构.分析并讨论这组UML度量在评估软件体系结构的规模、复杂性和结构性等质量属性方面的应用.  相似文献   

16.
We investigate the changing relationship between the small research community of theoretical computer scientists and the much larger community of computer users, in particular, the technology transfer problem of how to exploit theoretical insights that can lead to better products. Our recommendation can be summarized in four points:1 The computing community is impressed by usable tools and by little else. Although a powerful theorem or ån elegant algorithm may be a useful tool for a fellow theoretician, by and large, the only tools directly usable by the general computing community are systems. No systems, no impact!2 System development means programming-in-the-large, but the algorithms research community so far has learned only how to program in-the-small.3 Algorithm researchers must enshrine their algorithms not merely in individual elegant programs, but collectively in useful application packages aimed at some identifiable user group.4 Since the development of software systems easily turns into a full-time activity that requires different skills from those of algorithms research, we must strive to develop techniques that lets a small group of algorithm researchers develop simply structured, open-ended systems whose kernel can be implemented with an effort of the order of 1 man-year. Low-complexity systems is the goal!  相似文献   

17.
一种软件体系结构设计决策的建模工具   总被引:1,自引:0,他引:1  
体系结构设计在整个软件生命周期中起到关键作用,而设计知识的蒸发会导致系统演化花费代价高、涉众之间交流出现障碍、体系结构制品的复用受到限制等问题,为此需要在软件体系结构层次对设计决策进行显式化的建模.基于一种以决策为中心的体系结构设计方法,实现了一个软件体系结构设计决策的建模工具.该工具帮助架构师对体系结构设计中的问题、方案、决策、理由等核心概念进行建模,完成从需求到体系结构的设计过程,并实现了自动化的候选体系结构方案的合成和部分设计理由的捕捉.该工具还提供了体系结构设计模型与设计决策之间的相互追踪性,以及帮助实现体系结构设计过程中设计决策知识的复用.  相似文献   

18.
Variant-rich software systems offer a large degree of customization, allowing users to configure the target system according to their preferences and needs. Facing high degrees of variability, these systems often employ variability models to explicitly capture user-configurable features (e.g., systems options) and the constraints they impose. The explicit representation of features allows them to be referenced in different variation points across different artifacts, enabling the latter to vary according to specific feature selections. In such settings, the evolution of variability models interplays with the evolution of related artifacts, requiring the two to evolve together, or coevolve. Interestingly, little is known about how such coevolution occurs in real-world systems, as existing research has focused mostly on variability evolution as it happens in variability models only. Furthermore, existing techniques supporting variability evolution are usually validated with randomly-generated variability models or evolution scenarios that do not stem from practice. As the community lacks a deep understanding of how variability evolution occurs in real-world systems and how it relates to the evolution of different kinds of software artifacts, it is not surprising that industry reports existing tools and solutions ineffective, as they do not handle the complexity found in practice. Attempting to mitigate this overall lack of knowledge and to support tool builders with insights on how variability models coevolve with other artifact types, we study a large and complex real-world variant-rich software system: the Linux kernel. Specifically, we extract variability-coevolution patterns capturing changes in the variability model of the Linux kernel with subsequent changes in Makefiles and C source code. From the analysis of the patterns, we report on findings concerning evolution principles found in the kernel, and we reveal deficiencies in existing tools and theory when handling changes captured by our patterns.  相似文献   

19.
Power plant process simulation software is well-suited for the modelling of energy systems and more importantly, tools for analysing the energy efficiency are often built into the software. This work presents the development of a simulation model for a sulphuric acid plant using a commercial software package for power plant process simulation. This will be of value to for instance small consultant and engineering companies involved with audits and analysis of energy systems. For small sized companies the cost of acquiring and maintaining many different specialised software packages will be noticeable. However, companies involved with audits and analysis of energy systems will in most cases have access to at least one software package for power plant process calculations. The use of this kind of software for also modelling chemical plants would be valuable to these companies. The results of this work shows that it is possible to use an inexpensive but powerful power plant process simulation software for modelling a common chemical process as a part of a large energy system.  相似文献   

20.
The concept of the ‘information technology (IT) artifact’ plays a central role in the information systems (IS) research community's discourse on design science. We pose the alternative concept of the ‘IS artifact’, unpacking what has been called the IT artifact into a separate ‘information artifact’, ‘technology artifact’ and ‘social artifact’. Technology artifacts (such as hardware and software), information artifacts (such as a message) and social artifacts (such as a charitable act) are different kinds of artifacts that together interact in order to form the IS artifact. We illustrate the knowledge value of the IS artifact concept with material from three cases. The result is to restore the idea that the study of design in IS needs to attend to the design of the entire IS artifact, not just the IT artifact. This result encourages an expansion in the use of design science research methodology to study broader kinds of artifacts.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号