首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
International Journal on Software Tools for Technology Transfer - Goal models play a significant role in the early stages of the requirements engineering process. These models are subject to...  相似文献   

2.
Job evaluatio refers to a systematic determination of the relative values of jobs in an organization. Often, jobs are evaluated based upon subjective judgement. By considering each job to consist of certain levels of different job factors, this paper develops a goal programming model to evaluate various levels of job factors. The main constraints in such a formation are obtained by using some existing benchmark jobs. The model development and application in evaluating new jobs are illustrated by solving an example problem consisting of four factors and six levels of each factor.  相似文献   

3.
The analysis of workflow is crucial to the correctness of workflow applications. This paper introduces a simple and practical method for analyzing workflow logic models. Firstly, some definitions of the models and some properties, such as throughness, no‐redundant‐transition and boundedness, are presented. Then, we propose an approach based on synchronized reachability graphs (SRGs) to verify these properties. The SRG uses the characteristics of synchronizers in workflow logic models and mitigates the state explosion by constructing synchronized occurrence sequences rather than interleaving occurrence sequences. This paper also proposes some refined and feasible reduction rules which can preserve vital properties of workflow logic models. Using these two techniques, the SRG‐based verification method can achieve higher efficiency. Furthermore, this research also develops a verification tool based on the method, presents the analysis results of some practical cases and compares our method with others. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

4.
This paper describes an actor-based approach to real-time programming, which focuses on the separation of functional from timing behaviour. The approach favours modularity and time predictability. Clusters of actors, allocated on distinct processors, are orchestrated by a control machine which provides an event-driven and time-driven customisable scheduling framework. The approach can be hosted by Java, which fosters a clean and type-safe programming style. Temporal analysis can be formally assisted by Coloured Petri Nets.  相似文献   

5.
Several models are considered to illustrate feasibility and appropriateness of involving the game theory framework in describing the process and result of informational confrontation in social networks.  相似文献   

6.
The infrastructures are interconnected and interdependent on multiple levels, the failure of one infrastructure can result in the disruption of other infrastructures, which can cause severe economic disruption and loss of life or failure of services. A methodological approach to analyze vulnerability of interdependent infrastructures has been introduced in this paper, two types of vulnerability are studied: structural vulnerability and functional vulnerability. Infrastructure topologies are only used for analysis on structural vulnerability while operating regimes of different infrastructures are further considered to analyze functional vulnerability. For these two types of vulnerability, interdependent effects are mainly studied and the effects of interdependence strength between infrastructures have also been analyzed. The analysis on structural vulnerability will be helpful to design or improve the infrastructures in the long run while the discussion on functional vulnerability will be useful to protect them in the short term. The methodology introduced in this paper will be advantageous to comprehensively analyze the vulnerability of interdependent infrastructures and protect them more efficiently.  相似文献   

7.
8.
In this study a hybrid (including qualitative and quantitative objectives) fuzzy multi objective nonlinear programming (H-FMONLP) model with different goal priorities will be developed for aggregate production planning (APP) problem in a fuzzy environment. Using an interactive decision making process the proposed model tries to minimize total production costs, carrying and back ordering costs and costs of changes in workforce level (quantitative objectives) and maximize total customer satisfaction (qualitative objective) with regarding the inventory level, demand, labor level, machines capacity and warehouse space. A real-world industrial case study demonstrates applicability of proposed model to practical APP decision problems. GENOCOP III (Genetic Algorithm for Numerical Optimization of Constrained Problems) has been used to solve final crisp nonlinear programming problem.  相似文献   

9.
A planner-based approach to generate and analyze minimal attack graph   总被引:1,自引:0,他引:1  
In the present scenario, even well administered networks are susceptible to sophisticated cyber attacks. Such attack combines vulnerabilities existing on different systems/services and are potentially more harmful than single point attacks. One of the methods for analyzing such security vulnerabilities in an enterprise network is the use of attack graph. It is a complete graph which gives a succinct representation of different attack scenarios, depicted by attack paths. An attack path is a logical succession of exploits, where each exploit in the series satisfies the preconditions for subsequent exploits and makes a causal relationship among them. Thus analysis of the attack graph may help in assessing network security from hackers’ perspective. One of the intrinsic problems with the generation and analysis of such a complete attack graph is its scalability. In this work, an approach based on Planner, a special purpose search algorithm from artificial intelligence domain, has been proposed for time-efficient, scalable representation of the attack graphs. Further, customized algorithms have been developed for automatic generation of attack paths (using Planner as a low-level module). The analysis shows that generation of attack graph using the customized algorithms can be done in polynomial time. A case study has also been presented to demonstrate the efficacy of the proposed methodology.  相似文献   

10.
Landslide incidence can be affected by a variety of environmental factors. Past studies have focused on the identification of these environmental factors, but most are based on statistical analysis. In this paper, spatial information techniques were applied to a case study of landslide occurrence in China by combining remote sensing and geographical information systems with an innovative data mining approach (rough set theory) and statistical analyses. Core and reducts of data attributes were obtained by data mining based on rough set theory. Rules for the impact factors, which can contribute to landslide occurrence, were generated from the landslide knowledge database. It was found that all 11 rules can be classified as both exact and approximate rules. In terms of importance, three main rules were then extracted as the key decision-making rules for landslide predictions. Meanwhile, the relationship between landslide occurrence and environmental factors was statistically analyzed to validate the accuracy of rules extracted by the rough set-based method. It was shown that the rough set-based approach is of use in analyzing environmental factors affecting landslide occurrence, and thus facilitates the decision-making process for landslide prediction.  相似文献   

11.
12.
Abstract— In this paper, two models to evaluate the temporal behavior of liquid‐crystal displays are described: a model assuming a linear display behavior and a model that incorporates non‐linear effects. For the linear temporal model, it can be predicted that the response time starts to contribute to motion blur when it is longer than one‐sixth of the hold time and becomes dominant when it is longer than eight times the hold time. The non‐linear model can be used to visualize the appearance of effects that cannot be determined via linear system theory. Also, some means to reduce display artifacts are described and its impact is illustrated. Although the main focus in this article is on the temporal behavior of liquid‐crystal displays, the spatial properties defined by the pixel structure can be simulated as well. A formula for the spatio‐temporal display behavior is given, which can be evaluated numerically to simulate the perceived image for arbitrary image‐sequence input material.  相似文献   

13.
This paper proposes several goal programming (GP) models for estimating the performance measure weights of firms by means of constrained regression. Since some single-criterion performance measures are usually in conflict, we propose two opposed alternatives for determining multiple-criterion performance: the first is to calculate a consensus performance that reflects the majority trend of the single-criterion measures and the other is to calculate a performance that is biased towards the measures that show the most discrepancy with the rest. GP makes it possible to model both approaches as well as a compromise between the two extremes. Using two case studies reported in the literature and introducing another one examining non-financial companies listed in Ibex-35, we compare our proposal with other methods such as CRITIC and a modified version of TOPSIS. In order to improve the comparisons a Montecarlo simulation has been performed in all three case studies.Scope and purposeThe study falls into the area of multiple-criteria analysis of business performance. Firms are obliged to report a vast amount of financial information at regular intervals, and for this there is a wide range of performance measures. Multicriteria performance is calculated from the single-criterion measures and is then used to draw up rankings of firms. As a complement to the other multicriteria methods described in the literature, we propose the use of GP for implementing two quite different strategies: overweighting the measures in line with the general trend or overweighting the measures that conflict with the rest. Besides the use of Spearman's correlation, we introduce two other measures for comparing the solutions obtained.  相似文献   

14.
A new discrete-time dynamic input-output economic model is proposed. A control system formulation is undertaken in which the rates of change of capital stock and production are used in the control (policy or instrument) vector. The model is a supply-demand disequilibrium model, allowing excess demand to exist at any time. The exogenous final demand is modeled as a disturbance input.  相似文献   

15.
16.
Two-sided assembly lines are especially used at the assembly of large-sized products, such as trucks and buses. In this type of a production line, both sides of the line are used in parallel. In practice, it may be necessary to optimize more than one conflicting objectives simultaneously to obtain effective and realistic solutions. This paper presents a mathematical model, a pre-emptive goal programming model for precise goals and a fuzzy goal programming model for imprecise goals for two-sided assembly line balancing. The mathematical model minimizes the number of mated-stations as the primary objective and it minimizes the number of stations as a secondary objective for a given cycle time. The zoning constraints are also considered in this model, and a set of test problems taken from literature is solved. The proposed goal programming models are the first multiple-criteria decision-making approaches for two-sided assembly line balancing problem with multiple objectives. The number of mated-stations, cycle time and the number of tasks assigned per station are considered as goals. An example problem is solved and a computational study is conducted to illustrate the flexibility and the efficiency of the proposed goal programming models. Based on the decision maker's preferences, the proposed models are capable of improving the value of goals.  相似文献   

17.
18.
A fuzzy regression model is developed to construct the relationship between the response and explanatory variables in fuzzy environments. To enhance explanatory power and take into account the uncertainty of the formulated model and parameters, a new operator, called the fuzzy product core (FPC), is proposed for the formulation processes to establish fuzzy regression models with fuzzy parameters using fuzzy observations that include fuzzy response and explanatory variables. In addition, the sign of parameters can be determined in the model-building processes. Compared to existing approaches, the proposed approach reduces the amount of unnecessary or unimportant information arising from fuzzy observations and determines the sign of parameters in the models to increase model performance. This improves the weakness of the relevant approaches in which the parameters in the models are fuzzy and must be predetermined in the formulation processes. The proposed approach outperforms existing models in terms of distance, mean similarity, and credibility measures, even when crisp explanatory variables are used.  相似文献   

19.
We propose new quantization of homogenous cosmological models. Four fundamental methods are applied to the cosmological model and efficiently joint. The Dirac method for constrained systems is used, then the Fock space is built, and second quantization is carried out. Finally, the diagonalization ansatz, which is a combination of the Bogoliubov transformation method and the Heisenberg equation of motion, is formulated. The temperature of a quantum cosmological model is introduced.  相似文献   

20.
Conclusion The notion of compatibility of automata was proposed in [1] for formalization of requirements that must be met by interacting partial automata. Testing the compatibility of automata is of essential importance for the design of systems that interact with the environment, especially when we use declarative specificatio of the system to be designed. Under the assumptions of this article for the automaton that models the environment, partiality of the specified automaton is a source of possible incompatibility with the environment. When declarative specification is used, we can never decide in advance if the specified automaton is partial or not. Moreover, even a specification thata priori describes a completely defined automaton may be altered by the actions of the designer in the process of design (especially if these actions are incorrect) so that the specified automaton becomes partial. Therefore the initial specification, and each successive specification produced by human intervention in the design process, must be tested for compatibility with the environment. In the methodology of verification design of automata, compatibility testing is used to solve two problems: a) generating the specification of the class of all automata that satisfy the initial specification and are compatible with the specification of the environment; b) testing for correctness the designer's decisions that alter the current specification of the automaton being designed. The results of this article have led to the development of an efficient resolution procedure for testing the compatibility of automaton specification with the specification of the environment. this procedure has been implemented in the system for verification design of automata from their logical specifications. The efficiency of the developed procedure is based on the results of compatibility analysis of automata from [1] and on the restricted resolution strategy whose completeness and correctness have been proved in [2]. Translated from Kibernetika i Sistemnyi Analiz, No. 6, pp. 36–50, November–December, 1994.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号