首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
2.
In this paper, we present a fully Bayesian approach for generalized Dirichlet mixtures estimation and selection. The estimation of the parameters is based on the Monte Carlo simulation technique of Gibbs sampling mixed with a Metropolis-Hastings step. Also, we obtain a posterior distribution which is conjugate to a generalized Dirichlet likelihood. For the selection of the number of clusters, we used the integrated likelihood. The performance of our Bayesian algorithm is tested and compared with the maximum likelihood approach by the classification of several synthetic and real data sets. The generalized Dirichlet mixture is also applied to the problems of IR eye modeling and introduced as a probabilistic kernel for Support Vector Machines.
Riad I. HammoudEmail:
  相似文献   

3.
The duration of a software project is a very important feature, closely related to its cost. Various methods and models have been proposed in order to predict not only the cost of a software project but also its duration. Since duration is essentially the random length of a time interval from a starting to a terminating event, in this paper we present a framework of statistical tools, appropriate for studying and modeling the distribution of the duration. The idea for our approach comes from the parallelism of duration to the life of an entity which is frequently studied in biostatistics by a certain statistical methodology known as survival analysis. This type of analysis offers great flexibility in modeling the duration and in computing various statistics useful for inference and estimation. As in any other statistical methodology, the approach is based on datasets of measurements on projects. However, one of the most important advantages is that we can use in our data information not only from completed projects, but also from ongoing projects. In this paper we present the general principles of the methodology for a comprehensive duration analysis and we also illustrate it with applications to known data sets. The analysis showed that duration is affected by various factors such as customer participation, use of tools, software logical complexity, user requirements volatility and staff tool skills.
Ioannis StamelosEmail:
  相似文献   

4.

Ocean Colour Monitor (OCM) data gathered onboard Indian Remote Sensing satellite (IRS P4), in eight VNIR bands covering Gulf of Kutch and adjoining regions were digitally processed using an image processing system. False colour composite images of actual data and principal components were generated to study sediment transport, depth variation and associated processes by mapping coastal and underwater geomorphic features and suspended sediment plumes. Submerged shoals, located as deep as 20 m below water level, were correctly marked. By studying turbidity distribution patterns and sediment transport indicators, sediment distribution and dispersion have been inferred. The study demonstrates that OCM data have been derived from up to 20 m water depth in the high-energy tide dominated Gulf and hence are useful for mapping underwater bedforms, sediment plumes, and other geomorphic features. The study indicates that the sediments are transported to the Gulf from the north as well as from the south and are mainly season dependent. It is concluded that the OCM data can be used for depth variation and sediment distribution studies.  相似文献   

5.
Numerical modelling is a tool to investigate the controls on the formation of the stratigraphic record on geological timescales. The model presented in this paper (DELTASIM) uses a process-response approach that simulates the stratigraphy of fluvial-dominated deltaic systems in two dimensions, based on simplified diffusion rules of cross-shore sedimentation. Net sedimentation is calculated for individual grain-size classes as the sum of independent erosion and deposition functions, enabling simulations of fluvio-deltaic stratigraphy besides clinoform evolution. Critical sediment transport parameters are validated using synthetic data from a process-based morphodynamic model, DELFT3D.Generic experiments show the effect of changes in sea level, sediment supply, offshore gradient and sediment size distribution. These experiments show that the model is fully capable of reproducing classic concepts of delta development on geological timescale. Such experiments allow students a possibility to evaluate the controls on the formation of the stratigraphic record. DELTASIM has been successfully applied to improve the understanding of the sedimentary evolution of a real-world fluvial-dominated delta in the Caspian Sea.Additional functionality encompasses a stochastic discharge model that can be used as input to simulate series of scenarios of delta development using the model's rapid run time to our advantage. This functionality enables us to present probabilistic output of longitudinal stratigraphic sections as an alternative to the deterministic predictions often made by stratigraphic models. The characteristics of the model; simplicity, speed and compatibility of the output to conceptual sequence stratigraphic models make DELTASIM suitable as a teaching tool.  相似文献   

6.
Due to their complexity, the syntax of modern modeling languages is preferably defined in two steps. The abstract syntax identifies all modeling concepts whereas the concrete syntax should clarify how these concepts are rendered by graphical and/or textual elements. While the abstract syntax is often defined in form of a metamodel, there does not exist such standard format yet for concrete syntax definitions. The diversity of definition formats—ranging from EBNF grammars to informal text—is becoming a major obstacle for advances in modeling language engineering, including the automatic generation of editors. In this paper, we propose a uniform format for concrete syntax definitions. Our approach captures both textual and graphical model representations and even allows to assign more than one rendering to the same modeling concept. Consequently, following our approach, a model can have multiple, fully equivalent representations, but—in order to avoid ambiguities when reading a model representation—two different models should always have distinguishable representations. We call a syntax definition correct, if all well-formed models are represented in a non-ambiguous way. As the main contribution of this paper, we present a rigorous analysis technique to check the correctness of concrete syntax definitions.
Thomas BaarEmail:
  相似文献   

7.
We consider a class of production/inventory control problems that has a single product and a single stocking location, for which a stochastic demand with a known non-stationary probability distribution is given. Under the widely-known replenishment cycle policy the problem of computing policy parameters under service level constraints has been modeled using various techniques. Tarim and Kingsman introduced a modeling strategy that constitutes the state-of-the-art approach for solving this problem. In this paper we identify two sources of approximation in Tarim and Kingsman’s model and we propose an exact stochastic constraint programming approach. We build our approach on a novel concept, global chance-constraints, which we introduce in this paper. Solutions provided by our exact approach are employed to analyze the accuracy of the model developed by Tarim and Kingsman. This work was supported by Science Foundation Ireland under Grant No. 03/CE3/I405 as part of the Centre for Telecommunications Value-Chain-Driven Research (CTVR) and Grant No. 00/PI.1/C075.  相似文献   

8.
A new approach to software reliability modeling is discussed where variables indirectly related with software reliability are used to provide additional information for the modeling process. Previous studies, empirical and theoretical evidences, and results from experiments indicate that there is a strong relationship between software reliability and coverage of program elements required to be exercised by structural testing criteria. This paper develops a binomial type coverage-based software reliability model through the definition of a coverage-based failure rate function. The Binomial software reliability Model Based on Coverage—BMBC—is proposed and discussed. In the BMBC test data between failures is used instead of time as independent variable; the model was assessed with test data from a real application, making use of the following structural testing criteria: all-nodes, all-edges, and potential-uses—a data-flow based family of testing criteria. The results from our experiments have shown that our modeling approach has some advantages over some traditional reliability models and points to a very promising research direction in software reliability.
José Carlos MaldonadoEmail:
  相似文献   

9.
The presentation of multimedia data is not only characterized by precise temporal constraints; also spatial constraints must be taken into account. An important requirement in multimedia systems is thus the integrated modeling of spatio-temporal constraints. Moreover, it is important to devise methods for checking the consistency of the specified constraints. In this paper, we first propose a spatio-temporal object graph (STOG) model that provides an integrated and graphical representation of spatio-temporal constraints. Second, we investigate consistency conditions between spatial and temporal constraints expressed in the STOG model. Then, we present a prototype system implementing the proposed model.  相似文献   

10.
This paper addresses the task of coordinated planning of a supply chain (SC). Work in process (WIP) in each facility participating in the SC, finished goods inventory, and backlogged demand costs are minimized over the planning horizon. In addition to the usual modeling of linear material flow balance equations, variable lead time (LT) requirements, resulting from the increasing incremental WIP as a facility’s utilization increases, are also modeled. In recognition of the emerging significance of quality of service (QoS), that is, control of stockout probability to meet demand on time, maximum stockout probability constraints are also modeled explicitly. Lead time and QoS modeling require incorporation of nonlinear constraints in the production planning optimization process. The quantification of these nonlinear constraints must capture statistics of the stochastic behavior of production facilities revealed during a time scale far shorter than the customary weekly time scale of the planning process. The apparent computational complexity of planning production against variable LT and QoS constraints has long resulted in MRP-based scheduling practices that ignore the LT and QoS impact to the plan’s detriment. The computational complexity challenge was overcome by proposing and adopting a time-scale decomposition approach to production planning, where short-time-scale stochastic dynamics are modeled in multiple facility-specific subproblems that receive tentative targets from a deterministic master problem and return statistics to it. A converging and scalable iterative methodology is implemented, providing evidence that significantly lower cost production plans are achievable in a computationally tractable manner.  相似文献   

11.
Real-time discrete event systems are discrete event systems with timing constraints, and can be modeled by timed automata. The latter are convenient for modeling real-time discrete event systems. However, due to their infinite state space, timed automata are not suitable for studying real-time discrete event systems. On the other hand, finite state automata, as the name suggests, are convenient for modeling and studying non-real time discrete event systems. To take into account the advantages of finite state automata, an approach for studying real-time discrete event systems is to transform, by abstraction, the timed automata modeling them into finite state automata which describe the same behaviors. Then, studies are performed on the finite state automata model by adapting methods designed for non real-time discrete event systems. In this paper, we present a method for transforming timed automata into special finite state automata called Set-Exp automata. The method, called SetExp, models the passing of time as real events in two types: Set events which correspond to resets with programming of clocks, and Exp events which correspond to the expiration of clocks. These events allow to express the timing constraints as events order constraints. SetExp limits the state space explosion problem in comparison to other transformation methods of timed automata, notably when the magnitude of the constants used to express the timing constraints are high. Moreover, SetExp is suitable, for example, in supervisory control and conformance testing of real-time discrete event systems.  相似文献   

12.
Topological relations have played important roles in spatial query, analysis and reasoning. In a two-dimensional space (IR2), most existing topological models can distinguish the eight basic topological relations between two spatial regions. Due to the arbitrariness and complexity of topological relations between spatial regions, it is difficult for these models to describe the order property of transformations among the topological relations, which is important for detailed analysis of spatial relations. In order to overcome the insufficiency in existing models, a multi-level modeling approach is employed to describe all the necessary details of region–region relations based upon topological invariants. In this approach, a set of hierarchically topological invariants is defined based upon the boundary–boundary intersection set (BBIS) of two involved regions. These topological invariants are classified into three levels based upon spatial set concept proposed, which include content, dimension and separation number at the set level, the element type at the element level, and the sequence at the integrated level. Corresponding to these hierarchical invariants, multi-level formal models of topological relations between spatial regions are built. A practical example is provided to illustrate the use of the approach presented in this paper.
Zhilin LiEmail:
  相似文献   

13.
Microgrids are subsystems of the distribution grid which comprises generation capacities, storage devices and flexible loads, operating as a single controllable system either connected or isolated from the utility grid. In this work, microgrid management system is developed in a stochastic framework. It is seen as a constraint-based system that employs forecasts and stochastic techniques to manage microgrid operations. Uncertainties due to fluctuating demand and generation from renewable energy sources are taken into account and a two-stage stochastic programming approach is applied to efficiently optimize microgrid operations while satisfying a time-varying request and operation constraints. At the first stage, before the realizations of the random variables are known, a decision on the microgrid operations has to be made. At the second stage, after random variables outcomes become known, correction actions must be taken, which have a cost. The proposed approach aims at minimizing the expected cost of correction actions. Mathematically, the stochastic optimization problem is stated as a mixed-integer linear programming problem, which is solved in an efficient way by using commercial solvers. The stochastic problem is incorporated in a model predictive control scheme to further compensate the uncertainty through the feedback mechanism. A case study of a microgrid is employed to assess the performance of the on-line optimization-based control strategy and the simulation results are discussed. The method is applied to an experimental microgrid: experimental results show the feasibility and the effectiveness of the proposed approach.  相似文献   

14.
Modeling process-related RBAC models with extended UML activity models   总被引:2,自引:0,他引:2  

Context

Business processes are an important source for the engineering of customized software systems and are constantly gaining attention in the area of software engineering as well as in the area of information and system security. While the need to integrate processes and role-based access control (RBAC) models has been repeatedly identified in research and practice, standard process modeling languages do not provide corresponding language elements.

Objective

In this paper, we are concerned with the definition of an integrated approach for modeling processes and process-related RBAC models - including roles, role hierarchies, statically and dynamically mutual exclusive tasks, as well as binding of duty constraints on tasks.

Method

We specify a formal metamodel for process-related RBAC models. Based on this formal model, we define a domain-specific extension for a standard modeling language.

Results

Our formal metamodel is generic and can be used to extend arbitrary process modeling languages. To demonstrate our approach, we present a corresponding extension for UML2 activity models. The name of our extension is Business Activities. Moreover, we implemented a library and runtime engine that can manage Business Activity runtime models and enforce the different policies and constraints in a software system.

Conclusion

The definition of process-related RBAC models at the modeling-level is an important prerequisite for the thorough implementation and enforcement of corresponding policies and constraints in a software system. We identified the need for modeling support of process-related RBAC models from our experience in real-world role engineering projects and case studies. The Business Activities approach presented in this paper is successfully applied in role engineering projects.  相似文献   

15.
In this paper, we describe the process of parallelizing an existing, production level, sequential Synthetic Aperture Radar (SAR) processor based on the Range-Doppler algorithmic approach. We show how, taking into account the constraints imposed by the software architecture and related software engineering costs, it is still possible with a moderate programming effort to parallelize the software and present an message-passing interface (MPI) implementation whose speedup is about 8 on 9 processors, achieving near real-time processing of raw SAR data even on a moderately aged parallel platform. Moreover, we discuss a hybrid two-level parallelization approach that involves the use of both MPI and OpenMP. We also present GridStore, a novel data grid service to manage raw, focused and post-processed SAR data in a grid environment. Indeed, another aim of this work is to show how the processed data can be made available in a grid environment to a wide scientific community, through the adoption of a data grid service providing both metadata and data management functionalities. In this way, along with near real-time processing of SAR images, we provide a data grid-oriented system for data storing, publishing, management, etc.
Giovanni AloisioEmail:
  相似文献   

16.
Construction and stepwise refinement of dependability models   总被引:8,自引:0,他引:8  
This paper presents a stepwise approach for dependability modeling, based on generalized stochastic Petri nets (GSPNs). The first-step model called functional-level model, is built based on the system’s functional specifications and then completed by the structural model as soon as the system’s architecture is known. It can then be refined according to three complementary aspects: component decomposition, state and event fine-tuning and distribution adjustment to take into account increasing event rates. We define specific rules to make the successive transformations as easy and systematic as possible. This approach allows the various dependencies to be taken into account at the right level of abstraction: functional dependency, structural dependency and those induced by non-exponential distributions. A part of the approach is applied to an instrumentation and control (I&C) system in power plants.  相似文献   

17.
An important aspect in the specification of conceptual schemas is the definition of general constraints that cannot be expressed by the predefined constructs provided by conceptual modeling languages. This is generally achieved by using general-purpose languages like OCL. In this paper we propose a new approach that facilitates the definition of such general constraints in UML. More precisely, we define a profile that extends the set of predefined UML constraints by adding certain types of constraints that are commonly used in conceptual schemas. We also show how our proposal facilitates reasoning about the constraints and their automatic code generation, study the application of our ideas to the specification of two real-life applications, and present a prototype tool implementation.
Ernest TenienteEmail:
  相似文献   

18.
We present a new approach to model 2D surfaces and 3D volumetric data, as well as an approach for non-rigid registration; both are developed in the geometric algebra framework. The approach for modeling is based on marching cubes idea using however spheres and their representation in the conformal geometric algebra; it will be called marching spheres. Note that before we can proceed with the modeling, it is needed to segment the object we are interested in; therefore, we include an approach for image segmentation, which is based on texture and border information, developed in a region-growing strategy. We compare the results obtained with our modeling approach against the results obtained with other approach using Delaunay tetrahedrization, and our proposed approach reduces considerably the number of spheres. Afterward, a method for non-rigid registration of models based on spheres is presented. Registration is done in an annealing scheme, as in Thin-Plate Spline Robust Point Matching (TPS-RPM) algorithm. As a final application of geometric algebra, we track in real time objects involved in surgical procedures.
Jorge Rivera-RoveloEmail:
  相似文献   

19.
ABSTRACT

This article investigates model predictive control (MPC) of linear systems subject to arbitrary (possibly unbounded) stochastic disturbances. An MPC approach is presented to account for hard input constraints and joint state chance constraints in the presence of unbounded additive disturbances. The Cantelli–Chebyshev inequality is used in combination with risk allocation to obtain computationally tractable but accurate surrogates for the joint state chance constraints when only the mean and variance of the arbitrary disturbance distributions are known. An algorithm is presented for determining the optimal feedback gain and optimal risk allocation by iteratively solving a series of convex programs. The proposed stochastic MPC approach is demonstrated on a continuous acetone–butanol–ethanol fermentation process, which is used in the production of biofuels.  相似文献   

20.
In this paper a new method for the combination of 2D GIS vector data and 2.5D DTM represented by triangulated irregular networks (TIN) to derive integrated triangular 2.5D object-based landscape models (also known as 2.5D-GIS-TIN) is presented. The algorithm takes into account special geometric constellations and fully exploits existing topologies of both input data sets, it “sews the 2D data into the TIN like a sewing-machine” while traversing the latter along the 2D data. The new algorithm is called radial topology algorithm. We discuss its advantages and limitations, and describe ways to eliminate redundant nodes generated during the integration process. With the help of four examples from practical work we show that it is feasible to compute and work with such integrated data sets. We also discuss the integrated data models in the light of various general requirements and conclude that the integration based on triangulations has a number of distinct advantages.
Christian HeipkeEmail:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号