首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到10条相似文献,搜索用时 78 毫秒
1.
Software development cost estimation approaches — A survey   总被引:1,自引:0,他引:1  
This paper summarizes several classes of software cost estimation models and techniques: parametric models, expertise‐based techniques, learning‐oriented techniques, dynamics‐based models, regression‐based models, and composite‐Bayesian techniques for integrating expertise‐based and regression‐based models. Experience to date indicates that neural‐net and dynamics‐based techniques are less mature than the other classes of techniques, but that all classes of techniques are challenged by the rapid pace of change in software technology. The primary conclusion is that no single technique is best for all situations, and that a careful comparison of the results of several approaches is most likely to produce realistic estimates. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

2.
Summary Equivalence is a fundamental notion for the semantic analysis of algebraic specifications. In this paper the notion of “crypt-equivalence” is introduced and studied w.r.t. two “loose” approaches to the semantics of an algebraic specificationT: the class of all first-order models ofT and the class of all term-generated models ofT. Two specifications are called crypt-equivalent if for one specification there exists a predicate logic formula which implicitly defines an expansion (by new functions) of every model of that specification in such a way that the expansion (after forgetting unnecessary functions) is homologous to a model of the other specification, and if vice versa there exists another predicate logic formula with the same properties for the other specification. We speak of “first-order crypt-equivalence” if this holds for all first-order models, and of “inductive crypt-equivalence” if this holds for all term-generated models. Characterizations and structural properties of these notions are studied. In particular, it is shown that firstorder crypt-equivalence is equivalent to the existence of explicit definitions and that in case of “positive definability” two first-order crypt-equivalent specifications admit the same categories of models and homomorphisms. Similarly, two specifications which are inductively crypt-equivalent via sufficiently complete implicit definitions determine the same associated categories. Moreover, crypt-equivalence is compared with other notions of equivalence for algebraic specifications: in particular, it is shown that first-order cryptequivalence is strictly coarser than “abstract semantic equivalence” and that inductive crypt-equivalence is strictly finer than “inductive simulation equivalence” and “implementation equivalence”.  相似文献   

3.
4.
Structured large margin machines: sensitive to data distributions   总被引:4,自引:0,他引:4  
This paper proposes a new large margin classifier—the structured large margin machine (SLMM)—that is sensitive to the structure of the data distribution. The SLMM approach incorporates the merits of “structured” learning models, such as radial basis function networks and Gaussian mixture models, with the advantages of “unstructured” large margin learning schemes, such as support vector machines and maxi-min margin machines. We derive the SLMM model from the concepts of “structured degree” and “homospace”, based on an analysis of existing structured and unstructured learning models. Then, by using Ward’s agglomerative hierarchical clustering on input data (or data mappings in the kernel space) to extract the underlying data structure, we formulate SLMM training as a sequential second order cone programming. Many promising features of the SLMM approach are illustrated, including its accuracy, scalability, extensibility, and noise tolerance. We also demonstrate the theoretical importance of the SLMM model by showing that it generalizes existing approaches, such as SVMs and M4s, provides novel insight into learning models, and lays a foundation for conceiving other “structured” classifiers. Editor: Dale Schuurmans. This work was supported by the Hong Kong Research Grant Council under Grants G-T891 and B-Q519.  相似文献   

5.
Most existing formulations for structural elements such as beams, plates and shells do not allow for the use of general nonlinear constitutive models in a straightforward manner. Furthermore, such structural element models, due to the nature of the generalized coordinates used, do not capture some Poisson modes such as the ones that couple the deformation of the cross section of the structural element and stretch and bending. In this paper, beam models that employ general nonlinear constitutive equations are presented using finite elements based on the nonlinear absolute nodal coordinate formulation. This formulation relaxes the assumptions of the Euler–Bernoulli and Timoshenko beam theories, and allows for the use of general nonlinear constitutive models. The finite elements based on the absolute nodal coordinate formulation also allow for the rotation as well as the deformation of the cross section, thereby capturing Poisson modes which can not be captured using other beam models. In this investigation, three different nonlinear constitutive models based on the hyper-elasticity theory are considered. These three models are based on the Neo–Hookean constitutive law for compressible materials, the Neo–Hookean constitutive law for incompressible materials, and the Mooney–Rivlin constitutive law in which the material is assumed to be incompressible. These models, which allow capturing Poisson modes, are suitable for many materials and applications, including rubber-like materials and biological tissues which are governed by nonlinear elastic behavior. Numerical examples that demonstrate the implementation of these nonlinear constitutive models in the absolute nodal coordinate formulation are presented. The results obtained using the nonlinear and linear constitutive models are compared in this study. These results show that the use of nonlinear constitutive models can significantly enhance the performance and improve the computational efficiency of the finite element models based on the absolute nodal coordinate formulation. The results also show that when linear constitutive models are used in the large deformation analysis, singular configurations are encountered and basic formulas such as Nanson’s formula are no longer valid. These singular deformation configurations are not encountered when the nonlinear constitutive models are used.  相似文献   

6.
We list the most pressing reliability problems of the Unified gas supply system (UGSS) and its object, characterize models developed for solving these problems, and present brief data concerning software implementations of these models. Based on these models, an industry standard “Providing system reliability for gas transport and consumer gas supply stability” has been developed and approved.  相似文献   

7.
We discuss the relationship between ID-based key agreement protocols, certificateless encryption and ID-based key encapsulation mechanisms. In particular we show how in some sense ID-based key agreement is a primitive from which all others can be derived. In doing so we focus on distinctions between what we term pure ID-based schemes and non-pure schemes, in various security models. We present security models for ID-based key agreement which do not “look natural” when considered as analogues of normal key agreement schemes, but which look more natural when considered in terms of the models used in certificateless encryption. We illustrate our models and constructions with two running examples, one pairing based and one non-pairing based. Our work highlights distinctions between the two approaches to certificateless encryption and adds to the debate about what is the “correct” security model for certificateless encryption.  相似文献   

8.
The abstraction of cryptographic operations by term algebras, called Dolev–Yao models, is essential in almost all tool-supported methods for proving security protocols. Recently significant progress was made in proving that Dolev–Yao models can be sound with respect to actual cryptographic realizations and security definitions. The strongest results show this in the sense of blackbox reactive simulatability (BRSIM)/UC, a notion that essentially means the preservation of arbitrary security properties under arbitrary active attacks and in arbitrary protocol environments, with only small changes to the Dolev–Yao models and natural implementations. However, these results are so far restricted to core cryptographic systems like encryption and signatures. Typical modern tools and complexity results around Dolev–Yao models also allow operations with more algebraic properties, in particular XOR because of its clear structure and cryptographic usefulness. We show that it is not possible to extend the strong BRSIM/UC results to XOR, at least not with remotely the same generality and naturalness as for the core cryptographic systems. We also show that for every potential soundness result for XOR with secrecy implications, one significant change to typical Dolev–Yao models must be made. On the positive side, we show the soundness of a rather general Dolev–Yao model with XOR and its realization in the sense of BRSIM/UC under passive attacks. A preliminary version of this paper appeared in Proc. 10th European Symposium on Research in Computer Security [9]  相似文献   

9.
In the process of extending the UML metamodel for a specific domain, the metamodel specifier introduces frequently some metaassociations at MOF level M2 with the aim that they induce some specific associations at MOF level M1. For instance, if a metamodel for software process modelling states that a “Role” is responsible for an “Artifact”, we can interpret that its specifier intended to model two aspects: (1) the implications of this metaassociation at level M1 (e.g., the specific instance of Role “TestEngineer” is responsible for the specific instance of Artifact “TestPlans”); and (2) the implications of this metaassociation at level M0 (e.g., “John Doe” is the responsible test engineer for elaborating the test plans for the package “Foo”). Unfortunately, the second aspect is often not enforced by the metamodel and, as a result, the models which are defined as its instances may not incorporate it. This problem, consequence of the so-called “shallow instantiation” in Atkinson and Kühne (Procs. UML’01, LNCS 2185, Springer, 2001), prevents these models from being accurate enough in the sense that they do not express all the information intended by the metamodel specifier and consequently do not distinguish metaassociations that induce associations at M1 from those that do not. In this article we introduce the concept of induced association that may come up when an extension of the UML metamodel is developed. The implications that this concept has both in the extended metamodel and in its instances are discussed. We also present a methodology to enforce that M1 models incorporate the associations induced by the metamodel which they are instances from. Next, as an example of application we present a quality metamodel for software artifacts which makes intensive use of induced associations. Finally, we introduce a software tool to assist the development of quality models as correct instantiations of the metamodel, assuring the proper application of the induced associations as required by the metamodel.  相似文献   

10.
Li P  Banerjee S  McBean AM 《GeoInformatica》2011,15(3):435-454
Statistical models for areal data are primarily used for smoothing maps revealing spatial trends. Subsequent interest often resides in the formal identification of ‘boundaries’ on the map. Here boundaries refer to ‘difference boundaries’, representing significant differences between adjacent regions. Recently, Lu and Carlin (Geogr Anal 37:265–285, 2005) discussed a Bayesian framework to carry out edge detection employing a spatial hierarchical model that is estimated using Markov chain Monte Carlo (MCMC) methods. Here we offer an alternative that avoids MCMC and is easier to implement. Our approach resembles a model comparison problem where the models correspond to different underlying edge configurations across which we wish to smooth (or not). We incorporate these edge configurations in spatially autoregressive models and demonstrate how the Bayesian Information Criteria (BIC) can be used to detect difference boundaries in the map. We illustrate our methods with a Minnesota Pneumonia and Influenza Hospitalization dataset to elicit boundaries detected from the different models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号