首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到17条相似文献,搜索用时 9 毫秒
1.
随当前我国计算机网络技术的不断快速发展,以及全球信息化的发展趋势日渐成型,人们对于信息的需求也愈发地渴望.正是基于此时代背景,以信息作为传输载体的计算机网络系统也应势而起,并且其可靠性、安全性也更加受到让人们的关注.本文中研究了计算机网络系统可靠性,比较了存在的优劣问题,并探讨其应用于矿山计算机系统的可行性.  相似文献   

2.
Reliability analysis and optimal version-updating for open source software   总被引:1,自引:0,他引:1  

Context

Although reliability is a major concern of most open source projects, research on this problem has attracted attention only recently. In addition, the optimal version-dating for open source software considering its special properties is not yet discussed.

Objective

In this paper, the reliability analysis and optimal version-updating for open source software are studied.

Method

A modified non-homogeneous Poisson process model is developed for open source software reliability modeling and analysis. Based on this model, optimal version-updating for open source software is investigated as well. In the decision process, the rapid release strategy and the level of reliability are the two most important factors. However, they are essentially contradicting with each other. In order to consider these two conflicting factors simultaneously, a new decision model based on multi-attribute utility theory is proposed.

Results

Our models are tested on the real world data sets from two famous open source projects: Apache and GNOME. It is found that traditional software reliability models provide overestimations of the reliability of open source software. In addition, the proposed decision model can help management to make a rational decision on the optimal version-updating for open source software.

Conclusion

Empirical results reveal that the proposed model for open source software reliability can describe the failure process more accurately. Furthermore, it can be seen that the proposed decision model can assist management to appropriately determine the optimal version-update time for open source software.  相似文献   

3.
We present procedures and tools for the analysis of network traffic measurements. The tools consist of stand-alone1 modules that implement advanced statistical analysis procedures and a flexible web-based interface through which a user can create, modify, save, and execute experiments using the statistical analysis modules. The tools do not assume a specific source traffic model, but rather process actual measurements of network traffic. Indeed, the theory that the tools are based on can identify the time-scales that affect a link’s performance, hence suggest the appropriate time granularity of traffic measurements. We present and discuss case studies that demonstrate the application of the tools for answering questions related to network management and dimensioning, such as the maximum link utilization when some quality of service is guaranteed, how this utilization is affected by the link buffer and traffic shaping, the acceptance region and the effects of the scheduling discipline, and the token (or leaky) bucket parameters of a traffic stream. The case studies involve actual traffic measurements obtained by a high performance measurement platform that we have deployed at the University of Crete network.  相似文献   

4.
Traditional parametric software reliability growth models (SRGMs) are based on some assumptions or distributions and none such single model can produce accurate prediction results in all circumstances. Non-parametric models like the artificial neural network (ANN) based models can predict software reliability based on only fault history data without any assumptions. In this paper, initially we propose a robust feedforward neural network (FFNN) based dynamic weighted combination model (PFFNNDWCM) for software reliability prediction. Four well-known traditional SRGMs are combined based on the dynamically evaluated weights determined by the learning algorithm of the proposed FFNN. Based on this proposed FFNN architecture, we also propose a robust recurrent neural network (RNN) based dynamic weighted combination model (PRNNDWCM) to predict the software reliability more justifiably. A real-coded genetic algorithm (GA) is proposed to train the ANNs. Predictability of the proposed models are compared with the existing ANN based software reliability models through three real software failure data sets. We also compare the performances of the proposed models with the models that can be developed by combining three or two of the four SRGMs. Comparative studies demonstrate that the PFFNNDWCM and PRNNDWCM present fairly accurate fitting and predictive capability than the other existing ANN based models. Numerical and graphical explanations show that PRNNDWCM is promising for software reliability prediction since its fitting and prediction error is much less relative to the PFFNNDWCM.  相似文献   

5.
嵌入式实时控制系统软件可靠性建模与应用   总被引:1,自引:0,他引:1  
郭荣佐  黄君 《计算机应用》2013,33(2):575-578
嵌入式实时控制系统(ERCS)广泛应用于各种控制系统中,其软件不同于普通软件,除满足实时性要求外,可靠性也是相当重要的。首先对嵌入式实时控制系统软件进行形式化抽象定义,然后对不可再分的软件模块进行可靠性建模,并应用Copula函数对软件系统进行建模,最后应用建立的模型,对具体的系统进行了软件可靠性计算。通过实例计算可知,用Copula函数建立的嵌入式实时控制系统软件可靠性模型,考虑了软件各个模块的相依性,进而得到嵌入式实时控制系统软件模块相依的可靠度较各模块独立时有所提高。  相似文献   

6.
Mobile payments can be categorized according to their usage in each of the five payment scenarios presented here. The paper proposes the mobile payment modeling approach (MPMA) especially suited for value-based analysis of mobile payment use cases. Based on this approach, the study also develops a set of seven reference models that can classify any given mobile payment use case or mobile payment procedure and analyze it with regard to the business model, the roles of the market participants, and their interrelation from a value-based perspective. An introspective analysis of the mobile payment service provider role and a market constellation analysis which shows the implications of different actors assuming one or more of the respective roles complete the study.  相似文献   

7.
The analysis and classification of data is a common task in multiple fields of experimental research such as bioinformatics, medicine, satellite remote sensing or chemometrics leading to new challenges for an appropriate analysis. For this purpose different machine learning methods have been proposed. These methods usually do not provide information about the reliability of the classification. This, however, is a common requirement in, e.g. medicine and biology. In this line the present contribution offers an approach to enhance classifiers with reliability estimates in the context of prototype vector quantization. This extension can also be used to optimize precision or recall of the classifier system and to determine items which are not classifiable. This can lead to significantly improved classification results. The method is exemplarily presented on satellite remote spectral data but is applicable to a wider range of data sets.  相似文献   

8.
As the structural and behavioral complexity of systems has increased, so has interest in reusing modules in early development phases. Developing reusable modules and then weaving them into specific systems has been addressed by many approaches, including plug-and-play software component technologies, aspect-oriented techniques, design patterns, superimposition, and product line techniques. Most of these ideas are expressed in an object-oriented framework, so they reuse behaviors after dividing them into methods that are owned by classes. In this paper, we present a crosscutting reuse approach that applies object-process methodology (OPM). OPM, which unifies system structure and behavior in a single view, supports the notion of a process class that does not belong to and is not encapsulated in an object class, but rather stands alone, capable of getting input objects and producing output objects. The approach features the ability to specify modules generically and concretize them in the target application. This is done in a three-step process: designing generic and target modules, weaving them into the system under development, and refining the combined specification in a way that enables the individual modules to be modified after their reuse. Rules for specifying and combining modules are defined and exemplified, showing the flexibility and benefits of this approach.
Shmuel KatzEmail:
  相似文献   

9.
In generalized renewal process (GRP) reliability analysis for repairable systems, Monte Carlo (MC) simulation method instead of numerical method is often used to estimate model parameters because of the complexity and the difficulty of developing a mathematically tractable probabilistic model. In this paper, based on the conditional Weibull distribution for repairable systems, using negative log-likelihood as an objective function and adding inequality constraints to model parameters, a nonlinear programming approach is proposed to estimate restoration factor for the Kijima type GRP model I, as well as the model II. This method minimizes the negative log-likelihood directly, and avoids solving the complex system of equations. Three real and different types of field failure data sets with time truncation for NC machine tools are analyzed by the proposed numerical method. The sampling formulas of failure times for the GRP models I and II are derived and the effectiveness of the proposed method is validated with MC simulation method. The results show that the GRP model is superior to the ordinary renewal process (ORP) and the power law non-homogeneous Poisson process (PL-NHPP) model.  相似文献   

10.
朱清超  陈靖  龚水清 《计算机应用》2016,36(10):2664-2669
针对移动自组网(MANET)多速率媒体接入控制(MAC)协议吞吐量和公平性偏低问题,推导不同发送速率节点吞吐量表达式,定量分析限制协议性能的关键在于低速率节点和高速率节点信道占用时间的不公平性。基于时间公平性最大化考量,在不影响低速率节点性能的前提下,提出低速率节点竞争窗口和分组长度最优化两种机制,最大化高速率节点吞吐量,使网络饱和吞吐量最优。实验结果表明,发送速率为1 Mb/s和11 Mb/s且Jain公平索引值最大时,低速竞争窗口仿真和理论最优值为320和340,分组长度为64 B和60 B,且低速节点吞吐量基本不变,但饱和吞吐量理论值比仿真时高0.2~0.5 Mb/s,公平性和吞吐量均得到改善。  相似文献   

11.
The GroundWater Spatiotemporal Data Analysis Tool (GWSDAT) is a user friendly, open source, decision support tool for the analysis and reporting of groundwater monitoring data. Uniquely, GWSDAT applies a spatiotemporal model smoother for a more coherent and smooth interpretation of the interaction in spatial and time-series components of groundwater solute concentrations. Data entry is via a standardised Microsoft Excel input template whilst the underlying statistical modelling and graphical output are generated using the open source statistical program R. This paper describes in detail the various plotting options available and how the graphical user interface can be used for rapid, rigorous and interactive trend analysis with facilitated report generation. GWSDAT has been used extensively in the assessment of soil and groundwater conditions at Shell's downstream assets and the discussion section describes the benefits of its applied use. Finally, some consideration is given to possible future developments.  相似文献   

12.
需求是软件项目的基础,如何把握用户需求,是项目成功与否的关键.一个灵活的软件需求分析技术,可以帮助开发人员准确获取用户需求.用例建模是面向对象软件开发技术的重要组成内容,它能够完整地捕捉系统的功能性需求,体现用户和系统之间的交互关系.通过一个实例分析,简要介绍如何使用用例建模技术,完成软件需求分析.  相似文献   

13.
Non-Functional Requirements (NFRs) are rarely treated as “first-class” elements in software development as Functional Requirements (FRs) are. Often NFRs are stated informally and incorporated in the final software as an after-thought. We leverage existing research work for the treatment of NFRs to propose an approach that enables to systematically analyze and design NFRs in parallel with FRs. Our approach premises on the importance of focusing on tactics (the specific mechanisms used to fulfill NFRs) as opposed to focusing on NFRs themselves. The advantages of our approach include filling the gap between NFRs elicitation and NFRs implementation, systematically treating NFRs through grouping of tactics so that tactics in the same group can be addressed uniformly, remedying some shortcomings in existing work (by prioritizing NFRs and analyzing tradeoff among NFRs), and integration of FRs and NFRs by treating them as first-class entities.  相似文献   

14.
Best practices currently state that the security requirements and security architectures of distributed software-intensive systems should be based on security risk assessments, which have been designed from security patterns, are implemented in security standards and are tool-supported throughout their development life-cycle. Web service-based information systems uphold inter-enterprise relations through the Internet, and this technology has been revealed as the reference solution with which to implement Service-Oriented Architectures. In this paper, we present the application of the Process for Web Service Security (PWSSec), developed by the authors, to a real web service-based case study. The manner in which security in inter-organizational information systems can be analyzed, designed and implemented by applying PWSSec, which combines a risk analysis and management, along with a security architecture and a standard-based approach, is also shown. We additionally present a tool built to provide support to the PWSSec process.  相似文献   

15.
Organization scholars differ in their understanding and application of the construct of “knowledge” in theorizing and empirical research. Over the past years, two perspectives have become prevalent in organization science. The individualist perspective assumes the locus of knowledge is people who learn, and that knowledge cannot extend beyond the physical limits of human beings. The collectivist perspective assumes the locus of knowledge is collective. Collective entities accumulate knowledge through forms of social learning. Boundaries of knowledge are drawn around social entities—groups, communities, networks, and organizational units, etc. Recent work in management and organization science has accentuated the differences, and argued against the widespread adoption of a collectivist perspective. This argument holds implications for information systems research. The current paper reviews selected contributions on the locus of knowledge, presents an argument for a combined collectivist and individualist perspective, and outlines future directions for information systems research. Drawing on two significant examples, I show that information systems research has a strategic role to play in greatly advancing this combined perspective.  相似文献   

16.
This survey article highlights the difficulties in the field maintenance of telecommunication towers. It critically analyses the main features of the deployment of robots to maintain telecommunication towers. The growing demand for mobile connectivity poses the need for more towers, and the subsequent problem of network maintenance becomes more critical. Most tower maintenance is required work at height; therefore, height-related risks are more frequent. A rigorous review is conducted, and the growth of the telecommunications network and key on-site maintenance challenges are analyzed. Despite numerous challenges, these towers are maintained manually by riggers (certified climbers) worldwide. It raises the question, Is it possible to implement automation by robots for the maintenance of telecommunications towers? The feasibility analysis to deploy the robots is conducted systematically. To access the tower through a robot, detailed information on the type of towers, the climbing arrangements available on the existing towers, and the necessary operations to be carried out at the height is collected. A critical analysis of the climbing robots currently available in the literature, their grasping technology, and control algorithms is performed. The opinion of experts in the telecommunication industry is very helpful in identifying the requirements of robotic systems. The design attributes especially needed for the climbing robot, and the execution of the maintenance in height are highlighted. Due justification is given for deploying robots for field maintenance of telecom towers. The recommended methodology for designing an automation system helps research in the field of maintenance of telecom towers through robots, which could bring a remarkable solution to the telecom sector.  相似文献   

17.
Classical approaches for remote visualization and collaboration used in Computer-Aided Design and Engineering (CAD/E) applications are no longer appropriate due to the increasing amount of data generated, especially using standard networks. We introduce a lightweight and computing platform for scientific simulation, collaboration in engineering, 3D visualization and big data management. This ICT based platform provides scientists an “easy-to-integrate” generic tool, thus enabling worldwide collaboration and remote processing for any kind of data. The service-oriented architecture is based on the cloud computing paradigm and relies on standard internet technologies to be efficient on a large panel of networks and clients. In this paper, we discuss the need of innovations in (i) pre and post processing visualization services, (ii) 3D large scientific data set scalable compression and transmission methods, (iii) collaborative virtual environments, and (iv) collaboration in multi-domains of CAD/E. We propose our open platform for collaborative simulation and scientific big data analysis. This platform is now available as an open project with all core components licensed under LGPL V2.1. We provide two examples of usage of the platform in CAD/E for sustainability engineering from one academic application and one industrial case study. Firstly, we consider chemical process engineering showing the development of a domain specific service. With the rise of global warming issues and with growing importance granted to sustainable development, chemical process engineering has turned to think more and more environmentally. Indeed, the chemical engineer has now taken into account not only the engineering and economic criteria of the process, but also its environmental and social performances. Secondly, an example of natural hazards management illustrates the efficiency of our approach for remote collaboration that involves big data exchange and analysis between distant locations. Finally we underline the platform benefits and we open our platform through next activities in innovation techniques and inventive design.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号