首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
This paper describes computer-based heuristic models for Reliability and Maintainability (R&M) allocation for large systems. The model is an embellishment to the Maintenance Allocation Program (MAP) developed at McDonnell Douglas Corporation. The new version of MAP known as REMAP (Reliability Embellished MAP) is a decision support tool for contractors who are involved in large scale design projects such as aircraft design. The REMAP allows the user to give preliminary R&M information about the system. The REMAP gives “what if” changes on the requirements required to meet the desired design reliability specification.  相似文献   

2.
This paper presents a multi-agent model system to characterize land-use change dynamics. The replicable parameterization process should be useful to the development of simulation frameworks, important to environmental policy makers to analyze different scenarios during decision making process. The methodological two-fold approach intends to form a solid backbone based on: (i) the systematic and structured empirical characterization of the model; and (ii) the conceptual structure definition according to the agent-based model documentation protocol – Overview, Design concepts and Details. A multi-agent system for land-use change simulation was developed to validate the model, which is illustrated with a case study of the Brazilian Cerrado using LANDSAT ETM images. The simulation results prove the model importance with a figure of merit greater than 50%, what means the amount of correctly predicted change is larger than the sum of any type of error. The results are very good compared with nine popular peer-reviewed land change models.  相似文献   

3.
Adaptive simulation for system reliability analysis of large structures   总被引:6,自引:0,他引:6  
This article proposes an efficient simulation-based methodology to estimate the system reliability of large structures. The proposed method uses a hybrid approach: first, a probabilistic enumeration technique is used to identify a significant system failure sequence. This provides an initial sampling domain for an adaptive importance sampling procedure. As further simulations are performed, information about other significant sequences is incorporated to refine the sampling domain and to estimate the failure probability of the system. The adaptive sampling overcomes the restrictive assumptions of analytical techniques, yet achieves the robustness and accuracy of basic Monte Carlo simulation in an efficient manner. In this article, the proposed method is implemented using the ANSYS finite element software, and is applied to the system reliability estimation of two redundant structural systems, a six-story building frame and a transmission tower. Both ductile and brittle failure modes are considered. The method is structured in a modular form such that it can be easily applied to different types of problems and commercial software, thus facilitating practical application.  相似文献   

4.
A one-dimensional simulation procedure is developed for use in estimating structural reliability in multi-dimensional load and resistance space with the loads represented as stochastic process. The technique employed is based on the idea of using ‘strips’ of points parallel to each other and sampled on the limit state hyperplanes. The ‘local’ outcrossing rate and the zero time failure probability Pf(0) associated with the narrow strips are derived using the conditional reliability index. When the domain boundary consists of a set of limit states, second order bounds are used to obtain a lower bound approximation of the outcrossing rate and Pf(0) associated with the union of a set of λ strips. It is shown by examples that for high reliability problems, λ may be much less than the number of limit states without significant loss of accuracy and with considerable saving in computation time. It was also found that the rate of convergence of the simulations is quite fast even without using importance sampling.  相似文献   

5.
Since some assumptions such as the function ϕ(·) needs to be completely specified and the relationship between μ and ϕ(s) must have linear behavior in the model μ = a + (S) used in the accelerated life testing analysis, generally do not hold; the estimation of stress level contains uncertainty. In this paper, we propose to use a non-linear fuzzy regression model for performing the extrapolation process and adapting the fuzzy probability theory to the classical reliability including uncertainty and process experience for obtaining fuzzy reliability of a component. Results show, that the proposed model has the ability to estimate reliability when the mentioned assumptions are violated and uncertainty is implicit; so that the classical models are unreliable.  相似文献   

6.
蔡金燕  于志坚 《计算机应用》2011,31(5):1428-1430
针对无失效数据条件下复杂电子设备的系统可靠性评估问题,提出了一种基于电路单元性能的仿真评估方法。首先,确定组成系统的电路功能单元模块,根据单元性能参数与系统输出性能的逻辑关系,建立系统性能可靠性仿真模型;然后,通过性能数据的统计分析估计各单元的性能分布参数,根据分布参数获得性能数据的随机抽样值,由抽样结果仿真得到系统的输出性能数据;最后,统计仿真结果获得系统输出性能参数的失效比例,进而实现系统的性能可靠性评估。通过一个实例验证了该方法的有效性和实用性。  相似文献   

7.
Kan  Guangyuan  He  Xiaoyan  Li  Jiren  Ding  Liuqian  Zhang  Dawei  Lei  Tianjie  Hong  Yang  Liang  Ke  Zuo  Depeng  Bao  Zhenxin  Zhang  Mengjie 《Neural computing & applications》2018,29(7):577-593

Artificial neural network (ANN)-based data-driven model is an effective and robust tool for multi-input single-output (MISO) system simulation task. However, there are several conundrums which deteriorate the performance of the ANN model. These problems include the hard task of topology design, parameter training, and the balance between simulation accuracy and generalization capability. In order to overcome conundrums mentioned above, a novel hybrid data-driven model named KEK was proposed in this paper. The KEK model was developed by coupling the K-means method for input clustering, ensemble back-propagation (BP) ANN for output estimation, and K-nearest neighbor (KNN) method for output error estimation. A novel calibration method was also proposed for the automatic and global calibration of the KEK model. For the purpose of intercomparison of model performance, the ANN model, KNN model, and proposed KEK model were applied for two applications including the Peak benchmark function simulation and the real-world electricity system daily total load forecasting. The testing results indicated that the KEK model outperformed other two models and showed very good simulation accuracy and generalization capability in the MISO system simulation tasks.

  相似文献   

8.
A neural network model that processes financial input data is developed to estimate the market price of options at closing. The network's ability to estimate closing prices is compared to the Black-Scholes model, the most widely used model for the pricing of options. Comparisons reveal that the mean squared error for the neural network is less than that of the Black-Scholes model in about half of the cases examined. The differences and similarities in the two modeling approaches are discussed. The neural network, which uses the same financial data as the Black-Scholes model, requires no distribution assumptions and learns the relationships between the financial input data and the option price from the historical data. The option-valuation equilibrium model of Black-Scholes determines option prices under the assumptions that prices follow a continuous time path and that the instantaneous volatility is nonstochastic.  相似文献   

9.
Software cost estimation is an important concern for software managers and other software professionals. The hypothesized model in this research suggests that an organization's use of an estimate influences its estimating practices which influence both the basis of the estimating process and the accuracy of the estimate. The model also suggests that the estimating basis directly influences the accuracy of the estimate. A study of business information systems managers and professionals at 112 different organizations using causal analysis with the Equations Modeling System (EQS) refined the model. The refined model shows that no managerial practice in this study discourages the use of intuition, guessing and personal memory in cost estimating. Although user commitment and accountability appear to foster algorithm-based estimating, such an algorithmic basis does not portend greater accuracy. Only one managerial practice-the use of the estimate in performance evaluations of software managers and professionals-presages greater accuracy. By implication, the research suggests somewhat ironically that the most effective approach to improve estimating accuracy may be to make estimators, developers and managers more accountable for the estimate even though it may be impossible to direct them explicitly on how to produce a more accurate one  相似文献   

10.
Service-oriented applications are dynamically built by assembling existing, loosely coupled, distributed, and heterogeneous services. Predicting their reliability is very important to appropriately drive the selection and assembly of services, to evaluate design feasibility, to compare design alternatives, to identify potential failure areas and to maintain an acceptable reliability level under environmental extremes.This article presents a model for predicting reliability of a service-oriented application based on its architecture specification in the lightweight formal language SCA-ASM. The SCA-ASM component model is based on the OASIS standard Service Component Architecture for heterogeneous service assembly and on the formal method Abstract State Machines for modeling service behavior, interactions, and orchestration in an abstract but executable way.The proposed method provides an automatic and compositional means for predicting reliability both at system-level and component-level by combining a reliability model for an SCA assembly involving SCA-ASM components, and a reliability model of an SCA-ASM component. It exploits ideas from architecture-based and path-based reliability models. A set of experimental results shows the effectiveness of the proposed approach and its comparison with a state-of-the art BPEL-based approach.  相似文献   

11.
Mattel Toys is a leading toy manufacturing company and Barbie is one of Mattel's major product lines. Mattel develops several variations of the toy every year which requires many duplicate tools for standard figure parts and tools for parts similar to the standard parts. The tooling engineer estimates the type of tools and the number of tools to be built to support a given production rate. This estimate, represented in a tool plan is carried out manually, giving rise to inaccuracies. A decision support system was developed to estimate the number of duplicate tools as well as the budgets to build these tools. The decision support system was checked for accuracy by conducting a case study.  相似文献   

12.
Structural and Multidisciplinary Optimization - Time-dependent global reliability sensitivity can quantify the effect of input variables in their whole distribution ranges on the time-dependent...  相似文献   

13.
Modeling the generation of a wind farm and its effect on power system reliability is a challenging task, largely due to the random behavior of the output power. In this paper, we propose a new probabilistic model for assessing the reliability of wind farms in a power system at hierarchical level II (HLII), using a Monte Carlo simulation. The proposed model shows the effect of correlation between wind and load on reliability calculation. It can also be used for identifying the priority of various points of the network for installing new wind farms, to promote the reliability of the whole system. A simple grid at hierarchical level I (HLI) and a network in the north-eastern region of Iran are studied. Simulation results showed that the correlation between wind and load significantly affects the reliability.  相似文献   

14.
15.
Combat system effectiveness simulation (CoSES) plays an irreplaceable role in the effectiveness measurement of combat systems. According to decades of research and practice, composable modeling and multi-domain modeling are recognized as two major modeling requirements in CoSES. Current effectiveness simulation researches attempt to cope with the structural and behavioral complexity of CoSES based on a unified technological space, and they are limited to their existing modeling paradigms and fail to meet these two requirements. In this work, we propose a model framework-based domain-specific composable modeling method to solve this problem. This method builds a common model framework using application invariant knowledge for CoSES, and designs domain-specific modeling infrastructures for subdomains as corresponding extension points of the framework to support the modeling of application variant knowledge. Therefore, this method supports domain-specific modeling in multiple subdomains and the composition of subsystem models across different subdomains based on the model framework. The case study shows that this method raises the modeling abstraction level, supports generative modeling, and promotes model reuse and composability.  相似文献   

16.
17.
Missingness frequently complicates the analysis of longitudinal data. A popular solution for dealing with incomplete longitudinal data is the use of likelihood-based methods, when, for example, linear, generalized linear, or non-linear mixed models are considered, due to their validity under the assumption of missing at random (MAR). Semi-parametric methods such as generalized estimating equations (GEEs) offer another attractive approach but require the assumption of missing completely at random (MCAR). Weighted GEE (WGEE) has been proposed as an elegant way to ensure validity under MAR. Alternatively, multiple imputation (MI) can be used to pre-process incomplete data, after which GEE is applied (MI-GEE). Focusing on incomplete binary repeated measures, both methods are compared using the so-called asymptotic, as well as small-sample, simulations, in a variety of correctly specified as well as incorrectly specified models. In spite of the asymptotic unbiasedness of WGEE, results provide striking evidence that MI-GEE is both less biased and more accurate in the small to moderate sample sizes which typically arise in clinical trials.  相似文献   

18.
A prospective payment system based on diagnosis procedure combination (DPC/PPS) was introduced to acute care hospitals in Japan in April 2003. In order to increase hospital income, hospitals must shorten the average length of stay (ALOS) and increase the number of patients. We constructed a simulation program for evaluating the relationships among ALOS, bed occupation rate (BOR) and hospital income of hospitals in which DPC/PPS has been introduced. This program can precisely evaluate the hospital income by regulating the ALOS and the number of patients for each DPC. By using this program, it is possible to predict the optimum ALOS and optimum number of inpatients for each DPC in order to increase hospital income.  相似文献   

19.
Zoran  Igor   《Data & Knowledge Engineering》2008,67(3):504-516
The paper compares different approaches to estimate the reliability of individual predictions in regression. We compare the sensitivity-based reliability estimates developed in our previous work with four approaches found in the literature: variance of bagged models, local cross-validation, density estimation, and local modeling. By combining pairs of individual estimates, we compose a combined estimate that performs better than the individual estimates. We tested the estimates by running data from 28 domains through eight regression models: regression trees, linear regression, neural networks, bagging, support vector machines, locally weighted regression, random forests, and generalized additive model. The results demonstrate the potential of a sensitivity-based estimate, as well as the local modeling of prediction error with regression trees. Among the tested approaches, the best average performance was achieved by estimation using the bagging variance approach, which achieved the best performance with neural networks, bagging and locally weighted regression.  相似文献   

20.
A smoke simulation approach based on the integration of traditional particle systems and density functions is presented in this paper.By attaching a density function to each particle as its attribute,the diffusion of smoke can be described by the variation of parti-each particle as its attribute ,the diffusion of smoke can be described by the variation of parti-cles‘ density functions ,along with the effect on airflow by controlling particles‘ movement and fragmentation.In addition.a continuous density field for realistic rendering can be generatd quickly through the look-up talbes of particle‘s density functions .Compared with traditional particle systems,this approach can describe smoke diffusion,and provide a continuous density field for realistic rendering with much less computation.A quick rendering scheme is also presented in this paper as a useful preview tool for tuning appropriate parameters in the smoke model.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号