首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到16条相似文献,搜索用时 15 毫秒
1.
This paper describes the construction of a discrete‐event simulation model of a proposed train maintenance depot for an underground transportation facility in the UK. The company who were bidding to operate the depot had traditionally been involved in manufacture and so had no experience of either operating such a facility or meeting the type of performance indicators specified in the service‐level agreement. The simulation proved successful in providing a greater understanding of the operation of the depot and the effect of various strategies for meeting demand. It also proved to be an excellent communication tool during the bid process in helping to show capability to the client in meeting proposed service level targets over time and thus prove service reliability. Copyright © 2000 John Wiley & Sons, Ltd.  相似文献   

2.
Degradation tests are widely used to assess the reliability of highly reliable products which are not likely to fail under traditional life tests or accelerated life tests. However, for some highly reliable products, the degradation may be very slow and hence it is impossible to have a precise assessment within a reasonable amount of testing time. In such cases, an alternative is to use higher stresses to extrapolate the product's reliability at the design stress. This is called an accelerated degradation test (ADT). In conducting an ADT, several decision variables, such s the inspection frequency, sample size and termination time, at each stress level are influential on the experimental efficiency. An inappropriate choice of these decision variables not only wastes experimental resources but also reduces the precision of the estimation of the product's reliability at the use condition. The main purpose of this paper is to deal with the problem of designing an ADT. By using the criterion of minimizing the mean‐squared error of the estimated 100 th percentile of the product's lifetime distribution at the use condition subject to the constraint that the total experimental cost does not exceed a predetermined budget, a nonlinear integer programming problem is built to derive the optimal combination of the sample size, inspection frequency and the termination time at each stress level. A numerical example is provided to illustrate the proposed method. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

3.
ISO 10995 is the international standard for the reliability testing and archival lifetime prediction of optical media. The standard specifies the testing conditions in terms of the combinations of stress variables—temperature and relative humidity. The periodically collected data from tests are the error rate of the device, and failure is defined as the error rate exceeding a predetermined level. The standard assumes that the projected failure time is the actual failure time, and these projected failure times are then analyzed by using an Eyring or Arrhenius model. Since true failure times are often not directly observed, the uncertainties in the failure time must be taken into account. In this paper, we present a hierarchical model for degradation that can directly infer failure time at the use condition and compare this model with the International Standard Organization (ISO) standard through a simulation study. Not accounting for the uncertainty in the projected failure times leads to unjustified confidence in the estimation for the median lifetime at both the stress conditions used in the experiments and at the use condition.  相似文献   

4.
Statistically designed experiments provide a proactive means for improving reliability; moreover, they can be used to design products that are robust to noise factors which are hard or impossible to control. Traditionally, failure‐time data have been collected; for high‐reliability products, it is unlikely that failures will occur in a reasonable testing period, so the experiment will be uninformative. An alternative, however, is to collect degradation data. Take, for example, fluorescent lamps whose light intensity decreases over time. Observation of light‐intensity degradation paths, given that they are smooth, provides information about the reliability of the lamp, and does not require the lamps to fail. This paper considers experiments with such data for ‘reliability improvement’, as well as for ‘robust reliability achievement’ using Taguchi's robust design paradigm. A two‐stage maximum‐likelihood analysis based on a nonlinear random‐effects model is proposed and illustrated with data from two experiments. One experiment considers the reliability improvement of fluorescent lamps. The other experiment focuses on robust reliability improvement of light‐emitting diodes. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

5.
6.
In this work, a methodology to monitor a shift in the quantile of a distribution that is a member of the log-symmetric family is proposed. Because the sampling distribution of a quantile estimator is often not available, the parametric bootstrap method is used to determine this sampling distribution and to establish the control limits when the process measurements follow a log-symmetric distribution. The mentioned family is helpful for describing the behavior of data following a distribution with positive support and that is skewed to the right. Monte Carlo simulations are carried out to investigate the performance of the proposed bootstrap control charts for quantiles. An application regarding failure data due to stress on carbon fibers is presented for illustration when monitoring reliability data. This illustration shows that non-conventional models, other than the Birnbaum-Saunders, log-normal and Weibull distributions, have potential to be used in practice. Two model selection procedures are considered to assess adequacy to the data. To facilitate the public use of the proposed methodology, we have created an R package named chartslogsym whose main functions are detailed in this paper.  相似文献   

7.
Statistical inference for mechanistic models of partially observed dynamic systems is an active area of research. Most existing inference methods place substantial restrictions upon the form of models that can be fitted and hence upon the nature of the scientific hypotheses that can be entertained and the data that can be used to evaluate them. In contrast, the so-called plug-and-play methods require only simulations from a model and are thus free of such restrictions. We show the utility of the plug-and-play approach in the context of an investigation of measles transmission dynamics. Our novel methodology enables us to ask and answer questions that previous analyses have been unable to address. Specifically, we demonstrate that plug-and-play methods permit the development of a modelling and inference framework applicable to data from both large and small populations. We thereby obtain novel insights into the nature of heterogeneity in mixing and comment on the importance of including extra-demographic stochasticity as a means of dealing with environmental stochasticity and model misspecification. Our approach is readily applicable to many other epidemiological and ecological systems.  相似文献   

8.
In this case study, we investigate the degradation process of light‐emitting diodes (LEDs), which is used as a light source in DNA sequencing machines. Accelerated degradation tests are applied by varying temperature and forward current, and the light outputs are measured by a computerized measuring system. A degradation path model, which connects to the LED function recommended in Mitsuo (1991), is used in describing the degradation process. We consider variations in both measurement errors and degradation paths among individual test units. It is demonstrated that the hierarchical modeling approach is flexible and powerful in modeling a complex degradation process with nonlinear function and random coefficient. After fitting the model by maximum likelihood estimation, the failure time distribution can be obtained by simulation. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

9.
In this paper, we focus on the performance of adjustment rules for a machine that produces items in batches and that can experience errors at each setup operation performed before machining a batch. The adjustment rule is applied to compensate for the setup offset in order to bring back the process to target. In particular, we deal with the case in which no prior information about the distribution of the offset or about the within‐batch variability is available. Under such conditions, adjustment rules that can be applied are Grubbs' rules, the exponentially‐weighted moving average (EWMA) controller and the Markov chain Monte Carlo (MCMC) adjustment rule, based on a Bayesian sequential estimation of unknown parameters that uses MCMC simulation. The performance metric of the different adjustment rules is the sum of the quadratic off‐target costs over the set of batches machined. Given the number of batches and the batch size, different production scenarios (characterized by different values of the lot‐to‐lot and the within‐lot variability and of the mean offset over the set of batches) are considered. The MCMC adjustment rule is shown to have better performance in almost all of the cases examined. Furthermore, a closer study of the cases in which the MCMC policy is not the best adjustment rule motivates a modified version of this rule which outperforms alternative adjustment policies in all the scenarios considered. Copyright © 2005 John Wiley & Sons, Ltd.  相似文献   

10.
A significant amount of problems and applications in stochastic mechanics and engineering involve multi-dimensional random functions. The probabilistic analysis of these problems is usually computationally very expensive if a brute-force Monte Carlo method is used. Thus, a technique for the optimal selection of a moderate number of samples effectively representing the entire space of sample realizations is of paramount importance. Functional Quantization is a novel technique that has been proven to provide optimal approximations of random functions using a predetermined number of representative samples. The methodology is very easy to implement and it has been shown to work effectively for stationary and non-stationary one-dimensional random functions. This paper discusses the application of the Functional Quantization approach to the domain of multi-dimensional random functions and the applicability is demonstrated for the case of a 2D non-Gaussian field and a two-dimensional panel with uncertain Young modulus under plane stress.  相似文献   

11.
Abstract

Many uncertainties and cost variations occur in the work activities of a project, thereby causing many possibilities of under-estimating or over-estimating for a bid price. A comprehensive study for each process of risk management should be investigated to achieve project objectives. However, a limited number of studies have a comprehensive viewpoint to indicate the benefits of risk management and the effect on project performance for the engineering design stage of engineering–procurement–construction (EPC) projects, especially in the basic design stage. This research was conducted to identify and analyze the risks associated with a Basic Design Engineering (BDE) project for a high value-added petrochemical plant in Taiwan. First, a project risk management work flow was proposed as an effective tool to minimize the project risks and maximize the management capacity of practitioners. Second, the cost effect of project risks was described by conducting a case study for the design process of a high value-added petrochemical plant using a Monte Carlo simulation. A risk register was identified to support the data required for conducting simulation analysis. The results of this paper provide reference points for risk management planning of project execution and help project managers evaluate particular risks at the engineering design stage of EPC projects to avoid cost overruns.  相似文献   

12.
Summary Based on type-2 censored samples, the maximum likelihood, uniformly minimum variance unbiased, Bayes and empirical Bayes estimators of one of the two shape parameters (k) and reliability functionR(t) of the Burr type XII failure model are computed and compared. Computations show that when the censoring sizer=10, the EBE’s of κ andR(t),t=0.9, are better than the corresponding UMVUE’s for as few asm *=7 past samples fork andm *=11 past samples forR(0.9), in the sense of having smaller estimated risks, when the gamma conjugate prior is used.  相似文献   

13.
Recently, the manufacturing industry has been striving for sustainability because of the environmental degradation and resource depletion caused by it. Remanufacturing considerably saves material and is energy efficient, and thus, it can represent an important solution to environmental issues. However, the uncertainty of remanufacturing makes the practical management of closed-loop supply chains (CLSCs) difficult. To unlock the value potential of end-of-life (EOL) products, we studied a reuse, remanufacture, and recycle (3R) processing system under quality uncertainty for returned EOL engines. In the system, the returned cores were distributed into different processing routes, depending on the results of quality grading. The proposed matrix operations could efficiently assess the environmental benefits; moreover, we designed an algorithm to calculate the quality coefficient that reflects the overall quality condition of returned EOL cores. The impacts of quality uncertainty on the environment could be efficiently quantified via our proposed method. Furthermore, using Monte Carlo simulation and the law of large numbers, we devised a model to establish direct and definite quantitative relationships between the quality coefficient and production indexes. This model provides a basis for the formulation of optimal acquisition strategies under different returning scenarios.  相似文献   

14.
In Part 1 of this paper a methodology for back‐to‐back testing of simulation software was described. Residuals with error‐dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as ‘definite’, ‘possible’ or ‘impossible’. The status of ‘possible’ errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

15.
We present a sub‐structuring method for the coupling between a large elastic structure, and a stratified soil half‐space exhibiting random heterogeneities over a bounded domain and impinged by incident waves. Both media are also weakly dissipative. The concept of interfaces classically used in sub‐structuring methods is extended to ‘volume interfaces’ in the proposed approach. The random dimension of the stochastic fields modelling the heterogeneities in the soil is reduced by introducing a Karhunen–Loéve expansion of these stochastic fields. The coupled overall problem is solved by Monte‐Carlo simulation techniques. A realistic example of a large industrial structure interacting with an uncertain stratified soil medium under earthquake is finally presented. This case study and others validate the presented methodology and its ability to handle complex mechanical systems. Copyright © 2002 John Wiley & Sons, Ltd.  相似文献   

16.
This paper presents analyses of data from the Highway Safety Information System (HSIS) for the State of Illinois. Our analyses focuses on whether various changes in road network infrastructure and geometric design can be associated with changes in road fatalities and reported accidents. We also evaluate models that control for demographic changes. County-level time-series data is used and fixed effect negative binomial models are estimated. Results cannot confirm the hypothesis that changes in road infrastructure and geometric design have been beneficial for safety. Increases in the number of lanes appears to be associated with both increased traffic-related accidents and fatalities. Increased lane widths appears to be associated with increased fatalities. Increases in outside shoulder width appear to be associated with a decrease in accidents. Inclusion of demographic results does not significantly change these results but does capture much of the residual time trend in the models. Potentially mis-leading results are found when the time trend is not included. In this case a negative association between vertical curvature and both accidents and fatalities. No statistical association with changes in safety is found for median widths, inside shoulder widths, and horizontal and vertical curvature.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号