首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 10 毫秒
1.
The estimation of the tail index and extreme quantiles of a heavy-tailed distribution is addressed when some covariate information is available and the data are randomly right-censored. Several estimators are constructed by combining a moving-window technique (for tackling the covariate information) and the inverse probability-of-censoring weighting method. The asymptotic normality of these estimators is established and their finite-sample properties are investigated via simulations. A comparison with alternative estimators is provided. Finally, the proposed methodology is illustrated on a medical dataset.  相似文献   

2.
《Computer Networks》1999,31(6):627-637
Lucent Technologies PacketStar ATM Core Switch is designed to deliver the high reliability required in public telecommunications networks. It achieves this reliability through use of a variety of strategies including duplication of hardware and software components, isolation of groups of components to minimize the effect of problems, fault recovery hierarchies, and others. As a result, cell loss is very minimal during problems, and permanent virtual circuits are unaffected even if the main switch control system goes down.  相似文献   

3.
4.
用于可靠性分析的各类分布函数及实用性分析   总被引:1,自引:0,他引:1  
可靠性试验技术是可靠性工程学的主要内容之一。详细介绍了可靠性工程学涉及的各类分布函数,并以实例对各类分布函数在可靠性工程学中的实用性进行了分析,为同行开展可靠性分析提供参考。  相似文献   

5.
Aiming at efficiently estimating the dynamic failure probability with multiple temporal and spatial parameters and analyzing the global reliability sensitivity of the dynamic problem, a method is presented on the moment estimation of the extreme value of the dynamic limit state function. Firstly, two strategies are proposed to estimate the dynamic failure probability. One strategy is combining sparse grid technique for the extreme value moments with the fourth-moment method for the dynamic failure probability. Another is combining dimensional reduction method for fractional extreme value moments and the maximum entropy for dynamic failure probability. In the proposed two strategies, the key step is how to determine the temporal and spatial parameters where the dynamic limit state function takes their minimum value. This issue is efficiently addressed by solving the differential equations satisfying the extreme value condition. Secondly, three-point estimation is used to evaluate the global dynamic reliability sensitivity by combining with the dynamic failure probability method. The significance and the effectiveness of the proposed methods for estimating the temporal and spatial multi-parameter dynamic reliability and global sensitivity indices are demonstrated with several examples.  相似文献   

6.
Several methods of testing for reliability of engineering systems and equipment are currently available, together with applicable theoretical background. One of the more common methods is sequential testing, where tests are continued until a decision is reached as to product acceptability. Assuming an exponential survival curve outline the theory and procedures involved in carrying out these tests.The present paper gives a method for simulating reliability tests on digital computers whereby additional information such as the probability of completing the tests as a function of time can be obtained. The described Monte Carlo simulation uses exponential product survival characteristics and utilizes mathematical methods for saving computation time. The outlined computer method gives the user the flexibility of devising his own reliability testing procedures. The paper includes a numerical example and compares the results obtained from the computer program with those computed from theoretical formulas.  相似文献   

7.
We use the theory of order statistics, the concepts of first- and second-order stochastic dominance (FSD and SSD) to develop an order statistics SSD minimax decision rule. It can be used to refine choice within the random variables in the SSD noninferior set. We are able to reduce the size of the SSD noninferior set when we assume that the decision-maker is most concerned about the potential adverse outcomes at the right tail of the probability distribution. In other words, we consider the risk of extreme events and build on order statistics in order to refine the decision rules. In some eases, the order statistics SSD minimax decision rule can provide us with a unique choice from among the SSD noninferior set. We define the concept of conditional second-order stochastic dominance (CSSD) in order to model the risk of extreme events. We also use the concept of CSSD to develop a CSSD minimax decision rule  相似文献   

8.
Structural and Multidisciplinary Optimization - Uncertainties are usually modeled by random variables, and the values of distribution parameters are estimated from the collected samples. In...  相似文献   

9.
For a given prediction model, some predictions may be reliable while others may be unreliable. The average accuracy of the system cannot provide the reliability estimate for a single particular prediction. The measure of individual prediction reliability can be important information in risk-sensitive applications of machine learning (e.g. medicine, engineering, business). We define empirical measures for estimation of prediction accuracy in regression. Presented measures are based on sensitivity analysis of regression models. They estimate reliability for each individual regression prediction in contrast to the average prediction reliability of the given regression model. We study the empirical sensitivity properties of five regression models (linear regression, locally weighted regression, regression trees, neural networks, and support vector machines) and the relation between reliability measures and distribution of learning examples with prediction errors for all five regression models. We show that the suggested methodology is appropriate only for the three studied models: regression trees, neural networks, and support vector machines, and test the proposed estimates with these three models. The results of our experiments on 48 data sets indicate significant correlations of the proposed measures with the prediction error.  相似文献   

10.
Structural and Multidisciplinary Optimization - It is widely recognized that the active learning kriging (AK) combined with Monte Carlo simulation (AK-MCS) is a very efficient strategy for failure...  相似文献   

11.
In this paper, we propose a new method to estimate the relationship between software reliability and software development cost taking into account the complexity for developing the software system and the size of software intended to develop during the implementation phase of the software development life cycle. On the basis of estimated relationship, a set of empirical data has been used to validate the correctness of the proposed model by comparing the result with the other existing models. The outcome of this work shows that the method proposed here is a relatively straightforward one in formulating the relationship between reliability and cost during implementation phase.  相似文献   

12.
Water demands vary and consideration of the probabilistic nature of the variations should lead to more instructive assessments of the performance of water distribution systems. Water consumption data for several households were analysed using the chi-square technique and it was found that distributions worth considering under certain circumstances include the normal and lognormal.Reliability values were calculated for a range of critical demand values and the corresponding confidence levels determined from the probability distributions. Water consumption was assumed to be pressure dependent and the modelling of the water distribution system was carried out accordingly. This peaking factor approach coupled with the statistical modelling of demands provides a more realistic way of incorporating variations in demands in the evaluation and reporting of system performance than the traditional single demand value approach in that the extent to which a network can satisfy any demand and the probability that the demand will occur can be recognized explicitly. The method is illustrated by an example.  相似文献   

13.
14.
We consider the estimation of the parameters indexing a parametric model for the conditional distribution of a diagnostic marker given covariates and disease status. Such models are useful for the evaluation of whether and to what extent a marker’s ability to accurately detect or discard disease depends on patient characteristics. A frequent problem that complicates the estimation of the model parameters is that estimation must be conducted from observational studies. Often, in such studies not all patients undergo the gold standard assessment of disease. Furthermore, the decision as to whether a patient undergoes verification is not controlled by study design. In such scenarios, maximum likelihood estimators based on subjects with observed disease status are generally biased. In this paper, we propose estimators for the model parameters that adjust for selection to verification that may depend on measured patient characteristics and additionally adjust for an assumed degree of residual association. Such estimators may be used as part of a sensitivity analysis for plausible degrees of residual association. We describe a doubly robust estimator that has the attractive feature of being consistent if either a model for the probability of selection to verification or a model for the probability of disease among the verified subjects (but not necessarily both) is correct.  相似文献   

15.
The Birnbaum-Saunders distribution has recently received considerable attention in the statistical literature, including some applications in the environmental sciences. Several authors have generalized this distribution, but these generalizations are still inadequate for predicting extreme percentiles. In this paper, we consider a variation of the Birnbaum-Saunders distribution, which enables the prediction of extreme percentiles as well as the implementation of the EM algorithm for maximum likelihood estimation of the distribution parameters. This implementation has some advantages over the direct maximization of the likelihood function. Finally, we present results of a simulation study along with an application to a real environmental data set.  相似文献   

16.
In recent years, advanced geospatial technologies have been playing an increasingly important role in supporting critical decision makings in disaster response. One rising challenge to effectively use the growing volume of geospatial data sets is to rapidly process the data and to extract useful information. Unprocessed data are intangible and non-consumable, and often create the so-called “data-rich-but-information-poor” situation. To address this issue, this study proposed a Data Envelopment Analysis (DEA) based information salience framework to prioritize the sequence of the information processing tasks. The proposed model integrates the DEA efficiency score with a linguistic group decision process. For the input variables, computational complexity and intensity are selected to measure the difficulty in information processing. For the outputs, the performance of each processing tasks is evaluated based on the experts’ judgment on how the processing tasks satisfy the needs of decision makers. These needs are characterized by four classic disaster functions. A unique element of our proposed framework is that cone constraints are added to the DEA model based on the experts’ evaluation of the importance of the four disaster functions to model the dynamic information need. The proposed model was validated with a Hurricane Sandy based case study. The results indicate that the proposed framework is capable of prioritizing geospatial data processing tasks in a systematic manner and accelerating information extraction from disaster related geospatial data sets.  相似文献   

17.
Neural network models for conditional distribution under bayesian analysis   总被引:1,自引:0,他引:1  
We use neural networks (NN) as a tool for a nonlinear autoregression to predict the second moment of the conditional density of return series. The NN models are compared to the popular econometric GARCH(1,1) model. We estimate the models in a Bayesian framework using Markov chain Monte Carlo posterior simulations. The interlinked aspects of the proposed Bayesian methodology are identification of NN hidden units and treatment of NN complexity based on model evidence. The empirical study includes the application of the designed strategy to market data, where we found a strong support for a nonlinear multilayer perceptron model with two hidden units.  相似文献   

18.
19.
Reliability-based design optimization (RBDO) requires evaluation of sensitivities of probabilistic constraints. To develop RBDO utilizing the recently proposed novel second-order reliability method (SORM) that improves conventional SORM approaches in terms of accuracy, the sensitivities of the probabilistic constraints at the most probable point (MPP) are required. Thus, this study presents sensitivity analysis of the novel SORM at MPP for more accurate RBDO. During analytic derivation in this study, it is assumed that the Hessian matrix does not change due to the small change of design variables. The calculation of the sensitivity based on the analytic derivation requires evaluation of probability density function (PDF) of a linear combination of non-central chi-square variables, which is obtained by utilizing general chi-squared distribution. In terms of accuracy, the proposed probabilistic sensitivity analysis is compared with the finite difference method (FDM) using the Monte Carlo simulation (MCS) through numerical examples. The numerical examples demonstrate that the analytic sensitivity of the novel SORM agrees very well with the sensitivity obtained by FDM using MCS when a performance function is quadratic in U-space and input variables are normally distributed. It is further shown that the proposed sensitivity is accurate enough compared with FDM results even for a higher order performance function.  相似文献   

20.
The accurate and reliable measurement of effluent quality indices is essential for the implementation of successful control and optimization of wastewater treatment plants. In order to enhance the estimate performance in terms of accuracy and reliability, we present a partial least-squares-based extreme learning machine (called PLS-ELM) in this paper. The partial least squares (PLS) regression is applied to the ELM framework to improve the algebraic property of the hidden output matrix, which can be ill-conditional due to the high multicollinearity of the hidden layer output. The main idea behind our proposed PLS-ELM is to achieve a robust generalization performance by extracting a reduced number of latent variables from the hidden layer and using orthogonal projection operations. The results from a case study of a municipal wastewater treatment plant show that the PLS-ELM can effectively capture the input–output relationship with favorable performance against the conventional ELM.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号