首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
The Bayesian approach to uncertainty evaluation is a classical example of the fusion of information from different sources. Basically, it is founded on both the knowledge about the measurement process and the influencing quantities and parameters. The knowledge about the measurement process is primarily represented by the so-called model equation, which forms the basic relationship for the fusion of all involved quantities. The knowledge about the influencing quantities and parameters is expressed by their degree of belief, i.e. appropriate probability density functions that usually are obtained by utilizing the principle of maximum information entropy and the Bayes theorem. Practically, the Bayesian approach to uncertainty evaluation is put into effect by employing numerical integration techniques, preferably Monte-Carlo methods. Compared to the ISO-GUM procedure, the Bayesian approach does not have any restrictions with respect to nonlinearities and calculation of confidence intervals.  相似文献   

2.
Inference for the extreme-value regression model under Type-II censoring is discussed. The likelihood function and the score functions of the unknown parameters are presented. The asymptotic variance-covariance matrix is derived through the inverse of the expected Fisher information matrix. Since the maximum likelihood estimators (MLE) cannot be solved analytically, an approximation to these MLE are proposed. The variance-covariance matrix of these approximate estimators is also derived. Next, confidence intervals are proposed based on the MLE and the approximate estimators. An extensive simulation study is carried out in order to study the bias and variance of all these estimators. We also examine the coverage probabilities as well as the expected widths of the confidence intervals. Finally, all the inferential procedures discussed here are illustrated with practical data.  相似文献   

3.
Comparative lifetime experiments are of paramount importance when the object of a study is to ascertain the relative merits of two competing products in regard to the duration of their service life. In this paper, we discuss exact inference for two exponential populations when Type-II censoring is implemented on the two samples in a combined manner. We obtain the conditional maximum likelihood estimators (MLEs) of the two exponential mean parameters. We then derive the moment generating functions and the exact distributions of these MLEs along with exact confidence intervals and simultaneous confidence regions. Moreover, simultaneous approximate confidence regions based on the asymptotic normality of the MLEs and simultaneous credible confidence regions from a Bayesian viewpoint are also discussed. A comparison of the exact, approximate, bootstrap and Bayesian intervals is also made in terms of coverage probabilities. Finally, an example is presented in order to illustrate all the methods of inference discussed here.  相似文献   

4.
Two-stage stochastic linear complementarity problems (TSLCP) model a large class of equilibrium problems subject to data uncertainty, and are closely related to two-stage stochastic optimization problems. The sample average approximation (SAA) method is one of the basic approaches for solving TSLCP and the consistency of the SAA solutions has been well studied. This paper focuses on building confidence regions of the solution to TSLCP when SAA is implemented. We first establish the error-bound condition of TSLCP and then build the asymptotic and nonasymptotic confidence regions of the solutions to TSLCP by error-bound approach, which is to combine the error-bound condition with central limit theory, empirical likelihood theory, and large deviation theory.  相似文献   

5.
In this paper, the problem of estimating uncertainty regions for identified models is considered. A typical approach in this context is to resort to the asymptotic theory of Prediction Error Methods for system identification, by means of which ellipsoidal uncertainty regions can be constructed for the uncertain parameters.We show that the uncertainty regions worked out through the asymptotic theory can be unreliable in certain situations, precisely characterized in the paper.Then, we critically analyze the theoretical conditions for the validity of the asymptotic theory, and prove that the asymptotic theory also applies under new assumptions which are less restrictive than the usually required ones. Thanks to this result, we single out the classes of models among standard ones (ARX, ARMAX, Box-Jenkins, etc.) where the asymptotic theory can be safely used in practical applications to assess the quality of the identified model.These results are of interest in many applications, including iterative controller design schemes.  相似文献   

6.
In this paper, we consider single machine scheduling problems under position-dependent fuzzy learning effect with fuzzy processing times. We study three objectives which are to minimize makespan, total completion time and total weighted completion time. Furthermore, we show that these three problems are polynomially solvable under position-dependent fuzzy learning effects with fuzzy processing times. In order to model the uncertainty of fuzzy model parameters such as processing time and learning effect, we use an approach called likelihood profile that depends on the possibility and necessity measures of fuzzy parameters. For three objective functions, we build Fuzzy Mixed Integer Nonlinear Programming (FMINP) models using dependent chance constrained programming techniques for the same predetermined confidence levels. Furthermore, we present polynomially solvable algorithms for different confidence levels for these problems.  相似文献   

7.
A statistical minimax method for optimizing linear models with parameters, given up to the accuracy of belonging to some uncertainty sets, is proposed. Statistical methods for constructing uncertainty sets as confidence regions with a given reliability level are presented. A numerical method for finding a minimax strategy is proposed for arbitrary uncertainty sets that meet convexity and compactness conditions. A number of examples are considered that admit the analytical solution to optimization problem. Results of numerical simulation are given.  相似文献   

8.
A probabilistic construction of model validation   总被引:1,自引:0,他引:1  
We describe a procedure to assess the predictive accuracy of process models subject to approximation error and uncertainty. The proposed approach is a functional analysis-based probabilistic approach for which we represent random quantities using polynomial chaos expansions (PCEs). The approach permits the formulation of the uncertainty assessment in validation, a significant component of the process, as a problem of approximation theory. It has two essential parts. First, a statistical procedure is implemented to calibrate uncertain parameters of the candidate model from experimental or model-based measurements. Such a calibration technique employs PCEs to represent the inherent uncertainty of the model parameters. Based on the asymptotic behavior of the statistical parameter estimator, the associated PCE coefficients are then characterized as independent random quantities to represent epistemic uncertainty due to lack of information. Second, a simple hypothesis test is implemented to explore the validation of the computational model assumed for the physics of the problem. The above validation path is implemented for the case of dynamical system validation challenge exercise.  相似文献   

9.
Manual calibration of distributed models with many unknown parameters can result in problems of equifinality and high uncertainty. In this study, the Generalized Likelihood Uncertainty Estimation (GLUE) technique was used to address these issues through uncertainty and sensitivity analysis of a distributed watershed scale model (SAHYSMOD) for predicting changes in the groundwater levels of the Rechna Doab basin, Pakistan. The study proposes and then describes a stepwise methodology for SAHYSMOD uncertainty analysis that has not been explored in any study before. One thousand input data files created through Monte Carlo simulations were classified as behavior and non-behavior sets using threshold likelihood values. The model was calibrated (1983–1988) and validated (1998–2003) through satisfactory agreement between simulated and observed data. Acceptable values were observed in the statistical performance indices. Approximately 70% of the observed groundwater level values fell within uncertainty bounds. Groundwater pumping (Gw) and hydraulic conductivity (Kaq) were found to be highly sensitive parameters affecting groundwater recharge.  相似文献   

10.
Bootstrap confidence intervals for the mode of the hazard function   总被引:1,自引:0,他引:1  
In many applications of lifetime data analysis, it is important to perform inferences about the mode of the hazard function in situations of lifetime data modeling with unimodal hazard functions. For lifetime distributions where the mode of the hazard function can be analytically calculated, its maximum likelihood estimator is easily obtained from the invariance properties of the maximum likelihood estimators. From the asymptotical normality of the maximum likelihood estimators, confidence intervals can be obtained. However, these results might not be very accurate for small sample sizes and/or large proportion of censored observations. Considering the log-logistic distribution for the lifetime data with shape parameter beta>1, we present and compare the accuracy of asymptotical confidence intervals with two confidence intervals based on bootstrap simulation. The alternative methodology of confidence intervals for the mode of the log-logistic hazard function are illustrated in three numerical examples.  相似文献   

11.
Generalized Extended to the Limit LU sparse factorization procedures for the solution of large sparse unsymmetric linear systems of irregular and unsymmetric structure are presented. Composite “inner-outer” iterative schemes incorporating these procedures are introduced for solving non-linear elliptic and parabolic difference equations. Applications of the methods on non-linear boundary-value problems are discussed and numerical results are given.  相似文献   

12.
再制造/制造系统集成物流网络模糊机会约束规划模型   总被引:6,自引:0,他引:6  
在再制造/制造(R/M)系统集成物流网络中,回收产品的数量具有不确定性.根据这一特点,将各消费区域废旧产品的回收数量看成是模糊参数,提出了该集成物流网络的模糊机会约束规划模型.通过把模型中模糊机会约束清晰化,将模型转化为确定性的混合整数规划模型.利用实例数据,针对不同的置信水平对模型进行分析,其结果为该集成物流网络的设计提供了依据.  相似文献   

13.
Parameter uncertainty and sensitivity for a watershed-scale simulation model in Portugal were explored to identify the most critical model parameters in terms of model calibration and prediction. The research is intended to help provide guidance regarding allocation of limited data collection and model parameterization resources for modelers working in any data and resource limited environment. The watershed-scale hydrology and water quality simulation model, Hydrologic Simulation Program – FORTRAN (HSPF), was used to predict the hydrology of Lis River basin in Portugal. The model was calibrated for a 5-year period 1985–1989 and validated for a 4-year period 2003–2006. Agreement between simulated and observed streamflow data was satisfactory considering the performance measures such as Nash–Sutcliffe efficiency (E), deviation runoff (Dv) and coefficient of determination (R2). The Generalized Likelihood Uncertainty Estimation (GLUE) method was used to establish uncertainty bounds for the simulated flow using the Nash–Sutcliffe coefficient as a performance likelihood measure. Sensitivity analysis results indicate that runoff estimations are most sensitive to parameters related to climate conditions, soil and land use. These results state that even though climate conditions are generally most significant in water balance modeling, attention should also focus on land use characteristics as well. Specifically with respect to HSPF, the two most sensitive parameters, INFILT and LZSN, are both directly dependent on soil and land use characteristics.  相似文献   

14.
An R package is developed for the Generalized Extreme Value conditional density estimation network (GEVcdn). Parameters in a GEV distribution are specified as a function of covariates using a probabilistic variant of the multilayer perceptron neural network. If the covariate is time or is dependent on time, then the GEVcdn model can be used to perform nonlinear, nonstationary extreme value analysis. Due to the flexibility of the neural network architecture, the model is capable of representing a wide range of nonstationary relationships, including those involving interactions between covariates. Model parameters are estimated by generalized maximum likelihood, an approach that is tailored to the analysis of hydroclimatological extremes. Functions are included to assist in the calculation of parameter uncertainty via bootstrapping.  相似文献   

15.
Some work has been done in the past on the estimation of parameters of the three-parameter lognormal distribution based on complete and censored samples. In this article, we develop inferential methods based on progressively Type-II censored samples from a three-parameter lognormal distribution. In particular, we use the EM algorithm as well as some other numerical methods to determine maximum likelihood estimates (MLEs) of parameters. The asymptotic variances and covariances of the MLEs from the EM algorithm are computed by using the missing information principle. An alternative estimator, which is a modification of the MLE, is also proposed. The methodology developed here is then illustrated with some numerical examples. Finally, we also discuss the interval estimation based on large-sample theory and examine the actual coverage probabilities of these confidence intervals in case of small samples by means of a Monte Carlo simulation study.  相似文献   

16.
Varying-coefficient models are popular multivariate nonparametric fitting techniques. When all coefficient functions in a varying-coefficient model share the same smoothing variable, inference tools available include the F-test, the sieve empirical likelihood ratio test and the generalized likelihood ratio (GLR) test. However, when the coefficient functions have different smoothing variables, these tools cannot be used directly to make inferences on the model because of the differences in the process of estimating the functions. In this paper, the GLR test is extended to models of the latter case by the efficient estimators of these coefficient functions. Under the null hypothesis the new proposed GLR test follows the χ2-distribution asymptotically with scale constant and degree of freedom independent of the nuisance parameters, known as Wilks phenomenon. Further, we have derived its asymptotic power which is shown to achieve the optimal rate of convergence for nonparametric hypothesis testing. A simulation study is conducted to evaluate the test procedure empirically.  相似文献   

17.
Based on progressively type-II censored samples, constant-partially accelerated life tests (PALTs) when the lifetime of items under use condition follow the two-parameter Burr type-XII (Burr(c,k)) distribution are considered. The likelihood equations of the involved parameters are derived and then reduced to a single nonlinear equation to be solved numerically to obtain the maximum likelihood estimates (MLEs) of the parameters. The observed Fisher information matrix, as well as the asymptotic variance-covariance matrix of the MLEs are derived. Approximate confidence intervals (CIs) for the parameters, based on normal approximation to the asymptotic distribution of MLEs, studentized-t and percentile bootstrap CIs are derived. A Monte Carlo simulation study is carried out to investigate the precision of the MLEs and to compare the performance of the CIs considered. Finally, two examples presented to illustrate our results are followed by conclusions.  相似文献   

18.
In this paper the problem of computing uncertainty regions for models identified through an instrumental variable technique is considered. Recently, it has been pointed out that, in certain operating conditions, the asymptotic theory of system identification (the most widely used method for model quality assessment) may deliver unreliable confidence regions. The aim of this paper is to show that, in an instrumental variable setting, the asymptotic theory exhibits a certain “robustness” that makes it reliable even with a moderate number of data samples. Reasons for this are highlighted in the paper through a theoretical analysis and simulation examples.  相似文献   

19.
In this paper, effects of environmental and hunting parameters on the interspecific interacting populations are considered by applying the Rosenzweig-MacArthur model with the Holling type II functional response. Attenuating functions of the carrying capacity are introduced with a concern on the hunting parameters. We carry out numerical study to investigate how the population densities behave when environmental quantities change. We obtain the Hopf bifurcation diagrams from numerical results.  相似文献   

20.
Generalized Approximate Inverse Matrix (GAIM) techniques based on the concept of LU-type sparse factorization procedures are introduced for calculating explicitly approximate inverses of large sparse unsymmetric matrices of regular structure without inverting the factors L and U. Explicit first and second-order iterative methods in conjunction with modified forms of the GAIM techniques are presented for solving numerically three-dimensional initial/boundary-value problems on multiprocessor systems. Applications of the new methods on a 3D boundary-value problem is discussed and numerical results are given.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号