首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
To account for the input-model and input-parameter uncertainties inherent in many simulations as well as the usual stochastic uncertainty, we present a Bayesian input-modeling technique that yields improved point and confidence-interval estimators for a selected posterior mean response. Exploiting prior information to specify the prior probabilities of the postulated input models and the associated prior input-parameter distributions, we use sample data to compute the posterior input-model and input-parameter distributions. Our Bayesian simulation replication algorithm involves: (i) estimating parameter uncertainty by randomly sampling the posterior input-parameter distributions; (ii) estimating stochastic uncertainty by running independent replications of the simulation using each set of input-model parameters sampled in (i); and (iii) estimating input-model uncertainty by weighting the responses generated in (ii) using the corresponding posterior input-model probabilities. Sampling effort is allocated among input models to minimize final point-estimator variance subject to a computing-budget constraint. A queueing simulation demonstrates the advantages of this approach.  相似文献   

2.
The concept of a Bayesian probability of agreement was recently introduced to give the posterior probabilities that the response surfaces for two different groups are within δ of one another. For example, a difference of less than δ in the mean response at fixed levels of the predictor variables might be thought to be practically unimportant. In such a case, we would say that the mean responses are in agreement. The posterior probability of this is called the Bayesian probability of agreement. In this article, we quantify the probability that new response observations from two groups will be within δ for a continuous response, and the probability that the two responses agree completely for categorical cases such as logistic regression and Poisson regression. We call these Bayesian comparative predictive probabilities, with the former being the predictive probability of agreement. We use Markov chain Monte Carlo simulation to estimate the posterior distribution of the model parameters and then the predictive probability of agreement. We illustrate the use of this methodology with three examples and provide a freely available R Shiny app that automates the computation and estimation associated with the methodology.  相似文献   

3.
The software reliability modeling is of great significance in improving software quality and managing the software development process. However, the existing methods are not able to accurately model software reliability improvement behavior because existing single model methods rely on restrictive assumptions and combination models cannot well deal with model uncertainties. In this article, we propose a Bayesian model averaging (BMA) method to model software reliability. First, the existing reliability modeling methods are selected as the candidate models, and the Bayesian theory is used to obtain the posterior probabilities of each reliability model. Then, the posterior probabilities are used as weights to average the candidate models. Both Markov Chain Monte Carlo (MCMC) algorithm and the Expectation–Maximization (EM) algorithm are used to evaluate a candidate model's posterior probability and for comparison purpose. The results show that the BMA method has superior performance in software reliability modeling, and the MCMC algorithm performs better than EM algorithm when they are used to estimate the parameters of BMA method.  相似文献   

4.
马君明  李惠  兰成明  刘彩平 《工程力学》2022,39(3):11-22, 63
该文着重研究基于观测信息的结构体系可靠度更新模型及其拒绝抽样算法。基于Bayesian理论建立考虑观测信息的结构体系失效概率更新模型,根据观测信息事件类型建立不等式和等式观测信息条件下随机变量的似然函数并推导其后验概率密度函数;基于观测信息域确定随机变量后验样本的拒绝抽样策略,探究拒绝抽样算法的抽样效率,推导更新后结构体系失效概率估计值及其标准差的计算公式;将上述方法应用于刚架结构发生塑性失效时体系可靠度更新计算。研究表明:考虑观测信息的结构体系条件失效概率更新模型可转化为随机变量后验概率密度在失效域上的积分,构造满足观测信息域的先验样本作为随机变量后验样本的抽样策略是可行的,该抽样策略可以处理多随机变量、多观测信息条件下结构体系可靠度更新;与抗力相关随机变量检测值增大及验证荷载值提高均可以降低更新后结构体系的失效概率,与抗力相关的随机变量还需控制其检测误差的标准差,以降低观测信息的不确定性。  相似文献   

5.
A warning system such as the Command, Control, Communication, and Intelligence system (C3I) for the United States nuclear forces operates on the basis of various sources of information among which are signals from sensors. A fundamental problem in the use of such signals is that these sensors provide only imperfect information. Bayesian probability, defined as a degree of belief in the possibility of each event, is therefore a key concept in the logical treatment of the signals. However, the base of evidence for estimation of these probabilities may be small and, therefore, the results of the updating (posterior probabilities of attack) may also be uncertain. In this paper, we examine the case where uncertainties hinge upon the existence of several possible underlying hypotheses (or models), and where the decision-maker attributes a different probability of attack to each of these fundamental hypotheses. We present a two-stage Bayesian updating process, first of the probabilities of the fundamental hypotheses, then of the probabilities of attack conditional on each hypothesis, given a positive signal from the C3I. We illustrate the method in the discrete case where there are only two possible fundamental hypotheses, and in the case of a continuous set of hypotheses. We discuss briefly the implications of the results for decision-making. The method can be generalized to other warning systems with imperfect signals, when the prior probability of the event of interest is uncertain.  相似文献   

6.
The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol’ sequences and Bucher’s design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.  相似文献   

7.
Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology.  相似文献   

8.
Uncertainties in common cause event observation, documentation and interpretation are taken into account by conditional probabilities and generalized impact vector weights that separate single and double events of a specific multiplicity in a single observation. Distributions and moments of common cause failure (CCF) rates of a system are obtained in terms of the weights by using probability generating functions, combining assessment uncertainties and statistical uncertainties. These results are then used to generate effective plant-specific input data to general empirical Bayes estimation methods to combine data from many plants. The posterior output yields CCF probabilities for standby safety system fault tree analysis or probabilistic safety assessments of a target plant.  相似文献   

9.
In this paper, an entirely new procedure for the classification of high-dimensional vectors on the basis of a few training samples is described. The proposed method is based on the Bayesian paradigm and provides posterior probabilities that a new vector belongs to each of the classes, therefore it adapts naturally to any number of classes. The classification technique is based on a small vector which can be viewed as a regression of the new observation onto the space spanned by the training samples, which is similar to Support Vector Machine classification paradigm. This is achieved by employing matrix-variate distributions in classification, which is an entirely new idea.   相似文献   

10.
沈凌洁  王蔚 《声学技术》2018,37(2):167-174
提出一种基于韵律特征(基频、时长)和梅尔倒谱系数(Mel-Frequency Cepstral Coefficient,MFCC)特征的融合特征进行短语音汉语声调识别的方法,旨在利用两种特征的优势提高短语音汉语声调识别率。该融合特征包括7个根据不同模型得到的韵律特征和统计参数以及4个从每个音段的梅尔倒谱系数计算得来的对数化后验概率,使用高斯混合模型表示4个声调的倒谱特征的分布。实验分两步:第一步,将基于韵律特征和倒谱特征的分类器在决策阶段混合起来进行声调分类,分别赋予两个分类器权重,计算倒谱特征和韵律特征在声调分类任务中的权重;第二步,将基于字的韵律特征和基于帧的倒谱特征结合起来生成融合特征的超向量,使用融合特征进行汉语声调识别,根据准确率、未加权平均召回率(Unweigted Average Recall,UAR)和科恩卡帕(Cohen’s Kappa)系数3个指标,比较并评估5种分类器(两种设置的高斯混合模型,后向传播神经网络,支持向量机和卷积神经网络(Convolutional Neural Network,CNN))在不平衡数据集上的分类效果。实验结果表明:(1)倒谱特征方法能够提高汉语声调的识别率,该特征在总体分类任务中的权重为0.11;(2)基于融合特征的深度学习(CNN)方法对声调的识别率最高,为87.6%,与高斯混合模型的基线系统相比,提高了5.87%。该研究证明了倒谱特征法能够提供与韵律特征法互补的信息,从而提高短语音汉语声调识别率;同时,该方法可以运用到韵律检测和副语言信息检测等相关研究中。  相似文献   

11.
The posterior probabilities ofK given models when improper priors are used depend on the proportionality constants assigned to the prior densities corresponding to each of the models. It is shown that this assignment can be done using natural geometric priors in multiple regression problems if the normal distribution of the residual errors is truncated. This truncation is a realistic modification of the regression models, and since it will be made far away from the mean, it has no other effect beyond the determination of the proportionality constants, provided that the sample size is not too large. In the caseK=2, the posterior odds ratio is related to the usualF statistic in “classical” statistics. Assuming zero-one losses the optimal selection of a regression model is achieved by maximizing the posterior probability of a submodel. It is shown that the geometric criterion obtained in this way is asymptotically equivalent to Schwarz’s asymptotic Bayesian criterion, sometimes called the BIC criterion. An example of polynomial regression is used to provide numerical comparisons between the new geometric criterion, the BIC criterion and the Akaike information criterion. Villegas and Swartz were partially supported by grants from the Natural Sciences and Engineering Research Council of Canada.  相似文献   

12.
Objects described by stochastic differential equations and the associated control systems under the conditions of possible distortions in models generating these systems are considered. A solution of problems of detecting and diagnostics of distortions which utilizes sequential decision rules and the maximum of posterior probability criterion is validated. As an application, and efficient algorithm of the fastest detection of a maneuver of a moving object which assesses the given probabilities of the type I and type II errors is developed. The necessary programming system is established and positive results of computational experiments are obtained. Translated from Izmeritel'naya Tekhnika, No. 3, pp. 9–11, March, 1996.  相似文献   

13.
基于形状信息的Bayes分类方法   总被引:1,自引:0,他引:1  
本文提出了一种新的基于形状信息的Bayes分类方法,以实现对图像中单个物体的分类。该方法首先运用图像边缘提取和配准算法,构造一个形状相似性能量泛函,并利用其计算形状信息的先验概率。然后,结合图像中物体其它特征的后验概率,通过Bayes方法进行分类。本文将该方法应用于一个病原菌图像分类的实际问题,实验结果表明,该方法是十分有效的,不仅降低了分类所需的特征维数,而且提高了分类精度,并能满足实际问题中所要求的计算速度。  相似文献   

14.
Bayesian analysis was performed to estimate an appropriate value of the uncertain propagation rate of cracks that can be initiated at the wheelseat of a Shinkansen vehicle axle. In the analysis, fatigue life distribution obtained by numerical simulation that employed the crack propagation rate obtained from small specimens was used as the prior distribution. Then it was modified by the results of the fatigue test of full-scale models as additional information to obtain the posterior distribution. It was indicated that the variances of fatigue life distribution reduced through the analysis. By using the crack propagation rate obtained from the posterior fatigue life distribution, the failure probabilities of the Shinkansen vehicle axle in operation, that were calculated previously by using the crack propagation rate due to the experiment of small specimens were recalculated. The resulting probabilities of failure were almost the same as those that were not modified, but were slightly lower. Although the difference was not so significant, it was thought that more confident values of the failure probability were obtained.  相似文献   

15.
The present work addresses the problem of structural damage identification built on the statistical inversion approach. Here, the damage state of the structure is continuously described by a cohesion parameter, which is spatially discretized by the finite element method. The inverse problem of damage identification is then posed as the determination of the posterior probability densities of the nodal cohesion parameters. The Markov Chain Monte Carlo method, implemented with the Metropolis–Hastings algorithm, is considered in order to approximate the posterior probabilities by drawing samples from the desired joint posterior probability density function. With this approach, prior information on the sought parameters can be used and the uncertainty concerning the known values of the material properties can be quantified in the estimation of the cohesion parameters. The assessment of the proposed approach has been performed by means of numerical simulations on a simply supported Euler–Bernoulli beam. The damage identification and assessment are performed considering time domain response data. Different damage scenarios and noise levels were addressed, demonstrating the feasibility of the proposed approach.  相似文献   

16.

The characteristics of human intuitive reasoning in estimating posterior probability can often be clarified through counterintuitive problems. A modified version of the “problem of three prisoners” (Shimojo & Ichikawa, 1989) is a very difficult Bayesian problem. Most subjects cannot yield the normative answer, and they do not intuitively accept it even after they understand the solution based on Bayes’ theorem. However, it has been pointed out that this problem has some ambiguity on a conditional probability, which is critical to solve it. In the present study, the subjects were first required to determine a value of the ambiguous parameter freely and then solve the problem. This pre-manipulation did not improve their performance: Most of them could not give the answer in accordance with the parameter they set for themselves. Moreover, an additional questionnaire revealed that many subjects had a crucial fallacy on the relation between prior and posterior probabilities. It is argued that the difficulty of the problem lies not in setting a value of the parameter but in the subjects’ erroneous beliefs about the nature of probability.

  相似文献   

17.
The calibration of discrete element method (DEM) simulations is typically accomplished in a trial-and-error manner. It generally lacks objectivity and is filled with uncertainties. To deal with these issues, the sequential quasi-Monte Carlo (SQMC) filter is employed as a novel approach to calibrating the DEM models of granular materials. Within the sequential Bayesian framework, the posterior probability density functions (PDFs) of micromechanical parameters, conditioned to the experimentally obtained stress–strain behavior of granular soils, are approximated by independent model trajectories. In this work, two different contact laws are employed in DEM simulations and a granular soil specimen is modeled as polydisperse packing using various numbers of spherical grains. Knowing the evolution of physical states of the material, the proposed probabilistic calibration method can recursively update the posterior PDFs in a five-dimensional parameter space based on the Bayes’ rule. Both the identified parameters and posterior PDFs are analyzed to understand the effect of grain configuration and loading conditions. Numerical predictions using parameter sets with the highest posterior probabilities agree well with the experimental results. The advantage of the SQMC filter lies in the estimation of posterior PDFs, from which the robustness of the selected contact laws, the uncertainties of the micromechanical parameters and their interactions are all analyzed. The micro–macro correlations, which are byproducts of the probabilistic calibration, are extracted to provide insights into the multiscale mechanics of dense granular materials.  相似文献   

18.
Multivariate quality characteristics are often monitored using a single statistic or a few statistics. However, it is difficult to determine the causes of an out-of-control signal based on a few summary statistics. Therefore, if a control chart for the mean detects a change in the mean, the quality engineer needs to determine which means shifted and the directions of the shifts to facilitate identification of root causes. We propose a Bayesian approach that gives a direct answer to this question. For each mean, an indicator variable that indicates whether the mean shifted upward, shifted downward, or remained unchanged is introduced. Prior distributions for the means and indicators capture prior knowledge about mean shifts and allow for asymmetry in upward and downward shifts. The mode of the posterior distribution of the vector of indicators or the mode of the marginal posterior distribution of each indicator gives the most likely scenario for each mean. Evaluation of the posterior probabilities of all possible values of the indicators is avoided by employing Gibbs sampling. This renders the computational cost more affordable for high-dimensional problems. This article has supplementary materials online.  相似文献   

19.
This paper uses mixture priors for Bayesian assessment of performance. In any Bayesian performance assessment, a prior distribution for performance parameter(s) is updated based on current performance information. The performance assessment is then based on the posterior distribution for the parameter(s). This paper uses a mixture prior, a mixture of conjugate distributions, which is itself conjugate and which is useful when performance may have changed recently. The present paper illustrates the process using simple models for reliability, involving parameters such as failure rates and demand failure probabilities. When few failures are observed the resulting posterior distributions tend to resemble the priors. However, when more failures are observed, the posteriors tend to change character in a rapid nonlinear way. This behavior is arguably appropriate for many applications. Choosing realistic parameters for the mixture prior is not simple, but even the crude methods given here lead to estimators that show qualitatively good behavior in examples.  相似文献   

20.
Chen-Pin Wang  Malay Ghosh 《TEST》2007,16(1):145-171
This paper presents a Bayesian diagnostic procedure for examining change-point assumption in the competing risks model framework. It considers the family of distributions arising from the cause-specific model as reported by Chiang (Introduction to stochastic processes in biostatistics. Wiley, New York, 1968) upon which change-points are added to accommodate possible distributional heterogeneity. Model departure, due to misspecification of change-points associated with either the overall survival distribution or cause-specific probabilities, is quantified in terms of a sequence of cumulative-sum statistics between each pair of adjacent change-points assumed. When assessing the asymptotic behavior of each sequence of cumulative-sum statistics using its posterior predictive p-values, see Rubin (Ann Stat 12:1151–1172, 1984) and partial posterior predictive p-values as reported by Bayarri and Berger (J Am Stat Assoc 95:1127–1142, 2000), we show that both types of p-values attain their greatest departure from 0.5 at the change-point that is missed in the assumed model, from which a diagnostic procedure is formalized. Statistical power of these two types of p-values is discussed.   相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号