首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In Bayesian signal processing, all the information about the unknowns of interest is contained in their posterior distributions. The unknowns can be parameters of a model, or a model and its parameters. In many important problems, these distributions are impossible to obtain in analytical form. An alternative is to generate their approximations by Monte Carlo-based methods like Markov chain Monte Carlo (MCMC) sampling, adaptive importance sampling (AIS) or particle filtering (PF). While MCMC sampling and PF have received considerable attention in the literature and are reasonably well understood, the AIS methodology remains relatively unexplored. This article reviews the basics of AIS as well as provides a comprehensive survey of the state-of-the-art of the topic. Some of its most relevant implementations are revisited and compared through computer simulation examples.  相似文献   

2.
This paper applies the transferable belief model (TBM) interpretation of the Dempster-Shafer theory of evidence to approximate distribution of circuit performance function for parametric yield estimation. Treating input parameters of performance function as credal variables defined on a continuous frame of real numbers, the suggested approach constructs a random set-type evidence for these parameters. The corresponding random set of the function output is obtained by extension principle of random set. Within the TBM framework, the random set of the function output in the credal state can be transformed to a pignistic state where it is represented by the pignistic cumulative distribution. As an approximation to the actual cumulative distribution, it can be used to estimate yield according to circuit response specifications. The advantage of the proposed method over Monte Carlo (MC) methods lies in its ability to implement just once simulation process to obtain an available approximate value of yield which has a deterministic estimation error. Given the same error, the new method needs less number of calculations than MC methods. A track circuit of high-speed railway and a numerical eight-dimensional quadratic function examples are included to demonstrate the efficiency of this technique.  相似文献   

3.
Approximate Bayes estimators applied to the inverse Gaussian lifetime model   总被引:1,自引:0,他引:1  
In this paper, Bayesian estimation of the two (unknown) parameters and reliability function of the Inverse Gaussian distribution are obtained by using the approximation forms of Lindley [1] and Tierney and Kadane [2]. Based on a Monte Carlo simulation, the Bayes estimates using Tierney and Kadane's approximation form is the best one, as compared with Lindley's form or maximum likelihood method.  相似文献   

4.
Monte Carlo simulation in conjunction with Fourier transform based spectral windowing is used to model the live load on bridges. Vehicles are classified into a few groups and the probability distributions of axle weight and length associated with each group are estimated. The vehicle arriving at an instant is determined through Monte Carlo simulation, which uses a vehicle group density function derived from measurement data on the relative contribution of each group in total vehicles. The weight and length of the arriving vehicle is also simulated by Monte Carlo using the distribution function for the corresponding group. Vehicle arrivals are modeled by the Poisson distribution. The vehicle velocities are realized through spectral simulation based on decaying power spectra of the velocity time series. The simulations are performed for a sufficient time interval in several lanes, thus the ensemble sampling of load is obtained. Fourier transform based windowing is used to characterize the power spectra of mechanical load on the bridge. The study shows the white noise nature of the load spectral density, which is in agreement with the assumptions of previous investigators. Parametric sensitivity of the spectra is also performed and recommendations are made to include site-specific parameters in the model. Finally, applications are illustrated for frequency domain random vibration analysis of a simple model of bridge structures.  相似文献   

5.
In reliability analysis, accelerated life-testing allows for gradual increment of stress levels on test units during an experiment. In a special class of accelerated life tests known as step-stress tests, the stress levels increase discretely at pre-fixed time points, and this allows the experimenter to obtain information on the parameters of the lifetime distributions more quickly than under normal operating conditions. Moreover, when a test unit fails, there are often more than one fatal cause for the failure, such as mechanical or electrical. In this article, we consider the simple step-stress model under time constraint when the lifetime distributions of the different risk factors are independently exponentially distributed. Under this setup, we derive the maximum likelihood estimators (MLEs) of the unknown mean parameters of the different causes under the assumption of a cumulative exposure model. Since it is found that the MLEs do not exist when there is no failure by any particular risk factor within the specified time frame, the exact sampling distributions of the MLEs are derived through the use of conditional moment generating functions. Using these exact distributions as well as the asymptotic distributions, the parametric bootstrap method, and the Bayesian posterior distribution, we discuss the construction of confidence intervals and credible intervals for the parameters. Their performance is assessed through Monte Carlo simulations and finally, we illustrate the methods of inference discussed here with an example.  相似文献   

6.
This paper proposes optimal quadrature rules over the hemisphere for the shading integral. We leverage recent work regarding the theory of quadrature rules over the sphere in order to derive a new theoretical framework for the general case of hemispherical quadrature error analysis. We then apply our framework to the case of the shading integral. We show that our quadrature error theory can be used to derive optimal sample weights (OSW) which account for both the features of the sampling pattern and the bidirectional reflectance distribution function (BRDF). Our method significantly outperforms familiar Quasi Monte Carlo (QMC) and stochastic Monte Carlo techniques. Our results show that the OSW are very effective in compensating for possible irregularities in the sample distribution. This allows, for example, to significantly exceed the regular convergence rate of stochastic Monte Carlo while keeping the exact same sample sets. Another important benefit of our method is that OSW can be applied whatever the sampling points distribution: the sample distribution need not follow a probability density function, which makes our technique much more flexible than QMC or stochastic Monte Carlo solutions. In particular, our theoretical framework allows to easily combine point sets derived from different sampling strategies (e.g. targeted to diffuse and glossy BRDF). In this context, our rendering results show that our approach overcomes MIS (Multiple Importance Sampling) techniques.  相似文献   

7.
In estimating the effect of a change in a random variable parameter on the (time-invariant) probability of structural failure estimated through Monte Carlo methods the usual approach is to carry out a duplicate simulation run for each parameter being varied. The associated computational cost may become prohibitive when many random variables are involved. Herein a procedure is proposed in which the numerical results from a Monte Carlo reliability estimation procedure are converted to a form that will allow the basic ideas of the first order reliability method to be employed. Using these allows sensitivity estimates of low computational cost to be made. Illustrative examples with sensitivities computed both by conventional Monte Carlo and the proposed procedure show good agreement over a range of probability distributions for the input random variables and for various complexities of the limit state function.  相似文献   

8.
由于多电飞机拓扑结构复杂且发生故障的概率较低,采用传统的蒙特卡罗采样方法进行可靠度评估时存在采样次数多,仿真时间长等缺点。通过信息熵引入多元件系统的近似概率分布,从而提出一种适用于多电飞机供电可靠度评估方法,在信息熵中引入最优参数,改变了元件的故障概率分布,构造了元件零方差概率密度函数的近似函数,然后利用差分进化求解最优参数。结合对偶抽样的信息熵法可进一步降低采样过程的方差,提高了传统蒙特卡罗方法的采样效率。最后使用一个多电飞机电源系统为应用案例,对几种可靠度分析方法进行收敛性和准确性分析,其结果表明本文方法在小概率事件评估问题上优势明显。  相似文献   

9.
The Burr type III distribution allows for a wider region for the skewness and kurtosis plane, which covers several distributions including the log-logistic, and the Weibull and Burr type XII distributions. However, outliers may occur in the data set. The robust regression method such as an M-estimator with symmetric influence function has been successfully used to diminish the effect of outliers on statistical inference. However, when the data distribution is asymmetric, these methods yield biased estimators. We present an M-estimator with asymmetric influence function (AM-estimator) based on the quantile function of the Burr type III distribution to estimate the parameters for complete data with outliers. The simulation results show that the M-estimator with asymmetric influence function generally outperforms the maximum likelihood and traditional M-estimator methods in terms of the bias and root mean square errors. One real example is used to demonstrate the performance of our proposed method.  相似文献   

10.
Omnibus procedures for testing serial correlation are developed, using spectral density estimation and wavelet shrinkage. The asymptotic distributions of the wavelet coefficients under the null hypothesis of no serial correlation are derived. Under some general conditions on the wavelet basis, the wavelet coefficients asymptotically follow a normal distribution. Furthermore, they are asymptotically uncorrelated. Adopting a spectral approach and using results on wavelet shrinkage, new one-sided test statistics are proposed. As a spatially adaptive estimation method, wavelets can effectively detect fine features in the spectral density, such as sharp peaks and high frequency alternations. Using an appropriate thresholding parameter, shrinkage rules are applied to the empirical wavelet coefficients, resulting in a non-linear wavelet-based spectral density estimator. Consequently, the advocated approach avoids the need to select the finest scale J, since the noise in the wavelet coefficients is naturally suppressed. Simple data-dependent threshold parameters are also considered. In general, the convergence of the spectral test statistics toward their respective asymptotic distributions appears to be relatively slow. In view of that, Monte Carlo methods are investigated. In a small simulation study, several spectral test statistics are compared, with respect to level and power, including versions of these test statistics using Monte Carlo simulations.  相似文献   

11.
An efficient Monte Carlo method for random sample generation from high dimensional distributions of complex structures is developed. The method is based on random discretization of the sample space and direct inversion of the discretized cumulative distribution function. It requires only the knowledge of the target density function up to a multiplicative constant and applies to standard distributions as well as high-dimensional distributions arising from real data applications. Numerical examples and real data applications are used for illustration. The algorithms are implemented in statistical software R and a package dsample has been developed and is available online.  相似文献   

12.
Maximum likelihood and Bayes estimates for the two parameters and the reliability function of the Burr Type XII distribution are obtained based on progressive Type II censored samples. An approximation based on the Laplace approximation method developed by Tierney and Kadane [1] and a bivariate prior density for the two unknown parameters, suggested by Al-Hussaini and Jaheen [2] are used for obtaining the Bayes estimates. These estimates are compared via Monte Carlo simulation study.  相似文献   

13.
An investigation of the applicability of neural network-based methods in predicting the values of multiple parameters, given the value of a single parameter within a particular problem domain is presented. In this context, the input parameter may be an important source of variation that is related with a complex mapping function to the remaining sources of variation within a multivariate distribution. The definition of the relationship between the variables of a multivariate distribution and a single source of variation allows the estimation of the values of multiple variables given the value of the single variable, addressing in that way an ill-conditioned one-to-many mapping problem. As part of our investigation, two problem domains are considered: predicting the values of individual stock shares, given the value of the general index, and predicting the grades received by high school pupils, given the grade for a single course or the average grade. With our work, the performance of standard neural network-based methods and in particular multilayer perceptrons (MLPs), radial basis functions (RBFs), mixture density networks (MDNs) and a latent variable method, the general topographic mapping (GTM), is compared. According to the results, MLPs and RBFs outperform MDNs and the GTM for these one-to-many mapping problems.  相似文献   

14.
A new method of data augmentation for binary and multinomial logit models is described. First, the latent utilities are introduced as auxiliary latent variables, leading to a latent model which is linear in the unknown parameters, but involves errors from the type I extreme value distribution. Second, for each error term the density of this distribution is approximated by a mixture of normal distributions, and the component indicators in these mixtures are introduced as further latent variables. This leads to Markov chain Monte Carlo estimation based on a convenient auxiliary mixture sampler that draws from standard distributions like normal or exponential distributions and, in contrast to more common Metropolis-Hastings approaches, does not require any tuning. It is shown how the auxiliary mixture sampler is implemented for binary or multinomial logit models, and it is demonstrated how to extend the sampler to mixed effect models and time-varying parameter models for binary and categorical data. Finally, an application to Austrian labor market data is discussed.  相似文献   

15.
This paper presents a framework for state estimation which tolerates uncertainty in observation model parameters by (1) incorporating this uncertainty in state observation, and (2) correcting model parameters to improve future state observations. The first objective is met by an uncertainty propagation approach, while the second is achieved by gradient-descent optimization. The novel framework allows state estimates to be represented by non-Gaussian probability distribution functions. By correcting observation model parameters, estimation performance is enhanced since the accuracy of observations is increased. Monte Carlo simulation experiments validate the efficacy of the proposed approach in comparison with conventional estimation techniques, showing that as model parameters converge to ground-truth over time, state estimation correspondingly improves when compared to a static model estimate. Because observation models cannot be known with perfect accuracy and existing approaches do not address parametric uncertainties in non-Gaussian estimation, this work has both novelty and usefulness in most state estimation contexts.  相似文献   

16.
A validated simulation model primarily requires performing an appropriate input analysis mainly by determining the behavior of real-world processes using probability distributions. In many practical cases, probability distributions of the random inputs vary over time in such a way that the functional forms of the distributions and/or their parameters depend on time. This paper answers the question whether a sequence of observations from a process follow the same statistical distribution, and if not, where the exact change points are, so that observations within two consecutive change points follow the same distribution. We propose two different methods based on likelihood ratio test and cluster analysis to detect multiple change points when observations follow non-stationary Poisson process with diverse occurrence rates over time. Results from a comprehensive Monte Carlo study indicate satisfactory performance for the proposed methods. A well-known example is also considered to show the application of our findings in real world cases.  相似文献   

17.
The Bayesian method is widely used to identify a joint distribution, which is modeled by marginal distributions and a copula. The joint distribution can be identified by one-step procedure, which directly tests all candidate joint distributions, or by two-step procedure, which first identifies marginal distributions and then copula. The weight-based Bayesian method using two-step procedure and the Markov chain Monte Carlo (MCMC)-based Bayesian method using one-step and two-step procedures were recently developed. In this paper, the one-step weight-based Bayesian method and two-step MCMC-based Bayesian method using the parametric marginal distributions are proposed. Comparison studies among the Bayesian methods have not been thoroughly carried out. In this paper, the weight-based and MCMC-based Bayesian methods using one-step and two-step procedures are compared to see which Bayesian method accurately and efficiently identifies a correct joint distribution through simulation studies. It is validated that the two-step weight-based Bayesian method has the best performance.  相似文献   

18.
正电子发射成像是一种有效的生理功能性成像手段,但是由于投影数据中噪声大而给重建带来困难,为此提出了利用其他高质量的解剖成像结果的分割模板先验来进行完全Bayesian重建以提高重建效果,分割模板先验可以表示为包含超验参数的Markov场形式,但是它的非凸性和超验参数的存在使得无法用常规的方法得到最大后验估计,为此采用动态后验模拟算法计算后验平均估计,基于满足条件分布的动态后验模拟法可以同时更新象素的密度和超验参数,并且容易得到重建的方差和置信区间,将这种方法和似然估计、最大后验估计结果进行比较,重建的结果无论在空间分辨率和抑制噪声方面都有取得了好的效果。  相似文献   

19.
We solve the light transport problem by introducing a novel unbiased Monte Carlo algorithm called replica exchange light transport, inspired by the replica exchange Monte Carlo method in the fields of computational physics and statistical information processing. The replica exchange Monte Carlo method is a sampling technique whose operation resembles simulated annealing in optimization algorithms using a set of sampling distributions. We apply it to the solution of light transport integration by extending the probability density function of an integrand of the integration to a set of distributions. That set of distributions is composed of combinations of the path densities of different path generation types: uniform distributions in the integral domain, explicit and implicit paths in light (particle/photon) tracing, indirect paths in bidirectional path tracing, explicit and implicit paths in path tracing, and implicit caustics paths seen through specular surfaces including the delta function in path tracing. The replica‐exchange light transport algorithm generates a sequence of path samples from each distribution and samples the simultaneous distribution of those distributions as a stationary distribution by using the Markov chain Monte Carlo method. Then the algorithm combines the obtained path samples from each distribution using multiple importance sampling. We compare the images generated with our algorithm to those generated with bidirectional path tracing and Metropolis light transport based on the primary sample space. Our proposing algorithm has better convergence property than bidirectional path tracing and the Metropolis light transport, and it is easy to implement by extending the Metropolis light transport.  相似文献   

20.
Decision tree models may be more realistic if branching probabilities (and possibly utilities) are represented as distributions rather than point estimates. However, numerical analysis of such “probabilistic” trees is more difficult. This study employed theMathematicacomputer algebra system to implement and verify previously described probabilistic methods. Both algebraic approximations and Monte Carlo simulation methods were used; in particular, simulations with beta, logistic-normal, and triangular distributions for branching probabilities were compared. Algebraic and simulation methods of sensitivity analysis were also implemented and compared. Computation required minimal programming and was reasonably fast usingMathematicaon a standard personal computer. This study verified previously published results, including methods of sensitivity analysis. Changing the input distributional form had little effect. Computation is no longer a significant barrier to the use of probabilistic methods for analysis of decision trees.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号