共查询到20条相似文献,搜索用时 31 毫秒
1.
J. Peters 《Nuclear instruments & methods in physics research. Section A, Accelerators, spectrometers, detectors and associated equipment》2005,540(2-3):419-429
An analytical calculation for the transmission probability of neutrons travelling through a revolving slit is presented here. For the first time, two effects have been taken into account in the same approach, that is, on the one hand, the fact that the neutron beam might be divergent and, on the other hand, that neutrons from a continuous source can arrive at different times at the chopper. Furthermore, the neutron distribution at a given distance behind the chopper has been calculated and these theoretical results have been compared with simulated data obtained with the VITESS simulation program. The theoretical and the simulated curves are in good agreement. 相似文献
2.
The environment in which a population evolves can have a crucial impact on selection. We study evolutionary dynamics in finite populations of fixed size in a changing environment. The population dynamics are driven by birth and death events. The rates of these events may vary in time depending on the state of the environment, which follows an independent Markov process. We develop a general theory for the fixation probability of a mutant in a population of wild-types, and for mean unconditional and conditional fixation times. We apply our theory to evolutionary games for which the payoff structure varies in time. The mutant can exploit the environmental noise; a dynamic environment that switches between two states can lead to a probability of fixation that is higher than in any of the individual environmental states. We provide an intuitive interpretation of this surprising effect. We also investigate stationary distributions when mutations are present in the dynamics. In this regime, we find two approximations of the stationary measure. One works well for rapid switching, the other for slowly fluctuating environments. 相似文献
3.
Ruoyu Su Xiaojun Sun Fei Ding Dengyin Zhang Hongbo Zhu M. I. M. Wahab 《计算机、材料和连续体(英文)》2020,62(3):1387-1398
Wireless communications have to face to several different security issues in
practice due to the nature of broadcast. The information theory is well known to provide
efficient approaches to address security issues in wireless communications, which attracts
much attention in both industry and academia in recent years. In this paper, inspired by
information theory, we study the outage probability of the opportunistic relay selection
based on cognitive decode-and-forward relaying with the secrecy consideration.
Specifically, the closed-form expression of the outage probability is proposed. Moreover,
the asymptotic performance evaluation on the basis of the analytical results is
investigated. The simulation results show that the relay selection can reduce the outage
probability in accordance with our theoretical analysis. 相似文献
4.
Evolutionary dynamics on graphs can lead to many interesting and counterintuitive findings. We study the Moran process, a discrete time birth–death process, that describes the invasion of a mutant type into a population of wild-type individuals. Remarkably, the fixation probability of a single mutant is the same on all regular networks. But non-regular networks can increase or decrease the fixation probability. While the time until fixation formally depends on the same transition probabilities as the fixation probabilities, there is no obvious relation between them. For example, an amplifier of selection, which increases the fixation probability and thus decreases the number of mutations needed until one of them is successful, can at the same time slow down the process of fixation. Based on small networks, we show analytically that (i) the time to fixation can decrease when links are removed from the network and (ii) the node providing the best starting conditions in terms of the shortest fixation time depends on the fitness of the mutant. Our results are obtained analytically on small networks, but numerical simulations show that they are qualitatively valid even in much larger populations. 相似文献
5.
Mohammad M. Hamasha 《Quality Engineering》2017,29(2):322-328
The left-sided truncated normal distribution is especially important in quality engineering where we remove the left side of the distribution, for example, due to estimation of the life of used products. Although the theoretical background of truncated normal distribution is already established, there is very little work on mathematical approximation of the probability density and the cumulative probability density functions. In this article, a high accuracy mathematical approximation of the left-sided truncated normal distribution is proposed. A full analysis of errors and recommendations for implementation using Microsoft Excel are provided at the end of this article. 相似文献
6.
The goal of this work is to quantitatively examine the effect of adhesive resin cement on the probability of crack initiation from the internal surface of ceramic dental restorations. The possible crack bridging mechanism and residual stress effect of the resin cement on the ceramic surface are examined. Based on the fracture-mechanics-based failure probability model, we predict the failure probability of glass-ceramic disks bonded to simulated dentin subjected to indentation loads. The theoretical predictions match experimental data suggesting that both resin bridging and shrinkage plays an important role and need to be considered for accurate prognostics to occur. 相似文献
7.
8.
Dae Woo Kim Dong-Woo Suh R. S. Qin H. K. D. H. Bhadeshia 《Journal of Materials Science》2010,45(15):4126-4132
The crystallographic relationship between austenite and grain boundary nucleated allotriomorphic ferrite has been investigated
using electron back-scattered diffraction with a view to establishing a mechanism of variant selection. It is possible in
some circumstances for the ferrite to adopt a favoured orientation relationship with both of the austenite grains with which
it is in contact. However, the theoretical probability for the development of such a dual orientation has in previous work
been shown to be very small, although experiments indicate otherwise. In this work, we have discovered experimentally that
the probability of dual orientations is significantly increased when adjacent austenite grains are connected by special high-angle
boundaries. Crystallographic calculations validate these observations and lead to the conclusion that simultaneous lattice
matching between ferrite and its parent austenite grains is more likely in the presence of certain kinds of microscopic texture
in the austenite. The phenomenon of dual orientation provides a criterion for crystallographic variant selection during diffusional
transformation. 相似文献
9.
Abstract In this paper we discuss detection problems for a high resolution radar. Fluctuation in the target radar cross section usually decreases the probability of detection. However, through integration of cells within range profiles of a high resolution radar, variation of the integrated magnitude with respect to the change of carrier frequency and target aspect becomes much smaller, and this is helpful for improving the probability of detection. Two detection algorithms, the cell integration method and the correlation method, for a high resolution radar are proposed, and their detection performances are compared with that obtained by a conventional low resolution radar. Some theoretical formulations are developed. Simulation results show the effectiveness of the proposed algorithms. 相似文献
10.
Raj R Geisler WS Frazor RA Bovik AC 《Journal of the Optical Society of America. A, Optics, image science, and vision》2005,22(10):2039-2049
The human visual system combines a wide field of view with a high-resolution fovea and uses eye, head, and body movements to direct the fovea to potentially relevant locations in the visual scene. This strategy is sensible for a visual system with limited neural resources. However, for this strategy to be effective, the visual system needs sophisticated central mechanisms that efficiently exploit the varying spatial resolution of the retina. To gain insight into some of the design requirements of these central mechanisms, we have analyzed the effects of variable spatial resolution on local contrast in 300 calibrated natural images. Specifically, for each retinal eccentricity (which produces a certain effective level of blur), and for each value of local contrast observed at that eccentricity, we measured the probability distribution of the local contrast in the unblurred image. These conditional probability distributions can be regarded as posterior probability distributions for the "true" unblurred contrast, given an observed contrast at a given eccentricity. We find that these conditional probability distributions are adequately described by a few simple formulas. To explore how these statistics might be exploited by central perceptual mechanisms, we consider the task of selecting successive fixation points, where the goal on each fixation is to maximize total contrast information gained about the image (i.e., minimize total contrast uncertainty). We derive an entropy minimization algorithm and find that it performs optimally at reducing total contrast uncertainty and that it also works well at reducing the mean squared error between the original image and the image reconstructed from the multiple fixations. Our results show that measurements of local contrast alone could efficiently drive the scan paths of the eye when the goal is to gain as much information about the spatial structure of a scene as possible. 相似文献
11.
Mohammad T. Khasawneh Shannon R. Bowling Sittichai Kaewkuekool Byung Rae Cho 《Quality Engineering》2004,17(1):33-50
A normal distribution has a unique position in many engineering fields, and the standard normal distribution table has been widely used for more than a century. There are many situations, however, in which a truncated normal distribution needs to be considered. Although the theoretical foundations of the truncated normal distribution are well established, there has been little work on tabulating the characteristics associated with the truncated normal distribution, such as a cumulative probability, a truncated mean, and a truncated variance. In this article, we provide tables for a singly truncated normal distribution, which may be useful for quality practitioners. 相似文献
12.
采用蒙特卡洛方法(MCM)实施“概率分布传播”来评定转基因水稻样品中NOS终止子的测量不确定度,解析转基因成分测量不确定度的概率分布。评定结果表明:NOS终止子的相对含量为2.95%,非常接近理论含量(3%),而且其标准不确定度为2.13×10-4,远小于1.00×10-2。在95%的包含概率下,NOS终止子的相对含量分布在2.91%~3.00%非常窄的包含区间内,充分说明测量质量好,测量结果可靠;NOS终止子相对含量的概率分布呈标准正态分布,揭示转基因成分测量条件满足GUM法的假设,MCM和GUM法都可以应用于转基因成分测量不确定度评定。 相似文献
13.
There are difficulties with probability as a representation of uncertainty. However, we argue that there is an important distinction between principle and practice. In principle, probability is uniquely appropriate for the representation and quantification of all forms of uncertainty; it is in this sense that we claim that ‘probability is perfect’. In practice, people find it difficult to express their knowledge and beliefs in probabilistic form, so that elicitation of probability distributions is a far from perfect process. We therefore argue that there is no need for alternative theories, but that any practical elicitation of expert knowledge must fully acknowledge imprecision in the resulting distribution.We outline a recently developed Bayesian technique that allows the imprecision in elicitation to be formulated explicitly, and apply it to some of the challenge problems. 相似文献
14.
J. F. Schutte R. T. Haftka B. J. Fregly 《International journal for numerical methods in engineering》2007,71(6):678-702
For some problems global optimization algorithms may have a significant probability of not converging to the global optimum or require an extremely large number of function evaluations to reach it. For such problems, the probability of finding the global optimum may be improved by performing multiple independent short searches rather than using the entire available budget of function evaluations on a single long search. The main difficulty in adopting such a strategy is to decide how many searches to carry out for a given function evaluation budget. The basic premise of this paper is that different searches may have substantially different outcomes, but they all start with rapid initial improvement of the objective function followed by much slower progress later on. Furthermore, we assume that the number of function evaluations to the end of the initial stage of rapid progress does not change drastically from one search to another for a given problem and algorithmic setting. Therefore we propose that the number of function evaluations required for this rapid‐progress stage be estimated with one or two runs, and then the same number of function evaluations be allocated to all subsequent searches. We show that these assumptions work well for the particle swarm optimization algorithm applied to a set of difficult analytical test problems with known global solutions. For these problems we show that the proposed strategy can substantially improve the probability of obtaining the global optimum for a given budget of function evaluations. We also test a Bayesian criterion for estimating the probability of having reached the global optimum at the end of the series of searches and find that it can provide a conservative estimate for most problems. Finally, we demonstrate the approach on a particularly challenging engineering design problem constructed so as to have at least 32 widely separated local optima. Copyright © 2006 John Wiley & Sons, Ltd. 相似文献
15.
Nathaniel T. Stevens Steven E. Rigdon Christine M. Anderson‐Cook 《Quality and Reliability Engineering International》2018,34(6):968-978
The concept of a Bayesian probability of agreement was recently introduced to give the posterior probabilities that the response surfaces for two different groups are within δ of one another. For example, a difference of less than δ in the mean response at fixed levels of the predictor variables might be thought to be practically unimportant. In such a case, we would say that the mean responses are in agreement. The posterior probability of this is called the Bayesian probability of agreement. In this article, we quantify the probability that new response observations from two groups will be within δ for a continuous response, and the probability that the two responses agree completely for categorical cases such as logistic regression and Poisson regression. We call these Bayesian comparative predictive probabilities, with the former being the predictive probability of agreement. We use Markov chain Monte Carlo simulation to estimate the posterior distribution of the model parameters and then the predictive probability of agreement. We illustrate the use of this methodology with three examples and provide a freely available R Shiny app that automates the computation and estimation associated with the methodology. 相似文献
16.
本文研究了当保费率随时间变化时的复合Poisson-Geometric过程的风险模型.通过无穷小方法,得到了该模型的Gerber-Shiu折现惩罚函数所满足的更新方程.在此基础上,推导出破产概率,破产前瞬时盈余,以及破产时刻赤字分布满足的更新方程.特别地,当个体索赔服从指数分布时,通过求解微分方程,得到了该模型的破产概率的显式表达式和所满足的不等式.最后通过数值模拟和算例分析,提出了保险公司的赔付政策和保费政策对自身风险的影响. 相似文献
17.
Input-profile-based software failure probability quantification for safety signal generation systems
Hyun Gook Kang Ho Gon Lim Ho Jung Lee Man Cheol Kim Seung Cheol Jang 《Reliability Engineering & System Safety》2009,94(10):1542-1546
The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics. 相似文献
18.
层板复合材料的疲劳剩余寿命预报模型 总被引:3,自引:0,他引:3
应用可靠性分析的方法 ,导出了层板复合材料在疲劳载荷作用下的疲劳剩余寿命的预报模型。该模型已用典型层板复合材料在恒幅疲劳载荷作用下的实验数据进行了验证。实验结果表明 ,理论预测结果与实验值的接近程度是合理的 相似文献
19.
在金融风险管理中,对风险度量方法的研究一直是该领域的一项重要内容.我们先利用概率测度以及凸函数构造了一种风险度量,但是发现该度量不满足协调性以及下侧风险的思想.我们又利用凸函数构造了基于半概率测度的风险度量,发现该风险度量方法包括了许多常见的风险度量方法,如半方差、半绝对离差、下偏矩、ES等.研究表明新风险度量不仅满足凸性而且还满足协调性.考虑到凸性以及协调性在投资组合以及风险管理中的重要意义,该风险度量方法具有一定的研究价值和实际意义. 相似文献
20.
In this paper, we investigate the performance of secondary transmission scheme based on Markov ON-OFF state of primary users in Underlay cognitive radio networks. We propose flexible secondary cooperative transmission schemewith interference cancellation technique according to the ON-OFF status of primary transmitter. For maximal ratio combining (MRC) at destination, we have derived exact closed-form expressions of the outage probability in different situations. The numerical simulation results also reveal that the proposed scheme improve the secondary transmission performance compared with traditional mechanism in terms of secondary outage probability and energy efficiency. 相似文献