首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A new type of per-fluorinated polymer, “Low Temperature Fomblin,” has been tested as a wall coating in an ultracold neutron (UCN) storage experiment using a gravitational storage system. The data show a UCN reflection loss coefficient η as low as ≈ 5 × 10−6 in the temperature range 105 K to 150 K. We plan to use this oil in a new type of neutron lifetime measurement, where a bellows system (“accordion”) enables to vary the trap size in a wide range while the total surface area and distribution of surface area over height remain constant. These unique characteristics, in combination with application of the scaling technique developed by W. Mampe et al. in 1989, ensure exact linearity for the extrapolation from inverse storage lifetimes to the inverse neutron lifetime. Linearity holds for any energy dependence of loss coefficient µ(E). Using the UCN source at the Institut Laue Langevin we expect to achieve a lifetime precision below ±1 s.  相似文献   

2.
We present a new value for the neutron lifetime of 878.5 ± 0.7stat. ± 0.3syst. This result differs from the world average value by 6.5 standard deviations and by 5.6 standard deviations from the previous most precise result. However, this new value for the neutron lifetime together with a β-asymmetry in neutron decay, A0, of −0.1189(7) is in a good agreement with the Standard Model.  相似文献   

3.
We report progress on an experiment to measure the neutron lifetime using magnetically trapped neutrons. Neutrons are loaded into a 1.1 T deep superconducting Ioffe-type trap by scattering 0.89 nm neutrons in isotopically pure superfluid 4He. Neutron decays are detected in real time using the scintillation light produced in the helium by the beta-decay electrons. The measured trap lifetime at a helium temperature of 300 mK and with no ameliorative magnetic ramping is substantially shorter than the free neutron lifetime. This is attributed to the presence of neutrons with energies higher than the magnetic potential of the trap. Magnetic field ramping is implemented to eliminate these neutrons, resulting in an 83363+74s trap lifetime, consistent with the currently accepted value of the free neutron lifetime.  相似文献   

4.
We present a conceptual design for an experiment to measure the neutron lifetime (~886 s) with an accuracy of 10−4. The lifetime will be measured by observing the decay rate of a sample of ultracold neutrons (UCN) confined in vacuum in a magnetic trap. The UCN collaboration at Los Alamos National Laboratory has developed a prototype UCN source that is expected to produce a bottled UCN density of more than 100/cm3 [1]. The availability of such an intense source makes it possible to approach the measurement of the neutron lifetime in a new way. We argue below that it is possible to measure the neutron lifetime to 10−4 in a vacuum magnetic trap. The measurement involves no new technology beyond the expected UCN density. If even higher densities are available, the experiment can be made better and/or less expensive. We present the design and methodology for the measurement. The slow loss of neutrons that have stable orbits, but are not energetically trapped would produce a systematic uncertainty in the measurement. We discuss a new approach, chaotic cleaning, to the elimination of quasi-neutrons from the trap by breaking the rotational symmetry of the quadrupole trap. The neutron orbits take on a chaotic character and mode mixing causes the neutrons on the quasi-bound orbits to leave the trap.  相似文献   

5.
A spectral representation based model for Monte Carlo simulation   总被引:1,自引:0,他引:1  
A new model is proposed for generating samples of real-valued stationary Gaussian processes. The model is based on the spectral representation theorem stating that a weakly stationary process can be viewed as a superposition of harmonics with random properties. The classical use of this theorem for Monte Carlo simulation is based on models consisting of a superposition of harmonics with fixed frequencies but random amplitude and phase. The resulting samples have the same period depending on the discretization of the frequency band. In contrast, the proposed model consists of a superposition of harmonics with random amplitude, phase, and frequency so that different samples have different periods depending on the particular sample values of the harmonic frequencies.

A band limited Gaussian white noise process is used to illustrate the proposed Monte Carlo simulation algorithm and demonstrate that the estimates of the covariance function based on the samples of the proposed model are not periodic.  相似文献   


6.
In order to keep high reliability of components in a nuclear power plant, it is important to understand the damaging process due to multiple small cracks. The growth shows random behavior because of the microstructural inhomogeneity and the interaction between cracks. The former includes the effects of crack kinking and anisotropic deformation in each crystal of polycrystalline. In this study, a Monte Carlo simulation method is developed in order to analyze the random behavior, taking into account the their influences on the stress intensity factor. The damaging process of mill-annealed alloy 600 in the primary water stress corrosion cracking (PWSCC) is numerically simulated by the proposed method. The crack size distribution obtained agrees well with the experimental observation, and the maximum crack size is statistically estimated on the basis of the Gumbel statistics.  相似文献   

7.
Dependability has been recognized in the transportation reliability literature as an effective measure of transit system service quality. Dependability models link system dependability with reliability and maintainability characteristics of subsystems, incorporating special operating characteristics and recovery policy from failure of each particular transit system. In this paper a new transit system dependability model is proposed, which considers the possibility that a passenger may be delayed by the occurrence of more than one failure in a trip. The mathematical difficulties associated with the algebra of random variables are overcome by using the Monte Carlo method. The results of the proposed model are compared with those relative to different modelling approaches in the literature, by applying the model to a common test scenario.  相似文献   

8.
This paper considers the inverse problem in electrical impedance tomography with non‐informative prior information on the required conductivity function. The problem is approached with a Newton‐type iterative algorithm where the solution of the linearized approximation is estimated using Bayesian inference. The novelty of this work focuses on maximum a posteriori estimation assuming a model that incorporates the linearization error as a random variable. From an analytical expression of this term, we employ Monte Carlo simulation in order to characterize its probability distribution function. This simulation entails sampling an improper prior distribution for which we propose a stable scheme on the basis of QR decomposition. The simulation statistics show that the error on the linearized model is not Gaussian, however, to maintain computational tractability, we derive the posterior probability density function of the solution by imposing a Gaussian kernel approximation to the error density. Numerical results obtained through this approach indicate the superiority of the new model and its respective maximum a posteriori estimator against the conventional one that neglects the impact of the linearization error. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

9.
In this paper stochastic modeling is used to predict fatigue life uncertainty by simulating small variations in strain-life material constants. The Monte Carlo method,[1] using either known cumulative distribution functions (CDFs) for the material constants and/or postulated CDFs, provides the mechanism to generate a set of failure data. These data are analyzed, using ordinary statistical techniques, to develop the Weibull CDF for cycles to failure. It is seen that small variations (∼±15%) in values of the strain-life constants result in large variations (∼600%) in predicted fatigue life at moderate strain. The often-observed phenomenon that spread in fatigue data is greater at lower strain is also described by the stochastic model.  相似文献   

10.
The 2D Direct Simulation Monte Carlo code LasVegas was originally developed and used for the simulation of atmospheric re-entry and the flow around plasma wind tunnel probes. This code is reviewed regarding its applicability to appropriate plasma technological applications. The main DSMC theory and the code structure, features and properties are described in order to classify the LasVegas code in regard to these applications. General restrictions, model uncertainties and geometrical considerations of the DSMC method itself as well as of the LasVegas code put the focus on the vacuum spray process. As proof of concept, results of a supersonic free stream simulation in a vacuum chamber with typical inflow properties are presented and discussed. Appropriate code extensions such as the flow-particle interaction, ionization model and parallel computation capability were identified, thus indicating future numerical works.  相似文献   

11.
Zhenlin Yang 《TEST》2000,9(1):123-131
A new statistic for testing a regression transformation is proposed based on a result of Yang (1999). This statistic is shown to be stable, having a null distribution almost idependent of model type and parameter values, accurate and easy to implement. The statistic is of the Wald-type and thus is compared with the Wald statistic given by Lawrence (1987) in terms of size, null distribution and power using simulation. The simulation results show that the new statistic generally outperforms that of Lawrance.  相似文献   

12.
This paper combines Monte Carlo simulation and cellular automata for computing the availability of a complex network system and the importance measures of its elements.  相似文献   

13.
A class of stationary non-Gaussian processes, referred to as the class of mixtures of translation processes, is defined by their finite dimensional distributions consisting of mixtures of finite dimensional distributions of translation processes. The class of mixtures of translation processes includes translation processes and is useful for both Monte Carlo simulation and analytical studies. As for translation processes, the mixture of translation processes can have a wide range of marginal distributions and correlation functions. Moreover, these processes can match a broader range of second order correlation functions than translation processes. The paper also develops an algorithm for generating samples of any non-Gaussian process in the class of mixtures of translation processes. The algorithm is based on the sampling representation theorem for stochastic processes and properties of the conditional distributions. Examples are presented to illustrate the proposed Monte Carlo algorithm and compare features of translation processes and mixture of translation processes.  相似文献   

14.
A simple procedure is proposed to identify line layout solutions when a production facility with work centres of unequal size uses conventional material handling devices and operates under stochastic demand scenarios. The procedure uses Monte Carlo simulation (MCS) to empirically search for robust solutions defined as those that simultaneously meet minimum material handling cost performance levels across all demand scenarios. The results reported in this study suggest that ‘robust’ line layout solutions can be identified using a modest volume of random sampling. The procedure and results are demonstrated through a series of sample problems.  相似文献   

15.
空间电子探测器的准直仪的目的是限制仪器的探测张角和几何因子,而电子与准直仪材料的散射效应严重,准直仪的设计优劣将直接影响仪器的测量精度。本文以我国的风云四号卫星的高能电子探测器为例,在基础准直仪的结构基础上进行了改进设计,分析了在准直仪内的加齿结构、齿形、齿厚、齿深、齿数等参数对准直仪反散射能力的影响。分析发现齿形对反散射能力的影响最大,齿深对反散射能力影响较小。提出了最优准直仪结构,有效改善了准直仪内电子散射问题,总结了空间电子探测器的准直仪设计的经验和原则,为以后准直仪的优化提供参考。  相似文献   

16.
A semi-analytical simulation method is proposed in this paper to assess system reliability of structures. Monte Carlo simulation with variance-reduction techniques, systematic and antithetic sampling, is employed to obtain the samples of the structural resistance in this method. Variance-reduction techniques make it possible to sufficiently simulate the structural resistance with less runs of structural analysis. When resistance samples and its moments determined, exponential polynomial method (EPM) is used to fit the probability density function of the structural resistance. EPM can provide the approximate distribution and statistical characteristic of the structural resistance and then the first-order second-moment method can be carried out to calculate the structural failure probability. Numerical examples are provided for a structural component and two ductile frames, which illustrate the method proposed facilitates the evaluation of system reliability in assessments of structural safety.  相似文献   

17.
A critical appraisal of reliability estimation procedures for high dimensions   总被引:16,自引:0,他引:16  
A critical appraisal of reliability procedures for high dimensions is presented. Available approximate methods and methods based on Monte Carlo simulation are discussed. It is shown that procedures which perform well in low dimensions may become impractical if the dimension increases considerably or tends to infinity. It is observed that some types of Monte Carlo based simulation procedures in fact are capable of treating high dimensional problems.  相似文献   

18.
The failure mode and effect analysis (FMEA) is a widely applied technique for prioritizing equipment failures in the maintenance decision‐making domain. Recent improvements on the FMEA have largely focussed on addressing the shortcomings of the conventional FMEA of which the risk priority number is incorporated as a measure for prioritizing failure modes. In this regard, considerable research effort has been directed towards addressing uncertainties associated with the risk priority number metrics, that is occurrence, severity and detection. Despite these improvements, assigning these metrics remains largely subjective and mostly relies on expert elicitations, more so in instances where empirical data are sparse. Moreover, the FMEA results remain static and are seldom updated with the availability of new failure information. In this paper, a dynamic risk assessment methodology is proposed and based on the hierarchical Bayes theory. In the methodology, posterior distribution functions are derived for risk metrics associated with equipment failure of which the posterior function combines both prior functions elicited from experts and observed evidences based on empirical data. Thereafter, the posterior functions are incorporated as input to a Monte Carlo simulation model from which the expected cost of failure is generated and failure modes prioritized on this basis. A decision scheme for selecting appropriate maintenance strategy is proposed, and its applicability is demonstrated in the case study of thermal power plant equipment failures. Copyright © 2016 John Wiley & Sons, Ltd.  相似文献   

19.
测量不确定度最大残差系数的一种新算法   总被引:9,自引:1,他引:8  
在测量不确定度的评定中,用测量数据的最大残差乘以适当的系数获得标准差的方法称为最大残差法。通过对最大残差的分布函数及其数字特征的分析研究,建立了最大残差的概率模型,并且利用蒙特卡罗模拟及Matlab软件,提出了计算测量不确定度最大残差系数的一种新方法,求出了当测量次数小于50时,最大残差所对应的分布函数、均值、标准差和自由度的数值,并给出了最大残差法的系数表。最后通过测量实例验证了理论分析的正确性。用文中提出的方法可简单、迅速、可靠地计算出所需要的标准差数值。  相似文献   

20.
The software reliability modeling is of great significance in improving software quality and managing the software development process. However, the existing methods are not able to accurately model software reliability improvement behavior because existing single model methods rely on restrictive assumptions and combination models cannot well deal with model uncertainties. In this article, we propose a Bayesian model averaging (BMA) method to model software reliability. First, the existing reliability modeling methods are selected as the candidate models, and the Bayesian theory is used to obtain the posterior probabilities of each reliability model. Then, the posterior probabilities are used as weights to average the candidate models. Both Markov Chain Monte Carlo (MCMC) algorithm and the Expectation–Maximization (EM) algorithm are used to evaluate a candidate model's posterior probability and for comparison purpose. The results show that the BMA method has superior performance in software reliability modeling, and the MCMC algorithm performs better than EM algorithm when they are used to estimate the parameters of BMA method.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号