首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 217 毫秒
1.
Jun Li 《Sequential Analysis》2013,32(4):475-487
Abstract

Estimation of the offset between two network clocks has received a lot of attention in the literature, with the motivating force being data networking applications that require synchronous communication protocols. Statistical modeling techniques have been used to develop improved estimation algorithms, with a recent development being the construction of a confidence interval based on a fixed sample size. Lacking in the fixed sample size confidence interval procedures is a useable relationship between sample size and the width of the resulting confidence interval. Were that available, an optimum sample size could be determined to achieve a specified level of precision in the estimator and thereby improve the efficiency of the estimation procedure by reducing unnecessary overhead in the network that is associated with collecting the data used by the estimation schemes. A fixed sample size confidence interval that has a prescribed width is not available for this problem. However, in this paper we develop and compare alternative sequential intervals with fixed width and demonstrate that an effective solution is available.  相似文献   

2.
This work reviews a well-known methodology for batch distillation modeling, estimation, and optimization but adds a new case study with experimental validation. Use of nonlinear statistics and a sensitivity analysis provides valuable insight for model validation and optimization verification for batch columns. The application is a simple, batch column with a binary methanol–ethanol mixture. Dynamic parameter estimation with an ℓ1-norm error, nonlinear confidence intervals, ranking of observable parameters, and efficient sensitivity analysis are used to refine the model and find the best parameter estimates for dynamic optimization implementation. The statistical and sensitivity analyses indicated there are only a subset of parameters that are observable. For the batch column, the optimized production rate increases by 14% while maintaining product purity requirements.  相似文献   

3.
The influence of measurement uncertainties (MU) on the determination of the parameters of a distribution function has been analyzed using a Monte Carlo simulation technique on the example of the Weibull distribution, which describes the strength of brittle materials. It is shown that in the parameter range which is relevant for strength testing of brittle materials (e.g. ceramics) very high precision measurements are necessary if the Weibull modulus of the parent distribution is m ≥ 20. In that case the MU should be lower than ±2% of the measured value to obtain the same confidence level compared to the ideal case (MU = 0). Otherwise a significant underestimation of m becomes very probable. However, if m ≤ 10, even relatively large MU (up to ±10%) are tolerable. In summary, the precision of the measurements is acceptable as long as the width of the error distribution be much smaller than the width of the parent distribution.  相似文献   

4.
This article considers the reliability analysis of a hybrid system with dependent components, which are linked by a copula function. Based on Type I progressive hybrid censored and masked system lifetime data, we drive some probability results for the hybrid system and then the maximum likelihood estimates as well as the asymptotic confidence intervals and bootstrap confidence intervals of the unknown parameters are obtained. The effects of different dependence structures on the estimates of the parameter and the reliability function are investigated. Finally, Monte Carlo simulations are implemented to compare the performances of the estimates when the components are dependent with those when the components are independent.  相似文献   

5.
6.
Monte Carlo simulations were used to search for the probability estimator for the unbiased estimate of the Weibull parameters in the linear regression method. Compared with commonly-used probability estimators, the estimator proposed gives a more accurate estimation of the Weibull modulus and the same estimation precision of the scale parameter. It is found that the estimator proposed is more conservative than the estimator Pi = (i  0.5)/n recommended by previous authors, and hence results in a higher safety in reliability predictions. The unbiased properties of the estimated Weibull parameters were validated with actual experimental data. It is also concluded that the estimated Weibull modulus from actual experimental data is more dispersive than that from Monte Carlo simulation, which arises from the fact that the strength data from actual experiments does not perfectly follow the Weibull statistics.  相似文献   

7.
For analysis, design and model-based control of crystallization processes typically population balance models or reduced models derived therefrom are used. Usually the kinetic parameters in these models are determined analyzing measured concentration trajectories and/or particle size distributions using parameter estimation procedures. In the case of preferential crystallization of enantiomers the analysis of experiments is complex since there are two “competing” crystal populations. In this field often batch processes are performed using seeds of the desired enantiomer. Currently, it is in particular challenging to quantify and optimize a new concept: e.g. the so-called “auto seeded programmed polythermal preferential crystallization” (“as3pc” [Coquerel, G., Petit, M.-N., Bouaziz, R., 2000. Method of resolution of two enantiomers by crystallization. United States Patent, Patent number: 6,022,409]). In order to design and optimize this process the temperature dependent kinetic constants for crystal growth, nucleation and dissolution have to be known. In this work a reduced model for this auto seeded process is presented. The general identifiability of the model parameters is investigated along with some suggestions on how to reparameterize the kinetic terms involved. The values of the identified key parameters are estimated using conventional least square optimization using experimental data determined for the model system threonine/water. Parameter confidence and cross correlation are discussed and finally the model is validated using experiments not used for parameter estimation.  相似文献   

8.
Source term identification is very important for the contaminant gas emission event.Thus,it is necessary to study the source parameter estimation method with high computation efficiency,high estimation accuracy and reasonable confidence interval.Tikhonov regularization method is a potential good tool to identify the source parameters.However,it is invalid for nonlinear inverse problem like gas emission process.2-step nonlinear and linear PSO (partial swarm optimization)-Tikhonov regularization method proposed previously have estimated the emission source parameters successfully.But there are still some problems in computation efficiency and confidence interval.Hence,a new 1-step nonlinear method combined Tikhonov regularization and PSO algorithm with nonlinear forward dispersion model was proposed.First,the method was tested with simulation and experiment cases.The test results showed that 1-step nonlinear hybrid method is able to estimate multiple source parameters with reasonable confidence interval.Then,the estimation performances of different methods were compared with different cases.The estimation values with 1-step nonlinear method were close to that with 2-step nonlinear and linear PSO-Tikhonov regularization method.1-step nonlinear method even performs better than other two methods in some cases,especially for source strength and downwind distance estimation.Compared with 2-step nonlinear method,1-step method has higher computation efficiency.On the other hand,the confidence intervals with the method proposed in this paper seem more reasonable than that with other two methods.Finally,single PSO algorithm was compared with 1-step nonlinear PSO-Tikhonov hybrid regularization method.The results showed that the skill scores of 1-step nonlinear hybrid method to estimate source parameters were close to that of single PSO method and even better in some cases.One more important property of 1-step nonlinear PSO-Tikhonov regularization method is its reasonable confidence interval,which is not obtained by single PSO algorithm.Therefore,1-step nonlinear hybrid regularization method proposed in this paper is a potential good method to estimate contaminant gas emission source term.  相似文献   

9.
Determination of Texture from Individual Grain Orientation Measurements   总被引:2,自引:0,他引:2  
We present a technique for determining the texture of a polycrystalline material based on the measurement of the orientation of a number of individual grains. We assumed that the sample has fiber (i.e., axisymmetric) texture and that the texture can be characterized by a function (the March-Dollase function) with a single parameter. We simulated a large number, N , of orientation data sets, using the March-Dollase function for a total of five different texture parameters, r init. Using the maximum-likelihood method, we solved for the texture parameter, r ´, that best fits each simulated data set in order to determine the distribution of r ´ and evaluate the precision and accuracy with which r ´ can be determined. The 90% confidence limits of the ratio r ´/ r init vary as N -1/2 but were independent of r init. Using the texture of slightly textured Al2O3 as determined by X-ray diffraction, we calculated the 90% confidence limits for measurements of 131 grains. The orientations of 131 grains in textured Al2O3 were measured by electron backscatter diffraction, and the texture determined from those measurements lay within these 90% confidence limits.  相似文献   

10.
'Exact' methods for categorical data are exact in terms of using probability distributions that do not depend on unknown parameters. However, they are conservative inferentially. The actual error probabilities for tests and confidence intervals are bounded above by the nominal level. This article examines the conservatism for interval estimation and describes ways of reducing it. We illustrate for confidence intervals for several basic parameters, including the binomial parameter, the difference between two binomial parameters for independent samples, and the odds ratio and relative risk. Less conservative behavior results from devices such as (1) inverting tests using statistics that are 'less discrete', (2) inverting a single two-sided test rather than two separate one-sided tests each having size at least half the nominal level, (3) using unconditional rather than conditional methods (where appropriate) and (4) inverting tests using alternative p-values. The article concludes with recommendations for selecting an interval in three situations-when one needs to guarantee a lower bound on a coverage probability, when it is sufficient to have actual coverage probability near the nominal level, and when teaching in a classroom or consulting environment.  相似文献   

11.
We consider the problem of providing a fixed width confidence interval for the difference of two normal means when the variances are unknown and unequal. We propose a two-stage procedure that differs from those of Chapman (1950) and Ghosll (1975). The procedure provides the desired confidence, subject to the restriction on the width, for certain values of the design parameter h. Values of h are given by the Monte Carlo rnethod for various combinations of first stage sample size and confidence level. Finally, it is shown that the procedure is asymptotically more efficient than those of Chapmail and Ghosh with respect to total sample size, as the width of the interval approaches zero.  相似文献   

12.
This study concerns understanding of the underlying mechanistic pathways in high temperature solution polymerization of n-butyl acrylate (nBA) in the absence of added thermal initiators. The particular system of interest is the batch polymerization of nBA in xylene at temperatures between 140 and 180 °C with initial monomer content between 20 and 40 wt%. A mechanistic process model is developed to capture the dynamics of the polymerization system. Postulated reaction mechanisms include chain-initiation by monomer (self-initiation), chain-initiation by unknown impurities, chain-propagation by secondary and tertiary radicals, intra-molecular chain-transfer to polymer (back-biting), chain-fragmentation (β-scission), chain-transfer to monomer and solvent, and chain termination by disproportionation and combination. The extent of the reactions is quantified by estimating the reaction rate constants of the initiation and the secondary reactions, based on a set of process measurements. The set of measurements considered in the parameter estimation includes monomer conversion, number- and weight-average molecular weights, and average number of chain-branches per chain (CBC). Effect of temperature on chain microstructures was observed to be most evident when microstructures are expressed in terms of their quantities per chain. The evolution of other microstructural quantities such as average number of terminal double bonds per chain (TDBC) and average number of terminal solvent groups per chain (TSGC) was then also investigated. Microstructural quantities per polymer chain (TDBC, TSGC, CBC) are defined based on combinations of 13C, 1H NMR and chromatographic measurements. This study presents (i) a mechanistic explanation for the competing nature of short-chain-branch and terminal double bond formation (i.e. as temperature increases, number of chain branches per chain decreases and number of terminal double bonds per chain increases), (ii) quantitative insights into dominant modes of chain-initiation and chain-termination reactions, and (iii) mechanistic explanations for the observed spontaneous polymerization. The study also reports estimated Arrhenius parameters for second-order self initiation, tertiary radical propagation, secondary radical backbiting and tertiary radical β-scission reaction rate constants. Validation of the mechanistic process model with the estimated Arrhenius parameters and comparison of estimated parameter values to recently reported estimates are also presented.  相似文献   

13.
Experimental values of the Flory–Huggins parameter, χ, between polymers and solvents, are frequently used to determine the solubility parameters of the polymers. A method using nonlinear curve fitting of RTχ/V was compared to the linear regression method commonly used. It was found that the formulas for the solubility parameter were the same, but the linear method produced a slightly different entropy term. The nonlinear method gave a lower correlation coefficient and wider confidence intervals and was more effective at distinguishing systems than the linear model. The effect of the deviation of probes in the solubility parameter model is discussed. Using probes with low solubility parameters to measure the polymer solubility parameter gave wider confidence intervals. © 2004 Wiley Periodicals, Inc. J Appl Polym Sci 91: 2894–2902, 2004  相似文献   

14.
A dynamic physical model of a microbial fuel cell (MFC) anode is presented and parameterized. It describes the evolution of current density and biofilm mass over time as a function of substrate concentration. The model is particularly useful for process monitoring or control purposes because it reproduces the dynamics of the MFC anode and contains comparatively few parameters. Parameters are identified using data from the response of the MFC to a substrate concentration pulse. Theoretical and practical identifiability of the parameters is evaluated, and parameter confidence intervals are determined.  相似文献   

15.
Conditions are given for weak convergence through random indices of a general stochastic approximation process which includes the Robbins-Monro and Kiefer-Wolfowitz processes. For a particular index, a sequential fixed-width bounded length confidence interval for the parameter being estimated is established. As an example, an optimal recursive estimator and confidence interval for the mode of a distribution function is constructed.  相似文献   

16.
Electrical field flow fractionation (EFFF) has two perpendicular driving forces that help to produce an optimal separation of solute in a mixture [Giddings, Science 1993; 260:1456–1465]. For Couette flow based devices, the ratio of the velocity of the capillary walls offers an extra parameter that can be exploited to enhance the efficiency of EFFF applications. The analysis of the effects of this parameter on optimal times of separation is the subject matter of this contribution. The use of this additional parameter increases flexibility in the design of new devices for the improvement of the separation of solutes, such as proteins, DNA, and pharmaceuticals, as it will be illustrated with the results of this analysis (Jaroszeski et al., 2000 ; Trinh et al., 1999 ). The analysis has been illustrated by selecting parameter values that represent a number of potential useful applications. A set of five parameters (i.e., z, the valence; µ, electrophoretic mobility; Pe, Peclet number; Ω, the orthogonal applied electrical field; and R, the ratio of channel wall velocities) has been combined to obtain the best operating conditions for optimal separation of solutes. Results indicate that R, the ratio of the channel wall velocities, is actually the most important driving parameter.  相似文献   

17.
朱奥  郭建华  王淑莹  彭永臻 《化工学报》2013,64(4):1387-1395
提出了一种全新的针对初值常微分方程组系统的全局最优化(遗传算法)结合局部最优化(拟牛顿法)实现参数的鲁棒、快速估计的算法。利用该算法,对所建两步硝化模型中过程溶解氧(dissolved oxygen, DO)的动态变化成功实现了参数估计,相关度达到了0.9955。采用基于Fisher信息矩阵和直接搜索获得的参数置信区间相比较的方法实现了对估计结果的可靠性分析,结果表明采用该方法大部分参数可实现可靠估计,只有少数两个参数可实现估计却不可靠,为动力学系统的参数估计结果提出了一个全新的检验方法。DO模拟结果可以作为软测量手段,对过程中易生物降解COD、氨氮、亚硝态氮、硝态氮的全程变化情况提供充足的过程信息。  相似文献   

18.
This paper considers sequantial producers to construct fixed-width confidence intervals for some function θ of mean μ and variance σ2 of normal distribution.Consideration is devoted to θ=exp( μ + σ2/2 ) and θ=μ/σ.Nonlinear renewal theory is used to drive asymptotic expansion of expectation of the stopping time and the estimate as the width of confidence interval decreases to zero.An improvement of the coverage probability is also discussed.  相似文献   

19.
Particle breakage during dense-phase comminution processes is significantly affected by mechanical multi-particle interactions, which are neglected in traditional discrete linear population model (DL-PBM). A discrete non-linear PBM (DNL-PBM) has been recently proposed to account for multi-particle interactions; however, the inverse problem, i.e., the estimation of the model parameters, has not been addressed. In this paper, a method for the estimation of DNL-PBM parameters is presented with a purpose of determining the consequences of neglecting multi-particle interactions in the traditional DL-PBM. The model parameters were obtained from a constrained, non-linear, least-squares minimization of the residuals between comminution data and discrete PBM prediction. Comminution data exhibiting multi-particle interactions were obtained from a DNL-PBM simulation followed by addition of 0%, 10%, and 20% random error. A comprehensive statistical analysis of the goodness of fit and certainty of the parameters was performed to discriminate the models. Using the estimated parameters, predictive capability of both models was further assessed by comparing their prediction with additional computer-generated data obtained with a different feed particle size distribution. The parameter estimation method was shown to be highly accurate and robust. DNL-PBM can predict the influence of different feed conditions better than DL-PBM when multi-particle interactions are significant. This study has demonstrated that neglecting multi-particle interactions in dense-phase comminution processes via the use of DL-PBM can lead to falsified kinetics with erroneous breakage functions.  相似文献   

20.
The adaptive input design (also called online redesign of experiments) for parameter estimation is very effective for the compensation of uncertainties in nonlinear processes. Moreover, it enables substantial savings in experimental effort and greater reliability in modeling.We present theoretical details and experimental results from the real-time adaptive optimal input design for parameter estimation. The case study considers separation of three benzoate by reverse phase liquid chromatography. Following a receding horizon scheme, adaptive D-optimal input designs are generated for a precise determination of competitive adsorption isotherm parameters. Moreover, numerical techniques for the regularization of arising ill-posed problems, e.g. due to scarce measurements, lack of prior information about parameters, low sensitivities and parameter correlations are discussed. The estimated parameter values are successfully validated by Frontal Analysis and the benefits of optimal input designs are highlighted when compared to various standard/heuristic input designs in terms of parameter accuracy and precision.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号