共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper presents an assessment of efficient response surface techniques based on the High Dimensional Model Representation (HDMR) and the Factorized High Dimensional Model Representation (FHDMR). The HDMR is a general set of quantitative model assessment and analysis tools for capturing the high-dimensional relationships between sets of input and output model variables. It is a very efficient formulation of the system response, if higher order variable correlations are weak and if the response function is dominantly of an additive nature, allowing the physical model to be captured by the first few lower order terms. But, if the multiplicative nature of the response function is dominant, then all the right hand side components of the HDMR must be used to be able to obtain the best result. However, if the HDMR requires all components, which means 2N of them, to get a desired accuracy, making the method very expensive in practice, then the FHDMR can be used. The component functions of the FHDMR are determined by using the component functions of the HDMR. This paper presents the formulation of the FHDMR based response surface approximation of a limit state/performance function which is dominantly multiplicative in nature. It is a given that conventional methods for reliability analysis are computationally very demanding, when applied in conjunction with complex finite element models. This study aims to assess how accurately and efficiently HDMR/FHDMR based response surface techniques can capture complex model output uncertainty. As a part of this effort, the efficacy of the HDMR, which is recently applied to reliability analysis, is also demonstrated. The response surface is constructed using the moving least squares interpolation formula by including constant, first-order, and second-order terms of the HDMR and the FHDMR. Once the response surface form is defined, the failure probability can be obtained by statistical simulation. Results of seven numerical examples involving structural/solid-mechanics/geo-technical engineering problems indicate that the failure probability obtained using the FHDMR based response surface method for a limit state/performance function that is dominantly multiplicative in nature, provides a significant accuracy when compared with the conventional Monte Carlo method, while requiring fewer original model simulations. 相似文献
2.
Phase I analysis of a control chart implementation comprises parameter estimation, chart design, and outlier filtering, which are performed iteratively until reliable control limits are obtained. These control limits are then used in Phase II for online monitoring and prospective analyses of the process to detect out-of-control states. Although a Phase I study is required only when the true values of the parameters of a process are unknown, this is the case in many practical applications. In the literature, research on the effects of parameter estimation (a component of Phase I analysis) on the control chart performance has gained importance recently. However, these studies consider availability of complete and clean data sets, without outliers and missing observations, for estimation. In this article, we consider AutoRegressive models of order 1 and study the effects of two extreme cases for Phase I analysis; the case where all outliers are filtered from the data set (parameter estimation from incomplete but clean data) and the case where all outliers remain in the data set during estimation. Performance of the maximum likelihood and conditional sum of squares estimators are evaluated and effects on the Phase II use are investigated. Results indicate that the effect of not detecting outliers in Phase I can be severe on the Phase II application of a control chart. A real-world example is provided to illustrate the importance of an appropriate Phase I analysis. 相似文献
3.
In real applications, we may be confronted with the problem of informative censoring. Koziol-Green model is commonly used
to model the possible information contained in the informative censoring. However the proportionality assumption cast by Koziol-Green
model (see (1.2) below) is “too restrictive in that it limits the scope of the Cox model in practice” (see Subramanian, 2000).
In this paper, we try to relax the proportionality condition of Koziol-Green model by modeling the censorship semiparametrically.
It is shown that our suggested semiparametric censoring model is an applicable extension of the Koziol-Green model. Through
a close connection with the logistic regression, our model assumptions are readily to be checked in paractice. We also propose
estimation for both the regression parameter and the cumulative baseline hazard function which can incorporate the additional
information contained in the semiparametric censorship model. Simulations and the analysis of a real dataset confirm the applicability
of the suggested model and estimation.
This research is partially supported by NSF Grant DMS-0772292 相似文献
4.
Abdallah A. Abdel Ghaly Hanan M. Aly Rana N. Salah 《Quality and Reliability Engineering International》2016,32(3):1095-1108
The Accelerated Life Testing (ALT) has been used for a long time in several fields to obtain information on the reliability of product components and materials under operating conditions in a much shorter time. One of the main purposes of applying ALT is to estimate the failure time functions and reliability performance under normal conditions. This paper concentrates on the estimation procedures under ALT and how to select the best estimation method that gives accurate estimates for the reliability function. For this purpose, different estimation methods are used, such as maximum likelihood, least squares (LS), weighted LS, and probability weighted moment. Moreover, the reliability function under usual conditions is predicted. The estimation procedures are applied under the family of the exponentiated distributions in general, and for the exponentiated inverted Weibull (EIW) as a special case. Numerical analysis including simulated data and a real life data set is conducted to compare the performances between these four methods. It is found that the ML method gives the best results among other estimation methods. Finally, a comparison between the EIW and the Inverted Weibull (IW) distributions based on a real life data set is made using a likelihood ratio test. It is observed that the EIW distribution can provide better fitting than the IW in case of ALT. Copyright © 2015 John Wiley & Sons, Ltd. 相似文献
5.
S. Sarkani T.A. Mazzuchi D. Lewandowski D.P. Kihl 《Engineering Fracture Mechanics》2007,74(18):2971-2980
Those individual tests in a series of experiments that are not completed or must be suspended are known as “censored points,” or “runouts”. Inclusion of runouts in data analysis can be problematic, and such ad hoc approaches as ignoring the runout observation or treating it as a failure can significantly affect estimation. The methodology offered here alleviates the handling of runouts and censored data by using the maximum likelihood estimation (MLE) method to incorporate the censored data properly. The methodology is illustrated with an example problem using actual data and the affects of ad hoc approaches are illustrated. 相似文献
6.
Mazen Nassar Sanku Dey Saralees Nadarajah 《Quality and Reliability Engineering International》2021,37(6):2853-2874
Accelerated life testing is an efficient tool frequently adopted for obtaining failure time data of test units in a lesser time period as compared to normal use conditions. We assume that the lifetime data of a product at constant level of stress follows an exponentiated Poisson-exponential distribution and the shape parameter of the model has a log-linear relationship with the stress level. Model parameters, the reliability function (RF), and the mean time to failure (MTTF) function under use conditions are estimated based on eight frequentist methods of estimation, namely, method of maximum likelihood, method of least square and weighted least square, method of maximum product of spacing, method of minimum spacing absolute-log distance, method of Cramér-von-Mises, method of Anderson–Darling, and Right-tail Anderson–Darling. The performance of the different estimation methods is evaluated in terms of their mean relative estimate and mean squared error using small and large sample sizes through a Monte Carlo simulation study. Finally, two accelerated life test data sets are considered and bootstrap confidence intervals for the unknown parameters, predicted shape parameter, predicted RF, and the MTTF at different stress levels, are obtained. 相似文献
7.
In the present paper, a mixture form of the factor analysis model is developed under the maximum-likelihood framework. In this new model structure, different noise levels of process variables have been considered. Afterward, the developed mixture factor analysis model is utilized for process monitoring. To enhance the monitoring performance, a soft combination strategy is then proposed to integrate different local monitoring results into a single monitoring chart, which is based on the Bayesian inference method. To test the modeling and monitoring performance of the proposed mixture factor analysis method, a numerical example and the Tennessee Eastman (TE) benchmark case studies are provided. 相似文献
8.
Estimation of the parameters of generalized inverted exponential distribution is considered under constant stress accelerated life test. Besides the maximum likelihood method, nine different frequentist methods of estimation are used to estimate the unknown parameters. Moreover, the reliability function is estimated under use conditions based on different methods of estimation. We perform extensive simulation experiments to see the performance of the proposed estimators. As an illustration, a real data set is analyzed to demonstrate how the proposed methods may work in practice. 相似文献
9.
J. M. A. Pinto J. C. F. Pujol C. A. Cimini Jr 《Fatigue & Fracture of Engineering Materials & Structures》2014,37(1):85-94
This paper introduces a numerical model to estimate fatigue life under step‐stress conditions, using the Weibull and lognormal distributions. The maximum likelihood method was used to estimate the free parameters of the distributions. The model was fitted to an experimental data on fatigue life in the specimens of steel SAE 8620, by using evolutionary computation to optimize the likelihood function. Results are reported on the values of the parameters and their confidence interval. Also, a validation of the model is discussed using analysis of residuals. 相似文献
10.
11.
A cumulative model of fatigue crack growth 总被引:1,自引:0,他引:1
G Glinka 《International Journal of Fatigue》1982,4(2):59-67
A model of fatigue crack growth based on an analysis of elastic/plastic stress and strain at the crack tip is presented. It is shown that the fatigue crack growth rate can be calculated by means of the local stress/strain at the crack tip. The local stress and strain calculations are based on the general solutions given by Hutchinson, Rice and Rosengren. It is assumed that a small highly strained area existing at the crack tip is responsible for the fatigue crack growth. It is also assumed that the fatigue crack growth rate depends mainly on the width, x1, of the highly strained zone and on the strain range, , within the zone. A relationship between stress intensity factor K and the local strain and stress has been developed. It is possible to calculate the local strain for a variety of crack problems. Then, the number of cycles N1 required for material failure inside the highly strained zone is calculated. The fatigue crack growth rate is calculated as the ratio .The calculated fatigue crack growth rates were compared to the experimental ones. Two alloys steels and two aluminium alloys were analyzed. Good agreement between experimental and theoretical results is obtained. 相似文献
12.
提出了一种结构计算模型修正的二次约束最小二乘方法。该方法是在质量矩阵和刚度矩阵满足正交性条件和特征方程的约束下,使修正矩阵的范数最小,将模型修止问题转化为一个带二次约束的最小二乘问题。应用奇异值分解,给出了在振型需要和不需要扩充两种情况下结构计算模型修正的数值算法,并进行了数值实验。计算结果表明:新算法精度较高,能保证修正模型的前m阶模态参数与实测值有较好的吻合。 相似文献
13.
N. HUYEN L. FLACELIERE F. MOREL 《Fatigue & Fracture of Engineering Materials & Structures》2008,31(1):12-28
The work proposed in this paper is a possible way of modelling some local observations at the surface of mild steel specimens submitted to uniaxial and multiaxial loads. It is clearly seen that local plasticity, controlled by local microstructural heterogeneities, plays a fundamental role in microcrack nucleation and damage orientation is closely related to the applied loading mode. The framework of irreversible thermodynamics with internal variables for time‐independent, isothermal and small deformations has been used to build a critical plane damage model by assuming the existence of a link between mesoplasticity and mesodamage. Non‐associated plasticity and damage rules allow the evolution of some plastic slip before any damage nucleation, as seen during the observations. A key feature of this proposal is the capacity to reflect nonlinear damage accumulation under variable amplitude loading. 相似文献
14.
15.
Haitao Guo Simon Watson Jiangping Xiang 《Reliability Engineering & System Safety》2009,94(6):1057-1063
Reliability has an impact on wind energy project costs and benefits. Both life test data and field failure data can be used for reliability analysis. In wind energy industry, wind farm operators have greater interest in recording wind turbine operating data. However, field failure data may be tainted or incomplete, and therefore it needs a more general mathematical model and algorithms to solve the model. The aim of this paper is to provide a solution to this problem. A three-parameter Weibull failure rate function is discussed for wind turbines and the parameters are estimated by maximum likelihood and least squares. Two populations of German and Danish wind turbines are analyzed. The traditional Weibull failure rate function is also employed for comparison. Analysis shows that the three-parameter Weibull function can obtain more accuracy on reliability growth of wind turbines. This work will be helpful in the understanding of the reliability growth of wind energy systems as wind energy technologies evolving. The proposed three-parameter Weibull function is also applicable to the life test of the components that have been used for a period of time, not only in wind energy but also in other industries. 相似文献
16.
A new method to estimate case specific prediction uncertainty for univariate trilinear partial least squares (tri-PLS1) regression is introduced. This method is, from a theoretical point of view, the most exact finite sample approximation to true prediction uncertainty that has been reported up till now. Using the new method, different error sources can be propagated, which is an advantage that cannot be offered by data driven approaches such as the bootstrap. In a concise example, it is illustrated how the method can be applied. In the Appendix, efficient algorithms are presented to compute the estimates required. 相似文献
17.
This article introduces a method for local sensitivity analysis of practical interest. A theorem is given that provides a general and neat manner to obtain all sensitivities of a general nonlinear programming problem (around a local minimum) with respect to any parameter irrespective of it being a right-hand side, objective function or constraint constant. The method is based on the well-known duality property of mathematical programming, which states that the partial derivatives of the primal objective function with respect to the constraints' right-hand side parameters are the optimal values of the dual problem variables. For the parameters or data for which sensitivities are sought to appear on the right-hand side, they are converted into artificial variables and set to their actual values, thus obtaining the desired constraints. If the problem is degenerated and partial derivatives do not exist, the method also permits obtaining the right, left, and also directional derivatives, if they exist. In addition to its general applicability, the method is also computationally inexpensive because the necessary information becomes available without extra calculations. Moreover, analytical relations among sensitivities, locally valid, are straightforwardly obtained. It is also shown how the roles of the objective function and any of the active constraints (equality or inequality) can be exchanged leading to equivalent optimization problems. This permits obtaining the sensitivities of any constraint with respect to the parameters without the need of repeating the calculations. The method is illustrated by its application to two examples, one degenerated and the other one of a competitive market. 相似文献
18.
Lonny L. Thompson Peter M. Pinsky 《International journal for numerical methods in engineering》1995,38(3):371-397
In this paper a Galerkin least-squares (GLS) finite element method, in which residuals in least-squares form are added to the standard Galerkin variational equation, is developed to solve the Helmholtz equation in two dimensions. An important feature of GLS methods is the introduction of a local mesh parameter that may be designed to provide accurate solutions with relatively coarse meshes. Previous work has accomplished this for the one-dimensional Helmholtz equation using dispersion analysis. In this paper, the selection of the GLS mesh parameter for two dimensions is considered, and leads to elements that exhibit improved phase accuracy. For any given direction of wave propagation, an optimal GLS mesh parameter is determined using two-dimensional Fourier analysis. In general problems, the direction of wave propagation will not be known a priori. In this case, an optimal GLS parameter is found which reduces phase error for all possible wave vector orientations over elements. The optimal GLS parameters are derived for both consistent and lumped mass approximations. Several numerical examples are given and the results compared with those obtained from the Galerkin method. The extension of GLS to higher-order quadratic interpolations is also presented. 相似文献
19.
Spandan Maiti 《Engineering Fracture Mechanics》2005,72(5):691-708
A cohesive failure model is proposed to simulate fatigue crack propagation in polymeric materials. The model relies on the combination of a bi-linear cohesive failure law used for fracture simulations under monotonic loading and an evolution law relating the cohesive stiffness, the rate of crack opening displacement and the number of cycles since the onset of failure. The fatigue component of the cohesive model involves two parameters that can be readily calibrated based on the classical log-log Paris failure curve between the crack advance per cycle and the range of applied stress intensity factor. The paper also summarizes a semi-implicit implementation of the cohesive model into a cohesive-volumetric finite element framework, allowing for the simulation of a wide range of fatigue fracture problems. 相似文献
20.
V. Adrov 《International Journal of Fatigue》1993,15(6):451-453
A new damage parameter is proposed for fatigue life prediction using a local stress-strain approach. This parameter has a physical energy basis, and makes it possible to obtain the same accuracy as, and better life assessments than, the well-known Smith et al parameter using significantly less calculation time for load history treatment. Comparisons of life predictions obtained using the proposed parameter with experimental results and other predictions are presented. 相似文献