首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Attempts to model the variation of a random response in terms of the factors that affect it and its own self-generated variability constitute the bulk of the scientific and engineering research effort. This is particularly valid for quality engineering, where modeling of a response is often required to solve quality problems or to improve quality. While models that are derived from established domain-specific theories are commonly used in various disciplines, a pragmatic approach may be conceived that assembles under a single general model features that are shared by models developed in disparate and unrelated disciplines.

In this paper, we develop a new approach compatible with this concept. On the basis of recently developed inverse normalizing transformations, the new model provides the quantile-relationship between a response (the dependent variable) and the factors that affect it (the independent variables), assuming only that this relationship is either uniformly convex or concave. Furthermore, two independent sources of variation, one internal and one external, are assumed to account for the observed response variation. We demonstrate the validity of the new approach by showing that the new model is a generalization of models that are currently in use in three disparate engineering disciplines: hardware reliability, software reliability, and chemical engineering. Employing some previously published data sets, the modeling competence of the new approach is demonstrated.  相似文献   

2.
The data‐transformation approach and generalized linear modeling both require specification of a transformation prior to deriving the linear predictor (LP). By contrast, response modeling methodology (RMM) requires no such specifications. Furthermore, RMM effectively decouples modeling of the LP from modeling its relationship to the response. It may therefore be of interest to compare LPs obtained by the three approaches. Based on numerical quality problems that have appeared in the literature, these approaches are compared in terms of both the derived structure of the LPs and goodness‐of‐fit statistics. The relative advantages of RMM are discussed. Copyright © 2007 John Wiley & Sons, Ltd.  相似文献   

3.
Statistical process control monitoring of nonlinear relationships (profiles) has been the subject of much research recently. While attention is primarily given to the statistical aspects of the monitoring techniques, little effort has been devoted to developing a general modeling approach that would introduce ‘uniformity of practice’ in modeling nonlinear profiles (analogously with the three‐sigma limits of Shewhart control charts). In this article, we use response modeling methodology (RMM) to demonstrate implementation of this approach to statistical process control monitoring of ecological relationships. Using 10 ecological models that have appeared in the literature, it is first shown that RMM models can replace (approximate) current ecological models with negligible loss in accuracy. Computer simulation is then used to demonstrate that estimated RMM models and estimated data generating ecological models achieve goodness‐of‐fit that is practically indistinguishable from one another. A regression‐adjusted control scheme, based on control charts for the predicted median and for residuals variation, is developed and demonstrated for three types of ‘out of control’ scenarios. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

4.
Response Modeling Methodology (RMM) is a general platform to model monotone convex relationships. In this article, RMM is combined with linear regression analysis to model and estimate linear predictors (LPs) embedded in a nonlinear profile. A regression‐adjusted statistical process control scheme is then implemented to monitor the LP's residuals. To model and estimate the LP, RMM defines a Taylor series expansion of an unknown response transformation and then use canonical correlation analysis to estimate the LP. A possible hindrance to the implementation of the new scheme is possible occurrence of nonnormal errors (in violation of the linear regression model). Reasons for the occurrence of this phenomenon are explored and remedies offered. The effectiveness of the new scheme is demonstrated for data generated via Monte Carlo simulation. Results from hypothesis testing clearly indicate that the type of the response distribution, its skewness and the sample size, do not affect the effectiveness of the new approach. A detailed implementation routine is expounded, accompanied by a numerical example. When interest is solely focused on the stability of the LP, and the nonlinear profile per se is of little interest, the new general RMM‐based statistical process control scheme delivers an effective platform for process monitoring. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

5.
Efforts to improve quality require that the factors affecting it be identified. This allows either removal of root-causes for low quality or finding optimal settings for the investigated product or process. When the common assumptions of the normal scenario are not met, two alternative approaches are commonly pursued: normalization of data and the use of generalized linear models (GLM). Recently, a third alternative has been developed, that models a response subject to self-generated random variation and externally generated systematic and random variation. It is assumed that the relationship between the response and the externally generated variation is uniformly convex (or concave). A unique feature of the new model is that both its structure and the parameters' values are determined solely by the data on hand (no theory-based arguments are required). Here, we compare the effectiveness of the new methodology relative to current approaches when applied to response modelling in quality improvement efforts. We do this by using published data sets which have been formerly analysed within the framework of either the normalizing approach or the GLM approach (or both). The relative merits of the new methodology are demonstrated and discussed.  相似文献   

6.
The problem of determining the optimal warranty period, assumed to coincide with the manufacturer's lower specification limit for the lifetime of the product, is addressed. It is assumed that the quantity sold depends via a Cobb–Douglas‐type demand function on the sale price and on the warranty period, and that both the cost incurred for a non‐conforming item and the sale price increase with the warranty period. A general solution is derived using Response Modeling Methodology (RMM) and a new approximation for the standard normal cumulative distribution function. The general solution is compared with the exact optimal solutions derived under various distributional scenarios. Relative to the exact optimal solutions, RMM‐based solutions are accurate to at least the first three significant digits. Some exact results are derived for the uniform and the exponential distributions. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

7.
A rigorous method for interpolating a set of parameterized linear structural dynamics reduced‐order models (ROMs) is presented. By design, this method does not operate on the underlying set of parameterized full‐order models. Hence, it is amenable to an online real‐time implementation. It is based on mapping appropriately the ROM data onto a tangent space to the manifold of symmetric positive‐definite matrices, interpolating the mapped data in this space and mapping back the result to the aforementioned manifold. Algorithms for computing the forward and backward mappings are offered for the case where the ROMs are derived from a general Galerkin projection method and the case where they are constructed from modal reduction. The proposed interpolation method is illustrated with applications ranging from the fast dynamic characterization of a parameterized structural model to the fast evaluation of its response to a given input. In all cases, good accuracy is demonstrated at real‐time processing speeds. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

8.
New model fusion techniques based on spatial‐random‐process modeling are developed in this work for combining multi‐fidelity data from simulations and experiments. Existing works in multi‐fidelity modeling generally assume a hierarchical structure in which the levels of fidelity of the simulation models can be clearly ranked. In contrast, we consider the nonhierarchical situation in which one wishes to incorporate multiple models whose levels of fidelity are unknown or cannot be differentiated (e.g., if the fidelity of the models changes over the input domain). We propose three new nonhierarchical multi‐model fusion approaches with different assumptions or structures regarding the relationships between the simulation models and physical observations. One approach models the true response as a weighted sum of the multiple simulation models and a single discrepancy function. The other two approaches model the true response as the sum of one simulation model and a corresponding discrepancy function, and differ in their assumptions regarding the statistical behavior of the discrepancy functions, such as independence with the true response or a common spatial correlation function. The proposed approaches are compared via numerical examples and a real engineering application. Furthermore, the effectiveness and relative merits of the different approaches are discussed. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

9.
Variation exists in all processes. Significant work has been done to identify and remove sources of variation in manufacturing processes resulting in large returns for companies. However, business process optimization is an area that has a large potential return for a company. Business processes can be difficult to optimize due to the nature of the output variables associated with them. Business processes tend to have output variables that are binary, nominal or ordinal. Examples of these types of output include whether a particular event occurred, a customer's color preference for a new product and survey questions that assess the extent of the survey respondent's agreement with a particular statement. Output variables that are binary, nominal or ordinal cannot be modeled using ordinary least‐squares regression. Logistic regression is a method used to model data where the output is binary, nominal or ordinal. This article provides a review of logistic regression and demonstrates its use in modeling data from a business process involving customer feedback. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

10.
Time series analysis methods have been applied to a large number of practical problems, including modeling and forecasting economic time series and process and quality control. One aspect of time series analysis is the use of discrete linear transfer functions to model the interrelationships between input and output time series. This paper is an introduction to the identification, estimation, and diagnostic checking of these models. Some aspects of forecasting with transfer function models are also discussed. A survey of intervention analysis models in which the input series is an indicator variable corresponding to an isolated event thought to influence the output is also given. Familiarity with univariate autoregressive integrated moving average modeling is assumed. Extensions to more general multiple time series analysis methods are also briefly discussed.  相似文献   

11.
Gaussian process (GP) metamodels have been widely used as surrogates for computer simulations or physical experiments. The heart of GP modeling lies in optimizing the log‐likelihood function with respect to the hyperparameters to fit the model to a set of observations. The complexity of the log‐likelihood function, computational expense, and numerical instabilities challenge this process. These issues limit the applicability of GP models more when the size of the training data set and/or problem dimensionality increase. To address these issues, we develop a novel approach for fitting GP models that significantly improves computational expense and prediction accuracy. Our approach leverages the smoothing effect of the nugget parameter on the log‐likelihood profile to track the evolution of the optimal hyperparameter estimates as the nugget parameter is adaptively varied. The new approach is implemented in the R package GPM and compared to a popular GP modeling R package ( GPfit) for a set of benchmark problems. The effectiveness of the approach is also demonstrated using an engineering problem to learn the constitutive law of a hyperelastic composite where the required level of accuracy in estimating the response gradient necessitates a large training data set.  相似文献   

12.
This paper reviews the evolution of off-line quality engineering methods with respect to one or more quality criteria, and presents some recent results. The fundamental premises that justify the use of robust product/process design are established with an illustrative example. The use of designed experiments to model quality criteria and their optimization is briefly reviewed. The fact that most design-for-quality problems involve multiple quality criteria motivates the development of multiobjective optimization techniques for robust parameter design. Two situations are considered: one in which response surface models for the quality characteristics can be obtained using regression and considered over a continuous factor space, and one in which the problem scenario and the experiment permit only discrete parameter settings for the design factors. In the former scenario, a multiobjective optimization technique based on the reference-point method is presented; this technique also incorporates an inference mechanism to deal with uncertainty in the response surface models caused by finite, noisy data. In the discrete-factors scenario, an efficient method to reduce computational complexity for a class of models is presented.  相似文献   

13.
The problem of calculating the uncertainty in the dynamic response of a structure due to uncertainties related to the modeling of its dynamic behavior, is addressed. Based on a Bayesian probabilistic approach, a new approximate numerical method is proposed to investigate the resulting uncertainties in the structural response. The proposed method provides a very efficient and accurate approach to the solution of stochastic finite-element models. It can be used to quantify the uncertainties in the predicted response of a structure during its design, where engineering judgement is used to quantify the uncertainties in the modeling process.  相似文献   

14.
Flexible non-linear regression techniques have been widely used for data-based modeling of chemical processes, and they form the basis of process design under the framework of response surface methodology (RSM). These non-linear models typically achieve more accurate approximation to the factor-response relationship than traditional polynomial regressions. However, non-linear models usually lack a clear interpretation as to how the factors contribute to the prediction of process response.This paper applies the technique of sensitivity analysis (SA) to facilitate the interpretation of non-linear process models. By recognizing that derivative-based local SA is only valid within the neighborhood of certain “nominal” values, global SA is adopted to study the entire range of the factors. Global SA is based on the decomposition of the model and the variance of response into contributing terms of main effects and interactions. Therefore, the effect of individual factors and their interactions can be both visualized by graphs and quantified by sensitivity indices. The proposed methodology is demonstrated on two catalysis processes where non-linear data-based models have been developed to aid process design. The results indicate that global SA is a powerful tool to reveal the impact of process factors on the response variables.  相似文献   

15.
A new approach to linear plate theory for layered structures is introduced. The new theory distances itself from the classical strategy where through the thickness displacement functions are assumed a priori. Instead, the homogenization is based on a single assumption that the plate response is a superposition of fundamental states. An equivalent single‐layer (ESL) formulation is presented, which, for the first time, allows accurate three‐dimensional stress and strain fields to be recovered through a postprocessing strategy that is consistent with the theory's assumptions. Copyright © 2010 John Wiley & Sons, Ltd.  相似文献   

16.
基于对产品制造过程波动传递和累积模式、波动影响因素之间关系的分析,运用图论方法将制造过程影响关系拓扑结构简化为有向图和邻接矩阵,提出制造过程波动源解释结构模型的建模步骤;给出基于因果图建立邻接矩阵,基于布尔代数方法编程计算可达矩阵,建立质量波动源解释结构模型,提出改进优先次序和技术建议的工程应用案例。  相似文献   

17.
《技术计量学》2013,55(4):328-337
Formulation and evaluation of environmental policy depends on receptor models that are used to assess the number and nature of pollution sources affecting the air quality for a region of interest. Different approaches have been developed for situations in which no information is available about the number and nature of these sources (e.g., exploratory factor analysis) and the composition of these sources is assumed known (e.g., regression and measurement error models). We propose a flexible approach for fitting the receptor model when only partial pollution source information is available. The use of latent variable modeling allows the direct incorporation of subject matter knowledge into the model, including known physical constraints and partial pollution source information obtained from laboratory measurements or past studies. Because air quality data often exhibit temporal and/or spatial dependence, we consider the importance of accounting for such correlation in estimating model parameters and making valid statistical inferences. We propose an approach for incorporating dependence structure directly into estimation and inference procedures via a new nested block bootstrap method that adjusts for bias in estimating moment matrices. A goodness-of-fit test that is valid in the presence of such dependence is proposed. The application of the approach is facilitated by a new multivariate extension of an existing block size determination algorithm. The proposed approaches are evaluated by simulation and illustrated with an analysis of hourly measurements of volatile organic compounds in the El Paso, Texas/Ciudad Juarez, Mexico area.  相似文献   

18.
Metamodels are widely used to facilitate the analysis and optimization of engineering systems that involve computationally expensive simulations. Kriging is a metamodelling technique that is well known for its ability to build surrogate models of responses with non‐linear behaviour. However, the assumption of a stationary covariance structure underlying Kriging does not hold in situations where the level of smoothness of a response varies significantly. Although non‐stationary Gaussian process models have been studied for years in statistics and geostatistics communities, this has largely been for physical experimental data in relatively low dimensions. In this paper, the non‐stationary covariance structure is incorporated into Kriging modelling for computer simulations. To represent the non‐stationary covariance structure, we adopt a non‐linear mapping approach based on parameterized density functions. To avoid over‐parameterizing for the high dimension problems typical of engineering design, we propose a modified version of the non‐linear map approach, with a sparser, yet flexible, parameterization. The effectiveness of the proposed method is demonstrated through both mathematical and engineering examples. The robustness of the method is verified by testing multiple functions under various sampling settings. We also demonstrate that our method is effective in quantifying prediction uncertainty associated with the use of metamodels. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

19.
This article presents the results of a study of the functional case of the problem of parameter estimation when there is error in all the variables. There is consequently no distinction between independent and dependent variables. Posterior probability density functions are developed for the parameters with both linear and nonlinear, and possibly multiple, relations among the true values of the variables. There is no distinction between models that are linear or nonlinear in the parameters. The results are equivalent to generalizations of the work of some previous authors, but lead to new and efficient algorithms for finding point estimates and their precisions. For most of the results the error covariance matrix is assumed known, though a case is treated where it is known except for a scalar multiplier. The results are also shown to be valid if the covariance matrix is singular. Geometric interpretations are described.  相似文献   

20.
In a complex manufacturing environment, there are hundreds of interrelated processes that form a complex hierarchy. This is especially true of semiconductor manufacturing. In such an environment, modeling and understanding the impact of critical process parameters on final performance measures such as defectivity is a challenging task. In addition, a number of modeling issues such as a small number of observations compared to process variables, difficulty in formulating a high‐dimensional design matrix, and missing data due to failures pose challenges in using empirical modeling techniques such as classical linear modeling as well as generalized linear modeling (GLM) approaches. Our approach is to utilize GLM in a hierarchical structure to understand the impact of key process and subprocess variables on the system output. A two‐level approach, comprising subprocess modeling and meta‐modeling, is presented and modeling related issues such as bias and variance estimation are considered. The hierarchical GLM approach helps not only in improving output measures, but also in identifying and improving subprocess variables attributed to poor output quality. Copyright © 2011 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号