首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A robust strategy for real-time process monitoring   总被引:1,自引:0,他引:1  
An operator support system (OSS) is proposed to reliably retain salient information in a high dimensional and correlated database, to uncover linear and nonlinear correlations among variables, to reconstruct failed/unavailable sensors, and to assess process-operating performance in the presence of noise and outliers. The proposed strategy carries out the task in three steps. In the first step, a robust tandem filter is used to suppress noise and reject any outlying observations. Next, an orthogonal nonlinear principal component analysis network is utilized to optimally retain a parsimonious representation of the system. In the final step, the process status is checked against the normal operating region defined by kernel density estimation, and failed/unavailable sensors are reconstructed via constrained optimization and the trained network. The strategy is demonstrated in real-time using a pilot-scale distillation column.  相似文献   

2.
Robust multi-scale principal component analysis (RMSPCA) improves multi-scale principal components analysis (MSPCA) techniques by incorporating the uncertainty of signal noise distributions and eliminating/down-weighting the effects of abnormal data in the training set. The novelty of the approach is to integrate MSPCA with the robustness to the typical normality assumption of noisy data. By using an M-estimator based on the generalized T distribution, RMSPCA adaptively transforms the data in the score space at each scale in order to eliminate/down-weight the effects of the outliers in the original data. The robust estimation of the covariance or correlation matrix at each scale is obtained by the proposed approach so that accurate MSPCA models can be obtained for process monitoring purposes. The performance of the proposed approach in process fault detection is illustrated and compared with that of the conventional MSPCA approach through a pilot-scale setting.  相似文献   

3.
Chemical process monitoring based on independent component analysis (ICA) is among the most widely used multivariate statistical process monitoring methods and has progressed very quickly in recent years. Generally, ICA methods initially employ several independent components (ICs) that are ordered according to certain criteria for process monitoring. However, fault information has no definite mapping relationship to a certain IC, and useful information might be submerged under the retained ICs. Thus, weighted independent component analysis (WICA) for fault detection and identification is proposed to process useful submerged information and reduce missed detection rates of I2 statistics. The main idea of WICA is to initially build the conventional ICA model and then use the change rate of the I2 statistic (RI2) to evaluate the importance of each IC. The important ICs tend to have higher RI2; thus, higher weighting values are then adaptively set for these ICs to highlight the useful fault information. Case studies on both simple simulated and Tennessee Eastman processes demonstrate the effectiveness of the WICA method. Monitoring results indicate that the performance of I2 statistics improved significantly compared with principal component analysis and conventional ICA methods.  相似文献   

4.
In this study we bridge traditional standalone data-driven and knowledge-driven process monitoring approaches by proposing a novel hybrid framework that exploits the advantages of both simultaneously. Namely, we design a process monitoring system based on a data-driven model that includes two different data types: i) “actual” data coming from sensor measurements, and ii) “virtual” data coming from a state estimator, based on a first-principles model of the system under investigation. We test the proposed approach on two simulated case studies: a continuous polycondensation process for the synthesis of poly-ethylene terephthalate, and a fed-batch fermentation process for the manufacturing of penicillin. The hybrid monitoring model shows superior fault detection and diagnosis performances with respect to conventional monitoring techniques, even when the first-principles model is relatively simple and process/model mismatch exists.  相似文献   

5.
Statistical process monitoring with independent component analysis   总被引:6,自引:0,他引:6  
In this paper we propose a new statistical method for process monitoring that uses independent component analysis (ICA). ICA is a recently developed method in which the goal is to decompose observed data into linear combinations of statistically independent components [1 and 2]. Such a representation has been shown to capture the essential structure of the data in many applications, including signal separation and feature extraction. The basic idea of our approach is to use ICA to extract the essential independent components that drive a process and to combine them with process monitoring techniques. I2, Ie2 and SPE charts are proposed as on-line monitoring charts and contribution plots of these statistical quantities are also considered for fault identification. The proposed monitoring method was applied to fault detection and identification in both a simple multivariate process and the simulation benchmark of the biological wastewater treatment process, which is characterized by a variety of fault sources with non-Gaussian characteristics. The simulation results clearly show the power and advantages of ICA monitoring in comparison to PCA monitoring.  相似文献   

6.
Canonical correlation analysis (CCA) is a well-known data analysis technique that extracts multidimensional correlation structure between two sets of variables. CCA focuses on maximizing the correlation between quality and process data, which leads to the efficient use of latent dimensions. However, CCA does not focus on exploiting the variance or the magnitude of variations in the data, making it rarely used for quality and process monitoring. In addition, it suffers from collinearity problems that often exist in the process data. To overcome this shortcoming of CCA, a modified CCA method with regularization is developed to extract correlation between process variables and quality variables. Next, to handle the issue that CCA focuses only on correlation but ignores variance information, a new concurrent CCA (CCCA) modeling method with regularization is proposed to exploit the variance and covariance in the process-specific and quality-specific spaces. The CCCA method retains the CCA's efficiency in predicting the quality while exploiting the variance structure for quality and process monitoring using subsequent principal component decompositions. The corresponding monitoring statistics and control limits are then developed in the decomposed subspaces. Numerical simulation examples and the Tennessee Eastman process are used to demonstrate the effectiveness of the CCCA-based monitoring method.  相似文献   

7.
Probabilistic principal component analysis (PPCA) based approaches have been widely used in the field of process monitoring. However, the traditional PPCA approach is still limited to linear dimensionality reduction. Although the nonlinear projection model of PPCA can be obtained by Gaussian process mapping, the model still lacks robustness and is susceptible to process noise. Therefore, this paper proposes a new nonlinear process monitoring and fault diagnosis approach based on the Bayesian Gaussian latent variable model (Bay-GPLVM). Bay-GPLVM can obtain the posterior distribution rather than point estimation for latent variables, so the model is more robust. Two monitoring statistics corresponding to latent space and residual space are constructed for PM-FD purpose. Further, the cause of fault is analyzed by calculating the gradient value of the variable at the fault point. Compared with several PPCA-based monitoring approaches in theory and practical application, the Bay-GPLVM-based process monitoring approach can better deal with nonlinear processes and show high efficiency in process monitoring.  相似文献   

8.
The crescent demand for customized products has challenged industries with reduced lot sizes. As a result, frequent product model changing and short series of observable variables decreased the performance of many traditional tools used in process control. This paper proposes the use of endogenous variables in predictive models aimed at overcoming the multiple setup and short production runs problems found in customized manufacturing systems. The endogenous variables describe the type/model of manufactured products, while the response variable predicts a product quality characteristic. Three robust predictive models, ARIMA, structural model with stochastic parameters fitted by Kalman filter, and Partial Least Squares (PLS) regression, are tested in univariate time series relying on endogenous variables. The PLS modeling yielded better predictions in real manufacturing data, while the structural model led to more robust results in simulated data.  相似文献   

9.
Data-driven fault detection technique has exhibited its wide applications in industrial process monitoring. However, how to extract the local and non-Gaussian features effectively is still an open problem. In this paper, statistics locality preserving projections (SLPP) is proposed to extract the local and non-Gaussian features. Firstly, statistics pattern analysis (SPA) is applied to construct process statistics and grasp the non-Gaussian statistical property using high order statistics. Then, locality preserving projections (LPP) method is used to discover local manifold structure of the statistics. In essence, LPP tries to map the close points in the original space to close in the low-dimensional space. Lastly, T2 and squared prediction error (SPE) charts of SLPP model are used to detect process faults. One simple simulated system and the Tennessee Eastman process show that the proposed SLPP method is more effective than principal component analysis, LPP and statistics principal component analysis in fault detection performance.  相似文献   

10.
We propose a novel process monitoring method integrating independent component analysis (ICA) and local outlier factor (LOF). LOF is a recently developed outlier detection technique which is a density-based outlierness calculation method. In the proposed monitoring scheme, ICA transformation is performed and the control limit of LOF value is obtained based on the normal operating condition (NOC) dataset. Then, at the monitoring phase, the LOF value of current observation is computed at each monitoring time, which determines whether the current process is a fault or not. The comparison experiments are conducted with existing ICA-based monitoring schemes on widely used benchmark processes, a simple multivariate process and the Tennessee Eastman process. The proposed scheme shows the improved accuracy over existing schemes. By adopting LOF, the monitoring statistic is computed regardless of data distribution. Therefore, the proposed scheme integrating ICA and LOF is more suitable for real industry where the monitoring variables are the mixture of Gaussian and non-Gaussian variables, whereas existing ICA-based schemes assume only non-Gaussian distribution.  相似文献   

11.
Incidents happening in the blast furnace will strongly affect the stability and smoothness of the iron-making process. Thus far, diagnosis of abnormalities in furnaces still mainly relies on the personal experiences of individual workers in many iron works. In this paper, principal component analysis (PCA)-based algorithms are developed to monitor the iron-making process and achieve early abnormality detection. Because the process exhibits a non-normal distribution and a time-varying nature in the measurement data, a static convex hull-based PCA algorithm (SCHPCA) which replaces the traditional T2-based abnormality detection logic with the convex hull-based abnormality detection logic, and its moving window version, called the moving window convex hull-based PCA algorithm (MWCHPCA) are proposed, respectively. These two algorithms are tested on the real process data to verify their effectiveness in the early abnormality detection of iron-making process.  相似文献   

12.
In this paper a sensor fault detection and isolation procedure based on principal component analysis (PCA) is proposed to monitor an air quality monitoring network. The PCA model of the network is optimal with respect to a reconstruction error criterion. The sensor fault detection is carried out in various residual subspaces using a new detection index. For our application, this index improves the performance compared to classical detection index SPE. The reconstruction approach allows, on one hand, to isolate the faulty sensors and, on the other hand, to estimate the fault amplitudes.  相似文献   

13.
Hidden Markov models (HMMs) perform parameter estimation based on the forward–backward (FB) procedure and the Baum–Welch (BW) algorithm. The two algorithms together may increase the computational complexity and the difficulty to understand the algorithm structure of HMMs clearly. In this study, an increasing mapping based hidden Markov model (IMHMM) is proposed. Between the observation sequence and possible state sequence an increasing mapping is established. The re-estimation formulas for the model parameters are derived straightforwardly based on these mappings instead of FB variables. The IMHMM has simpler algorithm structure and lower storage requirement than the HMM. Based on IMHMM, an expandable process monitoring and fault diagnosis framework for large-scale dynamical process is developed. To characterize the dynamic process, a novel index considering serial correlation is used to evaluate process state. The presented methodology is carried out in Tennessee Eastman process (TEP). The results show improvement over HMM in terms of memory complexity and training time of the model. Also, the power of IMHMM can be observed compared with principal component analysis (PCA) based methods.  相似文献   

14.
Principal component regression (PCR) based on principal component analysis (PCA) and partial least squares regression (PLSR) are well known projection methods for analysis of multivariate data. They result in scores and loadings that may be visualized in a score-loading plot (biplot) and used for process monitoring. The difficulty with this is that often more than two principal or PLS components have to be used, resulting in a need to monitor more than one such plot. However, it has recently been shown that for a scalar response variable all PLSR/PCR models can be compressed into equivalent PLSR models with two components only. After a summary of the underlying theory, the present paper shows how such two-component PLS (2PLS) models can be utilized in informative score-loading biplots for process understanding and monitoring. The possible utilization of known projection model monitoring statistics and variable contribution plots is also discussed, and a new method for visualization of contributions directly in the biplot is presented. An industrial data example is included.  相似文献   

15.
Process monitoring and quality prediction are crucial for maintaining favorable operating conditions and have received considerable attention in previous decades. For majority complicated cases in chemical and biological industrial processes with particular nonlinear characteristics, traditional latent variable models, such as principal component analysis (PCA), principal component regression (PCR), partial least squares (PLS), may not work well. In this paper, various nonlinear latent variable models based on autoencoder (AE) are developed. In order to extract deeper nonlinear features from process data, the basic shallow AE models are extended to the deep latent variable models, which provides a deep generative structure for nonlinear process monitoring and quality prediction. Meanwhile, with the ever increasing scale of industrial data, the computational burden for process modeling and analytics has becoming more and more tremendous, particularly for large-scale processes. To handle the big data problem, the parallel computing strategy is further applied to the above model, which partitions the whole computational task into a few sub-tasks and assigns them to parallel computing nodes. Then the parallel models are utilized for process monitoring and quality prediction applications. The effectiveness of the developed methods are evaluated through the Tennessee Eastman (TE) benchmark process and a real-life industrial process in an ammonia synthesis plant (ASP).  相似文献   

16.
An intelligent process monitoring and fault diagnosis environment has been developed by interfacing multivariate statistical process monitoring (MSPM) techniques and knowledge-based systems (KBS) for monitoring multivariable process operation. The real-time KBS developed in G2 is used with multivariate SPM methods based on canonical variate state space (CVSS) process models. Fault detection is based on T 2 charts of state variables. Contribution plots in G2 are used for determining the process variables that have contributed to the out-of-control signal indicated by large T 2 values, and G2 Diagnostic Assistant (GDA) is used to diagnose the source causes of abnormal process behavior. The MSPM modules developed in Matlab are linked with G2. This intelligent monitoring and diagnosis system can be used to monitor multivariable processes with autocorrelated, crosscorrelated, and collinear data. The structure of the integrated system is described and its performance is illustrated by simulation studies.  相似文献   

17.
Process control monitoring (PCM) data provide information that is used to track abnormal processes and estimate various probe bin yields. However, multi-dimensional information has not yet been fully utilized from both PCM data and probe bins. In this paper, we proposed a canonical correlation analysis in order to investigate the relationship between multiple PCM variables and various probe bin variables. Polynomial regression was also employed as a methodology for maximizing the performance yield based on the results of the canonical correlation analysis. Two conclusions were drawn from the results of this research. First, the PCM variables that affected the probe bins were contact resistance, sheet resistance, and Isat_P4H as well as threshold voltage (Vt) during the process tuning step. Second, the typical values of Vtl_P4H and Isat_P4H should be changed in order to maximize the performance yield. The proposed method can be used for yield improvement and as a problem-solving approach for optimizing the IC process.  相似文献   

18.
In this paper, we evaluate multivariate pattern matching methods for the Tennessee Eastman (TE) challenge process. The pattern matching methodology includes principal component analysis based similarity factors and dissimilarity factor of Kano et al., that compare current and historical data. In our similarity factor approach, the start and end times of disturbances are not known a priori and the data are compared by moving a window though the historical data. Comparisons with methods used in earlier case studies of the TE challenge process show advantages of using the proposed similarity factor approach.  相似文献   

19.
Due to the wide variety of fusion techniques available for combining multiple classifiers into a more accurate classifier, a number of good studies have been devoted to determining in what situations some fusion methods should be preferred over other ones. However, the sample size behavior of the various fusion methods has hitherto received little attention in the literature of multiple classifier systems. The main contribution of this paper is thus to investigate the effect of training sample size on their relative performance and to gain more insight into the conditions for the superiority of some combination rules.A large experiment is conducted to study the performance of some fixed and trainable combination rules for executing one- and two-level classifier fusion for different training sample sizes. The experimental results yield the following conclusions: when implementing one-level fusion to combine homogeneous or heterogeneous base classifiers, fixed rules outperform trainable ones in nearly all cases, with only one exception of merging heterogeneous classifiers for large sample size. Moreover, the best classification for any considered sample size is generally achieved by a second level of combination (namely, utilizing one fusion rule to further combine a set of ensemble classifiers with each of them constructed by fusing base classifiers). Under these circumstances, it seems that adopting different types of fusion rules (fixed or trainable) as the combiners for two levels of fusion is appropriate.  相似文献   

20.
We present a data-driven method for monitoring machine status in manufacturing processes. Audio and vibration data from precision machining are used for inference in two operating scenarios: (a) variable machine health states (anomaly detection); and (b) settings of machine operation (state estimation). Audio and vibration signals are first processed through Fast Fourier Transform and Principal Component Analysis to extract transformed and informative features. These features are then used in the training of classification and regression models for machine state monitoring. Specifically, three classifiers (K-nearest neighbors, convolutional neural networks and support vector machines) and two regressors (support vector regression and neural network regression) were explored, in terms of their accuracy in machine state prediction. It is shown that the audio and vibration signals are sufficiently rich in information about the machine that 100% state classification accuracy could be accomplished. Data fusion was also explored, showing overall superior accuracy of data-driven regression models.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号