首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Rhizome of cassava plants (Manihot esculenta Crantz) was catalytically pyrolysed at 500 °C using analytical pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) method in order to investigate the relative effect of various catalysts on pyrolysis products. Selected catalysts expected to affect bio-oil properties were used in this study. These include zeolites and related materials (ZSM-5, Al-MCM-41 and Al-MSU-F type), metal oxides (zinc oxide, zirconium (IV) oxide, cerium (IV) oxide and copper chromite) catalysts, proprietary commercial catalysts (Criterion-534 and alumina-stabilised ceria-MI-575) and natural catalysts (slate, char and ashes derived from char and biomass). The pyrolysis product distributions were monitored using models in principal components analysis (PCA) technique. The results showed that the zeolites, proprietary commercial catalysts, copper chromite and biomass-derived ash were selective to the reduction of most oxygenated lignin derivatives. The use of ZSM-5, Criterion-534 and Al-MSU-F catalysts enhanced the formation of aromatic hydrocarbons and phenols. No single catalyst was found to selectively reduce all carbonyl products. Instead, most of the carbonyl compounds containing hydroxyl group were reduced by zeolite and related materials, proprietary catalysts and copper chromite. The PCA model for carboxylic acids showed that zeolite ZSM-5 and Al-MSU-F tend to produce significant amounts of acetic and formic acids.  相似文献   

2.
In this study, a two‐step principal component analysis (TS‐PCA) is proposed to handle the dynamic characteristics of chemical industrial processes in both steady state and unsteady state. Differently from the traditional dynamic PCA (DPCA) dealing with the static cross‐correlation structure and dynamic auto‐correlation structure in process data simultaneously, TS‐PCA handles them in two steps: it first identifies the dynamic structure by using the least squares algorithm, and then monitors the innovation component by using PCA. The innovation component is time uncorrelated and independent of the initial state of the process. As a result, TS‐PCA can monitor the process in both steady state and unsteady state, whereas all other reported dynamic approaches are limited to only processes in steady state. Even tested in steady state, TS‐PCA still can achieve better performance than the existing dynamic approaches.  相似文献   

3.
For plant-wide processes with multiple operating conditions,the multimode feature imposes some chal-lenges to conventional monitoring techniques.Hence,to solve this problem,this paper provides a novel local component based principal component analysis(LCPCA)approach for monitoring the status of a multimode process.In LCPCA,the process prior knowledge of mode division is not required and it purely based on the process data.Firstly,LCPCA divides the processes data into multiple local components using finite Gaussian mixture model mixture(FGMM).Then,calculating the posterior probability is applied to determine each sample belonging to which local component.After that,the local component information(such as mean and standard deviation)is used to standardize each sample of local component.Finally,the standardized samples of each local component are combined to train PCA monitoring model.Based on the PCA monitoring model,two monitoring statistics T2 and SPE are used for monitoring multimode pro-cesses.Through a numerical example and the Tennessee Eastman(TE)process,the monitoring result demonstrates that LCPCA outperformed conventional PCA and LNS-PCA in the fault detection rate.  相似文献   

4.
Data reconciliation (DR) and principal component analysis (PCA) are two popular data analysis techniques in process industries. Data reconciliation is used to obtain accurate and consistent estimates of variables and parameters from erroneous measurements. PCA is primarily used as a method for reducing the dimensionality of high dimensional data and as a preprocessing technique for denoising measurements. These techniques have been developed and deployed independently of each other. The primary purpose of this article is to elucidate the close relationship between these two seemingly disparate techniques. This leads to a unified framework for applying PCA and DR. Further, we show how the two techniques can be deployed together in a collaborative and consistent manner to process data. The framework has been extended to deal with partially measured systems and to incorporate partial knowledge available about the process model.  相似文献   

5.
The task of quality checking of the coil-coated system was undertaken using impedance spectroscopy. In order to properly characterize the system the new impedance spectrum was calculated as an averaged result of 12 experimental spectra obtained from 12 different places of the sample at the same time of immersion in 3% of NaCl. The experimental spectra were recorded for a 24-h period at 1 h intervals between starting points of each measurement. Averaged impedance at each frequency point of the calculated spectrum was characterized by standard deviation. Impedance analysis using two-time constant equivalent electrical circuit was conducted showing the permissible parameter scatter for the checked system. A new classification method of obtained parameters using principal component analysis was presented. This approach enables to identify samples of low quality.  相似文献   

6.
7.
Most multivariate statistical monitoring methods based on principal component analysis (PCA) assume implicitly that the observations at one time are statistically independent of observations at past time and the latent variables follow a Gaussian distribution. However, in real chemical and biological processes, these assumptions are invalid because of their dynamic and nonlinear characteristics. Therefore, monitoring charts based on conventional PCA tend to show many false alarms and bad detectability. In this paper, a new statistical process monitoring method using dynamic independent component analysis (DICA) is proposed to overcome these disadvantages. ICA is a recently developed technique for revealing hidden factors that underlies sets of measurements followed on a non-Gaussian distribution. Its goal is to decompose a set of multivariate data into a base of statistically independent components without a loss of information. The proposed DICA monitoring method is applying ICA to the augmenting matrix with time-lagged variables. DICA can show more powerful monitoring performance in the case of a dynamic process since it can extract source signals which are independent of the auto- and cross-correlation of variables. It is applied to fault detection in both a simple multivariate dynamic process and the Tennessee Eastman process. The simulation results clearly show that the method effectively detects faults in a multivariate dynamic process.  相似文献   

8.
Principal component analysis (PCA) has been used to establish a new method for the detection of olive oil adulteration. The data set, composed of values obtained from the determination of the mole percentage of total FA and their regiospecific distribution in positions 1 and 3 in TG of oils (pure or mixtures) by GC analysis, was subjected to PCA. 3-D scatter plots showed clearly that it is possible to distinguish the pure oils from the mixtures. Moreover it is possible to discriminate the different types of seed oil used for the adulteration.  相似文献   

9.
In the present study, chemometric analysis is applied as a tool to evaluate the release behaviour of trace elements (TEs) during coal utilization processes. Principal component analysis (PCA) and linear discriminant Analysis (LDA) were applied on the TE concentrations of raw and thermally treated coals. PCA and LDA successfully predicted the association of 21 trace elements (Na, Mg, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Sb, Te, Pb) contained in coal and their thermal behavior at various temperatures. Application of chemometric on thermally treated coals shows that at temperature 450 °C, elements like Na, P, K, Fe, Ca, Mg, Al and Si have affinity with mineral matter and therefore have low volatility. Elements like Te, Sb and Ti may form their chlorides, which enhance the volatilities of these elements, while Co and Pb may form sulfides like Co2S4 and PbS. In the temperature range of 600-850 °C, either coal undergones an intense degradation of its structure during pyrolysis and the elements released may be adsorbed on coal surface or be volatile. The elements Cr, Co, V, Ni may react with sulphurous gases evolved during pyrolysis. At temperature 1000 °C, wide dispersion in data elements interact with carbon and sulphur compounds of coals. The formation of compounds like Si carbide, bassanite, gehlenite, anarthite may also be responsible for low volatilities of the elements Si, Al and Ca at higher temperatures. Predictive capabilities of PCA and LDA were evaluated in terms of TEs volatilities at different temperatures. The results of chemometric analysis are not only in good agreement with volatilities of TEs present in coals at various temperatures but also with FTIR analysis.  相似文献   

10.
The weighted principal component analysis technique is employed for reconstruction of reflectance spectra of surface colors from the related tristimulus values. A dynamic eigenvector subspace based on applying certain weights to reflectance data of Munsell color chips has been formed for each particular sample and the color difference value between the target, and Munsell dataset is chosen as a criterion for determination of weighting factors. Implementation of this method enables one to increase the influence of samples which are closer to target on extracted principal eigenvectors and subsequently diminish the effect of those samples which benefit from higher amount of color difference. The performance of the suggested method is evaluated in spectral reflectance reconstruction of three different collections of colored samples by the use of the first three Munsell bases. The resulting spectra show considerable improvements in terms of root mean square error between the actual and reconstructed reflectance curves as well as CIELAB color difference under illuminant A in comparison to those obtained from the standard PCA method. © 2008 Wiley Periodicals, Inc. Col Res Appl, 33, 360–371, 2008  相似文献   

11.
12.
Batch processes lie at the heart of many industries; hence the effective monitoring and control of batch processes is crucial to the production of high-quality materials. Multiway principal component analysis (MPCA) has been widely used for batch monitoring and has proved to be an effective method for monitoring many industrial batch processes. However, because MPCA is a fixed-model monitoring technique, it gives false alarms when it is used to monitor real processes whose normal operation involves slow changes. In this paper, we propose a simple on-line batch monitoring method that uses a consecutively updated MPCA model. The key to the proposed approach is that whenever a batch successfully remains within the bounds of normal operation, its batch data are added to the historical database of normal data and a new MPCA model is developed based on the revised database. The proposed method was applied to monitoring fed-batch penicillin production, and the results were compared with those obtained using conventional MPCA. The simulation results clearly show that the ability of the proposed method to adapt to new normal operating conditions eliminates the many false alarms generated by the fixed model and provides a reliable monitoring chart.  相似文献   

13.
The best way to describe a color is to study its reflectance spectrum, which provide the most useful information. Different methods were purposed for reflectance spectra reconstruction from CIE tristimulus values such as principal components analysis. In this study, the training samples were first divided into 3, 6, 9, and 12 subgroups by creating a competitive neural network. To do that, L*a*b*, L*C*h or L*a*b*C*h were introduced to neural network as input elements. In order to investigate the performance of reflectance spectra reconstruction, the color difference and RMS between actual and reconstructed data were obtained. The reconstruction of reflectance spectra were improved by using a six or nine‐neuron layer with L*a*b* input elements. © 2016 Wiley Periodicals, Inc. Col Res Appl, 42, 182–188, 2017  相似文献   

14.
In this paper the multiscale kernel principal component analysis (MSKPCA) based on sliding median filter (SFM) is proposed for fault detection in nonlinear system with outliers. The MSKPCA based on SFM (SFM-MSKPCA) algorithm is first proposed and applied to process monitoring. The advantages of SFM-MSKPCA are: (1) the dynamical multiscale monitoring method is proposed which combining the Kronecker production, the wavelet decomposition technique, the sliding median filter technique and KPCA. The Kronecker production is first used to build the dynamical model; (2) there are more disturbances and noises in dynamical processes compared to static processes. The sliding median filter technique is used to remove the disturbances and noises; (3) SFM-MSKPCA gives nonlinear dynamic interpretation compared to MSPCA; (4) by decomposing the original data into multiple scales, SFM-MSKPCA analyze the dynamical data at different scales, reconstruct scales contained important information by IDWT, eliminate the effects of the noises in the original data compared to kernel principal component analysis (KPCA). To demonstrate the feasibility of the SFM-MSKPCA method, its process monitoring abilities are tested by simulation examples, and compared with the monitoring abilities of the KPCA and MSPCA method on the quantitative basis. The fault detection results and the comparison show the superiority of SFM-MSKPCA in fault detection.  相似文献   

15.
In recent years, neural networks have been used as a tool for modeling an industrial process. An improvement in their performance may be expected either by divining more efficient training algorithms or by intelligently manipulating the data set. The second method is examined. The problem chosen is one of predicting the properties of cotton yarn from the fiber properties. When the input data are known to correlate with each other, principal component analysis can be used to improve the performance of neural networks. © 2003 Wiley Periodicals, Inc. J Appl Polym Sci 91: 1746–1751, 2004  相似文献   

16.
Fluorescent materials are now a critical field of research due to their unique excitation and emission properties that can be tailored to specific fluorescence detection technologies. In this work, a procedure is described to approximate the emission spectral data of fluorescent materials of different types from their excitation spectral data using principal component analysis (PCA) technique. First, PCA as a statistical and mathematical method was used to reconstruct the excitation and emission spectra of training dataset and then, the approximation was accomplished by multiple linear regression (MLR).The performance of obtained function was examined on testing dataset. Afterward, CIE tristimulus values of the fluorescent samples were calculated based on ASTM, E2152–12 standard test method. The colorimetric accuracy was then evaluated by calculating the geometric differences in CIE tristimulus values X, Y, and Z for the 1964 standard colorimetric observer under illuminant D65. The obtained results show a good curve fit between the actual emission spectra and recovered emission spectra. In addition, based on cumulative variance and root mean square (RMS), eight principal components were selected as optimum number of principal components for prediction of emission spectra data. © 2015 Wiley Periodicals, Inc. Col Res Appl, 41, 16–21, 2016  相似文献   

17.
We used the dynamic solvent effect to sample rat haunch odor, which we then analyzed using principal component analysis. PCA, based on 22 volatile components, indicated that one axis clearly separated rat haunch odor samples according to sex and female reproductive condition (estrus and diestrus), explaining 79.5% of the variation in proportional peak area. We have therefore been able to separate odors along biologically meaningful lines.  相似文献   

18.
刘昌明  时朵 《耐火材料》2020,54(2):93-97
耐火材料微观结构十分复杂,其破坏形式呈现多样化、复杂化,如何对其损伤进行高效准确地判别十分重要。为此,借助声发射技术对镁碳质耐火材料受压过程进行了损伤声发射信号采集,针对海量声发射数据,首先采用了经验模式分解结合主元分析,对声发射信号进行了特征提取与参数化降维:在大量的声发射时域与频域参数中,选取前7个主元来表征损伤,其损伤贡献率可以达到99%;之后将主元作为支持向量机输入进行损伤分类,将损伤分为两个主要类型,基质相与界面相损伤;最后将分析结果与扫描电镜结果进行对比分析,发现两者结论可以较好地吻合,可为运用声发射手段检测耐火材料损伤提供一种高效准确的解决方案。  相似文献   

19.
To cope with a cost-effective manufacturing approach driven by more than Moore’s law era, plasma etching which is one of the major processes in semiconductor manufacturing has developed plasma sensors and their applications. Among the plasma sensors, optical emission spectroscopy (OES) has been widely utilized and its high dimensionality has required multivariate analysis (MVA) techniques such as principal component analysis (PCA). PCA, however, might devaluate physical meaning of target process during its statistical calculation. In addition, inherent noise from charge coupled devices (CCD) array in OES might deteriorate PCA model performance. Therefore, it is desirable to pre-select physically important variables and to filter out noisy signals before modeling OES based plasma data. For these purposes, this paper introduces a peak wavelength selection algorithm for selecting physically meaningful wavelength in plasma and discrete wavelet transform (DWT) for filtering out noisy signals from a CCD array. The effectiveness of the PCA model introduced in this paper is verified by comparing fault detection capabilities of conventional PCA model under the various source power or pressure faulty situations in a capacitively coupled plasma etcher. Even though the conventional PCA model fails to detect all of the faulty situations under the tests, the PCA model introduced in this paper successively detect even extremely small variation such as 0.67% of source power fault. The results introduced in this paper is expected to contribute to OES based plasma monitoring capability in plasma etching for more than Moore’s law era.  相似文献   

20.
Conventionally, for probabilistic principal component analysis (PPCA) based regression models, noise with a Gaussian distribution is assumed for both input and output observations. This assumption makes the model to be vulnerable to large random errors, known as outliers. In this article, unlike the conventional noise assumption, a mixture noise model with a contaminated Gaussian distribution is adopted for probabilistic modeling to diminish the adverse effect of outliers, which usually occur due to irregular process disturbances, instrumentation failures or transmission problems. This is done by downweighing the effect of the noise component which accounts for contamination on output prediction. Outliers are common in process industries; therefore, handling this issue is of practical importance. In comparison with conventional PPCA based regression model, prediction performance of the developed robust probabilistic regression model is improved in presence of data contamination. To evaluate the model performance two case studies were carried out. A simulated set of data with specific characteristics to highlight the presence of outliers was used to demonstrate the robustness of the developed model. The advantages of this robust model are further illustrated via a set of real industrial process data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号