首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 125 毫秒
1.
This work addresses the optimal planning and campaign scheduling of biopharmaceutical manufacturing processes, considering multiple operational characteristics, such as the campaign schedule of batch and/or continuous process steps, multiple intermediate deliveries, sequence dependent changeovers operations, product storage restricted to shelf-life limitations, and the track-control of the production/campaign lots due to regulatory policies. A new mixed integer linear programing (MILP) model, based on a Resource Task Network (RTN) continuous time single-grid formulation, is developed to comprise the integration of all these features. The performance of the model features is discussed with the resolution of a set of industrial problems with different data sets and process layouts, demonstrating the wide application of the proposed formulation. It is also performed a comparison with a related literature model, showing the advantages of the continuous-time approach and the generality of our model for the optimal production management of biopharmaceutical processes.  相似文献   

2.
In the semiconductor industry, process monitoring has been recognized as a critical component of the manufacturing system. Multivariate statistical process monitoring (SPM) techniques, such as multiway principal component analysis and multiway partial least squares, have been extend to monitor semiconductor processes. These SPM methods require extensive, often off‐line data preprocessing such as data unfolding, trajectory mean shift, and trajectory alignment. This requirement is probably not an issue for the traditional chemical batch processes but it poses a significant challenge for semiconductor batch processes. This is because data preprocessing makes model building and maintenance extremely labor intensive due to the large number of models in a typical semiconductor fab. In addition, semiconductor process data often show more severe nonnormality compared to those of the traditional chemical process under closed‐loop control, which results in suboptimal performance in many applications. To address these challenges, several pattern classification based monitoring (PCM) methods have been developed recently, but some limitations remain and trajectory alignment is still required. In this article, we analyze the fundamental reasons for the limitations of the SPM and PCM methods when applied to monitor semiconductor processes. In addition, we propose a new statistics pattern analysis (SPA) framework to address the challenges associated with semiconductor processes. By monitoring batch statistics, the proposed SPA framework not only eliminates all data preprocessing steps but also provides superior fault detection performance. Finally, we use an industrial example to demonstrate the advantages of the proposed SPA framework, and examine the fundamental reasons for the improved performance from SPA. © 2010 American Institute of Chemical Engineers AIChE J, 2011  相似文献   

3.
Soft sensors are widely used to estimate process variables that are difficult to measure online. In polymer plants that produce various grades of polymers, the quality of products must be estimated using soft sensors in order to reduce the amount of off-grade material. However, during grade transition, the predictive accuracy deteriorates because the state in polymer reactors is unsteady, causing the values of process variables to differ from the steady-state values used to construct regression models. Therefore, we have proposed to construct models that detect the completion of transition to ensure that the polymer quality evaluated after transition conforms to the predicted one. By using these models and regression models constructed for each product grade, the polymer quality can be predicted with high accuracy, selecting a regression model appropriately. The proposed method was applied to industrial plant data and was found to exhibit higher predictive performance than traditional methods.  相似文献   

4.
A data‐based approach for developing robust processes is presented and illustrated with an application to an industrial membrane manufacturing process. Using historical process data, principal component analysis and partial least squares are used to extract models of the process and of the sensitivities of the process to various disturbances, including raw material variations, environmental conditions, and process equipment differences. Robustness measures are presented to quantify the robustness of the process to each of these disturbances. The process is then made robust (insensitive) to the disturbances over which one has some control (e.g., by modifying the equipment units to which the process is sensitive and imposing specification regions on sensitive raw materials). It is also made robust to disturbances over which one has little control (e.g., environmental variations) by optimizing the process operating conditions with respect to performance and robustness measures. The optimization is easily performed in the low‐dimensional space of the latent variables even though the number of process variables involved is very large. After applying the methodology to historical data from the membrane manufacturing process, results from several months of subsequent operation are used to demonstrate the large improvement achieved in the robustness of the process. © 2010 American Institute of Chemical Engineers AIChE J, 2011  相似文献   

5.
The paper presents an approach to improve the product quality from batch-to-batch by exploiting the repetitive nature of batch processes to update the operating trajectories using process knowledge obtained from previous runs. The data based methodology is focused on using the linear time varying (LTV) perturbation model in an iterative learning control (ILC) framework to provide a convergent batch-to-batch improvement of the process performance indicator. The major contribution of this work is the development of a novel hierarchical ILC (HILC) scheme for systematic design of the supersaturation controller (SSC) of seeded batch cooling crystallizers. The HILC is used to determine the required supersaturation setpoint for the SSC and the corresponding temperature trajectory required to produce crystals with desired end-point property. The performance and robustness of these approaches are evaluated through simulation case studies. These results demonstrate the potential of the ILC approaches for controlling batch processes without rigorous process models.  相似文献   

6.
To improve availability and performance of fuel cells, the operating temperature of molten carbonate fuel cells (MCFC) stack should be strictly maintained within a specified operation range, and an efficient control technique should be employed to meet this objective. While most modern control strategies are based on process models, many existing models of MCFC are not ready to be applied in synthesis and operation of control systems. In this study, we developed an auto-regressive moving average (ARMA) model and machine learning methods of least squares support vector machine (LS-SVM), artificial neural network (ANN) and partial least squares (PLS) for the MCFC system based on input-output operating data. The ARMA model showed the best tracking performance. A model predictive control method for the operation of MCFC system was developed based on the proposed ARMA model. The control performance of the proposed MPC methods was compared with that of conventional controllers using numerical simulations performed on various process models including an MCFC process. Numerical results show that ARMA model based control provides improved control performance compared to other control methods.  相似文献   

7.
Dynamic risk analysis (DRA) has been used widely to analyze the performance of alarm and safety interlock systems of manufacturing processes. Because the most critical alarm and safety interlock systems are rarely activated, little or no data from these systems are often available to apply purely‐statistical DRA methods. Moskowitz et al. (2015)1 introduced a repeated‐simulation, process‐model‐based technique for constructing informed prior distributions, generating low‐variance posterior distributions for Bayesian analysis,1 and making alarm‐performance predictions. This article presents a method of quantifying process model quality, which impacts prior and posterior distributions used in Bayesian Analysis. The method uses higher‐frequency alarm and process data to select the most relevant constitutive equations and assumptions. New data‐based probabilistic models that describe important special‐cause event occurrences and operators’ response‐times are proposed and validated with industrial plant data. These models can be used to improve estimates of failure probabilities for alarm and safety interlock systems. © 2016 American Institute of Chemical Engineers AIChE J, 62: 3461–3472, 2016  相似文献   

8.
This work presents an uncertainty‐conscious methodology for the assessment of process performance—for example, run time—in the manufacturing of biopharmaceutical drug products. The methodology is presented as an activity model using the type 0 integrated definition (IDEF0) functional modeling method, which systematically interconnects information, tools, and activities. In executing the methodology, a hybrid stochastic–deterministic model that can reflect operational uncertainty in the assessment result is developed using Monte Carlo simulation. This model is used in a stochastic global sensitivity analysis to identify tasks that had large impacts on process performance under the existing operational uncertainty. Other factors are considered, such as the feasibility of process modification based on Good Manufacturing Practice, and tasks to be improved is identified as the overall output. In a case study on cleaning and sterilization processes, suggestions were produced that could reduce the mean total run time of the processes by up to 40%. © 2017 American Institute of Chemical Engineers AIChE J, 64: 1272–1284, 2018  相似文献   

9.
In this paper, a modified version of the Support Vector Machine (SVM) is proposed as an empirical model for polymerization processes modeling. Usually the exact principle models of polymerization processes are seldom known; therefore, the relations between input and output variables have to be estimated by using an empirical inference model. They can be used in process monitoring, optimization and quality control. The Support Vector Machine is a good tool for modeling polymerization process because it can handle highly nonlinear systems successfully. The proposed method is derived by modifying the risk function of the standard Support Vector Machine by using the concept of Locally Weighted Regression. Based on the smoothness concept, it can handle the correlations among many process variables and nonlinearities more effectively. Case studies show that the proposed method exhibits superior performance as compared with the standard SVR, which is itself superior to the traditional statistical learning machine in the case of high dimensional, sparse and nonlinear data.  相似文献   

10.
We study the problem of intervention effects generating various types of outliers in a linear count time‐series model. This model belongs to the class of observation‐driven models and extends the class of Gaussian linear time‐series models within the exponential family framework. Studies about effects of covariates and interventions for count time‐series models have largely fallen behind, because the underlying process, whose behaviour determines the dynamics of the observed process, is not observed. We suggest a computationally feasible approach to these problems, focusing especially on the detection and estimation of sudden shifts and outliers. We consider three different scenarios, namely the detection of an intervention effect of a known type at a known time, the detection of an intervention effect when the type and the time are both unknown and the detection of multiple intervention effects. We develop score tests for the first scenario and a parametric bootstrap procedure based on the maximum of the different score test statistics for the second scenario. The third scenario is treated by a stepwise procedure, where we detect and correct intervention effects iteratively. The usefulness of the proposed methods is illustrated using simulated and real data examples.  相似文献   

11.
近红外光谱(NIR)是制药工业领域应用最为广泛的过程分析技术(PAT),在中药产品质量的在线实时检测和控制中越来越受到重视。和化学药相比,由于中药组成的复杂性和生产加工过程的特殊性,对利用化学计量学建立NIR预测模型,提出了新的挑战。本文对NIR在中药质量控制应用中的化学计量学建模方法和技术进行了综述并对未来发展做了展望。综述涉及到NIR数据的采集、预处理、分组,特征波段自动选取,建模以及模型的验证和评价。讨论了平滑、导数、标准化算法、数据增强算法和主元分析等预处理方法对模型影响。特征波段的选取述及间隔偏最小二乘、遗传算法、无信息变量消除、随机蛙跳法、竞争自适应重加权采样和重要变量投影法等;建模方法论及线性和非线性技术包括主元回归、偏最小二乘回归、人工神经网络和支持向量机回归等。未来的NIR建模平台应该是一个在后台集成各种复杂的数学算法和实现数据的无缝共享,面向用户的前台则是友好、简单、智能的半自动界面环境。论述结合具体的实例进行。  相似文献   

12.
The application of multivariate statistical projection based techniques has been recognized as one approach to contributing to an increased understanding of process behaviour. The key methodologies have included multi‐way principal component analysis (PCA), multi‐way partial least squares (PLS) and batch observation level analysis. Batch processes typically exhibit nonlinear, time variant behaviour and these characteristics challenge the aforementioned techniques. To address these challenges, dynamic PLS has been proposed to capture the process dynamics. Likewise approaches to removing the process nonlinearities have included the removal of the mean trajectory and the application of nonlinear PLS. An alternative approach is described whereby the batch trajectories are sub‐divided into operating regions with a linear/linear dynamic model being fitted to each region. These individual models are spliced together to provide an overall nonlinear global model. Such a structure provides the potential for an alternative approach to batch process performance monitoring. In the paper a number of techniques are considered for developing the local model, including multi‐way PLS and dynamic multi‐way PLS. Utilising the most promising set of results from a simulation study of a batch process, the local model comprising individual linear dynamic PLS models was benchmarked against global nonlinear dynamic PLS using data from an industrial batch fermentation process. In conclusion the results for the local operating region techniques were comparable to the global model in terms of the residual sum of squares but for the global model structure was evident in the residuals. Consequently, the local modelling approach is statistically more robust.  相似文献   

13.
This work explores the design of distributed model predictive control (DMPC) systems for nonlinear processes using machine learning models to predict nonlinear dynamic behavior. Specifically, sequential and iterative DMPC systems are designed and analyzed with respect to closed-loop stability and performance properties. Extensive open-loop data within a desired operating region are used to develop long short-term memory (LSTM) recurrent neural network models with a sufficiently small modeling error from the actual nonlinear process model. Subsequently, these LSTM models are utilized in Lyapunov-based DMPC to achieve efficient real-time computation time while ensuring closed-loop state boundedness and convergence to the origin. Using a nonlinear chemical process network example, the simulation results demonstrate the improved computational efficiency when the process is operated under sequential and iterative DMPCs while the closed-loop performance is very close to the one of a centralized MPC system.  相似文献   

14.
Soft sensor techniques have been widely used to estimate product quality or other key indices which cannot be measured online by hardware sensors. Unfortunately, their estimation performance would deteriorate under certain circumstances, e.g., the change of the process characteristics, especially for global learning approaches. Meanwhile, local learning methods always only utilize input information to select relevant instances, which may lead to a waste of output information and inaccurate sample selection. To overcome these disadvantages, a new local modeling algorithm, adaptive local kernel-based learning scheme (ALKL) is proposed. First, a new similarity measurement using both input and output information is proposed and utilized in a supervised locality preserving projection technique to select relevant samples. Second, an adaptive weighted least squares support vector regression (AW-LSSVR) is employed to establish a local model and predict output indices for each query data. In AW-LSSVR, instead of using traditional cross-validation methods, the trade-off parameters are adjusted iteratively and the local model is updated recursively, which reduces the computational complexity a lot. The proposed ALKL is applied to an online crude oil endpoint prediction in an industrial fluidized catalytic cracking unit (FCCU) process. The experimental results demonstrate the high precision of our ALKL approach.  相似文献   

15.
This article focuses on the design of model predictive control (MPC) systems for nonlinear processes that utilize an ensemble of recurrent neural network (RNN) models to predict nonlinear dynamics. Specifically, RNN models are initially developed based on a data set generated from extensive open-loop simulations within a desired process operation region to capture process dynamics with a sufficiently small modeling error between the RNN model and the actual nonlinear process model. Subsequently, Lyapunov-based MPC (LMPC) that utilizes RNN models as the prediction model is developed to achieve closed-loop state boundedness and convergence to the origin. Additionally, machine learning ensemble regression modeling tools are employed in the formulation of LMPC to improve prediction accuracy of RNN models and overall closed-loop performance while parallel computing is utilized to reduce computation time. Computational implementation of the method and application to a chemical reactor example is discussed in the second article of this series.  相似文献   

16.
王亚君  孙福明 《化工学报》2014,65(12):4905-4913
针对传统的多元统计监测方法不能有效检测工业过程中由于初始条件波动较大所引发的弱故障问题,提出一种基于多动态核聚类的核主元分析(DKCPCA)监控策略,实现多阶段间歇过程的弱故障在线监控.该方法首先针对过程中各阶段每一批次数据结合自回归移动平均时间序列模型(ARMAX)和核主成分分析(KPCA)方法分别建立动态核PCA模型,然后根据各批次模型间载荷的相似性采用分层次聚类方法进行聚类,最后将聚在一起的批次数据进行展开重新再建立动态核PCA模型,随着聚类数目的不同从而建立多个类模型.当在线应用时给出了多模型选择策略,以提高监测精度.将此方法应用于青霉素发酵过程的监控中,监测结果表明此方法取得了比DKPCA和MKPCA更好的监测性能.  相似文献   

17.
Two approaches for optimal control of diffusion-convection-reaction processes based on reduced-order models are presented. The approaches differ in the way spatial discretization is carried out to compute a reduced-order model suitable for controller design. In the first approach, the partial differential equation (PDE) that describes the process is first discretized in space and time using the finite difference method to derive a large number of recursive algebraic equations, which are written in the form of a discrete-time state-space model with sparse state, input and output matrices. Snapshots based on this high-dimensional state-space model are generated to calculate empirical eigenfunctions using proper orthogonal decomposition. The Galerkin projection with the computed empirical eigenfunctions as basis functions is then directly applied to the high-dimensional state-space model to derive a reduced-order model. In the second approach, a continuous-time finite-dimensional state-space model is constructed directly from the PDE through application of orthogonal collocation on finite elements in the spatial domain. The dimension of the derived state-space model can be further reduced using standard model reduction techniques. In both cases, optimal controllers are designed based on the low-order state-space models using discrete-time and continuous-time linear quadratic regulator (LQR) techniques. The effectiveness of the proposed methods are illustrated through applications to a diffusion-convection process and a diffusion-convection-reaction process.  相似文献   

18.
In the process industry, automation and process control systems are widely implemented, information integration is however far away from satisfactory. It remains a hard job for senior managers to make decisions based on the plant-wide real-time integrated information. This paper proposes a multi-layer information integration platform. In the data integration level, the standard for the exchange of product (STEP) and the extensible markup language (XML) are used to unify these data of the chemical process. In the model integration level, the models are integrated by using the neutral model repository and CAPE-OPEN. In the integration of process task, the common object request broker architecture (CORBA) is used as the communication mediator. The XML is taken as the data standard. A uniform information platform is thus constructed and realized. The proposed information integration platform is satisfactorily implemented to solve the Tennessee Eastman (TE) problem.  相似文献   

19.
In the process industry, automation and process control systems are widely implemented, information integration is however far away from satisfactory. It remains a hard job for senior managers to make decisions based on the plant-wide real-time integrated information. This paper proposes a multi-layer information integration platform. In the data integration level, the standard for the exchange of product (STEP) and the extensible markup language (XML) are used to unify these data of the chemical process. In the model integration level, the models are integrated by using the neutral model repository and CAPE-OPEN. In the integration of process task, the common object request broker architecture (CORBA) is used as the communication mediator. The XML is taken as the data standard. A uniform information platform is thus constructed and realized. The proposed information integration platform is satisfactorily implemented to solve the Tennessee Eastman (TE) problem.  相似文献   

20.
Low pressure chemical vapor deposition (LPCVD) is one of themost important processes during semiconductor manufacturing. However, the spatial distribution of internal temperature and extremely few samples makes it hard to build a good-quality model of this batch process. Besides, due to the properties of this process, the reliability of the model must be taken into consideration when optimizing the MVs. In this work, an optimal design strategy based on the self-learning Gaussian processmodel (GPM)is proposed to control this kind of spatial batch process. The GPMis utilized as the internalmodel to predict the thicknesses of thin films on all spatial-distributed wafers using the limited data. Unlike the conventional model based design, the uncertainties of predictions provided by GPM are taken into consideration to guide the optimal design of manipulated variables so that the designing can be more prudent. Besides, the GPM is also actively enhanced using as little data as possible based on the predictive uncertainties. The effectiveness of the proposed strategy is successfully demonstrated in an LPCVD process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号