首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
李钢  代海飞 《计算机应用》2008,28(10):2718-2720
在分析研究现有小批量及多元控制图相关理论的基础上,基于Kalman滤波原理,提出一种综合解决小批量多元过程控制的建模方法。仿真实验及应用实例表明,该建模方法能够充分利用已经取得的数据,动态建立控制模型,从而解决小批量生产过程中建模数据不足的问题。  相似文献   

2.
Neural networks have recently received a great deal of attention in the field of manufacturing process quality control, where statistical techniques have traditionally been used. In this paper, a neural-based procedure for quality monitoring is discussed from a statistical perspective. The neural network is based on Fuzzy ART, which is exploited for recognising any unnatural change in the state of a manufacturing process. Initially, the neural algorithm is analysed by means of geometrical arguments. Then, in order to evaluate control performances in terms of errors of Types I and II, the effects of three tuneable parameters are examined through a statistical model. Upper bound limits for the error rates are analytically computed, and then numerically illustrated for different combinations of the tuneable parameters. Finally, a criterion for the neural network designing is proposed and validated in a specific test case through simulation. The results demonstrate the effectiveness of the proposed neural-based procedure for manufacturing quality monitoring.  相似文献   

3.
The research described in this paper evaluates the implementation of a new method of quality control in continuous flow processes. Specifically, the kalman filter is introduced as a technique which is not dependent upon the traditional assumption of independent data. The application, utilization, and evaluation of the kalman filter for small order processes (specifically, AR(2)) are provided. Small scaled prototypes indicate that initial results are promising and should be the basis of future research endeavors.  相似文献   

4.
Automated quality control is a key aspect of industrial maintenance. In manufacturing processes, this is often done by monitoring relevant system parameters to detect deviations from normal behavior. Previous approaches define “normalcy” as statistical distributions for a given system parameter, and detect deviations from normal by hypothesis testing. This paper develops an approach to manufacturing quality control using a newly introduced method: Bayesian Posteriors Updated Sequentially and Hierarchically (BPUSH). This approach outperforms previous methods, achieving reliable detection of faulty parts with low computational cost and low false alarm rates (∼0.1%). Finally, this paper shows that sample size requirements for BPUSH fall well below typical sizes for comparable quality control methods, achieving True Positive Rates (TPR) >99% using as few as n = 25 samples.  相似文献   

5.
In this paper we analyze the monitoring of p Poisson quality characteristics simultaneously, developing a new multivariate control chart based on the linear combination of the Poisson variables, the LCP control chart. The optimization of the coefficients of this linear combination (and control limit) for minimizing the out-of-control ARL is constrained by the desired in-control ARL. In order to facilitate the use of this new control chart the optimization is carried out employing user-friendly Windows© software, which also makes a comparison of performance between this chart and other schemes based on monitoring a set of Poisson variables; namely a control chart on the sum of the variables (MP chart), a control chart on their maximum (MX chart) and an optimized set of univariate Poisson charts (Multiple scheme). The LCP control chart shows very good performance. First, the desired in-control ARL (ARL0) is perfectly matched because the linear combination of Poisson variables is not constrained to integer values, which is an advantage over the rest of charts, which cannot in general match the required ARL0 value. Second, in the vast majority of cases this scheme signals process shifts faster than the rest of the charts.  相似文献   

6.
A method for controlling manufacturing tests in real time by statistically predicting test behavior is described. This statistical prediction is used to eliminate certain tests in sequential testing. Analytic methods for clustering tests for more efficient execution and an algorithm for predictive testing are presented. A relational database using structured-query-language system, called SQL/DB2 is proposed; its structure allows efficient retrieval of the information needed for test prediction, test clustering, and predictive testing. A test-system architecture based on personal computers is presented  相似文献   

7.
将多向偏最小二乘(MPLS)方法应用于青霉素间歇生产过程的建模与故障诊断中。从青霉素反应过程的特点来看,数据具有多维性,应用传统的偏最小二乘方法会使过程的统计建模与故障诊断难以实现。MPLS可对间歇过程的多维数据沿变量方向进行分割,使得多批量的数据可以在过程的各操作阶段建立相应的PLS模型,从而完成对该反应过程的实时监视与故障诊断。运用T2统计、Q统计方法,结合贡献图对过程进行了仿真分析,从理论分析和仿真实验结果的一致性,证明了该方法在青霉素生产过程的故障检测与诊断方面是可行的。  相似文献   

8.
Neural-network techniques for the development of models of critical parameters in continuous forest products manufacturing processes are described. Predictive models of strength parameters in particleboard manufacturing were developed utilizing both backpropagation and counterpropagation neural network techniques. The modeled strength parameters were modulus of rupture and internal bond. The backpropagation neural network model did not provide sufficient accuracy in predicting the values of the strength parameters. Counterpropagation was successful at predicting modulus of rupture within ± 10% and internal bond within ± 15%. The trained counterpropagation network can be used to improve process control and reduce the amount of substandard and scrap board produced. Efforts are underway to refine the counterpropagation network and further improve its predictive capability, as well as to evaluate alternative neural network paradigms.  相似文献   

9.
This paper proposes two kinds of statistical games constructed to show how to achieve quality assurance system based on SPC(statistical process control) by using simple models and software tools. Proposed games are Coin Shooting Game and Paper Glider Releasing Game. These games can be played on a table using simple materials, and are easy to play. These are described by showing actual data.

Participants of these games get necessary outputs timely by using prepared software tools so that they can execute an effective and efficient decision. These save time and also raise the level of understanding how to achieve quality assurance system based on SPC.  相似文献   


10.
The typical manufacturing facility is constantly developing new product designs and related manufacturing processes. The increased volume of new designs and processes causes rapid and inefficient construction of product designs and manufacturing processes. Many parts and manufacturing processes are developed over the life cycle of a production facility with no organized means of cataloging this past and present data. This procedure is extremely ineffective because there is no way to determine if a part or process has been previously developed. The constant “reinventing of the wheel” creates a tremendous waste of manpower and cost.

One approach to solving this problem is through the use of group technoogy. Group technology is the identification and grouping of similar parts and processes in order to take avantage of their similarities in the design and manufacturing process. Parts and processes can be grouped under a classification and implemented with a coding system. Concurrently, the number of parts and processes can be reduced by putting them in a “family.” This “family” has common characteristics such as shape, size, color, tolerance or production operations.

For handling and manipulation of this data, a computer system has been developed. The computer system would set up a reporting format that would classify, code and group the parts and processes, so the user can analyze if a previously designed process or part can be used in the current system and/or if a better layout can be feasible.

Many advantages such as reduced inventory cost, increased facility space and better utilization of manpower are but a few of the benefits from this system.  相似文献   


11.
12.
13.
计算机网络工程一般由多种业务组成,需要运用多种学科的知识和技术来解决问题.项目执行中有许多未知因素,每个因素又有可能带有不确定,需要把具有不同经历的人员组织在一个临时性的团队内,在费用、进度等较为严格的约束条件下,实现预定的目标.大型的计算机网络工程有时由若干个子项目构成,这些子项目还有可能包含若干有逻辑顺序关系的工作单元.这样子项目、工作单元等子系统相互制约和相互依存共同构成完整的项目系统.这些因素决定了计算机网络工程项目管理是一项复杂的工作.  相似文献   

14.
There are some complicated coupling relations among quality features (QFs) in manufacturing process. Generally, the machining errors of one key feature may cause some errors of other features which are coupled with the key one. Considering the roles of key QFs, the weighted-coupled network-based quality control method for improving key features is proposed in this paper. Firstly, the W-CN model is established by defining the mapping rules of network elements (i.e. node, edge, weight). Secondly, some performance indices are introduced to evaluate the properties of W-CN. The influence index of node is calculated to identify the key nodes representing key features. Thirdly, three coupling modes of nodes are discussed and coupling degrees of key nodes are calculated to describe the coupling strengthen. Then, the decoupling method based on small world optimization algorithm is discussed to analyze the status changes of key nodes accurately. Finally, a case of engine cylinder body is presented to illustrate and verify the proposed method. The results show that the method is able to provide guidance for improving product quality in manufacturing process  相似文献   

15.
本文以CMMI为例,着重讲述对二级不稳定的项目管理过程和三级虽标准过程稳定但项目定义软件过程不稳定时涉及的较低级的七个过程域的统计控制方法,以将八个过程域统一在一个度量模型中进行管理,从而理解过程行为,促进软件过程的稳定性、可预见性和其改进.  相似文献   

16.
为了将统计过程控制(statistical process control,SPC)应用于软件开发过程,根据项目实例和经验分析了软件开发过程的特点及应用SPC的难点,通过在测试阶段进行SPC控制图检测,分析得出以过程为中心的重要性,提出了一种利用前摄活动改进软件过程的方法.讨论了自我导向能力在改善软件过程中的意义和价值,给出了将SPC应用于软件开发过程的方法、步骤及注意事项,通过同行审查实例表明了该方法的可行性和有效性.  相似文献   

17.
Dreams of using digital computers in industrial control systems surfaced almost as soon as such a computer was invented in the mid to late 1940s. By the early fifties, the concepts of such use were fairly well established. However, actual applications had to wait until relatively small, reliable, and also relatively inexpensive machines were available, along with vendor companies with the will and the initiative to pursue this field vigorously Such a company was the Ramo-Wooldridge Company, which entered this field in the mid-fifties. The company found ready acceptance of its products among the companies in the process industries. By the mid-sixties, there were installations in almost every process industry and many other vendors had entered the field. Such installations became the norm for computer applications to industrial control until the microprocessor and its associated distributed computer control systems superseded them beginning in the mid-seventies. The article chronicles the development of this early field by describing several of the early installations and their successes and difficulties  相似文献   

18.
We propose a method for flow control of parts in a manufacturing system with machines that require setups. The setup scheduling problem is investigated in the context of a multilevel hierarchy of discrete events with distinct frequencies. The higher level of the hierarchy calculates a target trajectory in the surplus/backlog space of the part types which must be tracked at the level of setups. We consider a feedback setup scheduling policy which usescorridors in the surplus/backlog space of the part types to determine the timing of the set-up changes in order to guide the trajectory in the desired direction. An interesting case in which the trajectory leads to a target point (e.g., a hedging point) is investigated in detail. It is shown that in this case the surplus/backlog trajectory at the setup level can lead to a limit cycle. Conditions for linear corridors which result in a stable limit cycle are determined.  相似文献   

19.
The ability to improve yield is an important competitiveness determinant for thin-film transistor-liquid crystal displays (TFT-LCD) factories. Until now, few studies were proposed to address the related issues for process analysis in TFT-LCD industry. Therefore, the information (e.g. the domain knowledge or the parameter effect) or the improvement chance hidden from process analysis will be frequently omitted. That is, the yield or yield loss model construction, the critical manufacturing processes (or layers) and the clustering effect based on the abnormal position (or defect) on TFT-LCD glasses will became the important issues to be addressed in TFT-LCD industry. In this study, we proposed an integrated procedure incorporating the data mining techniques, e.g. artificial neural networks (ANNs) and stepwise regression techniques, to achieve the construction of yield loss model, the effect analysis of manufacturing process and the clustering analysis of abnormal position (or it can be viewed as defect) for TFT-LCD products. Besides, an illustrative case owing to TFT-LCD manufacturer at Tainan Science Park in Taiwan will be applied to verifying the rationality and feasibility of our proposed procedure.  相似文献   

20.
This research represents a unique approach to quality monitoring of a process when data are autocorrelated. The effect of autocorrelated data is evaluated by modelling the manufacturing process as either as an autoregressive model of order one or two. Statistical process control is utilized and evaluated as a technique to detect known process disturbance. Due to the observed weaknesses of the statistical process control techniques, the Kalman filter is proposed as a technique to eliminate the autocorrelation from the process data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号