共查询到20条相似文献,搜索用时 15 毫秒
1.
Fraud detection is an important issue in service industries. Its technical challenges include the existence of complex patterns, its multivariate nature, and the incompleteness of records. As similar problems are observed in batch manufacturing process analysis, existing batch manufacturing techniques can be adapted and mapped to the fraud detection problem in the service sector. In particular, the batch library method can be modified to incorporate new information into incomplete customer records for analysis, while controlling the overall type I error rate. A real case is used to demonstrate the effectiveness of this approach, and simulation results are presented to show that the proposed approach consistently outperforms existing approaches. 相似文献
2.
Multi-product systems with finite buffers and sequence-dependent set-up times are quite common in modern manufacturing industry. In practice, the distribution of machine processing time could be arbitrary, while in existing literature it is often assumed to follow an exponential distribution. In this paper, we develop an analytical method to study the multi-product manufacturing systems with non-exponential processing times. An embedded Markov chain model is constructed and two approximation methods, Gamma estimation and linear approximation, are proposed. The model is validated with high accuracy by numerical experiments and practical data from an automotive assembly system. 相似文献
3.
A real-time algorithm is developed for scheduling single-part-type production lines with work-in-process inventory buffers. We consider three classes of activities: operations, failures and repairs, and starvation and blockage. The scheduling objectives are to keep the actual production close to the demand, the work-in-process (WIP) inventory level low, and the cycle time short. A three-level hierardhical controller is constructed to regulate the production. At the top level, we determine the desirable buffer sizes and the target production level for each operation. At the middle level is a production flow rate controller that recalculates the production rates whenever a machine fails or is starved or blocked. The loading times for individual parts are determined at the bottom level of the hierarchy. The production scheduling algorithm is evaluated by using computer simulations for a variety of cases. Compared with a transfer line policy, a significant improvement in system performance is observed. 相似文献
4.
In this paper, we propose a procedure for production flow control in reentrant manufacturing systems. The system under study consists ofN machines and producesM product types simultaneously. Each part goes through the system following a predefined process and may visit a machine many times. All machines are subject to random failures and need random repair times. The scheduling objectives are to keep the production close to demand and to keep the WIP inventory level and cycle times at low values. The model is motivated by semiconductor fabrication production. A three-level hierarchical controller is constructed to regulate the production. At the top level of this hierarchy, we perform capacity planning by selecting the desirable buffer sizes and the target production level for each operation. A production flow rate controller is at the middle level which recalculates the production rates whenever a machine fails or is starved or blocked. The loading times for individual parts are determined at the bottom level of the hierarchy. Comparison with alternative control is made through simulation and it shows that the control policy performs well. 相似文献
5.
6.
This paper studies static complexity in manufacturing systems. We enumerate factors influencing static complexity, and define a static complexity measure in terms of the processing requirements of parts to be produced and machine capabilities. The measure suggested for static complexity in manufacturing systems needs only the information available from production orders and process plans. The variation in static complexity is studied with respect to part similarity, system size, and product design changes. Finally, we present relationships between the static complexity measure and system performance. 相似文献
7.
8.
9.
An adequate and economic automation of production plants demands standardized and transferable solutions especially as far as computer controlled production is concerned. New developments of hardware and software components offer potential users extendable systems which, depending on the degree of extension, are capable of coping with both the technical and the organizational information flow within the production plant. In course of research developments concerned with the automation of production installation, a DNC-system was. first built using standardized process peripherals (CAMAC) and modular process control software written in a high-level language (PROCESS-FORTRAN) and later implemented in industry. In the current set-up are two complex flexible manufacturing systems, one for profile milling pieces and one for rotational pieces. These DNC-systems are being equipped with additional functions necessary for data processing as well as for material-flow, handling functions and process monitoring. The presentation deals with the development and set-up of these two systems. 相似文献
10.
Analytical approximations for the performance of flexible manufacturing systems (FMS) with blocking of machines due to limited local buffers are presented. The approximations are based on a detailed analysis of FMS configurations used in industry. The method proposed uses informations generated by applying the classical closed queueing network (CQN) model to the FMS. The approximations developed are tested against simulation models for a wide variety of FMS configurations. The results presented show that the approximations are very good. 相似文献
11.
12.
13.
Because of their complexity manufacturing systems are difficult to model. However, modelling is very often required in order to study the behaviour of the system. In this paper an approach is described, where an analogy is drawn between the behaviour of a manufacturing and a mechanical system. Manufacturing systems have to respond to a dynamic demand, namely, a demand that changes over time. Flexibility of a manufacturing system can be thought of as the ability and the rapidness with which the system responds to the dynamic demand. This resembles the behaviour of a mechanical system under the excitation of a force that changes over time. The paper attempts to establish a modelling method based on this analogy and uses this method in the study of a real industrial system. 相似文献
14.
Zineb Simeu-Abazi Olivier Daniel Bernard Descotes-Genon 《Reliability Engineering & System Safety》1997,55(2):125-130
The association between Stochastic Petri nets and Markov's chains constitutes a powerful tool for analysis. However, the Markovian's models obtained for complex manufacturing systems are so large that their storage and analysis is very expensive and very time consuming. The method based on decomposition and iterative analysis is very efficient for the number of states combinatory explosion. In this paper, we present a method for modular modelisation, an algorithm for iterative analysis and some numerical results. 相似文献
15.
We present a framework for performance evaluation of manufacturing systems subject to failure and repair. In particular, we determine the mean and variance of accumulated production over a specified time frame and show the usefulness of these results in system design and in evaluating operational policies for manufacturing systems. We extend this analysis for lead time as well. A detailed performability study is carried out for the generic model of a manufacturing system with centralized material handling. Several numerical results are presented, and the relevance of performability analysis in resolving system design issues is highlighted. Specific problems addressed include computing the distribution of total production over a shift period, determining the shift length necessary to deliver a given production target with a desired probability, and obtaining the distribution of Manufacturing Lead Time, all in the face of potential subsystem failures. 相似文献
16.
Orthogonal projection sampling mode was proposed to reconstruct the incomplete-data flow field in optical computerized tomography (OCT). With numerical simulation technique, a two-peak plane symmetric flow field was reconstructed in different sampling modes and discussed in simulated results is the reconstructive accuracy with error indexes, such as mean square error (MSE) and peak error (PE). The corresponding experiments were researched with a Fabry-Perot rotary interferometer. The results indicated that the errors were drastically reduced and the precision was improved when orthogonal projection sampling mode was adopted in the reconstruction of the incomplete data field. The MSE obtained with orthogonal sampling mode was decreased 72.81% from that of the sequential projection sampling mode (the difference between the MSE obtained with the orthogonal sampling mode and that with the sequential sampling mode divided by the MSE of the sequential sampling mode) and the PE was decreased by 73.97%. The precision obtained from the experimental results reached 10%, which showed the orthogonal projection sampling could be a practicable sampling mode for the incomplete data field reconstruction in OCT and could provide some guidance for the flow-field measurement and apparatus design in the practical situation. 相似文献
17.
Maryam Keshavarz Shervin Asadzadeh Seyed Taghi Akhavan Niaki 《Quality and Reliability Engineering International》2019,35(7):2314-2326
In recent years, much attention has been given to monitoring multistage processes in order to effectively improve the product reliability. To this end, the output of the process is investigated under special circumstances, and the values corresponding to reliability‐related quality characteristic are measured. However, analyzing reliability data is quite complicated because of their unique features such as being censored and obeying nonnormal distributions. A more sophisticated picture arises when the observations of the process are autocorrelated in some cases, which makes the application of previous control procedures futile. In this paper, the accelerated failure time (AFT) regression models have been modified in order to account for autocorrelated data. Then, a cumulative sum (CUSUM) control chart and an exponentially weighted moving average (EWMA) control chart based on conditional expected values have been proposed to monitor the quality variable with Weibull distribution while taking the effective covariates into consideration. Extensive simulation studies reveal that the CUSUM control chart outperforms its counterpart in detecting out‐of‐control conditions. Finally, a real case study in a textile industry has been provided to investigate the application of the CUSUM control scheme. 相似文献
18.
In this paper a method is developed for the selection of an efficient path in a fuzzy multi-objective network. The application of the methodology developed is illustrated by a process plan selection problem in a manufacturing environment. 相似文献
19.
In principle, data envelopment analysis (DEA) does not consider the possibility, which can occur in practice, of a production system being able to operate in different modes of functioning. In this paper, a new DEA modelling approach is proposed in which the different modes of functioning are taken into account and included in the analysis. The observed input consumption and output production in each mode of functioning is used to derive a mode-specific technology. The overall DEA technology aggregates these mode-specific technologies according to their respective time allocations. The proposed model computes a target operating point for each mode of functioning so that the operation of the overall system is efficient. The proposed approach is applied to assess the technical, cost and allocative efficiency of a reconfigurable manufacturing system. The inputs considered are modules/tools usage, labour and energy consumption. The outputs are the number of units produced of each part type. The production possibility set is determined by previous observations of the system functioning, from which the best practices can be identified. Technical, cost and allocative efficiency scores can be computed. The proposed approach not only generates input cost savings but also lead time reductions. 相似文献
20.
Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model’s ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the “ground truth,” and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and methods through applications to representative atomic structures and we discuss extensions to the validation process for molecular models of polymer structures encountered in certain semiconductor nanomanufacturing processes. The powerful method of model plausibility as a means for selecting interaction potentials for coarse-grained models is discussed in connection with a coarse-grained hexane molecule. Discussions of how all-atom information is used to construct priors are contained in an appendix. 相似文献